EarBit: Using Wearable Sensors to Detect Eating Episodes in Unconstrained Environments. Bedri, A., Li, R., Haynes, M., Kosaraju, R., P., Grover, I., Prioleau, T., Beh, M., Y., Goel, M., Starner, T., & Abowd, G. Proc. ACM Interactive, Mobile and Wearable Ubiquitous Technology, ACM, 9, 2017.
Website abstract bibtex Chronic and widespread diseases such as obesity, diabetes, and hypercholesterolemia require patients to monitor their food intake, and food journaling is currently the most common method for doing so. However, food journaling is subject to self-bias and recall errors, and is poorly adhered to by patients. In this paper, we propose an alternative by introducing EarBit, a wearable system that detects eating moments. We evaluate the performance of inertial, optical, and acoustic sensing modalities and focus on inertial sensing, by virtue of its recognition and usability performance. Using data collected in a simulated home setting with minimum restrictions on participants' behavior, we build our models and evaluate them with an unconstrained outside-the-lab study. For both studies, we obtained video footage as ground truth for participants activities. Using leave-one-user-out validation, EarBit recognized all the eating episodes in the semi-controlled lab study, and achieved an accuracy of 90.1% and an F1-score of 90.9% in detecting chewing instances. In the unconstrained, outside-the-lab evaluation, EarBit obtained an accuracy of 93% and an F1-score of 80.1% in detecting chewing instances. It also accurately recognized all but one recorded eating episodes. These episodes ranged from a 2 minute snack to a 30 minute meal.
@article{
title = {EarBit: Using Wearable Sensors to Detect Eating Episodes in Unconstrained Environments},
type = {article},
year = {2017},
identifiers = {[object Object]},
keywords = {auracle,eating,in-ear-mic,in-take-detection,inertial,sensors,wearable},
volume = {1},
websites = {http://dx.doi.org/10.1145/3130902},
month = {9},
publisher = {ACM},
id = {08ad0181-7af7-35c8-886b-1b6e0c20137d},
created = {2018-07-12T21:32:27.440Z},
file_attached = {false},
profile_id = {f954d000-ce94-3da6-bd26-b983145a920f},
group_id = {b0b145a3-980e-3ad7-a16f-c93918c606ed},
last_modified = {2018-07-12T21:32:27.440Z},
read = {false},
starred = {false},
authored = {false},
confirmed = {true},
hidden = {false},
citation_key = {bedri:earbit},
source_type = {article},
notes = {a paper from the MD2K project. • they developed a wearable system for detecting chewing and eating, from off-the-shelf parts. • they experimented in both controlled setting (a smart-home lab) and free-living setting (3 hour session). • they included inertial, optical, and acoustic sensors: in the ear, behind the ear, behind the neck, and around the neck. • they found the two inertial sensors (behind ear and behind neck) were sufficient. • they used a GoPro mounted on abdomen, facing up, as the ground truth; employed 4 coders and cross-checked reliability (page 37:8). • they sought to recognized ” chewing instances” (1s granularity) and ” eating episodes” (aggregated) • to define ” eating episodes” (page 37:15) they aggregated chewing instances within 10min of each other; they discarded eating episodes less than 2min long. • notice how they did all their work to build, tune, and verify their model in-lab, then applied that model to the out-of-lab data ” to avoid any unintentional and manual overfitting on the test data.” • do we need to add an accelerometer to Auracle, either as secondary sensor or to eliminate motion artifacts or to support some kinds of interaction? • section 6.4 provides some comments on mechanical design and feedback about form factors. • [31] is a survey of systems for wearable dietary monitoring. • although it is nice work, they note (section 6.2) ” the quest for a true evaluation of eating activity in unconstrained environments remains unfinished.”},
private_publication = {false},
abstract = {Chronic and widespread diseases such as obesity, diabetes, and hypercholesterolemia require patients to monitor their food intake, and food journaling is currently the most common method for doing so. However, food journaling is subject to self-bias and recall errors, and is poorly adhered to by patients. In this paper, we propose an alternative by introducing EarBit, a wearable system that detects eating moments. We evaluate the performance of inertial, optical, and acoustic sensing modalities and focus on inertial sensing, by virtue of its recognition and usability performance. Using data collected in a simulated home setting with minimum restrictions on participants' behavior, we build our models and evaluate them with an unconstrained outside-the-lab study. For both studies, we obtained video footage as ground truth for participants activities. Using leave-one-user-out validation, EarBit recognized all the eating episodes in the semi-controlled lab study, and achieved an accuracy of 90.1% and an F1-score of 90.9% in detecting chewing instances. In the unconstrained, outside-the-lab evaluation, EarBit obtained an accuracy of 93% and an F1-score of 80.1% in detecting chewing instances. It also accurately recognized all but one recorded eating episodes. These episodes ranged from a 2 minute snack to a 30 minute meal.},
bibtype = {article},
author = {Bedri, Abdelkareem and Li, Richard and Haynes, Malcolm and Kosaraju, Raj P and Grover, Ishaan and Prioleau, Temiloluwa and Beh, Min Y and Goel, Mayank and Starner, Thad and Abowd, Gregory},
journal = {Proc. ACM Interactive, Mobile and Wearable Ubiquitous Technology},
number = {3}
}
Downloads: 0
{"_id":"JWyk95zhS2Ky5Tb7p","bibbaseid":"bedri-li-haynes-kosaraju-grover-prioleau-beh-goel-etal-earbitusingwearablesensorstodetecteatingepisodesinunconstrainedenvironments-2017","downloads":0,"creationDate":"2019-02-15T15:15:01.654Z","title":"EarBit: Using Wearable Sensors to Detect Eating Episodes in Unconstrained Environments","author_short":["Bedri, A.","Li, R.","Haynes, M.","Kosaraju, R., P.","Grover, I.","Prioleau, T.","Beh, M., Y.","Goel, M.","Starner, T.","Abowd, G."],"year":2017,"bibtype":"article","biburl":null,"bibdata":{"title":"EarBit: Using Wearable Sensors to Detect Eating Episodes in Unconstrained Environments","type":"article","year":"2017","identifiers":"[object Object]","keywords":"auracle,eating,in-ear-mic,in-take-detection,inertial,sensors,wearable","volume":"1","websites":"http://dx.doi.org/10.1145/3130902","month":"9","publisher":"ACM","id":"08ad0181-7af7-35c8-886b-1b6e0c20137d","created":"2018-07-12T21:32:27.440Z","file_attached":false,"profile_id":"f954d000-ce94-3da6-bd26-b983145a920f","group_id":"b0b145a3-980e-3ad7-a16f-c93918c606ed","last_modified":"2018-07-12T21:32:27.440Z","read":false,"starred":false,"authored":false,"confirmed":"true","hidden":false,"citation_key":"bedri:earbit","source_type":"article","notes":"a paper from the MD2K project. • they developed a wearable system for detecting chewing and eating, from off-the-shelf parts. • they experimented in both controlled setting (a smart-home lab) and free-living setting (3 hour session). • they included inertial, optical, and acoustic sensors: in the ear, behind the ear, behind the neck, and around the neck. • they found the two inertial sensors (behind ear and behind neck) were sufficient. • they used a GoPro mounted on abdomen, facing up, as the ground truth; employed 4 coders and cross-checked reliability (page 37:8). • they sought to recognized ” chewing instances” (1s granularity) and ” eating episodes” (aggregated) • to define ” eating episodes” (page 37:15) they aggregated chewing instances within 10min of each other; they discarded eating episodes less than 2min long. • notice how they did all their work to build, tune, and verify their model in-lab, then applied that model to the out-of-lab data ” to avoid any unintentional and manual overfitting on the test data.” • do we need to add an accelerometer to Auracle, either as secondary sensor or to eliminate motion artifacts or to support some kinds of interaction? • section 6.4 provides some comments on mechanical design and feedback about form factors. • [31] is a survey of systems for wearable dietary monitoring. • although it is nice work, they note (section 6.2) ” the quest for a true evaluation of eating activity in unconstrained environments remains unfinished.”","private_publication":false,"abstract":"Chronic and widespread diseases such as obesity, diabetes, and hypercholesterolemia require patients to monitor their food intake, and food journaling is currently the most common method for doing so. However, food journaling is subject to self-bias and recall errors, and is poorly adhered to by patients. In this paper, we propose an alternative by introducing EarBit, a wearable system that detects eating moments. We evaluate the performance of inertial, optical, and acoustic sensing modalities and focus on inertial sensing, by virtue of its recognition and usability performance. Using data collected in a simulated home setting with minimum restrictions on participants' behavior, we build our models and evaluate them with an unconstrained outside-the-lab study. For both studies, we obtained video footage as ground truth for participants activities. Using leave-one-user-out validation, EarBit recognized all the eating episodes in the semi-controlled lab study, and achieved an accuracy of 90.1% and an F1-score of 90.9% in detecting chewing instances. In the unconstrained, outside-the-lab evaluation, EarBit obtained an accuracy of 93% and an F1-score of 80.1% in detecting chewing instances. It also accurately recognized all but one recorded eating episodes. These episodes ranged from a 2 minute snack to a 30 minute meal.","bibtype":"article","author":"Bedri, Abdelkareem and Li, Richard and Haynes, Malcolm and Kosaraju, Raj P and Grover, Ishaan and Prioleau, Temiloluwa and Beh, Min Y and Goel, Mayank and Starner, Thad and Abowd, Gregory","journal":"Proc. ACM Interactive, Mobile and Wearable Ubiquitous Technology","number":"3","bibtex":"@article{\n title = {EarBit: Using Wearable Sensors to Detect Eating Episodes in Unconstrained Environments},\n type = {article},\n year = {2017},\n identifiers = {[object Object]},\n keywords = {auracle,eating,in-ear-mic,in-take-detection,inertial,sensors,wearable},\n volume = {1},\n websites = {http://dx.doi.org/10.1145/3130902},\n month = {9},\n publisher = {ACM},\n id = {08ad0181-7af7-35c8-886b-1b6e0c20137d},\n created = {2018-07-12T21:32:27.440Z},\n file_attached = {false},\n profile_id = {f954d000-ce94-3da6-bd26-b983145a920f},\n group_id = {b0b145a3-980e-3ad7-a16f-c93918c606ed},\n last_modified = {2018-07-12T21:32:27.440Z},\n read = {false},\n starred = {false},\n authored = {false},\n confirmed = {true},\n hidden = {false},\n citation_key = {bedri:earbit},\n source_type = {article},\n notes = {a paper from the MD2K project. • they developed a wearable system for detecting chewing and eating, from off-the-shelf parts. • they experimented in both controlled setting (a smart-home lab) and free-living setting (3 hour session). • they included inertial, optical, and acoustic sensors: in the ear, behind the ear, behind the neck, and around the neck. • they found the two inertial sensors (behind ear and behind neck) were sufficient. • they used a GoPro mounted on abdomen, facing up, as the ground truth; employed 4 coders and cross-checked reliability (page 37:8). • they sought to recognized ” chewing instances” (1s granularity) and ” eating episodes” (aggregated) • to define ” eating episodes” (page 37:15) they aggregated chewing instances within 10min of each other; they discarded eating episodes less than 2min long. • notice how they did all their work to build, tune, and verify their model in-lab, then applied that model to the out-of-lab data ” to avoid any unintentional and manual overfitting on the test data.” • do we need to add an accelerometer to Auracle, either as secondary sensor or to eliminate motion artifacts or to support some kinds of interaction? • section 6.4 provides some comments on mechanical design and feedback about form factors. • [31] is a survey of systems for wearable dietary monitoring. • although it is nice work, they note (section 6.2) ” the quest for a true evaluation of eating activity in unconstrained environments remains unfinished.”},\n private_publication = {false},\n abstract = {Chronic and widespread diseases such as obesity, diabetes, and hypercholesterolemia require patients to monitor their food intake, and food journaling is currently the most common method for doing so. However, food journaling is subject to self-bias and recall errors, and is poorly adhered to by patients. In this paper, we propose an alternative by introducing EarBit, a wearable system that detects eating moments. We evaluate the performance of inertial, optical, and acoustic sensing modalities and focus on inertial sensing, by virtue of its recognition and usability performance. Using data collected in a simulated home setting with minimum restrictions on participants' behavior, we build our models and evaluate them with an unconstrained outside-the-lab study. For both studies, we obtained video footage as ground truth for participants activities. Using leave-one-user-out validation, EarBit recognized all the eating episodes in the semi-controlled lab study, and achieved an accuracy of 90.1% and an F1-score of 90.9% in detecting chewing instances. In the unconstrained, outside-the-lab evaluation, EarBit obtained an accuracy of 93% and an F1-score of 80.1% in detecting chewing instances. It also accurately recognized all but one recorded eating episodes. These episodes ranged from a 2 minute snack to a 30 minute meal.},\n bibtype = {article},\n author = {Bedri, Abdelkareem and Li, Richard and Haynes, Malcolm and Kosaraju, Raj P and Grover, Ishaan and Prioleau, Temiloluwa and Beh, Min Y and Goel, Mayank and Starner, Thad and Abowd, Gregory},\n journal = {Proc. ACM Interactive, Mobile and Wearable Ubiquitous Technology},\n number = {3}\n}","author_short":["Bedri, A.","Li, R.","Haynes, M.","Kosaraju, R., P.","Grover, I.","Prioleau, T.","Beh, M., Y.","Goel, M.","Starner, T.","Abowd, G."],"urls":{"Website":"http://dx.doi.org/10.1145/3130902"},"bibbaseid":"bedri-li-haynes-kosaraju-grover-prioleau-beh-goel-etal-earbitusingwearablesensorstodetecteatingepisodesinunconstrainedenvironments-2017","role":"author","keyword":["auracle","eating","in-ear-mic","in-take-detection","inertial","sensors","wearable"],"downloads":0},"search_terms":["earbit","using","wearable","sensors","detect","eating","episodes","unconstrained","environments","bedri","li","haynes","kosaraju","grover","prioleau","beh","goel","starner","abowd"],"keywords":["auracle","eating","in-ear-mic","in-take-detection","inertial","sensors","wearable"],"authorIDs":[]}