Monitoring an individual’s eating behavior will provide detailed understanding of the causal relationship between the eating activity and conditions such as problematic eating and obesity. Manual self-reports are often erroneous and biased. The availability of wearable video cameras makes it possible to objectively capture the eating activity. However, manually observing video frames to obtain the objective measure of eating is burdensome. Additionally, wearable video cameras pose privacy concerns in real-world settings. To overcome these challenges, we are developing an automated eating detection system to monitor real-time eating behavior via a privacy-conscious wearable device that individuals will wear. We will confirm eating activities in real time. Such automatic and privacy-conscious real-time eating detection will allow us to advance our understanding of the eating activity, laying the foundation for future interventions to change problematic eating behaviors.
To this end, first we will develop an activity detection algorithm that will allow detecting the eating activity using data from an IR sensor array and RGB images. Next, we will test various obfuscation methods in a cross-over trial and select the best obfuscation method based on the greatest participant acceptability. We will then deploy the eating detection algorithm with the best obfuscation approach on a novel wearable camera that has an infrared sensor array. We will use this camera to test the possibility of detecting eating in a real-world setting. To validate our algorithm, we will ask people to confirm or refute predicted eating and non-eating moments. We will compare the performance of this algorithm against both real-time user response and 24-hour dietary recall to objectively evaluate the algorithm’s performance. Our proposed system will improve current research practices of evaluating dietary intake and pave the way for personalized interventions for behavioral medicine.
Participants performed eating and non-eating activities in a controlled environment for an entire day. We developed an automated eating detection algorithm using data from each obfuscation method to determine accuracy of eating detection across methods.
A cohort of 72 participants will wear the camera using three obfuscation and one non-obfuscation methods (7 consecutive days for each method) during the entire wake period, separated by a 7-day washout. We will assess subjective acceptability and conduct a fully powered experiment by objectively measuring total wear time of each version.
New participants (n=60, 50% with obesity, 50% female) will wear EAT with the best obfuscation method identified by considering results from Aim 1 and 2 for 7-days while self-reporting eating moments and context using a quick tap app implemented on a smartwatch. We will determine how many eating episodes are detected by an EAT informed algorithm and the number of gestures required to capture an eating moment in a real-world setting.
Angela Fidler Pfammatter
Associate Professor of Public Health, University of Tennessee Knoxville
Annie W. Lin
Assistant Professor of Nutrition Informatics, Hormel Institute, University of Minnesota
Josiah Hester
Associate Professor of Interactive Computing and Computer Science College of Computing, Georgia Institute of Technology
Have a question? let us now and we’ll get back to you ASAP!