Grant

EAT: A Reliable Eating Assessment Technology for Free-living Individuals

Project Overview

Monitoring an individual’s eating behavior will provide detailed understanding of the causal relationship between the eating activity and conditions such as problematic eating and obesity. Manual self-reports are often erroneous and biased. The availability of wearable video cameras makes it possible to objectively capture the eating activity. However, manually observing video frames to obtain the objective measure of eating is burdensome. Additionally, wearable video cameras pose privacy concerns in real-world settings. To overcome these challenges, we are developing an automated eating detection system to monitor real-time eating behavior via a privacy-conscious wearable device that individuals will wear. We will confirm eating activities in real time. Such automatic and privacy-conscious real-time eating detection will allow us to advance our understanding of the eating activity, laying the foundation for future interventions to change problematic eating behaviors.

To this end, first we will develop an activity detection algorithm that will allow detecting the eating activity using data from an IR sensor array and RGB images. Next, we will test various obfuscation methods in a cross-over trial and select the best obfuscation method based on the greatest participant acceptability. We will then deploy the eating detection algorithm with the best obfuscation approach on a novel wearable camera that has an infrared sensor array. We will use this camera to test the possibility of detecting eating in a real-world setting. To validate our algorithm, we will ask people to confirm or refute predicted eating and non-eating moments. We will compare the performance of this algorithm against both real-time user response and 24-hour dietary recall to objectively evaluate the algorithm’s performance. Our proposed system will improve current research practices of evaluating dietary intake and pave the way for personalized interventions for behavioral medicine.

Aims

Aim 1: Compare accuracy of the eating detection algorithm that incorporates each obfuscation method in controlled environments

Participants performed eating and non-eating activities in a controlled environment for an entire day. We developed an automated eating detection algorithm using data from each obfuscation method to determine accuracy of eating detection across methods.

The study team after setting up for one of the Aim 1 study sessions!

Photo of study team during Aim 1 study

Aim 2: Measure acceptability/feasibility of the EAT real-time obfuscation methods

A cohort of 72 participants will wear the camera using three obfuscation and one non-obfuscation methods (7 consecutive days for each method) during the entire wake period, separated by a 7-day washout. We will assess subjective acceptability and conduct a fully powered experiment by objectively measuring total wear time of each version.

Aim 3: Test and validate algorithm to assess accuracy of detecting eating in free-living people

New participants (n=60, 50% with obesity, 50% female) will wear EAT with the best obfuscation method identified by considering results from Aim 1 and 2 for 7-days while self-reporting eating moments and context using a quick tap app implemented on a smartwatch. We will determine how many eating episodes are detected by an EAT informed algorithm and the number of gestures required to capture an eating moment in a real-world setting.

Primary Investigator

image here

Nabil Alshurafa
Director of HABits Lab

Profile page

Co-investigators

image here

Angela Fidler Pfammatter
Associate Professor of Public Health, University of Tennessee Knoxville

Profile
image here

Annie W. Lin
Assistant Professor of Nutrition Informatics, Hormel Institute, University of Minnesota

Profile
image here

Josiah Hester
Associate Professor of Interactive Computing and Computer Science College of Computing, Georgia Institute of Technology

Profile
image here

Jacob M. Schauer
Assistant Professor of Preventive Medicine, Northwestern University

Profile

Student Investigators

image here

Blaine Rothrock

Profile
image here

Soroush Shahi

Profile
image here

Boyang Wei

Profile

Software & Projects

Let’s Talk

680 N. Lakeshore Dr., Suite 1400, Chicago, IL 60611

Have a question? let us now and we’ll get back to you ASAP!