Research Article
SmartNecklace: Designing a Wearable Multi-sensor System for Smart Eating Detection
@INPROCEEDINGS{10.4108/eai.15-12-2016.2267882, author={Eli Cohen and William Stogin and Haik Kalantarian and Nabil Alshurafa and Angela Pfammatter and Bonnie Spring}, title={SmartNecklace: Designing a Wearable Multi-sensor System for Smart Eating Detection}, proceedings={11th International Conference on Body Area Networks}, publisher={ACM}, proceedings_a={BODYNETS}, year={2017}, month={4}, keywords={wearables passive sensing eating detection alone piezoelectric sensor audio accelerometer wireless}, doi={10.4108/eai.15-12-2016.2267882} }
- Eli Cohen
William Stogin
Haik Kalantarian
Nabil Alshurafa
Angela Pfammatter
Bonnie Spring
Year: 2017
SmartNecklace: Designing a Wearable Multi-sensor System for Smart Eating Detection
BODYNETS
EAI
DOI: 10.4108/eai.15-12-2016.2267882
Abstract
Characterizing eating behaviors to inform and prevent obesity requires nutritionists, behaviorists and interventionists to disrupt subjects’ routine with questionnaires and unfamiliar eating environments. Such a disruption may be necessary as a means of self-reflection, however, prevents researchers from capturing problematic eating behaviors in a free-living environment. An automated system alleviates many of these disruptions; however, success in automating sensing of eating habits has proven to be a challenge due to high withinsubject variability in people’s eating habits. Given a positive correlation between eating duration and caloric intake, along with the fact that many problematic eaters spend time alone, this paper presents a passive sensing system designed with the following three goals: detecting eating episodes through data analytics of passive sensors, detecting time spent alone while eating, and designing a passive sensing system that people will adhere to wearing in the field, without disrupting regular activity or behavior. A real-time coarse multilayered classification approach is proposed to detect challenging eating episodes with confounding factors using data from piezoelectric, audio, and inertial sensors. The system was tested on 7 participants with 14 eating episodes, resulting in an 80.8%, and 91.3% average F-measure for detection of eating and alone time, respectively.