Recent studies have shown that the environment where people eat can affect their nutritional behavior [1]. In this paper, we provide automatic tools for personalized analysis of a person's health habits by the examination of daily recorded egocentric photo-streams. Specifically, we propose a new automatic approach for the classification of food-related environments, that is able to classify up to 15 such scenes. In this way, people can monitor the context around their food intake in order to get an objective insight into their daily eating routine. We propose a model that classifies food-related scenes organized in a semantic hierarchy. Additionally, we present and make available a new egocentric dataset composed of more than 33 000 images recorded by a wearable camera, over which our proposed model has been tested. Our approach obtains an accuracy and F-score of 56% and 65%, respectively, clearly outperforming the baseline methods.

Download full-text PDF

Source
http://dx.doi.org/10.1109/JBHI.2019.2922390DOI Listing

Publication Analysis

Top Keywords

egocentric photo-streams
8
hierarchical approach
4
approach classify
4
classify food
4
food scenes
4
scenes egocentric
4
photo-streams studies
4
studies environment
4
environment people
4
people eat
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!