Technology-enhanced methods of dietary assessment may still face common limitations of self-report. This study aimed to assess foods and beverages omitted when both a 24 h recall and a smartphone app were used to assess dietary intake compared with camera images. For three consecutive days, young adults (18-30 years) wore an Autographer camera that took point-of-view images every 30 seconds. Over the same period, participants reported their diet in the app and completed daily 24 h recalls. Camera images were reviewed for food and beverages, then matched to the items reported in the 24 h recall and app. ANOVA (with post hoc analysis using Tukey Honest Significant Difference) and paired -test were conducted. Discretionary snacks were frequently omitted by both methods ( < 0.001). Water was omitted more frequently in the app than in the camera images ( < 0.001) and 24 h recall ( < 0.001). Dairy and alternatives ( = 0.001), sugar-based products ( = 0.007), savoury sauces and condiments ( < 0.001), fats and oils ( < 0.001) and alcohol ( = 0.002) were more frequently omitted in the app than in the 24 h recall. The use of traditional self-report methods of assessing diet remains problematic even with the addition of technology and finding new objective methods that are not intrusive and are of low burden to participants remains a challenge.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8228902 | PMC |
http://dx.doi.org/10.3390/nu13061806 | DOI Listing |
Healthc Technol Lett
December 2024
Center for medical Image Analysis & Navigation, Department of Biomedical Engineering University of Basel Basel Switzerland.
The emergence of augmented reality (AR) in surgical procedures could significantly enhance accuracy and outcomes, particularly in the complex field of orthognathic surgery. This study compares the effectiveness and accuracy of traditional drilling guides with two AR-based navigation techniques: one utilizing ArUco markers and the other employing small-workspace infrared tracking cameras for a drilling task. Additionally, an alternative AR visualization paradigm for surgical navigation is proposed that eliminates the potential inaccuracies of image detection using headset cameras.
View Article and Find Full Text PDFHeliyon
July 2024
D-Eye Srl, Padova, 35131, Italy.
Widespread screening is crucial for the early diagnosis and treatment of glaucoma, the leading cause of visual impairment and blindness. The development of portable technologies, such as smartphone-based ophthalmoscopes, able to image the optical nerve head, represents a resource for large-scale glaucoma screening. Indeed, they consist of an optical device attached to a common smartphone, making the overall device cheap and easy to use.
View Article and Find Full Text PDFBiomed Opt Express
January 2025
Department of Biomedical Engineering, University of Illinois Chicago, Chicago, IL 60607, USA.
The choroid, a critical vascular layer beneath the retina, is essential for maintaining retinal function and monitoring chorioretinal disorders. Existing imaging methods, such as indocyanine green angiography (ICGA) and optical coherence tomography (OCT), face significant limitations, including contrast agent requirements, restricted field of view (FOV), and high costs, limiting accessibility. To address these challenges, we developed a nonmydriatic, contrast agent-free fundus camera utilizing transcranial near-infrared (NIR) illumination.
View Article and Find Full Text PDFBiomed Opt Express
January 2025
Biophotonics@Tyndall, IPIC, Tyndall National Institute, Lee Maltings, Dyke Parade, Cork, Ireland.
Cardiovascular imaging with camera-on-tip endoscopes has the potential to provide physiologically relevant data on the tissue state and device placement that can improve clinical outcomes. In this work, we review the unmet clinical need for image-based cardiovascular diagnostics and guidance for minimally invasive procedures. We present a 7 Fr camera-on-tip endoscope with fibre-coupled multispectral illumination that includes methods for imaging in a blood-filled field of view (FOV).
View Article and Find Full Text PDFWhen observing chip-to-free-space light beams from silicon photonics (SiPh) to free space, manual adjustment of camera lens is often required to obtain a focused image of the light beams. In this Letter, we demonstrated an auto-focusing system based on the you-only-look-once (YOLO) model. The trained YOLO model exhibits high classification accuracy of 99.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!