A PHP Error was encountered

Severity: Warning

Message: file_get_contents(https://...@pubfacts.com&api_key=b8daa3ad693db53b1410957c26c9a51b4908&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests

Filename: helpers/my_audit_helper.php

Line Number: 176

Backtrace:

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 176
Function: file_get_contents

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 250
Function: simplexml_load_file_from_url

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3122
Function: getPubMedXML

File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global

File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword

File: /var/www/html/index.php
Line: 316
Function: require_once

Intelligent Visual Acuity Estimation System With Hand Motion Recognition. | LitMetric

Visual acuity (VA) measurement is utilized to test a subject's acuteness of vision. Conventional VA measurement requires a physician's assistance to ask a subject to speak out or wave a hand in response to the direction of an optotype. To avoid this repetitive testing procedure, different types of automatic VA tests have been developed in recent years by adopting contact-based responses, such as pushing buttons or keyboards on a device. However, contact-based testing is not as intuitive as speaking or waving hands, and it may distract the subjects from concentrating on the VA test. Moreover, problems related to hygiene may arise if all the subjects operate on the same testing device. To overcome these problems, we propose an intelligent VA estimation (iVAE) system for automatic VA measurements that assists the subject to respond in an intuitive, noncontact manner. VA estimation algorithms using maximum likelihood (VAML) are developed to automatically estimate the subject's vision by compromising between a prespecified logistic function and a machine-learning technique. The neural-network model adapts human learning behavior to consider the accuracy of recognizing the optotype as well as the reaction time of the subject. Furthermore, a velocity-based hand motion recognition algorithm is adopted to classify hand motion data, collected by a sensing device, into one of the four optotype directions. Realistic experiments show that the proposed iVAE system outperforms the conventional line-by-line testing method as it is approximately ten times faster in testing trials while achieving a logarithm of the minimum angle of resolution error of less than 0.2. We believe that our proposed system provides a method for accurate and fast noncontact automatic VA testing.

Download full-text PDF

Source
http://dx.doi.org/10.1109/TCYB.2020.2969520DOI Listing

Publication Analysis

Top Keywords

hand motion
12
visual acuity
8
motion recognition
8
ivae system
8
testing
6
intelligent visual
4
acuity estimation
4
system
4
estimation system
4
hand
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!