AI Article Synopsis

  • Indoor navigation poses significant challenges for visually impaired individuals who can't rely on visual cues for orientation and wayfinding.
  • A new iOS app has been developed that uses real-time computer vision to provide accessible turn-by-turn directions, using smartphone technology.
  • The app combines visual-inertial odometry and map data with enhanced algorithms for detecting signs and estimating distances, allowing users to navigate more effectively within indoor environments.

Article Abstract

Indoor navigation is a major challenge for people with visual impairments, who often lack access to visual cues such as informational signs, landmarks and structural features that people with normal vision rely on for wayfinding. Building on our recent work on a computer vision-based localization approach that runs in real time on a smartphone, we describe an accessible wayfinding iOS app we have created that provides turn-by-turn directions to a desired destination. The localization approach combines dead reckoning obtained using visual-inertial odometry (VIO) with information about the user's location in the environment from informational sign detections and map constraints. We explain how we estimate the user's distance from Exit signs appearing in the image, describe new improvements in the sign detection and range estimation algorithms, and outline our algorithm for determining appropriate turn-by-turn directions.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7703403PMC
http://dx.doi.org/10.1007/978-3-030-58796-3_56DOI Listing

Publication Analysis

Top Keywords

indoor navigation
8
localization approach
8
turn-by-turn directions
8
navigation app
4
app computer
4
computer vision
4
vision sign
4
sign recognition
4
recognition indoor
4
navigation major
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!