Human big data decoding is of great potential to reveal the complex patterns of human dynamics like physiological and biomechanical signals. In this study, we take special interest in brain visual dynamics, e.g., eye movement signals, and investigate how to leverage eye signal decoding to provide a voice-free communication possibility for ALS patients who lose ability to control their muscles. Due to substantial complexity of visual dynamics, we propose a deep learning framework to decode the visual dynamics when the user performs eye-writing tasks. Further, to enable real-time inference of the eye signals, we design and develop a mobile edge computing platform, called UbiEi-Edge, which can wirelessly receive the eye signals via low-energy Bluetooth, execute the deep learning algorithm, and visualize decoding results. This real word implementation, developed on an Android Phone, aims to provide real-time data streaming and automatic, real-time decoding of brain visual dynamics, thereby enabling a new paradigm for ALS patients to communicate with the external world. Our experiment has demonstrated the feasibility and effectiveness of the proposed novel mobile edge computing prototype. The study, by innovatively bridging AI, edge computing, and mobile health, will greatly advance the brain dynamics decoding-empowered human-centered computing and smart health big data applications.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1109/EMBC46164.2021.9629820 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!