Bioinspired multisensory neural network with crossmodal integration and recognition.

Nat Commun

NanoSpin, Department of Applied Physics, Aalto University School of Science, P.O. Box 15100, FI-00076, Aalto, Finland.

Published: February 2021

The integration and interaction of vision, touch, hearing, smell, and taste in the human multisensory neural network facilitate high-level cognitive functionalities, such as crossmodal integration, recognition, and imagination for accurate evaluation and comprehensive understanding of the multimodal world. Here, we report a bioinspired multisensory neural network that integrates artificial optic, afferent, auditory, and simulated olfactory and gustatory sensory nerves. With distributed multiple sensors and biomimetic hierarchical architectures, our system can not only sense, process, and memorize multimodal information, but also fuse multisensory data at hardware and software level. Using crossmodal learning, the system is capable of crossmodally recognizing and imagining multimodal information, such as visualizing alphabet letters upon handwritten input, recognizing multimodal visual/smell/taste information or imagining a never-seen picture when hearing its description. Our multisensory neural network provides a promising approach towards robotic sensing and perception.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7893014PMC
http://dx.doi.org/10.1038/s41467-021-21404-zDOI Listing

Publication Analysis

Top Keywords

multisensory neural
16
neural network
16
bioinspired multisensory
8
crossmodal integration
8
integration recognition
8
neural
4
network
4
network crossmodal
4
recognition integration
4
integration interaction
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!