In order to realize the full potential of wireless edge artificial intelligence (AI), very large and diverse datasets will often be required for energy-demanding model training on resource-constrained edge devices. This paper proposes a lead federated neuromorphic learning (LFNL) technique, which is a decentralized energy-efficient brain-inspired computing method based on spiking neural networks. The proposed technique will enable edge devices to exploit brain-like biophysiological structure to collaboratively train a global model while helping preserve privacy. Experimental results show that, under the situation of uneven dataset distribution among edge devices, LFNL achieves a comparable recognition accuracy to existing edge AI techniques, while substantially reducing data traffic by >3.5× and computational latency by >2.0×. Furthermore, LFNL significantly reduces energy consumption by >4.5× compared to standard federated learning with a slight accuracy loss up to 1.5%. Therefore, the proposed LFNL can facilitate the development of brain-inspired computing and edge AI.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC9314401PMC
http://dx.doi.org/10.1038/s41467-022-32020-wDOI Listing

Publication Analysis

Top Keywords

edge devices
12
lead federated
8
federated neuromorphic
8
neuromorphic learning
8
wireless edge
8
edge artificial
8
artificial intelligence
8
brain-inspired computing
8
edge
7
learning wireless
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!