Multi-sensor information fusion detection system for fire robot through back propagation neural network.

PLoS One

School of Electronic and Electrical Engineering, University of Leeds, Leeds, United Kingdom.

Published: September 2020

Objective: To reduce the danger for firefighters and ensure the safety of firefighters as much as possible, based on the back propagation neural network (BPNN) the fire sensor multi-sensor information fusion detection system is investigated.

Method: According to previous studies, the information sources and information processing methods for the design of this study are first explained. Then, the basic structure and flowchart of the research object in this study are designed. Based on the structure diagram and flowchart, the BPNN is selected to fuse the feature layers in this study, and the fuzzy control is selected to fuse the decision layers in this study. The multi-sensor information fusion detection system collects information for the sensors first, processes the collected information, and sends it to the processor of the robot. The processor analyzes and processes the received signal, and transmits the obtained information to the control terminal through the wireless communication system.

Results: Through the tests in this study, it is found that when the number of hidden layer nodes of the BPNN is 7, the optimal training result is obtained. On this basis, the test of BPNN in this study is performed. The test results show that after 127 iterations, the error of the BPNN reaches the lowest target value, indicating that the BPNN achieves an excellent level of accuracy. The trained BPNN has a running time of 0.0276 s and a mean square error of 0.0013. The smaller the mean square error value is, the higher the accuracy of the BPNN is, which shows that the BPNN meets the high precision requirements of this study.

Conclusion: The research on the multi-sensor information fusion detection system of fire robots in this study can provide theoretical support for the research on forest fire detection in China. Since the proposed BPNN-based robot is applied to the inspection and processing of forest remaining fire, the results are applicable to the forests of various countries, with a wide range of applications.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7380588PMC
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0236482PLOS

Publication Analysis

Top Keywords

multi-sensor fusion
16
fusion detection
16
detection system
16
bpnn
9
system fire
8
propagation neural
8
neural network
8
selected fuse
8
layers study
8
square error
8

Similar Publications

Application of multi-sensor fusion localization algorithm based on recurrent neural networks.

Sci Rep

March 2025

School of Electrical Engineering and Automation, Anhui University, Hefei, 230601, Anhui, China.

With the rapid advancements in artificial intelligence (AI), 5G technology, and robotics, multi-sensor fusion technologies have emerged as a critical solution for achieving high-precision localization in mobile robots operating within dynamic and unstructured environments. This study proposes a novel hybrid fusion framework that combines the Extended Kalman Filter (EKF) and Recurrent Neural Network (RNN) to address challenges such as sensor frequency asynchrony, drift accumulation, and measurement noise. The EKF provides real-time statistical estimation for initial data fusion, while the RNN effectively models temporal dependencies, further reducing errors and enhancing data accuracy.

View Article and Find Full Text PDF

Human Activity Recognition (HAR) finds extensive application across diverse domains. Yet, its integration into healthcare remains challenging due to disparities between prevailing HAR systems optimized for rudimentary actions in controlled settings and the nuanced behaviors and dynamic conditions pertinent to medical diagnostics. Furthermore, prevailing sensor technologies and deployment scenarios present formidable hurdles regarding wearability and adaptability to heterogeneous environments.

View Article and Find Full Text PDF

Hand gestures are a natural form of human communication, making gesture recognition a sensible approach for intuitive human-computer interaction. Wearable sensors on the forearm can be used to detect the muscle contractions that generate these gestures, but classification approaches relying on a single measured modality lack accuracy and robustness. In this work, we analyze sensor fusion of force myography (FMG) and electromyography (EMG) for gesture recognition.

View Article and Find Full Text PDF

Leg workout-based monitoring provides valuable insights into physical and neurological health, supporting healthcare professionals and facilitating in-depth analysis. However, current single sensing modalities technologies are limited by size constraints, environmental sensitivity, and accuracy issues. Furthermore, despite the widespread use of deep learning (DL) methods for sensor-based gesture recognition methods, they still encounter challenges in feature extraction.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!