Tactile technologies that can identify human body features are valuable in clinical diagnosis and human-machine interactions. Previously, cutting-edge tactile platforms have been able to identify structured non-living objects; however, identification of human body features remains challenging mainly because of the irregular contour and heterogeneous spatial distribution of softness. Here, freestanding and scalable tactile platforms of force-softness bimodal sensor arrays are developed, enabling tactile gloves to identify body features using machine-learning methods. The bimodal sensors are engineered by adding a protrusion on a piezoresistive pressure sensor, endowing the resistance signals with combined information of pressure and the softness of samples. The simple design enables 112 bimodal sensors to be integrated into a thin, conformal, and stretchable tactile glove, allowing the tactile information to be digitalized while hand skills are performed on the human body. The tactile glove shows high accuracy (98%) in identifying four body features of a real person, and four organ models (healthy and pathological) inside an abdominal simulator, demonstrating identification of body features of the bimodal tactile platforms and showing their potential use in future healthcare and robotics.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1002/adma.202207016 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!