Deep sparse transfer learning for remote smart tongue diagnosis.

Math Biosci Eng

The School of Software Technology, Dalian University of Technology, Dalian 116620, China.

Published: January 2021

People are exploring new ideas based on artificial intelligent infrastructures for immediate processing, in which the main obstacles of widely-deploying deep methods are the huge volume of neural network and the lack of training data. To meet the high computing and low latency requirements in modeling remote smart tongue diagnosis with edge computing, an efficient and compact deep neural network design is necessary, while overcoming the vast challenge on modeling its intrinsic diagnosis patterns with the lack of clinical data. To address this challenge, a deep transfer learning model is proposed for the effective tongue diagnosis, based on the proposed similar-sparse domain adaptation (SSDA) scheme. Concretely, a transfer strategy of similar data is introduced to efficiently transfer necessary knowledge, overcoming the insufficiency of clinical tongue images. Then, to generate simplified structure, the network is pruned with transferability remained in domain adaptation. Finally, a compact model combined with two sparse networks is designed to match limited edge device. Extensive experiments are conducted on the real clinical dataset. The proposed model can use fewer training data samples and parameters to produce competitive results with less power and memory consumptions, making it possible to widely run smart tongue diagnosis on low-performance infrastructures.

Download full-text PDF

Source
http://dx.doi.org/10.3934/mbe.2021063DOI Listing

Publication Analysis

Top Keywords

tongue diagnosis
16
smart tongue
12
transfer learning
8
remote smart
8
neural network
8
training data
8
domain adaptation
8
tongue
5
diagnosis
5
deep
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!