Multi-Task Learning Based on Stochastic Configuration Networks.

Front Bioeng Biotechnol

Collaborative Innovation Center of Statistical Data Engineering, Technology & Application, School of Statistics and Mathematics, Zhejiang Gongshang University, Hangzhou, China.

Published: August 2022

When the human brain learns multiple related or continuous tasks, it will produce knowledge sharing and transfer. Thus, fast and effective task learning can be realized. This idea leads to multi-task learning. The key of multi-task learning is to find the correlation between tasks and establish a fast and effective model based on these relationship information. This paper proposes a multi-task learning framework based on stochastic configuration networks. It organically combines the idea of the classical parameter sharing multi-task learning with that of constraint sharing configuration in stochastic configuration networks. It organically combines the idea of the classical parameter sharing multi-task learning with that of constraint sharing configuration in stochastic configuration neural networks. Moreover, it provides an efficient multi-kernel function selection mechanism. The convergence of the proposed algorithm is proved theoretically. The experiment results on one simulation data set and four real life data sets verify the effectiveness of the proposed algorithm.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC9386079PMC
http://dx.doi.org/10.3389/fbioe.2022.890132DOI Listing

Publication Analysis

Top Keywords

multi-task learning
24
stochastic configuration
16
configuration networks
12
based stochastic
8
fast effective
8
networks organically
8
organically combines
8
combines idea
8
idea classical
8
classical parameter
8

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!