Stochastic configuration network ensembles with selective base models.

Neural Netw

The State Key Laboratory of Synthetical Automation for Process Industries, Northeastern University, Liaoning, China; Department of Computer Science and Information Technology, La Trobe University, Melbourne, Australia. Electronic address:

Published: May 2021

Studies have demonstrated that stochastic configuration networks (SCNs) have good potential for rapid data modeling because of their sufficient adequate learning power, which is theoretically guaranteed. Empirical studies have verified that the learner models produced by SCNs can usually achieve favorable test performance in practice but more in-depth theoretical analysis of their generalization power would be useful for constructing SCN-based ensemble models with enhanced generalization capacities. In particular, given a collection of independently developed SCN-based learner models, it is useful to select certain base learners that can potentially obtain preferable test results rather than considering all of the base models together, before simply taking their average in order to build an effective ensemble model. In this study, we propose a novel framework for building SCN ensembles by exploring key factors that might potentially affect the generalization performance of the base model. Under a mild assumption, we provide a comprehensive theoretical framework for examining a learner model's generalization error, as well as formulating a novel indicator that contains measurement information for the training errors, output weights, and a hidden layer output matrix, which can be used by our proposed algorithm to find a subset of appropriate base models from a pool of randomized learner models. A toy example of one-dimensional function approximation, a case study for developing a predictive model for forecasting student learning performance, and two large-scale data sets were used in our experiments. The experimental results indicate that our proposed method has some remarkable advantages for building ensemble models.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.neunet.2021.01.011DOI Listing

Publication Analysis

Top Keywords

base models
12
learner models
12
stochastic configuration
8
models
8
ensemble models
8
base
5
configuration network
4
network ensembles
4
ensembles selective
4
selective base
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!