Mixture of Experts with Entropic Regularization for Data Classification.

Entropy (Basel)

Department of Computer Sciences, Pontifical Catholic University of Chile, Santiago 7820436, Chile.

Published: February 2019

Today, there is growing interest in the automatic classification of a variety of tasks, such as weather forecasting, product recommendations, intrusion detection, and people recognition. "Mixture-of-experts" is a well-known classification technique; it is a probabilistic model consisting of local expert classifiers weighted by a gate network that is typically based on softmax functions, combined with learnable complex patterns in data. In this scheme, one data point is influenced by only one expert; as a result, the training process can be misguided in real datasets for which complex data need to be explained by multiple experts. In this work, we propose a variant of the regular mixture-of-experts model. In the proposed model, the cost classification is penalized by the Shannon entropy of the gating network in order to avoid a "winner-takes-all" output for the gating network. Experiments show the advantage of our approach using several real datasets, with improvements in mean accuracy of 3-6% in some datasets. In future work, we plan to embed feature selection into this model.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7514672PMC
http://dx.doi.org/10.3390/e21020190DOI Listing

Publication Analysis

Top Keywords

real datasets
8
gating network
8
mixture experts
4
experts entropic
4
entropic regularization
4
data
4
regularization data
4
classification
4
data classification
4
classification today
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!