Performance-guaranteed regularization in maximum likelihood method: Gauge symmetry in Kullback-Leibler divergence.

Phys Rev E

Department of Applied Mathematics, Faculty of Science, Fukuoka University, 8-19-1, Nanakuma, Jonan-ku, Fukuoka City 814-0180, Japan.

Published: October 2023

The maximum likelihood method is the best-known method for estimating the probabilities behind the data. However, the conventional method obtains the probability model closest to the empirical distribution, resulting in overfitting. Then regularization methods prevent the model from being excessively close to the wrong probability, but little is known systematically about their performance. The idea of regularization is similar to error-correcting codes, which obtain optimal decoding by mixing suboptimal solutions with an incorrectly received code. The optimal decoding in error-correcting codes is achieved based on gauge symmetry. We propose a theoretically guaranteed regularization in the maximum likelihood method by focusing on a gauge symmetry in Kullback-Leibler divergence. In our approach, we obtain the optimal model without the need to search for hyperparameters frequently appearing in regularization.

Download full-text PDF

Source
http://dx.doi.org/10.1103/PhysRevE.108.044134DOI Listing

Publication Analysis

Top Keywords

maximum likelihood
12
likelihood method
12
gauge symmetry
12
regularization maximum
8
symmetry kullback-leibler
8
kullback-leibler divergence
8
error-correcting codes
8
optimal decoding
8
method
5
performance-guaranteed regularization
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!