Growing hierarchical self-organizing models are characterized by the flexibility of their structure, which can easily accommodate for complex input datasets. However, most proposals use the Euclidean distance as the only error measure. Here we propose a way to introduce Bregman divergences in these models, which is based on stochastic approximation principles, so that more general distortion measures can be employed. A procedure is derived to compare the performance of networks using different divergences. Moreover, a probabilistic interpretation of the model is provided, which enables its use as a Bayesian classifier. Experimental results are presented for classification and data visualization applications, which show the advantages of these divergences with respect to the classical Euclidean distance.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1142/S0129065714500166 | DOI Listing |
Entropy (Basel)
December 2024
Sony Computer Science Laboratories Inc., Tokyo 141-0022, Japan.
We present a generalization of Bregman divergences in finite-dimensional symplectic vector spaces that we term symplectic Bregman divergences. Symplectic Bregman divergences are derived from a symplectic generalization of the Fenchel-Young inequality which relies on the notion of symplectic subdifferentials. The symplectic Fenchel-Young inequality is obtained using the symplectic Fenchel transform which is defined with respect to the symplectic form.
View Article and Find Full Text PDFEntropy (Basel)
November 2024
Sony Computer Science Laboratories, Tokyo 141-0022, Japan.
The symmetric Kullback-Leibler centroid, also called the Jeffreys centroid, of a set of mutually absolutely continuous probability distributions on a measure space provides a notion of centrality which has proven useful in many tasks, including information retrieval, information fusion, and clustering. However, the Jeffreys centroid is not available in closed form for sets of categorical or multivariate normal distributions, two widely used statistical models, and thus needs to be approximated numerically in practice. In this paper, we first propose the new Jeffreys-Fisher-Rao center defined as the Fisher-Rao midpoint of the sided Kullback-Leibler centroids as a plug-in replacement of the Jeffreys centroid.
View Article and Find Full Text PDFEntropy (Basel)
July 2024
Department of Mathematics, University of Florida, Gainesville, FL 32611, USA.
Methods used in topological data analysis naturally capture higher-order interactions in point cloud data embedded in a metric space. This methodology was recently extended to data living in an information space, by which we mean a space measured with an information theoretical distance. One such setting is a finite collection of discrete probability distributions embedded in the probability simplex measured with the relative entropy (Kullback-Leibler divergence).
View Article and Find Full Text PDFStats (Basel)
June 2024
Department of Mathematics and Statistics, Saint Louis University, St. Louis, MO 63103, USA.
Change-point detection is a challenging problem that has a number of applications across various real-world domains. The primary objective of CPD is to identify specific time points where the underlying system undergoes transitions between different states, each characterized by its distinct data distribution. Precise identification of change points in time series omics data can provide insights into the dynamic and temporal characteristics inherent to complex biological systems.
View Article and Find Full Text PDFEntropy (Basel)
February 2024
Sony Computer Science Laboratories, Tokyo 141-0022, Japan.
Exponential families are statistical models which are the workhorses in statistics, information theory, and machine learning, among others. An exponential family can either be normalized subtractively by its cumulant or free energy function, or equivalently normalized divisively by its partition function. Both the cumulant and partition functions are strictly convex and smooth functions inducing corresponding pairs of Bregman and Jensen divergences.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!