Flexibly regularized mixture models and application to image segmentation.

Neural Netw

Department of Systems & Computational Biology, Albert Einstein College of Medicine, 1300 Morris Park Ave, Bronx, 10461, NY, USA; Dominick P. Purpura Department of Neuroscience, Albert Einstein College of Medicine, 1300 Morris Park Ave, Bronx, 10461, NY, USA; Department of Ophthalmology & Visual Sciences, Albert Einstein College of Medicine, 1300 Morris Park Ave, Bronx, 10461, NY, USA. Electronic address:

Published: May 2022

Probabilistic finite mixture models are widely used for unsupervised clustering. These models can often be improved by adapting them to the topology of the data. For instance, in order to classify spatially adjacent data points similarly, it is common to introduce a Laplacian constraint on the posterior probability that each data point belongs to a class. Alternatively, the mixing probabilities can be treated as free parameters, while assuming Gauss-Markov or more complex priors to regularize those mixing probabilities. However, these approaches are constrained by the shape of the prior and often lead to complicated or intractable inference. Here, we propose a new parametrization of the Dirichlet distribution to flexibly regularize the mixing probabilities of over-parametrized mixture distributions. Using the Expectation-Maximization algorithm, we show that our approach allows us to define any linear update rule for the mixing probabilities, including spatial smoothing regularization as a special case. We then show that this flexible design can be extended to share class information between multiple mixture models. We apply our algorithm to artificial and natural image segmentation tasks, and we provide quantitative and qualitative comparison of the performance of Gaussian and Student-t mixtures on the Berkeley Segmentation Dataset. We also demonstrate how to propagate class information across the layers of deep convolutional neural networks in a probabilistically optimal way, suggesting a new interpretation for feedback signals in biological visual systems. Our flexible approach can be easily generalized to adapt probabilistic mixture models to arbitrary data topologies.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8944213PMC
http://dx.doi.org/10.1016/j.neunet.2022.02.010DOI Listing

Publication Analysis

Top Keywords

mixture models
16
mixing probabilities
16
image segmentation
8
regularize mixing
8
mixture
5
models
5
flexibly regularized
4
regularized mixture
4
models application
4
application image
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!