Despite the successful use of Gaussian-binary restricted Boltzmann machines (GB-RBMs) and Gaussian-binary deep belief networks (GB-DBNs), little is known about their theoretical approximation capabilities to represent distributions of continuous random variables. In this paper, we address the expressive properties of GB-RBMs and GB-DBNs, contributing theoretical insights to the optimal number of hidden variables. We first treat the GB-RBM's unnormalized log-likelihood as a sum of a special two-layer feedforward neural network and a negative quadratic term. Then, a series of simulation results are established, which can be used to relate GB-RBMs to general two-layer feedforward neural networks whose expressive properties are much better understood. On this basis, we show that a two-layer ReLU network with all weights in the second layer being 1, along with a negative quadratic term, can approximate all continuous functions. In addition, we provide qualified lower bounds for the number of hidden variables of GB-RBMs required to approximate distributions whose log-likelihood are given by some classes of smooth functions. Moreover, we further study the universal approximation of GB-DBNs with two hidden layers by providing a sufficient number of hidden variables O(ɛ) that are guaranteed to approximate any given strictly positive continuous distribution within a given error ɛ. Finally, numerical experiments are carried out to verify some of the proposed theoretical results.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.neunet.2022.05.020DOI Listing

Publication Analysis

Top Keywords

number hidden
12
hidden variables
12
gaussian-binary restricted
8
restricted boltzmann
8
boltzmann machines
8
gaussian-binary deep
8
deep belief
8
belief networks
8
expressive properties
8
two-layer feedforward
8

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!