A restricted Boltzmann machine is a fully connected shallow neural network. It can be used to solve many challenging optimization problems. The Boltzmann machines are usually considered probability models. Probability models normally use nondeterministic algorithms to solve their parameters. The Hopfield network which is also known as the Ising model is a special case of a Boltzmann machine, in the sense that the hidden layer is the same as the visible layer. The weights and biases from the visible layer to the hidden layer are the same as the weights and biases from the hidden layer to the visible layer. When the Hopfield network is considered a probabilistic model, everything is treated as stochastic (i.e., random) and nondeterministic. An optimization problem in the Hopfield network is considered searching for the samples that have higher probabilities according to a probability density function. This paper proposes a method to consider the Hopfield network as a deterministic model, in which nothing is random, and no stochastic distribution is used. An optimization problem associated with the Hopfield network thus has a deterministic objective function (also known as loss function or cost function) that is the energy function itself. The purpose of the objective function is to assist the Hopfield network to reach a state that has a lower energy. This study suggests that deterministic optimization algorithms can be used for the associated optimization problems. The deterministic algorithm has the same mathematical form for the calculation of a perceptron that consists of a dot product, a bias, and a nonlinear activation function. This paper uses some examples of searching for stable states to demonstrate that the deterministic optimization method may have a faster convergence rate and smaller errors.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11634054 | PMC |
http://dx.doi.org/10.47852/bonviewjcce42022789 | DOI Listing |
Hopfield neural networks (HNNs) promise broad applications in areas such as combinatorial optimization, memory storage, and pattern recognition. Among various implementations, optical HNNs are particularly interesting because they can take advantage of fast optical matrix-vector multiplications. Yet their studies so far have mostly been on the theoretical side, and the effects of optical imperfections and robustness against memory errors remain to be quantified.
View Article and Find Full Text PDFCogn Neurodyn
December 2024
Department of Electronics and Communication Engineering, Vemu Institute of Technology, Chittoor, India.
The studies conducted in this contribution are based on the analysis of the dynamics of a homogeneous network of five inertial neurons of the Hopfield type to which a unidirectional ring coupling topology is applied. The coupling is achieved by perturbing the next neuron's amplitude with a signal proportional to the previous one. The system consists of ten coupled ODEs, and the investigations carried out have allowed us to highlight several unusual and rarely related dynamics, hence the importance of emphasizing them.
View Article and Find Full Text PDFNeural Netw
December 2024
Wang Zheng School of Microelectronics, Changzhou University, Changzhou, 213159, PR China. Electronic address:
Memristors are commonly used as the connecting parts of neurons in brain-like neural networks. The memristors, unlike the existing literature, possess the capability to function as both self-connected synaptic weights and interconnected synaptic weights, thereby enabling the generation of intricate initials-regulated plane coexistence behaviors. To demonstrate this dynamical effect, a Hopfield neural network with two-memristor-interconnected neurons (TMIN-HNN) is proposed.
View Article and Find Full Text PDFProc Natl Acad Sci U S A
December 2024
State Key Laboratory of Surface Physics and Institute for Nanoelectronic Devices and Quantum Computing, Fudan University, Shanghai 200433, China.
Physical neural networks (PNN) using physical materials and devices to mimic synapses and neurons offer an energy-efficient way to implement artificial neural networks. Yet, training PNN is difficult and heavily relies on external computing resources. An emerging concept to solve this issue is called physical self-learning that uses intrinsic physical parameters as trainable weights.
View Article and Find Full Text PDFJ Comput Cogn Eng
November 2024
Department of Computer Science, Utah Valley University, USA.
A restricted Boltzmann machine is a fully connected shallow neural network. It can be used to solve many challenging optimization problems. The Boltzmann machines are usually considered probability models.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!