This article introduces a neural approximation-based method for solving continuous optimization problems with probabilistic constraints. After reformulating the probabilistic constraints as the quantile function, a sample-based neural network model is used to approximate the quantile function. The statistical guarantees of the neural approximation are discussed by showing the convergence and feasibility analysis. Then, by introducing the neural approximation, a simulated annealing-based algorithm is revised to solve the probabilistic constrained programs. An interval predictor model (IPM) of wind power is investigated to validate the proposed method.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1109/TNNLS.2021.3102323 | DOI Listing |
J Theor Biol
December 2024
Institute of Evolution, Centre for Ecological Research, 1121 Budapest, Hungary; Center for the Conceptual Foundations of Science, Parmenides Foundation, 82343 Pöcking, Germany. Electronic address:
Building on the algorithmic equivalence between finite population replicator dynamics and particle filtering based approximation of Bayesian inference, we design a computational model to demonstrate the emergence of Darwinian evolution over representational units when collectives of units are selected to infer statistics of high-dimensional combinatorial environments. The non-Darwinian starting point is two units undergoing a few cycles of noisy, selection-dependent information transmission, corresponding to a serial (one comparison per cycle), non-cumulative process without heredity. Selection for accurate Bayesian inference at the collective level induces an adaptive path to the emergence of Darwinian evolution within the collectives, capable of maintaining and iteratively improving upon complex combinatorial information.
View Article and Find Full Text PDFNeural Netw
December 2024
School of Economics and Management, University of Chinese Academy of Sciences, Beijing, 100190, China; Research Center on Fictitious Economy and Data Science, Chinese Academy of Sciences, Beijing, 100190, China; Key Laboratory of Big Data Mining and Knowledge Management, Chinese Academy of Sciences, Beijing, 100190, China. Electronic address:
Optimal transport (OT) is an effective tool for measuring discrepancies in probability distributions and histograms of features. To reduce its high computational complexity, entropy-regularized OT is proposed, which is computed through Sinkhorn algorithm and can be readily integrated into neural networks. However, each time the parameters of networks are updated, both the value and derivative of OT need to be calculated.
View Article and Find Full Text PDFNeuroimage
December 2024
Department of Psychiatry and the Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Harvard Medical School, Charlestown, MA, 02129, USA; Department of Psychology, Tufts University, Medford, MA, 02155, USA. Electronic address:
During language comprehension, the larger neural response to unexpected versus expected inputs is often taken as evidence for predictive coding-a specific computational architecture and optimization algorithm proposed to approximate probabilistic inference in the brain. However, other predictive processing frameworks can also account for this effect, leaving the unique claims of predictive coding untested. In this study, we used MEG to examine both univariate and multivariate neural activity in response to expected and unexpected inputs during word-by-word reading comprehension.
View Article and Find Full Text PDFJMIR Dermatol
December 2024
K.E.M. Hospital, Mumbai, India.
Background: Thus far, considerable research has been focused on classifying a lesion as benign or malignant. However, there is a requirement for quick depth estimation of a lesion for the accurate clinical staging of the lesion. The lesion could be malignant and quickly grow beneath the skin.
View Article and Find Full Text PDFSci Adv
December 2024
Gatsby Computational Neuroscience Unit, University College London, 25 Howland St, London W1T 4JG, UK.
The complex neural activity of prefrontal cortex (PFC) is a hallmark of cognitive processes. How these rich dynamics emerge and support neural computations is largely unknown. Here, we infer mechanisms underlying the context-dependent integration of sensory inputs by fitting dynamical models to PFC population responses of behaving monkeys.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!