We derive explicit, closed-form expressions for the cumulant densities of a multivariate, self-exciting Hawkes point process, generalizing a result of Hawkes in his earlier work on the covariance density and Bartlett spectrum of such processes. To do this, we represent the Hawkes process in terms of a Poisson cluster process and show how the cumulant density formulas can be derived by enumerating all possible "family trees," representing complex interactions between point events. We also consider the problem of computing the integrated cumulants, characterizing the average measure of correlated activity between events of different types, and derive the relevant equations.
View Article and Find Full Text PDFCurr Opin Neurobiol
June 2015
Our ability to collect large amounts of data from many cells has been paralleled by the development of powerful statistical models for extracting information from this data. Here we discuss how the activity of cell assemblies can be analyzed using these models, focusing on the generalized linear models and the maximum entropy models and describing a number of recent studies that employ these tools for analyzing multi-neuronal activity. We show results from simulations comparing inferred functional connectivity, pairwise correlations and the real synaptic connections in simulated networks demonstrating the power of statistical models in inferring functional connectivity.
View Article and Find Full Text PDFWe derive learning rules for finding the connections between units in stochastic dynamical networks from the recorded history of a "visible'' subset of the units. We consider two models. In both of them, the visible units are binary and stochastic.
View Article and Find Full Text PDFWe describe how the couplings in an asynchronous kinetic Ising model can be inferred. We consider two cases: one in which we know both the spin history and the update times and one in which we know only the spin history. For the first case, we show that one can average over all possible choices of update times to obtain a learning rule that depends only on spin correlations and can also be derived from the equations of motion for the correlations.
View Article and Find Full Text PDFThere has been recent progress on inferring the structure of interactions in complex networks when they are in stationary states satisfying detailed balance, but little has been done for nonequilibrium systems. Here we introduce an approach to this problem, considering, as an example, the question of recovering the interactions in an asymmetrically coupled, synchronously updated Sherrington-Kirkpatrick model. We derive an exact iterative inversion algorithm and develop efficient approximations based on dynamical mean-field and Thouless-Anderson-Palmer equations that express the interactions in terms of equal-time and one-time-step-delayed correlation functions.
View Article and Find Full Text PDFStatistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of data: knowledge of the mean values and correlations between pairs of elements in the system is sufficient. Not surprisingly, then, using pairwise models for studying neural data has been the focus of many studies in recent years.
View Article and Find Full Text PDFNeural Comput
February 2010
Neuronal firing correlations are studied using simulations of a simple network model for a cortical column in a high-conductance state with dynamically balanced excitation and inhibition. Although correlations between individual pairs of neurons exhibit considerable heterogeneity, population averages show systematic behavior. When the network is in a stationary state, the average correlations are generically small: correlation coefficients are of order 1/N, where N is the number of neurons in the network.
View Article and Find Full Text PDFPhys Rev E Stat Nonlin Soft Matter Phys
May 2009
We study pairwise Ising models for describing the statistics of multineuron spike trains, using data from a simulated cortical network. We explore efficient ways of finding the optimal couplings in these models and examine their statistical properties. To do this, we extract the optimal couplings for subsets of size up to 200 neurons, essentially exactly, using Boltzmann learning.
View Article and Find Full Text PDFWe present a complete mean field theory for a balanced state of a simple model of an orientation hypercolumn, with a numerical procedure for solving the mean-field equations quantitatively. With our treatment, one can determine self-consistently both the firing rates and the firing correlations, without being restricted to specific neuron models. Here, we solve the mean-field equations numerically for integrate-and-fire neurons.
View Article and Find Full Text PDFWe study the spike statistics of neurons in a network with dynamically balanced excitation and inhibition. Our model, intended to represent a generic cortical column, comprises randomly connected excitatory and inhibitory leaky integrate-and-fire neurons, driven by excitatory input from an external population. The high connectivity permits a mean field description in which synaptic currents can be treated as gaussian noise, the mean and autocorrelation function of which are calculated self-consistently from the firing statistics of single model neurons.
View Article and Find Full Text PDFPhys Rev E Stat Nonlin Soft Matter Phys
September 2004
We present a dynamical description and analysis of nonequilibrium transitions in the noisy one-dimensional Ginzburg-Landau equation for an extensive system based on a weak noise canonical phase space formulation of the Freidlin-Wentzel or Martin-Siggia-Rose methods. We derive propagating nonlinear domain wall or soliton solutions of the resulting canonical field equations with superimposed diffusive modes. The transition pathways are characterized by the nucleation and subsequent propagation of domain walls.
View Article and Find Full Text PDFLarge-scale expression data are today measured for thousands of genes simultaneously. This development has been followed by an exploration of theoretical tools to get as much information out of these data as possible. Several groups have used principal component analysis (PCA) for this task.
View Article and Find Full Text PDFWe introduce a model of generalized Hebbian learning and retrieval in oscillatory neural networks modeling cortical areas such as hippocampus and olfactory cortex. Recent experiments have shown that synaptic plasticity depends on spike timing, especially on synapses from excitatory pyramidal cells, in hippocampus, and in sensory and cerebellar cortex. Here we study how such plasticity can be used to form memories and input representations when the neural dynamics are oscillatory, as is common in the brain (particularly in the hippocampus and olfactory cortex).
View Article and Find Full Text PDFLarge-scale expression data are today measured for thousands of genes simultaneously. This development is followed by an exploration of theoretical tools to get as much information out of these data as possible. One line is to try to extract the underlying regulatory network.
View Article and Find Full Text PDF