Neural Comput
September 2019
There is extensive evidence that biological neural networks encode information in the precise timing of the spikes generated and transmitted by neurons, which offers several advantages over rate-based codes. Here we adopt a vector space formulation of spike train sequences and introduce a new liquid state machine (LSM) network architecture and a new forward orthogonal regression algorithm to learn an input-output signal mapping or to decode the brain activity. The proposed algorithm uses precise spike timing to select the presynaptic neurons relevant to each learning task.
View Article and Find Full Text PDFInferring mathematical models of sensory processing systems directly from input-output observations, while making the fewest assumptions about the model equations and the types of measurements available, is still a major issue in computational neuroscience. This letter introduces two new approaches for identifying sensory circuit models consisting of linear and nonlinear filters in series with spiking neuron models, based only on the sampled analog input to the filter and the recorded spike train output of the spiking neuron. For an ideal integrate-and-fire neuron model, the first algorithm can identify the spiking neuron parameters as well as the structure and parameters of an arbitrary nonlinear filter connected to it.
View Article and Find Full Text PDFNeural Comput
September 2015
Integrate-and-fire neurons are time encoding machines that convert the amplitude of an analog signal into a nonuniform, strictly increasing sequence of spike times. Under certain conditions, the encoded signals can be reconstructed from the nonuniform spike time sequences using a time decoding machine. Time encoding and time decoding methods have been studied using the nonuniform sampling theory for band-limited spaces, as well as for generic shift-invariant spaces.
View Article and Find Full Text PDF