Publications by authors named "Schemmel J"

Background: Finding appropriate model parameters for multi-compartmental neuron models can be challenging. Parameters such as the leak and axial conductance are not always directly derivable from neuron observations but are crucial for replicating desired observations. The objective of this study is to replicate the attenuation behavior of an excitatory postsynaptic potential (EPSP) traveling along a linear chain of compartments on the analog BrainScaleS-2 neuromorphic hardware platform.

View Article and Find Full Text PDF
Article Synopsis
  • The BrainScaleS-2 system is an advanced analog neuromorphic platform used in computational neuroscience and spike-based machine learning.
  • The system now features a configurable real-time event interface, enhancing its connection with external sensors and actuators for improved performance.
  • A demonstration involves using PyTorch to train a spiking neural network for controlling brushless DC motors, enabling high-speed robotics research focused on event-driven control and online learning.
View Article and Find Full Text PDF

We report on a survey of 258 psychotherapists from Germany, focusing on their experiences with memory recovery in general, suggestive therapy procedures, evaluations of recovered memories, and memory recovery in training and guidelines. Most therapists (78%) reported instances of memory recovery encompassing negative and positive childhood experiences, but usually in a minority of patients. Also, most therapists (82%) reported to have held assumptions about unremembered trauma.

View Article and Find Full Text PDF

Objectives: We tested the effect of true and fabricated baseline statements from the same sender on veracity judgments.

Hypotheses: We predicted that presenting a combination of true and fabricated baseline statements would improve truth and lie detection accuracy, while presenting a true baseline would improve only truth detection, and presenting a fabricated baseline would only improve lie detection compared with presenting no baseline statement.

Method: In a 4 × 2 within-subjects design, 142 student participants ( = 23.

View Article and Find Full Text PDF
Article Synopsis
  • Neuromorphic systems are new types of computer systems that help scientists explore and research better, but making them easy to use and efficient is tricky.
  • The BrainScaleS-2 system is a special kind of neuromorphic hardware that uses unique software features to make it easier for researchers to run experiments.
  • The text talks about improvements like faster training methods, new types of neurons, and better access for users, plus plans for making the hardware even bigger and easier to work with in the future.
View Article and Find Full Text PDF

Since the beginning of information processing by electronic components, the nervous system has served as a metaphor for the organization of computational primitives. Brain-inspired computing today encompasses a class of approaches ranging from using novel nano-devices for computation to research into large-scale neuromorphic architectures, such as TrueNorth, SpiNNaker, BrainScaleS, Tianjic, and Loihi. While implementation details differ, spiking neural networks-sometimes referred to as the third generation of neural networks-are the common abstraction used to model computation with such systems.

View Article and Find Full Text PDF

To rapidly process temporal information at a low metabolic cost, biological neurons integrate inputs as an analog sum, but communicate with spikes, binary events in time. Analog neuromorphic hardware uses the same principles to emulate spiking neural networks with exceptional energy efficiency. However, instantiating high-performing spiking networks on such hardware remains a significant challenge due to device mismatch and the lack of efficient training algorithms.

View Article and Find Full Text PDF

BrainScaleS-2 is an accelerated and highly configurable neuromorphic system with physical models of neurons and synapses. Beyond networks of spiking point neurons, it allows for the implementation of user-defined neuron morphologies. Both passive propagation of electric signals between compartments as well as dendritic spikes and plateau potentials can be emulated.

View Article and Find Full Text PDF

Spiking neural networks are the basis of versatile and power-efficient information processing in the brain. Although we currently lack a detailed understanding of how these networks compute, recently developed optimization techniques allow us to instantiate increasingly complex functional spiking neural networks in-silico. These methods hold the promise to build more efficient non-von-Neumann computing hardware and will offer new vistas in the quest of unraveling brain circuit function.

View Article and Find Full Text PDF

In computational neuroscience, as well as in machine learning, neuromorphic devices promise an accelerated and scalable alternative to neural network simulations. Their neural connectivity and synaptic capacity depend on their specific design choices, but is always intrinsically limited. Here, we present a strategy to achieve structural plasticity that optimizes resource allocation under these constraints by constantly rewiring the pre- and postsynaptic partners while keeping the neuronal fan-in constant and the connectome sparse.

View Article and Find Full Text PDF

We investigated how information on a motive to lie impacts on the perceived content quality of a statement and its subsequent veracity rating. In an online study, 300 participants rated a statement about an alleged sexual harassment on a scale based on Criteria-based Content Analysis (CBCA) and judged its veracity. In a 3 × 3 between-subjects design, we varied prior information (motive to lie, no motive to lie, and no information on a motive), and presented three different statement versions of varying content quality (high, medium, and low).

View Article and Find Full Text PDF

The critical state is assumed to be optimal for any computation in recurrent neural networks, because criticality maximizes a number of abstract computational properties. We challenge this assumption by evaluating the performance of a spiking recurrent neural network on a set of tasks of varying complexity at - and away from critical network dynamics. To that end, we developed a plastic spiking network on a neuromorphic chip.

View Article and Find Full Text PDF

The massively parallel nature of biological information processing plays an important role due to its superiority in comparison to human-engineered computing devices. In particular, it may hold the key to overcoming the von Neumann bottleneck that limits contemporary computer architectures. Physical-model neuromorphic devices seek to replicate not only this inherent parallelism, but also aspects of its microscopic dynamics in analog circuits emulating neurons and synapses.

View Article and Find Full Text PDF

Neuronal network models of high-level brain functions such as memory recall and reasoning often rely on the presence of some form of noise. The majority of these models assumes that each neuron in the functional network is equipped with its own private source of randomness, often in the form of uncorrelated external noise. In vivo, synaptic background input has been suggested to serve as the main source of noise in biological neuronal networks.

View Article and Find Full Text PDF

An increasing body of evidence suggests that the trial-to-trial variability of spiking activity in the brain is not mere noise, but rather the reflection of a sampling-based encoding scheme for probabilistic computing. Since the precise statistical properties of neural activity are important in this context, many models assume an ad-hoc source of well-behaved, explicit noise, either on the input or on the output side of single neuron dynamics, most often assuming an independent Poisson process in either case. However, these assumptions are somewhat problematic: neighboring neurons tend to share receptive fields, rendering both their input and their output correlated; at the same time, neurons are known to behave largely deterministically, as a function of their membrane potential and conductance.

View Article and Find Full Text PDF

Neuromorphic devices represent an attempt to mimic aspects of the brain's architecture and dynamics with the aim of replicating its hallmark functional capabilities in terms of computational power, robust learning and energy efficiency. We employ a single-chip prototype of the BrainScaleS 2 neuromorphic system to implement a proof-of-concept demonstration of reward-modulated spike-timing-dependent plasticity in a spiking network that learns to play a simplified version of the Pong video game by smooth pursuit. This system combines an electronic mixed-signal substrate for emulating neuron and synapse dynamics with an embedded digital processor for on-chip learning, which in this work also serves to simulate the virtual environment and learning agent.

View Article and Find Full Text PDF

Neuromorphic engineering (NE) encompasses a diverse range of approaches to information processing that are inspired by neurobiological systems, and this feature distinguishes neuromorphic systems from conventional computing systems. The brain has evolved over billions of years to solve difficult engineering problems by using efficient, parallel, low-power computation. The goal of NE is to design systems capable of brain-like computation.

View Article and Find Full Text PDF

Here, we describe a multicompartment neuron circuit based on the adaptive-exponential I&F (AdEx) model, developed for the second-generation BrainScaleS hardware. Based on an existing modular leaky integrate-and-fire (LIF) architecture designed in 65-nm CMOS, the circuit features exponential spike generation, neuronal adaptation, intercompartmental connections as well as a conductance-based reset. The design reproduces a diverse set of firing patterns observed in cortical pyramidal neurons.

View Article and Find Full Text PDF

Spiking networks that perform probabilistic inference have been proposed both as models of cortical computation and as candidates for solving problems in machine learning. However, the evidence for spike-based computation being in any way superior to non-spiking alternatives remains scarce. We propose that short-term synaptic plasticity can provide spiking networks with distinct computational advantages compared to their classical counterparts.

View Article and Find Full Text PDF

We present results from a new approach to learning and plasticity in neuromorphic hardware systems: to enable flexibility in implementable learning mechanisms while keeping high efficiency associated with neuromorphic implementations, we combine a general-purpose processor with full-custom analog elements. This processor is operating in parallel with a fully parallel neuromorphic system consisting of an array of synapses connected to analog, continuous time neuron circuits. Novel analog correlation sensor circuits process spike events for each synapse in parallel and in real-time.

View Article and Find Full Text PDF

The highly variable dynamics of neocortical circuits observed in vivo have been hypothesized to represent a signature of ongoing stochastic inference but stand in apparent contrast to the deterministic response of neurons measured in vitro. Based on a propagation of the membrane autocorrelation across spike bursts, we provide an analytical derivation of the neural activation function that holds for a large parameter space, including the high-conductance state. On this basis, we show how an ensemble of leaky integrate-and-fire neurons with conductance-based synapses embedded in a spiking environment can attain the correct firing statistics for sampling from a well-defined target distribution.

View Article and Find Full Text PDF
Article Synopsis
  • The text includes a collection of research topics related to neural circuits, mental disorders, and computational models in neuroscience.
  • It features various studies examining the functional advantages of neural heterogeneity, propagation waves in the visual cortex, and dendritic mechanisms crucial for precise neuronal functioning.
  • The research covers a range of applications, from understanding complex brain rhythms to modeling auditory processing and investigating the effects of neural regulation on behavior.
View Article and Find Full Text PDF

The means by which cortical neural networks are able to efficiently solve inference problems remains an open question in computational neuroscience. Recently, abstract models of Bayesian computation in neural circuits have been proposed, but they lack a mechanistic interpretation at the single-cell level. In this article, we describe a complete theoretical framework for building networks of leaky integrate-and-fire neurons that can sample from arbitrary probability distributions over binary random variables.

View Article and Find Full Text PDF

Advancing the size and complexity of neural network models leads to an ever increasing demand for computational resources for their simulation. Neuromorphic devices offer a number of advantages over conventional computing architectures, such as high emulation speed or low power consumption, but this usually comes at the price of reduced configurability and precision. In this article, we investigate the consequences of several such factors that are common to neuromorphic devices, more specifically limited hardware resources, limited parameter configurability and parameter variations due to fixed-pattern noise and trial-to-trial variability.

View Article and Find Full Text PDF