The nervous system must observe a complex world and produce appropriate, sometimes complex, behavioral responses. In contrast to this complexity, neural responses are often characterized through very simple descriptions such as receptive fields or tuning curves. Do these characterizations adequately reflect the true dimensionality reduction that takes place in the nervous system, or are they merely convenient oversimplifications? Here we address this question for the target-selective descending neurons (TSDNs) of the dragonfly. Using extracellular multielectrode recordings of a population of TSDNs, we quantify the completeness of the receptive field description of these cells and conclude that the information in independent instantaneous position and velocity receptive fields accounts for 70%-90% of the total information in single spikes. Thus, we demonstrate that this simple receptive field model is close to a complete description of the features in the stimulus that evoke TSDN response.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/s0896-6273(03)00680-9 | DOI Listing |
Neuroimage
January 2025
Division of Arts and Sciences, NYU Shanghai, 567 West Yangsi Road, Pudong New District, 200124, Shanghai, China; Center for Neural Science, New York University, 4 Washington Place, NY, 10003, NY, USA; NYU-ECNU Institute of Brain and Cognitive Science, 3663 Zhongshan Road North, Putuo District, 200062, Shanghai, China. Electronic address:
BOLD response can be fitted using the population receptive field (PRF) model to reveal how visual input is represented on the cortex (Dumoulin and Wandell, 2008). Fitting the PRF model costs considerable time, often requiring days to analyze BOLD signals for a small cohort of subjects. We introduce the qPRF ("quick PRF"), a system for accelerated PRF modeling that reduced the computation time by a factor ¿1,000 without losing goodness-of-fit when compared to another widely available PRF modeling package (Kay et al.
View Article and Find Full Text PDFCurr Oncol
June 2024
Hôpital Maisonneuve-Rosemont, Montreal, QC H1T 2M4, Canada
On behalf of Cell Therapy Transplant Canada (CTTC), we are pleased to present the Abstracts of the CTTC 2023 Annual Conference. The conference was held in-person, 31 May–2 June 2023, in Halifax, Nova Scotia at the Westin Nova Scotian hotel. Poster authors presented their work during a lively and engaging welcome reception on Thursday, 1 June, and oral abstract authors were featured during the oral abstract session in the afternoon of Friday, 2 June 2023.
View Article and Find Full Text PDFFront Comput Neurosci
December 2024
Sussex AI, School of Engineering and Informatics, University of Sussex, Brighton, United Kingdom.
We present a Spiking Neural Network (SNN) model that incorporates learnable synaptic delays through two approaches: per-synapse delay learning via Dilated Convolutions with Learnable Spacings (DCLS) and a dynamic pruning strategy that also serves as a form of delay learning. In the latter approach, the network dynamically selects and prunes connections, optimizing the delays in sparse connectivity settings. We evaluate both approaches on the Raw Heidelberg Digits keyword spotting benchmark using Backpropagation Through Time with surrogate gradients.
View Article and Find Full Text PDFSci Rep
January 2025
School of Computer and Control Engineering, Qiqihar University, Qiqihar, 161003, China.
Neural Netw
December 2024
Institute of Automation, Chinese Academy of Sciences, MAIS, Beijing, 100190, China; University of Chinese Academy of Sciences, Beijing, 101408, China.
In the rapidly evolving field of deep learning, Convolutional Neural Networks (CNNs) retain their unique strengths and applicability in processing grid-structured data such as images, despite the surge of Transformer architectures. This paper explores alternatives to the standard convolution, with the objective of augmenting its feature extraction prowess while maintaining a similar parameter count. We propose innovative solutions targeting depthwise separable convolution and standard convolution, culminating in our Multi-scale Progressive Inference Convolution (MPIC).
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!