Publications by authors named "Jennifer M Groh"

Understanding how neurons encode multiple simultaneous stimuli is a fundamental question in neuroscience. We have previously introduced a novel theory of stochastic encoding patterns wherein a neuron's spiking activity dynamically switches among its constituent single-stimulus activity patterns when presented with multiple stimuli (Groh et al., 2024).

View Article and Find Full Text PDF

Our ability to perceive multiple objects is mysterious. Sensory neurons are broadly tuned, producing potential overlap in the populations of neurons activated by each object in a scene. This overlap raises questions about how distinct information is retained about each item.

View Article and Find Full Text PDF

How neural representations preserve information about multiple stimuli is mysterious. Because tuning of individual neurons is coarse (e.g.

View Article and Find Full Text PDF

Eye movements alter the relationship between the visual and auditory spatial scenes. Signals related to eye movements affect neural pathways from the ear through auditory cortex and beyond, but how these signals contribute to computing the locations of sounds with respect to the visual scene is poorly understood. Here, we evaluated the information contained in eye movement-related eardrum oscillations (EMREOs), pressure changes recorded in the ear canal that occur in conjunction with simultaneous eye movements.

View Article and Find Full Text PDF

We recently discovered a unique type of otoacoustic emission (OAE) time-locked to the onset (and offset) of saccadic eye movements and occurring in the absence of external sound (Gruters et al., 2018). How and why these eye-movement-related eardrum oscillations (EMREOs) are generated is unknown, with a role in visual-auditory integration being the likeliest candidate.

View Article and Find Full Text PDF

Auditory and visual information involve different coordinate systems, with auditory spatial cues anchored to the head and visual spatial cues anchored to the eyes. Information about eye movements is therefore critical for reconciling visual and auditory spatial signals. The recent discovery of eye movement-related eardrum oscillations (EMREOs) suggests that this process could begin as early as the auditory periphery.

View Article and Find Full Text PDF

Unlabelled: How neural representations preserve information about multiple stimuli is mysterious. Because tuning of individual neurons is coarse (for example, visual receptive field diameters can exceed perceptual resolution), the populations of neurons potentially responsive to each individual stimulus can overlap, raising the question of how information about each item might be segregated and preserved in the population. We recently reported evidence for a potential solution to this problem: when two stimuli were present, some neurons in the macaque visual cortical areas V1 and V4 exhibited fluctuating firing patterns, as if they responded to only one individual stimulus at a time.

View Article and Find Full Text PDF

We recently discovered a unique type of low-frequency otoacoustic emission (OAE) time-locked to the onset (and offset) of saccadic eye movements and occurring in the absence of external sound (Gruters et al., 2018). How and why these eye-movement-related eardrum oscillations (EMREOs) are generated is unknown, with a role in visual-auditory integration being the likeliest candidate.

View Article and Find Full Text PDF

Sensory receptive fields are large enough that they can contain more than one perceptible stimulus. How, then, can the brain encode information about of the stimuli that may be present at a given moment? We recently showed that when more than one stimulus is present, single neurons can fluctuate between coding one vs. the other(s) across some time period, suggesting a form of neural multiplexing of different stimuli (Caruso et al.

View Article and Find Full Text PDF

How we distinguish multiple simultaneous stimuli is uncertain, particularly given that such stimuli sometimes recruit largely overlapping populations of neurons. One commonly proposed hypothesis is that the sharpness of tuning curves might change to limit the number of stimuli driving any given neuron when multiple stimuli are present. To test this hypothesis, we recorded the activity of neurons in the inferior colliculus while monkeys made saccades to either one or two simultaneous sounds differing in frequency and spatial location.

View Article and Find Full Text PDF

Conventional analysis of neuroscience data involves computing average neural activity over a group of trials and/or a period of time. This approach may be particularly problematic when assessing the response patterns of neurons to more than one simultaneously presented stimulus. in such cases the brain must represent each individual component of the stimuli bundle, but trial-and-time-pooled averaging methods are fundamentally unequipped to address the means by which multiitem representation occurs.

View Article and Find Full Text PDF

Coordination between different sensory systems is a necessary element of sensory processing. Where and how signals from different sense organs converge onto common neural circuitry have become topics of increasing interest in recent years. In this article, we focus specifically on visual-auditory interactions in areas of the mammalian brain that are commonly considered to be auditory in function.

View Article and Find Full Text PDF

Stimulus locations are detected differently by different sensory systems, but ultimately they yield similar percepts and behavioral responses. How the brain transcends initial differences to compute similar codes is unclear. We quantitatively compared the reference frames of two sensory modalities, vision and audition, across three interconnected brain areas involved in generating saccades, namely the frontal eye fields (FEF), lateral and medial parietal cortex (M/LIP), and superior colliculus (SC).

View Article and Find Full Text PDF

The environment is sampled by multiple senses, which are woven together to produce a unified perceptual state. However, optimally unifying such signals requires assigning particular signals to the same or different underlying objects or events. Many prior studies (especially in animals) have assumed fusion of cross-modal information, whereas recent work in humans has begun to probe the appropriateness of this assumption.

View Article and Find Full Text PDF

We recently reported the existence of fluctuations in neural signals that may permit neurons to code multiple simultaneous stimuli sequentially across time [1]. This required deploying a novel statistical approach to permit investigation of neural activity at the scale of individual trials. Here we present tests using synthetic data to assess the sensitivity and specificity of this analysis.

View Article and Find Full Text PDF

Keynote by Jenny Groh (Duke University) at the 20th European Conference on Eye Movement Research (ECEM) in Alicante, 19.8.2019 Information about eye movements with respect to the head is required for reconciling visual and auditory space.

View Article and Find Full Text PDF

How the brain preserves information about multiple simultaneous items is poorly understood. We report that single neurons can represent multiple stimuli by interleaving signals across time. We record single units in an auditory region, the inferior colliculus, while monkeys localize 1 or 2 simultaneous sounds.

View Article and Find Full Text PDF

Interactions between sensory pathways such as the visual and auditory systems are known to occur in the brain, but where they first occur is uncertain. Here, we show a multimodal interaction evident at the eardrum. Ear canal microphone measurements in humans ( = 19 ears in 16 subjects) and monkeys ( = 5 ears in three subjects) performing a saccadic eye movement task to visual targets indicated that the eardrum moves in conjunction with the eye movement.

View Article and Find Full Text PDF

We accurately perceive the visual scene despite moving our eyes ~3 times per second, an ability that requires incorporation of eye position and retinal information. In this study, we assessed how this neural computation unfolds across three interconnected structures: frontal eye fields (FEF), intraparietal cortex (LIP/MIP), and the superior colliculus (SC). Single-unit activity was assessed in head-restrained monkeys performing visually guided saccades from different initial fixations.

View Article and Find Full Text PDF

Advances in drug potency and tailored therapeutics are promoting pharmaceutical manufacturing to transition from a traditional batch paradigm to more flexible continuous processing. Here we report the development of a multistep continuous-flow CGMP (current good manufacturing practices) process that produced 24 kilograms of prexasertib monolactate monohydrate suitable for use in human clinical trials. Eight continuous unit operations were conducted to produce the target at roughly 3 kilograms per day using small continuous reactors, extractors, evaporators, crystallizers, and filters in laboratory fume hoods.

View Article and Find Full Text PDF

Continuous processing enables the use of non-standard reaction conditions such as high temperatures and pressures while in the liquid phase. This expands the chemist's toolbox and can enable previously unthinkable chemistry to proceed with ease. For a series of amphoteric amino acid derivatives, we have demonstrated the ability to hydrolyze the tert-butyl ester functionality in protic solvent systems.

View Article and Find Full Text PDF

Unlabelled: Understanding the relationship between the auditory selectivity of neurons and their contribution to perception is critical to the design of effective auditory brain prosthetics. These prosthetics seek to mimic natural activity patterns to achieve desired perceptual outcomes. We measured the contribution of inferior colliculus (IC) sites to perception using combined recording and electrical stimulation.

View Article and Find Full Text PDF

Saccadic eye movements can be elicited by more than one type of sensory stimulus. This implies substantial transformations of signals originating in different sense organs as they reach a common motor output pathway. In this study, we compared the prevalence and magnitude of auditory- and visually evoked activity in a structure implicated in oculomotor processing, the primate frontal eye fields (FEF).

View Article and Find Full Text PDF

Maps are a mainstay of visual, somatosensory, and motor coding in many species. However, auditory maps of space have not been reported in the primate brain. Instead, recent studies have suggested that sound location may be encoded via broadly responsive neurons whose firing rates vary roughly proportionately with sound azimuth.

View Article and Find Full Text PDF

A general problem in learning is how the brain determines what lesson to learn (and what lessons not to learn). For example, sound localization is a behavior that is partially learned with the aid of vision. This process requires correctly matching a visual location to that of a sound.

View Article and Find Full Text PDF