Publications by authors named "John van Opstal"

We recently developed a biomimetic robotic eye with six independent tendons, each controlled by their own rotatory motor, and with insertions on the eye ball that faithfully mimic the biomechanics of the human eye. We constructed an accurate physical computational model of this system, and learned to control its nonlinear dynamics by optimising a cost that penalised saccade inaccuracy, movement duration, and total energy expenditure of the motors. To speed up the calculations, the physical simulator was approximated by a recurrent neural network (NARX).

View Article and Find Full Text PDF
Article Synopsis
  • The study investigated post-saccadic oscillations (PSOs) in individuals with age-related macular degeneration (AMD), retinitis pigmentosa (RP), and those with normal vision to understand differences in eye movement stability.
  • Participants' gaze was measured during a horizontal saccade task, and PSO characteristics like amplitude, decay time, and frequency were analyzed using a damped oscillation model.
  • Results showed that those with vision loss exhibited larger oscillation amplitudes and longer decay times compared to normal vision participants, indicating that abnormal PSOs contribute to reduced fixation stability in AMD and RP.
View Article and Find Full Text PDF

Human speech and vocalizations in animals are rich in joint spectrotemporal (S-T) modulations, wherein acoustic changes in both frequency and time are functionally related. In principle, the primate auditory system could process these complex dynamic sounds based on either an inseparable representation of S-T features or, alternatively, a separable representation. The separability hypothesis implies an independent processing of spectral and temporal modulations.

View Article and Find Full Text PDF

The midbrain superior colliculus is a crucial sensorimotor stage for programming and generating saccadic eye-head gaze shifts. Although it is well established that superior colliculus cells encode a neural command that specifies the amplitude and direction of the upcoming gaze-shift vector, there is controversy about the role of the firing-rate dynamics of these neurons during saccades. In our earlier work, we proposed a simple quantitative model that explains how the recruited superior colliculus population may specify the detailed kinematics (trajectories and velocity profiles) of head-restrained saccadic eye movements.

View Article and Find Full Text PDF

A cochlear implant (CI) is a neurotechnological device that restores total sensorineural hearing loss. It contains a sophisticated speech processor that analyzes and transforms the acoustic input. It distributes its time-enveloped spectral content to the auditory nerve as electrical pulsed stimulation trains of selected frequency channels on a multi-contact electrode that is surgically inserted in the cochlear duct.

View Article and Find Full Text PDF

Purpose: Most eye-movement studies in patients with visual field defects have examined the strategies that patients use while exploring a visual scene, but they have not investigated saccade kinematics. In healthy vision, saccade trajectories follow the remarkably stereotyped "main sequence": saccade duration increases linearly with saccade amplitude; peak velocity also increases linearly for small amplitudes, but approaches a saturation limit for large amplitudes. Recent theories propose that these relationships reflect the brain's attempt to optimize vision when planning eye movements.

View Article and Find Full Text PDF

Many cochlear implant users with binaural residual (acoustic) hearing benefit from combining electric and acoustic stimulation (EAS) in the implanted ear with acoustic amplification in the other. These bimodal EAS listeners can potentially use low-frequency binaural cues to localize sounds. However, their hearing is generally asymmetric for mid- and high-frequency sounds, perturbing or even abolishing binaural cues.

View Article and Find Full Text PDF

Introduction: To reorient gaze (the eye's direction in space) towards a target is an overdetermined problem, as infinitely many combinations of eye- and head movements can specify the same gaze-displacement vector. Yet, behavioral measurements show that the primate gaze-control system selects a specific contribution of eye- and head movements to the saccade, which depends on the initial eye-in-head orientation. Single-unit recordings in the primate superior colliculus (SC) during head-unrestrained gaze shifts have further suggested that cells may encode the instantaneous trajectory of a desired straight gaze path in a feedforward way by the total cumulative number of spikes in the neural population, and that the instantaneous gaze kinematics are thus determined by the neural firing rates.

View Article and Find Full Text PDF

We tested whether sensitivity to acoustic spectrotemporal modulations can be observed from reaction times for normal-hearing and impaired-hearing conditions. In a manual reaction-time task, normal-hearing listeners had to detect the onset of a ripple (with density between 0-8 cycles/octave and a fixed modulation depth of 50%), that moved up or down the log-frequency axis at constant velocity (between 0-64 Hz), in an otherwise-unmodulated broadband white-noise. Spectral and temporal modulations elicited band-pass filtered sensitivity characteristics, with fastest detection rates around 1 cycle/oct and 32 Hz for normal-hearing conditions.

View Article and Find Full Text PDF

Previous studies have indicated that the location of a large neural population in the Superior Colliculus (SC) motor map specifies the amplitude and direction of the saccadic eye-movement vector, while the saccade trajectory and velocity profile are encoded by the population firing rates. We recently proposed a simple spiking neural network model of the SC motor map, based on linear summation of individual spike effects of each recruited neuron, which accounts for many of the observed properties of SC cells in relation to the ensuing eye movement. However, in the model, the cortical input was kept invariant across different saccades.

View Article and Find Full Text PDF

Several studies have proposed that an optimal speed-accuracy tradeoff underlies the stereotyped relationship between amplitude, duration and peak velocity of saccades (main sequence). To test this theory, we asked 8 participants to make saccades to Gaussian-blurred spots and manipulated the task's accuracy constraints by varying target size (1, 3, and 5°). The largest targets indeed yielded more endpoint scatter (and lower gains) than the smallest targets, although this effect subsided with target eccentricity.

View Article and Find Full Text PDF

Purpose: Speech understanding in noise and horizontal sound localization is poor in most cochlear implant (CI) users with a hearing aid (bimodal stimulation). This study investigated the effect of static and less-extreme adaptive frequency compression in hearing aids on spatial hearing. By means of frequency compression, we aimed to restore high-frequency audibility, and thus improve sound localization and spatial speech recognition.

View Article and Find Full Text PDF

The cochlear implant (CI) allows profoundly deaf individuals to partially recover hearing. Still, due to the coarse acoustic information provided by the implant, CI users have considerable difficulties in recognizing speech, especially in noisy environments. CI users therefore rely heavily on visual cues to augment speech recognition, more so than normal-hearing individuals.

View Article and Find Full Text PDF

An interesting problem for the human saccadic eye-movement system is how to deal with the degrees-of-freedom problem: the six extra-ocular muscles provide three rotational degrees of freedom, while only two are needed to point gaze at any direction. Measurements show that 3D eye orientations during head-fixed saccades in far-viewing conditions lie in Listing's plane (LP), in which the eye's cyclotorsion is zero (Listing's law, LL). Moreover, while saccades are executed as single-axis rotations around a stable eye-angular velocity axis, they follow straight trajectories in LP.

View Article and Find Full Text PDF

Although moving sound-sources abound in natural auditory scenes, it is not clear how the human brain processes auditory motion. Previous studies have indicated that, although ocular localization responses to stationary sounds are quite accurate, ocular smooth pursuit of moving sounds is very poor. We here demonstrate that human subjects faithfully track a sound's unpredictable movements in the horizontal plane with smooth-pursuit responses of the head.

View Article and Find Full Text PDF

The latency of the auditory steady-state response (ASSR) may provide valuable information regarding the integrity of the auditory system, as it could potentially reveal the presence of multiple intracerebral sources. To estimate multiple latencies from high-order ASSRs, we propose a novel two-stage procedure that consists of a nonparametric estimation method, called apparent latency from phase coherence (ALPC), followed by a heuristic sequential forward selection algorithm (SFS). Compared with existing methods, ALPC-SFS requires few prior assumptions, and is straightforward to implement for higher-order nonlinear responses to multi-cosine sound complexes with their initial phases set to zero.

View Article and Find Full Text PDF

To program a goal-directed response in the presence of acoustic reflections, the audio-motor system should suppress the detection of time-delayed sources. We examined the effects of spatial separation and interstimulus delay on the ability of human listeners to localize a pair of broadband sounds in the horizontal plane. Participants indicated how many sounds were heard and where these were perceived by making one or two head-orienting localization responses.

View Article and Find Full Text PDF

Several studies have demonstrated the advantages of the bilateral vs. unilateral cochlear implantation in listeners with bilateral severe to profound hearing loss. However, it remains unclear to what extent bilaterally implanted listeners have access to binaural cues, e.

View Article and Find Full Text PDF

Congenital unilateral conductive hearing loss (UCHL) jeopardizes directional hearing and speech perception in noisy conditions. Potentially, children with congenital UCHL can benefit from fitting a hearing device, such as a bone-conduction device (BCD). However, the literature reports limited benefit from fitting a BCD, and often, surprisingly, relatively good sound localization in the unaided condition is reported.

View Article and Find Full Text PDF

We assessed how synchronous speech listening and lipreading affects speech recognition in acoustic noise. In simple audiovisual perceptual tasks, inverse effectiveness is often observed, which holds that the weaker the unimodal stimuli, or the poorer their signal-to-noise ratio, the stronger the audiovisual benefit. So far, however, inverse effectiveness has not been demonstrated for complex audiovisual speech stimuli.

View Article and Find Full Text PDF

Sound localization in the horizontal plane (azimuth) relies mainly on binaural difference cues in sound level and arrival time. Blocking one ear will perturb these cues, and may strongly affect azimuth performance of the listener. However, single-sided deaf listeners, as well as acutely single-sided plugged normal-hearing subjects, often use a combination of (ambiguous) monaural head-shadow cues, impoverished binaural level-difference cues, and (veridical, but limited) pinna- and head-related spectral cues to estimate source azimuth.

View Article and Find Full Text PDF

Single-unit recordings in head-restrained monkeys indicated that the population of saccade-related cells in the midbrain Superior Colliculus (SC) encodes the kinematics of desired straight saccade trajectories by the cumulative number of spikes. In addition, the nonlinear main sequence of saccades (their amplitude-peak velocity saturation) emerges from a spatial gradient of peak-firing rates of collicular neurons, rather than from neural saturation at brainstem burst generators. We here extend this idea to eye-head gaze shifts and illustrate how the cumulative spike-count in head-unrestrained monkeys relates to the desired gaze trajectory and its kinematics.

View Article and Find Full Text PDF

The superior colliculus (SC) generates saccades by recruiting a population of cells in its topographically organized motor map. Supra-threshold electrical stimulation in the SC produces a normometric saccade with little effect of the stimulation parameters. Moreover, the kinematics of electrically evoked saccades strongly resemble natural, visual-evoked saccades.

View Article and Find Full Text PDF

This study describes sound localization and speech-recognition-in-noise abilities of a cochlear-implant user with electro-acoustic stimulation (EAS) in one ear, and a hearing aid in the contralateral ear. This listener had low-frequency, up to 250 Hz, residual hearing within the normal range in both ears. The objective was to determine how hearing devices affect spatial hearing for an individual with substantial unaided low-frequency residual hearing.

View Article and Find Full Text PDF

Bilateral cochlear-implant (CI) users and single-sided deaf listeners with a CI are less effective at localizing sounds than normal-hearing (NH) listeners. This performance gap is due to the degradation of binaural and monaural sound localization cues, caused by a combination of device-related and patient-related issues. In this study, we targeted the device-related issues by measuring sound localization performance of 11 NH listeners, listening to free-field stimuli processed by a real-time CI vocoder.

View Article and Find Full Text PDF