Publications by authors named "Ayse Pinar Saygin"

Identifying the movements of those around us is fundamental for many daily activities, such as recognizing actions, detecting predators, and interacting with others socially. A key question concerns the neurobiological substrates underlying biological motion perception. Although the ventral "form" visual cortex is standardly activated by biologically moving stimuli, whether these activations are functionally critical for biological motion perception or are epiphenomenal remains unknown.

View Article and Find Full Text PDF

Previous research indicates that motion-sensitive brain regions are engaged when comprehending motion semantics expressed by words or sentences. Using fMRI, we investigated whether such neural modulation can occur when the linguistic signal itself is visually dynamic and motion semantics is expressed by movements of the hands. Deaf and hearing users of American Sign Language (ASL) were presented with signed sentences that conveyed motion semantics ("The deer walked along the hillside.

View Article and Find Full Text PDF

We used a novel stimulus set of human and robot actions to explore the role of humanlike appearance and motion in action prediction. Participants viewed videos of familiar actions performed by three agents: human, android and robot, the former two sharing human appearance, the latter two nonhuman motion. In each trial, the video was occluded for 400 ms.

View Article and Find Full Text PDF

Using MRI-guided off-line TMS, we targeted two areas implicated in biological motion processing: ventral premotor cortex (PMC) and posterior STS (pSTS), plus a control site (vertex). Participants performed a detection task on noise-masked point-light displays of human animations and scrambled versions of the same stimuli. Perceptual thresholds were determined individually.

View Article and Find Full Text PDF

Using functional magnetic resonance imaging (fMRI) repetition suppression, we explored the selectivity of the human action perception system (APS), which consists of temporal, parietal and frontal areas, for the appearance and/or motion of the perceived agent. Participants watched body movements of a human (biological appearance and movement), a robot (mechanical appearance and movement) or an android (biological appearance, mechanical movement). With the exception of extrastriate body area, which showed more suppression for human like appearance, the APS was not selective for appearance or motion per se.

View Article and Find Full Text PDF

Using functional MRI, we investigated whether auditory processing of both speech and meaningful non-linguistic environmental sounds in superior and middle temporal cortex relies on a complex and spatially distributed neural system. We found that evidence for spatially distributed processing of speech and environmental sounds in a substantial extent of temporal cortices. Most importantly, regions previously reported as selective for speech over environmental sounds also contained distributed information.

View Article and Find Full Text PDF

Background: Perception of biological motion is linked to the action perception system in the human brain, abnormalities within which have been suggested to underlie impairments in social domains observed in autism spectrum conditions (ASC). However, the literature on biological motion perception in ASC is heterogeneous and it is unclear whether deficits are specific to biological motion, or might generalize to form-from-motion perception.

Methodology And Principal Findings: We compared psychophysical thresholds for both biological and non-biological form-from-motion perception in adults with ASC and controls.

View Article and Find Full Text PDF

The human visual system must perform complex visuospatial extrapolations (VSE) across space and time in order to extract shape and form from the retinal projection of a cluttered visual environment characterized by occluded surfaces and moving objects. Even if we exclude the temporal dimension, for instance when judging whether an extended finger is pointing towards one object or another, the mechanisms of VSE remain opaque. Here we investigated the neural correlates of VSE using functional magnetic resonance imaging in sixteen human observers while they judged the relative position of, or saccaded to, a (virtual) target defined by the extrapolated path of a pointer.

View Article and Find Full Text PDF

To examine how young children recognize the association between two different types of meaningful sounds and their visual referents, we compared 15-, 20-, and 25-month-old infants' looking time responses to familiar naturalistic environmental sounds, (e.g., the sound of a dog barking) and their empirically matched verbal descriptions (e.

View Article and Find Full Text PDF

Can linguistic semantics affect neural processing in feature-specific visual regions? Specifically, when we hear a sentence describing a situation that includes motion, do we engage neural processes that are part of the visual perception of motion? How about if a motion verb was used figuratively, not literally? We used fMRI to investigate whether semantic content can "penetrate" and modulate neural populations that are selective to specific visual properties during natural language comprehension. Participants were presented audiovisually with three kinds of sentences: motion sentences ("The wild horse crossed the barren field."), static sentences, ("The black horse stood in the barren field.

View Article and Find Full Text PDF

We report the case of patient M, who suffered unilateral left posterior temporal and parietal damage, brain regions typically associated with language processing. Language function largely recovered since the infarct, with no measurable speech comprehension impairments. However, the patient exhibited a severe impairment in nonverbal auditory comprehension.

View Article and Find Full Text PDF

We compared psychophysical thresholds for biological and non-biological motion detection in adults with autism spectrum conditions (ASCs) and controls. Participants watched animations of a biological stimulus (a moving hand) or a non-biological stimulus (a falling tennis ball). The velocity profile of the movement was varied between 100% natural motion (minimum-jerk (MJ) for the hand; gravitational (G) for the ball) and 100% constant velocity (CV).

View Article and Find Full Text PDF

Observers judged whether a periodically moving visual display (point-light walker) had the same temporal frequency as a series of auditory beeps that in some cases coincided with the apparent footsteps of the walker. Performance in this multisensory judgment was consistently better for upright point-light walkers than for inverted point-light walkers or scrambled control stimuli, even though the temporal information was the same in the three types of stimuli. The advantage with upright walkers disappeared when the visual "footsteps" were not phase-locked with the auditory events (and instead offset by 50% of the gait cycle).

View Article and Find Full Text PDF

To clarify how different the processing of verbal information is from the processing of meaningful non-verbal information, the present study characterized the developmental changes in neural responses to words and environmental sounds from pre-adolescence (7-9 years) through adolescence (12-14 years) to adulthood (18-25 years). Children and adults' behavioral and electrophysiological responses (the N400 effect of event-related potentials) were compared during the processing of words and environmental sounds presented in semantically matching and mismatching picture contexts. Behavioral accuracy of picture-sound matching improved until adulthood, while reaction time measures leveled out by age 12.

View Article and Find Full Text PDF

Novel mapping stimuli composed of biological motion figures were used to study the extent and layout of multiple retinotopic regions in the entire human brain and to examine the independent manipulation of retinotopic responses by visual stimuli and by attention. A number of areas exhibited retinotopic activations, including full or partial visual field representations in occipital cortex, the precuneus, motion-sensitive temporal cortex (extending into the superior temporal sulcus), the intraparietal sulcus, and the vicinity of the frontal eye fields in frontal cortex. Early visual areas showed mainly stimulus-driven retinotopy; parietal and frontal areas were driven primarily by attention; and lateral temporal regions could be driven by both.

View Article and Find Full Text PDF

We tested biological motion perception in a large group of unilateral stroke patients (N = 60). Both right and left hemisphere lesioned patients were significantly impaired compared with age-matched controls. Voxel-based lesion analyses revealed that lesions in superior temporal and premotor frontal areas had the greatest effect on biological motion perception.

View Article and Find Full Text PDF

We assess brain areas involved in speech production using a recently developed lesion-symptom mapping method (voxel-based lesion-symptom mapping, VLSM) with 50 aphasic patients with left-hemisphere lesions. Conversational speech was collected through a standardized biographical interview, and used to determine mean length of utterance in morphemes (MLU), type token ratio (TTR) and overall tokens spoken for each patient. These metrics are used as indicators of grammatical complexity, semantic variation, and amount of speech, respectively.

View Article and Find Full Text PDF

We used functional magnetic resonance imaging (fMRI) in conjunction with a voxel-based approach to lesion symptom mapping to quantitatively evaluate the similarities and differences between brain areas involved in language and environmental sound comprehension. In general, we found that language and environmental sounds recruit highly overlapping cortical regions, with cross-domain differences being graded rather than absolute. Within language-based regions of interest, we found that in the left hemisphere, language and environmental sound stimuli evoked very similar volumes of activation, whereas in the right hemisphere, there was greater activation for environmental sound stimuli.

View Article and Find Full Text PDF

Cortical surface-based analysis of fMRI data has proven to be a useful method with several advantages over 3-dimensional volumetric analyses. Many of the statistical methods used in 3D analyses can be adapted for use with surface-based analyses. Operating within the framework of the FreeSurfer software package, we have implemented a surface-based version of the cluster size exclusion method used for multiple comparisons correction.

View Article and Find Full Text PDF

Contrasting linguistic and nonlinguistic processing has been of interest to many researchers with different scientific, theoretical, or clinical questions. However, previous work on this type of comparative analysis and experimentation has been limited. In particular, little is known about the differences and similarities between the perceptual, cognitive, and neural processing of nonverbal environmental sounds and that of speech sounds.

View Article and Find Full Text PDF

The utility of single-case vs. group studies has been debated in neuropsychology for many years. The purpose of the present study is to illustrate an alternative approach to group studies of aphasia, in which the same symptom dimensions that are commonly used to assign patients to classical taxonomies (fluency, naming, repetition, and comprehension) are used as independent and continuous predictors in a multivariate design, without assigning patients to syndromes.

View Article and Find Full Text PDF

We tested aphasic patients' comprehension of actions to examine processing deficits in the linguistic and non-linguistic domains and their lesion correlates. Twenty-nine left-hemisphere injured patients and 18 age-matched control subjects matched pictured actions (with the objects missing) or their linguistic equivalents (printed sentences with the object missing) to one of two visually-presented pictures of objects. Aphasic patients performed poorly not only in the linguistic domain but also in the non-linguistic domain.

View Article and Find Full Text PDF

Motion cues can be surprisingly powerful in defining objects and events. Specifically, a handful of point-lights attached to the joints of a human actor will evoke a vivid percept of action when the body is in motion. The perception of point-light biological motion activates posterior cortical areas of the brain.

View Article and Find Full Text PDF

To examine the role of motor areas in speech perception, we carried out a functional magnetic resonance imaging (fMRI) study in which subjects listened passively to monosyllables and produced the same speech sounds. Listening to speech activated bilaterally a superior portion of ventral premotor cortex that largely overlapped a speech production motor area centered just posteriorly on the border of Brodmann areas 4a and 6, which we distinguished from a more ventral speech production area centered in area 4p. Our findings support the view that the motor system is recruited in mapping acoustic inputs to a phonetic code.

View Article and Find Full Text PDF