Publications by authors named "Chris Brozdowski"

Scientific progress depends on reliable and reproducible results. Progress can also be accelerated when data are shared and re-analyzed to address new questions. Current approaches to storing and analyzing neural data typically involve bespoke formats and software that make replication, as well as the subsequent reuse of data, difficult if not impossible.

View Article and Find Full Text PDF

For sign languages, transitional movements of the hands are fully visible and may be used to predict upcoming linguistic input. We investigated whether and how deaf signers and hearing nonsigners use transitional information to detect a target item in a string of either pseudosigns or grooming gestures, as well as whether motor imagery ability was related to this skill. Transitional information between items was either intact (Normal videos), digitally altered such that the hands were selectively blurred (Blurred videos), or edited to only show the frame prior to the transition which was frozen for the entire transition period, removing all transitional information (Static videos).

View Article and Find Full Text PDF

In American Sign Language (ASL) spatial relationships are conveyed by the location of the hands in space, whereas English employs prepositional phrases. Using event-related fMRI, we examined comprehension of perspective-dependent (PD) (left, right) and perspective-independent (PI) (in, on) sentences in ASL and audiovisual English (sentence-picture matching task). In contrast to non-spatial control sentences, PD sentences engaged the superior parietal lobule (SPL) bilaterally for ASL and English, consistent with a previous study with written English.

View Article and Find Full Text PDF

Motor simulation has emerged as a mechanism for both predictive action perception and language comprehension. By deriving a motor command, individuals can predictively represent the outcome of an unfolding action as a forward model. Evidence of simulation can be seen via improved participant performance for stimuli that conform to the participant's individual characteristics (an egocentric bias).

View Article and Find Full Text PDF

In ASL spatial classifier expressions, the location of the hands in signing space depicts the relative position of described objects. When objects are physically present, the arrangement of the hands maps to the observed position of objects in the world (Shared Space). For non-present objects, interlocutors must perform a mental transformation to take the signer's perspective ("Signer Space").

View Article and Find Full Text PDF