In the first study using point-light displays (lights corresponding to the joints of the human body) to examine children's understanding of verbs, 3-year-olds were tested to see if they could perceive familiar actions that corresponded to motion verbs (e.g., walking). Experiment 1 showed that children could extend familiar motion verbs (e.g., walking and dancing) to videotaped point-light actions shown in the intermodal preferential looking paradigm. Children watched the action that matched the requested verb significantly more than they watched the action that did not match the verb. In Experiment 2, the findings of Experiment 1 were validated by having children spontaneously produce verbs for these actions. The use of point-light displays may illuminate the factors that contribute to verb learning.

Download full-text PDF

Source
http://dx.doi.org/10.1037//0012-1649.38.4.604DOI Listing

Publication Analysis

Top Keywords

motion verbs
12
point-light displays
12
children extend
8
verbs walking
8
watched action
8
verbs
5
young children
4
extend motion
4
point-light
4
verbs point-light
4

Similar Publications

Differential indexing in Kamang: a viewpoint alternation.

Linguist Vanguard

May 2024

Amsterdam Center for Language and Communication, University of Amsterdam, Amsterdam, The Netherlands.

In Kamang (Alor-Pantar, Indonesia), some verbs alternate between indexing the S or P argument with a prefix (from several different series) and occurring unprefixed; that is, Kamang has differential argument indexing. Through a qualitative study of a spoken-language corpus, this paper investigates the alternation between one of the prefix series and zero-marking. Previously described as indicating increased patientivity on intransitive motion and posture verbs, the alternation is here analysed in terms of a shift in event view: unprefixed verbs express events holistically, while prefixed verbs shift the viewpoint towards the "elaboration phase", the temporal and causal middle and end of an event.

View Article and Find Full Text PDF

When infants hear sentences containing unfamiliar words, are some language-world links (such as noun-object) more readily formed than others (verb-predicate)? We examined English learning 14-15-month-olds' capacity for linking referents in scenes with bisyllabic nonce utterances. Each of the two syllables referred either to the object's identity, or the object's motion. Infants heard the syllables in either a Verb-Subject (VS) or Subject-Verb (SV) order.

View Article and Find Full Text PDF

Comparing Deaf/Hard-of-Hearing Children's Oral Narratives Using Movies and Static Books.

J Deaf Stud Deaf Educ

September 2024

Communication Sciences and Disorders, Florida State University, Tallahassee, Fl, USA.

Clinicians utilize various methods for narrative sampling, including oral assessments like story generation and retelling, often aided by visual aids. Assessing language skills in deaf/hard of hearing (DHH) children requires careful narrative technique selection. This comparative observational study investigates the narrative outcomes of story generation and retelling tasks in 21 DHH children, using both book and movie contexts.

View Article and Find Full Text PDF

Kinematic observation reduces effect of aging on episodic memory performance.

Acta Psychol (Amst)

March 2024

UMR CNRS 7295, Centre de Recherches sur la Cognition et l'Apprentissage, Université de Tours, Université de Poitiers, France; Institut Universitaire de, France.

This study investigated the influence of kinematics observation (i.e., observing action from only the motion of the main joints of an actor) on episodic memory performance differences between young and older adults.

View Article and Find Full Text PDF

Vision provides a key source of information about many concepts, including 'living things' (e.g., ) and visual events (e.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!