Caregivers use a range of verbal and nonverbal behaviours when responding to their infants. Previous studies have typically focused on the role of the caregiver in providing verbal responses, while communication is inherently multimodal (involving audio and visual information) and bidirectional (exchange of information between infant and caregiver). In this paper, we present a comprehensive study of caregivers' verbal, nonverbal, and multimodal responses to 10-month-old infants' vocalisations and gestures during free play. A new coding scheme was used to annotate 2036 infant vocalisations and gestures of which 87.1 % received a caregiver response. Most caregiver responses were verbal, but 39.7 % of all responses were multimodal. We also examined whether different infant behaviours elicited different responses from caregivers. Infant bimodal (i.e., vocal-gestural combination) behaviours elicited high rates of verbal responses and high rates of multimodal responses, while infant gestures elicited high rates of nonverbal responses. We also found that the types of verbal and nonverbal responses differed as a function of infant behaviour. The results indicate that infants influence the rates and types of responses they receive from caregivers. When examining caregiver-child interactions, analysing caregivers' verbal responses alone undermines the multimodal richness and bidirectionality of early communication.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.infbeh.2023.101828DOI Listing

Publication Analysis

Top Keywords

verbal nonverbal
16
responses
12
multimodal responses
12
verbal responses
12
high rates
12
verbal
8
nonverbal multimodal
8
responses caregivers
8
caregivers' verbal
8
vocalisations gestures
8

Similar Publications

Get Over It: Surgical Residents' Responses to Simulated Harassment. A Multi Method Study.

J Surg Educ

January 2025

Department of Surgery, Faculty of Medicine and Health Sciences, McGill University, 3605 Rue de la Montagne, Montréal, QC, Canada, H3G 2M1; Institute of Health Sciences Education, Faculty of Medicine and Health Sciences, McGill University, 1110 Pine Avenue West, Montréal, QC, Canada, H3A 1A3; Steinberg Centre for Simulation and Interactive Learning, Faculty of Medicine and Health Sciences, McGill University, 3575 Park Ave, Montréal, QC, Canada H2 × 3P9; Research Institute of the McGill University Health Centre, Montreal General Hospital, 1650 Cedar Ave, R1.112, Montreal, QC, H3G 1A4. Electronic address:

Objective: This study examined the response strategies of Surgery residents as bystanders to harassment in a simulated clinical environment, their alignment with the bystander intervention model, and the motivations behind their actions.

Design: Participants watched an educational video on harassment and ways to address it prior to undergoing a simulated clinical scenario where they witnessed a senior resident harassing a medical student. The study used audio-video recordings of the simulations to capture and analyze residents' verbal and nonverbal responses to harassment.

View Article and Find Full Text PDF

Recent research suggests that performance on Statistical Learning (SL) tasks may be lower in children with dyslexia in deep orthographies such as English. However, it is debated whether the observed difficulties may vary depending on the modality and stimulus of the task, opening a broad discussion about whether SL is a domain-general or domain-specific construct. Besides, little is known about SL in children with dyslexia who learn transparent orthographies, where the transparency of grapheme-phoneme correspondences might reduce the reliance on implicit learning processes.

View Article and Find Full Text PDF

Purpose: This study aimed to explore pharmacy students' perceptions of remote flipped classrooms in Malaysia, focusing on their learning experiences and identifying areas for potential improvement to inform future educational strategies.

Methods: A qualitative approach was employed, utilizing inductive thematic analysis. Twenty Bachelor of Pharmacy students (18 women, 2 men; age range, 19-24 years) from Monash University participated in 8 focus group discussions over 2 rounds during the coronavirus disease 2019 pandemic (2020-2021).

View Article and Find Full Text PDF

The dynamic Colombian sign language dataset for basic conversation LSC70.

Data Brief

February 2025

Sistemas dinámicos, instrumentación y control (SIDICO), Departamento de física, Universidad del Cauca, Colombia.

Sign language is a form of non-verbal communication used by people with hearing disability. This form of communication relies on the use of signs, gestures, facial expressions, and more. Considering that in Colombia, the population with hearing impairments is around half a million, a database of dynamic, alphanumeric signs and commonly used words was created to establish a basic conversation.

View Article and Find Full Text PDF

Impaired semantic control in the logopenic variant of primary progressive aphasia.

Brain Commun

December 2024

Medical Research Council (MRC) Cognition and Brain Sciences Unit, University of Cambridge, Cambridge CB2 7EF, UK.

We investigated semantic cognition in the logopenic variant of primary progressive aphasia, including (i) the status of verbal and non-verbal semantic performance; and (ii) whether the semantic deficit reflects impaired semantic control. Our hypothesis that individuals with logopenic variant of primary progressive aphasia would exhibit semantic control impairments was motivated by the anatomical overlap between the temporoparietal atrophy typically associated with logopenic variant of primary progressive aphasia and lesions associated with post-stroke semantic aphasia and Wernicke's aphasia, which cause heteromodal semantic control impairments. We addressed the presence, type (semantic representation and semantic control; verbal and non-verbal), and progression of semantic deficits in logopenic variant of primary progressive aphasia.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!