In recent years, the popularity of tablets has skyrocketed and there has been an explosive growth in apps designed for children. Howhever, many of these apps are released without tests for their effectiveness. This is worrying given that the factors influencing children's learning from touchscreen devices need to be examined in detail. In particular, it has been suggested that children learn less from passive video viewing relative to equivalent live interaction, which would have implications for learning from such digital tools. However, this so-called video deficit may be reduced by allowing children greater influence over their learning environment. Across two touchscreen-based experiments, we examined whether 2- to 4-year-olds benefit from actively choosing what to learn more about in a digital word learning task. We designed a tablet study in which "active" participants were allowed to choose which objects they were taught the label of, while yoked "passive" participants were presented with the objects chosen by their active peers. We then examined recognition of the learned associations across different tasks. In Experiment 1, children in the passive condition outperformed those in the active condition (n = 130). While Experiment 2 replicated these findings in a new group of Malay-speaking children (n = 32), there were no differences in children's learning or recognition of the novel word-object associations using a more implicit looking time measure. These results suggest that there may be performance costs associated with active tasks designed as in the current study, and at the very least, there may not always be systematic benefits associated with active learning in touchscreen-based word learning tasks. The current studies add to the evidence that educational apps need to be evaluated before release: While children might benefit from interactive apps under certain conditions, task design and requirements need to consider factors that may detract from successful performance.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7707543PMC
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0240519PLOS

Publication Analysis

Top Keywords

word learning
12
children's learning
8
associated active
8
learning
7
children
6
learning tablet
4
tablet app
4
app toddlers
4
toddlers perform
4
perform better
4

Similar Publications

Simulating Early Phonetic and Word Learning Without Linguistic Categories.

Dev Sci

March 2025

Laboratoire de Sciences Cognitives et de Psycholinguistique, Département d'Études Cognitives, ENS, EHESS, CNRS, PSL University, Paris, France.

Before they even talk, infants become sensitive to the speech sounds of their native language and recognize the auditory form of an increasing number of words. Traditionally, these early perceptual changes are attributed to an emerging knowledge of linguistic categories such as phonemes or words. However, there is growing skepticism surrounding this interpretation due to limited evidence of category knowledge in infants.

View Article and Find Full Text PDF

Developmental Language Disorder (DLD) is a common neurodevelopmental condition characterized by significant difficulty with language learning, comprehension, and expression. The neurocognitive bases of DLD are underspecified but are thought to be related, in part, to altered basal ganglia (BG). The BG are known to have a high level of brain iron, which contributes to myelination and dopaminergic pathways among other physiological mechanisms.

View Article and Find Full Text PDF

Language interventions may yield greater benefits for younger children than their older counterparts, making it critical to evaluate children's language skills as early as possible. Yet, assessing young children's language presents many challenges, such as limited attention spans, low expressive language, and hesitancy to speak with an unfamiliar examiner. To address these challenges, the Quick Interactive Language Screener for Toddlers (QUILS:TOD; for children 24- to 36-months of age) was developed as a quick, tablet-based language screener capable of assessing children's vocabulary, syntax, and word learning skills.

View Article and Find Full Text PDF

Hand movements frequently occur with speech. The extent to which the memories that guide co-speech hand movements are tied to the speech they occur with is unclear. Here, we paired the acquisition of a new hand movement with speech.

View Article and Find Full Text PDF

Individual differences elucidate the perceptual benefits associated with robust temporal fine-structure processing.

Proc Natl Acad Sci U S A

January 2025

Department of Communication Science and Disorders, University of Pittsburgh, Pittsburgh, PA 15213.

Article Synopsis
  • The auditory system can precisely track quick changes in sound, but the importance of this ability (temporal fine structure or TFS) for hearing is still debated.
  • Researchers studied 200 participants to see how TFS sensitivity affects speech perception in noisy environments.
  • Results showed that better TFS sensitivity helped more with listening in reverberant spaces and led to quicker responses, suggesting it plays a key role in everyday hearing experiences.
View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!