Research with Western samples has uncovered the rapid development of infants' visual attention. This study evaluated spatial attention in 6- to 9-month-old infants living in rural Malawi (N = 511; = 255, = 427) or suburban California, United States (N = 57, = 29, = 37) in 2018-2019. Using the Infant Orienting With Attention (IOWA) task, results showed that infants were faster and more accurate to fixate a target when a cue validly predicted the target location and were slower and less accurate when the cue was invalid. However, compared to US infants, Malawian infants took longer to fixate the target and were more accurate. These results both provide information about the development of spatial attention in an underrepresented population and demonstrate differences in spatial attention in infants with different lived experiences.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1111/cdev.14228 | DOI Listing |
Public Health Rev
February 2025
Population Research Centre, Faculty of Spatial Sciences, University of Groningen, Groningen, Netherlands.
Objective: This scoping review examines health outcome trends in European cross-border regions, identifies available evidence, and highlights research gaps. The European Union's integration efforts aim to harmonise living standards and healthcare access. Removed border controls and freedom of movement enhanced service availability, benefiting populations in border regions with cross-border healthcare access.
View Article and Find Full Text PDFPlant Methods
March 2025
College of Information Engineering, Northwest A&F University, Yangling, 712100, Shaanxi, China.
Remarkable inter-class similarity and intra-class variability of tomato leaf diseases seriously affect the accuracy of identification models. A novel tomato leaf disease identification model, DWTFormer, based on frequency-spatial feature fusion, was proposed to address this issue. Firstly, a Bneck-DSM module was designed to extract shallow features, laying the groundwork for deep feature extraction.
View Article and Find Full Text PDFBMC Med Imaging
March 2025
School of Electronics Engineering, Vellore Institute of Technology, Vellore, India.
Background: Diabetic retinopathy is a major cause of vision loss worldwide. This emphasizes the need for early identification and treatment to reduce blindness in a significant proportion of individuals. Microaneurysms, extremely small, circular red spots that appear in retinal fundus images, are one of the very first indications of diabetic retinopathy.
View Article and Find Full Text PDFSci Rep
March 2025
College of Computer and Control Engineering, Northeast Forestry University, HeXing Road, Harbin, China.
Traffic flow prediction is a key challenge in intelligent transportation, and the ability to accurately forecast future traffic flow directly affects the efficiency of urban transportation systems. However, existing deep learning-based prediction models suffer from the following issues: First, CNN- or RNN-based models are limited by their architecture and unsuitable for modeling long-term sequences. Second, most Transformer-based methods focus solely on the traffic flow data itself during embedding, neglecting the implicit information behind the traffic data.
View Article and Find Full Text PDFSci Rep
March 2025
School of Civil Engineering and Transportation, Northeast Forestry University, Harbin, China.
In heterogeneous traffic flow environments, it is critical to accurately predict the future trajectories of human-driven vehicles around intelligent vehicles in real time. This paper introduces a neural network model that integrates both spatial interaction information and the long-term and short-term characteristics of the time series. Initially, the historical state information of both the target vehicle and its surrounding counterparts, along with their spatial interaction relationships, are fed into a Graph Attention Network (GAT) encoder.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!