In the past decade, cultural differences in perceptual judgment and memory have been observed: Westerners attend more to focal objects, whereas East Asians attend more to contextual information. However, the underlying mechanisms for the apparent differences in cognitive processing styles have not been known. In the present study, we examined the possibility that the cultural differences arise from culturally different viewing patterns when confronted with a naturalistic scene. We measured the eye movements of American and Chinese participants while they viewed photographs with a focal object on a complex background. In fact, the Americans fixated more on focal objects than did the Chinese, and the Americans tended to look at the focal object more quickly. In addition, the Chinese made more saccades to the background than did the Americans. Thus, it appears that differences in judgment and memory may have their origins in differences in what is actually attended as people view a scene.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1194960 | PMC |
http://dx.doi.org/10.1073/pnas.0506162102 | DOI Listing |
Sensors (Basel)
December 2024
Department of Optometry and Vision Science, Faculty of Science and Technology, University of Latvia, Jelgavas Street 1, LV-1004 Riga, Latvia.
Eccentric photorefractometry is widely used to measure eye refraction, accommodation, gaze position, and pupil size. While the individual calibration of refraction and accommodation data has been extensively studied, gaze measurements have received less attention. PowerRef 3 does not incorporate individual calibration for gaze measurements, resulting in a divergent offset between the measured and expected gaze positions.
View Article and Find Full Text PDFJ Clin Med
December 2024
Département d'ORL, Centre Hospitalier Universitaire de Saint Etienne, 42055 Saint-Etienne, France.
: Spontaneous nystagmus during vertigo attacks of Menière's disease has been essentially described as horizontal, beating ipsilaterally (irritative type) or contralaterally (deficit type) to the hearing loss. Our main objective was to describe the characteristics of nystagmus during vertigo attacks. The second objective was to determine the feasibility of self-video recording of eye movements by a mobile phone.
View Article and Find Full Text PDFBrain Sci
December 2024
Department of Neurology, The Carrick Institute, Cape Canaveral, FL 32920, USA.
Background: Eye movement research serves as a critical tool for assessing brain function, diagnosing neurological and psychiatric disorders, and understanding cognition and behavior. Sex differences have largely been under reported or ignored in neurological research. However, eye movement features provide biomarkers that are useful for disease classification with superior accuracy and robustness compared to previous classifiers for neurological diseases.
View Article and Find Full Text PDFBrain Sci
December 2024
SensoriMotorLab, Department of Ophthalmology-University of Lausanne, Jules Gonin Eye Hospital-Fondation Asile des Aveugles, 1004 Lausanne, Switzerland.
Many daily activities depend on visual inputs to improve motor accuracy and minimize errors. Reaching tasks present an ecological framework for examining these visuomotor interactions, but our comprehension of how different amounts of visual input affect motor outputs is still limited. The present study fills this gap, exploring how hand-related visual bias affects motor performance in a reaching task (to draw a line between two dots).
View Article and Find Full Text PDFBrain Sci
November 2024
ISJPS UMR 8103 CNRS, Université Paris 1 Panthéon Sorbonne, 75005 Paris, France.
Background: The aim of this study is to use an eye tracker to compare the understanding of three forms of implicitness (i.e., presupposition, conversational implicatures, and irony) in 139 pupils from the first to the fifth year of elementary school.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!