In a newly discovered form of visual masking, a target stimulus is masked by 4 flanking dots if their offset is delayed relative to the target (V. Di Lollo, J. T. Enns, & R. A. Rensink, 2000). In Di Lollo et al. (2000), the dot pattern also cued the relevant target and therefore required deliberate attention. In the present Experiments 2-6, a central arrow cued 1 of 2 letters for an E/F discrimination, with dots flanking both letters. Masking was reduced compared with the mask-cue procedure but was still robust. Delayed-offset dots flanking the nontarget also impaired performance, indicating competition for attention. Masking was unaffected by brightness of the dots relative to the target. Masking was attenuated not only by precuing attention to the target location but also by preview of an uninformative dot mask. Theories of masking by object substitution must therefore accommodate the prior context into which the target stimulus is introduced.
Download full-text PDF |
Source |
---|
Opt Express
January 2025
Phase distributions typically contain richer information about the morphology, structure, and organizational properties of a sample than intensity distributions. However, due to the weak scattering and absorption properties of pure phase objects, intensity measurements are unable to provide information about the phase, making it more challenging to reveal phase structure from the incident light background. Here, we propose a method for visualizing phase objects through simple optical reflection occurring at a glass interface.
View Article and Find Full Text PDFMikrochim Acta
January 2025
College of Chemistry, Chemical Engineering & Environmental Science, Minnan Normal University, Zhangzhou, 363000, China.
The detection of cysteine (Cys) and homocysteine (Hcy) in biological fluids has great significance for early diagnosis, including Alzheimer's and Parkinson's disease. The simultaneous determination of Cys and Hcy with a single probe is still a huge challenge. To enlarge the differences in space structure (line and ring) and energy (-721.
View Article and Find Full Text PDFInfant Behav Dev
January 2025
Universität zu Köln, Richard Strauss Straße 2, Cologne 50931, Germany.
The study examined the saccadic behavior of 4- to 10-month-old infants when tracking a two-dimensional linear motion of a circle that occasionally bounced off a barrier constituted by the screen edges. It was investigated whether infants could anticipate the angle of the circle's direction after the bounce and the circle's displacement from the location of bounce. Seven bounce types were presented which differed in the angle of incidence.
View Article and Find Full Text PDFSensors (Basel)
January 2025
School of Information and Communications Engineering, Xi'an Jiaotong University, Xi'an 710049, China.
This review offers a comprehensive and in-depth analysis of face mask detection and recognition technologies, emphasizing their critical role in both public health and technological advancements. Existing detection methods are systematically categorized into three primary classes: feaRture-extraction-and-classification-based approaches, object-detection-models-based methods and multi-sensor-fusion-based methods. Through a detailed comparison, their respective workflows, strengths, limitations, and applicability across different contexts are examined.
View Article and Find Full Text PDFBehav Sci (Basel)
December 2024
Department of Psychology, College of Education, Zhejiang University of Technology, Hangzhou 310028, China.
Visual sensory memory constructs representations of the physical information of visual objects. However, few studies have investigated whether abstract information, such as semantic information, is also involved in these representations. This study utilized a masking technique combined with the partial report paradigm to examine whether visual sensory memory representation contains semantic information.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!