Research has shown that inverting faces significantly disrupts the processing of configural information, leading to a face inversion effect. We recently used a contextual priming technique to show that the presence or absence of the face inversion effect can be determined via the top-down activation of face versus non-face processing systems [Ge, L., Wang, Z., McCleery, J., & Lee, K. (2006). Activation of face expertise and the inversion effect. Psychological Science, 17(1), 12-16]. In the current study, we replicate these findings using the same technique but under different conditions. We then extend these findings through the application of a neural network model of face and Chinese character expertise systems. Results provide support for the hypothesis that a specialized face expertise system develops through extensive training of the visual system with upright faces, and that top-down mechanisms are capable of influencing when this face expertise system is engaged.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2267768 | PMC |
http://dx.doi.org/10.1016/j.visres.2007.11.025 | DOI Listing |
PLoS One
January 2025
College of Information Science and Technology & College of Artificial Intelligence, Nanjing Forestry University, Nanjing, China.
To enhance the efficacy of multimedia quantum processing and diminish processing overhead, an advanced multimedia quantum representation model and quantum video display framework are devised. A range of framework processing operators are also developed, including an image color compensation operator, a bit plane inversion operator, and a frame displacement operator. In addition, to address image security issues, two quantum image operations have been proposed: color transformation operation and pixel blending operation.
View Article and Find Full Text PDFMol Ecol
January 2025
Department of Biology, Lund University, Lund, Sweden.
How gene expression evolves to enable divergent ecological adaptation and how changes in gene expression relate to genomic architecture are pressing questions for understanding the mechanisms enabling adaptation and ecological speciation. Furthermore, how plasticity in gene expression can both contribute to and be affected by the process of ecological adaptation is crucial to understanding gene expression evolution, colonisation of novel niches and response to rapid environmental change. Here, we investigate the role of constitutive and plastic gene expression differences between host races, or host-specific ecotypes, of the peacock fly Tephritis conura, a thistle bud specialist.
View Article and Find Full Text PDFPLoS One
January 2025
College of Mechanical and Electrical Engineering, Xinjiang Agricultural University, Urumqi, P.R. China.
Automated large-scale farmland preparation operations face significant challenges related to path planning efficiency and uniformity in resource allocation. To improve agricultural production efficiency and reduce operational costs, an enhanced method for planning land preparation paths is proposed. In the initial stage, unmanned aerial vehicles (UAVs) are employed to collect data from the field, which is then used to construct accurate farm models.
View Article and Find Full Text PDFVision Res
December 2024
University of Essex, Department of Psychology, Wivenhoe Park, Colchester, CO4 3SQ, Essex, UK; University of Suffolk, Institute of Health and Wellbeing, Ipswich, IP4 1QJ, Suffolk, UK.
Face recognition from 2D images is influenced by various factors, including lighting conditions, viewing direction, rotation, and polarity inversion. It has been proposed that these techniques affect face recognition by distorting shape from shading. This study investigates the perception of 3D face shape in 2D images using a gauge figure task.
View Article and Find Full Text PDFGenome Biol Evol
December 2024
Institute of Environmental Sciences, Faculty of Biology, Jagiellonian University, 30-387 Kraków, Poland.
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!