Individual recognition is critical for social behavior across species. Whether recognition is mediated by circuits specialized for social information processing has been a matter of debate. Here we examine the neurobiological underpinning of individual visual facial recognition in paper wasps. Front-facing images of conspecific wasps broadly increase activity across many brain regions relative to other stimuli. Notably, we identify a localized subpopulation of neurons in the protocerebrum which show specialized selectivity for front-facing wasp images, which we term . These encode information regarding the facial patterns, with ensemble activity correlating with facial identity. are strikingly analogous to face cells in primates, indicating that specialized circuits are likely an adaptive feature of neural architecture to support visual recognition.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11042187 | PMC |
http://dx.doi.org/10.1101/2024.04.11.589095 | DOI Listing |
J Family Med Prim Care
December 2024
Faculty of Medicine, King Abdulaziz University Hospital, Jeddah, Saudi Arabia.
The Kabuki syndrome (KS) is a rare congenital disease that has two different types, KS1 and KS2, with variant in epigenetic gene KMT2D and KDM6A, respectively. It is associated with multiple abnormalities such as (developmental delay, atypical facial features, cardiac anomalies, minor skeleton anomalies, genitourinary anomalies, and mild to moderate intellectual disability). This syndrome can lead to neonatal hypoglycemia that results from hyperinsulinemia and electrolyte abnormalities.
View Article and Find Full Text PDFClin Teach
February 2025
AP-HP, Institut du Cerveau - Paris Brain Institute - ICM, Inserm, CNRS, Hôpitaux Universitaires La Pitié Salpêtrière - Charles Foix, DMU Neurosciences, Service de Neurologie 2-Mazarin, Sorbonne Université, Paris, France.
Background: The acquisition of practical skills is a key objective of medical education. Improving knowledge and skills is essential for early diagnosis of patients suffering from neuromuscular (NM) diseases.
Approach: Multimedia tools have proved to be useful and effective for learning clinical skills.
PLoS One
January 2025
Department of Psychology, Tokyo Woman's Christian University, Tokyo, Japan.
We perceive and understand others' emotional states from multisensory information such as facial expressions and vocal cues. However, such cues are not always available or clear. Can partial loss of visual cues affect multisensory emotion perception? In addition, the COVID-19 pandemic has led to the widespread use of face masks, which can reduce some facial cues used in emotion perception.
View Article and Find Full Text PDFJ Trauma Dissociation
January 2025
Division of Child and Adolescent Psychiatry, Department of Psychiatry, Lausanne University Hospital (CHUV), Lausanne, Switzerland.
This pilot study aimed to understand the moderating role of context processing (i.e. encoding and memorizing) when mothers are confronted with threatening stimuli and undergo physiologic monitoring in order to understand a possible mechanism favoring intergenerational transmission of posttraumatic stress.
View Article and Find Full Text PDFBackground: Recent advances in automatic face recognition have increased the risk that de-identified research imaging data could be re-identified from face imagery in brain scans.
Method: An ADNI committee of independent imaging experts evaluated 11 published techniques for face-deidentification ("de-facing") and selected four algorithms (FSL-UK Biobank, HCP/XNAT, mri_reface, and BIC) for formal testing using 183 longitudinal scans of 61 racially and ethnically diverse ADNI participants, evaluated by their facial feature removal on 3D rendered surfaces (confirming sufficient privacy protection) and by comparing measurements from ADNI routine image analyses on unmodified vs. de-faced images (confirming negligible side effects on analyses).
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!