Grid-cells firing fields tile the environment with a 6-fold periodicity during both locomotion and visual exploration. Here, we tested, in humans, whether movements of covert attention elicit grid-like coding using frequency tagging. Participants observed visual trajectories presented sequentially at fixed rate, allowing different spatial periodicities (e.g., 4-, 6-, and 8-fold) to have corresponding temporal periodicities (e.g., 1, 1.5, and 2 Hz), thus resulting in distinct spectral responses. We found a higher response for the (grid-like) 6-fold periodicity and localized this effect in medial-temporal sources. In a control experiment featuring the same temporal periodicity but lacking spatial structure, the 6-fold effect did not emerge, suggesting its dependency on spatial movements of attention. We report evidence that grid-like signals in the human medial-temporal lobe can be elicited by covert attentional movements and suggest that attentional coding may provide a suitable mechanism to support the activation of cognitive maps during conceptual navigation.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.celrep.2023.113209 | DOI Listing |
J Insect Sci
January 2025
School of Biological Sciences, University of Aberdeen, King's College, Aberdeen, UK.
Radio frequency identification (RFID) technology and marker recognition algorithms can offer an efficient and non-intrusive means of tracking animal positions. As such, they have become important tools for invertebrate behavioral research. Both approaches require fixing a tag or marker to the study organism, and so it is useful to quantify the effects such procedures have on behavior before proceeding with further research.
View Article and Find Full Text PDFEur J Neurosci
January 2025
Institute of Neuroscience (IONS), UCLouvain, Brussels, Belgium.
Experiencing music often entails the perception of a periodic beat. Despite being a widespread phenomenon across cultures, the nature and neural underpinnings of beat perception remain largely unknown. In the last decade, there has been a growing interest in developing methods to probe these processes, particularly to measure the extent to which beat-related information is contained in behavioral and neural responses.
View Article and Find Full Text PDFACS Earth Space Chem
January 2025
School of Chemistry, Norwich Research Park, University of East Anglia, Norwich NR4 7TJ, U.K.
2-Cyanoindene is one of the few specific aromatic or polycyclic aromatic hydrocarbon (PAH) molecules positively identified in Taurus molecular cloud-1 (TMC-1), a cold, dense molecular cloud that is considered the nearest star-forming region to Earth. We report cryogenic mid-infrared (550-3200 cm) and visible (16,500-20,000 cm, over the ← electronic transition) spectra of 2-cyanoindene radical cations (2CNI), measured using messenger tagging (He and Ne) photodissociation spectroscopy. The infrared spectra reveal the prominence of anharmonic couplings, particularly over the fingerprint region.
View Article and Find Full Text PDFFASEB J
January 2025
Laboratory of Molecular Pharmacology, Biosignal Research Center, Kobe University, Kobe, Japan.
DFNA1 (deafness, nonsyndromic autosomal dominant 1), initially identified as nonsyndromic sensorineural hearing loss, has been associated with an additional symptom: macrothrombocytopenia. However, the timing of the onset of hearing loss (HL) and thrombocytopenia has not been investigated, leaving it unclear which occurs earlier. Here, we generated a knock-in (KI) DFNA1 mouse model, diaphanous-related formin 1 (DIA1), in which Aequorea coerulescens green fluorescent protein (AcGFP)-tagged human DIA1(p.
View Article and Find Full Text PDFCortex
December 2024
Institute of Research in Psychology (IPSY) & Institute of Neuroscience (IoNS), Louvain Bionics Center, University of Louvain (UCLouvain), Louvain-la-Neuve, Belgium; School of Health Sciences, HES-SO Valais-Wallis, The Sense Innovation and Research Center, Lausanne & Sion, Switzerland. Electronic address:
Effective social communication depends on the integration of emotional expressions coming from the face and the voice. Although there are consistent reports on how seeing and hearing emotion expressions can be automatically integrated, direct signatures of multisensory integration in the human brain remain elusive. Here we implemented a multi-input electroencephalographic (EEG) frequency tagging paradigm to investigate neural populations integrating facial and vocal fearful expressions.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!