Fishing boats produce acoustic cues while hauling longlines. These acoustic signals are known to be used by odontocetes to detect the fishing activity and to depredate. However, very little is known about potential interactions before hauling. This article describes the acoustic signature of the setting activity. Using passive acoustic recorders attached to the buoys of longlines, this work demonstrates an increase in the ambient sound of ∼6 dB re 1 μPa Hz within 2-7 kHz during the setting activity. This could also be used as an acoustic cue by depredating species, suggesting that predators can detect longlines as soon as they are set.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1121/10.0003191 | DOI Listing |
J Exp Psychol Hum Percept Perform
January 2025
School of Psychology, University of Sussex.
Human listeners have a remarkable capacity to adapt to severe distortions of the speech signal. Previous work indicates that perceptual learning of degraded speech reflects changes to sublexical representations, though the precise format of these representations has not yet been established. Inspired by the neurophysiology of auditory cortex, we hypothesized that perceptual learning involves changes to perceptual representations that are tuned to acoustic modulations of the speech signal.
View Article and Find Full Text PDFNat Commun
January 2025
Cold Spring Harbor Laboratory, Cold Spring Harbor, NY, USA.
While animals readily adjust their behavior to adapt to relevant changes in the environment, the neural pathways enabling these changes remain largely unknown. Here, using multiphoton imaging, we investigate whether feedback from the piriform cortex to the olfactory bulb supports such behavioral flexibility. To this end, we engage head-fixed male mice in a multimodal rule-reversal task guided by olfactory and auditory cues.
View Article and Find Full Text PDFQ J Exp Psychol (Hove)
January 2025
Department of Otorhinolaryngology / Head and Neck Surgery, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands.
This study aims to provide a comprehensive picture of auditory emotion perception in cochlear implant (CI) users by (1) investigating emotion categorization in both vocal (pseud-ospeech) and musical domains, and (2) how individual differences in residual acoustic hearing, sensitivity to voice cues (voice pitch, vocal tract length), and quality of life (QoL) might be associated with vocal emotion perception, and, going a step further, also with musical emotion perception. In 28 adult CI users, with or without self-reported acoustic hearing, we showed that sensitivity (d') scores for emotion categorization varied largely across the participants, in line with previous research. However, within participants, the d' scores for vocal and musical emotion categorization were significantly correlated, indicating similar processing of auditory emotional cues across the pseudo-speech and music domains and robustness of the tests.
View Article and Find Full Text PDFThis paper explores the perception of two diachronically related and mutually intelligible phonological oppositions, the onset voicing contrast of Northern Raglai and the register contrast of Southern Raglai. It is the continuation of a previous acoustic study that revealed that Northern Raglai onset stops maintain a voicing distinction accompanied by weak formant and voice quality modulations on following vowels, while Southern Raglai has transphonologized this voicing contrast into a register contrast marked by vowel and voice quality distinctions. Our findings indicate that the two dialects partially differ in their use of identification cues, Northern Raglai listeners using both voicing and F1 as major cues while Southern Raglai listeners largely focus on F1.
View Article and Find Full Text PDFCell Rep
January 2025
Zuckerman Mind Brain Behavior Institute, Columbia University, New York, NY 10027, USA; Department of Ecology, Evolution, and Environmental Biology, Columbia University, New York, NY 10027, USA. Electronic address:
Outside acoustic communication, little is known about how animals coordinate social turn taking and how the brain drives engagement in these social interactions. Using Siamese fighting fish (Betta splendens), we discover dynamic visual features of an opponent and behavioral sequences that drive visually driven turn-taking aggressive behavior. Lesions of the telencephalon show that it is unnecessary for coordinating turn taking but is required for persistent participation in aggressive interactions.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!