To better understand how vocalisations are used during interactions of multiple individuals, studies are increasingly deploying on-board devices with a microphone on each animal. The resulting recordings are extremely challenging to analyse, since microphone clocks drift non-linearly and record the vocalisations of non-focal individuals as well as noise. Here we address this issue with callsync, an R package designed to align recordings, detect and assign vocalisations to the caller, trace the fundamental frequency, filter out noise and perform basic analysis on the resulting clips. We present a case study where the pipeline is used on a dataset of six captive cockatiels () wearing backpack microphones. Recordings initially had a drift of ~2 min, but were aligned to within ~2 s with our package. Using callsync, we detected and assigned 2101 calls across three multi-hour recording sessions. Two had loud beep markers in the background designed to help the manual alignment process. One contained no obvious markers, in order to demonstrate that markers were not necessary to obtain optimal alignment. We then used a function that traces the fundamental frequency and applied spectrographic cross correlation to show a possible analytical pipeline where vocal similarity is visually assessed. The callsync package can be used to go from raw recordings to a clean dataset of features. The package is designed to be modular and allows users to replace functions as they wish. We also discuss the challenges that might be faced in each step and how the available literature can provide alternatives for each step.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11116754PMC
http://dx.doi.org/10.1002/ece3.11384DOI Listing

Publication Analysis

Top Keywords

callsync package
12
animal recordings
8
package designed
8
fundamental frequency
8
recordings
5
callsync
4
package alignment
4
alignment analysis
4
analysis multi-microphone
4
multi-microphone animal
4

Similar Publications

To better understand how vocalisations are used during interactions of multiple individuals, studies are increasingly deploying on-board devices with a microphone on each animal. The resulting recordings are extremely challenging to analyse, since microphone clocks drift non-linearly and record the vocalisations of non-focal individuals as well as noise. Here we address this issue with callsync, an R package designed to align recordings, detect and assign vocalisations to the caller, trace the fundamental frequency, filter out noise and perform basic analysis on the resulting clips.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!