Plants, animals, and fungi display a rich tapestry of colors. Animals, in particular, use colors in dynamic displays performed in spatially complex environments. Although current approaches for studying colors are objective and repeatable, they miss the temporal variation of color signals entirely. Here, we introduce hardware and software that provide ecologists and filmmakers the ability to accurately record animal-perceived colors in motion. Specifically, our Python codes transform photos or videos into perceivable units (quantum catches) for animals of known photoreceptor sensitivity. The plans and codes necessary for end-users to capture animal-view videos are all open source and publicly available to encourage continual community development. The camera system and the associated software package will allow ecologists to investigate how animals use colors in dynamic behavioral displays, the ways natural illumination alters perceived colors, and other questions that remained unaddressed until now due to a lack of suitable tools. Finally, it provides scientists and filmmakers with a new, empirically grounded approach for depicting the perceptual worlds of nonhuman animals.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10805291 | PMC |
http://dx.doi.org/10.1371/journal.pbio.3002444 | DOI Listing |
PLoS Biol
January 2024
Department of Biology, George Mason University, Fairfax, Virginia, United States of America.
Plants, animals, and fungi display a rich tapestry of colors. Animals, in particular, use colors in dynamic displays performed in spatially complex environments. Although current approaches for studying colors are objective and repeatable, they miss the temporal variation of color signals entirely.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!