We studied how the interactions among animals in a collective allow for the transfer of information. We performed laboratory experiments to study how zebrafish in a collective follow a subset of trained animals that move towards a light when it turns on because they expect food at that location. We built some deep learning tools to distinguish from video which are the trained and the naïve animals and to detect when each animal reacts to the light turning on. These tools gave us the data to build a model of interactions that we designed to have a balance between transparency and accuracy. The model finds a low-dimensional function that describes how a naïve animal weights neighbours depending on focal and neighbour variables. According to this low-dimensional function, neighbour speed plays an important role in the interactions. Specifically, a naïve animal weights more a neighbour in front than to the sides or behind, and more so the faster the neighbour is moving; and if the neighbour moves fast enough, the differences coming from the neighbour's relative position largely disappear. From the lens of decision-making, neighbour speed acts as confidence measure about where to go. This article is part of a discussion meeting issue 'Collective behaviour through time'.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC9939271 | PMC |
http://dx.doi.org/10.1098/rstb.2022.0073 | DOI Listing |
Biomed Phys Eng Express
January 2025
Chiba University Center for Frontier Medical Engineering, 1-33 Yayoi-cho, Inage-ku, Chiba, Chiba, 263-8522, JAPAN.
Traumatic injury remains a leading cause of death worldwide, with traumatic bleeding being one of its most critical and fatal consequences. The use of whole-body computed tomography (WBCT) in trauma management has rapidly expanded. However, interpreting WBCT images within the limited time available before treatment is particularly challenging for acute care physicians.
View Article and Find Full Text PDFPLoS One
January 2025
Department of Computer Science and Mathematics, Lebanese American University, Beirut, Lebanon.
In human activity-recognition scenarios, including head and entire body pose and orientations, recognizing the pose and direction of a pedestrian is considered a complex problem. A person may be traveling in one sideway while focusing his attention on another side. It is occasionally desirable to analyze such orientation estimates using computer-vision tools for automated analysis of pedestrian behavior and intention.
View Article and Find Full Text PDFHypertension is a critical risk factor and cause of mortality in cardiovascular diseases, and it remains a global public health issue. Therefore, understanding its mechanisms is essential for treating and preventing hypertension. Gene expression data is an important source for obtaining hypertension biomarkers.
View Article and Find Full Text PDFPLoS One
January 2025
Engineering Research Center of Hydrogen Energy Equipment& Safety Detection, Universities of Shaanxi Province, Xijing University, Xi'an, China.
The traditional method of corn quality detection relies heavily on the subjective judgment of inspectors and suffers from a high error rate. To address these issues, this study employs the Swin Transformer as an enhanced base model, integrating machine vision and deep learning techniques for corn quality assessment. Initially, images of high-quality, moldy, and broken corn were collected.
View Article and Find Full Text PDFBioinformatics
January 2025
Department of Biology, Emory University, Atlanta, GA 30322, United States.
Motivation: In silico functional annotation of proteins is crucial to narrowing the sequencing-accelerated gap in our understanding of protein activities. Numerous function annotation methods exist, and their ranks have been growing, particularly so with the recent deep learning-based developments. However, it is unclear if these tools are truly predictive.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!