With recent advances in high-throughput, automated microscopy, there has been an increased demand for effective computational strategies to analyze large-scale, image-based data. To this end, computer vision approaches have been applied to cell segmentation and feature extraction, whereas machine-learning approaches have been developed to aid in phenotypic classification and clustering of data acquired from biological images. Here, we provide an overview of the commonly used computer vision and machine-learning methods for generating and categorizing phenotypic profiles, highlighting the general biological utility of each approach.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5223612PMC
http://dx.doi.org/10.1083/jcb.201610026DOI Listing

Publication Analysis

Top Keywords

computer vision
12
vision approaches
8
machine learning
4
learning computer
4
approaches phenotypic
4
phenotypic profiling
4
profiling advances
4
advances high-throughput
4
high-throughput automated
4
automated microscopy
4

Similar Publications

PALMS: Plane-based Accessible Indoor Localization Using Mobile Smartphones.

Int Conf Indoor Position Indoor Navig

October 2024

Computer Science and Engineering, University of California, Santa Cruz, Santa Cruz, United States.

In this paper, we present PALMS, an innovative indoor global localization and relocalization system for mobile smartphones that utilizes publicly available floor plans. Unlike most vision-based methods that require constant visual input, our system adopts a dynamic form of localization that considers a single instantaneous observation and odometry data. The core contribution of this work is the introduction of a particle filter initialization method that leverages the Certainly Empty Space (CES) constraint along with principal orientation matching.

View Article and Find Full Text PDF

Purpose: To determine the performance of TOTAL30 for Astigmatism (T30fA; Alcon; Fort Worth, TX, USA) contact lenses (CLs) in existing CL wearers who are also frequent digital device users.

Methods: This 1-month, 3-visit study recruited adult, 18- to 40-year-old subjects who were required to use daily digital devices for at least 8 hours per day. All subjects were refit into T30fA CLs.

View Article and Find Full Text PDF

Objective: This study develops and evaluates multimodal machine learning models for differentiating bacterial and fungal keratitis using a prospective representative dataset from South India.

Design: Machine learning classifier training and validation study.

Participants: Five hundred ninety-nine subjects diagnosed with acute infectious keratitis at Aravind Eye Hospital in Madurai, India.

View Article and Find Full Text PDF

Introduction: With the advent of technologies such as deep learning in agriculture, a novel approach to classifying wheat seed varieties has emerged. However, some existing deep learning models encounter challenges, including long processing times, high computational demands, and low classification accuracy when analyzing wheat seed images, which can hinder their ability to meet real-time requirements.

Methods: To address these challenges, we propose a lightweight wheat seed classification model called LWheatNet.

View Article and Find Full Text PDF

Human upper limb kinematics using a novel algorithm in post-stroke patients.

Proc Inst Mech Eng H

January 2025

Department of Biomedical Engineering, College of Engineering and Technology, SRM Institute of Science and Technology, Kattankulathur, Tamil Nadu, India.

Assessing the kinematics of the upper limbs is crucial for rehabilitation treatment, especially for stroke survivors. Nowadays, researchers use computer vision-based algorithms for Human motion analysis. However, specific challenges include less accuracy, increased computational complexity and a limited number of anatomical key points.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!