Single-cell technology opened up a new avenue to delineate cellular status at a single-cell resolution and has become an essential tool for studying human diseases. Multiplexing allows cost-effective experiments by combining multiple samples and effectively mitigates batch effects. It starts by giving each sample a unique tag and then pooling them together for library preparation and sequencing. After sequencing, sample demultiplexing is performed based on tag detection, where cells belonging to one sample are expected to have a higher amount of the corresponding tag than cells from other samples. However, in reality, demultiplexing is not straightforward due to the noise and contamination from various sources. Successful demultiplexing depends on the efficient removal of such contamination. Here, we perform a systematic benchmark combining different normalization methods and demultiplexing approaches using real-world data and simulated datasets. We show that accounting for sequencing depth variability increases the separability between tagged and untagged cells, and the clustering-based approach outperforms existing tools. The clustering-based workflow is available as an R package from https://github.com/hwlim/hashDemux.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1093/bfgp/elae039 | DOI Listing |
BMC Bioinformatics
January 2025
College of Artificial Intelligence, Nanjing Agricultural University, Weigang No.1, Nanjing, 210095, Jiangsu, China.
Antimicrobial peptides (AMPs) have been widely recognized as a promising solution to combat antimicrobial resistance of microorganisms due to the increasing abuse of antibiotics in medicine and agriculture around the globe. In this study, we propose UniAMP, a systematic prediction framework for discovering AMPs. We observe that feature vectors used in various existing studies constructed from peptide information, such as sequence, composition, and structure, can be augmented and even replaced by information inferred by deep learning models.
View Article and Find Full Text PDFSensors (Basel)
January 2025
National Research Council of Italy, Institute for Microelectronics and Microsystems, 73100 Lecce, Italy.
In the medical field, there are several very different movement disorders, such as tremors, Parkinson's disease, or Huntington's disease. A wide range of motor and non-motor symptoms characterizes them. It is evident that in the modern era, the use of smart wrist devices, such as smartwatches, wristbands, and smart bracelets is spreading among all categories of people.
View Article and Find Full Text PDFSensors (Basel)
January 2025
NUS-ISS, National University of Singapore, Singapore 119615, Singapore.
Recognizing the action of plastic bag taking from CCTV video footage represents a highly specialized and niche challenge within the broader domain of action video classification. To address this challenge, our paper introduces a novel benchmark video dataset specifically curated for the task of identifying the action of grabbing a plastic bag. Additionally, we propose and evaluate three distinct baseline approaches.
View Article and Find Full Text PDFSensors (Basel)
January 2025
School of Mechanical and Electrical Engineering, China University of Mining and Technology (Beijing), Beijing 100083, China.
Unsupervised Domain Adaptation for Object Detection (UDA-OD) aims to adapt a model trained on a labeled source domain to an unlabeled target domain, addressing challenges posed by domain shifts. However, existing methods often face significant challenges, particularly in detecting small objects and over-relying on classification confidence for pseudo-label selection, which often leads to inaccurate bounding box localization. To address these issues, we propose a novel UDA-OD framework that leverages scale consistency (SC) and Temporal Ensemble Pseudo-Label Selection (TEPLS) to enhance cross-domain robustness and detection performance.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!