Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4450636PMC
http://dx.doi.org/10.3324/haematol.2014.114447DOI Listing

Publication Analysis

Top Keywords

fine tuning
4
tuning surface
4
surface crlf2
4
crlf2 expression
4
expression associated
4
associated signaling
4
signaling profile
4
profile childhood
4
childhood b-cell
4
b-cell precursor
4

Similar Publications

Norepinephrine in vertebrates and its invertebrate analog, octopamine, regulate the activity of neural circuits. We find that, when hungry, larvae switch activity in type II octopaminergic motor neurons (MNs) to high-frequency bursts, which coincide with locomotion-driving bursts in type I glutamatergic MNs that converge on the same muscles. Optical quantal analysis across hundreds of synapses simultaneously reveals that octopamine potentiates glutamate release by tonic type Ib MNs, but not phasic type Is MNs, and occurs via the G-coupled octopamine receptor (OAMB).

View Article and Find Full Text PDF

Learning the language of antibody hypervariability.

Proc Natl Acad Sci U S A

January 2025

Computer Science and Artificial Intelligence Laboratory, Massachusetts Institute of Technology, Cambridge, MA 02139.

Protein language models (PLMs) have demonstrated impressive success in modeling proteins. However, general-purpose "foundational" PLMs have limited performance in modeling antibodies due to the latter's hypervariable regions, which do not conform to the evolutionary conservation principles that such models rely on. In this study, we propose a transfer learning framework called Antibody Mutagenesis-Augmented Processing (AbMAP), which fine-tunes foundational models for antibody-sequence inputs by supervising on antibody structure and binding specificity examples.

View Article and Find Full Text PDF

The role of chromatin state in intron retention: A case study in leveraging large scale deep learning models.

PLoS Comput Biol

January 2025

Department of Computer Science, Colorado State University, Fort Collins, Colorado, United States of America.

Complex deep learning models trained on very large datasets have become key enabling tools for current research in natural language processing and computer vision. By providing pre-trained models that can be fine-tuned for specific applications, they enable researchers to create accurate models with minimal effort and computational resources. Large scale genomics deep learning models come in two flavors: the first are large language models of DNA sequences trained in a self-supervised fashion, similar to the corresponding natural language models; the second are supervised learning models that leverage large scale genomics datasets from ENCODE and other sources.

View Article and Find Full Text PDF

Semantical text understanding holds significant importance in natural language processing (NLP). Numerous datasets, such as Quora Question Pairs (QQP), have been devised for this purpose. In our previous study, we developed a Siamese Convolutional Neural Network (S-CNN) that achieved an F1 score of 82.

View Article and Find Full Text PDF

Continuous-wave perovskite polariton lasers.

Sci Adv

January 2025

State Key Laboratory of Extreme Photonics and Instrumentation, College of Optical Science and Engineering, International Research Center for Advanced Photonics, Zhejiang University, Hangzhou 310027, China.

Solution-processed semiconductor lasers are next-generation light sources for large-scale, bio-compatible and integrated photonics. However, overcoming their performance-cost trade-off to rival III-V laser functionalities is a long-standing challenge. Here, we demonstrate room-temperature continuous-wave perovskite polariton lasers exhibiting remarkably low thresholds of ~0.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!