Motivation: Nanobodies are a subclass of immunoglobulins, whose binding site consists of only one peptide chain, bestowing favorable biophysical properties. Recently, the first nanobody therapy was approved, paving the way for further clinical applications of this antibody format. Further development of nanobody-based therapeutics could be streamlined by computational methods. One of such methods is infilling-positional prediction of biologically feasible mutations in nanobodies. Being able to identify possible positional substitutions based on sequence context, facilitates functional design of such molecules.

Results: Here we present nanoBERT, a nanobody-specific transformer to predict amino acids in a given position in a query sequence. We demonstrate the need to develop such machine-learning based protocol as opposed to gene-specific positional statistics since appropriate genetic reference is not available. We benchmark nanoBERT with respect to human-based language models and ESM-2, demonstrating the benefit for domain-specific language models. We also demonstrate the benefit of employing nanobody-specific predictions for fine-tuning on experimentally measured thermostability dataset. We hope that nanoBERT will help engineers in a range of predictive tasks for designing therapeutic nanobodies.

Availability And Implementation: https://huggingface.co/NaturalAntibody/.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10978573PMC
http://dx.doi.org/10.1093/bioadv/vbae033DOI Listing

Publication Analysis

Top Keywords

language models
8
nanobert
4
nanobert deep
4
deep learning
4
learning model
4
model gene
4
gene agnostic
4
agnostic navigation
4
navigation nanobody
4
nanobody mutational
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!