Infer global, predict local: Quantity-relevance trade-off in protein fitness predictions from sequence data.

PLoS Comput Biol

Laboratory of Physics of the Ecole Normale Supérieure, CNRS UMR8023 & PSL Research, Sorbonne Université, Paris, France.

Published: October 2023

Predicting the effects of mutations on protein function is an important issue in evolutionary biology and biomedical applications. Computational approaches, ranging from graphical models to deep-learning architectures, can capture the statistical properties of sequence data and predict the outcome of high-throughput mutagenesis experiments probing the fitness landscape around some wild-type protein. However, how the complexity of the models and the characteristics of the data combine to determine the predictive performance remains unclear. Here, based on a theoretical analysis of the prediction error, we propose descriptors of the sequence data, characterizing their quantity and relevance relative to the model. Our theoretical framework identifies a trade-off between these two quantities, and determines the optimal subset of data for the prediction task, showing that simple models can outperform complex ones when inferred from adequately-selected sequences. We also show how repeated subsampling of the sequence data is informative about how much epistasis in the fitness landscape is not captured by the computational model. Our approach is illustrated on several protein families, as well as on in silico solvable protein models.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10645369PMC
http://dx.doi.org/10.1371/journal.pcbi.1011521DOI Listing

Publication Analysis

Top Keywords

sequence data
16
fitness landscape
8
data
6
protein
5
infer global
4
global predict
4
predict local
4
local quantity-relevance
4
quantity-relevance trade-off
4
trade-off protein
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!