Model reduction for the Chemical Master Equation: An information-theoretic approach.

J Chem Phys

School of Biological Sciences, University of Edinburgh, Edinburgh EH9 3JH, United Kingdom.

Published: March 2023

The complexity of mathematical models in biology has rendered model reduction an essential tool in the quantitative biologist's toolkit. For stochastic reaction networks described using the Chemical Master Equation, commonly used methods include time-scale separation, Linear Mapping Approximation, and state-space lumping. Despite the success of these techniques, they appear to be rather disparate, and at present, no general-purpose approach to model reduction for stochastic reaction networks is known. In this paper, we show that most common model reduction approaches for the Chemical Master Equation can be seen as minimizing a well-known information-theoretic quantity between the full model and its reduction, the Kullback-Leibler divergence defined on the space of trajectories. This allows us to recast the task of model reduction as a variational problem that can be tackled using standard numerical optimization approaches. In addition, we derive general expressions for propensities of a reduced system that generalize those found using classical methods. We show that the Kullback-Leibler divergence is a useful metric to assess model discrepancy and to compare different model reduction techniques using three examples from the literature: an autoregulatory feedback loop, the Michaelis-Menten enzyme system, and a genetic oscillator.

Download full-text PDF

Source
http://dx.doi.org/10.1063/5.0131445DOI Listing

Publication Analysis

Top Keywords

model reduction
28
chemical master
12
master equation
12
model
8
stochastic reaction
8
reaction networks
8
kullback-leibler divergence
8
reduction
6
reduction chemical
4
equation information-theoretic
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!