Cardiac pump function arises from a series of highly orchestrated events across multiple scales. Computational electromechanics can encode these events in physics-constrained models. However, the large number of parameters in these models has made the systematic study of the link between cellular, tissue, and organ scale parameters to whole heart physiology challenging.
View Article and Find Full Text PDFBackground: Personalised computer models are increasingly used to diagnose cardiac arrhythmias and tailor treatment. Patient-specific models of the left atrium are often derived from pre-procedural imaging of anatomy and fibrosis. These images contain noise that can affect simulation predictions.
View Article and Find Full Text PDFModels of electrical excitation and recovery in the heart have become increasingly detailed, but have yet to be used routinely in the clinical setting to guide personalized intervention in patients. One of the main challenges is calibrating models from the limited measurements that can be made in a patient during a standard clinical procedure. In this work, we propose a novel framework for the probabilistic calibration of electrophysiology parameters on the left atrium of the heart using local measurements of cardiac excitability.
View Article and Find Full Text PDFCalibration of cardiac electrophysiology models is a fundamental aspect of model personalization for predicting the outcomes of cardiac therapies, simulation testing of device performance for a range of phenotypes, and for fundamental research into cardiac function. Restitution curves provide information on tissue function and can be measured using clinically feasible measurement protocols. We introduce novel "restitution curve emulators" as probabilistic models for performing model exploration, sensitivity analysis, and Bayesian calibration to noisy data.
View Article and Find Full Text PDFIn patients with atrial fibrillation, local activation time (LAT) maps are routinely used for characterizing patient pathophysiology. The gradient of LAT maps can be used to calculate conduction velocity (CV), which directly relates to material conductivity and may provide an important measure of atrial substrate properties. Including uncertainty in CV calculations would help with interpreting the reliability of these measurements.
View Article and Find Full Text PDFPhilos Trans A Math Phys Eng Sci
June 2020
Uncertainty quantification (UQ) is a vital step in using mathematical models and simulations to take decisions. The field of cardiac simulation has begun to explore and adopt UQ methods to characterize uncertainty in model inputs and how that propagates through to outputs or predictions; examples of this can be seen in the papers of this issue. In this review and perspective piece, we draw attention to an important and under-addressed source of uncertainty in our predictions-that of uncertainty in the model structure or the equations themselves.
View Article and Find Full Text PDFObjective: Local activation time (LAT) mapping of the atria is important for targeted treatment of atrial arrhythmias, but current methods do not interpolate on the atrial manifold and neglect uncertainties associated with LAT observations. In this paper, we describe novel methods to, first, quantify uncertainties in LAT arising from bipolar electrogram analysis and assignment of electrode recordings to the anatomical mesh, second, interpolate uncertain LAT measurements directly on left atrial manifolds to obtain complete probabilistic activation maps, and finally, interpolate LAT jointly across both the manifold and different S1-S2 pacing protocols.
Methods: A modified center of mass approach was used to process bipolar electrograms, yielding a LAT estimate and error distribution from the electrogram morphology.
Three active learning schemes are used to generate training data for Gaussian process interpolation of intermolecular potential energy surfaces. These schemes aim to achieve the lowest predictive error using the fewest points and therefore act as an alternative to the status quo methods involving grid-based sampling or space-filling designs like Latin hypercubes (LHC). Results are presented for three molecular systems: CO-Ne, CO-H, and Ar.
View Article and Find Full Text PDFA procedure is proposed to produce intermolecular potential energy surfaces from limited data. The procedure involves generation of geometrical configurations using a Latin hypercube design, with a maximin criterion, based on inverse internuclear distances. Gaussian processes are used to interpolate the data, using over-specified inverse molecular distances as covariates, greatly improving the interpolation.
View Article and Find Full Text PDFImpurities from the CCS chain can greatly influence the physical properties of CO. This has important design, safety and cost implications for the compression, transport and storage of CO. There is an urgent need to understand and predict the properties of impure CO to assist with CCS implementation.
View Article and Find Full Text PDFOptimal sex allocation theory is one of the most intricately developed areas of evolutionary ecology. Under a range of conditions, particularly under population sub-division, selection favours sex being allocated to offspring non-randomly, generating non-binomial variances of offspring group sex ratios. Detecting non-binomial sex allocation is complicated by stochastic developmental mortality, as offspring sex can often only be identified on maturity with the sex of non-maturing offspring remaining unknown.
View Article and Find Full Text PDFBackground: Cathepsin S has been implicated in a variety of malignancies with genetic ablation studies demonstrating a key role in tumor invasion and neo-angiogenesis. Thus, the application of cathepsin S inhibitors may have clinical utility in the treatment of cancer. In this investigation, we applied a cell-permeable dipeptidyl nitrile inhibitor of cathepsin S, originally developed to target cathepsin S in inflammatory diseases, in both in vitro and in vivo tumor models.
View Article and Find Full Text PDFCathepsins S (CatS) has been implicated in numerous tumourigenic processes and here we document for the first time its involvement in CCL2 regulation within the tumour microenvironment. Analysis of syngeneic tumours highlighted reduced infiltrating macrophages in CatS depleted tumours. Interrogation of tumours and serum revealed genetic ablation of CatS leads to the depletion of several pro-inflammatory chemokines, most notably, CCL2.
View Article and Find Full Text PDFCathepsin S is a member of the cysteine cathepsin protease family. It is a lysosomal protease which can promote degradation of damaged or unwanted proteins in the endo-lysosomal pathway. Additionally, it has more specific roles such as MHC class II antigen presentation, where it is important in the degradation of the invariant chain.
View Article and Find Full Text PDFLobsters are a ubiquitous and economically important group of decapod crustaceans that include the infraorders Polychelida, Glypheidea, Astacidea and Achelata. They include familiar forms such as the spiny, slipper, clawed lobsters and crayfish and unfamiliar forms such as the deep-sea and "living fossil" species. The high degree of morphological diversity among these infraorders has led to a dynamic classification and conflicting hypotheses of evolutionary relationships.
View Article and Find Full Text PDFStat Appl Genet Mol Biol
May 2013
Approximate Bayesian computation (ABC) or likelihood-free inference algorithms are used to find approximations to posterior distributions without making explicit use of the likelihood function, depending instead on simulation of sample data sets from the model. In this paper we show that under the assumption of the existence of a uniform additive model error term, ABC algorithms give exact results when sufficient summaries are used. This interpretation allows the approximation made in many previous application papers to be understood, and should guide the choice of metric and tolerance in future work.
View Article and Find Full Text PDFEstimation of divergence times is usually done using either the fossil record or sequence data from modern species. We provide an integrated analysis of palaeontological and molecular data to give estimates of primate divergence times that utilize both sources of information. The number of preserved primate species discovered in the fossil record, along with their geological age distribution, is combined with the number of extant primate species to provide initial estimates of the primate and anthropoid divergence times.
View Article and Find Full Text PDFThe fossil record provides a lower bound on the primate divergence time of 54.8 million years ago, but does not provide an explicit estimate for the divergence time itself. We show how the pattern of diversification through the Cenozoic can be combined with a model for speciation to give a distribution for the age of the primates.
View Article and Find Full Text PDF