We provide an algorithm for the construction and analysis of autocorrelation (information) functions of gene nucleotide sequences. As a measure of correlation between discrete random variables, we use normalized mutual information. The information functions are indicative of the degree of structuredness of gene sequences.
View Article and Find Full Text PDFIn view of the frequent presence of several aging-related diseases in geriatric patients, there is a need to develop analytical methodologies that would be able to perform diagnostic evaluation of several diseases at once by individual or combined evaluation parameters and select the most informative parameters or parameter combinations. So far there have been no established formal methods to enable such capabilities. We develop a new formal method for the evaluation of multiple age-related diseases by calculating the informative values (normalized mutual information) of particular parameters or parameter combinations on particular diseases, and then combine the ranks of informative values to provide an overall estimation (or correlation) on several diseases at once.
View Article and Find Full Text PDFElderly patients are commonly characterized by the presence of several chronic aging-related diseases at once, or old-age "multimorbidity," with critical implications for diagnosis and therapy. However, at the present there is no agreed or formal method to diagnose or even define "multimorbidity." There is also no formal quantitative method to evaluate the effects of individual or combined diagnostic parameters and therapeutic interventions on multimorbidity.
View Article and Find Full Text PDFThe present work explores the application of information theoretical measures, such as entropy and normalized mutual information, for research of biomarkers of aging. The use of information theory affords unique methodological advantages for the study of aging processes, as it allows evaluating non-linear relations between biological parameters, providing the precise quantitative strength of those relations, both for individual and multiple parameters, showing cumulative or synergistic effect. Here we illustrate those capabilities utilizing a dataset on heart disease, including diagnostic parameters routinely available to physicians.
View Article and Find Full Text PDFProg Neurobiol
October 2017
This article reviews the application of information-theoretical analysis, employing measures of entropy and mutual information, for the study of aging and aging-related diseases. The research of aging and aging-related diseases is particularly suitable for the application of information theory methods, as aging processes and related diseases are multi-parametric, with continuous parameters coexisting alongside discrete parameters, and with the relations between the parameters being as a rule non-linear. Information theory provides unique analytical capabilities for the solution of such problems, with unique advantages over common linear biostatistics.
View Article and Find Full Text PDF