We show how to estimate the Kolmogorov-Sinai entropy rate for chaotic systems using the mutual information function, easily obtainable from experimental time series. We state the conditions under which the relationship is exact, and explore the usefulness of the approach for both maps and flows. We also explore refinements of the method, and study its convergence properties as a function of time series length.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1103/PhysRevE.84.046204 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!