Abstract
The problem of defining and studying complexity of a time series has interested people for years. In the context of dynamical systems, Grassberger has suggested that a slow approach of the entropy to its extensive asymptotic limit is a sign of complexity. We investigate this idea further by information theoretic and statistical mechanics techniques and show that these arguments can be made precise, and that they generalize many previous approaches to complexity, in particular, unifying ideas from the physics literature with ideas from learning and coding theory; there are even connections of this statistical approach to algorithmic or Kolmogorov complexity. Moreover, a set of simple axioms similar to those used by Shannon in his development of information theory allows us to prove that the divergent part of the subextensive component of the entropy is a unique complexity measure. We classify time series by their complexities and demonstrate that beyond the "logarithmic" complexity classes widely anticipated in the literature there are qualitatively more complex, "power-law" classes which deserve more attention.
Original language | English (US) |
---|---|
Pages (from-to) | 89-99 |
Number of pages | 11 |
Journal | Physica A: Statistical Mechanics and its Applications |
Volume | 302 |
Issue number | 1-4 |
DOIs | |
State | Published - Dec 15 2001 |
Event | International Workshop on Frontiers in the Physics of Complex Systems - Ramat-Gan, Israel Duration: Mar 25 2001 → Mar 28 2001 |
All Science Journal Classification (ASJC) codes
- Statistics and Probability
- Condensed Matter Physics