James P. Crutchfield, Christopher J. Ellison, and John R. Mahoney |
ABSTRACT: We show why the amount of information communicated between the past and future—the excess entropy—is not in general the amount of information stored in the present—the statistical complexity. This is a puzzle, and a long-standing one, since the latter is what is required for optimal prediction, but the former describes observed behavior. We layout a classification scheme for dynamical systems and stochastic processes that determines when these two quantities are the same or different. We do this by developing closed-form expressions for the excess entropy in terms of optimal causal predictors and retrodictors—the ε-machines of computational mechanics. A process's causal irreversibility and crypticity are key determining properties.