Alexandra M. Jurgens and James P. Crutchfield |
ABSTRACT: The ε-machine is a stochastic process' optimal model—maximally predictive and minimal in size. It often happens that to optimally predict even simply-defined processes, probabilistic models—including the ε-machine—must employ an uncountably-infinite set of features. To constructively work with these infinite sets we map the ε-machine to a place-dependent iterated function system (IFS)—a stochastic dynamical system. We then introduce the ambiguity rate that, in conjunction with a process' Shannon entropy rate, determines the rate at which this set of predictive features must grow to maintain maximal predictive power. We demonstrate, as an ancillary technical result which stands on its own, that the ambiguity rate is the (until now missing) correction to the Lyapunov dimension of an IFS's attractor. For a broad class of complex processes and for the first time, this then allows calculating their statistical complexity dimension—the information dimension of the minimal set of predictive features.