Synchronization and Control in Intrinsic and Designed Computation:
An Information-Theoretic Analysis of Competing Models of Stochastic Computation

James P. Crutchfield, Christopher J. Ellison, Ryan G. James, and John R. Mahoney

Complexity Sciences Center and Physics Department
Department of Computer Science
University of California at Davis
Davis, CA 95616

ABSTRACT: We adapt tools from information theory to analyze how an observer comes to synchronize with the hidden states of a finitary, stationary stochastic process. We show that synchronization is determined by both the process's internal organization and by an observer's model of it. We analyze these components using the convergence of state-block and block-state entropies, comparing them to the previously known convergence properties of the Shannon block entropy. Along the way, we introduce a hierarchy of information quantifiers as derivatives and integrals of these entropies, which parallels a similar hierarchy introduced for block entropy. We also draw out the duality between synchronization properties and a process's controllability. The tools lead to a new classification of a process's alternative representations in terms of minimality, synchronizability, and unifilarity.


James P. Crutchfield, Christopher J. Ellison, Ryan G. James, and John R. Mahoney, "Synchronization and Control in Intrinsic and Designed Computation: An Information-Theoretic Analysis of Competing Models of Stochastic Computation", CHAOS 20:3 (2010) 037105.
[pdf] 1.3 MB
Santa Fe Institute Working Paper 10-08-015.
arXiv:1007.5354 [cond-mat.stat-mech].