### Information and Its Metric

James P. Crutchfield

Physics Department

University of California

Berkeley, California 94720, USA

**ABSTRACT: **Information is taken as the primary
physical entity from which probabilities can be derived. Information
produced by a source is defined as the class of sources that are
recoding-equivalent. Shannon entropy is one of a family of formal Renyi
information measures on the space of unique sources. Each of these
measures quantifies the volume of the source's recoding-equivalence
class. A space of information sources is constructed from elements each
of which is the class of recoding-equivalent, but otherwise unique
sources. The norm in this space is the source entropy. A measure of
distance between information sources is derived from an algebra of
measurements. With this, the space of information sources is shown to be
a metric space whose logic is described by a metric lattice.
Applications of the information metric to quantum informational
uncertainty and to information densities in multicomponent dynamical
systems are outlined.

J. P. Crutchfield, "Information and Its Metric",
in **Nonlinear Structures in Physical Systems -- Pattern
Formation, Chaos, and Waves**, L. Lam and H. C. Morris,
editors, Springer-Verlag, Berlin (1990) 119 - 130.
[ps.gz]= 55kb
[ps]= 199kb
[pdf]= 136kb

**NOTE: **Appears in the proceedings of the Second Woodward Conference,
San Jose State University, San Jose, California, 17 - 18 November 1989.