ABSTRACT: Information is taken as the primary physical entity from which probabilities can be derived. Information produced by a source is defined as the class of sources that are recoding-equivalent. Shannon entropy is one of a family of formal Renyi information measures on the space of unique sources. Each of these measures quantifies the volume of the source's recoding-equivalence class. A space of information sources is constructed from elements each of which is the class of recoding-equivalent, but otherwise unique sources. The norm in this space is the source entropy. A measure of distance between information sources is derived from an algebra of measurements. With this, the space of information sources is shown to be a metric space whose logic is described by a metric lattice. Applications of the information metric to quantum informational uncertainty and to information densities in multicomponent dynamical systems are outlined.