Modes of Information Flow

Ryan G. James, Blanca Daniella Mansante Ayala, and James P. Crutchfield

Complexity Sciences Center
Physics Department
University of California at Davis
Davis, CA 95616

and

Bahti Zakirov
Department of Engineering Science and Physics, College of Staten Island
The City University of New York, 2800 Victory Blvd., Staten Island, NY 10314

ABSTRACT: Information flow between components of a system takes many forms and is key to understanding the organization and functioning of large-scale, complex systems. We demonstrate three modalities of information flow from time series X to time series Y. Intrinsic information flow exists when the past of X is individually predictive of the present of Y, independent of Y's past; this is most commonly considered information flow. Shared information flow exists when X's past is predictive of Y's present in the same manner as Y's past; this occurs due to synchronization or common driving, for example. Finally, synergistic information flow occurs when neither X's nor Y's pasts are predictive of Y's present on their own, but taken together they are. The two most broadly-employed information-theoretic methods of quantifying information flow—time-delayed mutual information and transfer entropy—are both sensitive to a pair of these modalities: time-delayed mutual information to both intrinsic and shared flow, and transfer entropy to both intrinsic and synergistic flow. To quantify each mode individually we introduce our cryptographic flow ansatz, positing that intrinsic flow is synonymous with secret key agreement between X and Y. Based on this, we employ an easily-computed secret-key-agreement bound—intrinsic mutual information—to quantify the three flow modalities in a variety of systems including asymmetric flows and financial markets.


Ryan G. James, Blanca Daniella Mansante Ayala, Bahti Zakirov, and James P. Crutchfield, “Modes of Information Flow”, (2018).
doi:XXXX/XXXXXX.
[pdf]

arxiv.org:1808.06723.