Multivariate Dependence Beyond Shannon Information

R. G. James and J. P. Crutchfield

Complexity Sciences Center
Physics and Mathematics Departments
University of California at Davis
Davis, CA 95616


Accurately determining dependency structure is critical to discovering a system's causal organization. We recently showed that the transfer entropy fails in a key aspect of this—measuring information flow—due to its conflation of dyadic and polyadic relationships. We extend this observation to demonstrate that this is true of all such Shannon information measures when used to analyze multivariate dependencies. This has broad implications, particularly when employing information to express the organization and mechanisms embedded in complex systems, including the burgeoning efforts to combine complex network theory with information theory. Here, we do not suggest that any aspect of information theory is wrong. Rather, the vast majority of its informational measures are simply inadequate for determining the meaningful dependency structure within joint probability distributions. Therefore, such information measures are inadequate for discovering intrinsic causal relations. We close by demonstrating that such distributions exist across an arbitrary set of variables.

R. G. James and J. P. Crutchfield "Multivariate Dependence Beyond Shannon Information", Entropy (2017) to appear.
[pdf] 490 KB
Santa Fe Institute Working Paper 16-09-017. [math.IT].
Jupyter Notebook: mdbsi.ipynb.