Information Measures

There are many more measures of information than are typically presented in an information theory book. Below is a list of many of them. For much more information and implementations of many of them, please see the dit documentation. In the following, alternative names for measures are given in square brackets.

There are a variety of measures directly based on Shannon's original measures, begin sums and differences of entropies:

  • Entropy
  • Mutual Information
  • Multivariate Mutual Information [Co-Information]
  • Total Correlation [Multi-Information, Integration]
  • Binding Information [Dual Total Correlation]
  • Residual Entropy [Erasure Entropy]

There are other measures that are not directly representable on I-diagrams, but are entropies or mutual informations of auxiliary variables:

  • Gacs-Korner Common Information [Zero-Error Information]
  • Wyner Common Information
  • Minimal Markov Chain Information
  • Minimal Functional Markov Chain Information
  • Joined Minimal Sufficient Statistic
  • Intrinsic Information [Intrinsically Conditional Mutual Information]
  • Reduced Intrinsic Information

There are measures which are not exactly entropies and the like, but are related:

  • Interaction Information
  • TSE Complexity

There are a variety of measures of divergence or distance between distributions:

  • Jensen-Shannon Divergence
  • Relative Entropy [Kullback-Leibler Divergence]
  • Cross Entropy
  • Jensen-Renyi Divergence
  • Jensen-Tsallis Divergence

Lastly, there are a number of alternative information measures:

  • Cumulative Residual Entropy
  • Extropy
  • Perplexity
  • Renyi Entropy
  • Tsallis Entropy

Special Topics

  • Directionality
    • Transinformation
    • Transfer Entropy
    • Directed Information
  • Permutation-based Measures
    • Permutation Entropy
    • Transcriptions
  • Partial Information Decomposition