Learning, Information Theory, and Nonequilibrium Statistical Mechanics

 
 

Berkeley workgroup on

       Learning, Information Theory, & Nonequilibrium

       Thermodynamics

 

Coordinates

EMAIL  lineq (at) lists.berkeley.edu

LOCATION  560 Evans Halls, UC Berkeley

TIme  3:30 PM every other Friday (kinda)

WEB https://calmail.berkeley.edu/manage/list/listinfo/lineq@lists.berkeley.edu

Ryan James (UC Davis): Information Anatomy

18 September 2015

Recently, a more nuanced view of the information dynamics of a stochastic process has arisen. This view decomposes the Shannon entropy rate (ubiquitous through science, and also known as the Kolmogorov-Sinai entropy, the metric entropy, and via Pesin’s relation, the sum of the positive Lyapunov exponents) in to two components: the bound information and the ephemeral information. The bound information quantifies information generation which plays a role in the future behavior of the system, while the ephemeral does not. With this decomposition in hand, we then use it to study the way in which two representative chaotic systems, the tent map and the logistic map, generate information.