Learning, Information Theory, and Nonequilibrium Statistical Mechanics
Learning, Information Theory, and Nonequilibrium Statistical Mechanics
Berkeley workgroup on
Learning, Information Theory, & Nonequilibrium
Thermodynamics
Coordinates
EMAIL lineq (at) lists.berkeley.edu
LOCATION 560 Evans Halls, UC Berkeley
TIme 3:30 PM every other Friday (kinda)
WEB https://calmail.berkeley.edu/manage/list/listinfo/lineq@lists.berkeley.edu
Three Short Talks on the Drazin Inverse
23 October 2015
Paul Reichers: Drazin Inverse from a Spectral Perspective
Abstract: Within the last few years, several groups – coming at quite different problems with quite different techniques – have converged on the Drazin inverse as a crucial object in analyzing complex systems.
For complex processes that can be modeled by a (discrete-time or continuous-time) hidden Markov model (HMM), we have found the first closed-form expressions for many complexity measures in terms of the Drazin inverse of the transition dynamic (or functions of the transition dynamic) of the HMM. The expressions reveal unexpected relationships among purportedly different questions, and suggest useful generalizations of long-used measures.
In the setting of nonequilibrium thermodynamics, we have again found the Drazin inverse. This time, the Drazin inverse appears in exact results for the excess heat dissipated during arbitrarily fast transitions among nonequilibrium steady states, so long as the joint driving-input--system-state dynamic can be modeled as a HMM over some effective state space. Specifically, all moments of the excess heat distribution involve the Drazin inverse of the rate matrix over the hidden states of the joint environmental-input--system-state dynamic.
I will discuss the Drazin inverse from a spectral perspective, and will try to explain both its mathematical innards and its ubiquity in exact results for complex systems. At a minimum, I will show: (1) the general spectral decomposition of the Drazin inverse for nondiagonable linear operators, and (2) several exact closed-form expressions in which it appears.
Reference: “Exact Complexity: The Spectral Decomposition of Intrinsic Computation”, James P. Crutchfield, Christopher J. Ellison, and Paul M. Riechers. http://arxiv.org/abs/1309.3792.
Dr. Subhaneil Lahiri: Learning and memory with complex synaptic plasticity
Abstract: Theorists frequently model synapses as a single number – the synaptic weight. In reality, there is a complex dynamical system underlying plasticity. This turns out to be crucial for the capacity of synaptic memory. We study the space of all possible Markov processes that could implement complex synaptic plasticity to understand how different structures are suitable for storing memories for different timescales.
Reference: “A memory frontier for complex synapses”, Subhaneil Lahiri and Surya Ganguli, http://ganguli-gang.stanford.edu/pdf/Synapse.NIPS14.pdf. Supplement: http://ganguli-gang.stanford.edu/pdf/Synapse.NIPS14.Supp.pdf
Dr. Dibyendu Mandal: Drazin Inverse and Steady State Thermodynamics
Abstract: Transitions between nonequilibrium steady states obey a generalized Clausius inequality, which becomes an equality in the quasistatic limit. For slow but finite transitions, we show that the behavior of the system is described by a response matrix whose elements are given by a far-from-equilibrium Green-Kubo formula, involving the decay of correlations evaluated in the nonequilibrium steady state. The response matrix is most naturally expressed in terms of the Drazin inverse of the generator of the dynamics. Furthermore, our results extend – to nonequilibrium steady states – the thermodynamic metric structure introduced by Sivak and Crooks for analyzing minimal-dissipation protocols.
Reference: “Analysis of slow transitions between nonequilibrium steady states”, Dibyendu Mandal and Christopher Jarzynski, http://arxiv.org/abs/1507.06269.
Note: A gentle introduction to generalized inverse for master operators in given in Dr. Jordan M. Horowitz's thesis: http://drum.lib.umd.edu/handle/1903/10296. Section 1.1.2 is especially relevant for our discussion.