Thermodynamic Machine Learning through Maximum Work Production

Alexander B. Boyd and Mile Gu

Complexity Institute, Nanyang Technological University, Singapore
School of Physical and Mathematical Sciences, Nanyang Technological University, Singapore and Centre for Quantum
Technologies, National University of Singapore, Singapore

James P. Crutchfield

Complexity Sciences Center
Physics Department
University of California at Davis
Davis, CA 95616

ABSTRACT: Adaptive thermodynamic systems—such as a biological organism attempting to gain survival advantage, an autonomous robot performing a functional task, or a motor protein transporting intracellular nutrients—can improve their performance by effectively modeling the regularities and stochasticity in their environments. Analogously, but in a purely computational realm, machine learning algorithms seek to estimate models that capture predictable structure and identify irrelevant noise in training data by optimizing performance measures, such as a model's log-likelihood of having generated the data. Is there a sense in which these computational models are physically preferred? For adaptive physical systems we introduce the organizing principle that thermodynamic work is the most relevant performance measure of advantageously modeling an environment. Specifically, a physical agent's model determines how much useful work it can harvest from an environment. We show that when such agents maximize work production they also maximize their environmental model's log-likelihood, establishing an equivalence between thermodynamics and learning. In this way, work maximization appears as an organizing principle that underlies learning in adaptive thermodynamic systems.

Alexander B. Boyd, James P. Crutchfield, and Mile Gu, “Thermodynamic Machine Learning through Maximum Work Production”, (2020).
[pdf] [cond-mat.stat-mech].