Thermodynamics of Correlations and Structure in Information Engines

Alexander Blades Boyd
Complexity Sciences Center
Physics Department
University of California at Davis
Davis, CA 95616 USA

ABSTRACT: Understanding structured information and computation in thermodynamics systems is crucial to progress in diverse fields—from biology at a molecular level to designed nanoscale information processors. Landauer' Principle puts a bound on the energy cost of erasing a bit of information. This suggests that devices which exchange energy and information with the environment, which we call information engines, can use information as a thermodynamic fuel to extract work from a heat reservoir, or dissipate work to erase information. However, Landauer's Principle on its own neglects the detailed dynamics of physical information processing — the mechanics and structure between the start and end of a computation. Our work deepens our understanding of these nonequilibrium dynamics, leading to new principles of efficient thermodynamic control. We explore a particular type of information engine called an information ratchet, which processes a symbol string sequentially, transducing its input string to an output string. We derive a general energetic framework for these ratchets as they operate out of equilibrium, allowing us to exactly calculate work and heat production. We show that this very general form of computation must obey a Landauer-like bound, the Information Processing Second Law (IPSL), which shows that any form of temporal correlations are a potential thermodynamic fuel. We show that in order to leverage that fuel, the autonomous information ratchet must have internal states which match the predictive states of the information reservoir. This leads to a thermodynamic principle of requisite complexity, much like Ashby's law of requisite variety in cybernetics. This is a result of the modularity of information transducers. We derive the modularity dissipation, which is an energetic cost beyond Landauer's bound that predicts the structural energy costs of different implementations of the same computation. Applying the modularity dissipation to information ratchets establishes design principles for thermodynamically efficient autonomous information processors. They prescribe the ratchet's structure such that the computation saturates the bound set by the IPSL and, thus, achieves maximum thermodynamic efficiency.


A. B. Boyd, “Thermodynamics of Correlations and Structure in Information Engines”, Ph.D. dissertation, Physics Department, University of California at Davis, 2017. [pdf] 4 MB.