James P. Crutchfield


The unifying theme of my research is patterns—what they are, how nature produces them, and how we discover new ones. The origins of this interest date back to the 1970s, when the advent of powerful and interactive computers stimulated much work on nonlinear dynamics—deterministic chaos and bifurcations between distinct behaviors. This early work raised a number of questions on how the properties of nonlinear systems bear on the foundations of statistical mechanics, including the existence of nonequilibrium states and how one distinguishes molecular chaos—required to derive macroscopic properties from microscopic dynamics—from the mechanisms of deterministic chaos.

Progress during the 1980s in analyzing increasingly more complex nonlinear systems eventually showed that these foundational questions were special cases of broader issues: How is it that nature spontaneously generates macroscopic order and structure? What mechanisms support the production of structure? How does nature balance randomness and order as structure emerges? And, perhaps most important of all, what do we mean by structure, pattern, order, and regularity? Can there be a theory that allows us to measure patterns as concretely and workably as we measure randomness using thermodynamic entropy and temperature?

This focus on patterns led to an even more central question, How do we (or any agent moving through the natural world) discover patterns in the first place? I call this pattern discovery to distinguish it from pattern recognition—familiar in engineering, where one designs systems with a built-in palette of templates, and familiar in the natural sciences, where one analyzes data in terms of an hypothesized representation, such as with Fourier transforms. In these cases, a pattern is recognized when data most closely matches one of the stored templates. Pattern recognition, however, begs the question of discovery, Where do these representations come from in the first place?

Answering these questions led me to develop a generalization of statistical mechanics that explicitly defines structure and connects structure in natural systems to how they store and process information. In short, one asks, How does nature compute? The theory—unsurprisingly called computational mechanics—attempts to answer three quantitative questions (i) how much historical information does a system store, (ii) where is that information stored, and (iii) how is it processed to produce future behavior? These computational properties complement the questions we typically ask in physics: How much energy is stored, in what form is it stored, and how is it transformed over time and space?

In its approach to patterns computational mechanics uses the basic paradigm of statistical mechanics to synthesize nonlinear dynamics with information and computation theories. Over the last decade it has been used in a number domains, some well outside physics—in learning theory, evolutionary biology, and neuroscience, for example. My current research focuses on applying computational mechanics to structure in disordered materials, distributed coordination in collectives of intelligent agents, pre-biotic evolution, quantum computation, and the dynamics of learning itself.

About Jim


EMAIL  chaos (at) ucdavis (dot) edu

OFFICE  197 Physics

PHONE  530-752-0600

WEB http://csc.ucdavis.edu/~chaos/

MAIL  Physics Department, University of California at Davis, One Shields Avenue, Davis, CA 95616


  1. 1.Collective Cognition

  2. 2.Nonlinear Dynamics

  3. 3.Pattern Formation

  4. 4.Evolutionary Dynamics

  5. 5.Quantum Information, Dynamics, and Computing


  1. 1.Complexity Sciences Center: Director

  2. 2.Physics Department, UC Davis: Professor

  3. 3.Art & Science Laboratory: Co-Founder, Vice President

  4. 4.Graduate Group in Applied Mathematics

  5. 5.Graduate Group in Computer Science

  6. 6.Santa Fe Institute: External Professor