Nihat Ay, Associate Professor of Mathematics, University of Leipzig


Nihat studied mathematics and physics at the Ruhr University Bochum and received his Ph.D. in mathematics from the University of Leipzig in 2001. In 2003 and 2004 he was a postdoctoral fellow at the Santa Fe Institute and at the Redwood Neuroscience Institute (now the Redwood Center for Theoretical Neuroscience at UC Berkeley). After his postdoctoral stay in the USA he became a member of the Mathematical Institute of the Friedrich Alexander University in Erlangen at the assistant professor level. Since September 2005 he worked as Max Planck Research Group Leader at the Max Planck Institute for Mathematics in the Sciences in Leipzig where he is heading the group Information Theory of Cognitive Systems. As external professor of the Santa Fe Institute he was involved in research on complexity and robustness theory. Since September 2009 he has been affiliated with the University of Leipzig as associate professor (Privatdozent) for mathematics.

Anthony Bell, Research Scientist, Redwood Center for Theoretical Neuroscience, UC Berkeley


Tony's long-term scientific goal is to work out how the brain learns (self-organizes). This has taken him in directions of Information Theory and probability theory for neural networks. This provides a hopelessly crude and impoverished model (called redundancy reduction) of what the brain does and how it lives in its world. Unfortunately, it's the best we have at the moment. We have to do some new mathematics before we reach self-organizational principles that will apply to the physical substrate of the brain, which is molecular: ion channels, enzyme complexes, gene expression networks. We have to think about dynamics, loops, open systems, how open dynamical systems can encode and effect the spatio-temporal trajectories of their perturbing inputs.

Luis Bettencourt, External Professor, Santa Fe Institute; Staff Researcher, Theoretical Division, LANL


Luís carries research in the structure and dynamics of several complex systems, with an emphasis on dynamical problems in biology and society. Currently, he works on information processing in neural systems, information theoretic optimization in collective behavior, urban organization and dynamics, and the development of science and technology. Luis obtained his PhD from Imperial College, University of London for work on critical phenomena in the early Universe, and associated mathematical techniques of Statistical Physics, Field Theory and Non-linear Dynamics. He held postdoctoral positions at the University of Heidelberg, Germany, as a Director’s Fellow in the Theoretical Division at LANL, and at the Center for Theoretical Physics at MIT. In 2000 he was awarded the distinguished Slansky Fellowship at Los Alamos National Laboratory for excellence in interdisciplinary research. He has been a scientist at LANL since the spring of 2003, first at the Computer and Computational Sciences Division (CCS), and since September 2005 in the Theoretical Division (T-5: Mathematical Modeling and Analysis). He is also External Professor at the Santa Fe Institute.

Gregory Chaitin, IBM Thomas J. Watson Research Center and Computer Science, University of Maine


Greg is at the IBM Watson Research Center in New York. In the mid 1960s, when he was a teenager, he created algorithmic information theory (AIT), which combines, among other elements, Shannon's information theory and Turing's theory of computability. In the four decades since then he has been the principal architect of the theory. Among his contributions are the definition of a random sequence via algorithmic incompressibility, his information-theoretic approach to Gödel's incompleteness theorem, and the celebrated number Omega. His work on Hilbert's 10th problem has shown that in a sense there is randomness in arithmetic, in other words, that God not only plays dice in quantum mechanics and nonlinear dynamics, but even in elementary number theory. His latest achievements have been to transform AIT into a theory about the size of real computer programs, programs that you can actually run, and his recent discovery that Leibniz anticipated AIT (1686). He is the author of nine books: Algorithmic Information Theory published by Cambridge University Press;Information, Randomness & Incompleteness and Information-Theoretic Incompleteness, both published by World Scientific; The Limits of Mathematics, The Unknowable, Exploring Randomness and Conversations with a Mathematician, all published by Springer-Verlag; From Philosophy to Program Size, published by the Tallinn Institute of Cybernetics; and Meta Math!, published by Pantheon Books. In 1995 he was given the degree of doctor of science honoris causa by the University of Maine. In 2002 he was given the title of honorary professor by the University of Buenos Aires. In 2004 he was elected a corresponding member of the Académie Internationale de Philosophie des Sciences. He is also a visiting professor at the Computer Science Department of the University of Auckland, and on the international committee of the Valparaíso Complex Systems Institute.

Lukasz Debowski, Research Scientist, Institute of Computer Science, Polish Academy of Sciences


Lukasz's research interests revolve around probability, language, information, and learning. Lukasz works at the IPI PAN, with the Statistical Analysis and Modeling and partly with the Linguistic Engineering. Seeking big intellectual adventures, he first studied at the Faculty of Physics, University of Warsaw. Later, he also visited the UFAL, the Santa Fe Institute, the CSE UNSW, and the CWI. Many interesting people showed him strikingly different ideas about what is worth doing in alpha and beta sciences, in engineering, and in general. ``I slowly realize what I should and can do best myself.

James P. Crutchfield, Professor of Physics and Director, Complexity Sciences Center, Physics Department, University of California at Davis


Jim is Professor of Physics at the University of California, Davis, and Director of the Complexity Sciences Center—a new research and graduate program. Prior to this he was Research Professor at the Santa Fe Institute for many years, where he led its Dynamics of Learning Group and Network Dynamics Program. In parallel, he was Adjunct Professor of Physics in the Physics Department, University of New Mexico, Albuquerque. Before coming to SFI in 1997, he was a Research Physicist in the Physics Department at the University of California, Berkeley, since 1985. He received his B.A. summa cum laude in Physics and Mathematics from the University of California, Santa Cruz, in 1979 and his Ph.D. in Physics there in 1983. He has been a Visiting Research Professor at the Sloan Center for Theoretical Neurobiology, University of California, San Francisco; a Post-doctoral Fellow of the Miller Institute for Basic Research in Science at UCB; a UCB Physics Department IBM Post-Doctoral Fellow in Condensed Matter Physics; a Distinguished Visiting Research Professor of the Beckman Institute at the University of Illinois, Urbana-Champaign; and a Bernard Osher Fellow at the San Francisco Exploratorium. He is co-founder and Vice President of the Art and Science Laboratory in Santa Fe [1].

Over the last three decades Jim has worked in the areas of nonlinear dynamics, solid-state physics, astrophysics, fluid mechanics, critical phenomena and phase transitions, chaos, and pattern formation. His current research interests center on computational mechanics, the physics of complexity, statistical inference for nonlinear processes, genetic algorithms, evolutionary theory, machine learning, quantum dynamics, and distributed intelligence. He has published over 110 papers in these areas, including the following recent, related publications. Most are available from his website: [2].

Andreas Trabesinger, Senior Editor, Nature Physics


Throughout his doctoral and post-doctoral studies, Andreas focused on various aspects of nuclear magnetic resonance, including application to monitoring brain metabolism and NMR at very low magnetic fields. After graduating from the physics department of ETH-Zürich in 2000, he conducted research at the Institute of Biomedical Engineering and in the Laboratory of Physical Chemistry at ETH, as well as at the Department of Chemistry at Berkeley, where he collaborated with the condensed-matter and atomic physics groups.





David Feldman, Professor, Physics and Astronomy, College of the Atlantic; Co-Director, SFI Complex Systems Summer School, Beijing

Dave's research training is in theoretical physics and mathematics, and his research interests lie in the fields of statistical mechanics and nonlinear dynamics. In particular, his research has examined how one might measure "complexity" or pattern in a mathematical system, and how such complexity is related to disorder. This work can be loosely categorized as belonging to the constellation of research topics often referred to as "chaos and complex systems." In his research, Dave uses both analytic and computational techniques. Dave has authored research papers in journals including Physical Review E, Chaos, Physics Letters A, and Advances in Complex Systems.

As a graduate student at UC-Davis, Dave received several awards in recognition of both teaching and scholarship: The Dissertation Year Fellowship; The Chancellor's Teaching Fellowship; and he was nominated for the Outstanding Graduate Student Teaching Award. Dave joined the faculty at College of the Atlantic in 1998, where he teaches a wide range of physics and math courses. He also teaches classes that explore connections between science and politics, such as Making the Bomb (about the Manhattan project and atomic weapons), and Gender and Science.

Jon Machta, Professor of Physics, University of Massachusetts at Amherst


Jon's research is in the area of theoretical condensed matter and statistical physics. His current research involves theoretical and computational studies of spin systems and applications of computational complexity theory to statistical physics.

John Mahoney, Post-doctoral Researcher, School of Natural Sciences, University of California at Merced

Melanie Mitchell, Professor, Computer Science, Portland State University; External Professor and Science Board member, Santa Fe Institute.


Melanie received a Ph.D. in Computer Science from the University of Michigan in 1990. Since then she has held faculty or professional positions at the University of Michigan, the Santa Fe Institute, Los Alamos National Laboratory, the OGI School of Science and Engineering, and Portland State University.


Melanie has served as Director of the Santa Fe Institute’s Complex Systems Summer School; at Portland State University she teaches, among other courses, Exploring Complexity in Science and Technology.


Her major work is in the areas of analogical reasoning, complex systems, genetic algorithms and cellular automata, and her publications in those fields are frequently cited. She is the author of An Introduction to Genetic Algorithms, a widely known introductory book published by MIT Press in 1996. Her most recent book is Complexity: A Guided Tour named by Amazon.com as one of the 10 best science books of 2009.

Cris Moore, Professor of Computer Science, University of New Mexico


Cris is a Professor in the Computer Science Department at the University of New Mexico, with a joint appointment in the Department of Physics and Astronomy. He is also a Professor at the Santa Fe Institute. Cris studies interesting things like quantum computation (especially post-quantum cryptography and the possibility of algorithms for Graph Isomorphism), phase transitions in NP-complete problems (e.g. the colorability of random graphs, or the satisfiability of random formulas) and social networks (in particular, automated techniques for identifying important structural features of large networks).

Rob Shaw, Research Scientist, ProtoLife, Inc., Venezia, Italy

"Dominos, Ergodic Flows": We present a model, developed with Norman Packard, of a simple discrete open flow system. Dimers are created at one edge of a two-dimensional lattice, diffuse across, and are removed at the opposite side. A steady-state flow is established, under various kinetic rules. In the equilibrium case, the system reduces to the classical monomer-dimer tiling problem, whose entropy as a function of density is known. This entropy density is reproduced locally in the flow system, as shown by statistics over local templates. The goal is to clarify informational aspects of a flowing pattern.

Susanne Still, Associate Professor of Computer Science, Department of Information and Computer Sciences, University of Hawaii at Manoa


Most research in learning theory deals with passive learning. However, many real world learning problems are interactive, and so is animal learning. The theoretical foundations for interactive learning and behavior are much less developed than those for passive learning. A theoretical understanding of behavioral learning lies at the heart of a new generation of machine intelligence, and is also at the core of many interesting questions about adaptation and learning in biology.

Rui Vilela-Mendes, Professor of Mathematics, Instituto Superior Tecnico, Lisboa, Portugal


Rui received an Electrical Engineering degree from the Technical University (IST)- Lisbon, a Ph. D. in Physics from the University of Texas (Austin) and an Habilitation in Mathematics from the University of Lisbon. He is currently a member of the Center for Mathematics and Applications (CMAF-UL) and of the Institute for Plasmas and Nuclear Fusion (IPFN-IST) as well as a member of the Lisbon Academy of Sciences. He was a visiting researcher at CERN, CNRS (Marseille), IHES (Bures), Univ. of Bielefeld and co-organizer and collaborator of several international research projects on Theoretical Physics and the Sciences of Complexity.


Over the last few decades Rui has worked in the areas of mathematical economics, nonlinear dynamics and control, stochastic processes and quantum theory. My current research interests center on mathematical economics, the physics of complexity, control and quantum computing.

Karoline Wiesner, Assistant Professor, School of Mathematics and Centre for Complexity Sciences, University of Bristol


Bateson defines information as “a difference that makes a difference”. Complexity is “when quantitative differences become qualitative differences.” We need information theory to identify this difference. Key to my work is coming up with good measures of complexity for classical (biological) and quantum systems. The goal is to build a tool set for identifying and measuring structure. Part of this tool set is a hierarchy of classical and quantum computational architectures. How difficult it is to generate a given structure determines how high up in this architectural hierarchy its representation is found.

Randomness, Structure, and Causality: Measures of Complexity from Theory to Applications