Randomness, Structure and Causality - Bios
From Santa Fe Institute Events Wiki
Workshop Navigation |
Nihat Ay, Associate Professor of Mathematics, University of Leipzig
Nihat studied mathematics and physics at the Ruhr University Bochum and received my Ph.D. in mathematics from the University of Leipzig in 2001. In 2003 and 2004 he was a postdoctoral fellow at the Santa Fe Institute and at the Redwood Neuroscience Institute (now the Redwood Center for Theoretical Neuroscience at UC Berkeley). After my postdoctoral stay in the USA he became a member of the Mathematical Institute of the Friedrich Alexander University in Erlangen at the assistant professor level. Since September 2005 he worked as Max Planck Resarch Group Leader at the Max Planck Institute for Mathematics in the Sciences in Leipzig where he is heading the group Information Theory of Cognitive Systems. As external professor of the Santa Fe Institute he was involved in research on complexity and robustness theory. Since September 2009 he has been affiliated with the University of Leipzig as associate professor (Privatdozent) for mathematics.
Anthony Bell, Research Scientist, Redwood Center for Theoretical Neuroscience, UC Berkeley
Tony's long-term scientific goal is to work out how the brain learns (self-organises). This has taken him in directions of Information Theory and probability theory for neural networks. This provides a hopelessly crude and impoverished model (called redundancy reduction) of what the brain does and how it lives in its world. Unfortunately, it's the best we have at the moment. We have to do some new mathematics before we reach self-organisational principles that will apply to the physical substrate of the brain, which is molecular: ion channels, enzyme complexes, gene expression networks. We have to think about dynamics, loops, open systems, how open dynamical systems can encode and effect the spatio-temporal trajectories of their perturbing inputs.
Carl T. Bergstrom, Professor, Department of Biology, University of Washington
Carl uses mathematical models and computer simulations to study a range of problems in population biology, animal behavior, and evolutionary theory. Our current research efforts are concentrated in two areas:
- Information in biological systems. How do living organisms acquire, store, and make use of information? How and why does communication evolve? How does information flow through biological or social networks?
- The ecology and evolution of infectious disease. How do pathogens evolve and spread through populations? How do populations evolve in response to pathogen challenge?
Gregory Chaitin, IBM Thomas J. Watson Research Center and Computer Science, University of Maine
Greg is at the IBM Watson Research Center in New York. In the mid 1960s, when he was a teenager, he created algorithmic information theory (AIT), which combines, among other elements, Shannon's information theory and Turing's theory of computability. In the four decades since then he has been the principal architect of the theory. Among his contributions are the definition of a random sequence via algorithmic incompressibility, his information-theoretic approach to Gödel's incompleteness theorem, and the celebrated number . His work on Hilbert's 10th problem has shown that in a sense there is randomness in arithmetic, in other words, that God not only plays dice in quantum mechanics and nonlinear dynamics, but even in elementary number theory. His latest achievements have been to transform AIT into a theory about the size of real computer programs, programs that you can actually run, and his recent discovery that Leibniz anticipated AIT (1686). He is the author of nine books: Algorithmic Information Theory published by Cambridge University Press;Information, Randomness & Incompleteness and Information-Theoretic Incompleteness, both published by World Scientific; The Limits of Mathematics, The Unknowable, Exploring Randomness and Conversations with a Mathematician, all published by Springer-Verlag; From Philosophy to Program Size, published by the Tallinn Institute of Cybernetics; and Meta Math!, published by Pantheon Books. In 1995 he was given the degree of doctor of sciencehonoris causa by the University of Maine. In 2002 he was given the title of honorary professor by the University of Buenos Aires. In 2004 he was elected a corresponding member of the Académie Internationale de Philosophie des Sciences. He is also a visiting professor at the Computer Science Department of the University of Auckland, and on the international committee of the Valparaíso Complex Systems Institute.
James P. Crutchfield, Professor of Physics and Director, Complexity Sciences Center, Physics Department, University of California at Davis
Jim is Professor of Physics at the University of California, Davis, and Director of the Complexity Sciences Center—a new research and graduate program. Prior to this he was Research Professor at the Santa Fe Institute for many years, where he led its Dynamics of Learning Group and Network Dynamics Program. In parallel, he was Adjunct Professor of Physics in the Physics Department, University of New Mexico, Albuquerque. Before coming to SFI in 1997, he was a Research Physicist in the Physics Department at the University of California, Berkeley, since 1985. He received his B.A. summa cum laude in Physics and Mathematics from the University of California, Santa Cruz, in 1979 and his Ph.D. in Physics there in 1983. He has been a Visiting Research Professor at the Sloan Center for Theoretical Neurobiology, University of California, San Francisco; a Post-doctoral Fellow of the Miller Institute for Basic Research in Science at UCB; a UCB Physics Department IBM Post-Doctoral Fellow in Condensed Matter Physics; a Distinguished Visiting Research Professor of the Beckman Institute at the University of Illinois, Urbana-Champaign; and a Bernard Osher Fellow at the San Francisco Exploratorium. He is co-founder and Vice President of the Art and Science Laboratory in Santa Fe [[1]].
Over the last three decades Jim has worked in the areas of nonlinear dynamics, solid-state physics, astrophysics, fluid mechanics, critical phenomena and phase transitions, chaos, and pattern formation. His current research interests center on computational mechanics, the physics of complexity, statistical inference for nonlinear processes, genetic algorithms, evolutionary theory, machine learning, quantum dynamics, and distributed intelligence. He has published over 110 papers in these areas, including the following recent, related publications. Most are available from his website: [[2]].
Lukasz Debowski, Research Scientist, Institute of Computer Science, Polish Academy of Sciences
Lukasz's research interests revolve around probability, language, information, and learning.
Lukasz works at the IPI PAN, with the Statistical Analysis and Modeling and partly with the Linguistic Engineering. Seeking big intellectual adventures, he first studied at the Faculty of Physics, University of Warsaw. Later, he also visited the UFAL, the Santa Fe Institute, the CSE UNSW, and the CWI. Many interesting people showed him strikingly different ideas about what is worth doing in alpha and beta sciences, in engineering, and in general. ``I slowly realize what I should and can do best myself.
David Feldman, Professor, Physics and Astronomy, College of the Atlantic; Co-Director, SFI Complex Systems Summer School, Beijing
Dave's research training is in theoretical physics and mathematics, and his research interests lie in the fields of statistical mechanics and nonlinear dynamics. In particular, his research has examined how one might measure "complexity" or pattern in a mathematical system, and how such complexity is related to disorder. This work can be loosely categorized as belonging to the constellation of research topics often referred to as "chaos and complex systems." In his research, Dave uses both analytic and computational techniques. Dave has authored research papers in journals including Physical Review E, Chaos, Physics Letters A, and Advances in Complex Systems.
As a graduate student at UC-Davis, Dave received several awards in recognition of both teaching and scholarship: The Dissertation Year Fellowship; The Chancellor's Teaching Fellowship; and he was nominated for the Outstanding Graduate Student Teaching Award. Dave joined the faculty at College of the Atlantic in 1998, where he teaches a wide range of physics and math courses. He also teaches classes that explore connections between science and politics, such as Making the Bomb (about the Manhattan project and atomic weapons), and Gender and Science.
Jon Machta, Professor of Physics, University of Massachusetts at Amherst
Jon's research is in the area of theoretical condensed matter and statistical physics. His current research involves theoretical and computational studies of spin systems and applications of computational complexity theory to statistical physics.
John Mahoney, Post-doctoral Researcher, School of Natural Sciences, University of California at Merced
Melanie Mitchell, Professor, Computer Science, Portland State University; External Professor and Science Board member, Santa Fe Institute. Melanie Mitchell received a Ph.D. in Computer Science from the University of Michigan in 1990. Since then she has held faculty or professional positions at the University of Michigan, the Santa Fe Institute, Los Alamos National Laboratory, the OGI School of Science and Engineering, and Portland State University.
Melanie has served as Director of the Santa Fe Institute’s Complex Systems Summer School; at Portland State University she teaches, among other courses, Exploring Complexity in Science and Technology.
Her major work is in the areas of analogical reasoning, complex systems, genetic algorithms and cellular automata, and her publications in those fields are frequently cited. She is the author of An Introduction to Genetic Algorithms, a widely known introductory book published by MIT Press in 1996. Her most recent book is Complexity: A Guided Tour named by Amazon.com as one of the 10 best science books of 2009.
Cris Moore, Professor of Computer Science, University of New Mexico
Cris is a Professor in the Computer Science Department at the University of New Mexico, with a joint appointment in the Department of Physics and Astronomy. He is also a Professor at the Santa Fe Institute. Cris studies interesting things like quantum computation (especially post-quantum cryptography and the possibility of algorithms for Graph Isomorphism), phase transitions in NP-complete problems (e.g. the colorability of random graphs, or the satisfiability of random formulas) and social networks (in particular, automated techniques for identifying important structural features of large networks).
Susanne Still, Professor of Computer Science, Department of Information and Computer Sciences, University of Hawaii at Manoa.
Giulio Tononi, Professor, School of Medicine, Department of Psychiatry, University of Wisconsin - Madison
Giulio is a psychiatrist and neuroscientist who has held faculty positions in Pisa, New York, San Diego and Madison, Wisconsin, where he is Professor of Psychiatry. Along with his collaborators he pioneered several complementary approaches to study sleep. These include genomics, proteomics, fruit fly models, rodent models employing multiunit / local field potential recordings in behaving animals, in vivo voltammetry and microscopy, high-density EEG recordings and transcranial magnetic stimulation in humans, and large-scale computer models of sleep and wakefulness. This research has led to a comprehensive hypothesis on the function of sleep, the synaptic homeostasis hypothesis. According to the hypothesis, wakefulness leads to a net increase in synaptic strength, and sleep is necessary to reestablish synaptic homeostasis. The hypothesis has implications for understanding the effects of sleep deprivation and for developing novel diagnostic and therapeutic approaches to sleep disorders and neuropsychiatric disorders. Another focus of Giulio's work is the integrated information theory of consciousness: a scientific theory of what consciousness is, how it can be measured, how it is realized in the brain and, of course, why it fades when we fall into dreamless sleep and returns when we dream. The theory is being tested with neuroimaging, transcranial magnetic stimulation, and computer models. In 2005, Giulio received the NIH Director’s Pioneer Award for his work on sleep mechanism and function, and in 2008 he was made the David P. White Chair in Sleep Medicine and is a Distinguished Chair in Consciousness Science.
Rui Vilela-Mendes, Professor of Mathematics, Instituto Superior Tecnico, Lisboa, Portugal. Rui received an Electrical Engineering degree from the Technical University (IST)- Lisbon, a Ph. D. in Physics from the University of Texas (Austin) and an Habilitation in Mathematics from the University of Lisbon. He is currently a member of the Center for Mathematics and Applications (CMAF-UL) and of the Institute for Plasmas and Nuclear Fusion (IPFN-IST) as well as a member of the Lisbon Academy of Sciences. He was a visiting researcher at CERN, CNRS (Marseille), IHES (Bures), Univ. of Bielefeld and co-organizer and collaborator of several international research projects on Theoretical Physics and the Sciences of Complexity.
Over the last few decades Rui has worked in the areas of mathematical economics, nonlinear dynamics and control, stochastic processes and quantum theory. My current research interests center on mathematical economics, the physics of complexity, control and quantum computing.
Karoline Wiesner, Assistant Professor, School of Mathematics and Centre for Complexity Sciences, University of Bristol
Bateson defines information as “a difference that makes a difference”. Complexity is “when quantitative differences become qualitative differences.” We need information theory to identify this difference. Key to my work is coming up with good measures of complexity for classical (biological) and quantum systems. The goal is to build a tool set for identifying and measuring structure. Part of this tool set is a hierarchy of classical and quantum computational architectures. How difficult it is to generate a given structure determines how high up in this architectural hierarchy its representation is found.