The Santa Fe Institute is an independent, nonprofit theoretical research institute located in Santa Fe and dedicated to the multidisciplinary study of the fundamental principles of complex adaptive systems, including physical, computational, biological, and social systems.The Institute consists of a small number of resident faculty, a large group of "external" faculty, whose primary appointments are at other institutions, and a number of visiting scholars. The Institute is advised by a group of eminent scholars, including several Nobel Prize–winning scientists. Although theoretical scientific research is the Institute's primary focus, it also runs several popular summer schools on complex systems, along with other educational and outreach programs aimed at students ranging from middle school up through graduate school.The Institute's annual funding comes from a combination of private donors, grant-making foundations, government science agencies, and companies affiliated with its business network. The 2011 budget was just over $10 million. Wikipedia.
West G.B.,Santa Fe Institute
The Lancet | Year: 2012
The study and practice of medicine could benefi t from an enhanced engagement with the new perspectives provided by the emerging areas of complexity science and systems biology. A more integrated, systemic approach is needed to fully understand the processes of health, disease, and dysfunction, and the many challenges in medical research and education. Integral to this approach is the search for a quantitative, predictive, multilevel, theoretical conceptual framework that both complements the present approaches and stimulates a more integrated research agenda that will lead to novel questions and experimental programmes. As examples, the importance of network structures and scaling laws are discussed for the development of a broad, quantitative, mathematical understanding of issues that are important in health, including ageing and mortality, sleep, growth, circulatory systems, and drug doses. A common theme is the importance of understanding the quantifi able determinants of the baseline scale of life, and developing corresponding parameters that defi ne the average, idealised, healthy individual.
Smith E.,Santa Fe Institute
Reports on Progress in Physics | Year: 2011
The meaning of thermodynamic descriptions is found in large-deviations scaling (Ellis 1985 Entropy, Large Deviations, and Statistical Mechanics (New York: Springer); Touchette 2009 Phys. Rep. 478 1-69) of the probabilities for fluctuations of averaged quantities. The central function expressing large-deviations scaling is the entropy, which is the basis both for fluctuation theorems and for characterizing the thermodynamic interactions of systems. Freidlin-Wentzell theory (Freidlin and Wentzell 1998 Random Perturbations in Dynamical Systems 2nd edn (New York: Springer)) provides a quite general formulation of large-deviations scaling for non-equilibrium stochastic processes, through a remarkable representation in terms of a Hamiltonian dynamical system. A number of related methods now exist to construct the Freidlin-Wentzell Hamiltonian for many kinds of stochastic processes; one method due to Doi (1976 J. Phys. A: Math. Gen. 9 1465-78; 1976 J. Phys. A: Math. Gen. 9 1479) and Peliti (1985 J. Physique 46 1469; 1986 J. Phys. A: Math. Gen. 19 L365, appropriate to integer counting statistics, is widely used in reaction-diffusion theory. Using these tools together with a path-entropy method due to Jaynes (1980 Annu. Rev. Phys. Chem. 31 579-601), this review shows how to construct entropy functions that both express large-deviations scaling of fluctuations, and describe system-environment interactions, for discrete stochastic processes either at or away from equilibrium. A collection of variational methods familiar within quantum field theory, but less commonly applied to the Doi-Peliti construction, is used to define a 'stochastic effective action', which is the large-deviations rate function for arbitrary non-equilibrium paths. We show how common principles of entropy maximization, applied to different ensembles of states or of histories, lead to different entropy functions and different sets of thermodynamic state variables. Yet the relations among all these levels of description may be constructed explicitly and understood in terms of information conditions. Although the example systems considered are limited, they are meant to provide a self-contained introduction to methods that may be used to systematically construct descriptions with all the features familiar from equilibrium thermodynamics, for a much wider range of systems describable by stochastic processes. © 2011 IOP Publishing Ltd.
Agency: NSF | Branch: Standard Grant | Program: | Phase: Interdiscp Behav&SocSci IBSS | Award Amount: 770.00K | Year: 2016
This interdisciplinary research project will explain why different social group exhibits particular forms of organization by providing a rigorous way to determine the extent to which a social groups structure is shaped by external pressures, such as demands and opportunities afforded by the environment, rather than internal pressures, such as competition and misaligned incentives among group members. The project will constitute a new research thrust for the social and behavioral sciences regarding the ways that the structure of organizations constrains their ability to process environmental information, and it will provide new insights as to how organizations internally route information subject to restrictions on actors capabilities. This project can help unite social and behavioral scientists ranging from those focusing on primate groups and forager societies to those studying the structure of multinational corporations. It will also provide tangible returns by suggesting how to best organize modern firms and governmental bureaucracies. The project will develop and make available databases on a variety of past and present social organizations which will be of broad scholarly value.
Social groups ranging from prehistoric societies to business firms to military organizations are organized in dramatically different ways, from egalitarian horizontal societies to deep vertical hierarchies and from sets of decentralized, modular teams to centralized command-and-control assemblies. Do social groups exhibit the organizations they do because these structures optimize information exchange, or are groups simply channeled by local historical precedent? This project will address this question by using recently developed mathematical techniques to formalize the notions of collective problem solving and cognitive constraints. These formalizations will be employed to investigate which organizations perform best under different constraints and group sizes as well as which are most robust in continuing to function when members or inter-member connections are removed or the pressures on the group change. The investigators will then determine whether such optimal organizations reflect the way real groups are (or have been) organized. The problem of optimal internal organization has received attention in various fields, from anthropology to economics. However, there has not been a unifying mathematical framework for studying the relationship between different kinds of organizations and the pressures and constraints on organizations. This project will develop such a framework by combining techniques from graph theory and information theory. Graph theory studies the organization of social networks and provides a rich vocabulary for quantifying organizational properties like hierarchy, centralization, and modularity. Methods from information theory can quantify the bits of information processed or communicated between group members, conceptualizing a social group as a telecommunication network where network nodes correspond to group members and bandwidth limits on the network channels and nodes correspond to cognitive constraints. The researchers will identify network structures that best process information under different sets of bandwidth limits and compare the results to empirical data on past and present human organizations. This project is supported through the NSF Interdisciplinary Behavioral and Social Sciences Research (IBSS) competition.
Agency: NSF | Branch: Standard Grant | Program: | Phase: COMPUTATIONAL MATHEMATICS | Award Amount: 111.58K | Year: 2016
Symmetry reduces large complex systems to manageable quantities of information. Identifying those symmetries and understanding their structure helps to solve a wide range of problems, from improving engineering tasks to disrupting the mechanisms of disease. The century-old problem of deciding whether two sets of symmetries have the same structure is known today as the Group Isomorphism Problem. This problem is fundamental to both computational algebra and computational complexity, and has implications for fields as diverse as material science, particle physics, and chemistry. The primary goal of this project is to develop significantly better approaches to testing isomorphism of finite groups of symmetries. It supports a new multidisciplinary collaboration between researchers at four universities, including students and early-career mathematicians and computer scientists.
The Group Isomorphism Problem asks for an algorithm to decide whether two finite groups are equivalent. Both the problem itself, and the techniques designed to improve upon it, have implications for other computational problems, including the better-known problems of Graph Isomorphism and P versus NP. Our teams approach goes beyond existing static recursions such as working sequentially down a derived or lower central series. Using a new dynamic strategy we prioritize the optimal stages of the problem, thereby improving the performance of later stages. To achieve this we are investigating the use of nonassociative rings, spectral sequences, modular representation theory, and p-local cohomology. We are also inspecting recently developed data structures in computational algebra that seem well-suited to our approach, as well as investigating applications to geometric complexity theory.
Agency: NSF | Branch: Standard Grant | Program: | Phase: CDS&E-MSS | Award Amount: 125.00K | Year: 2016
Big data increasingly means big networks, because such data either directly concerns relational structures as in human and animal mobility patterns, gene interaction networks, or is influenced by an underlying relational structure as in epidemiological studies, urban studies, and cultural diffusion. Most applications of networks rely crucially on comparing networks, for example to detect changes in one network across time, to categorize or classify multiple networks of similar types, or to build analogies across fields by comparing networks of different origins. The question of how to compare two networks in a principled way, without relying on the ad hoc choice of statistics used by many current comparison methods, is key to the foundations of network science. This project will bring to bear new ideas from mathematics, computer science, and statistical physics on the problem of principled, structural comparison of networks. Through pre-existing collaborations, the PIs will leverage these new comparison methods to address questions in several different areas, for example, about: how food webs change across gradients like latitude, altitude, and temperature,morphological growth patterns of bacterial colonies, the evolution of human culture and communities, and links between socio-economic indicators and epidemiology.
Our project will develop new rigorous and principled methods of comparing the structure of complex networks. The methods to be pursued aim to get away from single-scale summary statistics; to break new ground, we must think in terms of structural distance rather than statistical inference. In combination with tools from machine learning, such structural comparison methods are an important step towards defining the space of real-world networks, which could serve as a more rigorous basis for a theory of complex networks. These methods have four advantageous features: (1) they systematically consider multiple scales of network organization, (2) they do not depend on an identification of the nodes of the two networks beforehand, (3) they can compare networks of different sizes, and (4) they are not dependent on any particular generative model of network growth. Very few, if any, of the existing network comparison methods have all of these features, and those that do exist have not been extensively developed. These features enable many new applications in a range of areas, including ecology, microbiology, cultural evolution, and epidemiology.
Agency: NSF | Branch: Continuing grant | Program: | Phase: EVOLUTION OF DEVELOP MECHANISM | Award Amount: 44.95K | Year: 2017
How did complex multicellular life arise from the single-celled microorganisms that defined life on Earth for billions of years? The rise of multicellular organisms drove a profound diversification of life that fundamentally changed Earths ecology, yet little is known about how this major evolutionary transition occurred. This is largely due to the fact that most multicellular lineages are ancient, and early steps in this transition have been obscured by extinction. The PIs have overcome this limitation by developing a novel laboratory system, experimentally evolving simple multicellularity in Bakers yeast. Over thousands of generations of laboratory evolution, these cluster-forming snowflake yeast evolve a suite of multicellular adaptations, including larger cluster size, an elevated rate of programmed cell death, and a more hydrodynamic profile. The PIs will use this model system to examine how novel multicellular traits arise in evolution, and will determine whether a fundamental developmental mechanism, the single-celled bottleneck that most multicellular organisms pass through (e.g., a single fertilized egg), improves the ability of natural selection to act on these emergent multicellular traits. This interdisciplinary work utilizes cutting-edge techniques in experimental evolution, next generation sequencing, molecular genetics, confocal microscopy, image analysis, and mathematical modeling. By illuminating the earliest steps in the evolutionary transition to multicellularity, the proposed research will help resolve an outstanding problem of central importance to biology. The evolution of biological complexity remains challenging to teach, particularly at the high school level. This award will support the development of a novel lab module, suitable for both high school and college classes, in which students use snowflake yeast and computer simulations to examine the evolution of multicellularity. The PIs will hold summer on-campus workshops to train teachers in the Atlanta area to use this curricula.
Recent experiments have shown that unicellular organisms readily evolve to form multi-celled clusters, but little is known about how cellular clusters subsequently evolve greater multicellular complexity. The proposed research addresses this subject directly by asking the following questions: How do multicellular traits arise in evolution when mutation and recombination can only directly affect cell-level phenotype? Can early developmental mechanisms evolve to facilitate selection on these emergent multicellular traits? How are genes recruited for developmental functionality, and how are novel multicellular adaptations integrated into overall organism form and function? Answering the above questions will provide fundamental insight into the evolutionary origins of multicellular complexity, and will provide a theoretical foundation for similar investigations in other major evolutionary transitions. Snowflake yeast are an ideal experimental system for this work: the PIs have already evolved substantial morphological novelty, the snowflake growth form is mathematically tractable and has already been modeled by the PIs, and the extensive bioinformatic and molecular genetic tools developed for yeast allows for synthetic construction of multicellular strains with precisely defined properties.
Agency: NSF | Branch: Standard Grant | Program: | Phase: APPLIED MATHEMATICS | Award Amount: 157.50K | Year: 2016
Advances in science come from the collective and linked efforts of thousands of researchers working within and across disciplines. This project creates rigorous models of the composition, dynamics, and network structure of the United States? scientific workforce across heterogeneous independent institutions with different strengths and emphases. The systematic influence on these institutional and individual characteristics on scientific advances across disciplines is investigated. The results of this project will generate new insights into the composition of the scientific workforce and scientific productivity across fields. In addition, this project trains new graduate and undergraduate students in cutting-edge computational and statistical research techniques, and will develop and disseminate new large-scale open data sets on the composition of the United States? scientific workforce and provide new software for collecting structured data automatically from open unstructured sources.
This project uses state-of-the-art computational and statistical techniques from network science, machine learning, and social modeling to create a new technology platform for automatically and systematically collecting high-quality structured data on the composition, dynamics, and output of the scientific workforce. These data will be combined with social survey results of individual researchers and with rigorous network methods to model the relationship between workforce composition, productivity, and observable differences at the individual and institutional levels within and between scientific fields. Mathematical models of the short- and long-term evolution of workforce in order to evaluate the likely outcomes of certain types of interventions and policies are developed.
Agency: NSF | Branch: Standard Grant | Program: | Phase: CONDENSED MATTER & MAT THEORY | Award Amount: 381.00K | Year: 2016
This award supports theoretical research and education to investigate emergent phenomena in condensed-matter, ecological, and biological systems, with a specific emphasis on the brain. Training in these fundamental topics will also help advance the careers of junior researchers in the physical, mathematical, and biological sciences. The overarching goal is to understand phenomena that emerge in systems with many particles or components that interact with each other. These phenomena are a reflection of the components acting in concert and are distinct from the properties associated from an individual particle or component of the system.
In ferromagnetic materials, with the prototypical example being a bar magnet, magnetism arises at low temperatures, where the tendency for magnetism overwhelms thermal fluctuations. However, if such a material is suddenly cooled to low temperature, barriers to ferromagnetism arise, leading to the formation of a glassy state rather than the perfect alignment of microscopic magnets across the material which leads to the ferromagnetic state. A goal of the research is to determine the conditions under which magnetic or glassy behavior arises. A major focus of the research is the study of dense networks, in which the number of links is much larger than the number of nodes. An important example is the human brain, which typically has 100 billion neurons and 100 trillion connections. The connectivity patterns of these neurons contain a rich spectrum of local motifs that may underlie the wondrous functionality of the brain. An important goal is to elucidate these fascinating structures. Another focus is to understand the ecological interplay between depletion of an environment by foraging, the nourishment of the forager by resource consumption, and environmental replenishment by resource growth. An important aim is to determine the conditions under which the forager and resource densities remain in balance and when boom and bust cycles arise.
This award also supports the PIs efforts to develop a massive open online course on topics related to statistical physics.
This award supports theoretical research and education that involve applying the techniques of non-equilibrium statistical physics to emergent phenomena in condensed-matter, ecological, and biological systems, with a focus on the brain. While ostensibly disparate, these projects all rely on common investigative tools, including analysis of master equations, scaling theories, and large-scale numerical simulations. Training in using these essential tools will also help advance the careers of junior researchers in the physical, mathematical, and biological sciences.
The first project is to understand the dynamics of kinetic ferromagnetic systems that do not conform to conventional power-law coarsening. Such systems may get stuck in complex metastable states that consist of multiple breathing domains. Long-time properties are controlled domain merging - either as isolated events or part of a macroscopic cascade. The resulting ultraslow dynamics resembles that of glassy materials and should provide new insights into glassy behavior.
A second focus is dense networks in which the average node degree increases with the number of nodes N. An important example is the brain. Human brains typically have 100 billion neurons, each of which is connected to roughly 1000 other neurons. The structural connectivity of the brain reveals a rich spectrum of motifs in which small sets of nodes are densely interconnected; such structures may underlie the wondrous functionality of the brain. These and related features, such as multiple phase transitions in the density of fixed-size cliques will be elucidated by the master equation applied to dense networks.
Finally, a principled model of foraging, which is based on the starving random walk model, will be investigated. Here the forager consumes food upon encountering it, thereby depleting the resource locally. Moreover, the forager starves if it wanders for too long without encountering food. When regeneration and reproduction are also incorporated, an even richer phenomenology arises - the dynamics can be steady or oscillatory, with a large-scale spatial organization of foragers and resources. These features will be elucidated by exploiting first-passage and stochastic processes and by large-scale simulations.
This award also supports the PIs efforts to develop a massive open online course on topics related to statistical physics.
Agency: NSF | Branch: Continuing grant | Program: | Phase: DISCOVERY RESEARCH K-12 | Award Amount: 826.85K | Year: 2015
The Discovery Research K-12 program (DRK-12) seeks to significantly enhance the learning and teaching of science, technology, engineering and mathematics (STEM) by preK-12 students and teachers, through research and development of innovative resources, models and tools (RMTs). Projects in the DRK-12 program build on fundamental research in STEM education and prior research and development efforts that provide theoretical and empirical justification for proposed projects.
This project addresses the need for a computationally-enabled STEM workforce by equipping teachers with the skills necessary to prepare students for future endeavors as computationally-enabled scientists and citizens, and by investigating the most effective ways to provide this instruction to teachers. The project also addresses the immediate challenge presented by the Next Generation Science Standards to prepare middle school science teachers to implement rich computational thinking (CT) experiences, such as the use, creation and analysis of computer models and simulations, within science classes.
The project, a partnership between the Santa Fe Institute and the Santa Fe Public School District, directly addresses middle school teachers understanding, practice, and teaching of modern scientific practice. Using the Project GUTS program and professional development model as a foundation, this project will design and develop a set of Resources, Models, and Tools (RMTs) that collectively form the basis for a comprehensive professional development (PD) program, then study teachers experiences with the RMTs and assess how well the RMTs prepared teachers to implement the curriculum. The PD program includes: an online PD network; workshops; webinars and conferences; practicum and facilitator support; and curricular and program guides. The overall approach to the project is design based implementation research (DBIR). Methods used for the implementation research includes: unobtrusive measures such as self-assessment sliders and web analytics; the knowledge and skills survey (KS-CT); interviews (teachers and the facilitators); analysis of teacher modified and created models; and observations of practicum and classroom implementations. Data collection and analysis in the implementation research serve two purposes: a) design refinement and b) case study development. The implementation research employs a mixed-method, nonequivalent group design with embedded case studies.
Agency: NSF | Branch: Standard Grant | Program: | Phase: OFFICE OF MULTIDISCIPLINARY AC | Award Amount: 999.95K | Year: 2016
This INSPIRE project is jointly funded by the Chemistry of Life Processes Program in the Division of Chemistry in the Directorate for Mathematical and Physical Sciences, the Physics of Living Systems and Computational Physics Programs in the Division of Physics in the Directorate for Mathematical and Physical Sciences, the Systems and Synthetic Biology Cluster in the Division of Molecular and Cellular Biosciences in the Directorate for Biological Sciences, the INSPIRE Program and the Office of International Science and Engineering in the Office of Integrative Activities.
This award is funding Dr. David Wolpert from the Santa Fe Institute, Dr. Seth Lloyd from the Massachusetts Institute of Technology, and Dr. Sebastian Deffner from the University of Maryland, to exploit the powerful new tools of non-equilibrium statistical physics that analyzing the energy tradeoffs inherent in all computation. Energy (thermodynamic) costs of computational systems play a role in everything from human-engineered computers (which release heat energy equal to about 5% of the annual energy expenditure in the US) to biological systems (which must harvest enough energy from their environment to fulfill their needs to think and move). To investigate these energy costs, this project pursues three, synergistic research thrusts. The research analyzes the fundamental energy tradeoffs faced by intracellular biochemical networks. The project analyzes the energy tradeoffs that engineers face when designing new computer technologies. Finally, the research integrate the energy tradeoffs of computational processes into the theory of computation. This project allows graduate students and postdoctoral fellows to acquire the skills needed for expanding what is known about the energy requirements of computation. In addition, the workshops held under the auspices of this project are building intellectual bridges connecting the multiple scientific disciplines that involve computational systems. Such brides are crucial to the development of a broadly applicable theory of thermodynamics of computation.
This research project is undertaken to quantitatively analyze the tradeoffs relating the minimal free energy requirements and dissipation of a computer on the one hand, to several high-level properties of that computer on the other. This analysis considers the speed of the computer; the number of hidden internal states the computer can use as buffers; the variability of the inputs to the computer and the degree and type of noise in the computer. In addition to theoretical aspects of these issues and their consequences for computation theory, this research into the thermodynamics of computation is conducted in several domains that include: chromatin computers, the computation performed during RNA folding, the computation performed by biochemical networks and post-Moore computers that exploit hybrid computation involving both quantum and classical components. It is expected that, in addition to providing major insight into the role of the thermodynamics of computation in all those domains, this project may lay the foundation for an overarching theory of thermodynamics of computation in general.