Lake of the Woods, United States
Lake of the Woods, United States

Time filter

Source Type

News Article | May 15, 2017
Site: www.sciencedaily.com

Despite the grand diversity among living organisms, the molecule used to store and transmit energy within aerobic, or oxygen-using, cells is remarkably the same. From bacteria to fungi, plants, and animals, adenosine triphosphate (ATP) serves as the universal energy currency of life, fueling the processes cells need to survive and function. Over the course of a day, an individual will typically use the equivalent of his or her bodyweight in ATP; however, the human body carries only a small amount of the molecule at any one time. That means cells must constantly recycle or replenish their limited capacity, relying on a highly efficient molecular motor called ATP synthase to do the job. As part of a project dedicated to modeling how single-celled purple bacteria turn light into food, a team of computational scientists from the University of Illinois at Urbana-Champaign (UIUC) simulated a complete ATP synthase in all-atom detail. The work builds on the project's first phase -- a 100-million atom photosynthetic organelle called a chromatophore -- and gives scientists an unprecedented glimpse into a biological machine whose energy efficiency far surpasses that of any artificial system. First proposed under the leadership of the late Klaus Schulten, a pioneer in the field of computational biophysics and the founder of the Theoretical and Computational Biophysics Group at UIUC, the research has progressed under the stewardship of Abhishek Singharoy, co-principal investigator and a National Science Foundation postdoctoral fellow with UIUC's Center for the Physics of Living Cells. In addition to Singharoy, the team includes members from the groups of UIUC professors Emad Tajkhorshid, Zaida Luthey-Schulten and Aleksei Aksimentiev; research scientist Melih Sener; and developers Barry Isralewitz, Jim Phillips, and John Stone. Experimental collaborator Neil Hunter of the University of Sheffield in England also took part in the project. The UIUC-led team built and tested its mega-model under a multiyear allocation awarded through the Innovative and Novel Computational Impact on Theory and Experiment program on the Titan supercomputer, a Cray XK7 managed by the US Department of Energy's (DOE's) Oak Ridge Leadership Computing Facility (OLCF), a DOE Office of Science User Facility located at DOE's Oak Ridge National Laboratory. Using Titan, the team produced a virtual tool that can predict in exacting detail the chemical energy output of a photosynthetic system based on the amount of sunlight absorbed. The research could one day contribute to advanced clean energy technology that incorporates biological concepts. "Nature has designed the chromatophore in such a way that it can generate enough ATPs for these bacteria to survive in low-light environments such as the bottom of ponds and lakes," Singharoy said. "Our work captured this energy conversion process in all-atom detail and allowed us to predict its efficiency." Often referred to as the power plant of the cell, ATP synthase is a complex enzyme that speeds up the synthesis of its molecular precursors, adenosine diphosphate (ADP) and phosphate. Embedded within the chromatophore's inner and outer membrane, the enzymatic motor consists of three major parts -- an ion-powered rotor, a central stalk, and a protein ring. Similar to a waterwheel that's turned by the force of a flowing stream, the ATP synthase rotor harnesses the electrochemically spurred movement of ions, such as protons or sodium, from high concentration to low concentration across the membrane. The resulting mechanical energy transfers to the central stalk, which assists the protein ring in synthesizing ATP. Remarkably, the process works just as well in reverse. When too many ions build up on the outer side of the chromatophore, the ATP synthase protein ring will break down ATP into ADP, a process called hydrolysis, and ions will flow back to the inner side. "Normally, you would expect a lot of energy loss during this process, like in any man-made motor, but it turns out ATP synthase has very little waste," Singharoy said. "How this motor is designed to minimize energy loss is the question we started asking." Similar to a tinkerer disassembling an engine to better understand how it works, Singharoy's team broke down the 300,000-atom enzyme into its constituent parts. Drawing from decades of research into ATP synthase, past models, and new experimental data supplied by a Japanese team led by Takeshi Murata of the RIKEN Center for Life Science Technologies, the team constructed and simulated the pieces of the ATP synthase puzzle independently and together on Titan. To capture important processes that play out over millisecond time scales, Singharoy, in collaboration with Christophe Chipot of the French National Center for Scientific Research and Mahmoud Moradi of the University of Arkansas, deployed the molecular dynamics code NAMD strategically. The team executed an ensemble strategy, tracking the motion of around 1,000 replicas of ATP synthase simultaneously with time steps of 2 femtoseconds, or 2,000 trillionths of a second. In total, the team accumulated 65 microseconds (65 millionths of a second) of simulation time, using this information to extrapolate motions that occur over the course of a millisecond (1 thousandth of a second). As a result, the team identified previously undocumented swiveling motions in the protein ring that help explain the molecular motor's efficiency. Similarly, the team's simulations captured the rubber band-like elasticity of the enzyme's central stalk. Singharoy's team estimated that when paired with the protein ring, the stalk absorbs about 75 percent of the energy released during hydrolysis. Additionally, simulations of the protein ring by itself revealed a unit that can function independently, a finding reported in experiments but not in computational detail. "Even in the absence of the center stalk, the protein ring itself is capable of ATP hydrolysis. It's not very efficient, but it has the capability," Singharoy said. After simulating its complete ATP synthase model, the UIUC team incorporated the enzyme into its previously constructed chromatophore model to gain the most comprehensive picture of a photosynthetic system to date. With this virtual biological solar panel, the team could measure each step of the energy conversion process -- from light harvesting, to electron and proton transfer, to ATP synthesis -- and better understand its mechanical underpinnings. Nature's chromatophore is designed for low-light intensity, only absorbing between 3 and 5 percent of sunlight on a typical day. The team, through the efforts of Sener, found this absorption rate translates to around 300 ATPs per second, which is the amount a bacterium needs to stay alive. Having studied nature's design, the team now wanted to see if it could improve upon it. Assuming the same amount of light intensity, the team designed an artificial chromatophore with a decidedly unnatural protein composition, boosting the presence of two types of specialized proteins. Analysis of the new design predicted a tripling of the photosynthetic system's ATP production, opening up the possibility for the chromatophore's human-guided optimization. "You could potentially genetically modify a chromatophore or change its concentration of proteins," Singharoy said. "These predictions promise to bring forth new developments in artificial photosynthesis." Under its latest INCITE allocation, the UIUC team is pivoting to energy conversion in a different lifeform: animals. Taking what it has learned from modeling photosynthesis in purple bacteria, the team is modeling cellular respiration, the process animal cells use to convert nutrients to ATP. "You have at least two proteins in common between respiration and photosynthesis," said Singharoy, who is continuing his involvement with the project as an assistant professor at Arizona State University. "The question is what design principles carry over into higher organisms?" Simulation of the chromatophore -- complete with ATP synthase -- marks an ongoing shift in computational biophysics from analyzing individual cell parts (e.g., single proteins and hundreds of atoms) to analyzing entire cell systems (e.g., hundreds of proteins and millions of atoms). Schulten, who passed away in October 2016, understood better than most people the significance of using computers to simulate nature. In an interview in 2015, he laid out his rationale for modeling the chromatophore. "The motivation is to understand a very key step of life on Earth on which all life depends today. Energy-wise 95 percent of life on Earth depends on photosynthesis, including humans," he said. Schulten also understood the milestone a specialized organelle represented on the road to simulating a complete single-celled organism. "We don't have anything smaller than a cell that we would call alive," he said. "It's the smallest living entity, and we want to understand it." With next-generation supercomputers, including the OLCF's Summit, set to come online in 2018, the research group Schulten founded in 1989 is preparing to take on the grand challenge of simulating a cell. Under the leadership of Tajkhorshid, the team plans to simulate the first billion-atom cell, including the basic components a cell needs to survive and grow. Improvements to NAMD and work being done under the OLCF's Center for Accelerated Application Readiness program are helping to make the vision of Schulten and others a reality. "We keep moving forward," Singharoy said. "Our exhaustive study of a complete organelle in all-atom detail has opened the door for a full cell in all-atom detail."


News Article | May 16, 2017
Site: phys.org

Over the course of a day, an individual will typically use the equivalent of his or her bodyweight in ATP; however, the human body carries only a small amount of the molecule at any one time. That means cells must constantly recycle or replenish their limited capacity, relying on a highly efficient molecular motor called ATP synthase to do the job. As part of a project dedicated to modeling how single-celled purple bacteria turn light into food, a team of computational scientists from the University of Illinois at Urbana-Champaign (UIUC) simulated a complete ATP synthase in all-atom detail. The work builds on the project's first phase—a 100-million atom photosynthetic organelle called a chromatophore—and gives scientists an unprecedented glimpse into a biological machine whose energy efficiency far surpasses that of any artificial system. First proposed under the leadership of the late Klaus Schulten, a pioneer in the field of computational biophysics and the founder of the Theoretical and Computational Biophysics Group at UIUC, the research has progressed under the stewardship of Abhishek Singharoy, co-principal investigator and a National Science Foundation postdoctoral fellow with UIUC's Center for the Physics of Living Cells. In addition to Singharoy, the team includes members from the groups of UIUC professors Emad Tajkhorshid, Zaida Luthey-Schulten and Aleksei Aksimentiev; research scientist Melih Sener; and developers Barry Isralewitz, Jim Phillips, and John Stone. Experimental collaborator Neil Hunter of the University of Sheffield in England also took part in the project. The UIUC-led team built and tested its mega-model under a multiyear allocation awarded through the Innovative and Novel Computational Impact on Theory and Experiment program on the Titan supercomputer, a Cray XK7 managed by the US Department of Energy's (DOE's) Oak Ridge Leadership Computing Facility (OLCF), a DOE Office of Science User Facility located at DOE's Oak Ridge National Laboratory. Using Titan, the team produced a virtual tool that can predict in exacting detail the chemical energy output of a photosynthetic system based on the amount of sunlight absorbed. The research could one day contribute to advanced clean energy technology that incorporates biological concepts. "Nature has designed the chromatophore in such a way that it can generate enough ATPs for these bacteria to survive in low-light environments such as the bottom of ponds and lakes," Singharoy said. "Our work captured this energy conversion process in all-atom detail and allowed us to predict its efficiency." Often referred to as the power plant of the cell, ATP synthase is a complex enzyme that speeds up the synthesis of its molecular precursors, adenosine diphosphate (ADP) and phosphate. Embedded within the chromatophore's inner and outer membrane, the enzymatic motor consists of three major parts—an ion-powered rotor, a central stalk, and a protein ring. Similar to a waterwheel that's turned by the force of a flowing stream, the ATP synthase rotor harnesses the electrochemically spurred movement of ions, such as protons or sodium, from high concentration to low concentration across the membrane. The resulting mechanical energy transfers to the central stalk, which assists the protein ring in synthesizing ATP. Remarkably, the process works just as well in reverse. When too many ions build up on the outer side of the chromatophore, the ATP synthase protein ring will break down ATP into ADP, a process called hydrolysis, and ions will flow back to the inner side. "Normally, you would expect a lot of energy loss during this process, like in any man-made motor, but it turns out ATP synthase has very little waste," Singharoy said. "How this motor is designed to minimize energy loss is the question we started asking." Similar to a tinkerer disassembling an engine to better understand how it works, Singharoy's team broke down the 300,000-atom enzyme into its constituent parts. Drawing from decades of research into ATP synthase, past models, and new experimental data supplied by a Japanese team led by Takeshi Murata of the RIKEN Center for Life Science Technologies, the team constructed and simulated the pieces of the ATP synthase puzzle independently and together on Titan. To capture important processes that play out over millisecond time scales, Singharoy, in collaboration with Christophe Chipot of the French National Center for Scientific Research and Mahmoud Moradi of the University of Arkansas, deployed the molecular dynamics code NAMD strategically. The team executed an ensemble strategy, tracking the motion of around 1,000 replicas of ATP synthase simultaneously with time steps of 2 femtoseconds, or 2,000 trillionths of a second. In total, the team accumulated 65 microseconds (65 millionths of a second) of simulation time, using this information to extrapolate motions that occur over the course of a millisecond (1 thousandth of a second). As a result, the team identified previously undocumented swiveling motions in the protein ring that help explain the molecular motor's efficiency. Similarly, the team's simulations captured the rubber band–like elasticity of the enzyme's central stalk. Singharoy's team estimated that when paired with the protein ring, the stalk absorbs about 75 percent of the energy released during hydrolysis. Additionally, simulations of the protein ring by itself revealed a unit that can function independently, a finding reported in experiments but not in computational detail. "Even in the absence of the center stalk, the protein ring itself is capable of ATP hydrolysis. It's not very efficient, but it has the capability," Singharoy said. After simulating its complete ATP synthase model, the UIUC team incorporated the enzyme into its previously constructed chromatophore model to gain the most comprehensive picture of a photosynthetic system to date. With this virtual biological solar panel, the team could measure each step of the energy conversion process—from light harvesting, to electron and proton transfer, to ATP synthesis—and better understand its mechanical underpinnings. Nature's chromatophore is designed for low-light intensity, only absorbing between 3 and 5 percent of sunlight on a typical day. The team, through the efforts of Sener, found this absorption rate translates to around 300 ATPs per second, which is the amount a bacterium needs to stay alive. Having studied nature's design, the team now wanted to see if it could improve upon it. Assuming the same amount of light intensity, the team designed an artificial chromatophore with a decidedly unnatural protein composition, boosting the presence of two types of specialized proteins. Analysis of the new design predicted a tripling of the photosynthetic system's ATP production, opening up the possibility for the chromatophore's human-guided optimization. "You could potentially genetically modify a chromatophore or change its concentration of proteins," Singharoy said. "These predictions promise to bring forth new developments in artificial photosynthesis." Under its latest INCITE allocation, the UIUC team is pivoting to energy conversion in a different lifeform: animals. Taking what it has learned from modeling photosynthesis in purple bacteria, the team is modeling cellular respiration, the process animal cells use to convert nutrients to ATP. "You have at least two proteins in common between respiration and photosynthesis," said Singharoy, who is continuing his involvement with the project as an assistant professor at Arizona State University. "The question is what design principles carry over into higher organisms?" Simulation of the chromatophore—complete with ATP synthase—marks an ongoing shift in computational biophysics from analyzing individual cell parts (e.g., single proteins and hundreds of atoms) to analyzing entire cell systems (e.g., hundreds of proteins and millions of atoms). Schulten, who passed away in October 2016, understood better than most people the significance of using computers to simulate nature. In an interview in 2015, he laid out his rationale for modeling the chromatophore. "The motivation is to understand a very key step of life on Earth on which all life depends today. Energy-wise 95 percent of life on Earth depends on photosynthesis, including humans," he said. Schulten also understood the milestone a specialized organelle represented on the road to simulating a complete single-celled organism. "We don't have anything smaller than a cell that we would call alive," he said. "It's the smallest living entity, and we want to understand it." With next-generation supercomputers, including the OLCF's Summit, set to come online in 2018, the research group Schulten founded in 1989 is preparing to take on the grand challenge of simulating a cell. Under the leadership of Tajkhorshid, the team plans to simulate the first billion-atom cell, including the basic components a cell needs to survive and grow. Improvements to NAMD and work being done under the OLCF's Center for Accelerated Application Readiness program are helping to make the vision of Schulten and others a reality. "We keep moving forward," Singharoy said. "Our exhaustive study of a complete organelle in all-atom detail has opened the door for a full cell in all-atom detail." More information: Abhishek Singharoy et al. Chemomechanical Coupling in Hexameric Protein–Protein Interfaces Harnesses Energy within V-Type ATPases, Journal of the American Chemical Society (2017). DOI: 10.1021/jacs.6b10744 Abhishek Singharoy et al. Binding Site Recognition and Docking Dynamics of a Single Electron Transport Protein: Cytochrome, Journal of the American Chemical Society (2016). DOI: 10.1021/jacs.6b01193 Melih Sener et al. Overall energy conversion efficiency of a photosynthetic vesicle, eLife (2016). DOI: 10.7554/eLife.09541


News Article | May 23, 2017
Site: www.eurekalert.org

Organisms in nature adapt and evolve in complex environments. For example, when subjected to changes in nutrients, antibiotics, and predation, microbes in the wild face the challenge of adapting multiple traits at the same time. But how does evolution unfold when, for survival, multiple traits must be improved simultaneously? While heritable genetic mutations can alter phenotypic traits and enable populations to adapt to their environment, adaptation is frequently limited by trade-offs: a mutation advantageous to one trait might be detrimental to another. Because of the interplay between the selection pressures present in complex environments and the trade-offs constraining phenotypes, predicting evolutionary dynamics is difficult. Researchers at the University of Illinois at Urbana-Champaign have shown how evolutionary dynamics proceed when selection acts on two traits governed by a trade-off. The results move the life sciences a step closer to understanding the full complexity of evolution at the cellular level. Seppe Kuehn, an assistant professor of physics and member of the Center for the Physics of Living Cells at the U. of I., led the research. The team studied populations of the bacterium Escherichia coli, which can undergo hundreds of generations in a single week, providing ample opportunity to study mutations and their impact on heritable traits. The team selected populations of E. coli for faster migration through a porous environment. A quantitative model revealed that populations could achieve the fastest migration by improving two traits at once -- swimming speed and growth rate (cell division). Kuehn explains, "This study sheds new light on how evolution proceeds when performance depends on two traits that are restricted by a trade-off. Though a mathematical model suggests that the fastest migrating populations should be composed of cells that swim fast and reproduce quickly, what we found was that populations achieve faster migration through two divergent evolutionary paths that are mutually exclusive: in other words, these populations improved in either swimming speed or reproduction rate, but not both." David T. Fraebel, a U. of I. graduate student in Kuehn's lab group, is lead author on the study. He comments, "Most experiments apply selection pressure to optimize a single trait, and trade-offs are observed in this context due to decay of traits that aren't being selected rather than due to compromise between multiple pressures. We selected for swimming and growth simultaneously, yet E. coli was not able to optimize both traits at once." The selection environment created by the team determined which evolutionary trajectory the populations followed. In a nutrient-rich medium, faster swimming meant slower reproduction; in a nutrient-poor environment, however, slower swimming and faster reproduction led to the same desired outcome: faster migration through the porous environment. By sequencing the DNA of the evolved populations, the team identified the mutations responsible for adaptation in each condition. When they genetically engineered these mutations into the founding strain, these cells demonstrated faster migration and the same phenotypic trade-off as the evolved strains. "Our results support the idea that evolution takes the direction that's genetically easy," says Kuehn. "In a nutrient-rich environment, it's easy to find a mutation that enables the cells to swim faster. In a nutrient-poor environment, it's easy to find a mutation that makes cell division faster. In both cases, the mutations are disrupting negative regulatory genes whose function it is to reduce gene expression or protein levels." "Other recent studies have shown that microevolution is dominated by changes in negative regulatory elements. The reason: it's statistically easy to find a mutation that breaks things versus one that builds new function or parts. When selection acts on two traits restricted by a trade-off, the phenotype evolves in the direction of breaking negative regulatory elements, because it's an easy path statistically. It relates to the availability of useful mutations." Kuehn summarizes the finding's value: "Improving predictive modeling of evolution will involve understanding how mutations alter the regulation of cellular processes and how these processes are related to trade-offs that constrain traits. Uncovering the general principles that define the relationship between regulation and trade-offs could enable us to predict evolutionary outcomes." These findings are published in the online journal eLife.


News Article | May 23, 2017
Site: phys.org

While heritable genetic mutations can alter phenotypic traits and enable populations to adapt to their environment, adaptation is frequently limited by trade-offs: a mutation advantageous to one trait might be detrimental to another. Because of the interplay between the selection pressures present in complex environments and the trade-offs constraining phenotypes, predicting evolutionary dynamics is difficult. Researchers at the University of Illinois at Urbana-Champaign have shown how evolutionary dynamics proceed when selection acts on two traits governed by a trade-off. The results move the life sciences a step closer to understanding the full complexity of evolution at the cellular level. Seppe Kuehn, an assistant professor of physics and member of the Center for the Physics of Living Cells at the U. of I., led the research. The team studied populations of the bacterium Escherichia coli, which can undergo hundreds of generations in a single week, providing ample opportunity to study mutations and their impact on heritable traits. The team selected populations of E. coli for faster migration through a porous environment. A quantitative model revealed that populations could achieve the fastest migration by improving two traits at once—swimming speed and growth rate (cell division). Kuehn explains, "This study sheds new light on how evolution proceeds when performance depends on two traits that are restricted by a trade-off. Though a mathematical model suggests that the fastest migrating populations should be composed of cells that swim fast and reproduce quickly, what we found was that populations achieve faster migration through two divergent evolutionary paths that are mutually exclusive: in other words, these populations improved in either swimming speed or reproduction rate, but not both." David T. Fraebel, a U. of I. graduate student in Kuehn's lab group, is lead author on the study. He comments, "Most experiments apply selection pressure to optimize a single trait, and trade-offs are observed in this context due to decay of traits that aren't being selected rather than due to compromise between multiple pressures. We selected for swimming and growth simultaneously, yet E. coli was not able to optimize both traits at once." The selection environment created by the team determined which evolutionary trajectory the populations followed. In a nutrient-rich medium, faster swimming meant slower reproduction; in a nutrient-poor environment, however, slower swimming and faster reproduction led to the same desired outcome: faster migration through the porous environment. By sequencing the DNA of the evolved populations, the team identified the mutations responsible for adaptation in each condition. When they genetically engineered these mutations into the founding strain, these cells demonstrated faster migration and the same phenotypic trade-off as the evolved strains. "Our results support the idea that evolution takes the direction that's genetically easy," says Kuehn. "In a nutrient-rich environment, it's easy to find a mutation that enables the cells to swim faster. In a nutrient-poor environment, it's easy to find a mutation that makes cell division faster. In both cases, the mutations are disrupting negative regulatory genes whose function it is to reduce gene expression or protein levels." "Other recent studies have shown that microevolution is dominated by changes in negative regulatory elements. The reason: it's statistically easy to find a mutation that breaks things versus one that builds new function or parts. When selection acts on two traits restricted by a trade-off, the phenotype evolves in the direction of breaking negative regulatory elements, because it's an easy path statistically. It relates to the availability of useful mutations." Kuehn summarizes the finding's value: "Improving predictive modeling of evolution will involve understanding how mutations alter the regulation of cellular processes and how these processes are related to trade-offs that constrain traits. Uncovering the general principles that define the relationship between regulation and trade-offs could enable us to predict evolutionary outcomes." These findings are published in the online journal eLife. Explore further: At molecular level, evolutionary change is unpredictable More information: David T Fraebel et al. Environment determines evolutionary trajectory in a constrained phenotypic space, eLife (2017). DOI: 10.7554/eLife.24669


News Article | May 25, 2017
Site: www.sciencedaily.com

Organisms in nature adapt and evolve in complex environments. For example, when subjected to changes in nutrients, antibiotics, and predation, microbes in the wild face the challenge of adapting multiple traits at the same time. But how does evolution unfold when, for survival, multiple traits must be improved simultaneously? While heritable genetic mutations can alter phenotypic traits and enable populations to adapt to their environment, adaptation is frequently limited by trade-offs: a mutation advantageous to one trait might be detrimental to another. Because of the interplay between the selection pressures present in complex environments and the trade-offs constraining phenotypes, predicting evolutionary dynamics is difficult. Researchers at the University of Illinois at Urbana-Champaign have shown how evolutionary dynamics proceed when selection acts on two traits governed by a trade-off. The results move the life sciences a step closer to understanding the full complexity of evolution at the cellular level. Seppe Kuehn, an assistant professor of physics and member of the Center for the Physics of Living Cells at the U. of I., led the research. The team studied populations of the bacterium Escherichia coli, which can undergo hundreds of generations in a single week, providing ample opportunity to study mutations and their impact on heritable traits. The team selected populations of E. coli for faster migration through a porous environment. A quantitative model revealed that populations could achieve the fastest migration by improving two traits at once -- swimming speed and growth rate (cell division). Kuehn explains, "This study sheds new light on how evolution proceeds when performance depends on two traits that are restricted by a trade-off. Though a mathematical model suggests that the fastest migrating populations should be composed of cells that swim fast and reproduce quickly, what we found was that populations achieve faster migration through two divergent evolutionary paths that are mutually exclusive: in other words, these populations improved in either swimming speed or reproduction rate, but not both." David T. Fraebel, a U. of I. graduate student in Kuehn's lab group, is lead author on the study. He comments, "Most experiments apply selection pressure to optimize a single trait, and trade-offs are observed in this context due to decay of traits that aren't being selected rather than due to compromise between multiple pressures. We selected for swimming and growth simultaneously, yet E. coli was not able to optimize both traits at once." The selection environment created by the team determined which evolutionary trajectory the populations followed. In a nutrient-rich medium, faster swimming meant slower reproduction; in a nutrient-poor environment, however, slower swimming and faster reproduction led to the same desired outcome: faster migration through the porous environment. By sequencing the DNA of the evolved populations, the team identified the mutations responsible for adaptation in each condition. When they genetically engineered these mutations into the founding strain, these cells demonstrated faster migration and the same phenotypic trade-off as the evolved strains. "Our results support the idea that evolution takes the direction that's genetically easy," says Kuehn. "In a nutrient-rich environment, it's easy to find a mutation that enables the cells to swim faster. In a nutrient-poor environment, it's easy to find a mutation that makes cell division faster. In both cases, the mutations are disrupting negative regulatory genes whose function it is to reduce gene expression or protein levels." "Other recent studies have shown that microevolution is dominated by changes in negative regulatory elements. The reason: it's statistically easy to find a mutation that breaks things versus one that builds new function or parts. When selection acts on two traits restricted by a trade-off, the phenotype evolves in the direction of breaking negative regulatory elements, because it's an easy path statistically. It relates to the availability of useful mutations." Kuehn summarizes the finding's value: "Improving predictive modeling of evolution will involve understanding how mutations alter the regulation of cellular processes and how these processes are related to trade-offs that constrain traits. Uncovering the general principles that define the relationship between regulation and trade-offs could enable us to predict evolutionary outcomes." These findings are published in the online journal eLife.


Ha T.,Center for the Physics of Living Cells | Ha T.,University of Illinois at Urbana - Champaign | Kozlov A.G.,University of Washington | Lohman T.M.,University of Washington
Annual Review of Biophysics | Year: 2012

The advent of new technologies allowing the study of single biological molecules continues to have a major impact on studies of interacting systems as well as enzyme reactions. These approaches (fluorescence, optical, and magnetic tweezers), in combination with ensemble methods, have been particularly useful for mechanistic studies of proteinnucleic acid interactions and enzymes that function on nucleic acids. We review progress in the use of single-molecule methods to observe and perturb the activities of proteins and enzymes that function on flexible single-stranded DNA. These include single-stranded DNA binding proteins, recombinases (RecARad51), and helicasestranslocases that operate as motor proteins and play central roles in genome maintenance. We emphasize methods that have been used to detect and study the movement of these proteins (both ATP-dependent directional and random movement) along the single-stranded DNA and the mechanistic and functional information that can result from detailed analysis of such movement. © 2012 by Annual Reviews. All rights reserved.


News Article | March 2, 2017
Site: www.scientificcomputing.com

MIT chemical engineers have developed an extremely sensitive detector that can track single cells’ secretion of dopamine, a brain chemical responsible for carrying messages involved in reward-motivated behavior, learning, and memory. Using arrays of up to 20,000 tiny sensors, the researchers can monitor dopamine secretion of single neurons, allowing them to explore critical questions about dopamine dynamics. Until now, that has been very difficult to do. “Now, in real-time, and with good spatial resolution, we can see exactly where dopamine is being released,” says Michael Strano, the Carbon P. Dubbs Professor of Chemical Engineering and the senior author of a paper describing the research, which appears in the Proceedings of the National Academy of Sciences the week of Feb. 6. Strano and his colleagues have already demonstrated that dopamine release occurs differently than scientists expected in a type of neural progenitor cell, helping to shed light on how dopamine may exert its effects in the brain. The paper’s lead author is Sebastian Kruss, a former MIT postdoc who is now at Göttingen University, in Germany. Other authors are Daniel Salem and Barbara Lima, both MIT graduate students; Edward Boyden, an associate professor of biological engineering and brain and cognitive sciences, as well as a member of the MIT Media Lab and the McGovern Institute for Brain Research; Lela Vukovic, an assistant professor of chemistry at the University of Texas at El Paso; and Emma Vander Ende, a graduate student at Northwestern University. Dopamine is a neurotransmitter that plays important roles in learning, memory, and feelings of reward, which reinforce positive experiences. Neurotransmitters allow neurons to relay messages to nearby neurons through connections known as synapses. However, unlike most other neurotransmitters, dopamine can exert its effects beyond the synapse: Not all dopamine released into a synapse is taken up by the target cell, allowing some of the chemical to diffuse away and affect other nearby cells. “It has a local effect, which controls the signaling through the neurons, but also it has a global effect,” Strano says. “If dopamine is in the region, it influences all the neurons nearby.” Tracking this dopamine diffusion in the brain has proven difficult. Neuroscientists have tried using electrodes that are specialized to detect dopamine, but even using the smallest electrodes available, they can place only about 20 near any given cell. “We’re at the infancy of really understanding how these packets of chemicals move and their directionality,” says Strano, who decided to take a different approach. Strano’s lab has previously developed sensors made from arrays of carbon nanotubes — hollow, nanometer-thick cylinders made of carbon, which naturally fluoresce when exposed to laser light. By wrapping these tubes in different proteins or DNA strands, scientists can customize them to bind to different types of molecules. The carbon nanotube sensors used in this study are coated with a DNA sequence that makes the sensors interact with dopamine. When dopamine binds to the carbon nanotubes, they fluoresce more brightly, allowing the researchers to see exactly where the dopamine was released. The researchers deposited more than 20,000 of these nanotubes on a glass slide, creating an array that detects any dopamine secreted by a cell placed on the slide. In the new PNAS study, the researchers used these dopamine sensors to explore a longstanding question about dopamine release in the brain: From which part of the cell is dopamine secreted? To help answer that question, the researchers placed individual neural progenitor cells known as PC-12 cells onto the sensor arrays. PC-12 cells, which develop into neuron-like cells under the right conditions, have a starfish-like shape with several protrusions that resemble axons, which form synapses with other cells. After stimulating the cells to release dopamine, the researchers found that certain dopamine sensors near the cells lit up immediately, while those farther away turned on later as the dopamine diffused away. Tracking those patterns over many seconds allowed the researchers to trace how dopamine spreads away from the cells. Strano says one might expect to see that most of the dopamine would be released from the tips of the arms extending out from the cells. However, the researchers found that in fact more dopamine came from the sides of the arms. “We have falsified the notion that dopamine should only be released at these regions that will eventually become the synapses,” Strano says. “This observation is counterintuitive, and it’s a new piece of information you can only obtain with a nanosensor array like this one.” The team also showed that most of the dopamine traveled away from the cell, through protrusions extending in opposite directions. “Even though dopamine is not necessarily being released only at the tip of these protrusions, the direction of release is associated with them,” Salem says. Other questions that could be explored using these sensors include how dopamine release is affected by the direction of input to the cell, and how the presence of nearby cells influences each cell’s dopamine release. The research was funded by the National Science Foundation, the National Institutes of Health, a University of Illinois Center for the Physics of Living Cells Postdoctoral Fellowship, the German Research Foundation, and a Liebig Fellowship.


News Article | February 15, 2017
Site: news.mit.edu

MIT chemical engineers have developed an extremely sensitive detector that can track single cells’ secretion of dopamine, a brain chemical responsible for carrying messages involved in reward-motivated behavior, learning, and memory. Using arrays of up to 20,000 tiny sensors, the researchers can monitor dopamine secretion of single neurons, allowing them to explore critical questions about dopamine dynamics. Until now, that has been very difficult to do. “Now, in real-time, and with good spatial resolution, we can see exactly where dopamine is being released,” says Michael Strano, the Carbon P. Dubbs Professor of Chemical Engineering and the senior author of a paper describing the research, which appears in the Proceedings of the National Academy of Sciences the week of Feb. 6. Strano and his colleagues have already demonstrated that dopamine release occurs differently than scientists expected in a type of neural progenitor cell, helping to shed light on how dopamine may exert its effects in the brain. The paper’s lead author is Sebastian Kruss, a former MIT postdoc who is now at Göttingen University, in Germany. Other authors are Daniel Salem and Barbara Lima, both MIT graduate students; Edward Boyden, an associate professor of biological engineering and brain and cognitive sciences, as well as a member of the MIT Media Lab and the McGovern Institute for Brain Research; Lela Vukovic, an assistant professor of chemistry at the University of Texas at El Paso; and Emma Vander Ende, a graduate student at Northwestern University. Dopamine is a neurotransmitter that plays important roles in learning, memory, and feelings of reward, which reinforce positive experiences. Neurotransmitters allow neurons to relay messages to nearby neurons through connections known as synapses. However, unlike most other neurotransmitters, dopamine can exert its effects beyond the synapse: Not all dopamine released into a synapse is taken up by the target cell, allowing some of the chemical to diffuse away and affect other nearby cells. “It has a local effect, which controls the signaling through the neurons, but also it has a global effect,” Strano says. “If dopamine is in the region, it influences all the neurons nearby.” Tracking this dopamine diffusion in the brain has proven difficult. Neuroscientists have tried using electrodes that are specialized to detect dopamine, but even using the smallest electrodes available, they can place only about 20 near any given cell. “We’re at the infancy of really understanding how these packets of chemicals move and their directionality,” says Strano, who decided to take a different approach. Strano’s lab has previously developed sensors made from arrays of carbon nanotubes — hollow, nanometer-thick cylinders made of carbon, which naturally fluoresce when exposed to laser light. By wrapping these tubes in different proteins or DNA strands, scientists can customize them to bind to different types of molecules. The carbon nanotube sensors used in this study are coated with a DNA sequence that makes the sensors interact with dopamine. When dopamine binds to the carbon nanotubes, they fluoresce more brightly, allowing the researchers to see exactly where the dopamine was released. The researchers deposited more than 20,000 of these nanotubes on a glass slide, creating an array that detects any dopamine secreted by a cell placed on the slide. In the new PNAS study, the researchers used these dopamine sensors to explore a longstanding question about dopamine release in the brain: From which part of the cell is dopamine secreted? To help answer that question, the researchers placed individual neural progenitor cells known as PC-12 cells onto the sensor arrays. PC-12 cells, which develop into neuron-like cells under the right conditions, have a starfish-like shape with several protrusions that resemble axons, which form synapses with other cells. After stimulating the cells to release dopamine, the researchers found that certain dopamine sensors near the cells lit up immediately, while those farther away turned on later as the dopamine diffused away. Tracking those patterns over many seconds allowed the researchers to trace how dopamine spreads away from the cells. Strano says one might expect to see that most of the dopamine would be released from the tips of the arms extending out from the cells. However, the researchers found that in fact more dopamine came from the sides of the arms. “We have falsified the notion that dopamine should only be released at these regions that will eventually become the synapses,” Strano says. “This observation is counterintuitive, and it’s a new piece of information you can only obtain with a nanosensor array like this one.” The team also showed that most of the dopamine traveled away from the cell, through protrusions extending in opposite directions. “Even though dopamine is not necessarily being released only at the tip of these protrusions, the direction of release is associated with them,” Salem says. Other questions that could be explored using these sensors include how dopamine release is affected by the direction of input to the cell, and how the presence of nearby cells influences each cell’s dopamine release. The research was funded by the National Science Foundation, the National Institutes of Health, a University of Illinois Center for the Physics of Living Cells Postdoctoral Fellowship, the German Research Foundation, and a Liebig Fellowship.


News Article | November 14, 2016
Site: phys.org

The findings, published in Physical Review Letters by physicists Chi Xue and Nigel Goldenfeld, are an important step toward understanding the complex ways that genomes change over the lifetime of individual organisms, and how they evolve over generations. "These are genes that are active and are doing genome editing in real time in living cells, and this is a start of trying to really understand them in much more detail than has been done before," said Goldenfeld, who leads the Biocomplexity research theme at the Carl R. Woese Institute for Universal Biology (IGB). "This is helping us understand the evolution of complexity and the evolution of genomes." The study was supported by Center for the Physics of Living Cells, a Physics Frontiers Center at Illinois supported by the National Science Foundation, and the NASA Astrobiology Institute for Universal Biology at Illinois, which Goldenfeld directs. Goldenfeld and Xue embarked on this work because of their interest in transposons, small regions of DNA that can move themselves from one part of the genome to another during the lifetime of a cell—a capability that has earned them the name "jumping genes." Collectively, various types of transposons make up almost half of the human genome. When they move around, they may create mutations in or alter the activity of a functional gene; transposons can therefore create new genetic profiles in a population for natural selection to act on, in either a positive or negative way. The Illinois researchers wanted to learn more about how evolution works on this level, the level of whole organisms, by looking at the metaphorical ecosystem of the human genome. In this view, the physical structure of the DNA that makes up the genome acts like an environment, in which two types of transposons, long interspersed nuclear elements (LINEs) and short interspersed nuclear elements (SINEs), have a competitive relationship with one another. In order to replicate, SINEs steal the molecular machinery that LINEs use to copy themselves, somewhat like a cuckoo bird tricks other birds into raising her chicks for her while abandoning their own. With help from Oleg Simakov, a researcher at the Okinawa Institute of Science and Technology, Xue and Goldenfeld focused on the biology of L1 elements and Alu elements, respectively common types of LINEs and SINEs in the human genome. The researchers adopted methods from modern statistical physics and modeled the interaction between Alu and L1 elements mathematically as a stochastic process—a process created from chance interactions. This method has been successfully applied in ecology to describe predator-prey interactions; Xue and Goldenfeld simulated the movements of transposons within the human genome with the same mathematical method. Their models included a detailed accounting for how Alu elements steal the molecular machinery L1 elements use to copy themselves. Xue and Goldenfeld's results predicted that populations of LINE and SINE elements in the genome are expected to oscillate the way those of, for example, wolves and rabbits might. "We realized that the transposons' interaction actually was pretty much like the predator-prey interaction in ecology," said Xue. "We came up with the idea, why don't we apply the same idea of predator-prey dynamics . . .we expected to see the oscillations we see in the predator-prey model. So we first did the simulation and we saw the oscillations we expected, and we got really excited." In other words, too many SINEs and the LINEs start to suffer, and soon there are not enough for all the SINEs to exploit. SINEs start to suffer, and the LINEs make a come-back. Xue and Goldenfeld's model made the surprising prediction that these oscillations occur over a timescale that is longer than the human lifespan—waves of Alu elements and L1 elements pushing and pulling at each other in slow motion across generations of the human genomes that carry them. "The most enlightening aspect of the study for me was the fact that we could really compute the timescales, and see that it is possible that we could observe these things," said Goldenfeld. "We have a prediction for what happens in single cells, and we may be able to actually do an experiment to observe these things, though the period is longer than the lifetime of a single cell." In a related study, Goldenfeld's laboratory has collaborated with the laboratory of fellow physicist and IGB Biocomplexity research theme member Thomas Kuhlman to visualize the movements of transposons within the genomes of living cells. Using this type of innovative technology, and by studying the history of molecular evolution in other species, Goldenfeld and Xue hope to test some of the predictions made by their model and continue to gain insight into the dynamic world of the genome. More information: Chi Xue et al, Stochastic Predator-Prey Dynamics of Transposons in the Human Genome, Physical Review Letters (2016). DOI: 10.1103/PhysRevLett.117.208101


News Article | November 15, 2016
Site: www.sciencedaily.com

Nature is full of parasites -- organisms that flourish and proliferate at the expense of another species. Surprisingly, these same competing roles of parasite and host can be found in the microscopic molecular world of the cell. A new study by two Illinois researchers has demonstrated that dynamic elements within the human genome interact with each other in a way that strongly resembles the patterns seen in populations of predators and prey. The findings, published in Physical Review Letters by physicists Chi Xue and Nigel Goldenfeld, are an important step toward understanding the complex ways that genomes change over the lifetime of individual organisms, and how they evolve over generations. "These are genes that are active and are doing genome editing in real time in living cells, and this is a start of trying to really understand them in much more detail than has been done before," said Goldenfeld, who leads the Biocomplexity research theme at the Carl R. Woese Institute for Universal Biology (IGB). "This is helping us understand the evolution of complexity and the evolution of genomes." The study was supported by Center for the Physics of Living Cells, a Physics Frontiers Center at Illinois supported by the National Science Foundation, and the NASA Astrobiology Institute for Universal Biology at Illinois, which Goldenfeld directs. Goldenfeld and Xue embarked on this work because of their interest in transposons, small regions of DNA that can move themselves from one part of the genome to another during the lifetime of a cell -- a capability that has earned them the name "jumping genes." Collectively, various types of transposons make up almost half of the human genome. When they move around, they may create mutations in or alter the activity of a functional gene; transposons can therefore create new genetic profiles in a population for natural selection to act on, in either a positive or negative way. The Illinois researchers wanted to learn more about how evolution works on this level, the level of whole organisms, by looking at the metaphorical ecosystem of the human genome. In this view, the physical structure of the DNA that makes up the genome acts like an environment, in which two types of transposons, long interspersed nuclear elements (LINEs) and short interspersed nuclear elements (SINEs), have a competitive relationship with one another. In order to replicate, SINEs steal the molecular machinery that LINEs use to copy themselves, somewhat like a cuckoo bird tricks other birds into raising her chicks for her while abandoning their own. With help from Oleg Simakov, a researcher at the Okinawa Institute of Science and Technology, Xue and Goldenfeld focused on the biology of L1 elements and Alu elements, respectively common types of LINEs and SINEs in the human genome. The researchers adopted methods from modern statistical physics and modeled the interaction between Alu and L1 elements mathematically as a stochastic process -- a process created from chance interactions. This method has been successfully applied in ecology to describe predator-prey interactions; Xue and Goldenfeld simulated the movements of transposons within the human genome with the same mathematical method. Their models included a detailed accounting for how Alu elements steal the molecular machinery L1 elements use to copy themselves. Xue and Goldenfeld's results predicted that populations of LINE and SINE elements in the genome are expected to oscillate the way those of, for example, wolves and rabbits might. "We realized that the transposons' interaction actually was pretty much like the predator-prey interaction in ecology," said Xue. "We came up with the idea, why don't we apply the same idea of predator-prey dynamics . . .we expected to see the oscillations we see in the predator-prey model. So we first did the simulation and we saw the oscillations we expected, and we got really excited." In other words, too many SINEs and the LINEs start to suffer, and soon there are not enough for all the SINEs to exploit. SINEs start to suffer, and the LINEs make a come-back. Xue and Goldenfeld's model made the surprising prediction that these oscillations occur over a timescale that is longer than the human lifespan -- waves of Alu elements and L1 elements pushing and pulling at each other in slow motion across generations of the human genomes that carry them. "The most enlightening aspect of the study for me was the fact that we could really compute the timescales, and see that it is possible that we could observe these things," said Goldenfeld. "We have a prediction for what happens in single cells, and we may be able to actually do an experiment to observe these things, though the period is longer than the lifetime of a single cell." In a related study, Goldenfeld's laboratory has collaborated with the laboratory of fellow physicist and IGB Biocomplexity research theme member Thomas Kuhlman to visualize the movements of transposons within the genomes of living cells. Using this type of innovative technology, and by studying the history of molecular evolution in other species, Goldenfeld and Xue hope to test some of the predictions made by their model and continue to gain insight into the dynamic world of the genome.

Loading Center for the Physics of Living Cells collaborators
Loading Center for the Physics of Living Cells collaborators