Energy Biosciences Institute
Energy Biosciences Institute
News Article | July 24, 2017
CHAMPAIGN, Ill. -- In the summer of 2012, two undergraduate students tackled a problem that plant ecology experts had overlooked for 30 years. The students demonstrated that different plant species vary in how they take in carbon dioxide and emit water through stomata, the pores in their leaves. The data boosted the accuracy of mathematical models of carbon and water fluxes through plant leaves by 30 to 60 percent. The researchers, based at the University of Illinois, report their findings in the journal Nature Ecology and Evolution. In hindsight, the discovery might seem obvious, said U. of I. plant biology professor Andrew Leakey , who mentored the students and is a co-author on the study. "If I were to go to a conference of plant physiologists and say, 'Hey, is there diversity in the way that plant stomata behave?' every one of them would say, 'yes,'" Leakey said. "And yet, for most of the last 30 years, our community has failed to describe that diversity in terms of the math." This oversight stems in part from the fact that few plant biologists know how - or are naturally inclined - to convert their biological insights into the mathematical equations that modelers need to improve the accuracy of their work, Leakey said. "As a result, modelers have been forced to assume that the stomata of all species open and close in response to environmental conditions in the same way," he said. This assumption was based on the work of a team led by Joseph Berry of the Carnegie Institution for Science. The group discovered that the behavior of stomata could be described by a single, simple equation. But Berry and his colleagues made their initial breakthrough by measuring soybean. Since then, very few plant scientists had questioned whether the equation for soybean also worked in other species. As a result, modelers were stuck with the one version of the equation, Leakey said. "This was an oversimplification that likely led to errors in model predictions of how well crops and forests grow in different times and places," he said. "It's impossible to measure every plant everywhere through time across the globe," said Kevin Wolz, who conducted the new research with Mark Abordo when both were undergrads. "So, we instead measure a few things experimentally and then represent that with some math, which is a model." Modeling is a useful tool for making predictions about how various biological systems will function over time, Wolz said. Models can help determine which crops will do well in specific geographic locations and whether they will produce enough food or biomass to make their cultivation profitable. They also help predict how plants will respond to pollution, drought or future climate conditions, giving policymakers insight into the potential harms or benefits associated with specific land use decisions. At the time of the study, Wolz was majoring in biology and civil and environmental engineering. This gave him insight into both the complexity of the natural world and the simplicity and power of mathematical models. He and Abordo, a mathematics major at the time, jumped at the chance to study how plants adjust their stomata in response to different atmospheric conditions. "It was a nice change from working on chalkboards all the time to doing lab experiments and working out in the fields," Abordo said. The two got up before dawn every weekday over the summer to collect leaves from 15 tree species and take them back to the lab, where they used gas exchange equipment to measure how the leaves responded to different light and atmospheric conditions. Each leaf was put through its paces with tests lasting roughly six hours. "It's a bit like going to the doctor and having a cardio test where they put you on a treadmill," Leakey said. "Essentially, that's what Kevin and Mark were doing; they were taking leaves and running them under different scenarios to learn how the leaves responded." Their findings were not surprising, Wolz said. "We demonstrated that not every plant is alike," he said. The team found a significant amount of variation in the way that different tree species responded to things like light, heat, carbon dioxide concentration and humidity. Altering standard models with the new data dramatically improved the models' accuracy, the researchers found. "We saw a 30 to 60 percent reduction in error," Leakey said. "This research shows that training people like Kevin in an interdisciplinary way allows us to break down communication barriers in science - between modelers and plant scientists, for example," Leakey said. "This is only one of a long list of problems that would benefit from such an approach." More work is needed to extend the new approach to other plant species, and to broaden the effort to include models that look at dynamics at the ecosystem scale, the researchers said. Andrew Leakey is an affiliate of the Carl. R. Woese Institute for Genomic Biology at the U. of I. The National Science Foundation and Energy Biosciences Institute support this research. To reach Kevin Wolz, email email@example.com To reach Mark Abordo, email firstname.lastname@example.org To reach Andrew Leakey, call 217-766-9155; email email@example.com The paper "Diversity in stomatal function is integral to modeling plant carbon and water fluxes" is available online and from the U. of I. News Bureau.
News Article | July 24, 2017
U.S. Environmental Protection Agency (EPA) Administrator Scott Pruitt is considering a former official in President Barack Obama’s Energy Department to lead the agency's debate on mainstream climate science, according to a former leader of the Trump administration's EPA transition effort. Steve Koonin, a physicist and director of the Center for Urban Science and Progress at New York University, is being eyed to lead EPA's "red team, blue team" review of climate science, said Myron Ebell, a senior fellow at the Competitive Enterprise Institute and a Trump transition leader. "It makes sense because he has positioned himself as an honest broker," Ebell said. "He doesn't think that the consensus is what some of the alarmists claim it is, and there's a lot that needs to be discussed." When reached by phone, Koonin declined to comment on whether he was in talks with the administration about the climate job. But he added, "I think it would be a good idea if that kind of exercise took place." EPA has also consulted with groups like the free-market Heartland Institute for input on which scientists to include in the effort, but the agency didn't immediately respond to a request for comment about Koonin or its outreach. Koonin served as DOE's undersecretary for science from 2009 to 2011 under President Obama, overseeing activities tied to science, energy and security. He also led DOE's first Quadrennial Technology Review for energy, according to his online bio. Before joining DOE, Koonin was a professor of theoretical physics and provost at the California Institute of Technology, and he was a member of the National Academy of Sciences. Koonin also spent five years as a chief scientist for BP PLC, where he helped establish the Energy Biosciences Institute, according to his online bio. He has a bachelor's degree in physics from Caltech and a Ph.D. in theoretical physics from the Massachusetts Institute of Technology. Koonin in the past has called for a debate on mainstream climate science, and even pitched the "red team, blue team" concept in an op-ed in The Wall Street Journal in April. In an interview in April with the Journal, Koonin said the science isn't settled and skepticism is muted in policy-informing communities where people don't like to discuss uncertainties. "One of the biggest debates is how can we separate human influence from natural variability," he said. "That's very important because if we can detect human influences, then we can start to project their impact going forward." Koonin said scientists who question mainstream climate science are often shunned by colleagues and can lose federal funding. When asked about scientists who have also questioned climate science, Koonin pointed to prominent climate skeptic Richard Lindzen of MIT; Judith Curry, who recently retired from Georgia Tech; and Freeman Dyson, a retired professor of physics at the Institute for Advanced Study in Princeton University. Koonin is no stranger to such team debates or the controversy they trigger. In January 2014, Koonin oversaw a daylong symposium to discuss the American Physical Society's statement on climate change where the debate became testy. Koonin at the time was leading an APS subcommittee reviewing the society's position (Climatewire, April 14, 2015). In a transcript for the event held in Brooklyn, N.Y., Koonin said the panels would review both consensus views on climate change and scientists who "credibly take significant issue with several aspects of the consensus picture." Nine months later, Koonin resigned from his APS post to "promote his personal opinions on climate science in the public arena," according to the group. Ebell said the debate was productive but accused APS of not publicizing the event and the media of failing to pay attention. "That was a good example of how you can come to some deeper understanding by making people confront opposing arguments and then seeing where they lead," Ebell said. "That was a very useful exercise, but it never got much publicity or the media just didn't pay attention to it." Jim Lakely, the Heartland Institute's spokesman, said in an email that the White House and EPA had reached out to help identify scientists for a "red team" and called the debate "long overdue." The group has long called for a team approach to debating mainstream climate science and is sponsoring the publication of the Nongovernmental International Panel on Climate Change, or NIPCC. Lakely also applauded EPA for examining "alarmist dogma," adding that climate scientists who have dominated debates and products of the Intergovernmental Panel on Climate Change have gone unchallenged. Reprinted from Greenwire with permission from E&E News. Copyright 2017. E&E provides essential news for energy and environment professionals at www.eenews.net
News Article | June 27, 2017
Farmers earn more profits when there is demand for corn for biofuel instead of for food only. This can lead some to convert grasslands and forests to cropland. This conversion, also called indirect land use change, can have large-scale environmental consequences, including releasing stored carbon into the atmosphere. To penalize the carbon emissions from this so-called indirect land use change, the USEPA and California Air Resources Board include an indirect land use change factor when considering the carbon savings with biofuels for their compliance with the federal Renewable Fuel Standard or California's Low-Carbon Fuel Standard. "Biofuel policies like the Low-Carbon Fuel Standard in California are trying to minimize the indirect land use change related emissions by accounting for the indirect land use change factor as part of the carbon emissions per gallon of biofuels. We examine the costs and benefits of using this approach at a national level," says University of Illinois agricultural economist Madhu Khanna. A research paper on the subject by Khanna and her colleagues appears today in Nature Communications in which they ask: By how much would carbon emissions be reduced as a result of regulating indirect land use change like they are attempting to do in California? At what cost? And, who bears those costs? Khanna says a low-carbon fuel standard creates incentives to switch to low-carbon advanced biofuels, but including the indirect effect makes compliance more costly and fuel more expensive for consumers. Evan DeLucia, a U of I professor of plant biology and a co-author on the study, explains that biofuels differ in the carbon emissions they generate per gallon and their effect on use of land. Cellulosic biofuels, particularly from crop residues, or energy crops, like miscanthus and switchgrass, produced on low-quality marginal land lead to lower indirect land use change than corn ethanol. "Inclusion of the indirect land use change factor makes it much more costly to achieve the Low Carbon Fuel Standard," Khanna says. "It penalizes all biofuels and increases their carbon emissions per gallon. It imposes a hidden tax on all fuels that is borne by fuel consumers and blenders." "What we find is the inclusion of this indirect land use change factor leads to a relatively small reduction in emissions and this reduction comes at a very large cost to fuel consumers and fuel blenders," Khanna says. "The economic cost of reducing these carbon emissions is much higher than the value of the damages caused by those emissions, as measured by the social cost of carbon. What our findings suggest is that it's not optimal to regulate indirect land use change in the manner that it is currently done in California and of extending that to other parts of the country." The social cost of carbon, Khanna says, is $50 per ton of carbon dioxide on average. The economic cost of reducing carbon emissions by including California's indirect land use change factor at a national level is $61 per ton of carbon dioxide. The use of California's indirect land use change factors applied nationally would imply that the cost of reducing a ton of carbon is 20 percent higher than the avoided damages from those emissions. "We find that it is just not worth reducing these indirect land use emissions using California's approach. It imposes a cost that is passed on to the consumer in the form of a higher cost for fuel," Khanna says. "These costs for fuel consumers could range from $15 billion to $131 billion nationally over a decade, depending on the indirect land use change factors applied." "We need to think of better ways to prevent indirect land use change that would be more cost-effective," Khanna says. Currently, there is no national low-carbon fuel standard. California has one, Oregon recently established a low-carbon fuel standard, and other states are considering it. Khanna says this study provides useful information as states move forward to determine whether or not they should continue this policy of including an indirect land use change factor when they implement a low-carbon fuel standard. "A lot of effort has been made and continues to be made to calculate the indirect land use change factor so they can be included in implementing low-carbon fuel policies," Khanna says. "The presence of indirect land use change due to biofuels has in fact dominated the whole debate about the climate benefits of biofuels. We may be more productive if we focus more on the direct carbon saving with biofuels and incorporating those in trying to encourage the move toward lower carbon biofuels rather than regulating the indirect effects. Estimates of the indirect effects of biofuels have also become much smaller over time and it's time to re-evaluate the benefits of continuing the policy of regulating indirect emissions," Khanna says. The paper, "The social inefficiency of regulating indirect land use change due to biofuels," is written by Madhu Khanna, Weiwei Wang, Tara W. Hudiburg, and Evan H. DeLucia and is published in Nature Communications. Funding for the work was provided by the Energy Foundation, the Energy Biosciences Institute, University of California, Berkeley, and the Department of Energy Sun Grant. Khanna is the ACES Distinguished Professor for Environmental Economics in the Department of Agricultural and Consumer Economics in the College of Agricultural, Consumer and Environmental Sciences at the University of Illinois.
News Article | June 26, 2017
The Deepwater Horizon oil spill in the Gulf of Mexico in 2010 is one of the most studied spills in history, yet scientists haven't agreed on the role of microbes in eating up the oil. Now a research team at the Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) has identified all of the principal oil-degrading bacteria as well as their mechanisms for chewing up the many different components that make up the released crude oil. The team, led by Berkeley Lab microbial ecologist Gary Andersen, is the first to simulate the conditions that occurred in the aftermath of the spill. Their study, "Simulation of Deepwater Horizon oil plume reveals substrate specialization within a complex community of hydrocarbon-degraders," was just published in the Proceedings of the National Academy of Sciences. "This provides the most complete account yet of what was happening in the hydrocarbon plumes in the deep ocean during the event," said Andersen. Berkeley Lab's Ping Hu, the lead author of the study, added: "We simulated the conditions of the Gulf of Mexico oil spill in the lab and were able to understand the mechanisms for oil degradation from all of the principal oil-degrading bacteria that were observed in the original oil spill." This oil spill was the largest in history, with the release of 4.1 million barrels of crude oil as well as large amounts of natural gas from a mile below the surface of the ocean. After the initial explosion and uncontained release of oil, researchers observed a phenomenon that had not been seen before: More than 40 percent of the oil, combined with an introduced chemical dispersant, was retained in a plume nearly 100 miles long at this great depth. Yet because of the difficulty in collecting samples from so far below the ocean surface, and because of the large area that was impacted by the spill, a number of gaps in understanding the fate of the oil over time remained. Andersen and his team returned to the spill location four years later to collect water at depth. With the assistance of co-authors Piero Gardinali of Florida International University and Ron Atlas of the University of Louisville, a suspension of small, insoluble oil droplets was evenly distributed in bottles, along with the more soluble oil fractions and chemical dispersant to mimic the conditions of the oil plume. Over the next 64 days the composition of the microbes and the crude oil were intensively studied. The researchers witnessed an initial rapid growth of a microbe that had been previously observed to be the dominant bacterium in the early stages of the oil release but which had eluded subsequent attempts by others to recreate the conditions of the Gulf of Mexico oil plume. Through DNA sequencing of its genome they were able to identify its mechanism for degrading oil. They gave this newly discovered bacterium the tentative name of Bermanella macondoprimitus based on its relatedness to other deep-sea microbes and the location where it was discovered. "Our study demonstrated the importance of using dispersants in producing neutrally buoyant, tiny oil droplets, which kept much of the oil from reaching the ocean surface," Andersen said. "Naturally occurring microbes at this depth are highly specialized in growing by using specific components of the oil for their food source. So the oil droplets provided a large surface area for the microbes to chew up the oil." Working with Berkeley Lab scientist Jill Banfield, a study co-author and also a professor in UC Berkeley's Department of Earth and Planetary Sciences, the team used newly developed DNA-based methods to identify all of the genomes of the microbes that used the introduced oil for growth along with their specific genes that were responsible for oil degradation. Many of the bacteria that were identified were similar to oil-degrading bacteria found on the ocean surface but had considerably streamlined sets of genes for oil degradation. Early work on microbial activity after the oil spill was led by Berkeley Lab's Terry Hazen (now primarily associated with the University of Tennessee), which provided the first data ever on microbial activity from a deepwater dispersed oil plume. While Hazen's work revealed a variety of hydrocarbon degraders, this latest study identified the mechanisms the bacteria used to degrade oil and the relationship of these organisms involved in the spill to previously characterized hydrocarbon-degrading organisms. "We now have the capability to identify the specific organisms that would naturally degrade the oil if spills occurred in other regions and to calculate the rates of the oil degradation to figure out how long it would take to consume the spilled oil at depth," Andersen said. Andersen noted that it is not clear if the degradation of oil at these depths would have occurred in other offshore oil-producing regions. "The Gulf of Mexico is home to one of the largest concentrations of underwater hydrocarbon seeps, and it has been speculated that this helped in the selection of oil-degrading microbes that were observed in the underwater plumes," he said. Although the well drilled by the Deepwater Horizon rig was one of the deepest of its time, new oil exploration offshore of Brazil, Uruguay, and India has now exceeded 2 miles below the ocean surface. By capturing water from these areas and subjecting them to the same test, it may be possible in the future to understand the consequences of an uncontrolled release of oil in these areas in greater detail. "Our greatest hope would be that there were no oil spills in the future," Andersen said. "But having the ability to manipulate conditions in the laboratory could potentially allow us develop new insights for mitigating their impact." This research was funded by the Energy Biosciences Institute, a partnership led by UC Berkeley that includes Berkeley Lab and the University of Illinois at Urbana-Champaign. Other study co-authors were Eric Dubinsky, Jian Wang, Lauren Tom, and Christian Sieber of Berkeley Lab, and Alexander Probst of UC Berkeley. Lawrence Berkeley National Laboratory addresses the world's most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab's scientific expertise has been recognized with 13 Nobel Prizes. The University of California manages Berkeley Lab for the U.S. Department of Energy's Office of Science. For more, visit http://www. . DOE's Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.
News Article | August 15, 2017
In babies, sugars known as human milk oligosaccharides, or HMOs, play a key role in health. A California-based startup is one of a handful of companies trying to manufacture the sugars outside the human body–both to produce healthier infant formula for babies who can’t breastfeed, and to potentially improve health in adults as well. Unlike typical sugar found in food, human milk sugars can’t be digested by people and don’t make food sweet; instead, they feed beneficial bacteria in the gut. “If we consume them, the population of the good gut bacteria increases, out-populating the bad bacteria that could also reside in your body,” says Chaeyoung Shin, one of the cofounders of Sugarlogix, a startup making a particular type of human milk sugar called 2′-fucosyllactose, or 2′-FL. “This leads to a healthier digestive system, healthier gut, which will then help in boosting your immune system as well.” Shin and cofounder Kulika Chomvong met at the University of California-Berkeley’s Energy Biosciences Institute while working on a different problem: producing biofuel. Chomvong, a microbiologist, engineered yeast that could produce fuel from cellulose, and Shin, a chemical engineer, worked on improving the biofuel fermentation process. But after completing their PhD programs, they decided to shift course. “After the crash in fuel price, it didn’t seem like a good idea anymore, nor did it seem viable in the next 20 to 50 years,” says Shin. “So then we decided to look for a higher-end product.” Human milk sugars, which can be cultured through yeast fermentation–in a similar process to making biofuel or brewing beer–seemed like a good fit. In breastfed infants, the human milk sugars help build up bifidobacterium in the gut, one of a few bacteria that can digest the complex sugars. The bacteria help make the gut more acidic, which “prevents E. coli and bugs like that from getting an early foothold,” says David Mills, a professor of food science and technology at the University of California-Davis who studies the oligosaccharides. “It’s telling the immune system what’s, in a sense, a good bug and a bad bug.” In adults, low levels of bifidobacterium have been found in patients with diabetes and other diseases such as irritable bowel syndrome. When the gut microbiome–the ecosystem of trillions of bacteria living in the intestinal tract–is unbalanced, it has also been linked to cancer risk, Parkinson’s disease, and anxiety and depression, among other diseases. Eating junk food quickly affects the gut microbiome; in one study, a group of rural Africans who temporarily shifted from a healthy diet to burgers and fries showed both a marked change in gut microbes and an increase in biomarkers of cancer risk after only two weeks. In addition to shifting to a healthier diet, the scientists say that probiotics–such as bifidobacterium, which is added to some foods like yogurt–might help. And though there’s little research to back it up so far, the theory is that a supplement of human milk sugars, part of a class of ingredients known as prebiotics, could help those probiotics work better.
News Article | June 23, 2017
As farmers survey their fields this summer, several questions come to mind: How many plants germinated per acre? How does altering row spacing affect my yields? Does it make a difference if I plant my rows north to south or east to west? Now a computer model can answer these questions by comparing billions of virtual fields with different planting densities, row spacings, and orientations. The University of Illinois and the Partner Institute for Computational Biology in Shanghai developed this computer model to predict the yield of different crop cultivars in a multitude of planting conditions. Published in BioEnergy-Research, the model depicts the growth of 3D plants, incorporating models of the biochemical and biophysical processes that underlie productivity. Teaming up with the University of Sao Paulo in Brazil, they used the model to address a question for sugarcane producers: How much yield might be sacrificed to take advantage of a possible conservation planting technique? "Current sugarcane harvesters cut a single row at a time, which is time-consuming and leads to damage of the crop stands," said author Steve Long, Gutgsell Endowed Professor of Plant Biology and Crop Sciences at the Carl R. Woese Institute for Genomic Biology. "This could be solved if the crop was planted in double rows with gaps between the double rows. But plants in double rows will shade each other more, causing a potential loss of profitability." The model found that double-row spacing costs about 10% of productivity compared to traditional row spacing; however, this loss can be reduced to just 2% by choosing cultivars with more horizontal leaves planted in a north-south orientation. "This model could be applied to other crops to predict optimal planting designs for specific environments," said Yu Wang, a postdoctoral researcher at Illinois who led the study. "It could also be used in reverse to predict the potential outcome for a field." The authors predict this model will be especially useful when robotic planting becomes more commonplace, which will allow for many more planting permutations. This research was supported by the IGB, Energy Biosciences Institute, Realizing Increased Photosynthetic Efficiency (RIPE) project, and the Chinese Academy of Sciences. The paper "Development of a Three-Dimensional Ray-Tracing Model of Sugarcane Canopy Photosynthesis and Its Application in Assessing Impacts of Varied Row Spacing" is published by BioEnergy-Research (DOI: 10.1007/s12155-017-9823-x). Co-authors include: Yu Wang, Qingfeng Song, Deepak Jaiswal, Amanda P. de Souza, and Xin-Guang Zhu. The Carl R. Woese Institute for Genomic Biology (IGB) advances life sciences research through interdisciplinary collaborations within a state-of-the-art genomic research facility at the University of Illinois. The Energy Biosciences Institute (EBI) is a public-private collaboration to help solve the global energy challenge. Realizing Increased Photosynthetic Efficiency (RIPE) is an international research project funded by the Bill & Melinda Gates Foundation to engineer plants to more efficiently turn the sun's energy into food to sustainably increase worldwide food productivity.
News Article | January 26, 2016
The new study, published in Plant, Cell and Environment, addresses a central challenge of transgenic plant development: how to reliably evaluate whether genetic material has been successfully introduced. Researchers at the University of Illinois, the Polish Academy of Sciences, the University of Nebraska-Lincoln and the University of California, Berkeley compared the traditional method to several new ones that have emerged from advances in genomic technology and identified one that is much faster than the standard approach, yet equally reliable. The study was led by Illinois postdoctoral fellows Kasia Glowacka and Johannes Kromdijk. "For plants with long life cycles, such as our food crops, this will greatly speed the time between genetic transformation or DNA editing, and development of pure breeding lines," said Long, Gutgsell Endowed Professor of Crop Sciences and Plant Biology and the principal investigator for the study. Long is also a member of the Genomic Ecology of Global Change and Biosystems Design research themes and the Energy Biosciences Institute at the Carl R. Woese Institute for Genomic Biology. To meet the food and fuel needs of an ever-growing global population, researchers benefit from transgenic technologies to develop crops with higher yields and greater resiliency to environmental challenges. None of the technologies used to introduce new genetic material into plants work with 100 percent efficiency. Plants and their offspring must be screened to identify those in which gene transfer was successful. Traditionally, this was done in part by testing successive generations of plants to see if the desired traits are present and breed true over time. In addition, plant scientists can use one of several molecular methods to determine if a gene or genes have actually been successfully introduced into the plant genome. The "tried and true" method, the Southern blot, yields precise data but is slow and unwieldy. It requires isolating relatively large amounts of plant DNA, using fluorescent or radioactive dye to detect the gene of interest, and performing a week's worth of lab work for results from just a few samples at a time. The team compared the Southern blot technique with several that use variations of a chemical process called polymerase chain reaction (PCR). This process allows researchers to quantify specific pieces of the introduced DNA sequences by making many additional copies of them, and then estimating the number of copies—somewhat like estimating the amount of bacteria present in a sample by spreading it on a petri dish and letting colonies grow until they are visible. These methods are much faster than Southern blotting, but if the DNA in each sample does not "grow" at exactly the same rate, the resulting data will be imprecise—size won't be a perfect indicator of the starting quantity. One method examined by Long's group, digital drop PCR (ddPCR), is designed to overcome this weakness. Rather than using the PCR process to amplify all the DNA in a sample, this method first separates each individual fragment of DNA into its own tiny reaction, much like giving each bacterium its own tiny petri dish to grow in. PCR then amplifies each fragment until there are enough copies to be easily detected, and the total number of tiny reactions are counted. Because this method, unlike others, separates the growth-like step from the quantification step, it can be very precise even when the reaction isn't perfect. Results can be obtained in less than two days, and many samples can be processed simultaneously. Long hopes that his group's demonstration that ddPCR is a "reliable, fast and high throughput" technique will help it to become the new standard for those developing transgenic crops. "I believe it will become widely adopted," he said. Although ddPCR is currently more expensive than the other methods, Long said the cost would likely drop quickly, as have the costs of other genomic technologies.
Kim I.J.,Korea University |
Ko H.-J.,Korea University |
Kim T.-W.,Energy Biosciences Institute |
Choi I.-G.,Korea University |
Kim K.H.,Korea University
Biotechnology and Bioengineering | Year: 2013
Plant expansin proteins induce plant cell wall extension and have the ability to extend and disrupt cellulose. In addition, these proteins show synergistic activity with cellulases during cellulose hydrolysis. BsEXLX1 originating from Bacillus subtilis is a structural homolog of a β-expansin produced by Zea mays (ZmEXPB1). The Langmuir isotherm for binding of BsEXLX1 to microcrystalline cellulose (i.e., Avicel) revealed that the equilibrium binding constant of BsEXLX1 to Avicel was similar to those of other Type A surface-binding carbohydrate-binding modules (CBMs) to microcrystalline cellulose, and the maximum number of binding sites on Avicel for BsEXLX1 was also comparable to those on microcrystalline cellulose for other Type A CBMs. BsEXLX1 did not bind to cellooligosaccharides, which is consistent with the typical binding behavior of Type A CBMs. The preferential binding pattern of a plant expansin, ZmEXPB1, to xylan, compared to cellulose was not exhibited by BsEXLX1. In addition, the binding capacities of cellulose and xylan for BsEXLX1 were much lower than those for CtCBD3. © 2012 Wiley Periodicals, Inc.
Lee M.E.,University of California at Berkeley |
Lee M.E.,Energy Biosciences Institute |
Aswani A.,University of California at Berkeley |
Han A.S.,University of California at Berkeley |
And 3 more authors.
Nucleic Acids Research | Year: 2013
Engineered metabolic pathways often suffer from flux imbalances that can overburden the cell and accumulate intermediate metabolites, resulting in reduced product titers. One way to alleviate such imbalances is to adjust the expression levels of the constituent enzymes using a combinatorial expression library. Typically, this approach requires high-throughput assays, which are unfortunately unavailable for the vast majority of desirable target compounds. To address this, we applied regression modeling to enable expression optimization using only a small number of measurements. We characterized a set of constitutive promoters in Saccharomyces cerevisiae that spanned a wide range of expression and maintained their relative strengths irrespective of the coding sequence. We used a standardized assembly strategy to construct a combinatorial library and express for the first time in yeast the five-enzyme violacein biosynthetic pathway. We trained a regression model on a random sample comprising 3% of the total library, and then used that model to predict genotypes that would preferentially produce each of the products in this highly branched pathway. This generalizable method should prove useful in engineering new pathways for the sustainable production of small molecules. © 2013 The Author(s).
News Article | April 1, 2016
"We evaluated germination and plant growth for prairie cordgrass accessions and switchgrass cultivars in a greenhouse study," says crop scientist D.K. Lee. In crop production, too much salt in the soil can interfere with the plant's ability to absorb water. Water moves into plant roots by osmosis, and when solutes inside root cells are more concentrated than in soil, water moves into the root. In salt-affected soil, the difference in solute concentration inside and outside of the root is not as great, meaning that water may not move in. So, even where soil is moist, plants experience drought-like conditions when too much salt is present. Certain mineral salts are also toxic to plants. When they are taken up along with soil water, plant tissue damage can occur. "Saline soils are characterized by high concentrations of soluble salts, such as sodium, chloride, calcium chloride, or magnesium sulfate, whereas sodic soils are solely characterized by their high sodium concentrations," Lee explains. "Many soils are both saline and sodic." The researchers subjected six prairie cordgrass accessions and three switchgrass cultivars to different levels of sodicity and salinity over two years of growth. The team conducted a similar experiment in an earlier study, but only looked at one cordgrass ('Red River') and one switchgrass ('Cave-In-Rock') cultivar, over only one growing season. "In that study, we found that 'Cave-In-Rock' switchgrass was not good at all in terms of salt tolerance. 'Red River' cordgrass was far superior," Lee recalls. The expanded study showed that prairie cordgrass had, on average, much higher germination rates than switchgrass in saline and sodic conditions. Dry biomass production was not as clearly split between the two species in salty conditions, however. Three prairie cordgrasses, pc17-102, pc17-109, and 'Red River', and one switchgrass, EG-1102, produced equivalent amounts of dry biomass when subjected to high-salt conditions. However, they produced approximately 70 to 80 percent less biomass in salty conditions than they did with no added salt. In contrast, the salt-susceptible switchgrass cultivar, EG-2012, produced approximately 99.5 percent less biomass in high-salt treatments than it did without added salt. The next step for the researchers is to bring this work out of the greenhouse, where climate is controlled and water is unlimited, to real-world scenarios. Preliminary field research has shown that prairie cordgrass is very successful in salt-affected areas in Illinois and South Dakota. "Even in highly saline soils, prairie cordgrass can do very well. Unlike switchgrass, it can take up salt dissolved in water without getting sick because it can excrete it out through specialized salt glands. Then, once the plants grow deep roots, they can access less salty water," Lee explains. More research and agronomic improvements are needed before prairie cordgrass can be recommended widely as a biomass crop, but Lee sees a lot of potential in this species. "Prairie cordgrass is an interesting species," he says. "As a warm season grass, I think it is unique in being able to handle low temperatures, and it is also well adapted to poorly drained soils and lands with frequent flooding. And even in high-salt conditions in the field, we're getting pretty good yields: up to 8 or 9 tons per acre." The article, "Determining effects of sodicity and salinity on switchgrass and prairie cordgrass germination and plant growth," is published in Industrial Crops and Products. Lee's co-authors, Eric Anderson, Tom Voigt, and Sumin Kim are also from the U of I. The project was funded by the Energy Biosciences Institute. More information: Eric K. Anderson et al. Determining effects of sodicity and salinity on switchgrass and prairie cordgrass germination and plant growth, Industrial Crops and Products (2015). DOI: 10.1016/j.indcrop.2014.11.016