Urbana, IL, United States
Urbana, IL, United States

Time filter

Source Type

News Article | June 23, 2017
Site: www.eurekalert.org

As farmers survey their fields this summer, several questions come to mind: How many plants germinated per acre? How does altering row spacing affect my yields? Does it make a difference if I plant my rows north to south or east to west? Now a computer model can answer these questions by comparing billions of virtual fields with different planting densities, row spacings, and orientations. The University of Illinois and the Partner Institute for Computational Biology in Shanghai developed this computer model to predict the yield of different crop cultivars in a multitude of planting conditions. Published in BioEnergy-Research, the model depicts the growth of 3D plants, incorporating models of the biochemical and biophysical processes that underlie productivity. Teaming up with the University of Sao Paulo in Brazil, they used the model to address a question for sugarcane producers: How much yield might be sacrificed to take advantage of a possible conservation planting technique? "Current sugarcane harvesters cut a single row at a time, which is time-consuming and leads to damage of the crop stands," said author Steve Long, Gutgsell Endowed Professor of Plant Biology and Crop Sciences at the Carl R. Woese Institute for Genomic Biology. "This could be solved if the crop was planted in double rows with gaps between the double rows. But plants in double rows will shade each other more, causing a potential loss of profitability." The model found that double-row spacing costs about 10% of productivity compared to traditional row spacing; however, this loss can be reduced to just 2% by choosing cultivars with more horizontal leaves planted in a north-south orientation. "This model could be applied to other crops to predict optimal planting designs for specific environments," said Yu Wang, a postdoctoral researcher at Illinois who led the study. "It could also be used in reverse to predict the potential outcome for a field." The authors predict this model will be especially useful when robotic planting becomes more commonplace, which will allow for many more planting permutations. This research was supported by the IGB, Energy Biosciences Institute, Realizing Increased Photosynthetic Efficiency (RIPE) project, and the Chinese Academy of Sciences. The paper "Development of a Three-Dimensional Ray-Tracing Model of Sugarcane Canopy Photosynthesis and Its Application in Assessing Impacts of Varied Row Spacing" is published by BioEnergy-Research (DOI: 10.1007/s12155-017-9823-x). Co-authors include: Yu Wang, Qingfeng Song, Deepak Jaiswal, Amanda P. de Souza, and Xin-Guang Zhu. The Carl R. Woese Institute for Genomic Biology (IGB) advances life sciences research through interdisciplinary collaborations within a state-of-the-art genomic research facility at the University of Illinois. The Energy Biosciences Institute (EBI) is a public-private collaboration to help solve the global energy challenge. Realizing Increased Photosynthetic Efficiency (RIPE) is an international research project funded by the Bill & Melinda Gates Foundation to engineer plants to more efficiently turn the sun's energy into food to sustainably increase worldwide food productivity.


News Article | June 26, 2017
Site: www.eurekalert.org

The Deepwater Horizon oil spill in the Gulf of Mexico in 2010 is one of the most studied spills in history, yet scientists haven't agreed on the role of microbes in eating up the oil. Now a research team at the Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) has identified all of the principal oil-degrading bacteria as well as their mechanisms for chewing up the many different components that make up the released crude oil. The team, led by Berkeley Lab microbial ecologist Gary Andersen, is the first to simulate the conditions that occurred in the aftermath of the spill. Their study, "Simulation of Deepwater Horizon oil plume reveals substrate specialization within a complex community of hydrocarbon-degraders," was just published in the Proceedings of the National Academy of Sciences. "This provides the most complete account yet of what was happening in the hydrocarbon plumes in the deep ocean during the event," said Andersen. Berkeley Lab's Ping Hu, the lead author of the study, added: "We simulated the conditions of the Gulf of Mexico oil spill in the lab and were able to understand the mechanisms for oil degradation from all of the principal oil-degrading bacteria that were observed in the original oil spill." This oil spill was the largest in history, with the release of 4.1 million barrels of crude oil as well as large amounts of natural gas from a mile below the surface of the ocean. After the initial explosion and uncontained release of oil, researchers observed a phenomenon that had not been seen before: More than 40 percent of the oil, combined with an introduced chemical dispersant, was retained in a plume nearly 100 miles long at this great depth. Yet because of the difficulty in collecting samples from so far below the ocean surface, and because of the large area that was impacted by the spill, a number of gaps in understanding the fate of the oil over time remained. Andersen and his team returned to the spill location four years later to collect water at depth. With the assistance of co-authors Piero Gardinali of Florida International University and Ron Atlas of the University of Louisville, a suspension of small, insoluble oil droplets was evenly distributed in bottles, along with the more soluble oil fractions and chemical dispersant to mimic the conditions of the oil plume. Over the next 64 days the composition of the microbes and the crude oil were intensively studied. The researchers witnessed an initial rapid growth of a microbe that had been previously observed to be the dominant bacterium in the early stages of the oil release but which had eluded subsequent attempts by others to recreate the conditions of the Gulf of Mexico oil plume. Through DNA sequencing of its genome they were able to identify its mechanism for degrading oil. They gave this newly discovered bacterium the tentative name of Bermanella macondoprimitus based on its relatedness to other deep-sea microbes and the location where it was discovered. "Our study demonstrated the importance of using dispersants in producing neutrally buoyant, tiny oil droplets, which kept much of the oil from reaching the ocean surface," Andersen said. "Naturally occurring microbes at this depth are highly specialized in growing by using specific components of the oil for their food source. So the oil droplets provided a large surface area for the microbes to chew up the oil." Working with Berkeley Lab scientist Jill Banfield, a study co-author and also a professor in UC Berkeley's Department of Earth and Planetary Sciences, the team used newly developed DNA-based methods to identify all of the genomes of the microbes that used the introduced oil for growth along with their specific genes that were responsible for oil degradation. Many of the bacteria that were identified were similar to oil-degrading bacteria found on the ocean surface but had considerably streamlined sets of genes for oil degradation. Early work on microbial activity after the oil spill was led by Berkeley Lab's Terry Hazen (now primarily associated with the University of Tennessee), which provided the first data ever on microbial activity from a deepwater dispersed oil plume. While Hazen's work revealed a variety of hydrocarbon degraders, this latest study identified the mechanisms the bacteria used to degrade oil and the relationship of these organisms involved in the spill to previously characterized hydrocarbon-degrading organisms. "We now have the capability to identify the specific organisms that would naturally degrade the oil if spills occurred in other regions and to calculate the rates of the oil degradation to figure out how long it would take to consume the spilled oil at depth," Andersen said. Andersen noted that it is not clear if the degradation of oil at these depths would have occurred in other offshore oil-producing regions. "The Gulf of Mexico is home to one of the largest concentrations of underwater hydrocarbon seeps, and it has been speculated that this helped in the selection of oil-degrading microbes that were observed in the underwater plumes," he said. Although the well drilled by the Deepwater Horizon rig was one of the deepest of its time, new oil exploration offshore of Brazil, Uruguay, and India has now exceeded 2 miles below the ocean surface. By capturing water from these areas and subjecting them to the same test, it may be possible in the future to understand the consequences of an uncontrolled release of oil in these areas in greater detail. "Our greatest hope would be that there were no oil spills in the future," Andersen said. "But having the ability to manipulate conditions in the laboratory could potentially allow us develop new insights for mitigating their impact." This research was funded by the Energy Biosciences Institute, a partnership led by UC Berkeley that includes Berkeley Lab and the University of Illinois at Urbana-Champaign. Other study co-authors were Eric Dubinsky, Jian Wang, Lauren Tom, and Christian Sieber of Berkeley Lab, and Alexander Probst of UC Berkeley. Lawrence Berkeley National Laboratory addresses the world's most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab's scientific expertise has been recognized with 13 Nobel Prizes. The University of California manages Berkeley Lab for the U.S. Department of Energy's Office of Science. For more, visit http://www. . DOE's Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.


News Article | June 27, 2017
Site: www.eurekalert.org

Farmers earn more profits when there is demand for corn for biofuel instead of for food only. This can lead some to convert grasslands and forests to cropland. This conversion, also called indirect land use change, can have large-scale environmental consequences, including releasing stored carbon into the atmosphere. To penalize the carbon emissions from this so-called indirect land use change, the USEPA and California Air Resources Board include an indirect land use change factor when considering the carbon savings with biofuels for their compliance with the federal Renewable Fuel Standard or California's Low-Carbon Fuel Standard. "Biofuel policies like the Low-Carbon Fuel Standard in California are trying to minimize the indirect land use change related emissions by accounting for the indirect land use change factor as part of the carbon emissions per gallon of biofuels. We examine the costs and benefits of using this approach at a national level," says University of Illinois agricultural economist Madhu Khanna. A research paper on the subject by Khanna and her colleagues appears today in Nature Communications in which they ask: By how much would carbon emissions be reduced as a result of regulating indirect land use change like they are attempting to do in California? At what cost? And, who bears those costs? Khanna says a low-carbon fuel standard creates incentives to switch to low-carbon advanced biofuels, but including the indirect effect makes compliance more costly and fuel more expensive for consumers. Evan DeLucia, a U of I professor of plant biology and a co-author on the study, explains that biofuels differ in the carbon emissions they generate per gallon and their effect on use of land. Cellulosic biofuels, particularly from crop residues, or energy crops, like miscanthus and switchgrass, produced on low-quality marginal land lead to lower indirect land use change than corn ethanol. "Inclusion of the indirect land use change factor makes it much more costly to achieve the Low Carbon Fuel Standard," Khanna says. "It penalizes all biofuels and increases their carbon emissions per gallon. It imposes a hidden tax on all fuels that is borne by fuel consumers and blenders." "What we find is the inclusion of this indirect land use change factor leads to a relatively small reduction in emissions and this reduction comes at a very large cost to fuel consumers and fuel blenders," Khanna says. "The economic cost of reducing these carbon emissions is much higher than the value of the damages caused by those emissions, as measured by the social cost of carbon. What our findings suggest is that it's not optimal to regulate indirect land use change in the manner that it is currently done in California and of extending that to other parts of the country." The social cost of carbon, Khanna says, is $50 per ton of carbon dioxide on average. The economic cost of reducing carbon emissions by including California's indirect land use change factor at a national level is $61 per ton of carbon dioxide. The use of California's indirect land use change factors applied nationally would imply that the cost of reducing a ton of carbon is 20 percent higher than the avoided damages from those emissions. "We find that it is just not worth reducing these indirect land use emissions using California's approach. It imposes a cost that is passed on to the consumer in the form of a higher cost for fuel," Khanna says. "These costs for fuel consumers could range from $15 billion to $131 billion nationally over a decade, depending on the indirect land use change factors applied." "We need to think of better ways to prevent indirect land use change that would be more cost-effective," Khanna says. Currently, there is no national low-carbon fuel standard. California has one, Oregon recently established a low-carbon fuel standard, and other states are considering it. Khanna says this study provides useful information as states move forward to determine whether or not they should continue this policy of including an indirect land use change factor when they implement a low-carbon fuel standard. "A lot of effort has been made and continues to be made to calculate the indirect land use change factor so they can be included in implementing low-carbon fuel policies," Khanna says. "The presence of indirect land use change due to biofuels has in fact dominated the whole debate about the climate benefits of biofuels. We may be more productive if we focus more on the direct carbon saving with biofuels and incorporating those in trying to encourage the move toward lower carbon biofuels rather than regulating the indirect effects. Estimates of the indirect effects of biofuels have also become much smaller over time and it's time to re-evaluate the benefits of continuing the policy of regulating indirect emissions," Khanna says. The paper, "The social inefficiency of regulating indirect land use change due to biofuels," is written by Madhu Khanna, Weiwei Wang, Tara W. Hudiburg, and Evan H. DeLucia and is published in Nature Communications. Funding for the work was provided by the Energy Foundation, the Energy Biosciences Institute, University of California, Berkeley, and the Department of Energy Sun Grant. Khanna is the ACES Distinguished Professor for Environmental Economics in the Department of Agricultural and Consumer Economics in the College of Agricultural, Consumer and Environmental Sciences at the University of Illinois.


News Article | January 26, 2016
Site: phys.org

The new study, published in Plant, Cell and Environment, addresses a central challenge of transgenic plant development: how to reliably evaluate whether genetic material has been successfully introduced. Researchers at the University of Illinois, the Polish Academy of Sciences, the University of Nebraska-Lincoln and the University of California, Berkeley compared the traditional method to several new ones that have emerged from advances in genomic technology and identified one that is much faster than the standard approach, yet equally reliable. The study was led by Illinois postdoctoral fellows Kasia Glowacka and Johannes Kromdijk. "For plants with long life cycles, such as our food crops, this will greatly speed the time between genetic transformation or DNA editing, and development of pure breeding lines," said Long, Gutgsell Endowed Professor of Crop Sciences and Plant Biology and the principal investigator for the study. Long is also a member of the Genomic Ecology of Global Change and Biosystems Design research themes and the Energy Biosciences Institute at the Carl R. Woese Institute for Genomic Biology. To meet the food and fuel needs of an ever-growing global population, researchers benefit from transgenic technologies to develop crops with higher yields and greater resiliency to environmental challenges. None of the technologies used to introduce new genetic material into plants work with 100 percent efficiency. Plants and their offspring must be screened to identify those in which gene transfer was successful. Traditionally, this was done in part by testing successive generations of plants to see if the desired traits are present and breed true over time. In addition, plant scientists can use one of several molecular methods to determine if a gene or genes have actually been successfully introduced into the plant genome. The "tried and true" method, the Southern blot, yields precise data but is slow and unwieldy. It requires isolating relatively large amounts of plant DNA, using fluorescent or radioactive dye to detect the gene of interest, and performing a week's worth of lab work for results from just a few samples at a time. The team compared the Southern blot technique with several that use variations of a chemical process called polymerase chain reaction (PCR). This process allows researchers to quantify specific pieces of the introduced DNA sequences by making many additional copies of them, and then estimating the number of copies—somewhat like estimating the amount of bacteria present in a sample by spreading it on a petri dish and letting colonies grow until they are visible. These methods are much faster than Southern blotting, but if the DNA in each sample does not "grow" at exactly the same rate, the resulting data will be imprecise—size won't be a perfect indicator of the starting quantity. One method examined by Long's group, digital drop PCR (ddPCR), is designed to overcome this weakness. Rather than using the PCR process to amplify all the DNA in a sample, this method first separates each individual fragment of DNA into its own tiny reaction, much like giving each bacterium its own tiny petri dish to grow in. PCR then amplifies each fragment until there are enough copies to be easily detected, and the total number of tiny reactions are counted. Because this method, unlike others, separates the growth-like step from the quantification step, it can be very precise even when the reaction isn't perfect. Results can be obtained in less than two days, and many samples can be processed simultaneously. Long hopes that his group's demonstration that ddPCR is a "reliable, fast and high throughput" technique will help it to become the new standard for those developing transgenic crops. "I believe it will become widely adopted," he said. Although ddPCR is currently more expensive than the other methods, Long said the cost would likely drop quickly, as have the costs of other genomic technologies.


Kim I.J.,Korea University | Ko H.-J.,Korea University | Kim T.-W.,Energy Biosciences Institute | Nam K.H.,Cornell University | And 2 more authors.
Applied Microbiology and Biotechnology | Year: 2013

BsEXLX1 from Bacillus subtilis is the first discovered bacterial expansin as a structural homolog of a plant expansin, and it exhibited synergism with cellulase on the cellulose hydrolysis in a previous study. In this study, binding characteristics of BsEXLX1 were investigated using pretreated and untreated Miscanthus x giganteus in comparison with those of CtCBD3, a cellulose-binding domain from Clostridium thermocellum. The amounts of BsEXLX1 bound to cellulose-rich substrates were significantly lower than those of CtCBD3. However, the amounts of BsEXLX1 bound to lignin-rich substrates were much higher than those of CtCBD3. A binding competition assay between BsEXLX1 and CtCBD3 revealed that binding of BsEXLX1 to alkali lignin was not affected by the presence of CtCBD3. This preferential binding of BsEXLX1 to lignin could be related to root colonization in plants by bacteria, and the bacterial expansin could be used as a lignin blocker in the enzymatic hydrolysis of lignocellulose. © 2012 Springer-Verlag Berlin Heidelberg.


Crago C.L.,Energy Biosciences Institute | Khanna M.,301A Mumford Hall | Barton J.,University of British Columbia | Giuliani E.,Partners at Venture | Amaral W.,University of Sao Paulo
Energy Policy | Year: 2010

Corn ethanol produced in the US and sugarcane ethanol produced in Brazil are the world's leading sources of biofuel. Current US biofuel policies create both incentives and constraints for the import of ethanol from Brazil and together with the cost competitiveness and greenhouse gas intensity of sugarcane ethanol compared to corn ethanol will determine the extent of these imports. This study analyzes the supply-side determinants of cost competitiveness and compares the greenhouse gas intensity of corn ethanol and sugarcane ethanol delivered to US ports. We find that while the cost of sugarcane ethanol production in Brazil is lower than that of corn ethanol in the US, the inclusion of transportation costs for the former and co-product credits for the latter changes their relative competitiveness. We also find that the relative cost of ethanol in the US and Brazil is highly sensitive to the prevailing exchange rate and prices of feedstocks. At an exchange rate of US$1=R$2.15 the cost of corn ethanol is 15% lower than the delivered cost of sugarcane ethanol at a US port. Sugarcane ethanol has lower GHG emissions than corn ethanol but a price of over $113perton of CO2 is needed to affect competitiveness. © 2010 Elsevier Ltd.


Kim I.J.,Korea University | Ko H.-J.,Korea University | Kim T.-W.,Energy Biosciences Institute | Choi I.-G.,Korea University | Kim K.H.,Korea University
Biotechnology and Bioengineering | Year: 2013

Plant expansin proteins induce plant cell wall extension and have the ability to extend and disrupt cellulose. In addition, these proteins show synergistic activity with cellulases during cellulose hydrolysis. BsEXLX1 originating from Bacillus subtilis is a structural homolog of a β-expansin produced by Zea mays (ZmEXPB1). The Langmuir isotherm for binding of BsEXLX1 to microcrystalline cellulose (i.e., Avicel) revealed that the equilibrium binding constant of BsEXLX1 to Avicel was similar to those of other Type A surface-binding carbohydrate-binding modules (CBMs) to microcrystalline cellulose, and the maximum number of binding sites on Avicel for BsEXLX1 was also comparable to those on microcrystalline cellulose for other Type A CBMs. BsEXLX1 did not bind to cellooligosaccharides, which is consistent with the typical binding behavior of Type A CBMs. The preferential binding pattern of a plant expansin, ZmEXPB1, to xylan, compared to cellulose was not exhibited by BsEXLX1. In addition, the binding capacities of cellulose and xylan for BsEXLX1 were much lower than those for CtCBD3. © 2012 Wiley Periodicals, Inc.


Lee M.E.,University of California at Berkeley | Lee M.E.,Energy Biosciences Institute | Aswani A.,University of California at Berkeley | Han A.S.,University of California at Berkeley | And 3 more authors.
Nucleic Acids Research | Year: 2013

Engineered metabolic pathways often suffer from flux imbalances that can overburden the cell and accumulate intermediate metabolites, resulting in reduced product titers. One way to alleviate such imbalances is to adjust the expression levels of the constituent enzymes using a combinatorial expression library. Typically, this approach requires high-throughput assays, which are unfortunately unavailable for the vast majority of desirable target compounds. To address this, we applied regression modeling to enable expression optimization using only a small number of measurements. We characterized a set of constitutive promoters in Saccharomyces cerevisiae that spanned a wide range of expression and maintained their relative strengths irrespective of the coding sequence. We used a standardized assembly strategy to construct a combinatorial library and express for the first time in yeast the five-enzyme violacein biosynthetic pathway. We trained a regression model on a random sample comprising 3% of the total library, and then used that model to predict genotypes that would preferentially produce each of the products in this highly branched pathway. This generalizable method should prove useful in engineering new pathways for the sustainable production of small molecules. © 2013 The Author(s).


Enslow K.R.,Energy Biosciences Institute | Enslow K.R.,University of California at Berkeley | Bell A.T.,Energy Biosciences Institute | Bell A.T.,University of California at Berkeley
Catalysis Science and Technology | Year: 2015

A number of Lewis acid catalysts were screened for their effectiveness in converting both xylose and glucose in aqueous media to furfural and 5-HMF, respectively. While other catalysts were found to be more active, SnCl4 was identified as the most selective Lewis acid. Hydrolysis of SnCl4 was observed at various concentrations and temperatures resulting in the production of Brønsted acidic protons in a 3.5:1 ratio to Sn4+ at all SnCl4 concentrations above 60°C. As a consequence, there was no need to add a Brønsted acid in order to promote the dehydration of either xylose or glucose. SnCl4-promoted isomerization/dehydration of xylose and glucose at 140°C in water resulted in conversions of 55% and 33%, respectively, after 2 h of reaction, and furfural and 5-HMF selectivities of up to 58% and 27%, respectively. Significant conversion of sugars to humins was observed in both cases, and in the case of glucose, degradation of 5-HMF to levulinic and formic acids was also noted. The effects of secondary reactions could be greatly suppressed by extraction of the furanic product as it was produced. Using n-butanol as the extracting agent, xylose and glucose conversions of 90% and 75%, respectively, were observed after 5 h of reaction, and the selectivities to furfural and 5-HMF increased to 85% and 69%, respectively. Small additional increases in the furfural and 5-HMF selectivities were obtained by adding LiCl to the aqueous phase without much effect on the conversion of either sugar. In this case, the selectivities to furfural and 5-HMF were 88% and 72%, respectively, after 5 h of reaction at 140°C. © The Royal Society of Chemistry 2015.


News Article | April 1, 2016
Site: phys.org

"We evaluated germination and plant growth for prairie cordgrass accessions and switchgrass cultivars in a greenhouse study," says crop scientist D.K. Lee. In crop production, too much salt in the soil can interfere with the plant's ability to absorb water. Water moves into plant roots by osmosis, and when solutes inside root cells are more concentrated than in soil, water moves into the root. In salt-affected soil, the difference in solute concentration inside and outside of the root is not as great, meaning that water may not move in. So, even where soil is moist, plants experience drought-like conditions when too much salt is present. Certain mineral salts are also toxic to plants. When they are taken up along with soil water, plant tissue damage can occur. "Saline soils are characterized by high concentrations of soluble salts, such as sodium, chloride, calcium chloride, or magnesium sulfate, whereas sodic soils are solely characterized by their high sodium concentrations," Lee explains. "Many soils are both saline and sodic." The researchers subjected six prairie cordgrass accessions and three switchgrass cultivars to different levels of sodicity and salinity over two years of growth. The team conducted a similar experiment in an earlier study, but only looked at one cordgrass ('Red River') and one switchgrass ('Cave-In-Rock') cultivar, over only one growing season. "In that study, we found that 'Cave-In-Rock' switchgrass was not good at all in terms of salt tolerance. 'Red River' cordgrass was far superior," Lee recalls. The expanded study showed that prairie cordgrass had, on average, much higher germination rates than switchgrass in saline and sodic conditions. Dry biomass production was not as clearly split between the two species in salty conditions, however. Three prairie cordgrasses, pc17-102, pc17-109, and 'Red River', and one switchgrass, EG-1102, produced equivalent amounts of dry biomass when subjected to high-salt conditions. However, they produced approximately 70 to 80 percent less biomass in salty conditions than they did with no added salt. In contrast, the salt-susceptible switchgrass cultivar, EG-2012, produced approximately 99.5 percent less biomass in high-salt treatments than it did without added salt. The next step for the researchers is to bring this work out of the greenhouse, where climate is controlled and water is unlimited, to real-world scenarios. Preliminary field research has shown that prairie cordgrass is very successful in salt-affected areas in Illinois and South Dakota. "Even in highly saline soils, prairie cordgrass can do very well. Unlike switchgrass, it can take up salt dissolved in water without getting sick because it can excrete it out through specialized salt glands. Then, once the plants grow deep roots, they can access less salty water," Lee explains. More research and agronomic improvements are needed before prairie cordgrass can be recommended widely as a biomass crop, but Lee sees a lot of potential in this species. "Prairie cordgrass is an interesting species," he says. "As a warm season grass, I think it is unique in being able to handle low temperatures, and it is also well adapted to poorly drained soils and lands with frequent flooding. And even in high-salt conditions in the field, we're getting pretty good yields: up to 8 or 9 tons per acre." The article, "Determining effects of sodicity and salinity on switchgrass and prairie cordgrass germination and plant growth," is published in Industrial Crops and Products. Lee's co-authors, Eric Anderson, Tom Voigt, and Sumin Kim are also from the U of I. The project was funded by the Energy Biosciences Institute. More information: Eric K. Anderson et al. Determining effects of sodicity and salinity on switchgrass and prairie cordgrass germination and plant growth, Industrial Crops and Products (2015). DOI: 10.1016/j.indcrop.2014.11.016

Loading Energy Biosciences Institute collaborators
Loading Energy Biosciences Institute collaborators