Natural Resource Conservation Service
Natural Resource Conservation Service
News Article | April 18, 2017
An international team of scientists, including one from Lawrence Livermore National Laboratory, has found that up to 20 percent loss in the annual maximum amount of water contained in the Western United States' mountain snowpack in the last three decades is due to human influences. Peak runoff in streams and rivers of the Western U.S. is strongly influenced by melting of accumulated mountain snowpack. A significant decline in this resource has a direct connection to streamflow, with substantial economic and societal impacts. The team showed that observed snowpack loss between the 1980s and 2000s is consistent with results from climate simulations with combined changes in natural factors (such as solar irradiance and volcanic aerosols) and human influences (such as greenhouse gases, aerosols, ozone and land use) changes. The observed snowpack loss was inconsistent with simulations that considered natural influences only. Based on the current state of the snowpack, the researchers estimate a further loss of up to 60 percent within the next 30 years. "The projected losses have serious implications for the hydropower, municipal and agricultural sectors in the region," said John Fyfe, a senior research scientist at the Canadian Centre for Climate Modelling and Analysis of Environment and Climate Change Canada, and lead author of a paper appearing in the April 18 edition of the journal, Nature Communications. Using observations, land surface reanalysis products and climate model simulations, the team characterized the combined influences of decadal variability and external influences on recently observed and near-term projected changes in snowpack over the Western U.S. "These results add to the evidence of a human influence on climate that will have severe impacts on our water supply," said Benjamin Santer, an LLNL climate scientist and a co-author of the paper. Observations of snow water equivalent (SWE) were acquired from the United States Natural Resource Conservation Service Snow Telemetry (SnoTel) network of automated snow pillow measurements across alpine sites. The team focused on the post-1981 period, since the number of observations before this time was too low for calculations of reliable regional averages of SWE. The researchers used monthly (January-May) SnoTel observations that were continuously available from 1982 to 2016 at 354 stations with elevations greater than 1,500 meters. The data showed that 307 of the 354 stations (or about 87 percent of all stations) have a negative trend in annual maximum snow water. The maximum loss typically occurs in April. "These reductions in snowpack water storage have broad implications for future forest productivity and carbon storage, forest vulnerability to fire, as well as streamflow and water supply," Fyfe said. "Such sensitivities should be carefully considered in mitigating climate risks, particularly in the context of water resource and land management in the western United States." The LLNL portion of the work is supported by the DOE Office of Science. Founded in 1952, Lawrence Livermore National Laboratory provides solutions to our nation's most important national security challenges through innovative science, engineering and technology. Lawrence Livermore National Laboratory is managed by Lawrence Livermore National Security, LLC for the U.S. Department of Energy's National Nuclear Security Administration.
Liu K.,University of Florida |
Sollenberger L.E.,University of Florida |
Newman Y.C.,University of Florida |
Vendramini J.M.B.,Research and Education Center |
And 2 more authors.
Crop Science | Year: 2011
'Tifton 85' bermudagrass (Cynodon spp.) is an important forage in the southern United States, but its responses to the interaction of grazing frequency and intensity have not been studied. Sward persistence, herbage accumulation, and nutritive value were measured during 3 yr. Treatments were all combinations of three postgraze stubble heights (SH; 8, 16, and 24 cm) and three regrowth intervals (RI; 14, 21, and 28 d). Short SH (8 cm) with long RI (28 d) or tall SH (24 cm) with short RI (14 d) produced greatest herbage accumulation (11-15 Mg ha-1 yr-1). Lowest or nearly lowest herbage accumulation occurred with 14-d RI and 8-cm SH or 28-d RI with 24-cm SH (7.4-12 Mg ha-1 yr-1). Intermediate levels of RI (21 d) or SH (16 cm) produced consistent herbage accumulation regardless of level of the other factor. Nutritive value was primarily affected by RI, and P (3.1 to 2.8 g kg-1), crude protein (CP; 150 to 108 g kg-1), and in vitro digestible organic matter (IVDOM; 602 to 582 g kg-1) concentrations decreased as RI increased. Organic matter and nutrient mass of storage organs increased with increasing SH, but the 24-cm SH treatment exhibited greater reduction in percentage cover (~43% units) than the other SH treatments (~22% units) after 3 yr of grazing. These data indicate that intermediate levels of SH (16 cm) and RI (21 d) provided relatively high Tifton 85 herbage accumulation and nutritive value while minimizing negative impacts on persistence-related responses. © Crop Science Society of America.
White-Leech R.,Natural Resource Conservation Service |
Liu K.,China Agricultural University |
Sollenberger L.E.,University of Florida |
Woodard K.R.,University of Florida |
Interrante S.M.,The Noble Foundation
Crop Science | Year: 2013
Forage dry matter harvested (DMH) and nutritive value (NV) are affected by livestock excreta. Efforts to model nutrient cycling in grazed grasslands would benefit from increased understanding of the duration and spatial pattern of excreta effects on grassland patches. These responses were measured on 'pensacola' bahiagrass (Paspalum notatum Flüggé) swards treated with two excreta types (dung and urine) from two excreta source pastures (Average and High management intensities based on N fertilizer and stocking rates) applied at four frequencies (0, 1, 2, 3 per year) during 2 yr. Forage DMH under dung pats decreased, but DMH surrounding the pat was not affected by application frequency. Suppression of DMH by dung was ≥112 d and extent of suppression increased as application frequency increased. In contrast, DMH under urine increased linearly (2950 to 6250 kg ha-1 for the Average management intensity and 3480 to 6450 kg ha-1 for the High management intensity) as application frequency increased and effects were observed 15 cm beyond the deposit's edge. Forage NV was not affected by dung, but it increased with increasing urine application frequency and for distances up to 30 cm from the edge of the urine deposit. Urine increased DMH for ≥84 d and increased crude protein for ≥28 d following a single urine application and ≥84 d after multiple applications. Data show that duration and spatial patterns of forage response to dung and urine differ, but effects of both can be long lived and are increased by multiple deposits to a patch. © Crop Science Society of America.
Woodard K.R.,University of Florida |
Liu K.,China Agricultural University |
White-Leech U.R.,Natural Resource Conservation Service |
Sollenberger L.E.,University of Florida
Journal of Environmental Quality | Year: 2013
Research is limited for cow-calf operations as a potential nonpoint source of P within Florida's central highlands region (CHR). Th e study was conducted in a bahiagrass (Paspalum notatum Flügge) pasture. Th e soil is an excessively drained 'Candler' sand. In dungdesignated plots, 2 kg of fresh cattle dung was deposited across the surface of a 15-cm-radius circular zone (Zone 1 [Z1]) centered within 3 × 3 m plots. In urine plots, 1 L of urine was deposited on Z1 and 1 L on Zone 2 (Z2), an area extending outward from Z1 to 30 cm from plot center. In dung and urine plots, Zone 3 (Z3) extended from Z2 to 45 cm from plot center and Zone 4 (Z4) from Z3 to 60 cm. Excreta deposition frequencies (DFs) were 0, 1, 2, and 3 times per year during 2006 and 2007. Total apparent remaining P (ARP = [fertilizer P + excreta P] -forage P removal) for Z1 of dung plots was 21, 447, 905, and 1249 kg ha-1 for DF0, DF1, DF2, and DF3, respectively. In 2008, soil was incrementally sampled to a depth of 120 cm in all zones. Urine deposition did not increase soil P. Soil P levels and the degree of P saturation percentages increased with DF but only in the upper 10 cm of topsoil beneath Z1 of dung plots. Our results suggest that the risk of dung P reaching groundwater is low due to a considerable P retention capacity within the rooting zone of the Candler soil. © American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America.
Fleming K.S.,Mississippi State University |
Kaminski R.M.,Mississippi State University |
Tietjen T.E.,Southern Nevada Water Authority |
Schummer M.L.,Mississippi State University |
And 2 more authors.
Wetlands | Year: 2012
The Wetland Reserve Program (WRP) prescribes management of vegetation in moist-soil wetlands for waterfowl and other wildlife. This study used a block design on 18 sites in the Mississippi Alluvial Valley (MAV) in Mississippi to evaluate effectiveness of management prescriptions. Objectives were to determine appropriate timing of vegetation surveys and whether vegetation community metrics on private lands differed among management strategies (2008-2009): 1) active (e.g., annual soil disturbance), early drawdown of standing water (i.e., by 15 June), 2) active, late drawdown (≥3 weeks after early drawdown), and 3) passive, natural evaporation. A Vegetative Forage Quality Index (VFQI) was developed to assess quality of plant communities as forage for waterfowl. The study examined VFQI, plant community diversity and richness, percent (%) occurrence of grass, % woody species, and mean number of plant life-forms among management categories (α=0.10). Plant community metrics were measured June-October but only October metrics revealed differences in both years (p≤0.09). Active-early had the greatest VFQI, diversity, mean number of plant life-forms, and percent abundance of grasses in October 2008 and 2009 (p≤0.07). Results suggest that quality forage for waterfowl may be achieved through active management with early draw-down. © Society of Wetland Scientists 2012.
Olmstead V.G.,Arkansas Tech University |
Webb E.B.,Arkansas Tech University |
Johnson R.W.,Natural Resource Conservation Service
Wetlands | Year: 2013
To better understand the contribution of private lands enrolled in conservation easement programs to wintering waterfowl habitat and energetics, we evaluated effects of management strategies on seed biomass, species richness, and presence of beneficial (i.e., considered to have nutritional value to waterfowl) and introduced species on 32 Wetland Reserve Program easements in Arkansas and Mississippi. We collected soil core samples from seasonal wetlands with active and passive management strategies in 2008-2009. Overall mean (±SE) biomass for all seeds was 527.8 (±28.5) kg/ha, whereas mean biomass of beneficial seeds was 263.5 (±18.5) kg/ha. Actively managed sites in Mississippi had greater beneficial seed mass compared to passively managed sites, whereas management had no effect on beneficial seed mass in Arkansas in fall 2008 and in fall 2009, passive sites had greater beneficial seed mass than active sites. Our estimate of beneficial seed biomass on WRP easements represents a 47 % reduction in estimated food availability for waterfowl on privately owned wetlands. Lower estimates of food availability on privately owned, seasonal wetlands in the lower Mississippi Alluvial Valley may warrant increased conservation efforts for seasonal wetlands or additional emphasis on management techniques to increase moist-soil seed biomass on privately owned seasonal wetlands. © US Government 2013.
Schuh M.C.,Natural Resource Conservation Service |
Casey F.X.M.,North Dakota State University |
Hakk H.,U.S. Department of Agriculture |
DeSutter T.M.,North Dakota State University |
And 3 more authors.
Journal of Hazardous Materials | Year: 2011
The occurrence of the manure-borne estrogen, 17β-estradiol (E2), was investigated in laboratory and field soils. In the laboratory, E2 was applied to soil to simulate concentrations found in swine (Sus scrofa domestica) manure (5000ngL-1). The aqueous-extracted E2 dissipated in the soil by 98% within 1h and was not significantly different from background concentrations (18ngL-1) for the duration of the experiment (64h). In the field study, soil cores were taken before and several dates after swine manure application. Equivalent porewater concentrations of water-extractable E2 were determined in 0.15-m increments down to the water table (0.70-2.00m deep). The average frequency of detection for 168 samples was 38% (average=40ngL-1 porewater equivalents). Eleven days after manure application there was no significant effect on E2 detection frequency or concentration. However, E2 concentrations significantly increased by 6 months after manure application, and appeared to be related to precipitation. Concentrations then returned to original levels by 17 months after manure application. Manure did not have an immediate effect on E2 occurrence due to the capacity of the soil to rapidly sorb E2. However, it appears that soil may act as a long-term reservoir for E2 in the environment, which may be periodically released through desorption. © 2011 Elsevier B.V.
News Article | November 21, 2016
The findings recommend developing a network that will create a more comprehensive and integrated platform to support evidence-based conservation and archive program results to better assess effectiveness. Dr. David Briske, T.M. O'Connor Professor in the department of ecosystem science and management with Texas A&M University in College Station, recently authored the paper with experts from the U.S. Department of Agriculture-Agricultural Research Service, Utah State University and the University of Wyoming. The paper, titled Assessment of USDA-NRCS Rangeland Conservation Programs: Recommendation for an Evidence-based Conservation Platform, examines the effectiveness of conservation practices on U.S. rangelands. Briske, who conducts rangeland research through Texas A&M AgriLife Research, also served as academic coordinator and editor of an earlier study, Conservation Benefits of Rangeland Practices: Assessment, Recommendations and Knowledge Gaps, published in 2011. The 2011 study resulted from a request by the Office of Management and Budget for the USDA Natural Resource Conservation Service to document the societal benefits anticipated from a major increase in conservation funding authorized by the 2002 Farm Bill. Conservation funding in the Environmental Quality Incentive Program or EQIP, the primary program funding conservation practices, increased from $200 million in 1996 to $1.3 billion in the 2002 Farm Security and Rural Investment Act, with a goal to maximize the environmental benefits of conservation funding, he said. Briske said the Conservation Effects Assessment Project, or CEAP, was created at that time to assess these future conservation benefits. CEAP produced an unprecedented assessment of rangeland conservation practices conducted by a team of 40 scientists, interacting with 30 NRCS partners. They assessed the effectiveness of seven major conservation practices – prescribed grazing, prescribed burning, brush management, range planting, riparian herbaceous cover, upland wildlife habitat management and invasive plant management. "These are the primary conservation practices on rangelands and have been implemented for decades, both with and without federal cost-share funding," he said. "Surprisingly, this comprehensive assessment of rangeland conservation practices was unable to determine if benefits had occurred because practice outcomes were seldom documented." He said the paper recently published in Ecological Applications examines the underlying causes contributing to minimal documentation of the outcomes of federally funded conservation practices on U.S. rangelands as described in the initial assessment. The authors concluded that existing conservation programs are insufficiently designed to support efficient, cost-effective and accountable conservation investments on rangelands. They further stated that modification of the standards used to implement these conservation practices alone will not achieve the goals explicitly requested by CEAP. The problem, he said, is the practice standards are not sufficiently grounded in scientific evidence, relevant USDA databases or knowledge of production and environmental outcomes originating from conservation practices. "There is no capacity to learn from the results of previously implemented practices so that this knowledge can be applied to future conservation activities," Briske said. "We recommend that these conservation programs be restructured to establish a Conservation Programs Assessment Network to provide a more comprehensive and integrated platform to support evidence-based conservation," he said. The paper outlines the general structure of this conservation network, which would be based on collaborative monitoring of conservation practice outcomes among landowners, agency personnel and scientists to establish the missing information feedback loops between conservation practices and their agricultural and environmental outcomes. "Monitoring would be selectively conducted on the most important conservation practices and in the major ecoregions where they are applied," Briske said. He said the team concluded that restructuring conservation programs as recommended will directly address two major challenges confronting USDA-NRCS conservation programs. The first is the need for collaborative management to provide site-specific information, learning and accountability as requested by CEAP, Briske said. Secondly, it will further advance efforts to balance delivery of agricultural production and environmental quality goals by documenting the tradeoffs that exists among them in conservation programs. The goal, he said, is to archive evidence-based conservation information into this network so it can be made available to guide other related conservation programs in appropriate ecoregions. Explore further: How the presence of conservation researchers affects wildlife More information: D. D. Briske et al. Assessment of USDA-NRCS Rangeland Conservation Programs: Recommendation for an Evidence-based Conservation Platform, Ecological Applications (2016). DOI: 10.1002/eap.1414
News Article | October 25, 2016
Andy Johnson works with the soil. When younger, he served in Peace Corps in Central America for three years, working on conservation practices. Then he worked in the Natural Resource Conservation Service for years, the same agency that his father Paul Johnson headed by appointment from Bill Clinton in 1993. After moving back to northeast Iowa in 2007, he started farming christmas trees and grass-fed beef cows, but thinking about how the concept of conservation applied to his community’s energy use and economy.
News Article | October 24, 2016
Andy Johnson works with the soil. When younger, he served in Peace Corps in Central America for three years, working on conservation practices. Then he worked in the Natural Resource Conservation Service for years, the same agency that his father Paul Johnson headed by appointment from Bill Clinton in 1993. After moving back to northeast Iowa in 2007, he started farming Christmas trees and grass-fed beef cows, but thinking about how the concept of conservation applied to his community’s energy use and economy. When stimulus money became available in 2009, Andy used his knowledge of soil and water conservation districts to promote the idea of using those funds toward an energy district around Decorah, IA. The idea became more popular than he imagined, as he and an assembled board of directors won a federal grant. In the follow-up from our community-owned renewable energy report, Beyond Sharing: How Communities Can Take Ownership of Renewable Power, we released the second in a series of Local Energy Rules podcasts that informed the writing of this report. In December 2015, Johnson talked with ILSR’s John Farrell about the Winneshiek Energy District and how it promotes energy efficiency and renewable energy for all county residents, as well as what the future holds for distributed energy everywhere. Soil and water conservation districts grew out of the Dust Bowl era, when poor farming practices exacerbated dust storms and extended droughts. Sped into law by President Franklin Roosevelt, these districts were local–state–federal partnerships that provided farmers with technical and financial assistance in sustainable land usage. The Winneshiek Energy District — the first of its kind in the nation — works much the same way, only instead of keeping the soil on the ground, it keeps the money spent on energy within the county. The organization estimates that $100 million largely leaves the county every year to reimburse the out-of-state energy interests. The District’s primary work is home energy assessments, helping owners implement energy efficiency improvements as well as renewable energy. They’ve helped 600 houses so far, saving participants an estimated $3 million. Their conversion rate, or the percentage of customers that go on to make suggested improvements, ranges from 50 to 95 percent, says Johnson. In all, Winneshiek County has more than 190 watts of solar per person, according to numbers provided by Johnson. The county would rank second among the solar-friendliest cities in the United States. Johnson understands that investor-owned utilities such as Alliant Energy have to make a profit for investors. But those profits don’t stay in the community. “It’s a tough question,” he says about working with Alliant. The utility hasn’t been much of an ally, and has stalled on the idea of a community solar array in Decorah. “It’s been a tough uphill battle.” The Winneshiek Energy District works with the utility, but mostly below it, with the utility’s ratepayers. It’s different from other energy districts, which are governmentally-formed zones, driven by legislation in the past decade, that create funding opportunities for renewable energy and energy efficiency — Connecticut and Ohio support some of the best examples here. Winneshiek, on the other hand, is a nonprofit, and supports more than just a municipality or a specific subset of people, says Johnson. That enabled it to offer free energy efficiency services to more than 500 households in recent years, while working with the utility on offering more general measures such as community solar gardens. Winneshiek has met with Luther College, local governments, and other large electric customers to put pressure on the utility. Another route is participating in the proceedings of the state regulatory agency, the Iowa Utilities Board — something Johnson is doing now to defend net metering and the value of solar — or talking to state lawmakers. Local control of energy, at the very least, can be helped most by the municipal or cooperative utilities of the state, or even encouraging a town to form their own utility. Johnson predicts that most municipal utilities and co-ops will be working sooner or later toward 100% renewable energy. “Hopefully the utilities, in the rapidly changing energy world, will increasingly and more and more rapidly work with their customers,” he says. “But if not we have to do what we can with our communities.” Thanks largely to the Winneshiek Energy District, renewable energy is on a tear in Decorah, representing what a little bit of local energy planning can do for people. Now Johnson is talking to other people in other states who want to do the same thing. “Energy districts are essentially meant to be the boots on the ground as well as the preachers of what can happen.” This is the 35th edition of Local Energy Rules, an ILSR podcast with Director of Democratic Energy John Farrell that shares powerful stories of successful local renewable energy and exposes the policy and practical barriers to its expansion. Other than his immediate family, the audience is primarily researchers, grassroots organizers, and grasstops policy wonks who want vivid examples of how local renewable energy can power local economies. It is published intermittently on ilsr.org, but you can Click to subscribe to the podcast: iTunes or RSS/XML. For timely updates, follow John Farrell on Twitter or get the Energy Democracy weekly update. Buy a cool T-shirt or mug in the CleanTechnica store! Keep up to date with all the hottest cleantech news by subscribing to our (free) cleantech daily newsletter or weekly newsletter, or keep an eye on sector-specific news by getting our (also free) solar energy newsletter, electric vehicle newsletter, or wind energy newsletter. John Farrell directs the Democratic Energy program at ILSR and he focuses on energy policy developments that best expand the benefits of local ownership and dispersed generation of renewable energy. His seminal paper, Democratizing the Electricity System, describes how to blast the roadblocks to distributed renewable energy generation, and how such small-scale renewable energy projects are the key to the biggest strides in renewable energy development. Farrell also authored the landmark report Energy Self-Reliant States, which serves as the definitive energy atlas for the United States, detailing the state-by-state renewable electricity generation potential. Farrell regularly provides discussion and analysis of distributed renewable energy policy on his blog, Energy Self-Reliant States (energyselfreliantstates.org), and articles are regularly syndicated on Grist and Renewable Energy World. John Farrell can also be found on Twitter @johnffarrell, or at firstname.lastname@example.org.