Berkeley National Laboratory

Berkeley, CA, United States

Berkeley National Laboratory

Berkeley, CA, United States

Time filter

Source Type

News Article | May 15, 2017
Site: www.eurekalert.org

Simulations run at the U.S. Department of Energy's Lawrence Berkeley National Laboratory as part of a unique collaboration with Lawrence Livermore National Laboratory and an industry consortium could help U.S. paper manufacturers significantly reduce production costs and increase energy efficiencies. The project is one of the seedlings for the DOE's HPC for Manufacturing (HPC4Mfg) initiative, a multi-lab effort to use high performance computing to address complex challenges in U.S. manufacturing. Through HPC4Mfg, Berkeley Lab and LLNL are partnering with the Agenda 2020 Technology Alliance, a group of paper manufacturing companies that has a roadmap to reduce their energy use by 20 percent by 2020. The papermaking industry ranks third among the country's largest energy users, behind only petroleum refining and chemical production, according to the U.S. Energy Information Administration. To address this issue, the LLNL and Berkeley Lab researchers are using advanced supercomputer modeling techniques to identify ways that paper manufacturers could reduce energy and water consumption during the papermaking process. The first phase of the project targeted "wet pressing"--an energy-intensive process in which water is removed by mechanical pressure from the wood pulp into press felts that help absorb water from the system like a sponge before it is sent through a drying process. "The major purpose is to leverage our advanced simulation capabilities, high performance computing resources and industry paper press data to help develop integrated models to accurately simulate the water papering process," said Yue Hao, an LLNL scientist and a co-principal investigator on the project. "If we can increase the paper dryness after pressing and before the drying (stage), that would provide the opportunity for the paper industry to save energy." If manufacturers could increase the paper's dryness by 10-15 percent, he added, it would save paper manufacturers up to 20 percent of the energy used in the drying stage--up to 80 trillion BTUs (thermal energy units) per year--and as much as $400 million for the industry annually. For the HPC4Mfg project, the researchers used a computer simulation framework, developed at LLNL, that integrates mechanical deformation and two-phase flow models, and a full-scale microscale flow model, developed at Berkeley Lab, to model the complex pore structures in the press felts. Berkeley Lab's contribution centered around a flow and transport solver in complex geometries developed by David Trebotich, a computational scientist in the Computational Research Division at Berkeley Lab and co-PI on the project. This solver is based on the Chombo software libraries developed in the lab's Applied Numerical Algorithms Group and is the basis for other application codes including Chombo-Crunch, a subsurface flow and reactive transport code that has been used to study reactive transport processes associated with carbon sequestration and fracture evolution. This suite of simulation tools has been known to run at scale on supercomputers at Berkeley Lab's National Energy Research Scientific Computing Center (NERSC), a DOE Office of Science User Facility. "I used the flow and transport solvers in Chombo-Crunch to model flow in paper press felt, which is used in the drying process," Trebotich explained. "The team at LLNL has an approach that can capture the larger scale pressing or deformation as well as the flow in bulk terms. However, not all of the intricacies of the felt and the paper are captured by this model, just the bulk properties of the flow and deformation. My job was to improve their modeling at the continuum scale by providing them with an upscaled permeability-to-porosity ratio from pore scale simulation data. " Trebotich ran a series of production runs on NERSC's Edison system and was successful in providing his LLNL colleagues with numbers from these microscale simulations at compressed and uncompressed pressures, which improved their model, he added. "This was true 'HPC for manufacturing,'" Trebotich said, noting that the team recently released its final report on the first phase of the pilot project. "We used 50,000-60,000 cores at NERSC to do these simulations. It's one thing to take a research code and tune it for a specific application, but it's another thing to make it effective for industry purposes. Through this project we have been able to help engineering-scale models be more accurate by informing better parameterizations from micro-scale data." Going forward, to create a more accurate and reliable computational model and develop a better understanding of these complex phenomena, the researchers say they need to acquire more complete data from the industry, such as paper material properties, high-resolution micro-CT images of paper and experimental data derived from scientifically controlled dewatering tests. "The scientific challenge is that we need to develop a fundamental understanding of how water flows and migrates," Hao said. "All the physical phenomena involved make this problem a tough one because the dewatering process isn't fully understood due to lack of sufficient data." This study was conducted with funding from the DOE's Advanced Manufacturing Office within the Office of Energy Efficiency and Renewable Energy. The Lawrence Berkeley National Laboratory (Berkeley Lab) Computing Sciences organization provides the computing and networking resources and expertise critical to advancing the Department of Energy's research missions: developing new energy sources, improving energy efficiency, developing new materials and increasing our understanding of ourselves, our world and our universe. ESnet, the Energy Sciences Network, provides the high-bandwidth, reliable connections that link scientists at 40 DOE research sites to each other and to experimental facilities and supercomputing centers around the country. The National Energy Research Scientific Computing Center (NERSC) powers the discoveries of 6,000 scientists at national laboratories and universities, including those at Berkeley Lab's Computational Research Division (CRD). CRD conducts research and development in mathematical modeling and simulation, algorithm design, data storage, management and analysis, computer system architecture and high-performance software implementation. NERSC and ESnet are DOE Office of Science User Facilities. Lawrence Berkeley National Laboratory addresses the world's most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab's scientific expertise has been recognized with 13 Nobel prizes. The University of California manages Berkeley Lab for the DOE's Office of Science. DOE's Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.


News Article | April 20, 2017
Site: phys.org

This phenomenon called 'gravitational lensing' is an effect of Einstein's Theory of Relativity—mass bends light. This means that the gravitational field of a massive object—like a galaxy—can bend light rays that pass nearby and refocus them somewhere else, causing background objects to appear brighter and sometimes in multiple locations. Astrophysicists believe that if they can find more of these magni-fied Type Ia's, they may be able to measure the rate of the Universe's expansion to unprecedented accuracy and shed some light on the distribution of matter in the cosmos. Fortunately, by taking a closer look at the properties of this rare event, two Law-rence Berkeley National Laboratory (Berkeley Lab) researchers have come up with a method—a pipeline— for identifying more of these so-called "strongly lensed Type Ia supernovae" in existing and future wide-field surveys. A paper describing their approach was recently published in the Astrophysical Journal Letters. Mean-while, a paper detailing the discovery and observations of the 4 billion year old Type Ia supernova, iPTF16geu, was published in Science on April 21. "It is extremely difficult to find a gravitationally lensed supernova, let alone a lensed Type Ia. Statistically, we suspect that there may be approximately one of these in every 50,000 supernovae that we identify," says Peter Nugent, an astrophysicist in Berkeley Lab's Computational Research Division (CRD) and an author on both pa-pers. "But since the discovery of iPTF16geu, we now have some thoughts on how to improve our pipeline to identify more of these events." For many years, the transient nature of supernovae made them extremely difficult to detect. Thirty years ago, the discovery rate was about two per month. But thanks to the Intermediate Palomar Transient Factory (iPTF), a new survey with an innova-tive pipeline, these events are being detected daily, some within hours of when their initial explosions appear. The process of identifying transient events, like supernovae, begins every night at the Palomar Observatory in Southern California, where a wide-field camera mounted on the robotic Samuel Oschin Telescope scans the sky. As soon as observa-tions are taken, the data travel more than 400 miles to the Department of Energy's (DOE's) National Energy Research Scientific Computing Center (NERSC), which is located at Berkeley Lab. At NERSC, machine learning algorithms running on the fa-cility's supercomputers sift through the data in real-time and identify transients for researchers to follow up on. On September 5, 2016, the pipeline identified iPTF16geu as a supernova candidate. At first glance, the event didn't look particularly out of the ordinary. Nugent notes that many astronomers thought it was just a typical Type Ia supernova sitting about 1 billion light years away from Earth. Like most supernovae that are discovered relatively early on, this event got brighter with time. Shortly after it reached peak brightness (19th magnitude) Stockholm Uni-versity Professor in Experimental Particle Astrophysics Ariel Goobar decided to take a spectrum—or detailed light study—of the object. The results confirmed that the object was indeed a Type Ia supernova, but they also showed that, surprisingly, it was located 4 billion light years away. A second spectrum taken with the OSIRIS in-strument on the Keck telescope on Mauna Kea, Hawaii, showed without a doubt that the supernova was 4 billion light years away, and also revealed its host galaxy and another galaxy located about 2 billion light years away that was acting as a gravita-tional lens, which amplified the brightness of the supernova and caused it to appear in four different places on the sky. "I've been looking for a lensed supernova for about 15 years. I looked in every pos-sible survey, I've tried a variety of techniques to do this and essentially gave up, so this result came as a huge surprise," says Goobar, who is lead author of the Science paper. "One of the reasons I'm interested in studying gravitational lensing is that it allows you to measure the structure of matter—both visible and dark matter—at scales that are very hard to get." According to Goobar, the survey at Palomar was set up to look at objects in the nearby Universe, about 1 billion light years away. But finding a distant Type Ia su-pernova in this survey allowed researchers to follow up with even more powerful telescopes that resolved small-scale structures in the supernova host galaxy, as well as the lens galaxy that is magnifying it. "There are billions of galaxies in the observable universe and it takes a tremendous effort to look in a very small patch of the sky to find these kind of events. It would be impossible to find an event like this without a magnified supernova directing you where to look," says Goobar. "We got very lucky with this discovery because we can see the small scale structures in these galaxies, but we won't know how lucky we are until we find more of these events and confirm that what we are seeing isn't an anomaly." Another benefit of finding more of these events is that they can be used as tools to precisely measure the expansion rate of the Universe. One of the keys to this is gravitational lensing. When a strong gravitational lens produces multiple images of a background object, each image's light travels a slightly different path around the lens on its way to Earth. The paths have different lengths, so light from each image takes a different amount of time to arrive at Earth. "If you measure the arrival times of the different images, that turns out to be a good way to measure the expansion rate of the Universe," says Goobar. "When people measure the expansion rate of the Universe now locally using supernovae or Cepheid stars they get a different number from those looking at early universe ob-servations and the cosmic microwave background. There is tension out there and it would be neat if we could contribute to resolving that quest." According to Danny Goldstein, a UC Berkeley astronomy graduate student and an author of the Astrophysical Journal letter, there have only been a few gravitationally lensed supernovae of any type ever discovered, including iPTF16geu, and they've all been discovered by chance. "By figuring out how to systematically find strongly lensed Type Ia supernovae like iPTF16geu, we hope to pave the way for large-scale lensed supernova searches, which will unlock the potential of these objects as tools for precision cosmology," says Goldstein, who worked with Nugent to devise a method of for finding them in existing and upcoming wide-field surveys. The key idea of their technique is to use the fact that Type Ia supernovae are "stan-dard candles"—objects with the same intrinsic brightness—to identify ones that are magnified by lensing. They suggest starting with supernovae that appear to go off in red galaxies that have stopped forming stars. These galaxies only host Type Ia su-pernovae and make up the bulk of gravitational lenses. If a supernova candidate that appears to be hosted in such a galaxy is brighter than the "standard" brightness of a Type Ia supernova, Goldstein and Nugent argue that there is a strong chance the su-pernova does not actually reside in the galaxy, but is instead a background super-nova lensed by the apparent host. "One of the innovations of this method is that we don't have to detect multiple im-ages to infer that a supernova is lensed," says Goldstein. "This is a huge advantage that should enable us to find more of these events than previously thought possi-ble." Using this method, Nugent and Goldstein predict that the upcoming Large Synoptic Survey Telescope should be able to detect about 500 strongly lensed Type Ia super-novae over the course of 10 years—about 10 times more than previous estimates. Meanwhile, the Zwicky Transient Facility, which begins taking data in August 2017 at Palomar, should find approximately 10 of these events in a three-year search. On-going studies show that each lensed Type Ia supernova image has the potential to make a four percent, or better, measurement of the expansion rate of the universe. If realized, this could add a very powerful tool to probe and measure the cosmological parameters. "We are just now getting to the point where our transient surveys are big enough, our pipelines are efficient enough, and our external data sets are rich enough that we can weave through the data and get at these rare events," adds Goldstein. "It's an exciting time to be working in this field." iPTF is a scientific collaboration between Caltech; Los Alamos National Laboratory; the University of Wisconsin, Milwaukee; the Oskar Klein Centre in Sweden; the Weizmann Institute of Science in Israel; the TANGO Program of the University Sys-tem of Taiwan; and the Kavli Institute for the Physics and Mathematics of the Uni-verse in Japan. NERSC is a DOE Office of Science User Facility. Daniel A. Goldstein et al. HOW TO FIND GRAVITATIONALLY LENSED TYPE Ia SUPERNOVAE, The Astrophysical Journal (2016). DOI: 10.3847/2041-8213/834/1/L5


News Article | April 20, 2017
Site: www.eurekalert.org

VIDEO:  This animation shows the phenomenon of strong gravitational lensing. This effect caused the supernova iPTF16geu to appear 50 times brighter than under normal circumstances and to be visible on the... view more With the help of an automated supernova-hunting pipeline and a galaxy sitting 2 bil-lion light years away from Earth that's acting as a "magnifying glass,'' astronomers have captured multiple images of a Type Ia supernova--the brilliant explosion of a star--appearing in four different locations on the sky. So far this is the only Type Ia discovered that has exhibited this effect. This phenomenon called 'gravitational lensing' is an effect of Einstein's Theory of Relativity--mass bends light. This means that the gravitational field of a massive object--like a galaxy--can bend light rays that pass nearby and refocus them somewhere else, causing background objects to appear brighter and sometimes in multiple locations. Astrophysicists believe that if they can find more of these magni-fied Type Ia's, they may be able to measure the rate of the Universe's expansion to unprecedented accuracy and shed some light on the distribution of matter in the cosmos. Fortunately, by taking a closer look at the properties of this rare event, two Law-rence Berkeley National Laboratory (Berkeley Lab) researchers have come up with a method--a pipeline-- for identifying more of these so-called "strongly lensed Type Ia supernovae" in existing and future wide-field surveys. A paper describing their approach was recently published in the Astrophysical Journal Letters. Mean-while, a paper detailing the discovery and observations of the 4 billion year old Type Ia supernova, iPTF16geu, was published in Science on April 21. "It is extremely difficult to find a gravitationally lensed supernova, let alone a lensed Type Ia. Statistically, we suspect that there may be approximately one of these in every 50,000 supernovae that we identify," says Peter Nugent, an astrophysicist in Berkeley Lab's Computational Research Division (CRD) and an author on both pa-pers. "But since the discovery of iPTF16geu, we now have some thoughts on how to improve our pipeline to identify more of these events." For many years, the transient nature of supernovae made them extremely difficult to detect. Thirty years ago, the discovery rate was about two per month. But thanks to the Intermediate Palomar Transient Factory (iPTF), a new survey with an innova-tive pipeline, these events are being detected daily, some within hours of when their initial explosions appear. The process of identifying transient events, like supernovae, begins every night at the Palomar Observatory in Southern California, where a wide-field camera mounted on the robotic Samuel Oschin Telescope scans the sky. As soon as observa-tions are taken, the data travel more than 400 miles to the Department of Energy's (DOE's) National Energy Research Scientific Computing Center (NERSC), which is located at Berkeley Lab. At NERSC, machine learning algorithms running on the fa-cility's supercomputers sift through the data in real-time and identify transients for researchers to follow up on. On September 5, 2016, the pipeline identified iPTF16geu as a supernova candidate. At first glance, the event didn't look particularly out of the ordinary. Nugent notes that many astronomers thought it was just a typical Type Ia supernova sitting about 1 billion light years away from Earth. Like most supernovae that are discovered relatively early on, this event got brighter with time. Shortly after it reached peak brightness (19th magnitude) Stockholm Uni-versity Professor in Experimental Particle Astrophysics Ariel Goobar decided to take a spectrum--or detailed light study--of the object. The results confirmed that the object was indeed a Type Ia supernova, but they also showed that, surprisingly, it was located 4 billion light years away. A second spectrum taken with the OSIRIS in-strument on the Keck telescope on Mauna Kea, Hawaii, showed without a doubt that the supernova was 4 billion light years away, and also revealed its host galaxy and another galaxy located about 2 billion light years away that was acting as a gravita-tional lens, which amplified the brightness of the supernova and caused it to appear in four different places on the sky. "I've been looking for a lensed supernova for about 15 years. I looked in every pos-sible survey, I've tried a variety of techniques to do this and essentially gave up, so this result came as a huge surprise," says Goobar, who is lead author of the Science paper. "One of the reasons I'm interested in studying gravitational lensing is that it allows you to measure the structure of matter--both visible and dark matter--at scales that are very hard to get." According to Goobar, the survey at Palomar was set up to look at objects in the nearby Universe, about 1 billion light years away. But finding a distant Type Ia su-pernova in this survey allowed researchers to follow up with even more powerful telescopes that resolved small-scale structures in the supernova host galaxy, as well as the lens galaxy that is magnifying it. "There are billions of galaxies in the observable universe and it takes a tremendous effort to look in a very small patch of the sky to find these kind of events. It would be impossible to find an event like this without a magnified supernova directing you where to look," says Goobar. "We got very lucky with this discovery because we can see the small scale structures in these galaxies, but we won't know how lucky we are until we find more of these events and confirm that what we are seeing isn't an anomaly." Another benefit of finding more of these events is that they can be used as tools to precisely measure the expansion rate of the Universe. One of the keys to this is gravitational lensing. When a strong gravitational lens produces multiple images of a background object, each image's light travels a slightly different path around the lens on its way to Earth. The paths have different lengths, so light from each image takes a different amount of time to arrive at Earth. "If you measure the arrival times of the different images, that turns out to be a good way to measure the expansion rate of the Universe," says Goobar. "When people measure the expansion rate of the Universe now locally using supernovae or Cepheid stars they get a different number from those looking at early universe ob-servations and the cosmic microwave background. There is tension out there and it would be neat if we could contribute to resolving that quest." According to Danny Goldstein, a UC Berkeley astronomy graduate student and an author of the Astrophysical Journal letter, there have only been a few gravitationally lensed supernovae of any type ever discovered, including iPTF16geu, and they've all been discovered by chance. "By figuring out how to systematically find strongly lensed Type Ia supernovae like iPTF16geu, we hope to pave the way for large-scale lensed supernova searches, which will unlock the potential of these objects as tools for precision cosmology," says Goldstein, who worked with Nugent to devise a method of for finding them in existing and upcoming wide-field surveys. The key idea of their technique is to use the fact that Type Ia supernovae are "stan-dard candles"--objects with the same intrinsic brightness--to identify ones that are magnified by lensing. They suggest starting with supernovae that appear to go off in red galaxies that have stopped forming stars. These galaxies only host Type Ia su-pernovae and make up the bulk of gravitational lenses. If a supernova candidate that appears to be hosted in such a galaxy is brighter than the "standard" brightness of a Type Ia supernova, Goldstein and Nugent argue that there is a strong chance the su-pernova does not actually reside in the galaxy, but is instead a background super-nova lensed by the apparent host. "One of the innovations of this method is that we don't have to detect multiple im-ages to infer that a supernova is lensed," says Goldstein. "This is a huge advantage that should enable us to find more of these events than previously thought possi-ble." Using this method, Nugent and Goldstein predict that the upcoming Large Synoptic Survey Telescope should be able to detect about 500 strongly lensed Type Ia super-novae over the course of 10 years--about 10 times more than previous estimates. Meanwhile, the Zwicky Transient Facility, which begins taking data in August 2017 at Palomar, should find approximately 10 of these events in a three-year search. On-going studies show that each lensed Type Ia supernova image has the potential to make a four percent, or better, measurement of the expansion rate of the universe. If realized, this could add a very powerful tool to probe and measure the cosmological parameters. "We are just now getting to the point where our transient surveys are big enough, our pipelines are efficient enough, and our external data sets are rich enough that we can weave through the data and get at these rare events," adds Goldstein. "It's an exciting time to be working in this field." iPTF is a scientific collaboration between Caltech; Los Alamos National Laboratory; the University of Wisconsin, Milwaukee; the Oskar Klein Centre in Sweden; the Weizmann Institute of Science in Israel; the TANGO Program of the University Sys-tem of Taiwan; and the Kavli Institute for the Physics and Mathematics of the Uni-verse in Japan. NERSC is a DOE Office of Science User Facility.


The Trump administration has prioritized repealing the Clean Power Plan (CPP), a set of rules by the U.S. EPA aimed at limiting pollution from power plants. New analysis shows that repealing the rule would cost the U.S. economy hundreds of billions of dollars, add more than a billion tons of greenhouse gases to the atmosphere and cause more than 100,000 premature deaths due to inhaled particulate pollution. Energy Innovation utilized the Energy Policy Simulator (EPS) to analyze the effects of repealing the CPP. The EPS is an open-source computer model developed to estimate the economic and emissions effects of various combinations of energy and environmental policies using non-partisan, published data from the U.S. Energy Information Administration (EIA), U.S. EPA, Argonne National Laboratory, U.S. Forest Service, and U.S. Bureau of Transportation Statistics, among others. The EPS has been peer reviewed by experts at MIT, Stanford University, Argonne National Laboratory, Berkeley National Laboratory and the National Renewable Energy Laboratory. It is freely available for public use through a user-friendly web interface or by downloading the full model and input dataset. Our analysis compared a business-as-usual (BAU) scenario (based on existing policies as of mid-to-late 2016, not including the Clean Power Plan) to a scenario that includes a set of policies that narrowly achieve the Clean Power Plan’s mass-based emissions targets. Three important notes: We find that repealing the CPP would result in an increase of carbon dioxide equivalent (CO2e) emissions of more than 500 million metric tons (MMT) in 2030 and 1200 MMT in 2050, contributing to global warming and severe weather events, such as hurricanes, floods and droughts. Cumulative net costs to the U.S. economy (in increased capital, fuel, and operations and maintenance (O&M) expenditures) would exceed $100 billion by 2030 and would reach nearly $600 billion by 2050. It may seem ironic that removing regulations can result in increased costs to the economy, but regulations can help to overcome market barriers and similar problems that prevent certain economically-ideal outcomes from being achieved in a free market (for instance, under-investment in energy efficiency technologies). Although the CPP’s focus is on reducing carbon emissions, the same policies also reduce particulate pollution, which is responsible for thousands of heart attacks and respiratory diseases each year. Repealing the CPP would increase particulate emissions, causing more than 40,000 premature deaths in 2030 and more than 120,000 premature deaths in 2050.


Redshaw C.,University of Hull | Rowe O.,University of East Anglia | Elsegood M.R.J.,Loughborough University | Horsburgh L.,Loughborough University | Teat S.J.,Berkeley National Laboratory
Crystal Growth and Design | Year: 2014

Solvothermal reactions of the lower-rim functionalized diacid calix[4]arene 25,27-bis(methoxycarboxylic, acid)-26,28-dihydroxy-4-tert-butylcalix[4]arene, (LH2) with Zn(NO3)2·6H2O and the dipyridyl ligands 4,4′-bipyridyl (4,4′-bipy), 12-di(4-pyridyl)ethylene (DPE), or 4,4′-azopyridyl (4,4′-azopy) afforded a series of two-dimensional structures of the formulas {[Zn(4,4′-bipy)(L)]·21/4DEF}n, (1), {[Zn2(L)(DPE)]·DEF}n (2), and {[Zn(OH 2)2(L)(4,4′-azopy)]·DEF}n, (3) (DEF = diethylformamide). © 2013 American Chemical Society.


Williams P.T.,Berkeley National Laboratory
Hypertension | Year: 2013

To test prospectively in hypertensives whether moderate and vigorous exercise produces equivalent reductions in mortality, Cox-proportional hazard analyses were applied to energy expenditure (metabolic equivalents hours/d [METh/d]) in 6973 walkers and 3907 runners who used hypertensive medications at baseline. A total of 1121 died during 10.2-year follow-up: 695 cardiovascular disease (International Classification of Diseases, Tenth Revision [ICD10] I00-99; 465 underlying cause and 230 contributing cause), 124 cerebrovascular disease, 353 ischemic heart disease (ICD10 I20-25; 257 underlying and 96 contributing), 122 heart failure (ICD10 I50; 24 underlying and 98 contributing), and 260 dysrhythmias (ICD10 I46-49; 24 underlying and 236 contributing). Relative to <1.07 METh/d, running or walking 1.8 to 3.6 METh/d produced significantly lower all-cause (29% reduction; 95% confidence interval [CI], 17%-39%; P=0.0001), cardiovascular disease (34% reduction; 95% CI, 20%-46%; P=0.0001), cerebrovascular disease (55% reduction; 95% CI, 27%-73%; P=0.001), dysrhythmia (47% reduction; 95% CI, 27%-62%; P=0.0001), and heart failure mortality (51% reduction; 95% CI, 21%-70%; P=0.003), as did ≥3.6 METh/d with all-cause (22% reduction; 95% CI, 6%-35%; P=0.005), cardiovascular disease (36% reduction; 95% CI, 19%-50%; P=0.0002), cerebrovascular disease (47% reduction; 95% CI, 6%-71%; P=0.03), and dysrhythmia mortality (43% reduction; 95% CI, 16%-62%; P=0.004). Diabetes mellitus and chronic kidney disease mortality also decreased significantly with METh/d. All results remained significant when body mass index adjusted. Merely meeting guideline levels (1.07-1.8 METh/d) did not significantly reduced mortality. The dose-response was significantly nonlinear for all end points except diabetes mellitus, and cerebrovascular and chronic kidney disease. Results did not differ between running and walking. Thus, walking and running produce similar reductions in mortality in hypertensives. © 2013 American Heart Association, Inc.


Quinn N.W.T.,Berkeley National Laboratory
Proceedings - 7th International Congress on Environmental Modelling and Software: Bold Visions for Environmental Modeling, iEMSs 2014 | Year: 2014

A promising measure for mitigating climate change is to store large volumes of CO2 captured from large point-source carbon emitters in deep saline aquifers. In vulnerable systems, water resources impacts of large-scale CO2 storage need to be evaluated and assessed before industrial-size storage projects get under way. In California's southern San Joaquin Basin the land surface uplift caused by large CO2 injection projects land deformation could have the potential to create reverse flow along certain canal reaches, or to reduce canal deliveries to agricultural land and managed wetlands. The impact of CO2 storage on shallow water resources was compared to the expected stresses on the groundwater and surface water systems from ongoing pumping using a version of the Central Valley Hydrological Model CVHM extended vertically to capture reservoir geology. Results of simulations demonstrated that such pumping-related deformations in the area might be one order of magnitude larger than those from CO2 injection. In the basin the low permeability geological layers between shallow effectively limit pressure changes from migrating far in vertical directions, downward or upward.


Pollutant trading schemes are market-based strategies that can provide cost-effective and flexible environmental compliance in large river basins. The aim of this paper is to contrast two innovative adaptive strategies for salinity management have been developed in the Hunter River Basin, New South Wales, Australia and in the San Joaquin River Basin, California, USA, respectively. In both instances web-based stakeholder information dissemination has been a key to achieving a high level of stakeholder involvement and the formulation of effective decision support tools for salinity management. A common element to implementation of salinity management strategies in both the Hunter River and San Joaquin River basins has been the concept of river assimilative capacity as a guide for controlling export salt loading and the establishment of a framework for trading of the right to discharge salt load to the Hunter River and San Joaquin River respectively. Both rivers provide basin drainage and the means of exporting salt load to the ocean. The paper compares the opportunities and constraints governing salinity management in the two basins as well as the use of monitoring, modeling and information technology to achieve environmental compliance and sustain irrigated agriculture in an equitable, socially and politically acceptable manner. The paper concludes by placing into broader context some of the issues raised by the comparison of the two approaches to basin salinity management. © 2010 Elsevier B.V.


News Article | October 13, 2016
Site: news.yahoo.com

Delegates attend the official opening of the 28th meeting of the Parties to the Montreal Protocol in Kigali on October 13, 2016 (AFP Photo/Cyril Ndegeya) Kigali (AFP) - Rwanda's President Paul Kagame urged world leaders to rid the world of potent greenhouse gases used in refrigerators and air conditioners, as he opened a high-level meeting in Kigali Thursday. Envoys from nearly 200 nations are in the Rwandan capital to thrash out an agreement to phase out hydrofluorocarbons (HFCs), which were introduced in the 1990s to save the ozone layer -- but turned out to be catastrophic for global warming. Halting the use of HFCs -- also found in aerosols and foam insulation -- is crucial to meeting the goals to curb the rise of global temperatures agreed in a historic accord drafted in Paris last year. "We should not allow ourselves to be satisfied with making a little bit of good progress when it is within our power to actually solve the problem," Kagame told the meeting, attended by representatives of 197 countries. US Secretary of State John Kerry is among the 40 ministers expected. Kagame, whose small east African nation has put the environment at the heart of its development strategy, said that eradicating HFCs "will make our world safer and more prosperous". Maxime Beaugrand of the Institute for Governance and Sustainable Development was positive that there would be an agreement Friday to phase out HFCs. "Negotiations are moving in the right direction. I think we can expect an amendment tomorrow in Kigali and I think it will be sufficiently ambitious," she told AFP. HFCs predecessors, chlorofluorocarbons (CFCs), were discontinued under the ozone-protecting Montreal Protocol when scientists realised the compounds were responsible for the growing hole in the ozone layer, which protects Earth from the Sun's dangerous ultraviolet rays. However it emerged that HFCs -- while safe for the now-healing ozone -- are thousands of times worse for trapping heat in the atmosphere than carbon dioxide, the main greenhouse gas. "(HFCs) are increasing at a rate of 10-15 percent a year," Greenpeace global strategist Paula Carbajal told AFP. "That makes them the fastest-growing greenhouse gas." According to a study by the Berkeley National Laboratory, residential air conditioning is the cause of the largest growth in HFCs -- and the world is likely to have another 700 million air conditioners by 2030. "The world room air conditioner market is growing fast with increasing urbanisation, electrification, rising incomes and falling air conditioner prices in many developing economies." Beaugrand said alternatives to HFCs existed in all refrigeration sectors. These alternatives "either have less of a warming potential than HFCs or they are natural like ammonia". Other alternatives are water and gases called hydrofluoroolefins (HFOs) which are a form of HFCs, however some, like Greenpeace, believe these are still too dangerous. Carbajal said HFCs could add as much as 0.1 degrees celsius (0.18 Fahrenheit) to average global temperatures by mid-century, and 0.5 degrees celsius (0.9 F) by 2100. The Paris climate agreement aims to keep global warming below two degrees celsius, compared with pre-industrial levels, and continued use of HFCs could prove a serious stumbling block to attaining the goal. "If HFC growth is not stopped, it becomes virtually impossible to meet the Paris goals," said David Doniger of the Natural Resources Defense Council, an environmental advocacy group. HFCs -- though they are greenhouse gases like carbon dioxide, methane and nitrous oxide -- are not dealt with under the Paris Agreement but under the Montreal Protocol, adopted in 1987. Negotiators are weighing various proposals for amending the protocol to freeze HFC production and use, with possible dates for such moves ranging from almost immediately to as late as 2031. India -- which is a major HFC producer along with China -- backs the later date, while countries in very hot parts of the world where HFC-using air conditioners are in high demand, want temporary exemptions. Last month, a group of developed countries and companies offered $80 million (72 million euros) to help developing countries make the switch away from HFCs. "No one, frankly, will forgive you nor me if we cannot find a compromise at this conference because this is one of the cheapest, one of the easiest, one of the lowest hanging fruits in the entire household of climate mitigation," Erik Solheim, head of the UN Environment Programme, told delegates.


News Article | October 14, 2016
Site: news.yahoo.com

Air conditioning is the cause of the largest growth in hydrofluorocarbons and the world is likely to have another 700 million air conditioners by 2030 (AFP Photo/Philippe Huguen) Kigali (AFP) - World envoys were Friday putting the final touches to a deal in Rwanda to phase out potent greenhouse gases used in refrigerators and air conditioners, a major step in curbing global warming. Tough negotiations have seen major developing nations such as India put up a fight over the timeline to phase out the use of hydrofluorocarbons (HFCs) and the financing of the transition. However some delegates were already praising an early agreement in principle, and were hammering out the final details in late night sessions. "Country representatives are now negotiating the final details of the amendment," read a tweet from Rwanda's natural resources ministry. "This is a huge win for the climate. We have taken a major concrete step in delivering on the promises we made in Paris last December," said Miguel Arias Canete, a commissioner with the European Union in a statement ahead of the adoption of the agreement. "The global phase-down we have agreed today could knock off up to half a degree of warming by the end of the century." However thrashing out the nitty-gritty of the deal could take talks into the early hours of Saturday. HFCs were introduced in the 1990s to replace chemicals that had been found to erode the ozone layer, but turned out to be catastrophic for global warming. However swapping HFCs for alternatives such as ammonia, water or gases called hydrofluoroolefins could prove costly for developing countries with sweltering summer temperatures, such as India. These countries want a later date for the phase down to begin. "There are issues of cost, there are issues of technology, there are issues of finances," said Ajay Narayan Jha of India's environment and climate change ministry. "We would like to emphasise that any agreement will have to be flexible from all sides concerned. It can't be flexible from one side and not from the other." Last month, a group of developed countries and companies offered $80 million (72 million euros) to help developing countries make the switch away from HFCs. US Secretary of State John Kerry earlier acknowledged that while the US would be ready to begin phasing out HFCs by 2021, other countries might move at a slower pace. "But no country has a right to turn their back on this effort," he told delegates, warning that the world already faces droughts, flooding, agricultural disasters and waves of climate refugees. "And if we're going to give this amendment the teeth it needs to prevent as much as a half-degree of warming, then we need to make sure we’re pushing for the most far-reaching amendment we can adopt," he declared. Erik Solheim, head of the UN Environment Programme, said that if the agreement was adopted, "it will be one the most important global meetings in this year." HFCs' predecessors, chlorofluorocarbons (CFCs), were discontinued under the 1987 Montreal Protocol when scientists realised they were destroying the ozone layer. This blanket of gas in the upper stratosphere protects Earth from the Sun's dangerous ultraviolet rays. But it emerged that HFCs, while safe for the now-healing ozone, are thousands of times worse for trapping heat than carbon dioxide, the main greenhouse gas. According to the Berkeley National Laboratory, air conditioning is the cause of the largest growth in HFCs -- and the world is likely to have another 700 million air conditioners by 2030. Last year's Paris climate agreement aims to keep global warming below two degrees Celsius, compared with pre-industrial levels. But continued use of HFCs could prove a serious stumbling block to attaining the goal. HFCs -- though they are greenhouse gases like carbon dioxide, methane and nitrous oxide -- are not dealt with under the Paris Agreement but under the Montreal Protocol. Any amendment to the protocol will be legally binding.

Loading Berkeley National Laboratory collaborators
Loading Berkeley National Laboratory collaborators