News Article | May 11, 2017
BOULDER, Colo. -- Since the mid-1980s, the percentage of precipitation that becomes streamflow in the Upper Rio Grande watershed has fallen more steeply than at any point in at least 445 years, according to a new study led by the National Center for Atmospheric Research (NCAR). While this decline was driven in part by the transition from an unusually wet period to an unusually dry period, rising temperatures deepened the trend, the researchers said. The study paints a detailed picture of how temperature has affected the runoff ratio -- the amount of snow and rain that actually makes it into the river -- over time, and the findings could help improve water supply forecasts for the Rio Grande, which is a source of water for an estimated 5 million people. The study results also suggest that runoff ratios in the Upper Rio Grande and other neighboring snow-fed watersheds, such as the Colorado River Basin, could decline further as the climate continues to warm. "The most important variable for predicting streamflow is how much it has rained or snowed," said NCAR scientist Flavio Lehner, lead author of the study. "But when we looked back hundreds of years, we found that temperature has also had an important influence -- which is not currently factored into water supply forecasts. We believe that incorporating temperature in future forecasts will increase their accuracy, not only in general but also in the face of climate change." The study, published in the journal Geophysical Research Letters, was funded by the Bureau of Reclamation, Army Corps of Engineers, National Oceanic and Atmospheric Administration (NOAA), and National Science Foundation, which is NCAR's sponsor. Co-authors of the paper are Eugene Wahl, of NOAA; Andrew Wood, of NCAR; and Douglas Blatchford and Dagmar Llewellyn, both of the Bureau of Reclamation. Born in the Rocky Mountains of southern Colorado, the Rio Grande cuts south across New Mexico before hooking east and forming the border between Texas and Mexico. Snow piles up on the peaks surrounding the headwaters throughout the winter, and in spring the snowpack begins to melt and feed the river. The resulting streamflow is used both by farmers and cities, including Albuquerque, New Mexico, and El Paso, Texas, and water users depend on the annual water supply forecasts to determine who gets how much of the river. The forecast is also used to determine whether additional water needs to be imported from the San Juan River, on the other side of the Continental Divide, or pumped from groundwater. Current operational streamflow forecasts depend on estimates of the amount of snow and rain that have fallen in the basin, and they assume that a particular amount of precipitation and snowpack will always yield a particular amount of streamflow. In recent years, those forecasts have tended to over-predict how much water will be available, leading to over-allocation of the river. In an effort to understand this changing dynamic, Lehner and his colleagues investigated how the relationship between precipitation and streamflow, known as the runoff ratio, has evolved over time. Precipitation vs. streamflow: Tree rings tell a new story The scientists used tree ring-derived streamflow data from outside of the Upper Rio Grande basin to reconstruct estimates of precipitation within the watershed stretching back to 1571. Then they combined this information with a separate streamflow reconstruction within the basin for the same period. Because these two reconstructions were independent, it allowed the research team to also estimate runoff ratio for each year: the higher the ratio, the greater the share of precipitation that was actually converted into streamflow. "For the first time, we were able to take these two quantities and use them to reconstruct runoff ratios over the past 445 years," Wahl said. They found that the runoff ratio varies significantly from year to year and even decade to decade. The biggest factor associated with this variation was precipitation. When it snows less over the mountains in the headwaters of the Rio Grande, not only is less water available to become streamflow, but the runoff ratio also decreases. In other words, a smaller percentage of the snowpack becomes streamflow during drier years. But the scientists also found that another factor affected the runoff ratio: temperature. Over the last few centuries, the runoff ratio was reduced when temperatures were warmer. And the influence of temperature strengthened during drier years: When the snowpack was shallow, warm temperatures reduced the runoff ratio more than when the snowpack was deep, further exacerbating drought conditions. The low runoff ratios seen in dry years were two and a half to three times more likely when temperatures were also warmer. "The effect of temperature on runoff ratio is relatively small compared to precipitation," Lehner said. "But because its greatest impact is when conditions are dry, a warmer year can make an already bad situation much worse." A number of factors may explain the influence of temperature on runoff ratio. When it's warmer, plants take up more water from the soil and more water can evaporate directly into the air. Additionally, warmer temperatures can lead snow to melt earlier in the season, when the days are shorter and the angle of the sun is lower. This causes the snow to melt more slowly, allowing the meltwater to linger in the soil and giving plants added opportunity to use it. The extensive reconstruction of historical runoff ratio in the Upper Rio Grande also revealed that the decline in runoff ratio over the last three decades is unprecedented in the historical record. The 1980s were an unusually wet period for the Upper Rio Grande, while the 2000s and 2010s have been unusually dry. Pair that with an increase in temperatures over the same period, and the decline in runoff ratio between 1986 and 2015 was unlike any other stretch of that length in the last 445 years. This new understanding of how temperature influences runoff ratio could help improve water supply forecasts, which do not currently consider whether the upcoming months are expected to be hotter or cooler than average. The authors are now assessing the value of incorporating seasonal temperature forecasts into water supply forecasts to account for these temperature influences. The study complements a multi-year NCAR project funded by the Bureau of Reclamation and the Army Corps of Engineers that is evaluating prospects for enhancing seasonal streamflow forecasts for reservoir management. "Forecast users and stakeholders are increasingly raising questions about the reliability of forecasting techniques if climate is changing our hydrology," said Wood, who led the effort. "This study helps us think about ways to upgrade one of our oldest approaches -- statistical water supply forecasting -- to respond to recent trends in temperature. Our current challenge is to find ways to make sure the lessons of this work can benefit operational streamflow forecasts." Because the existing forecasting models were calibrated on conditions in the late 1980s and 1990s, it's not surprising that they over-predicted streamflow in the drier period since 2000, Lehner said. "These statistical models often assume that the climate is stable," Lehner said. "It's an assumption that sometimes works, but statistical forecasting techniques will struggle with any strong changes in hydroclimatology from decade to decade, such as the one we have just experienced." Lehner is a Postdoc Applying Climate Expertise (PACE) fellow, which is part of the Cooperative Programs for the Advancement of Earth System Science (CPAESS). CPAESS is a community program of the University Corporation for Atmospheric Research (UCAR). UCAR manages NCAR under sponsorship by the National Science Foundation. Any opinions, findings and conclusions, or recommendations expressed in this publication are those of the authors and do not necessarily reflect the views of the National Science Foundation. Title: Assessing recent declines in Upper Rio Grande River runoff efficiency from a paleoclimate perspective Authors: Flavio Lehner, Eugene R. Wahl, Andrew W. Wood, Douglas B. Blatchford, and Dagmar Llewellyn Journal: Geophysical Research Letters, DOI: 10.1002/2017GL073253
News Article | March 14, 2017
Global warming has been a cause of worry for a while now and is believed to be the reason behind the recent cracking in the Antarctic ice shelf, which has led to a rise in ocean levels. A new study now confirms that the Earth's oceans may be heating up at a much faster rate than what environmental scientists previously believed. The study was performed by a team of researchers from the National Center for Atmospheric Research (NCAR), where they observed the change in oceanic temperatures since 1960. The study was led by Lijing Cheng. Latest technological developments helped scientists in performing this study. Previous studies of such kinds were mainly dependent on ships traveling to different parts of the ocean and taking the temperature samples from those areas. However, this method only allowed the study to be done on those areas alone where ship travel was viable. However, from 2000 onward, scientists began using specialized floats dubbed Argo, to record the varying temperatures of the ocean at different locations. The Argo devices can measure temperatures to about 6,562 feet from the ocean's surface. By 2005, these devices had allowed scientists to map the global temperature of almost all the oceans. However, the main difficulty lay in comparing the recent statistics with data from the 1960's onward, as there was limited information available from that time. Scientists, however, used statistical analysis so solve this issue. They recorded the data from the floats in a single area at first to mimic the lack of technology available in the 20th century. With this, they created a global map of the oceanic temperature of the time, which they said matched the scarce original temperature recordings from that era. The results of the study reflected that the ocean temperature had warmed up by 337 zettajoules between 1960 and 2005. It also confirmed that the change in temperature was relatively small till 1980, since when the ocean water started getting warmer rapidly and since 1990, this heat is being transferred deeper under the water surface. "This work is an example of how advances in technology have enabled an improved understanding of past changes in the ocean, where variability has always been a bit of an enigma due to its vastness and depth," said John Fasullo, NCAR scientist and co-author of the research. He also stated that this research was not only a study of the oceanic temperature changes in the past, but may also provide valuable insight into how temperatures may change in the future. The study has been published in the journal, Science Advances. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.
News Article | March 27, 2017
On the left is an image of the global circulation pattern on a normal day. On the right is the image of the global circulation pattern when extreme weather occurs. The pattern on the right shows extreme patterns of wind speeds going north and south, while the normal pattern on the left shows moderate speed winds in both the north and south directions. Flood survivors negotiate a flooded road at Muzaffargarh, in central Pakistan, on Aug. 19, 2010. The floods that hit Pakistan in the summer of 2010 took 2,000 lives and affected 20 million people. —Whether a specific extreme weather event can be linked to climate change rarely gets a straightforward answer from climate scientists or meteorologists. It's complicated, they'll say, but that doesn't mean there isn't a relationship. "Climate scientists have been willing to link the general increase in certain types of weather extremes (heat waves, droughts, and floods) to climate change in a generic sense," says Michael Mann, an atmospheric scientist at Pennsylvania State University. Rising global temperatures and other climate forces can certainly change the conditions that underlie weather, which climate scientists have said can lead to a change in the frequency of a type of weather event. But Dr. Mann and colleagues report that there is a more direct way that climate change is impacting weather extremes: by altering the movement of the jet stream. "Our work shows that climate change isn’t just leading to more extreme weather through the usual mechanisms that have been described in the literature (warmer temperatures means more heat waves, hotter summers mean worse drought, warmer atmosphere holds more moisture so when it rains or snows we tend to see greater amounts of precipitation)," Mann writes in an email to The Christian Science Monitor. "We show that, in addition to those effects, climate change is changing the behavior of the jet stream in a way that favors more extreme persistent weather anomalies." And, in a paper published Monday in the journal Scientific Reports, Mann and his colleagues suggest that this climate change-driven shift in the jet stream influenced the 2003 heat wave in Europe, flooding in Pakistan and a heat wave in Russia simultaneously in 2010, and the Texas heat wave in 2011. "This study convincingly demonstrates a mechanism connecting climate change with extreme weather during summer over Northern Hemisphere continents, affecting billions of people," Jennifer Francis, a climate scientist at Rutgers University in New Brunswick, N.J., who was not involved in the research, writes in an email to the Monitor. The jet stream, which Mann describes as a "ribbon-like air current that travels eastward … in the lower part of the atmosphere where weather happens," exists because of the difference in air temperatures between the subtropic and subarctic regions. The eastward-flowing air isn't one completely steady band. Slow-moving waves that travel from north to south often appear across the ribbon of air. It is those particularly large undulations, called Rossby waves, that researchers link with extreme weather, because of the intensely low or high pressure systems they bring with them. In a warming world, these Rossby waves are getting stuck in one place for a long period of time, Mann and his colleagues say. And this means that a region under the low pressure part of these waves will experience intense, prolonged rainfall resulting in flooding. The regions under the high pressure systems will be stuck with hot, dry conditions conducive to drought and wildfires. What's making these waves static, or at least slowing them down? Mann points to the warming Arctic. The tropics aren't warming as much as the Arctic, so the temperature gradient from north-to-south is less extreme. As the temperature differences that create the jet stream decline, the jet stream's dynamics change. This relationship was previously proposed by German climate researchers in 2013. But Mann and his colleagues have built on this idea by identifying a fingerprint related to these static, or particularly slow-moving, waves. The researchers found that the pattern of these fingerprints in the real-world data for recent extreme weather events matched simulations of the influence of anthropogenic greenhouse gases, indicating that climate change is indeed influential in altering the jet stream dynamics. Kevin Trenberth, senior scientist in the Climate Analysis Section at the National Center for Atmospheric Research (NCAR) who was not involved in the research, cautions that teasing out the climate change signal from the noise of natural variability in a particular weather system can be tricky. "Weather systems occur naturally, in terms of the storms themselves, the phenomena, etc.," he writes in an email to the Monitor. "But their impacts are undoubtedly altered by climate change: higher temperatures, heat waves, wildfires; stronger rains (and snows), more intense droughts, and further, the storms may be more intense." Although the impact of climate change on the temperature, and rainfall intensity is undoubtable, Dr. Trenberth says, the causal relationship is less clear when considering the dynamics (the storms themselves and atmospheric waves) of a weather system. "There is a no doubt that there are relationships, but it is not so clear what is the cause; i.e. the change in waves, the Arctic and so forth are all part of the same thing: and the change in Arctic is likely more a result not a cause," he writes. Climate scientists and meteorologists have been hesitant to draw a direct link between extreme storms and climate change, although they say unusual weather is consistent with models of a changing climate. But Dr. Francis and David Easterling, chief of the Scientific Services Division of the National Oceanic and Atmosphere Administration’s (NOAA) National Centers for Environmental Information, who also was not involved in the study, say this research is eroding that hesitation. "This work adds a substantial layer to the pile of research suggesting that climate change is already causing an increase in certain types of extreme weather events. Moreover, as society continues along the present path of unabated fossil fuel burning, weather will become even more extreme," Francis says. "I think it certainly provides more evidence towards it," Dr. Easterling agrees in a phone interview with the Monitor. "Is it a definitive answer? No, not necessarily.... But it's beginning to draw that link." "This is beginning to give you a dynamical meteorology reason why we may see more of these events," Easterling says. This research gives us an idea about the mechanism behind these changes, rather than just statistics, he says. Mann and his team have focused their work on identifying the link in historical data, but the same technique could have applications in predicting future extreme weather from climate trends. "Indeed, that’s precisely the analysis we are doing now, performing a similar analysis but using instead the climate model projections for the next century," Mann says. "Stay tuned." It's not just about long-term climate trends, Easterling adds. "If meteorologists that are actually doing forecasting can look at this and begin to see these patterns, they can do a better job of forecasting heatwaves and/or extended wet periods" in the nearer future as well. Regardless of timescale, these sorts of predictions could help save lives and expense, Francis says. Extreme weather affects insurance costs, food security, and political stability, among other things, she says. "Knowing the reason for the increased frequency of extreme summer weather events – such as heat waves, droughts, and floods – and knowing these events will become only more frequent and intense in the future, will inform decisionmakers and help leaders of governments and businesses to prepare for them," Francis says. "This knowledge, if acted upon, could save lives and [prevent] suffering."
News Article | May 4, 2017
BOULDER, Colo. -- Expanding its work in renewable energy, the National Center for Atmospheric Research (NCAR) is launching a three-year project to develop specialized forecasts for a major wind and solar energy facility in Kuwait. "We're putting our expertise and technology to work around the world," said NCAR Senior Scientist Sue Ellen Haupt, the principal investigator on the project. "This landmark project meets our mission of science in service to society." The $5.1 million project will focus on developing a system to provide detailed forecasts of wind and solar irradiance at Kuwait's planned 2-gigawatt Shagaya renewable energy plant. After NCAR develops the system, the technology will be transferred to the Kuwait Institute for Scientific Research (KISR) for day-to-day operations. The forecasts will help Kuwait reach its goal of generating 15 percent of its energy from renewable sources by 2030. With the ability to anticipate the amount of electricity that sun and wind will produce hours to days in advance, energy operators will be able to power up or down traditional plants as needed to meet demand. "This technology will provide us with important benefits," said Salem Al-Hajraf, manager of KISR's Renewable Energy Program. "We are providing green energy to the grid using abundant sources of energy, which are sun and wind." When electric utilities integrate power from intermittent sources such as wind or solar into the grid, they temporarily reduce or shut off traditional sources such as oil or natural gas. But if weather conditions fail to come together as expected, the utility may not be able to power up traditional plants in time to meet their customer needs. To help utility managers anticipate renewable wind energy more reliably, NCAR has designed and is constantly improving a wind energy prediction system for Xcel Energy that has saved tens of millions of dollars for the utility's customers in Colorado and nearby states. The specialized system relies on a suite of tools, including highly detailed observations of atmospheric conditions, advanced computer modeling, and artificial intelligence techniques that enable Xcel Energy to issue high-resolution forecasts for wind farm sites. With funding from the U.S. Department of Energy, NCAR has also led a national team of scientists who have developed a cutting-edge forecasting system with the potential to save the solar energy industry hundreds of millions of dollars in the United States alone through improved forecasts. The new Sun4Cast™ system, unveiled last year, greatly improves predictions of clouds and other atmospheric conditions that influence the amount of energy generated by solar arrays. In Kuwait, the NCAR team will build on these technologies to develop both wind and solar energy forecasts. The scientists will customize the system to predict dust storms that can blot out sunlight and damage wind turbines. They will also incorporate the influence of nearby mountain ranges and the Persian Gulf on local weather patterns. "This is a great opportunity to do research into dust and other particulates, which we haven't previously needed to focus on to this extent for wind and solar energy prediction," Haupt said. "This kind of work will pay multiple dividends for energy forecasting as well as better understanding and predicting of weather in certain desert environments." Haupt and her team will collaborate with researchers at Pennsylvania State University and Solar Consulting Services in Florida, as well as with KISR. "This is an exciting international partnership that will both generate significant economic benefits and advance our understanding of the atmosphere," said Antonio J. Busalacchi, president of the University Corporation for Atmospheric Research. "In addition to reducing energy costs for our partners in Kuwait, the knowledge that we gain will help us further improve weather prediction skills here in the United States." The University Corporation for Atmospheric Research is a nonprofit consortium of 110 North American colleges and universities that manages the National Center for Atmospheric Research under sponsorship by the National Science Foundation. KISR leads and partners internationally to develop, deploy, and exploit the best science, technology, knowledge, and innovation for public and private sector clients, for the benefit of Kuwait and others facing similar challenges and opportunities.
News Article | February 15, 2017
The National Center for Atmospheric Research (NCAR) is launching operations this month of one of the world's most powerful and energy-efficient supercomputers, providing the nation with a major new tool to advance understanding of the atmospheric and related Earth system sciences. Named "Cheyenne," the 5.34-petaflop system is capable of more than triple the amount of scientific computing performed by the previous NCAR supercomputer, Yellowstone. It also is three times more energy efficient. Scientists across the country will use Cheyenne to study phenomena ranging from wildfires and seismic activity to gusts that generate power at wind farms. Their findings will lay the groundwork for better protecting society from natural disasters, lead to more detailed projections of seasonal and longer-term weather and climate variability and change, and improve weather and water forecasts that are needed by economic sectors from agriculture and energy to transportation and tourism. "Cheyenne will help us advance the knowledge needed for saving lives, protecting property, and enabling U.S. businesses to better compete in the global marketplace," said Antonio J. Busalacchi, president of the University Corporation for Atmospheric Research. "This system is turbocharging our science." UCAR manages NCAR on behalf of the National Science Foundation (NSF). Cheyenne currently ranks as the 20th fastest supercomputer in the world and the fastest in the Mountain West, although such rankings change as new and more powerful machines begin operations. It is funded by NSF as well as by the state of Wyoming through an appropriation to the University of Wyoming. Cheyenne is housed in the NCAR-Wyoming Supercomputing Center (NWSC), one of the nation's premier supercomputing facilities for research. Since the NWSC opened in 2012, more than 2,200 scientists from more than 300 universities and federal labs have used its resources. "Through our work at the NWSC, we have a better understanding of such important processes as surface and subsurface hydrology, physics of flow in reservoir rock, and weather modification and precipitation stimulation," said William Gern, vice president of research and economic development at the University of Wyoming. "Importantly, we are also introducing Wyoming’s school-age students to the significance and power of computing." The NWSC is located in Cheyenne, and the name of the new system was chosen to honor the support the center has received from the people of that city. The name also commemorates the upcoming 150th anniversary of the city, which was founded in 1867 and named for the American Indian Cheyenne Nation. Cheyenne was built by Silicon Graphics International, or SGI (now part of Hewlett Packard Enterprise Co.), with DataDirect Networks (DDN) providing centralized file system and data storage components. Cheyenne is capable of 5.34 quadrillion calculations per second (5.34 petaflops, or floating point operations per second). The new system has a peak computation rate of more than 3 billion calculations per second for every watt of energy consumed. That is three times more energy efficient than the Yellowstone supercomputer, which is also highly efficient. The data storage system for Cheyenne provides an initial capacity of 20 petabytes, expandable to 40 petabytes with the addition of extra drives. The new DDN system also transfers data at the rate of 220 gigabytes per second, which is more than twice as fast as the previous file system’s rate of 90 gigabytes per second. Cheyenne is the latest in a long and successful history of supercomputers supported by the NSF and NCAR to advance the atmospheric and related sciences. “We’re excited to provide the research community with more supercomputing power,” said Anke Kamrath, interim director of NCAR’s Computational and Information Systems Laboratory, which oversees operations at the NWSC. “Scientists have access to increasingly large amounts of data about our planet. The enhanced capabilities of the NWSC will enable them to tackle problems that used to be out of reach and obtain results at far greater speeds than ever.” High-performance computers such as Cheyenne allow researchers to run increasingly detailed models that simulate complex events and predict how they might unfold in the future. With more supercomputing power, scientists can capture additional processes, run their models at a higher resolution, and conduct an ensemble of modeling runs that provide a fuller picture of the same time period. "Providing next-generation supercomputing is vital to better understanding the Earth system that affects us all, " said NCAR Director James W. Hurrell. "We're delighted that this powerful resource is now available to the nation's scientists, and we're looking forward to new discoveries in climate, weather, space weather, renewable energy, and other critical areas of research." Some of the initial projects on Cheyenne include: Long-range, seasonal to decadal forecasting: Several studies led by George Mason University, the University of Miami, and NCAR aim to improve prediction of weather patterns months to years in advance. Researchers will use Cheyenne's capabilities to generate more comprehensive simulations of finer-scale processes in the ocean, atmosphere, and sea ice. This research will help scientists refine computer models for improved long-term predictions, including how year-to-year changes in Arctic sea ice extent may affect the likelihood of extreme weather events thousands of miles away. Wind energy: Projecting electricity output at a wind farm is extraordinarily challenging as it involves predicting variable gusts and complex wind eddies at the height of turbines, which are hundreds of feet above the sensors used for weather forecasting. University of Wyoming researchers will use Cheyenne to simulate wind conditions on different scales, from across the continent down to the tiny space near a wind turbine blade, as well as the vibrations within an individual turbine itself. In addition, an NCAR-led project will create high-resolution, 3-D simulations of vertical and horizontal drafts to provide more information about winds over complex terrain. This type of research is critical as utilities seek to make wind farms as efficient as possible. Space weather: Scientists are working to better understand solar disturbances that buffet Earth's atmosphere and threaten the operation of satellites, communications, and power grids. New projects led by the University of Delaware and NCAR are using Cheyenne to gain more insight into how solar activity leads to damaging geomagnetic storms. The scientists plan to develop detailed simulations of the emergence of the magnetic field from the subsurface of the Sun into its atmosphere, as well as gain a three-dimensional view of plasma turbulence and magnetic reconnection in space that lead to plasma heating. Extreme weather: One of the leading questions about climate change is how it could affect the frequency and severity of major storms and other types of severe weather. An NCAR-led project will explore how climate interacts with the land surface and hydrology over the United States, and how extreme weather events can be expected to change in the future. It will use advanced modeling approaches at high resolution (down to just a few miles) in ways that can help scientists configure future climate models to better simulate extreme events. Climate engineering: To counter the effects of heat-trapping greenhouse gases, some experts have proposed artificially cooling the planet by injecting sulfates into the stratosphere, which would mimic the effects of a major volcanic eruption. But if society ever tried to engage in such climate engineering, or geoengineering, the results could alter the world's climate in unintended ways. An NCAR-led project is using Cheyenne's computing power to run an ensemble of climate engineering simulations to show how hypothetical sulfate injections could affect regional temperatures and precipitation. Smoke and global climate: A study led by the University of Wyoming will look into emissions from wildfires and how they affect stratocumulus clouds over the southeastern Atlantic Ocean. This research is needed for a better understanding of the global climate system, as stratocumulus clouds, which cover 23 percent of Earth's surface, play a key role in reflecting sunlight back into space. The work will help reveal the extent to which particles emitted during biomass burning influence cloud processes in ways that affect global temperatures.
News Article | March 2, 2017
Earth's magnetic field and atmosphere protect us on the ground from most of the harmful effects of space weather, but astronauts in low-Earth orbit—or even, one day, in interplanetary space—are more exposed to space weather, including bursts of fast-moving particles called solar energetic particles, or SEPs. "Robotic spacecraft are usually radiation-hardened to protect against these kinds of events," said Chris St. Cyr, a space scientist at NASA's Goddard Space Flight Center in Greenbelt, Maryland, and lead author on the study. "But humans are still susceptible." So NASA wants to help improve systems that would provide future astronauts with advance warning of incoming SEPs. In the recent paper, scientists showed that tracking an associated kind of solar explosion—fast-moving clouds of magnetic solar material, called coronal mass ejections—can help. Scientists observe coronal mass ejections using a type of instrument called a coronagraph, in which a solid disk blocks the sun's bright face, revealing the sun's tenuous atmosphere, called the corona. Space-based coronagraphs are more widely used in space weather research because of their wide-field solar views that are not interrupted by cloud cover or Earth's rotation. But ground-based coronagraphs have their own advantages—while they can only observe the sun in the day during clear weather, they can return data almost instantly, and at a much higher time resolution than satellite instruments. This speed of data return could make a significant difference, given that SEPs can move at nearly the speed of light—so their total travel time can be less than an hour from the time they're accelerated near the sun to when they reach Earth. "With space-based coronagraphs, we get images back every 20-30 minutes," said St. Cyr. "You'll see the CME in one frame, and by the time you get the next frame—which contains the information we need to tell how fast it's moving—the energetic particles have already arrived." For this study, scientists worked backwards to see whether they could use a ground-based coronagraph to get that key information on the CME's speed fast enough to lengthen the warning time. They selected an SEP event and then went back to check if the data was available from a coronagraph called K-Cor, which is part of NCAR's High Altitude Observatory and sits on top of the Mauna Loa volcano in Hawaii. Their search confirmed that the necessary information to predict the arrival of the energetic particles was available about 45 minutes before the particles arrived at Earth—tens of minutes before they left the sun's inner atmosphere. The next step is to repeat this study over and over—using both archived data and future observations—in order to see if the early signatures of these energetic particles can be reliably detected in K-Cor's images. This confirmation, along with planned improvements that would put K-Cor's images online even faster, could make it possible for this technique to become a tool in space weather forecasting, such as is provided for the nation by the U.S. National Oceanic and Atmospheric Association. "Currently, processed images from K-Cor are available on the internet in less than 15 minutes after they're taken," said Joan Burkepile, an author on the study based at NCAR and principal investigator for the K-Cor instrument. "We're installing a more powerful computer at the observatory in Hawaii to process the images seconds after they are acquired and provide the data on the internet within a minute or two of acquisition." Explore further: Images of the sun from the GOES-16 satellite More information: O. C. St. Cyr et al, Solar energetic particle warnings from a coronagraph, Space Weather (2017). DOI: 10.1002/2016SW001545
Agency: GTR | Branch: NERC | Program: | Phase: Research Grant | Award Amount: 250.09K | Year: 2015
The hydroxyl radical (OH) is the dominant oxidizing agent in the troposphere, as such its concentration controls the abundances and lifetimes of most atmospheric pollutants, including the important greenhouse gas methane (CH4). Ozone (O3) is also an important oxidant and is itself a greenhouse gas. The concentrations of OH and O3 are interdependent, both being determined by a complex series of reactions involving CH4, carbon monoxide (CO), non-methane volatile organic compounds (NMVOCs) and nitrogen oxides (NOX = NO + NO2). As emissions of these compounds have changed substantially since pre-industrial times, the tropospheric budgets of OH and O3 will also have changed. However, there are large uncertainties associated with current understanding of these past changes and consequently very large uncertainties in projected future changes and associated climate impacts. Most of this uncertainty in past trends comes from lack of observations to constrain studies. Whilst there are a few direct observational data sets which indicate how O3 concentrations changed through the 20th century, there are none for OH. Direct observational data sets of CH4, NMVOCs, CO and NOX, extend, at best, from the 1980s. These time series can be extended backward in time through the analysis of air trapped in firn (unconsolidated snow). Whilst such historic time series have been available for CH4 for some time, only recently have they become available for CO and for some NMVOCs, in particular alkanes. Furthermore, we have also recently determined, from firn analysis, historic time series of alkyl nitrates. Alkyl nitrates are products of the chemistry involving NOX and as such can be used as a diagnostic of the changes in NOX. These new (and in the case of the alkyl nitrates, unique), historic time series provide an exciting opportunity to investigate the changing OH and O3 budgets of the northern hemisphere troposphere since 1950 with observational constraints never available before. Very interestingly, the simple analyses carried out on these time series to date suggest that substantial changes in the atmospheric chemistry have occurred. To exploit the full value of these time series a detailed study is required with a comprehensive chemistry-climate model. Here we propose the first such study. The outcomes of this study will be: 1) a better understanding of the impact of changing anthropogenic emissions on the OH and O3 budgets of the northern hemisphere troposphere; 2) an improved modelling capability with which to project future changes and better inform climate policy. This proposal brings together experts in firn air data interpretation with experts in chemistry-climate modelling. Both groups also have considerable expertise in organic (including alkyl) nitrate chemistry. This proposal specifically builds on past NERC funded work on the trends of alkanes and alkyl nitrates in firm air using simply relationships and models.
Journal of Climate | Year: 2011
One tool for studying uncertainties in simulations of future climate is to consider ensembles of general circulation models where parameterizations have been sampled within their physical range of plausibility. This study is about simulations from two such ensembles: a subset of the climateprediction.net ensemble using the Met Office Hadley Centre Atmosphere Model, version 3.0 and the new "CAMcube" ensemble using the Community Atmosphere Model, version 3.5. The study determines that the distribution of climate sensitivity in the two ensembles is very different: the climateprediction.net ensemble subset range is 1.7-9.9 K, while the CAMcube ensemble range is 2.2-3.2 K. On a regional level, however, both ensembles show a similarly diverse range in their mean climatology. Model radiative flux changes suggest that the major difference between the ranges of climate sensitivity in the two ensembles lies in their clear-sky longwave responses. Large clear-sky feedbacks present only in the climateprediction.net ensemble are found to be proportional to significant biases in upper-tropospheric water vapor concentrations, which are not observed in the CAMcube ensemble. Both ensembles have a similar range of shortwave cloud feedback, making it unlikely that they are causing the larger climate sensitivities in climateprediction.net. In both cases, increased negative shortwave cloud feedbacks at high latitudes are generally compensated by increased positive feedbacks at lower latitudes. © 2011 American Meteorological Society.
Surveys in Geophysics | Year: 2012
The transition between the middle atmosphere and the thermosphere is known as the MLT region (for mesosphere and lower thermosphere). This area has some characteristics that set it apart from other regions of the atmosphere. Most notably, it is the altitude region with the lowest overall temperature and has the unique characteristic that the temperature is much lower in summer than in winter. The summer-to-winter-temperature gradient is the result of adiabatic cooling and warming associated with a vigorous circulation driven primarily by gravity waves. Tides and planetary waves also contribute to the circulation and to the large dynamical variability in the MLT. The past decade has seen much progress in describing and understanding the dynamics of the MLT and the interactions of dynamics with chemistry and radiation. This review describes recent observations and numerical modeling as they relate to understanding the dynamical processes that control the MLT and its variability. Results from the Whole Atmosphere Community Climate Model (WACCM), which is a comprehensive high-top general circulation model with interactive chemistry, are used to illustrate the dynamical processes. Selected observations from the Sounding the Atmosphere with Broadband Emission Radiometry (SABER) instrument are shown for comparison. WACCM simulations of MLT dynamics have some differences with observations. These differences and other questions and discrepancies described in recent papers point to a number of ongoing uncertainties about the MLT dynamical system. © 2012 Springer Science+Business Media B.V.
News Article | February 27, 2017
As the world warms, mountain snowpack will not only melt earlier, it will also melt more slowly, according to a new study by scientists at the National Center for Atmospheric Research (NCAR). The counterintuitive finding, published today in the journal Nature Climate Change, could have widespread implications for water supplies, ecosystem health, and flood risk. "When snowmelt shifts earlier in the year, the snow is no longer melting under the high sun angles of late spring and early summer," said NCAR postdoctoral researcher Keith Musselman, lead author of the paper. "The Sun just isn't providing enough energy at that time of year to drive high snowmelt rates." The study was funded by the National Science Foundation, NCAR's sponsor. The findings could explain recent research that suggests the average streamflow in watersheds encompassing snowy mountains may decline as the climate warms -- even if the total amount of precipitation in the watershed remains unchanged. That's because the snowmelt rate can directly affect streamflow. When snowpack melts more slowly, the resulting water lingers in the soil, giving plants more opportunity to take up the moisture. Water absorbed by plants is water that doesn't make it into the stream, potentially reducing flows. Musselman first became interested in how snowmelt rates might change in the future when he was doing research in the Sierra Nevada. He noticed that shallower, lower-elevation snowpack melted earlier and more slowly than thicker, higher-elevation snowpack. The snow at cooler, higher elevations tended to stick around until early summer -- when the Sun was relatively high in the sky and the days had grown longer -- so when it finally started to melt, the melt was rapid. Musselman wondered if the same phenomenon would unfold in a future climate, when warmer temperatures are expected to transform higher-elevation snowpack into something that looks much more like today's lower-elevation snowpack. If so, the result would be more snow melting slowly and less snow melting quickly. To investigate the question, Musselman first confirmed what he'd noticed in the Sierra by analyzing a decade's worth of snowpack observations from 979 stations in the United States and Canada. He and his co-authors -- NCAR scientists Martyn Clark, Changhai Liu, Kyoko Ikeda, and Roy Rasmussen -- then simulated snowpack over the same decade using the NCAR-based Weather Research and Forecasting (WRF) model. Once they determined that the output from WRF tracked with the observations, they used simulations from the model to investigate how snowmelt rates might change in North America around the end of the century if climate change continues unabated. "We found a decrease in the total volume of meltwater -- which makes sense given that we expect there to be less snow overall in the future," Musselman said. "But even with this decrease, we found an increase in the amount of water produced at low melt rates and, on the flip side, a decrease in the amount of water produced at high melt rates." While the study did not investigate the range of implications that could come from the findings, Musselman said the impacts could be far-reaching. For example, a reduction in high melt rates could mean fewer spring floods, which could lower the risk of infrastructure damage but also negatively affect riparian ecosystems. Changes in the timing and amount of snowmelt runoff could also cause warmer stream temperatures, which would affect trout and other fish species, and the expected decrease in streamflow could cause shortages in urban water supplies. "We hope this study motivates scientists from many other disciplines to dig into our research so we can better understand the vast implications of this projected shift in hydrologic patterns," Musselman said. The University Corporation for Atmospheric Research manages the National Center for Atmospheric Research under sponsorship by the National Science Foundation. Any opinions, findings and conclusions, or recommendations expressed in this publication are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.