Boulder City, CO, United States
Boulder City, CO, United States

Time filter

Source Type

Earth's magnetic field and atmosphere protect us on the ground from most of the harmful effects of space weather, but astronauts in low-Earth orbit—or even, one day, in interplanetary space—are more exposed to space weather, including bursts of fast-moving particles called solar energetic particles, or SEPs. "Robotic spacecraft are usually radiation-hardened to protect against these kinds of events," said Chris St. Cyr, a space scientist at NASA's Goddard Space Flight Center in Greenbelt, Maryland, and lead author on the study. "But humans are still susceptible." So NASA wants to help improve systems that would provide future astronauts with advance warning of incoming SEPs. In the recent paper, scientists showed that tracking an associated kind of solar explosion—fast-moving clouds of magnetic solar material, called coronal mass ejections—can help. Scientists observe coronal mass ejections using a type of instrument called a coronagraph, in which a solid disk blocks the sun's bright face, revealing the sun's tenuous atmosphere, called the corona. Space-based coronagraphs are more widely used in space weather research because of their wide-field solar views that are not interrupted by cloud cover or Earth's rotation. But ground-based coronagraphs have their own advantages—while they can only observe the sun in the day during clear weather, they can return data almost instantly, and at a much higher time resolution than satellite instruments. This speed of data return could make a significant difference, given that SEPs can move at nearly the speed of light—so their total travel time can be less than an hour from the time they're accelerated near the sun to when they reach Earth. "With space-based coronagraphs, we get images back every 20-30 minutes," said St. Cyr. "You'll see the CME in one frame, and by the time you get the next frame—which contains the information we need to tell how fast it's moving—the energetic particles have already arrived." For this study, scientists worked backwards to see whether they could use a ground-based coronagraph to get that key information on the CME's speed fast enough to lengthen the warning time. They selected an SEP event and then went back to check if the data was available from a coronagraph called K-Cor, which is part of NCAR's High Altitude Observatory and sits on top of the Mauna Loa volcano in Hawaii. Their search confirmed that the necessary information to predict the arrival of the energetic particles was available about 45 minutes before the particles arrived at Earth—tens of minutes before they left the sun's inner atmosphere. The next step is to repeat this study over and over—using both archived data and future observations—in order to see if the early signatures of these energetic particles can be reliably detected in K-Cor's images. This confirmation, along with planned improvements that would put K-Cor's images online even faster, could make it possible for this technique to become a tool in space weather forecasting, such as is provided for the nation by the U.S. National Oceanic and Atmospheric Association. "Currently, processed images from K-Cor are available on the internet in less than 15 minutes after they're taken," said Joan Burkepile, an author on the study based at NCAR and principal investigator for the K-Cor instrument. "We're installing a more powerful computer at the observatory in Hawaii to process the images seconds after they are acquired and provide the data on the internet within a minute or two of acquisition." Explore further: Images of the sun from the GOES-16 satellite More information: O. C. St. Cyr et al, Solar energetic particle warnings from a coronagraph, Space Weather (2017). DOI: 10.1002/2016SW001545


News Article | February 15, 2017
Site: www.scientificcomputing.com

The National Center for Atmospheric Research (NCAR) is launching operations this month of one of the world's most powerful and energy-efficient supercomputers, providing the nation with a major new tool to advance understanding of the atmospheric and related Earth system sciences. Named "Cheyenne," the 5.34-petaflop system is capable of more than triple the amount of scientific computing performed by the previous NCAR supercomputer, Yellowstone. It also is three times more energy efficient. Scientists across the country will use Cheyenne to study phenomena ranging from wildfires and seismic activity to gusts that generate power at wind farms. Their findings will lay the groundwork for better protecting society from natural disasters, lead to more detailed projections of seasonal and longer-term weather and climate variability and change, and improve weather and water forecasts that are needed by economic sectors from agriculture and energy to transportation and tourism. "Cheyenne will help us advance the knowledge needed for saving lives, protecting property, and enabling U.S. businesses to better compete in the global marketplace," said Antonio J. Busalacchi, president of the University Corporation for Atmospheric Research. "This system is turbocharging our science." UCAR manages NCAR on behalf of the National Science Foundation (NSF). Cheyenne currently ranks as the 20th fastest supercomputer in the world and the fastest in the Mountain West, although such rankings change as new and more powerful machines begin operations. It is funded by NSF as well as by the state of Wyoming through an appropriation to the University of Wyoming. Cheyenne is housed in the NCAR-Wyoming Supercomputing Center (NWSC), one of the nation's premier supercomputing facilities for research. Since the NWSC opened in 2012, more than 2,200 scientists from more than 300 universities and federal labs have used its resources. "Through our work at the NWSC, we have a better understanding of such important processes as surface and subsurface hydrology, physics of flow in reservoir rock, and weather modification and precipitation stimulation," said William Gern, vice president of research and economic development at the University of Wyoming. "Importantly, we are also introducing Wyoming’s school-age students to the significance and power of computing." The NWSC is located in Cheyenne, and the name of the new system was chosen to honor the support the center has received from the people of that city. The name also commemorates the upcoming 150th anniversary of the city, which was founded in 1867 and named for the American Indian Cheyenne Nation. Cheyenne was built by Silicon Graphics International, or SGI (now part of Hewlett Packard Enterprise Co.), with DataDirect Networks (DDN) providing centralized file system and data storage components. Cheyenne is capable of 5.34 quadrillion calculations per second (5.34 petaflops, or floating point operations per second). The new system has a peak computation rate of more than 3 billion calculations per second for every watt of energy consumed. That is three times more energy efficient than the Yellowstone supercomputer, which is also highly efficient. The data storage system for Cheyenne provides an initial capacity of 20 petabytes, expandable to 40 petabytes with the addition of extra drives.  The new DDN system also transfers data at the rate of 220 gigabytes per second, which is more than twice as fast as the previous file system’s rate of 90 gigabytes per second. Cheyenne is the latest in a long and successful history of supercomputers supported by the NSF and NCAR to advance the atmospheric and related sciences. “We’re excited to provide the research community with more supercomputing power,” said Anke Kamrath, interim director of NCAR’s Computational and Information Systems Laboratory, which oversees operations at the NWSC. “Scientists have access to increasingly large amounts of data about our planet. The enhanced capabilities of the NWSC will enable them to tackle problems that used to be out of reach and obtain results at far greater speeds than ever.” High-performance computers such as Cheyenne allow researchers to run increasingly detailed models that simulate complex events and predict how they might unfold in the future. With more supercomputing power, scientists can capture additional processes, run their models at a higher resolution, and conduct an ensemble of modeling runs that provide a fuller picture of the same time period. "Providing next-generation supercomputing is vital to better understanding the Earth system that affects us all, " said NCAR Director James W. Hurrell. "We're delighted that this powerful resource is now available to the nation's scientists, and we're looking forward to new discoveries in climate, weather, space weather, renewable energy, and other critical areas of research." Some of the initial projects on Cheyenne include: Long-range, seasonal to decadal forecasting: Several studies led by George Mason University, the University of Miami, and NCAR aim to improve prediction of weather patterns months to years in advance. Researchers will use Cheyenne's capabilities to generate more comprehensive simulations of finer-scale processes in the ocean, atmosphere, and sea ice. This research will help scientists refine computer models for improved long-term predictions, including how year-to-year changes in Arctic sea ice extent may affect the likelihood of extreme weather events thousands of miles away. Wind energy: Projecting electricity output at a wind farm is extraordinarily challenging as it involves predicting variable gusts and complex wind eddies at the height of turbines, which are hundreds of feet above the sensors used for weather forecasting. University of Wyoming researchers will use Cheyenne to simulate wind conditions on different scales, from across the continent down to the tiny space near a wind turbine blade, as well as the vibrations within an individual turbine itself. In addition, an NCAR-led project will create high-resolution, 3-D simulations of vertical and horizontal drafts to provide more information about winds over complex terrain. This type of research is critical as utilities seek to make wind farms as efficient as possible. Space weather: Scientists are working to better understand solar disturbances that buffet Earth's atmosphere and threaten the operation of satellites, communications, and power grids. New projects led by the University of Delaware and NCAR are using Cheyenne to gain more insight into how solar activity leads to damaging geomagnetic storms. The scientists plan to develop detailed simulations of the emergence of the magnetic field from the subsurface of the Sun into its atmosphere, as well as gain a three-dimensional view of plasma turbulence and magnetic reconnection in space that lead to plasma heating. Extreme weather: One of the leading questions about climate change is how it could affect the frequency and severity of major storms and other types of severe weather. An NCAR-led project will explore how climate interacts with the land surface and hydrology over the United States, and how extreme weather events can be expected to change in the future. It will use advanced modeling approaches at high resolution (down to just a few miles) in ways that can help scientists configure future climate models to better simulate extreme events. Climate engineering: To counter the effects of heat-trapping greenhouse gases, some experts have proposed artificially cooling the planet by injecting sulfates into the stratosphere, which would mimic the effects of a major volcanic eruption. But if society ever tried to engage in such climate engineering, or geoengineering, the results could alter the world's climate in unintended ways. An NCAR-led project is using Cheyenne's computing power to run an ensemble of climate engineering simulations to show how hypothetical sulfate injections could affect regional temperatures and precipitation. Smoke and global climate: A study led by the University of Wyoming will look into emissions from wildfires and how they affect stratocumulus clouds over the southeastern Atlantic Ocean. This research is needed for a better understanding of the global climate system, as stratocumulus clouds, which cover 23 percent of Earth's surface, play a key role in reflecting sunlight back into space. The work will help reveal the extent to which particles emitted during biomass burning influence cloud processes in ways that affect global temperatures.


News Article | February 17, 2017
Site: hosted2.ap.org

(AP) — A new supercomputer in the top coal-mining state has begun critical climate-change research with support from even some global warming doubters, but scientists worry President Donald Trump could cut funding for such programs. The $30 million, house-sized supercomputer named Cheyenne belongs to a federally funded research center. It got to work a few weeks ago crunching numbers for several ambitious projects, from modeling air currents at wind farms to figuring out how to better predict weather months to years in advance. It's the fastest computer in the Rocky Mountain West — three times faster than the 4-year-old supercomputer named Yellowstone it is replacing and 20th-fastest in the world. Capable of 5.34 quadrillion calculations per second, Cheyenne is 240,000 times faster than a new, high-end laptop. Located in a windy business park near the city of Cheyenne, the National Center for Atmospheric Research-Wyoming Supercomputing Center that houses the water-cooled machine continues to enjoy support even from Wyoming's coal cheerleaders who doubt humankind is warming the Earth. "Before we start making policy decisions on this, the science has got to be good," said Travis Deti, executive director of the Wyoming Mining Association. The vast majority of peer-reviewed studies, science organizations and climate scientists have found the Earth is warming and that the warming is man-made and a problem, but Wyoming's relationship with climate science is complicated at best. The University of Wyoming in 2012 removed a campus artwork made of charred logs after the fossil fuel industry objected to the piece's climate-change-awareness message. The state also has vacillated on whether and how K-12 students should learn about climate change. Gov. Matt Mead, who is suing to block Obama administration efforts to limit carbon emissions from power plants and other sources, calls himself a climate-change skeptic. Still, he supports the supercomputer's role in driving Wyoming's small technology sector, spokesman David Bush said. Even so, scientists worry Trump, who has called climate change a hoax perpetrated by the Chinese to harm U.S. economic interests, could cut such projects. About 70 percent of the supercomputer's cost comes from the National Science Foundation, an independent federal agency with a $7.5 billion budget. Traditionally the foundation has had bipartisan support, but some Republicans have suggested redirecting the agency away from the earth sciences — and from climate change research in particular. In December some 800 U.S. scientists, including 23 affiliated with the University of Wyoming and three at the organization that runs the supercomputer, signed an open letter urging Trump to take climate change seriously. "To be ignorant doesn't really prevent it from happening," said Shane Murphy, a University of Wyoming assistant professor and climate researcher who signed. The White House didn't immediately respond to an Associated Press request for comment on Trump's plans for funding the science foundation. Like its predecessor Yellowstone, Cheyenne will help better predict weather and, over the long term, climate change. "We believe that doing better predictions of those things have apolitical benefits — saving lives and saving money, and improving outcomes for businesses and farmers," said Rich Loft, a National Center for Atmospheric Research supercomputing specialist. The center moved its supercomputing to Cheyenne from Boulder, Colorado, in 2012, enticed by a $40 million state incentive package. These days, Wyoming doesn't have money to lure a supercomputer or much of anything else. Downturns in coal, oil and natural gas extraction — in part because of competition from renewable energy — have pinched state revenue. Wyoming continues to supply close to 40 percent of the nation's coal, however, and in 2014 the state put $15 million toward a $20 million facility to study carbon capture at a power plant near Gillette. The Yellowstone and Cheyenne supercomputers each use about 1.5 megawatts of electricity, or as much as 750 average-sized homes use on average at any given time. Some of the electricity comes from a wind farm 7 miles up the road. The supercomputers will work side-by-side until Yellowstone is decommissioned later this year. The center's scientists are aware of the political winds surrounding their work but plan to maintain the course on projects planned since long before Trump's election. "I really don't think that anybody at NCAR is talking about changing our science mission," Loft said.


News Article | February 17, 2017
Site: news.yahoo.com

This Oct. 26, 2016 photo provided by the University Corporation for Atmospheric Research shows the new supercomputer named Cheyenne at the National Center for Atmospheric Research at the supercomputing center in Cheyenne, Wyo. Wyoming officials including Gov. Matt Mead say they support the NCAR-Wyoming Supercomputing Center even as they describe themselves as climate skeptics. Scientists nationwide are nonetheless concerned that President Donald Trump, who has called climate change a hoax, might not take climate change research seriously. (Carlye Calvin/ University Corporation for Atmospheric Research via AP) CHEYENNE, Wyo. (AP) — A new supercomputer in the top coal-mining state has begun critical climate-change research with support from even some global warming doubters, but scientists worry President Donald Trump could cut funding for such programs. The $30 million, house-sized supercomputer named Cheyenne belongs to a federally funded research center. It got to work a few weeks ago crunching numbers for several ambitious projects, from modeling air currents at wind farms to figuring out how to better predict weather months to years in advance. It's the fastest computer in the Rocky Mountain West — three times faster than the 4-year-old supercomputer named Yellowstone it is replacing and 20th-fastest in the world. Capable of 5.34 quadrillion calculations per second, Cheyenne is 240,000 times faster than a new, high-end laptop. Located in a windy business park near the city of Cheyenne, the National Center for Atmospheric Research-Wyoming Supercomputing Center that houses the water-cooled machine continues to enjoy support even from Wyoming's coal cheerleaders who doubt humankind is warming the Earth. "Before we start making policy decisions on this, the science has got to be good," said Travis Deti, executive director of the Wyoming Mining Association. The vast majority of peer-reviewed studies, science organizations and climate scientists have found the Earth is warming and that the warming is man-made and a problem, but Wyoming's relationship with climate science is complicated at best. The University of Wyoming in 2012 removed a campus artwork made of charred logs after the fossil fuel industry objected to the piece's climate-change-awareness message. The state also has vacillated on whether and how K-12 students should learn about climate change. Gov. Matt Mead, who is suing to block Obama administration efforts to limit carbon emissions from power plants and other sources, calls himself a climate-change skeptic. Still, he supports the supercomputer's role in driving Wyoming's small technology sector, spokesman David Bush said. Even so, scientists worry Trump, who has called climate change a hoax perpetrated by the Chinese to harm U.S. economic interests, could cut such projects. About 70 percent of the supercomputer's cost comes from the National Science Foundation, an independent federal agency with a $7.5 billion budget. Traditionally the foundation has had bipartisan support, but some Republicans have suggested redirecting the agency away from the earth sciences — and from climate change research in particular. In December some 800 U.S. scientists, including 23 affiliated with the University of Wyoming and three at the organization that runs the supercomputer, signed an open letter urging Trump to take climate change seriously. "To be ignorant doesn't really prevent it from happening," said Shane Murphy, a University of Wyoming assistant professor and climate researcher who signed. The White House didn't immediately respond to an Associated Press request for comment on Trump's plans for funding the science foundation. Like its predecessor Yellowstone, Cheyenne will help better predict weather and, over the long term, climate change. "We believe that doing better predictions of those things have apolitical benefits — saving lives and saving money, and improving outcomes for businesses and farmers," said Rich Loft, a National Center for Atmospheric Research supercomputing specialist. The center moved its supercomputing to Cheyenne from Boulder, Colorado, in 2012, enticed by a $40 million state incentive package. These days, Wyoming doesn't have money to lure a supercomputer or much of anything else. Downturns in coal, oil and natural gas extraction — in part because of competition from renewable energy — have pinched state revenue. Wyoming continues to supply close to 40 percent of the nation's coal, however, and in 2014 the state put $15 million toward a $20 million facility to study carbon capture at a power plant near Gillette. The Yellowstone and Cheyenne supercomputers each use about 1.5 megawatts of electricity, or as much as 750 average-sized homes use on average at any given time. Some of the electricity comes from a wind farm 7 miles up the road. The supercomputers will work side-by-side until Yellowstone is decommissioned later this year. The center's scientists are aware of the political winds surrounding their work but plan to maintain the course on projects planned since long before Trump's election. "I really don't think that anybody at NCAR is talking about changing our science mission," Loft said.


Our constantly-changing sun sometimes erupts with bursts of light, solar material, or ultra-fast energized particles -- collectively, these events contribute to space weather. In a study published Jan. 30, 2017, in Space Weather, scientists from NASA and the National Center for Atmospheric Research, or NCAR, in Boulder, Colorado, have shown that the warning signs of one type of space weather event can be detected tens of minutes earlier than with current forecasting techniques - critical extra time that could help protect astronauts in space. Earth's magnetic field and atmosphere protect us on the ground from most of the harmful effects of space weather, but astronauts in low-Earth orbit -- or even, one day, in interplanetary space -- are more exposed to space weather, including bursts of fast-moving particles called solar energetic particles, or SEPs. "Robotic spacecraft are usually radiation-hardened to protect against these kinds of events," said Chris St. Cyr, a space scientist at NASA's Goddard Space Flight Center in Greenbelt, Maryland, and lead author on the study. "But humans are still susceptible." So NASA wants to help improve systems that would provide future astronauts with advance warning of incoming SEPs. In the recent paper, scientists showed that tracking an associated kind of solar explosion -- fast-moving clouds of magnetic solar material, called coronal mass ejections -- can help. Scientists observe coronal mass ejections using a type of instrument called a coronagraph, in which a solid disk blocks the sun's bright face, revealing the sun's tenuous atmosphere, called the corona. Space-based coronagraphs are more widely used in space weather research because of their wide-field solar views that are not interrupted by cloud cover or Earth's rotation. But ground-based coronagraphs have their own advantages -- while they can only observe the sun in the day during clear weather, they can return data almost instantly, and at a much higher time resolution than satellite instruments. This speed of data return could make a significant difference, given that SEPs can move at nearly the speed of light -- so their total travel time can be less than an hour from the time they're accelerated near the sun to when they reach Earth. "With space-based coronagraphs, we get images back every 20-30 minutes," said St. Cyr. "You'll see the CME in one frame, and by the time you get the next frame -- which contains the information we need to tell how fast it's moving -- the energetic particles have already arrived." For this study, scientists worked backwards to see whether they could use a ground-based coronagraph to get that key information on the CME's speed fast enough to lengthen the warning time. They selected an SEP event and then went back to check if the data was available from a coronagraph called K-Cor, which is part of NCAR's High Altitude Observatory and sits on top of the Mauna Loa volcano in Hawaii. Their search confirmed that the necessary information to predict the arrival of the energetic particles was available about 45 minutes before the particles arrived at Earth -- tens of minutes before they left the sun's inner atmosphere. The next step is to repeat this study over and over -- using both archived data and future observations -- in order to see if the early signatures of these energetic particles can be reliably detected in K-Cor's images. This confirmation, along with planned improvements that would put K-Cor's images online even faster, could make it possible for this technique to become a tool in space weather forecasting, such as is provided for the nation by the U.S. National Oceanic and Atmospheric Association. "Currently, processed images from K-Cor are available on the internet in less than 15 minutes after they're taken," said Joan Burkepile, an author on the study based at NCAR and principal investigator for the K-Cor instrument. "We're installing a more powerful computer at the observatory in Hawaii to process the images seconds after they are acquired and provide the data on the internet within a minute or two of acquisition."


Grant
Agency: GTR | Branch: NERC | Program: | Phase: Research Grant | Award Amount: 250.09K | Year: 2015

The hydroxyl radical (OH) is the dominant oxidizing agent in the troposphere, as such its concentration controls the abundances and lifetimes of most atmospheric pollutants, including the important greenhouse gas methane (CH4). Ozone (O3) is also an important oxidant and is itself a greenhouse gas. The concentrations of OH and O3 are interdependent, both being determined by a complex series of reactions involving CH4, carbon monoxide (CO), non-methane volatile organic compounds (NMVOCs) and nitrogen oxides (NOX = NO + NO2). As emissions of these compounds have changed substantially since pre-industrial times, the tropospheric budgets of OH and O3 will also have changed. However, there are large uncertainties associated with current understanding of these past changes and consequently very large uncertainties in projected future changes and associated climate impacts. Most of this uncertainty in past trends comes from lack of observations to constrain studies. Whilst there are a few direct observational data sets which indicate how O3 concentrations changed through the 20th century, there are none for OH. Direct observational data sets of CH4, NMVOCs, CO and NOX, extend, at best, from the 1980s. These time series can be extended backward in time through the analysis of air trapped in firn (unconsolidated snow). Whilst such historic time series have been available for CH4 for some time, only recently have they become available for CO and for some NMVOCs, in particular alkanes. Furthermore, we have also recently determined, from firn analysis, historic time series of alkyl nitrates. Alkyl nitrates are products of the chemistry involving NOX and as such can be used as a diagnostic of the changes in NOX. These new (and in the case of the alkyl nitrates, unique), historic time series provide an exciting opportunity to investigate the changing OH and O3 budgets of the northern hemisphere troposphere since 1950 with observational constraints never available before. Very interestingly, the simple analyses carried out on these time series to date suggest that substantial changes in the atmospheric chemistry have occurred. To exploit the full value of these time series a detailed study is required with a comprehensive chemistry-climate model. Here we propose the first such study. The outcomes of this study will be: 1) a better understanding of the impact of changing anthropogenic emissions on the OH and O3 budgets of the northern hemisphere troposphere; 2) an improved modelling capability with which to project future changes and better inform climate policy. This proposal brings together experts in firn air data interpretation with experts in chemistry-climate modelling. Both groups also have considerable expertise in organic (including alkyl) nitrate chemistry. This proposal specifically builds on past NERC funded work on the trends of alkanes and alkyl nitrates in firm air using simply relationships and models.


News Article | March 30, 2016
Site: www.techtimes.com

The development of distinct sea surface temperature pattern in the North Pacific Ocean can help meteorologists predict increased occurrences of heat waves by up to 50 days ahead, at least in the United States' eastern half. The pattern shows the collision between cooler-than-average and warmer-than-average waters. Depending on how distinct the building pattern is - with the warmer waters hitting the cooler waters - the chances of extreme heat waves in a particular day or week can increase by three-fold. The research was conducted by the National Center for Atmospheric Research (NCAR). Heat waves during the summer are one of the most lethal weather events that can impact not just people, but also farming activities and energy use. According to lead author and NCAR postdoctoral researcher Karen McKinnon, giving farmers and city planners an advanced warning about an upcoming heat wave can help avoid worst-case scenario consequences. In the study, the researchers divided the United States into regions that most likely experience severe heat waves at the same time. Out of the resulting heat waves regions, they focused on the area that stretches across most of the Midwest all the way up to the East Coast. This focused region comprised of both heavily populated cities and agricultural areas. The team investigated if there is a connection between the severe heat waves in the country's eastern half and the irregularities in the worldwide sea surface temperature. They discovered a pattern in the middle of the North Pacific Ocean and named it the Pacific Extreme Pattern. They learned that the pattern is present not only during instances of heat waves but way ahead of the weather event. "Whatever mechanisms ultimately lead to the heat wave also leaves a fingerprint of sea surface temperature anomalies behind," said McKinnon. The team wanted to see how well the Pacific Extreme Pattern can predict upcoming heat waves retrospectively. They analyzed gathered data from 1,613 weather stations in eastern U.S. from 1982 to 2015 and compared it with the same-period data of the sea surface temperatures. Fifty days prior to a heat wave event, they were able to detect an increase in the chances - from one in six to one in four - that a heat wave will strike in a specific week. Thirty days prior or closer, the pattern was able to hindcast the odds - about one in two - that a heat wave will strike on a specific day. The new technique can help improve current seasonal forecasts. The ability to make long-range predictions for singular heat wave events can help society prepare better than relying on current forecasts alone. The National Science Foundation funded the research, which was published in the Nature Geoscience journal on March 28. The study team also included researchers from various universities, including the University of Washington, Pennsylvania State University and Harvard University.


One tool for studying uncertainties in simulations of future climate is to consider ensembles of general circulation models where parameterizations have been sampled within their physical range of plausibility. This study is about simulations from two such ensembles: a subset of the climateprediction.net ensemble using the Met Office Hadley Centre Atmosphere Model, version 3.0 and the new "CAMcube" ensemble using the Community Atmosphere Model, version 3.5. The study determines that the distribution of climate sensitivity in the two ensembles is very different: the climateprediction.net ensemble subset range is 1.7-9.9 K, while the CAMcube ensemble range is 2.2-3.2 K. On a regional level, however, both ensembles show a similarly diverse range in their mean climatology. Model radiative flux changes suggest that the major difference between the ranges of climate sensitivity in the two ensembles lies in their clear-sky longwave responses. Large clear-sky feedbacks present only in the climateprediction.net ensemble are found to be proportional to significant biases in upper-tropospheric water vapor concentrations, which are not observed in the CAMcube ensemble. Both ensembles have a similar range of shortwave cloud feedback, making it unlikely that they are causing the larger climate sensitivities in climateprediction.net. In both cases, increased negative shortwave cloud feedbacks at high latitudes are generally compensated by increased positive feedbacks at lower latitudes. © 2011 American Meteorological Society.


Smith A.K.,NCAR
Surveys in Geophysics | Year: 2012

The transition between the middle atmosphere and the thermosphere is known as the MLT region (for mesosphere and lower thermosphere). This area has some characteristics that set it apart from other regions of the atmosphere. Most notably, it is the altitude region with the lowest overall temperature and has the unique characteristic that the temperature is much lower in summer than in winter. The summer-to-winter-temperature gradient is the result of adiabatic cooling and warming associated with a vigorous circulation driven primarily by gravity waves. Tides and planetary waves also contribute to the circulation and to the large dynamical variability in the MLT. The past decade has seen much progress in describing and understanding the dynamics of the MLT and the interactions of dynamics with chemistry and radiation. This review describes recent observations and numerical modeling as they relate to understanding the dynamical processes that control the MLT and its variability. Results from the Whole Atmosphere Community Climate Model (WACCM), which is a comprehensive high-top general circulation model with interactive chemistry, are used to illustrate the dynamical processes. Selected observations from the Sounding the Atmosphere with Broadband Emission Radiometry (SABER) instrument are shown for comparison. WACCM simulations of MLT dynamics have some differences with observations. These differences and other questions and discrepancies described in recent papers point to a number of ongoing uncertainties about the MLT dynamical system. © 2012 Springer Science+Business Media B.V.


News Article | February 27, 2017
Site: www.eurekalert.org

As the world warms, mountain snowpack will not only melt earlier, it will also melt more slowly, according to a new study by scientists at the National Center for Atmospheric Research (NCAR). The counterintuitive finding, published today in the journal Nature Climate Change, could have widespread implications for water supplies, ecosystem health, and flood risk. "When snowmelt shifts earlier in the year, the snow is no longer melting under the high sun angles of late spring and early summer," said NCAR postdoctoral researcher Keith Musselman, lead author of the paper. "The Sun just isn't providing enough energy at that time of year to drive high snowmelt rates." The study was funded by the National Science Foundation, NCAR's sponsor. The findings could explain recent research that suggests the average streamflow in watersheds encompassing snowy mountains may decline as the climate warms -- even if the total amount of precipitation in the watershed remains unchanged. That's because the snowmelt rate can directly affect streamflow. When snowpack melts more slowly, the resulting water lingers in the soil, giving plants more opportunity to take up the moisture. Water absorbed by plants is water that doesn't make it into the stream, potentially reducing flows. Musselman first became interested in how snowmelt rates might change in the future when he was doing research in the Sierra Nevada. He noticed that shallower, lower-elevation snowpack melted earlier and more slowly than thicker, higher-elevation snowpack. The snow at cooler, higher elevations tended to stick around until early summer -- when the Sun was relatively high in the sky and the days had grown longer -- so when it finally started to melt, the melt was rapid. Musselman wondered if the same phenomenon would unfold in a future climate, when warmer temperatures are expected to transform higher-elevation snowpack into something that looks much more like today's lower-elevation snowpack. If so, the result would be more snow melting slowly and less snow melting quickly. To investigate the question, Musselman first confirmed what he'd noticed in the Sierra by analyzing a decade's worth of snowpack observations from 979 stations in the United States and Canada. He and his co-authors -- NCAR scientists Martyn Clark, Changhai Liu, Kyoko Ikeda, and Roy Rasmussen -- then simulated snowpack over the same decade using the NCAR-based Weather Research and Forecasting (WRF) model. Once they determined that the output from WRF tracked with the observations, they used simulations from the model to investigate how snowmelt rates might change in North America around the end of the century if climate change continues unabated. "We found a decrease in the total volume of meltwater -- which makes sense given that we expect there to be less snow overall in the future," Musselman said. "But even with this decrease, we found an increase in the amount of water produced at low melt rates and, on the flip side, a decrease in the amount of water produced at high melt rates." While the study did not investigate the range of implications that could come from the findings, Musselman said the impacts could be far-reaching. For example, a reduction in high melt rates could mean fewer spring floods, which could lower the risk of infrastructure damage but also negatively affect riparian ecosystems. Changes in the timing and amount of snowmelt runoff could also cause warmer stream temperatures, which would affect trout and other fish species, and the expected decrease in streamflow could cause shortages in urban water supplies. "We hope this study motivates scientists from many other disciplines to dig into our research so we can better understand the vast implications of this projected shift in hydrologic patterns," Musselman said. The University Corporation for Atmospheric Research manages the National Center for Atmospheric Research under sponsorship by the National Science Foundation. Any opinions, findings and conclusions, or recommendations expressed in this publication are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

Loading NCAR collaborators
Loading NCAR collaborators