News Article | March 1, 2017
PRINCETON, N.J. -- An influx of pollution from Asia in the western United States and more frequent heat waves in the eastern U.S. are responsible for the persistence of smog in these regions over the past quarter century despite laws curtailing the emission of smog-forming chemicals from tailpipes and factories. The study, led by researchers at Princeton University and the National Oceanic and Atmospheric Administration's Geophysical Fluid Dynamics Laboratory (GFDL), highlights the importance of maintaining domestic emission controls on motor vehicles, power plants and other industries at a time when pollution is increasingly global. Published March 1 in the journal Atmospheric Chemistry and Physics, the study looked at the sources of smog, also known as ground-level ozone, across a period ranging from the 1980s to today. Ground-level ozone, which is distinct from the ozone in the upper atmosphere that protects the planet from ultraviolet radiation, is harmful to human health, exacerbating asthma attacks and causing difficulty breathing. It also harms sensitive trees and crops. Despite a 50 percent cut in smog-forming chemicals such as nitrogen oxides, commonly known as "NOx", over the past 25 years, ozone levels measured in rural areas of the west have actually climbed. And while ozone in the eastern U.S. has decreased overall, the levels can spike during heat waves. The study traced the increase of ozone in the west to the influx of pollution from Asian countries, including China, North and South Korea, Japan, India, and other South Asian countries. Collectively, the region has tripled its emissions of NOx since 1990. In the eastern U.S., meanwhile, heat waves -- which have become more frequent in the past few decades -- trap polluted air in place, leading to temporary escalations in locally produced ozone. The study explains why springtime ozone levels measured in Yellowstone National Park and other western parks far from urban areas have climbed over the past quarter century. According to the study, springtime ozone levels in the national parks rose during that period by 5 to 10 parts per billion (ppb), which is significant given that the federal ozone standard is 70 ppb. The influx of pollution from Asia could make it difficult for these areas to comply with the federal ozone standards, according to the study's authors. "Increasing background ozone from rising Asian emissions leaves less room for local production of ozone before the federal standard is violated," said lead author Meiyun Lin, a research scholar in the Program in Atmospheric and Oceanic Sciences at Princeton University and a scientist at GFDL. Lin's co-authors were Larry Horowitz, also of GFDL; Richard Payton and Gail Tonnesen of the U.S. Environmental Protection Agency; and Arlene Fiore of the Lamont-Doherty Earth-Observatory and Department of Earth and Environmental Sciences at Columbia University. Using ozone measurements combined with climate models developed at GFDL, the authors identified pollution from Asia as driving the climb in ozone in western U.S. national parks in the spring, when wind and weather patterns push Asian pollution across the Pacific Ocean. In the summer, when these weather patterns subside, ozone levels in national parks are still above what would be expected given U.S. reductions in ozone-precursors. While it has been known for over a decade that Asian pollution contributes to ozone levels in the United States, this study is one of the first to categorize the extent to which rising Asian emissions contribute to U.S. ozone, according to Lin. In the eastern United States, where Asian pollution is a minor contributor to smog, NOx emission controls have been successful at reducing ozone levels. However, periods of extreme heat and drought can trap pollution in the region, making bad ozone days worse. Regional NOx emission reductions alleviated the ozone buildup during the recent heat waves of 2011 and 2012, compared to earlier heat waves such as in 1988 and 1999. As heat waves appear to be on the rise due to global climate change, smog in the eastern U.S. is likely to worsen, according to the study. Climate models such as those developed at GFDL can help researchers predict future levels of smog, enabling cost-benefit analyses for costly pollution control measures. The researchers compared results from a model called GFDL-AM3 to ozone measurements from monitoring stations over the course of the last 35 years, from 1980 to 2014. Prior studies using global models poorly matched the ozone increases measured in western national parks. Lin and co-authors were able to match the measurements by narrowing their analysis to days when the airflow is predominantly from the Pacific Ocean. Modeling the sources of air pollution can help explain where the ozone measured in the national parks is coming from, explained Lin. "The model allows us to divide the observed air pollution into components driven by different sources," she said. The team also looked at other contributors to ground-level ozone, such as global methane from livestock and wildfires. Wildfire emissions contributed less than 10 percent and methane about 15 percent of the western U.S. ozone increase, whereas Asian air pollution contributed as much as 65 percent. These new findings suggest that a global perspective is necessary when designing a strategy to meet U.S. ozone air quality objectives, said Lin. The negative effect of imported pollution on the US's ability to achieve its air quality goals is not wholly unexpected, according to Owen Cooper, a senior research scientist at the University of Colorado and the NOAA Earth System Research Laboratory, who is familiar with the current study but not directly involved. "Twenty years ago, scientists first speculated that rising Asian emissions would one day offset some of the United States' domestic ozone reductions," Cooper said. "This study takes advantage of more than 25 years of observations and detailed model hindcasts to comprehensively demonstrate that these early predictions were right."
News Article | November 30, 2016
The average American’s carbon dioxide emissions are responsible for shrinking Arctic sea ice by nearly 50 square meters each year. That’s the implication of a new study that finds that each additional metric ton of CO₂ released into the atmosphere directly results in a 3-square-meter loss of sea ice cover at summer’s end — comparable to losing a chunk of ice with a footprint a bit smaller than a two-seat Smart car. “For the first time now, it is possible to grasp how each one of us contributes to tangible consequences for the global climate system,” says study coauthor Dirk Notz, a climate scientist at the Max Planck Institute for Meteorology in Hamburg. Globally, humans are responsible for the release of some 36 billion metric tons of CO₂ each year. With another trillion metric tons, the Arctic Ocean will have a completely iceless summer — possibly the first in 125,000 years. That threshold could be crossed before 2050, Notz and Julienne Stroeve of University College London estimate online November 3 in Science. Many previous studies projected that summertime ice would stick around for years longer (SN Online: 8/3/15). “Sea ice feels so substantial when you’re standing on ice that can hold your own weight, that you can land an airplane on,” says Cecilia Bitz, an atmospheric scientist at the University of Washington in Seattle who was not involved in the study. The new work “makes it feel very fragile.” Dwindling ice at the top of the world threatens Arctic species (SN Online: 5/14/08), can spread pollution (SN: 1/23/16, p. 9) and could open the region to transpolar shipping. In 2012, Arctic sea ice hit a record low since satellite observations began: just 3.39 million square kilometers, far below the average of 6.22 million square kilometers set from 1981 through 2010. How quickly the ice will continue to disappear remains unclear. For their estimate, Notz and Stroeve analyzed records of Arctic sea surface temperature and minimum sea ice extent since 1953. The average extent of September sea ice declined in lockstep with the rising total amount of CO released from human sources, the researchers found. This simple relationship between emissions and ice loss stems from one similarly straightforward mechanism, the researchers propose. As CO concentrates in the atmosphere, it strengthens the greenhouse effect, sending some heat back to Earth that would otherwise escape into space. This increases the amount of ice-warming infrared radiation hitting the Arctic, causing the outermost edge of the sea ice to retreat northward, where less sunlight hits the planet, and reducing total ice coverage. Climate simulations underestimate this effect and don’t accurately re-create the sensitivity of Arctic sea ice to rising CO levels, the researchers argue. Other factors linked to sea ice loss, such as changes in ocean heat flowing from the Atlantic Ocean and in the region’s reflectiveness, were minor over the studied period compared with the increased radiative heating, Notz says. Downplaying the role of ocean heating is a mistake, says Rong Zhang, an oceanographer at the National Oceanic and Atmospheric Administration’s Geophysical Fluid Dynamics Laboratory in Princeton, N.J. Sea ice coverage peaks in area during winter, when little light shines on the Arctic and the greenhouse effect is less important. But like the minimum, the maximum extent of Arctic sea ice has also declined over recent decades, reaching a record low in March (SN Online: 3/28/16). More observations are needed to determine whether warming from below or above the ice plays a larger role, Zhang says. “There’s not just one explanation,” she says.
News Article | September 19, 2016
A new press release has revealed that an ExxonMobil and Princeton University research partnership has finalized its decisions on the 5 projects that will be receiving funding through the joint initiative. Generally, the winners are focusing on: Most people reading this will probably interpret all of this as greenwashing and nothing else, and you’re probably right — after all, remember the contrast between the newly released emails between the UK government and Shell and those between the UK government and ExxonMobil? Those interested in reading more, though, may appreciate these detailed descriptions of the 5 projects getting access (over a 5 year period) to the $5 million in funding promised by ExxonMobil: Organic Photovoltaics: The objective is to study how new photovoltaic materials, particularly those polymeric in nature, can be applied in forms of coatings and building materials. The project will be led by Lynn Loo, director of the Andlinger Center for Energy and the Environment, the Theodora D. ’78 and William H. Walton III ’74 Professor in Engineering, and professor of chemical and biological engineering. Extending Battery Lifetime and Cycle Efficiency: The project will use diagnostic tools recently developed at Princeton to study degradation pathways of electric-vehicle batteries, and how they might impact follow-on use in applications on the power grid, known as “second life” applications. Research will be led by Daniel Steingart, assistant professor of mechanical and aerospace engineering and the Andlinger Center for Energy and the Environment. Arctic Sea-Ice Modeling: The focus of the project is to advance sea-ice models used for understanding the factors controlling Arctic sea-ice cycles and, consequently, the ability to make reliable seasonal and long-range forecasts for sea-ice formation and melting. Research will be conducted at Princeton’s Geophysical Fluid Dynamics Laboratory, a premier institution that has been developing state-of-the-art sea-ice modeling tools for decades. The project will be led by Alistair Adcroft, research oceanographer, and Olga Sergienko, research glaciologist, at the Princeton University Atmospheric and Ocean Sciences Program/NOAA-Geophysical Fluid Dynamics Laboratory. Role of the Ocean in the Future of Atmospheric Carbon Dioxide Levels: The project’s objective is to gain insight into the future of carbon dioxide uptake by the ocean by reconstructing ocean carbon cycle changes during past periods of warming. Research will be led by Daniel Sigman, Dusenbury Professor of Geological and Geophysical Sciences. Plasma Physics: The project will take advantage of Princeton’s world-leading facilities for studying plasma physics. It will explore low-energy plasmas’ effectiveness in enhancing or controlling energy-related chemical processes, such as converting natural gas to larger molecules for producing liquid fuels or chemical feedstocks. Egemen Kolemen, assistant professor of mechanical and aerospace engineering and the Andlinger Center for Energy and the Environment and the Princeton Plasma Physics Laboratory, is leading this research with Yiguang Ju, Robert Porter Patterson Professor of Mechanical and Aerospace Engineering. This news follows on last month’s news of a similar partnership with the University of Texas at Austin Energy Institute that will see $15 million invested to support the pursuit of “technologies to help meet growing energy demand while reducing environmental impacts and the risk of climate change.” Or so ExxonMobil reps say. It should be remembered that $5 million and $15 million are drops in the bucket for companies like ExxonMobil. Not serious investments. Buy a cool T-shirt or mug in the CleanTechnica store! Keep up to date with all the hottest cleantech news by subscribing to our (free) cleantech daily newsletter or weekly newsletter, or keep an eye on sector-specific news by getting our (also free) solar energy newsletter, electric vehicle newsletter, or wind energy newsletter.
News Article | February 15, 2017
Last spring, MIT research scientist C. Adam Schlosser, who serves as deputy director of the MIT Joint Program on the Science and Policy of Global Change, and colleagues published a paper in the journal PLOS One that projected a “high risk of severe water stress” in much of Asia by midcentury. Attributing the projection to rising demands driven by population and economic growth and exacerbated by climate change, they estimated that within 35 years, 1 billion more people in the area would be affected. The region in question is home to about half of the global population, so this finding matters. News outlets from the Christian Science Monitor to TIME picked up the story, disseminating it to millions of potential readers. “The response to this study illustrates the kind of scientific finding that makes people — including decision-makers and other stakeholders — listen and react,” says Schlosser. “We presented not only the science but also its potential impact on people’s lives. That’s a hallmark of the Joint Program.” So, too, is a methodology that underlies not only the Asia water-stress study but much of Schlosser’s research: the practice of running a computer model multiple times under varying assumptions (e.g., about the climate, population growth, or economic growth) to produce an exhaustive range of plausible future scenarios for a particular aspect of global change — such as water availability — and qualify each scenario with a level of uncertainty. In the vernacular, this is known as the application of Monte Carlo methods. By “rolling the dice” hundreds to thousands of times under different assumptions about Earth and human systems, Schlosser and colleagues can determine the odds of outcomes that policymakers are either targeting or trying to prevent. This information can then help guide decision-makers on how best to “weight the dice” to minimize risk to lives and infrastructure. “The challenge with addressing and quantifying risks is to identify the bounds of your knowledge and everything in between, and then to simulate that environment with computer models,” notes Schlosser. “That demands that we not only use models in creative ways but also bring to bear observations that can help us isolate meaningful signals in the results we obtain from those models.” Applying Monte Carlo methods to the Joint Program’s Integrated Global System Modeling (IGSM) framework to simulate the response of Earth and human systems to global change and assess risks that may lie ahead in the coming decades, Schlosser is now working to identify potential threats to regional water supplies and ecosystems, optimal locations for renewable energy generation around the globe, and trends in extreme events and their potential impact on the built environment. Charting the future of water supplies, renewable energy, and the grid Having recently upgraded the Water Resource Systems (WRS) model used in the Asia water-stress study — an extension of the IGSM framework — to more precisely represent water-demand sectors (regional watersheds) and the quality of water within them, Schlosser aims to simulate a large number of plausible futures for the U.S. water supply. The goal of his research team is to pinpoint any significant threats to the water system and project when water availability may become severely stressed by changes in the agricultural, energy, industrial, and other sectors of the economy. Over the next two years, he plans to explore the range of risks that different climate pathways pose for the U.S. water system, and how those risks may be avoided through mitigation or adaptation measures, such as efficiency improvements in water use (e.g. irrigation) and transport. He also aims to account for the uncertainty in runoff changes that occur under climate change, and their impact on risks to water demand sectors. Another key research objective of Schlosser’s is to determine how regional patterns of precipitation and temperature will impact the deployment of renewable energy technologies such as wind turbines and photovoltaics. As the world shifts away from fossil fuels and toward lower-carbon energy sources, it will become increasingly important to identify the prime locations where wind and solar power can thrive. By enhancing the IGSM framework to generate multiple simulations of wind and clouds on a regional basis, Schlosser aims to provide policymakers with more precise estimates of the times and locations at which wind and solar energy resources will be plentiful and reliable. “In a world where wind and solar farm installations are ubiquitous, it would be very beneficial if the science of climate predictability could tell when and where those fundamentally intermittent resources are the most reliable without constantly relying on backup technologies which are the very same greenhouse gas-emitting technologies we’re trying to avoid in the first place,” he says. Schlosser is also applying Monte Carlo methods to assess the risk to infrastructure posed by extreme weather events that range from storms to heatwaves. He and colleagues first developed a technique that draws upon the Joint Program’s climate model and those used by the institutions that have participated in the Intergovernmental Panel on Climate Change (IPCC) to explore how precipitation extremes shift under various climate policies — and which policies are likely to minimize the likelihood of shifts in extreme precipitation events that threaten infrastructure and livelihoods. In a pilot project conducted in collaboration with the MIT Lincoln Laboratory, they next looked at how human-induced changes in climate affect the occurrence of heatwaves that could damage expensive transformers that are critical to the functioning of the electric power grid in the U.S. Northeast. The next step is to expand this analysis and evaluate the grid more comprehensively, so as to provide actionable information for how to make the grid more stable, reliable, and environmentally responsible. “Our approach shrinks down the range of possible outcomes,” says Schlosser. “We’ll never be able to completely eliminate all uncertainty, but there are opportunities to constrain the uncertainty and give people an outlook of the future that we can act upon.” Schlosser came to this work out of a love for snow. Growing up in Rhode Island, he lived for snow days, when he could trade reading, writing, and arithmetic for sledding, skating, and skiing. Over the years, as climate change emerged as a global threat, his affinity for winter storms and activities fueled a growing concern about how winter would change on a warmer planet. That led to an interest in hydrology: Studies of hydrology in graduate school at the University of Maryland, where he received a PhD in meteorology, deepened his focus on winter processes and raised his awareness about the challenges in representing hydrology in climate or earth system models. After completing postgraduate work in climate predictability at NOAA’s Geophysical Fluid Dynamics Laboratory and further research at the Center for Ocean Land Atmosphere Studies, he served as a research scientist at the NASA Goddard Spaceflight Center, where he developed an ongoing program, the NASA Energy and Water Cycle Study, that uses multiple observations to generate a comprehensive picture of the global water and energy cycle. While Schlosser’s work at Goddard nurtured his scientific curiosity, there was something missing that he would find in his next position at the Joint Program, and keep him here for 12 years and counting. “Throughout my career, my research has been personally compelling from a scientific discovery standpoint, but there’s nothing like advancing science that can make a substantive contribution to decision-making, strategic planning, and policy formation concerning critical global challenges,” he says. “I never had an appreciation for that until I came here.” A version of this article originally appeared in the Fall 2016 issue of Global Changes, a triennial publication of the MIT Joint Program on the Science and Policy of Global Change.
News Article | February 1, 2016
New research published Monday adds to a body of evidence suggesting that a warming climate may have particularly marked effects for some citizens of the country most responsible for global warming in the first place — namely, U.S. East Coasters. Writing in Nature Geoscience, John Krasting and three colleagues from the Geophysical Fluid Dynamics Laboratory of the National Oceanic and Atmospheric Administration find that “Atlantic coastal areas may be particularly vulnerable to near-future sea-level rise from present-day high greenhouse gas emission rates.” The research adds to recent studies that have found strong warming of ocean waters in the U.S. Gulf of Maine, a phenomenon that is not only upending fisheries but could be worsening the risk of extreme weather in storms like Winter Storm Jonas. [The surprising way that climate change could worsen East Coast blizzards] “When carbon emission rates are at present day levels and higher, we see greater basin average sea level rise in the Atlantic relative to the Pacific,” says Krasting. “This also means that single global average measures of sea level rise become less representative of the regional scale changes that we show in the study.” In the new research, the scientists used a high powered climate change model based at the Geophysical Fluid Dynamics Laboratory in Princeton, N.J., that simulates the ocean, the atmosphere and the cycling of carbon throughout the Earth system. The goal was to determine how much sea level rise would occur in the Atlantic, versus the Pacific, under a variety of global carbon emissions scenarios. And the simulation found that at high emissions scenarios similar to current rates, the Atlantic sea levels rise considerably faster than the Pacific, with particularly noteworthy impacts for the U.S. East Coast. (Other recent research by scientists with the U.S. Geological Survey has suggested this increased rate of sea level rise is already happening — finding sea level rise rates “~ 3–4 times higher than the global average” along a large stretch of the U.S. East Coast, which the researchers dubbed a sea level rise “hotspot.”) The reason for the difference, the researchers say, is that the Atlantic, more than the Pacific, is characterized by a strong “overturning” ocean circulation — technically known as the Atlantic Meridional Overturning Circulation, or AMOC — that spans the north-south length of the globe and ultimately connects waters off New York with those at the tip of Antarctica. This means that waters circulate through the entire Atlantic much faster than they do throughout the Pacific: A “parcel” of water that sinks beneath the surface in the Atlantic will generally make it back to the surface again in 200 to 300 years, versus about three times as long for the Pacific, Krasting explains. For this reason, scientists sometimes say that Atlantic waters are “younger” than Pacific waters. Another way of putting it is that the Atlantic waters “ventilate” more, plunging from the surface to great depths before eventually making their way back to the surface again. But if this circulation slows due to climate change, the study finds, less cold water will dive to ocean depths in the North and far South Atlantic (technically called “deep water formation”), leading to warmer water pooling below the surface and, ultimately, greater warming overall. “The average temperature of the basin actually goes up, because you’re not bringing that cool water,” says Krasting. Warm water expands, and that’s the cause of the sea level rise expected in the study. Indeed, the research finds that with global emissions rates of carbon greater than 5 gigatons (or billion metric tons) per year — current emissions from industry are about 10 gigatons per year — the Atlantic circulation would slow down considerably (and sea level rise would increase there more than in the Pacific). That could be bad news for the U.S. “The patterns of sea level rise that we show tend to show enhanced sea level rise along the U.S. east coast, and that is associated with, or is consistent with the weakening of the overturning circulation that we demonstrate in the study,” says Krasting. It’s important to note that sea level rise shows regional variations based on topography and other factors – so the change won’t be uniform even within this particular geographical area. “Local effects such as land subsidence, changes in offshore winds, and ocean circulation changes (that is, AMOC) lead to sea levels that are rising faster than the global average along the US East Coast,” the study says. “Our results suggest that higher carbon emission rates also contribute to increased [sea level rise] in this region compared to the global average.” This isn’t the first study to point to reasons why the Atlantic off the U.S. East Coast could be a particular climate change “hotspot,” as another NOAA researcher, Vincent Saba, put it to me recently. A recent study by Saba and a group of NOAA researchers found that global warming, by slowing down the Atlantic ocean’s circulation, would lead to warmer waters off the East Coast. This change in circulation appears to shift the warm waters of the Gulf Stream northward, allowing them to enter the Gulf of Maine, which in recent years has seen dramatic warming. And it’s not just warmer seas, but higher ones as well. For instance, one recent study found that in 2009-2010, there was a sudden “extreme” sea level spike off the U.S. East Coast. Seas rose 4 inches all of a sudden thanks to an apparently abrupt change in the Atlantic’s circulation. (Independent NASA research using satellite measurements also found that the circulation slowed during this time period). In other words, it’s a consistent picture — slowing down the ocean circulation means more warmth and more expansive Atlantic waters near the U.S. And as if that’s not enough, there’s another factor that could also punish the U.S. with extra sea level rise: Gravity. [The U.S. has contributed more to global warming than any other country. Here’s how the Earth will get its revenge] Right now, Antarctica is so massive that it draws the ocean toward it. Sea level slopes upward toward this largest ice mass on Earth, meaning that there is less ocean elsewhere, and particularly off of U.S. shores, than there might be otherwise. But there are fears that Antarctica could be on course to lose a sizeable chunk of ice — roughly 3.3 meters or 11 feet worth, measured in globally averaged sea level rise terms, from West Antarctica. If that happens, a substantial part of the ocean would flow back toward the rest of the globe, due to the lessened gravitational pull. And researchers have shown that this would produce a disproportionately high sea level rise for the U.S. East Coast — and indeed, in this case, for the West Coast as well. Instead of a mere 11 feet of sea level rise (assuming a total loss of West Antarctica, of course), the east might get more like 14. So in sum: Researchers are homing in on one regional consequence of climate change, occurring at the intersection of changes in the ocean and the atmosphere. And the upshot is that right off of the U.S. East Coast is where some of the most consequential changes could occur.
News Article | November 21, 2016
BOULDER, Colo. -- If society continues to pump greenhouse gases into the atmosphere at the current rate, Americans later this century will have to endure, on average, about 15 daily maximum temperature records for every time that the mercury notches a record low, new research indicates. That ratio of record highs to record lows may turn out to be much higher if the pace of emissions increases and produces even more warming, according to the study led by scientists at the National Center for Atmospheric Research (NCAR). "More and more frequently, climate change will affect Americans with record-setting heat," said NCAR senior scientist Gerald Meehl, lead author of the new paper. "An increase in average temperatures of a few degrees may not seem like much, but it correlates with a noticeable increase in days that are hotter than any in the record, and nights that will remain warmer than we've ever experienced in the past." The 15-to-1 ratio of record highs to lows is based on temperatures across the continental United States increasing by slightly more than 3 degrees Celsius (5.4 degrees Fahrenheit) above recent years, which is about the amount of warming expected to occur with the current pace of greenhouse gas emissions. Over the last decade, in contrast, the ratio of record high temperatures to record lows has averaged about two to one. The new research appears next week in the "Proceedings of the National Academy of Sciences." It was funded by the Department of Energy (DOE) and the National Science Foundation (NSF), which is NCAR's sponsor. The study was co-authored by NCAR scientist Claudia Tebaldi and by Dennis Adams-Smith, a scientist previously at Climate Central and now at the National Oceanic and Atmospheric Administration's Geophysical Fluid Dynamics Laboratory. In a 2009 study, Meehl and colleagues found that the ratio of record daily high temperatures to record daily low temperatures has steadily increased since the 1970s as average temperatures over the United States have warmed. Computer models at that time indicated that the ratio could continue to increase during this century, although the research team looked into just one scenario of future emissions. The scientists also found that the models were overstating the ratio of record highs to record lows in recent years compared to observations. By digging further into the issue and analyzing why the models differed from observations, Meehl and his co-authors have now produced a better calibrated projection of future record-breaking daily highs across the U.S. They based their projections on the average temperature increase over the continental United States, rather than on a particular scenario of future emissions. By about 2065, for example, U.S. temperatures will rise by an average of slightly more than 3 degrees C (5.4 degrees F) if society maintains a "business as usual" increase in the emission of greenhouse gases. Under such a scenario, the ratio of record daily high temperatures to record daily lows will likely be about 15 to 1, although it could range anywhere from 7 to 1 up to 22 to 1, the study found. If temperatures increase even more this century, the ratio of record highs to record lows will jump substantially. For example, if temperatures climb more than 4 degrees C (7.2 degrees F), Americans could experience about 38 record highs for every record low. Such an outcome could occur if society does not make any efforts to mitigate the production of greenhouse gases. "Every degree of warming makes a substantial amount of difference, with the ratio of record highs to record lows becoming much greater," Meehl said. "Even with much warmer temperatures on average, we will still have winter and we will still get record cold temperatures, but the numbers of those will be really small compared to record high maximums." If temperatures were not warming, Meehl said, the ratio of record highs to record lows would average out to about one to one. Instead, record high temperatures have already become a common occurrence in much of the country. The ratio of record highs to lows has averaged about 2 to 1 over the first decade of the 21st century, but there is considerable year-to-year variation. The ratio was about 5 to 1 in 2012, dropping to about 1 to 1 in 2013 and 2014, then almost 3 to 1 in 2015. The unusual warmth of 2016, resulting from both climate change and natural patterns such as El Niño, has led to 24,519 record daily maximums vs. 3,970 record daily minimums -- a ratio of about 6 to 1. A key part of the study involved pinpointing why the models in the 2009 study were simulating somewhat more daily record high maximum temperatures compared with recent observations, while there was good agreement between the models and the observed decreases in record low minimums. The authors focused on two sets of simulations conducted on the NCAR-based Community Climate System Model (version 4), which is funded by DOE and NSF and developed by climate scientists across the country. Their analysis uncovered two reasons for the disparity between the computer models and observations. First, the models tended to underestimate precipitation. Because the air is cooled by precipitation and resulting evapotranspiration -- the release of moisture from the land and plants back to the atmosphere -- the tendency of the computer models to create an overly dry environment led to more record high temperatures. Second, the original study in 2009 only went back to the 1950s. For the new study, the research team also analyzed temperatures in the 1930s and 1940s, which is as far back as accurate recordkeeping will allow. Because the Dust Bowl days of the 1930s were unusually warm, with many record-setting high temperatures, the scientists found that it was more difficult in subsequent years to break those records even as temperatures warmed. However, even taking the warm 1930s into account, the model-simulated and observed ratio of record highs to record lows has been increasing. "The steady increase in the record ratio is an immediate and stark reminder of how our temperatures have been shifting and continue to do so, reaching unprecedented highs and fewer record lows," said Tebaldi. "These changes pose adaptation challenges to both human and natural systems. Only a substantial mitigation of greenhouse gas emissions may stop this increase, or at least slow down its pace." Journal: Proceedings of the National Academy of Sciences, DOI: 10.1073/pnas.1606117113 The University Corporation for Atmospheric Research manages the National Center for Atmospheric Research under sponsorship by the National Science Foundation. Any opinions, findings and conclusions, or recommendations expressed in this publication are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.
Seager R.,Lamont Doherty Earth Observatory |
Naik N.,Lamont Doherty Earth Observatory |
Vecchi G.A.,Geophysical Fluid Dynamics Laboratory
Journal of Climate | Year: 2010
The mechanisms of changes in the large-scale hydrological cycle projected by 15 models participating in the Coupled Model Intercomparison Project phase 3 and used for the Intergovernmental Panel on Climate Change's Fourth Assessment Report are analyzed by computing differences between 2046 and 2065 and 1961 and 2000. The contributions to changes in precipitation minus evaporation, P-E, caused thermodynamically by changes in specific humidity, dynamically by changes in circulation, and by changes in moisture transports by transient eddies are evaluated. The thermodynamic and dynamic contributions are further separated into advective and divergent components. The nonthermodynamic contributions are then related to changes in the mean and transient circulation. The projected change in P-E involves an intensification of the existing pattern of P-E with wet areas [the intertropical convergence zone (ITCZ) and mid-to high latitudes] getting wetter and arid and semiarid regions of the subtropics getting drier. In addition, the subtropical dry zones expand poleward. The accentuation of the twentieth-century pattern of P-E is in part explained by increases in specific humidity via both advection and divergence terms. Weakening of the tropical divergent circulation partially opposes the thermodynamic contribution by creating a tendency to decreased P-E in the ITCZ and to increased P-E in the descending branches of the Walker and Hadley cells. The changing mean circulation also causes decreased P-E on the poleward flanks of the subtropics because the descending branch of the Hadley Cell expands and the midlatitude meridional circulation cell shifts poleward. Subtropical drying and poleward moistening are also contributed to by an increase in poleward moisture transport by transient eddies. The thermodynamic contribution to changing P-E, arising from increased specific humidity, is almost entirely accounted for by atmospheric warming under fixed relative humidity. © 2010 American Meteorological Society.
News Article | March 7, 2016
Using records of ships wrecked by Atlantic hurricanes dating as far back as the days of Christopher Columbus, researchers have extended the hurricane record by hundreds of years. The work reveals that hurricane frequency plummeted 75 percent between 1645 and 1715, a time called the Maunder Minimum when the sun dimmed to its lowest recorded brightness. “We didn’t go looking for the Maunder Minimum; it just popped out of the data,” says study coauthor Valerie Trouet, a paleoclimate scientist at the University of Arizona in Tucson. The findings should help scientists better predict how hurricanes will behave under climate change, the researchers report in a paper to appear in the Proceedings of the National Academy of Sciences. Detailed hurricane observational records go back to 1851. Scouring an Atlantic shipwreck catalog, Trouet and colleagues identified more than 650 Spanish ships sunk by hurricanes from 1495 through 1825. The researchers bridged the shipwreck and observational records using tree rings from slash pines (Pinus elliottii) collected along the Florida coast and dating to as early as 1707. Hurricane damage stunts tree growth, narrowing the annual rings. All three records agreed, allowing the researchers to stitch together one long hurricane frequency record. The number of hurricane-caused shipwrecks during the Maunder Minimum, which makes up a large portion of a period nicknamed the “Little Ice Age,” was less than a third the number of wrecks in the preceding decades. A hurricane slowdown during the solar dim period makes sense, Trouet says. Warm seawater fuels hurricanes. As temperatures dropped around the Maunder Minimum, less heat was available to power storms. The finding doesn’t mean that global warming will increase hurricane frequency, says Gabriel Vecchi, an oceanographer at the National Oceanic and Atmospheric Administration’s Geophysical Fluid Dynamics Laboratory in Princeton, N.J. While both solar brightness and heat-trapping greenhouse gases cause warming, their effects on hurricanes “aren’t perfect analogs,” he says. Still, the new data can provide a test for climate simulations, Vecchi says. “We can ask a model, ‘When we give you less sun, what do you do?’ If it doesn’t give us fewer hurricanes, we can then ask why. This gives us something to aim at.”
News Article | September 7, 2016
The 7.1 trillion gallons of torrential rainfall that a storm dumped on Louisiana in August, leading to a flooding disaster that killed at least 13 people and caused more than $1 billion in damage, had a direct link to human-caused global warming, a new study finds. In short, while these events are still rare — on the order of about once every 1,000 years at the local level — they used to be far more infrequent, the study, released Wednesday, concludes. SEE ALSO: Why the extreme Louisiana floods are worrying but not surprising Scientists studying the storm using readings from rain gauges as well as powerful computer models have determined just how unlikely the rains were, teasing out the component of the event that was likely human-caused. Like many other studies examining extreme events in recent years, the new research, which has been submitted to the online journal Hydrology and Earth Systems Discussions, points to the conclusion that global warming is an increasingly influential player when it comes to tipping the odds in favor of extreme events. This is particularly the case when it comes to extreme heat and heavy precipitation events, the researchers say. So much rain fell during the three-day deluge from August 12 to 14 — 25.5 inches in Livingston, Louisiana near Baton Rouge — that rivers swelled to several feet above their all-time historical markers. The Amite River in Magnolia, Louisiana, for example, rose to 58.56 feet, more than 6 feet above the old record set in 1977. "We find that the probability of a comparable event to occur anywhere in the region has increased over time and that this increase is caused by climate change. The increase of odds is at least 40 percent, most likely a doubling (100 percent)," Karin van der Wiel, lead author of the study, told Mashable in an email. "That means that what is a 1-in-30 year event now, used to be a 1-in-50 year event in preindustrial times (1900)," added van der Wiel, who is a research scientist at the NOAA Geophysical Fluid Dynamics Laboratory in Princeton, New Jersey. The new study is part of a "rapid climate attribution" push by many climate scientists to speed up the time it takes to inform the public and policymakers about the role that global warming may have played in a newsworthy extreme weather event. The World Weather Attribution project is an international effort, and this study enlisted the help of several scientists, such as van der Wiel, who are not formally part of this group. The study involved performing a statistical analysis of rainfall observations and using high-resolution climate models to simulate the odds of such a heavy rainfall event with and without global warming. Specifically, the study examined how the probabilities have changed between the early part of the 20th century and the early part of the current century. Bolstering the scientists' conclusions, the modeling data and the statistical analysis showed similar results. The odds of torrential rainfall amounting to more than 2 feet within three days in a specific town such as Livingston is closer to 1-in-450 years or possibly over 1-in-1,000 years, van der Wiel said. However, the study looked closely at the changing regional risk of an event like this, since climate models are not capable of simulating conditions over such small areas. Studies like these may not provide comfort or answers to people still cleaning up in the aftermath of these types of storms. However, the changing odds for these events will affect insurance companies and the rates they set, as well as local and state government policies in the future as climate change increases the chances that storms such as these will occur. The study did not analyze the weather system that caused the event itself to see how common such slow-moving hybrid tropical low pressure areas are, and whether the odds of such systems are changing. That, Van der Wiel says, would be a far more complicated task. "We haven't separated large rainfall events into those related to tropical cyclones, frontal systems or stationary lows as in August 2016. That would be a very interesting topic for further study," she told Mashable. The study will undergo an online, open peer review process on the journal's website.
News Article | April 8, 2016
2006: Freshwater flowing into the North Atlantic could shut down the ocean conveyor belt that shuttles warm water toward Western Europe. 2016: The ocean conveyor belt may already be slowing, but it’s not much of a conveyor belt at that. Last year may have been Earth’s hottest on record (SN: 2/20/16, p. 13). But for one small corner of the globe, 2015 was one of the coldest. Surface temperatures in the subpolar North Atlantic have chilled in recent years and, oddly enough, some research suggests global warming is partly responsible. An influx of freshwater from melting glaciers and increasing rainfall can slow — and possibly even shut down — the ocean currents that ferry warm water from the tropics to the North Atlantic. About 10 years ago, scientists warned of a possible abrupt shutdown of this “ocean conveyor belt.” After years of closely monitoring Earth’s flowing oceans, researchers say a sudden slowdown isn’t in the cards. Some researchers report that they may now be seeing a more gradual slowing of the ocean currents. Others, meanwhile, have discovered that Earth’s ocean conveyor belt may be less of a sea superhighway and more of a twisted network of side roads. The consequences of a sea current slowdown won’t be anywhere near as catastrophic as the over-the-top weather disasters envisioned in the 2004 film The Day After Tomorrow, says Stephen Griffies, a physical oceanographer at NOAA’s Geophysical Fluid Dynamics Laboratory. “The doomsday scenario is overblown, but the possibility of a slowing down of the circulation is real and will have important impacts on Atlantic climates,” Griffies says. The Atlantic mixing that feeds the currents is powered by differences in the density of seawater. In the simple ocean conveyor-belt model, warm, less-dense surface water flows northward into the North Atlantic. Off Greenland, cold, denser water sinks into the deep ocean and flows southward. This heat exchange, known as the Atlantic overturning circulation, helps keep European cities warmer than their counterparts elsewhere in the world. Ten years ago, scientists knew from past changes in Earth’s climate that temperature shifts can disrupt this density balance. Freshwater from the shrinking Greenland ice sheet and increased rainfall make the North Atlantic waters less dense and therefore less likely to sink. Investigations into Earth’s ancient climates show that the overturning circulation weakened around 12,800 years ago, probably causing cooling in Europe and sea level rise along North America’s East Coast, as piled-up water in the north sloshed southward. Tracking sea surface temperatures, researchers reported last year that the Atlantic overturning circulation significantly slowed during the 20th century, particularly after 1970. Comparing the recent slowdown with past events, the researchers reported in March in Nature Climate Change that the rapid weakening of the circulation is unprecedented in the last 1,000 years. That result isn’t the final word, though, says Duke University physical oceanographer Susan Lozier. Scientists have directly measured the speed of the ocean circulation only since the deployment of a network of ocean sensors in 2004. Earlier Atlantic circulation speed changes have to be gleaned from less reliable indirect sources such as sea surface temperature changes. “If you look at the most recent results, there’s a decline, yes,” she says. “But we can’t say that’s part of a long-term trend right now.” And effects on Europe’s climate could be masked by other factors. Another challenge is that over the last 10 years, “the ocean conveyor-belt model broke,” Lozier said in February at the American Geophysical Union’s Ocean Sciences Meeting in New Orleans. From 2003 through 2005, she and colleagues tracked the movements of 76 floating markers dropped into the North Atlantic and pulled around by ocean waters. If the model was right, these markers should have traveled along the southward-flowing part of the conveyor belt. Instead, the markers moved every which way, the researchers reported in 2009 in Nature. “We went from this simple ribbon of a conveyor belt to a complex flow field with multiple pathways,” Lozier says. Determining past and possible future effects of climate change on ocean currents will require more measurements and a better understanding of how the ocean truly flows, she says. Even if the overturning circulation cuts out completely, the resulting cooling effect will probably be short-lived, Griffies says. “At some point, even if the circulation collapses, it would only be 10 or 20 years before the global warming signal would overwhelm that cooling” in Europe, he says. “This is not going to save us from a warmer planet.”