Swingedouw D.,French National Center for Scientific Research |
Rodehacke C.B.,DMI-St. EUGENE University |
Rodehacke C.B.,Max Planck Institute For Meteorologie |
Olsen S.M.,DMI-St. EUGENE University |
And 5 more authors.
Climate Dynamics | Year: 2015
Large uncertainties exist concerning the impact of Greenland ice sheet melting on the Atlantic meridional overturning circulation (AMOC) in the future, partly due to different sensitivity of the AMOC to freshwater input in the North Atlantic among climate models. Here we analyse five projections from different coupled ocean–atmosphere models with an additional 0.1 Sv (1 Sv = 106 m3/s) of freshwater released around Greenland between 2050 and 2089. We find on average a further weakening of the AMOC at 26°N of 1.1 ± 0.6 Sv representing a 27 ± 14 % supplementary weakening in 2080–2089, as compared to the weakening relative to 2006–2015 due to the effect of the external forcing only. This weakening is lower than what has been found with the same ensemble of models in an identical experimental set-up but under recent historical climate conditions. This lower sensitivity in a warmer world is explained by two main factors. First, a tendency of decoupling is detected between the surface and the deep ocean caused by an increased thermal stratification in the North Atlantic under the effect of global warming. This induces a shoaling of ocean deep ventilation through convection hence ventilating only intermediate levels. The second important effect concerns the so-called Canary Current freshwater leakage; a process by which additionally released freshwater in the North Atlantic leaks along the Canary Current and escapes the convection zones towards the subtropical area. This leakage is increasing in a warming climate, which is a consequence of decreasing gyres asymmetry due to changes in Ekman pumping. We suggest that these modifications are related with the northward shift of the jet stream in a warmer world. For these two reasons the AMOC is less susceptible to freshwater perturbations (near the deep water formation sides) in the North Atlantic as compared to the recent historical climate conditions. Finally, we propose a bilinear model that accounts for the two former processes to give a conceptual explanation about the decreasing AMOC sensitivity due to freshwater input. Within the limit of this bilinear model, we find that 62 ± 8 % of the reduction in sensitivity is related with the changes in gyre asymmetry and freshwater leakage and 38 ± 8 % is due to the reduction in deep ocean ventilation associated with the increased stratification in the North Atlantic. © 2014, Springer-Verlag Berlin Heidelberg.
News Article | October 30, 2016
There is an interesting news article ($) in Science this week by Paul Voosen on the increasing amount of transparency on climate model tuning. (Full disclosure, I spoke to him a couple of times for this article and I’m working on tuning description paper for the US climate modeling centers). The main points of the article are worth highlighting here, even if a few of the characterizations are slightly off. The basic thrust of the article is that climate modeling groups are making significant efforts to increase the transparency and availability of model tuning processes for the next round of intercomparisons (CMIP6). This partly stems from a paper from the MPI-Hamburg group (Mauritsen et al, 2012), which was perhaps the first article to concentrate solely on the tuning process and the impact that it has on important behaviour of the model (such as it’s sensitivity to increasing CO ). That isn’t to say that details of tunings were not discussed previously, but the tendency was to describe them briefly in the model description papers (such as Schmidt et al. (2006) for the GISS model). Some discussion has appeared in IPCC reports too (h/t Gareth Jones), but not in much depth. Thus useful information was hard to collate and compare across all model groups, and it turns out that matters. For instance, if some analyses of the model ensemble tries to weight models based on some their skill compared to observations, it is obviously important to know whether a model group tuned their model to achieve a good result or whether it arose naturally from the the basic physics. In a more general sense this relates to whether “data accommodation” improves a model predictive skill or not. This is quite subtle though – weather forecast models obviously do better if they have initial conditions that are closer to the observations, and one might argue that for particular climate model predictions that are strongly dependent on the base climatology (such as for Arctic sea ice) tuning to the climatology will be worthwhile. The nature of the tuning also matters: allowing an uncertain parameter to vary within reasonable bounds and picking the value that gives the best result, is quite different to inserting completely artificial fluxes to correct for biases. Both have been done historically, but the latter is now much rarer. A recent summary paper in BAMS (Hourdin et al., 2016) discussed current practices and gave results from a survey of the modeling groups. In that survey, it was almost universal that groups tuned for radiation balance at the top of the atmosphere (usually by adjusting uncertain cloud parameters), but there is a split on pratices like using flux corrections (2/3rds of groups disagreed with that). This figure gives some more details: The Science article though does make some claims that I don’t think are correct. I assume these are statements that are paraphrases from scientists that the writer talked to, but they would have been better as quotes, as opposed to generalisations. For instance, the article claims that This is, I think, demonstrably untrue, since tuning has been discussed widely in papers including here on RealClimate. Perhaps it does reflect some people’s opinion, but it is not true generally. The targets for tuning are vary across groups, and again, it matters which you pick. Tuning to the seasonal cycle, or to the climatological average, or to the variance of some field – which can be well characterised from observations, is different to tuning to a transient change of over time – which is often less well known. Indeed, many groups specifically leave transient changes out of their tuning procedures in order to maintain those trends for out-of-sample evaluation of the model (approximately half the groups according to the Hourdin et al survey). The article says something a little ambiguous on this: Does that mean the global mean surface temperature trends over the 20th Century, or just that some 20th Century data is used? And what does ‘precisely’ mean in this context? The spread of 20th Century trends (1900-1999) in the CMIP5 simulations [0.25,1.17]ºC is clearly too broad to be the result of precisely tuning anything! On a similar issue, the article contains an example of the MPI-Hamburg model being tuned to avoid a 7ºC sensitivity. That is probably justified since there is plenty of evidence to rule out such a high value, but tuning to a specific value (albeit within the nominal range of 2 to 4.5ºC) is not justified. My experience is that most groups do not ‘precisely’ tune their models to 20th Century trends or climate sensitivity, but given this example and the Hourdin results, more clarity on exactly what is done (whether explicitly or implicitly) is needed. It would be worrying if the centers didn’t discuss tuning in the science literature through fear of commercial rivals, and I don’t think this really characterises the Hadley Centre position. Some groups code’s (incl. the Hadley Center) are however restricted for various reasons, though I personally see that as an unsustainable position in the long-term if groups want to partake in international model intercomparisons that will be used for public policy. The article ends up on an interesting note: I think this is exactly right. We should be using alternate tunings to expand the representation of structural uncertainty in the ensemble, and I hope many of the groups will take this opportunity to do so.
Mahowald N.,Cornell University |
Lindsay K.,U.S. National Center for Atmospheric Research |
Rothenberg D.,Cornell University |
Doney S.C.,Woods Hole Oceanographic Institution |
And 4 more authors.
Biogeosciences | Year: 2011
Coupled-carbon-climate simulations are an essential tool for predicting the impact of human activity onto the climate and biogeochemistry. Here we incorporate prognostic desert dust and anthropogenic aerosols into the CCSM3.1 coupled carbon-climate model and explore the resulting interactions with climate and biogeochemical dynamics through a series of transient anthropogenic simulations (20th and 21st centuries) and sensitivity studies. The inclusion of prognostic aerosols into this model has a small net global cooling effect on climate but does not significantly impact the globally averaged carbon cycle; we argue that this is likely to be because the CCSM3.1 model has a small climate feedback onto the carbon cycle. We propose a mechanism for including desert dust and anthropogenic aerosols into a simple carbon-climate feedback analysis to explain the results of our and previous studies. Inclusion of aerosols has statistically significant impacts on regional climate and biogeochemistry, in particular through the effects on the ocean nitrogen cycle and primary productivity of altered iron inputs from desert dust deposition. © 2011 Author(s).
Kottayil A.,Lulea University of Technology |
John V.O.,Hadley Center |
Buehler S.A.,Lulea University of Technology
Journal of Geophysical Research: Atmospheres | Year: 2013
Microwave humidity measurements from polar orbiting satellites are affected by diurnal sampling biases which are caused by changes in the local observation time of the satellites. The long-term data records available from these satellites thus have spurious trends, which must be corrected. Diurnal cycles of the microwave measurements have been constructed by combining data over the period 2001-2010 from five different satellite platforms (NOAA-15, -16, -17, -18, and MetOpA). This climatological diurnal cycle has been used to deduce and correct the diurnal sampling bias in Advanced Microwave Sounding Unit-B and microwave humidity sounder measurements. Diurnal amplitudes for channels which are sensitive to surface temperature variations show a sharp land-sea contrast with the amplitudes exceeding 10 K for land regions but less than 1 K for oceanic regions. The humidity channels sensitive to the upper and middle troposphere exhibit a seasonal variation with large diurnal amplitudes over convective land regions (often above 3 K) in comparison to oceanic regions. The diurnal peak times of these channels over land occur in the early mornings. The diurnal sampling bias correction has a greater impact over land regions when compared to oceanic regions due to the large diurnal amplitudes over land. The diurnal cycle of humidity generated as a part of this study could be used to evaluate diurnal cycles in climate models. Key Points Climatological diurnal cycle of microwave humidity measurements Greater impact of diurnal correction seen over land regions than ocean A plausible trend in surface temperature after diurnal correction. © 2012 American Geophysical Union. All Rights Reserved.
Morgenstern O.,NIWA - National Institute of Water and Atmospheric Research |
Zeng G.,NIWA - National Institute of Water and Atmospheric Research |
Abraham N.L.,University of Cambridge |
Telford P.J.,University of Cambridge |
And 5 more authors.
Journal of Geophysical Research: Atmospheres | Year: 2013
Using a stratosphere-troposphere chemistry-climate model, we compare the impacts of climate change, stratospheric ozone recovery, and methane increases on surface ozone and the tropospheric oxidizing capacity by 2050. Methane increases lead to a decreasing OH, particularly in the northern subtropics during summer. Stratospheric ozone recovery causes small increases of surface OH driven by increased stratosphere-troposphere exchange, occurring during parts of the year in the southern extratropics. Tropospheric warming is also associated with increasing OH, maximizing in the Northern Hemisphere in northern summer. In combination, OH is anticipated to decrease by approximately 8% in the tropospheric average by 2050 in the scenario considered here. In conjunction with these changes to OH, we model substantial changes in surface ozone in both hemispheres. Methane increases alone will lead to increasing surface ozone by up to 2-3 ppbv in the zonal mean, maximizing around 30N. This increase is exacerbated during austral winter when increased stratosphere-troposphere flux of ozone causes an increase in surface ozone in the southern extratropics. Both increases are partially offset by decreases in surface ozone of up to 2 ppbv in the zonal mean, with substantial zonal asymmetries, due to global warming. We model substantial changes in the methane lifetime caused by the three factors. In the Arctic during summer, disappearing sea ice, in an ice-albedo feedback, causes substantially reduced surface ozone. Of the three factors considered here, methane increases are found to exert the strongest influence on surface ozone. © 2012. American Geophysical Union.
Agency: GTR | Branch: NERC | Program: | Phase: Research Grant | Award Amount: 1.07M | Year: 2011
Dust is an important part of the Earths land-atmosphere-ocean-biosphere system affecting climate, the fertility of oceans, plant communities on land, and human health. Wind is able to move vast amounts of dust over the Earths surface and into the atmosphere. North Africa alone emits 500-1000 million tons of dust a year. To predict future weather and climate it is crucial that numerical models, our key tool for such prediction, represent the relationships important to the emission, transport and deposition of dust. Excluding dust from models leads to large local and global errors. Accurate modelling of dust begins with the correct simulation of emission. This is vital because source area simulation errors lead to errors in local climate dynamics and incorrect dust transport. However, many of the major dust source regions of the world are in extremely remote places for which there is no ground-based data on dust emission or its controls. Although recent advances have been made in identifying major dust sources, for example from satellite data, many models of dust emission are still very simple and are not constrained by real observed data. In the drive to predict weather and climate at spatial scales useful for planning decisions, numerical models have increased their resolution so that some global models run at near 1 degree and many regional models at better than 0.2 degree resolution. The few observed data sets characterising dust source areas and behaviour that do exist simply do not support the scale at which these models are being run. It is therefore extremely difficult to either evaluate or improve the dust emission component of models as things stand. Simulation of dust source areas is consequently very inaccurate and is set to remain so. We propose to address this problem by developing the first model dust emission scheme which is based on purpose built observed data sets that have been deliberately constructed to exactly match the scale of regional climate models. We propose to do this by first using high-resolution satellite data to identify key sources of dust within field areas that are characteristic of dust source areas found in many parts of the world. We will then use state-of-the-art field equipment to systematically investigate the real processes that control dust emission at the model grid box scale, measuring background conditions over a long period as well as the important processes that occur during dust storms. We will therefore measure and monitor both the factors that control the availability of dust to the wind on the ground (erodibility), and the ability of the wind to move that sediment (erosivity) to create this definitive data set on source regions that can be used in model development for years to come. We will be able to determine for the first time what kind of dust source data (e.g. surface roughness, soil moisture, wind gustiness) lead to the largest improvement in the observationally-constrained model emission scheme. We will also be able to say what errors result in simulations if no field data is collected and only remotely sensed data are used as inputs to the models. This will provide important guidance on how and where to spend time and money in the improvement of climate models in the future and also to provide direction on what kind of field data are most important to collect. The Met Office does not have the capacity to undertake the extensive fieldwork required to deliver the observational data that are critical to model development. Our proposal cuts across the traditional barriers between field work, Earth Observation and numerical modelling. It is only by doing so that breakthroughs in dust numerical modelling will be achieved. Our unprecedented field observations which are tailored to numerical model needs will be a significant step towards a new generation of model schemes.
Williams R.G.,University of Liverpool |
Roussenov V.,University of Liverpool |
Smith D.,Hadley Center |
Lozier M.S.,Duke University
Journal of Climate | Year: 2014
Basin-scale thermal anomalies in the North Atlantic, extending to depths of 1-2km, are more pronounced than the background warming over the last 60 years.Adynamical analysis based on reanalyses of historical data from 1965 to 2000 suggests that these thermal anomalies are formed by ocean heat convergences, augmented by the poorly known air-sea fluxes. The heat convergence is separated into contributions from the horizontal circulation and the meridional overturning circulation (MOC), the latter further separated into Ekman and MOC transport minus Ekman transport (MOC-Ekman) cells. The subtropical thermal anomalies are mainly controlled by wind-induced changes in the Ekman heat convergence, while the subpolar thermal anomalies are controlled by the MOC-Ekman heat convergence; the horizontal heat convergence is generally weaker, only becoming significant within the subpolar gyre. These thermal anomalies often have an opposing sign between the subtropical and subpolar gyres, associatedwith opposing changes in themeridional volume transport driving the Ekman and MOC-Ekman heat convergences. These changes in gyre-scale convergences in heat transport are probably induced by the winds, as they correlate with the zonal wind stress at gyre boundaries. © 2014 American Meteorological Society.
Agency: GTR | Branch: NERC | Program: | Phase: Research Grant | Award Amount: 241.73K | Year: 2014
There is widespread concern about how climate is responding to the on going rise in atmospheric CO2 from carbon emissions and land use changes. In our view, the climate response can be divided into the following stages: 1. Past and on going increases in atmospheric CO2 are leading to a global warming of up to 0.6C over the last 50 years. The regional variability is though much larger than this global signal. 2. Continuing emissions are increasing atmospheric CO2 and driving a heat flux into the ocean, leading to ocean warming and steric sea level rise. The amount of warming is sensitive to the carbon emission scenario, as well as the rate of carbon uptake by the ocean and terrestrial system. 3. The regional distribution of warming and steric sea level rise is sensitive to how the ocean interior takes up heat, involving the transfer of surface properties into the thermocline and deep ocean. 4. After emissions cease, there will be a thermal adjustment of the lower atmosphere, and the net heat flux into the ocean will cease, and so ocean warming and steric sea level will eventually likewise cease. 5. As well as a thermal equilibrium being reached, the atmosphere and ocean approach a carbon equilibrium after emissions cease, on a timescale of perhaps several hundred years to a thousand years. At this equilibrium, the final atmospheric CO2 and the amount of climate warming is related to cumulative carbon emissions based on our idealised theory. The climate warming and steric sea level rise will be investigated using diagnostics of (i) present day temperature and salinity observations, allowing the steric sea level to be diagnosed; (ii) thought experiments with a range of ocean and climate models on timescales of centuries to several thousand years, designed to explore how the ocean warming spreads from the sea surface into the ocean interior, which ultimately determines the steric sea level rise; (iii) comparison with diagnostics of state of the art climate models, integrated for a century; (iv) comparison with idealised theory, relevant for when emissions cease; and (iv) finally a down scaling to provide bounds on the steric sea level response on a regional scale. This combination of the theory and Earth System models of intermediate complexity will allow a wide parameter space to be explored for a range of emission scenarios, much broader than that usually employed within IPCC assessments for the next 100 years. The study has the potential to provide accessible bounds for steric sea level rise, relevant for policy makers interested in different energy policies, and a link to end users is provided via the collaboration with the Hadley Centre.
Agency: GTR | Branch: NERC | Program: | Phase: Research Grant | Award Amount: 364.61K | Year: 2011
The climate system is widely accepted as warming. Much of the extra heat provided to the climate system is estimated to have been taken up by the oceans. However, this warming of the oceans is not happening uniformly. For the North Atlantic, the most well observed basin, there has been warming in the tropics and mid latitudes, but cooling at high latitudes over the last 50 years. These changes in heat content are associated with changes in atmospheric forcing from winds and surface heat fluxes. As well as the oceans changing their temperature, there are salinity changes with a general freshening at high latitudes and increase in salinity at low latitudes, perhaps associated with a strengthening in the atmospheric water cycle. The strong gyre-scale contrast in these ocean properties suggest that the wind forcing and gyre dynamics are playing an important role. The small residual density changes over the basin are reflected in changes in the dynamical signals for overturning and sea level. The ocean overturning response to these water-mass changes appears to be surprising: based on our historical analyses, there is a slightly weakening over the subtropical gyre and slightly strengthening over the subpolar gyre during the last 50 years. These overturning changes might reflect the effect of the wind forcing, where gyre-scale property changes feedback onto changes in the overturning. The effect of the winds also directly affects sea level and the interpretation of the tide gauge record: there are large-scale correlations between the interannual variations in air pressure over the central part of the ocean basins and eastern boundary sea level. In our study, we plan to test hypotheses as to how the climate variability in the North Atlantic is controlled. Our aim is (i) to extend our analyses and assimilation of the historical data; (ii) conduct model experiments designed to reveal the effect of changing winds on the gyre contrasts in temperature and salinity, on how heat content and overturning are related, and on the relationship with sea level; and (iii) assess how tide gauge records for sea level are affected by gyre dynamics and overturning in order to interpret changes in the long historical records of sea level rise in the North Atlantic.
News Article | December 9, 2015
PARIS — With only three days left, tensions here are rising as countries race to resolve outstanding differences and forge an agreement that — hopefully — will set the planet on a path to avoiding the worst consequences of climate change. The goal is an agreement that would set the world on a path to limit warming to below 2 degrees Celsius, or perhaps even 1.5 degrees Celsius, above pre-industrial levels. But at a news conference here at the Le Bourget conference center Wednesday morning, scientists pointed out a factor that could make hitting these targets quite a lot harder. As the planet warms, this frozen northern soil is going to continue to thaw — and as it thaws, it’s going to release carbon dioxide and methane into the air. A lot of it, it turns out. Potentially enough to really throw off the carbon budgets that have been calculated in order to determine the maximum emissions that we can release and still have a good chance of keeping warming to 2 C or below it. [U.S. pledges more aid to poorer countries as Paris climate talks enter delicate stage] In particular, Susan Natali of the Woods Hole Research Center explained Wednesday that with a very high level of warming, permafrost emissions this century could be quite large indeed. Natali used numbers from the 2013 report of the United Nations’ Intergovernmental Panel on Climate Change, which found that humans can only emit about 275 more gigatons, or billion tons, of carbon (about 1,000 gigatons of carbon dioxide, which has a greater molecular weight) to have a greater than 66 percent chance of limiting warming to 2 degrees C. But out of that limited budget, she said, permafrost emissions could take up some 150 of those gigatons (or about 550 gigatons of carbon dioxide). “That’s on par with current U.S. rates of emission,” Natali said, which are about 1.4 gigatons of carbon per year. “So we’re talking about another emitting region that’s currently not included in our emissions scenarios.” Fortunately, even though they’re not considered to be strong enough, the current national pledges to limit global warming appear to have taken the world off a truly high emissions path. These pledges, or “intended nationally determined contributions,” could potentially limit warming to 2.7 degrees Celsius, according to the United Nations. But in an interview, Natali and her Woods Hole colleague and fellow permafrost expert Max Holmes explained that even for lower warming scenarios like this, permafrost could emit 50 gigatons of carbon (or about 180 gigatons of carbon dioxide) in this century. This is because under lower warming scenarios, only about 30 percent, rather than about 70 percent, of surface layer permafrost is expected to thaw. Another 50 gigatons out of a 275 gigaton carbon budget — or, another 180 gigatons out of a 1,000 gigaton carbon dioxide budget — would significantly constrain how much the world could emit and still have a strong chance of keeping warming below 2 degrees Celsius. Another prominent research institute, the U.K. Met Office’s Hadley Center, also recently released an assessment of how potential permafrost emissions could complicate attempts to limit global warming, and came up with numbers that are, if anything, potentially even worse. As the center put it: The feedbacks from wetlands and permafrost regions can be combined with other known processes to determine their greenhouse gas input into the atmosphere. For a global average temperature rise of 2 °C, this reduces the cumulative emissions that can be released by human actions by around 100GtC (360 GtCO2 ) in the most pessimistic simulation. This corresponds to about 10 years of anthropogenic emissions at the current rate. These numbers can’t be directly compared with the Woods Hole numbers, however, due to the inclusion of wetlands above. And Natali and Holmes also noted that permafrost emissions don’t end at 2100 — they are expected to continue after that and even get worse. “Most of the release will happen after 2100,” said Natali. That’s a big problem because the global carbon budget is fixed, and after it is exceeded there can be zero further emissions. Because carbon dioxide lasts so long in the atmosphere, you don’t get to start with a fresh budget in the next century. So permafrost emissions beyond 2100 would also have to be taken into account, and would restrict the budget even further. Permafrost is a potential carbon bomb because over thousands of years, dead plant life has been slowly swallowed up by the soil but has not decomposed. Plants pull carbon out of the atmosphere as they grow, but release it again when they die and decompose. As permafrost warms and thaws, microbes will have more ability to break down the plant life it contains, which is what will trigger a steady stream of emissions. “It’s just like you put celery in your freezer and then you turn your freezer into a refrigerator, and it starts to rot,” says Woods Hole’s Max Holmes. Many people are confused about permafrost, and think when they first hear about it that it is going to release methane, not carbon dioxide, in gigantic explosions. Actually, that’s confusing frozen subsea methane hydrates — which may or may not be destabilized by global warming, but in any case are a separate issue — with permafrost on land. The latter will lose carbon slowly, as thaw enables microbial processes that lead to decomposition. This will release both carbon dioxide and also some methane. There won’t be any explosion, says Natali — but as the numbers above show, it could still be dramatically significant to the total global carbon picture. The news about permafrost has been building in recent years, but it is still a relatively new area of scientific inquiry and one where there is much uncertainty. Thus, even as negotiators in Paris appear to be amping up their ambition and are even talking more about trying to limit global warming to 1.5 degrees C, there may be another wild card they have to contend with. On Tuesday, Secretary of State John F. Kerry called the Paris climate meeting the “demarcation point where we begin to get the job done to save the planet.” Alas, scientists are learning that the planet itself may not cooperate. The one thing that really doesn’t make sense about the climate debate in Paris For more, you can sign up for our weekly newsletter here, and follow us on Twitter here.