News Article | May 3, 2017
A recent upsurge in planet-warming methane may not be caused by increasing emissions, as previously thought, but by methane lingering longer in the atmosphere. That’s the conclusion of two independent studies that indirectly tracked concentrations of hydroxyl, a highly reactive chemical that rips methane molecules apart. Hydroxyl levels in the atmosphere decreased roughly 7 or 8 percent starting in the early 2000s, the studies estimate. The two teams propose that the hydroxyl decline slowed the breakdown of atmospheric methane, boosting levels of the greenhouse gas. Concentrations in the atmosphere have crept up since 2007, but during the same period, methane emissions from human activities and natural sources have remained stable or even fallen slightly, both studies suggest. The research groups report their findings online April 17 in Proceedings of the National Academy of Sciences. “If hydroxyl were to decline long-term, then it would be bad news,” says Matt Rigby, an atmospheric scientist at the University of Bristol in England who coauthored one of the studies. Less methane would be removed from the atmosphere, he says, so the gas would hang around longer and cause more warming. The stability of methane emissions might also vindicate previous studies that found no rise in emissions. The Environmental Protection Agency, for instance, has reported that U.S. emissions remained largely unchanged from 2004 to 2014 (SN Online: 4/14/16). Methane enters the atmosphere from a range of sources, from decomposing biological material in wetlands to leaks in natural gas pipelines. Ton for ton, that methane causes 28 to 36 times as much warming as carbon dioxide over a century. Since the start of the Industrial Revolution, atmospheric methane concentrations have more than doubled. By the early 2000s, though, levels of the greenhouse gas inexplicably flatlined. In 2007, methane levels just as mysteriously began rising again. The lull and subsequent upswing puzzled scientists, with explanations ranging from the abundance of methane-producing microbes to the collapse of the Soviet Union. Those proposals didn’t account for what happens once methane enters the atmosphere. Most methane molecules in the air last around a decade before being broken apart during chemical reactions with hydroxyl. Monitoring methane-destroying hydroxyl is tricky, though, because the molecules are so reactive that they survive for less than a second after formation before undergoing a chemical reaction. Neither study can show conclusively that hydroxyl levels changed, notes Stefan Schwietzke, an atmospheric scientist at the National Oceanic and Atmospheric Administration’s Earth System Research Laboratory in Boulder, Colo. The papers nevertheless add a new twist in explaining the mysterious methane rise, he says. “Basically these studies are opening a new can of worms, and there was no shortage of worms.” Despite being conducted by two separate teams — one headed by Rigby and the other by atmospheric scientist Alex Turner of Harvard University — the new studies used the same roundabout approach to tracking hydroxyl concentrations over time. Both teams followed methyl chloroform, an ozone-depleting substance used as a solvent before being banned by the Montreal Protocol. Like methane, methyl chloroform also breaks apart in reactions with hydroxyl. Unlike methane, though, emission rates of methyl chloroform are fairly easy to track because the chemical is entirely human-made. Examining methyl chloroform measurements gathered since the 1980s revealed that hydroxyl concentrations have probably wobbled over time, contributing to the odd pause and rise in atmospheric methane concentrations. But to know for sure whether hydroxyl levels varied or remained steady, scientists will need to take a more detailed look at regional emissions of methane and methyl chloroform, Rigby says. Why hydroxyl levels might have fallen also remains unclear. Turner and colleagues note that the ban on ozone-depleting substances like methyl chloroform might be the cause. The now-recovering ozone layer (SN: 12/24/16, p. 28) blocks some ultraviolet light, an important ingredient in the formation of hydroxyl. Identifying the cause of the hydroxyl changes could help climate scientists better predict how methane levels will behave in the future.
News Article | January 29, 2016
Wind and solar power coursing across a national system of high-voltage direct current transmission lines could significantly cut power sector carbon dioxide emissions without increasing the cost of energy in the U.S., according to a new study published in Nature Climate Change. Researchers at NOAA’s Earth System Research Laboratory and the University of Colorado, Boulder looked at three scenarios in which wind, solar, hydropower and nuclear were paired with natural gas at different cost points. They also included the cost of building a national high-voltage direct current (HVDC) transmission system on top of the existing power system. “The average variability of weather decreases as size increases; if wind or solar power are not available in a small area, they are more likely to be available somewhere in a larger area,” wrote lead author Alexander MacDonald, director of NOAA’s Earth System Research Laboratory. Although energy storage can also provide stability to the grid, HVDC can do it at a lower cost, the study contends. The results showed that with mid-cost renewables and mid-cost natural gas, electric power CO2 emissions could be cut by about 60 percent by 2030, a figure that rises to nearly 80 percent if natural gas costs rise while renewables decline by 2030. In all scenarios, the cost of power in 2030 would be cheaper than the International Energy Agency’s estimate of an average $0.115 per kilowatt-hour for the levelized cost of electricity in the U.S. in 2030. The model used weather data with high temporal and spatial resolution and assumed co-optimized dispatch of renewables across the modern HVDC grid. “We integrate complex weather data over continental-scale geography while still handling the salient features of an electrical power system,” the authors wrote in the paper. The authors modeled 3 gigawatts of HVDC transmission to carry 523 gigawatts of wind power, 371 gigawatts of solar PV, 471 gigawatts of natural gas, 100 gigawatts of nuclear and 74 gigawatts of hydroelectricity, an increase of 30 percent over 2012 installed capacity. The HVDC transmission network assumes a cost of about $700 per megawatt-mile and another $182,000 for each substation. The authors note that economies of scale allow for that price for the HVDC line, which becomes substantially cheaper once the lines are longer than about 300 miles. Costs for renewables were fairly conservative, with medium cost-assumption estimates nearly in line with today’s costs for wind and solar. A benefit of HVDC, besides connecting generation to load centers more efficiently than high-voltage alternating current, is that it reduces the need for frequency regulation that comes with a high penetration of renewables. One limitation of the study is that it only used hourly data for wind and solar, although fluctuations within the hour can be highly variable. Hourly data was used because there was not more granular electricity demand data and/or detailed weather data available across the large geographic scales the researchers used. To realize the scenario laid out in this study, the U.S. power sector would have to embrace HVDC in a way that it has not previously. There is very little HVDC in the U.S., although it is being used more frequently in Europe and China. One of the largest projects in the U.S. is the 1,000-megawatt Clean Power Link in the Northeast that was recently green-lighted. But large-scale transmission projects are difficult to site and often challenging to get approval for, especially as they move across state lines. Another benefit of HVDC, which PowerLink took advantage of, is that it can often be buried along existing rights of way, eliminating many of the battles that traditional transmission faces. The authors acknowledged the challenges, and the political will that would have to be mustered for a project of this scope. They concluded that building out a national HVDC power system over the existing one would be similar to the challenge and opportunity of building the transcontinental railroads or the interstate highway system.
News Article | January 25, 2016
The study used a sophisticated mathematical model to evaluate future cost, demand, generation and transmission scenarios. It found that with improvements in transmission infrastructure, weather-driven renewable resources could supply most of the nation's electricity at costs similar to today's. "Our research shows a transition to a reliable, low-carbon, electrical generation and transmission system can be accomplished with commercially available technology and within 15 years," said Alexander MacDonald, co-lead author and recently retired director of NOAA's Earth System Research Laboratory (ESRL) in Boulder. The paper is published online today in the journal Nature Climate Change. Although improvements in wind and solar generation have continued to ratchet down the cost of producing renewable energy, these energy resources are inherently intermittent. As a result, utilities have invested in surplus generation capacity to back up renewable energy generation with natural gas-fired generators and other reserves. "In the future, they may not need to," said co-lead author Christopher Clack, a physicist and mathematician with the Cooperative Institute for Research in Environmental Sciences at the University of Colorado Boulder. Since the sun is shining or winds are blowing somewhere across the United States all of the time, MacDonald theorized that the key to resolving the dilemma of intermittent renewable generation might be to scale up the renewable energy generation system to match the scale of weather systems. So MacDonald, who has studied weather and worked to improve forecasts for more than 40 years, assembled a team of four other NOAA scientists to explore the idea. Using NOAA's high-resolution meteorological data, they built a model to evaluate the cost of integrating different sources of electricity into a national energy system. The model estimates renewable resource potential, energy demand, emissions of carbon dioxide (CO2) and the costs of expanding and operating electricity generation and transmission systems to meet future needs. The model allowed researchers to evaluate the affordability, reliability, and greenhouse gas emissions of various energy mixes, including coal. It showed that low-cost and low-emissions are not mutually exclusive. "The model relentlessly seeks the lowest-cost energy, whatever constraints are applied," Clack said. "And it always installs more renewable energy on the grid than exists today." Even in a scenario where renewable energy costs more than experts predict, the model produced a system that cuts CO2 emissions 33 percent below 1990 levels by 2030, and delivered electricity at about 8.6 cents per kilowatt hour. By comparison, electricity cost 9.4 cents per kWh in 2012. If renewable energy costs were lower and natural gas costs higher, as is expected in the future, the modeled system sliced CO2 emissions by 78 percent from 1990 levels and delivered electricity at 10 cents per kWh. The year 1990 is a standard scientific benchmark for greenhouse gas analysis. A scenario that included coal yielded lower cost (8.5 cents per kWh), but the highest emissions. At the recent Paris climate summit, the United States pledged to cut greenhouse emissions from all sectors up to 28 percent below 2005 levels by 2025. The new paper suggests the United States could cut total CO2 emissions 31 percent below 2005 levels by 2030 by making changes only within the electric sector, even though the electrical sector represents just 38 percent of the national CO2 budget. These changes would include rapidly expanding renewable energy generation and improving transmission infrastructure. In identifying low-cost solutions, researchers enabled the model to build and pay for transmission infrastructure improvements—specifically a new, high-voltage direct-current transmission grid (HVDC) to supplement the current electrical grid. HVDC lines, which are in use around the world, reduce energy losses during long-distance transmission. The model did choose to use those lines extensively, and the study found that investing in efficient, long-distance transmission was key to keeping costs low. MacDonald compared the idea of a HVDC grid with the interstate highway system which transformed the U.S. economy in the 1950s. "With an 'interstate for electrons', renewable energy could be delivered anywhere in the country while emissions plummet," he said. "An HVDC grid would create a national electricity market in which all types of generation, including low-carbon sources, compete on a cost basis. The surprise was how dominant wind and solar could be." The new model is drawing interest from other experts in the field. "This study pushes the envelope," said Stanford University's Mark Jacobson, who commented on the findings in an editorial he wrote for the journal Nature Climate Change. "It shows that intermittent renewables plus transmission can eliminate most fossil-fuel electricity while matching power demand at lower cost than a fossil fuel-based grid - even before storage is considered." Explore further: Greening the electric grid with gas turbines More information: Alexander E. MacDonald et al. Future cost-competitive electricity systems and their impact on US CO2 emissions, Nature Climate Change (2016). DOI: 10.1038/NCLIMATE2921
News Article | November 7, 2016
Where the world emits is more important than how much it emits, suggesting that the southward shift of emissions toward the equator is driving the increase in total ozone Since the 1980s, air pollution has increased worldwide, but it has increased at a much faster pace in regions close to the equator. Research from the University of North Carolina at Chapel Hill now reveals that this changing global emissions map is creating more total ozone worldwide compared to the amount of pollution being emitted, signaling an effect that could be difficult to reign in without strategic policy planning. "Emissions are growing in places where there is a much greater effect on the formation of ozone," said Jason West, who led the research at UNC-Chapel Hill with former graduate student and first author Yuqiang Zhang. "A ton of emissions in a region close to the equator, where there is a lot of sunlight and intense heat, produces more ozone than a ton of emissions in a region farther from it." The work, to appear in the Nov. 7 advance online issue of Nature Geoscience, provides a much-needed path forward on where in the world to strategically reduce emissions of pollutants that form ozone, which when present in the lower atmosphere, or troposphere, is one of the primary causes of air pollution-related respiratory problems and heart disease. (In the upper atmosphere, or stratosphere, ozone helps protect against the sun's ultraviolet rays.) To drive home the point, West explained that China's emissions increased more than India's and Southeast Asia's from 1980 to 2010, but Southeast Asia and India, despite their lower growth in emissions during this period, appear to have contributed more to the total global ozone increase due to their proximity to the equator. The reason is that ozone, a greenhouse gas and toxic air pollutant, is not emitted but forms when ultraviolet light hits nitrogen oxides (basically combustion exhaust from cars and other sources). When these pollutants interact with more intense sunlight and higher temperatures, the interplay speeds up the chemical reactions that form ozone. Higher temperatures near the equator also increase the vertical motion of air, transporting ozone-forming chemicals higher in the troposphere, where they can live longer and form more ozone. "The findings were surprising," said West. "We thought that location was going to be important, but we didn't suspect it would be the most important factor contributing to total ozone levels worldwide. Our findings suggest that where the world emits is more important than how much it emits." Zhang, West and colleagues, including Owen Cooper and Audrey Gaudel, from the University of Colorado Boulder and NOAA's Earth System Research Laboratory, used a computer model to simulate the total amount of ozone in the troposphere, the part of the atmosphere where ozone is harmful to humans and agriculture, between 1980 and 2010. Since emissions have shifted south during this period, they wanted to answer, what contributed more to the increased production of ozone worldwide: the changing magnitude of emissions or location? To find out, the team used a unique European data set of ozone observations from commercial aircraft to confirm the strong increases in ozone above Asia. Then they superimposed a map of how much pollution the world was emitting in 1980 onto where the world was emitting it in 2010, and vice versa, in addition to another scenario of the growth of methane gas, to determine what is driving the world's increase in ozone production. "Location, by far," said West, associate professor of environmental sciences in the UNC Gillings School of Global Public Health. The findings point to several strategies for reducing ground-level ozone across the world, such as decreasing emissions of ozone precursors in regions close to the equator, particularly those with the fastest growth of emissions. However, concerns exist for policy makers. "A more challenging scenario is that even if there is a net reduction in global emissions, ozone levels may not decrease if emissions continue to shift toward the equator," said Cooper. "But continuing aircraft and satellite observations of ozone across the tropics can monitor the situation and model forecasts can guide decision making for controlling global ozone pollution.
News Article | November 8, 2016
Since the 1980s, air pollution has increased worldwide, but it has increased at a much faster pace in regions close to the equator. Research from the University of North Carolina at Chapel Hill now reveals that this changing global emissions map is creating more total ozone worldwide compared to the amount of pollution being emitted, signaling an effect that could be difficult to reign in without strategic policy planning. "Emissions are growing in places where there is a much greater effect on the formation of ozone," said Jason West, who led the research at UNC-Chapel Hill with former graduate student and first author Yuqiang Zhang. "A ton of emissions in a region close to the equator, where there is a lot of sunlight and intense heat, produces more ozone than a ton of emissions in a region farther from it." The work, to appear in the Nov. 7 advance online issue of Nature Geoscience, provides a much-needed path forward on where in the world to strategically reduce emissions of pollutants that form ozone, which when present in the lower atmosphere, or troposphere, is one of the primary causes of air pollution-related respiratory problems and heart disease. (In the upper atmosphere, or stratosphere, ozone helps protect against the sun's ultraviolet rays.) To drive home the point, West explained that China's emissions increased more than India's and Southeast Asia's from 1980 to 2010, but Southeast Asia and India, despite their lower growth in emissions during this period, appear to have contributed more to the total global ozone increase due to their proximity to the equator. The reason is that ozone, a greenhouse gas and toxic air pollutant, is not emitted but forms when ultraviolet light hits nitrogen oxides (basically combustion exhaust from cars and other sources). When these pollutants interact with more intense sunlight and higher temperatures, the interplay speeds up the chemical reactions that form ozone. Higher temperatures near the equator also increase the vertical motion of air, transporting ozone-forming chemicals higher in the troposphere, where they can live longer and form more ozone. "The findings were surprising," said West. "We thought that location was going to be important, but we didn't suspect it would be the most important factor contributing to total ozone levels worldwide. Our findings suggest that where the world emits is more important than how much it emits." Zhang, West and colleagues, including Owen Cooper and Audrey Gaudel, from the University of Colorado Boulder and NOAA's Earth System Research Laboratory, used a computer model to simulate the total amount of ozone in the troposphere, the part of the atmosphere where ozone is harmful to humans and agriculture, between 1980 and 2010. Since emissions have shifted south during this period, they wanted to answer, what contributed more to the increased production of ozone worldwide: the changing magnitude of emissions or location? To find out, the team used a unique European data set of ozone observations from commercial aircraft to confirm the strong increases in ozone above Asia. Then they superimposed a map of how much pollution the world was emitting in 1980 onto where the world was emitting it in 2010, and vice versa, in addition to another scenario of the growth of methane gas, to determine what is driving the world's increase in ozone production. "Location, by far," said West, associate professor of environmental sciences in the UNC Gillings School of Global Public Health. The findings point to several strategies for reducing ground-level ozone across the world, such as decreasing emissions of ozone precursors in regions close to the equator, particularly those with the fastest growth of emissions. However, concerns exist for policy makers. "A more challenging scenario is that even if there is a net reduction in global emissions, ozone levels may not decrease if emissions continue to shift toward the equator," said Cooper. "But continuing aircraft and satellite observations of ozone across the tropics can monitor the situation and model forecasts can guide decision making for controlling global ozone pollution.
News Article | January 25, 2016
The U.S. Could Make a Fast, Cheap Switch to Clean Energy More Coal-fired power plants are the biggest emitters of greenhouse gases in the United States, but new research finds that existing technology could cheaply slash the nation’s carbon spew nearly 80 percent by 2030. How? By transporting renewable energy from where the sun is shining and the wind is blowing to where it is not, according to the study, which was published on Monday in the journal Nature Climate Change by scientists from the National Oceanic and Atmospheric Administration and the University of Colorado Boulder. NOAA’s highly detailed weather data shows there’s nearly always someplace in the 48 contiguous states where electricity can be generated by solar power stations and wind farms, even if it happens to be hundreds or thousands of miles away from where it’s needed. The quandary: How to move electricity generated by that sun or wind over long distances without losing too much of it in the process. The solution: A proven technology, called high-voltage direct current, already exists and can carry power across long distances more efficiently than alternating current, the standard power transmission mode in the U.S. Utilities could add direct-current infrastructure to alternating-current transmission lines over the next 15 years as part of planned updates and upgrades without breaking the bank, said study coauthor Alexander MacDonald, who recently retired as director of NOAA’s Earth System Research Laboratory. “Almost everybody believes that if we go to wind and solar energy it will be more expensive, or won’t be ready unless we have a big technological breakthrough” in battery storage technology, MacDonald said. “Our study says that with existing transmission technology and use of the whole 48 states with this ‘interstate for electrons,’ we’re ready right now to have a national system that has the same electric costs as today, with as much as 80 percent less carbon, and just as reliable.” The greater reliance on wind and solar power would also cut water use for energy by 65 percent, the study found. That’s because fossil fuel plants, which generate 40 percent of the nation’s carbon emissions, need large volumes of water for cooling. RELATED: Morocco Will Soon Become the World’s Solar Energy Superpower “Our study assumed that the existing U.S. power system, with all of its AC distribution and usage, stays the same,” said MacDonald. “Power can be taken off the HVDC network for use, and put on by generation. To a power provider, let’s say a utility, instead of building a coal plant, they build a connection to the HVDC network. Everything else stays the same.” To test ideas about the most cost-effective means of generating power, MacDonald and his colleagues conducted a complex mathematical analysis that combined finely detailed data on continent-wide weather patterns from 2006 to 2008 with equally detailed data on power demand for the same period. “NOAA folks have known for some time how big weather is,” said mathematician and physicist Christopher Clack of the Cooperative Institute for Research in Environmental Sciences, a collaboration between NOAA and the University of Colorado Boulder. “We built and ran a very sophisticated model that was able to take advantage of [NOAA’s] exceptionally good-quality weather data to look at the situation of the grid, and see if there’s any way of running the grid that would incorporate a really cheap system.” The model was not designed to prioritize low carbon emissions, he said. “We tried to be completely agnostic on which technologies were picked. It turned out the most effective combination we saw was full U.S., 48-state transmission, backed up by gas when solar and wind wasn’t enough.” Using the U.S. Energy Information Administration’s estimate of a 0.7 percent increase in power demand annually between 2015 and 2030, the researchers found that scenarios combining wind, solar, and natural gas power with a nationwide transmission grid cut greenhouse gas emissions from 33 to 78 percent below 1990 levels. If gas was cheaper than solar and wind, the emissions were higher; when renewables beat gas on price, emissions went down. The cost to ratepayers was between $0.086 and $0.10 per kilowatt-hour—comparable to the actual average nationwide cost of $0.094 per kilowatt-hour in 2015 and potentially saving power customers $47.2 billion a year.
News Article | March 10, 2016
This story has been updated. Atmospheric carbon dioxide concentrations have spiked more in the period from February 2015 to February 2016 than in any other comparable period dating back to 1959, according to a scientist with the National Oceanic and Atmospheric Administration’s Earth System Research Laboratory. The change in average concentrations from February of last year to February of this year was 3.76 parts per million at the storied Mauna Loa Observatory in Hawaii, leaving the concentration at 404.02 parts per million for February, based on preliminary data. Pieter Tans, lead scientist of NOAA’s Global Greenhouse Gas Reference Network, confirmed that the increase, reported previously by New Scientist, represented a record year-over-year growth for Mauna Loa. He also said that in addition to the stark rise in carbon dioxide levels over the past year, researchers have now observed four straight years of increases of more than 2 parts per million in the atmosphere. “We’ve never seen that,” Tans said. “That’s unprecedented.” Indeed, the average annual increase during 2015, of 3.05 parts per million of carbon dioxide at Mauna Loa, was also the highest in the record, according to NOAA — exceeding the previous record of 2.93 parts per million in 1998, which was also a strong El Nino year. Pre-industrial levels of carbon dioxide were just 280 parts per million, rather than over 400 right now — and when the measurement record began at Mauna Loa in the late 1950s, were below 320 parts per million. So we have come a very long way, and very fast. Tans said the reason is very clear: Rates of fossil fuel burning remain at historically high levels, releasing 10 billion metric tons of carbon into the atmosphere annually. “The emissions are at a record high, therefore the growth rate of atmospheric CO2 is also at a record high,” he said. However, there also appears to be a role for the El Nino phenomenon in the records this year. “CO tends to rise much faster during and just following El Niño events,” wrote Ralph Keeling, director of the Scripps Institution of Oceanography carbon dioxide program and son of Charles David Keeling (after whom the iconic graph of rising greenhouse gas concentrations is named), last October. At the time, Keeling forecast that because of the current El Nino event, we would probably never see CO2 levels decline below 400 again “in our lifetimes.” In that post, Keeling also explained why CO2 goes up so much during El Nino. It’s because of the way the phenomenon tends to drive droughts across the tropics, which in turn leads forests, like those in Indonesia, to lose carbon in wildfires — which happened at a massive scale in 2015. Drought also stunts forest growth, which leads to less carbon dioxide removal from the atmosphere, Keeling wrote. “The loss of carbon from tropical forests in El Niño years is temporary as the forests tend to regrow in normal years, building back their biomass and sucking CO out of the air in the process,” Keeling concluded. “But the eventual recovery from this El Niño won’t bring us back below 400 ppm, because its impact will be dwarfed by the global consumption of fossil fuels, pushing CO levels ever higher.” Of late, the growth rate for carbon dioxide concentrations in the atmosphere has been around 2.2 parts per million per year. Greenhouse gas concentrations in the atmosphere fluctuate over the course of each year, forming a classic “saw-toothed curve” (seen above), due to the way that some parts of the Earth’s system (like trees and plants) pull more carbon out of the air during the northern hemisphere spring. That means that the level in February of this year, 404 parts per million as measured at Mauna Loa, will decline somewhat over the coming months. But overall, despite these fluctuations, the trend has been steadily upward. “Why should people be troubled by that?” Tans said. “There is no scientific doubt that higher CO2 in the atmosphere causes the inner heat balance of the Earth to change … This will have impacts on the climate.” He said the precise impacts can be difficult to pinpoint and predict, but they are unmistakable nonetheless. “What is very certain is that we do have an impact on the Earth’s climate that is risky,” he said. “We have no other home.”
News Article | March 11, 2016
The level of carbon dioxide in the Earth's atmosphere is one for the books as it reached a record level in 2015. In a recent report by the National Oceanic Atmospheric Administration (NOAA), experts from the agency's Earth System Research Laboratory and Scripps Institution of Oceanography independently measured the amounts of carbon dioxide in the atmosphere. The results set a new feat, raising concerns about the greenhouse gas and the impacts of global warming. Lead investigator Pieter Tans from the agency's Global Greenhouse Gas Reference Network says that the levels of carbon dioxide are rising more swiftly than it have in hundreds of thousands of years. "It's explosive compared to natural processes," he says. Data about the rate of carbon dioxide increase per year were collated at NOAA's Mauna Loa Observatory in Hawaii. The numbers show that carbon dioxide increased by 3.05 parts per million in 2015. Such rate is said to be the biggest increase documented in the year to year recording in 56 years of research. Tans also reveals that 2015 was the fourth successive year that carbon dioxide rose to more than two parts per million - another first. Before 1800, the rate of carbon dioxide in the Earth's atmosphere was 280 parts per million on the average. In February 2016, the said rate is 402.59 parts per million. Between 17,000 to 11,000 years ago, the Earth was able to sustain such increase in atmospheric carbon dioxide, when the greenhouse gas was exhibiting a rise of 80 parts per million. At present, Tans says the degree of rise is 200 times faster. NOAA says one reason for the massive leap in carbon dioxide levels is the current El Niño, which drives wildlife, forests and nature to respond to changes in weather, drought and precipitation. The biggest former rise of carbon dioxide levels was in 1998, which was also recognized as a solid El Niño year. The said natural phenomenon impairs the function of trees and other systems to absorb carbon dioxide as it adjust to the haywire that is the weather pattern. The resulting effect is faster release of the greenhouse gas to the atmosphere. Although the weather plays a significant role in carbon dioxide increases, the main causative factor remains to be the burning of fossil fuels such as oil, natural gas and coal. While the global emission of greenhouse gases were lower in 2015 than in previous years, this is not enough to compensate for the carbon dioxide numbers now. The way to totally improve the condition is to have so-called "negative emissions," wherein the level of carbon dioxide absorbed by the planet is lower than what humans emit.
News Article | December 20, 2016
Carbon dioxide (CO ) is not the only greenhouse gas on the rise. Since 2007, methane—which molecule-for-molecule has 30 times the warming effect of CO —has risen by more than 3%. Befuddled scientists have tried to pin the growth on increased natural gas drilling, rising rice cultivation, and a surge in bovine belches. But none of these explanations has stuck. Now, two more processes have gained ground as possible culprits, according to new work presented here last week at a meeting of the American Geophysical Union (AGU). In one scenario, methane’s rise may come in part from a drop in hydroxyl, a chemical that acts as an atmospheric detergent; in the other, the gas is emanating from tropical wetlands flooded by heavy rains in recent years. Because climate change is expected to increase tropical rainfall, this methane could be a new signal that “the tropics are changing fast,” says Euan Nisbet, a climate researcher at Royal Holloway, University of London—and a warning that methane may continue to rise as the world warms in a positive climate feedback. “Methane may be a tropical parallel to Arctic sea ice,” Nisbet says. Methane is normally held in check by the hydroxyl radical (OH), which scrubs nearly all of it out of the atmosphere within a decade. Formed in the presence of sunlight by water vapor and pollutants like ozone and nitrogen oxides, hydroxyl is hard to measure, because it persists for just a second in the air before it reacts away. Instead, scientists gauge OH abundance by looking to proxies—chemicals that react with hydroxyl. One proxy is methyl chloroform, banned years ago for contributing to the ozone hole. A steady decline of that compound in the wake of regulations would suggest that hydroxyl is relatively constant, an assumption baked into most models of the methane rise. “OH tends to get ignored a bit in discussions even in the science community,” says Michael Newland, an atmospheric chemist who recently completed a postdoc at the University of East Anglia in Norwich, U.K. But a close look at the methyl chloroform trend shows that the decline isn’t as steady or certain as many have assumed. Instead it reveals bumps that could indicate a loss of OH, according to research presented at the AGU conference by Alexander Turner, a graduate student in atmospheric chemistry at Harvard University. The overall oxidative capacity of the atmosphere could be declining, allowing methane to linger longer, he says. Newland says that a fall in nitric oxide and nitrogen dioxide from clean air regulations could have slowed the production of OH. But Turner cautions that the case for declining OH is far from closed: With different assumptions, his sparse data could just as easily chart a rise in methane emissions rather than a hydroxyl decline. *Correction, 23 December, 11:45 a.m.: A previous version of this story incorrectly described the hydroxyl radical as OH-, which refers to the hydroxide ion. Moreover, the article misstated a pollutant. In fact, nitrogen oxides, not nitrous oxide, are the pollutants that play an important role in hydroxyl chemistry. And so scientists continue to look for unrecognized new sources. Cows are unlikely, as their numbers saw their steepest increase between 2000 and 2006, when methane levels were flat. There’s little evidence of thawing Arctic permafrost pumping out more methane. Expanded rice growing could play a role. But Ed Dlugokencky, an atmospheric chemist at the National Oceanic and Atmospheric Administration’s Earth System Research Laboratory in Boulder, Colorado, sees another potential source: the heavy rains that washed over the tropics from 2008 to 2014, creating a surge in wetlands and methane-spewing microbes. “This is all very anecdotal,” says Dlugokencky, who gave a talk at the AGU meeting. “But I think it paints a consistent picture of what’s going on.” In recent years, researchers have noticed another clue to the puzzle: The carbon atoms in atmospheric methane molecules have shifted toward lighter isotopes. Because life prefers lighter carbon, the isotopes suggest to some scientists that the atmospheric rise must be due to extra microbial production, and not a boost due to leaked gas from fracking operations, which has a heavier isotopic signature. But the lighter carbon could also signal that hydroxyl levels are falling, says Joe McNorton, a climate scientist at the University of Leeds in the United Kingdom. Because OH prefers to react with lighter carbon, having less of it around would allow more of the light, microbial methane to linger in the atmosphere. It’s clear that methane scientists will need to come together to resolve this debate, as several factors are likely playing a role, says Eric Kort, an atmospheric chemist at the University of Michigan in Ann Arbor. “Trying to use any one single source or sink, or single measurement technique, to define exactly what’s happening is perhaps too simple.” The payoff will be a clearer sense of the future: whether the methane rise is a long-term trend, driven by climate change, or a blip that could reverse next year.
News Article | December 20, 2016
The Bureau of Reclamation is launching a new prize challenge, Sub-Seasonal Climate Forecast Rodeo, to ask solvers to improve existing sub-seasonal forecasts and to develop systems that perform demonstratively better than the existing baseline forecast for predicting temperature and precipitation over a 15 to 42-day time frame. If there are winners of this prize challenge, they will share up to $800,000 in prize money. Solvers of this prize challenge will have three months to develop their system, at which point they are asked to provide forecasts every two weeks over a 13-month period, with the first month being a "pre-season" to become familiar with the submission and evaluation processes. Improved sub-seasonal forecasts for temperature and precipitation, lead-times ranging from 15 to 45 days and beyond, would allow water managers to better prepare for shifts in hydrologic regimes, such as the onset of drought or occurrence of wet weather extremes. Skillful sub-seasonal forecasting 15 to 45 days in the future has proven difficult to accomplish, because it bridges short-term forecasting, where initial conditions primarily determine upcoming weather, and long-term forecasting in which slowly varying factors such as sea surface temperatures and soil moisture become more important. Reclamation is collaborating with the National Oceanic and Atmospheric Administration Earth System Research Laboratory in Boulder, Colorado, and NOAA's Climate Prediction Center in College Park, Maryland, to design and judge this challenge. In addition, the U.S. Geological Survey and U.S. Army Corps of Engineers contributed subject matter experts to review and assist with the design of this prize challenge. To register and learn more about this prize challenge, visit http://www.challenge.gov. To learn more about Reclamation's Water Prize Challenge Center, visit https://www.usbr.gov/research/challenges/.