Seattle, WA, United States

The National Oceanic and Atmospheric Administration is a scientific agency within the United States Department of Commerce focused on the conditions of the oceans and the atmosphere. NOAA warns of dangerous weather, charts seas and skies, guides the use and protection of ocean and coastal resources, and conducts research to improve understanding and stewardship of the environment. In addition to its civilian employees, 12,000 as of 2012, NOAA research and operations are supported by 300 uniformed service members who make up the NOAA Commissioned Officer Corps. The current Under Secretary of Commerce for Oceans and Atmosphere at the Department of Commerce and the agency's administrator is Kathryn D. Sullivan, who was nominated February 28, 2013, and confirmed March 6, 2014. Wikipedia.


Time filter

Source Type

Patent
National Oceanic and Atmospheric Administration | Date: 2016-08-27

A system for expressing an ion path in a time-of-flight (TOF) mass spectrometer. The present invention uses two successive curved sectors, with the second one reversed, to form S-shaped configuration such that an output ion beam is parallel to an input ion beam, such that the ions makes two identical but opposed turns, and such that the geometry of the entire system folds into a very compact volume. Geometry of a TOF mass spectrometer system in accordance with embodiments of the present invention further includes straight drift regions positioned before and after the S-shaped configuration and, optionally, a short straight region positioned between the two curved sectors with total length equal to about the length of the central arc of both curved sectors.


Minello T.J.,National Oceanic and Atmospheric Administration
Journal of Experimental Marine Biology and Ecology | Year: 2017

Laboratory experiments were conducted under simulated daytime conditions to examine the effects of salinity, sediment texture, size, density, and hunger on burrowing behavior of juvenile brown shrimp Farfantepenaeus aztecus and white shrimp Litopenaeus setiferus. Over all experimental conditions (20,929 observations of 2411 individual shrimp), 77.5% of brown shrimp and 21.4% of white shrimp were observed burrowed with more than half of their body beneath the substrate. The tendency of burrowed shrimp to emerge from burrows when disturbed also was tested. When burrowing rates were examined in combination with this tendency to emerge upon disturbance, only 46.7% of brown shrimp would be susceptible to capture in towed nets, while almost all (97%) white shrimp would be susceptible. All environmental factors examined in this study, except salinity for white shrimp, significantly affected burrowing of these species. When these environmental effects on burrowing were combined with the likelihood of emergence, however, the effects of salinity and substrate type on brown shrimp behavior appeared most likely to affect capture by towed nets. Estuarine abundance indices from resource surveys using towed nets could be adjusted using such vulnerability estimates. © 2016


Eberhard W.L.,National Oceanic and Atmospheric Administration
Applied Optics | Year: 2017

The maximum likelihood estimator (MLE) is derived for retrieving the extinction coefficient and zero-range intercept in the lidar slope method in the presence of random and independent Gaussian noise. Least-squares fitting, weighted by the inverse of the noise variance, is equivalent to the MLE. Monte Carlo simulations demonstrate that two traditional least-squares fitting schemes, which use different weights, are less accurate. Alternative fitting schemes that have some positive attributes are introduced and evaluated. The principal factors governing accuracy of all these schemes are elucidated. Applying these schemes to data with Poisson rather than Gaussian noise alters accuracy little, even when the signal-to-noise ratio is low. Methods to estimate optimum weighting factors in actual data are presented. Even when the weighting estimates are coarse, retrieval accuracy declines only modestly. Mathematical tools are described for predicting retrieval accuracy. Least-squares fitting with inverse variance weighting has optimum accuracy for retrieval of parameters from single-wavelength lidar measurements when noise, errors, and uncertainties are Gaussian distributed, or close to optimum when only approximately Gaussian. © 2017 Optical Society of America.


Rogers J.W.,National Oceanic and Atmospheric Administration | Cohen A.E.,National Oceanic and Atmospheric Administration | Carlaw L.B.,National Weather Service Forecast Office
Weather and Forecasting | Year: 2017

This comprehensive analysis of convective environments associated with thunderstorms affecting portions of central and southern Arizona during the North American monsoon focuses on both observed soundings and mesoanalysis parameters relative to lightning flash counts and severe-thunderstorm reports. Analysis of observed sounding data from Phoenix and Tucson, Arizona, highlights several moisture and instability parameters exhibiting moderate correlations with 24-h, domain-total lightning and severe thunderstorm counts, with accompanying plots of the precipitable water, surface-based lifted index, and 0-3-km layer mixing ratio highlighting the relationship to the domain-total lightning count. Statistical techniques, including stepwise, multiple linear regression and logistic regression, are applied to sounding and gridded mesoanalysis data to predict the domain-total lightning count and individual gridbox 3-h-long lightning probability, respectively. Applications of these forecast models to an independent dataset from 2013 suggest some utility in probabilistic lightning forecasts from the regression analyses. Implementation of this technique into an operational forecast setting to supplement short-term lightning forecast guidance is discussed and demonstrated. Severe-thunderstorm-report predictive models are found to be less skillful, which may partially be due to substantial population biases noted in storm reports over central and southern Arizona. © 2017 American Meteorological Society.


Carlaw L.B.,National Weather Service Forecast Office | Cohen A.E.,National Oceanic and Atmospheric Administration | Rogers J.W.,National Oceanic and Atmospheric Administration
Weather and Forecasting | Year: 2017

This paper comprehensively analyzes the synoptic and mesoscale environment associated with North American monsoon-related thunderstorms affecting central and southern Arizona. Analyses of thunderstorm environments are presented using reanalysis data, severe thunderstorm reports, and cloud-to-ground lightning information from 2003 to 2013, which serves as a springboard for lightning-prediction models provided in a companion paper. Spatial and temporal analyses of lightning strikes indicate thunderstorm frequencies maximize between 2100 and 0000 UTC, when the greatest frequencies are concentrated over higher terrain. Severe thunderstorm reports typically occur later in the day (between 2300 and 0100 UTC), while reports are maximized in the Tucson and Phoenix metropolitan areas. Composite analyses of the synoptic-scale patterns associated with severe thunderstorm days and nonthunderstorm days during the summer using the North American Regional Reanalysis dataset are presented. Severe thunderstorm cases tend to be associated with a stronger midlevel anticyclone and deep-layer moisture over portions of the southwestern United States. By September, severe weather patterns tend to associate with a midlevel trough along the Pacific coast. Specific parameters associated with severe thunderstorms are analyzed across the Tucson and Phoenix areas, where severe weather reporting is more consistent. Greater convective available potential energy, low-level lapse rates, and downdraft convective available potential energy are associated with severe thunderstorm (especially severe wind) environments compared to those with nonsevere thunderstorms, while stronger effective bulk wind differences (at least 15-20 kt, where 1 kt = 0.51 m s-1) can be used to distinguish severe hail environments. © 2017 American Meteorological Society.


Dong L.,National Oceanic and Atmospheric Administration | McPhaden M.J.,National Oceanic and Atmospheric Administration
Journal of Climate | Year: 2017

Both the Indian and Pacific Oceans exhibit prominent decadal time scale variations in sea surface temperature (SST), linked dynamically via atmospheric and oceanic processes. However, the relationship between SST in these two basins underwent a dramatic transformation beginning around 1985. Prior to that, SST variations associated with the Indian Ocean basin mode (IOB) and the interdecadal Pacific oscillation (IPO) were positively correlated, whereas afterward they were much less clearly synchronized. Evidence is presented from both observations and coupled state-of-the-art climate models that enhanced external forcing, particularly from increased anthropogenic greenhouse gases, was the principal cause of this changed relationship. Using coupled climate model experiments, it is shown that without external forcing, the evolution of the IOB would be strongly forced by variations in the IPO. However, with strong external forcing, the dynamical linkage between the IOB and the IPO weakens so that the negative phase IPO after 2000 is unable to force a negative phase IOB-induced cooling of the Indian Ocean. This changed relationship in the IOB and IPO led to unique SST patterns in the Indo-Pacific region after 2000, which favored exceptionally strong easterly trade winds over the tropical Pacific Ocean and a pronounced global warming hiatus in the first decade of the twenty-first century. © 2017 American Meteorological Society.


Chiodi A.M.,University of Washington | Chiodi A.M.,National Oceanic and Atmospheric Administration | Harrison D.E.,University of Washington | Harrison D.E.,National Oceanic and Atmospheric Administration
Journal of Climate | Year: 2017

The fundamental importance of near-equatorial zonal wind stress in the evolution of the tropical Pacific Ocean's seasonal cycle and El Niño-Southern Oscillation (ENSO) events is well known. It has been two decades since the TAO/TRITON buoy array was deployed, in part to provide accurate surface wind observations across the Pacific waveguide. It is timely to revisit the impact of TAO/TRITON winds on our ability to simulate and thereby understand the evolution of sea surface temperature (SST) in this region. This work shows that forced ocean model simulations of SST anomalies (SSTAs) during the periods with a reasonably high buoy data return rate can reproduce the major elements of SSTA variability during ENSO events using a wind stress field computed from TAO/TRITON observations only. This demonstrates that the buoy array usefully fulfills its waveguide-wind-measurement purpose. Comparison of several reanalysis wind fields commonly used in recent ENSO studies with the TAO/TRITON observations reveals substantial biases in the reanalyses that cause substantial errors in the variability and trends of the reanalysis-forced SST simulations. In particular, the negative trend in ERA-Interim is much larger and the NCEP-NCAR Reanalysis-1 and NCEP-DOE Reanalysis-2 variability much less than seen in the TAO/TRITON wind observations. There are also mean biases. Thus, even with the TAO/TRITON observations available for assimilation into these wind products, there remain oceanically important differences. The reanalyses would be much more useful for ENSO and tropical Pacific climate change study if they would more effectively assimilate the TAO/TRITON observations.


A statement in this recently published paper makes a point that is largely at odds with the main point of the paper that is cited. Stating that higher air temperatures lead to greater evapotranspiration is an oversimplification; the true story is more complex. Although this is by no means central to the conclusions of the paper being commented on, we have demonstrated the danger in taking too literally the idea that air temperature determines potential evapotranspiration. © 2017, by the authors; licensee MDPI, Basel, Switzerland.


News Article | May 3, 2017
Site: www.nature.com

For more than 80 years, the US media and political scholars have gauged a new president’s potential on the basis of his administration’s performance in its first 100 days. The time-honoured — if increasingly tiresome — tradition began when Franklin Delano Roosevelt took office in 1933. He was able to act fast, being blessed with a Congress controlled by his own political party. And he needed to do so, his presidency being cursed with the Great Depression, a financial emergency that demanded quick and decisive action. The 100-days benchmark is by its nature arbitrary, and projections based on it can be superficial. Many historic presidential achievements — such as Barack Obama’s reform of the US health-care system — happened in the subsequent days (all 1,361 of them) of a typical four‑year term. But the first few months of an administration are crucial to filling staff vacancies. The White House Transition Project (WHTP), a non-partisan effort to ease the transfer of power by providing information to the new administration’s staff, has found that posts take longer to fill as time drags on: the longer a president waits to nominate candidates, the slower the US Senate is to confirm them. When Trump passes the 100-day mark this weekend, he will become the slowest president to stock an administration in four decades, the WHTP says. He has yet to nominate candidates for hundreds of empty seats — some of them vitally important to the country’s science policy and research direction. On 12 April, Republicans in Congress wrote to Trump to urge him to fill two vacancies on the Nuclear Regulatory Commission, an independent panel that oversees civilian use of radioactive materials in power plants and other applications. Without those appointments, the commission will not have the number of people mandated by law for it to make decisions when its chair’s term ends on 1 July. The Federal Energy Regulatory Commission fell below that level in February. Trump has blamed some delays on the Senate, which must confirm nearly 1,000 of his appointments. But he has also suggested that his inaction is a deliberate strategy to pare down the size of government. “What do all of these people do?” he said in one interview. “You don’t need all those jobs.” The United States certainly needs some of them. Whether owing to impairment, intention or inexperience, Trump’s dithering over key scientific positions puts the country’s research community and the broader public at risk. For many researchers, the main concern has been the lack of a science adviser to head the Office of Science and Technology Policy. The absence of a voice for science in the administration may have contributed to the draconian cuts to the US National Institutes of Health and Environmental Protection Agency proposed in Trump’s 2018 budget blueprint. (The proposal didn’t mention the National Science Foundation at all.) A science adviser could also have informed the administration of the damaging consequences for science of proposed immigration policies. And he or she could offer counsel when scientific crises — the next Zika virus or oil spill, for example — arise. And arise they will. Biomedical researchers, meanwhile, are waiting to see how long Francis Collins will continue to serve as director of the US National Institutes of Health. The National Cancer Institute, the head of which is also appointed by the president, has been led by its deputy director since April 2015. And some of the major science agencies, including NASA and the National Oceanic and Atmospheric Administration, lack a leader. The uncertainty makes it difficult for agencies to plan ahead, negotiate for resources and launch initiatives. And the patchwork of vacancies will debilitate efforts to deal with emerging crises, which often require a coordinated response across agencies. In 1933, Roosevelt passed 76 laws in his first 100 days as he laboured to reshape the nation’s economy. He set a high bar for efficiency: no president has measured up to his achievement since. (Eight years into his presidency, Roosevelt appointed the nation’s first science adviser, engineer Vannevar Bush.) With any new president comes uncertainty, and no administration completes its full roster of appointments by the end of its first year. But Trump is lagging well behind his predecessors, and is fostering a damaging sense of uncertainty by suggesting that he will leave these chairs empty.


News Article | April 13, 2017
Site: www.techtimes.com

Right whales are listed as an endangered species. Researchers estimate that there are only about 500 existing North American right whales, and the future of these marine animals looks grim. Scientists revealed that right whales gave birth to the fewest calves last winter in the past 17 years, raising concerns over the declining population of the already endangered species. Each winter, right whales migrate to the warmer Atlantic waters off Florida and Georgia to give birth. The trained spotters who observe mother-and-calf pairs during aerial surveys said that there were very few sightings this year. They spotted only three newborn whales swimming alongside their mothers. The number is the lowest reported since 2000, when spotters only sighted one calf. The number is likewise far lower compared with the yearly average of 17. One bad year does not necessarily mean that the reproduction of right whales is in trouble since birth numbers may vary year-to-year, but researchers have observed below-average births since the 2012 calving season. The impact of this year's birth turnout can also affect the species' capacity to reproduce a decade from now, when the newborns would be sexually mature to reproduce and have calves of their own. This year likewise marks the first time since 2001 that researchers did not find first-time mothers among the whales that gave birth. Mother whales are identified using distinct markings on their heads. Scientists from the National Oceanic and Atmospheric Administration (NOAA) have developed a facial recognition software for the whales in a bid to save their species. The three whales that were sighted swimming with their calves were not first-time mothers, but they also neither had babies in seven to eight years, which is far longer than the typical interval between births. Right whales normally give birth once in three years or so. Wildlife biologist Clay George from the Georgia Department of Natural Resources, who is involved in right whale surveys, said it may be too early to be gloomy about the flat or declining population of the whales, citing how the population of the species turned around in the 2000s. Right whale births bottomed out in 2000, but their population managed to rebound with a baby boom of 31 newborns after that year. Some signs indicate that births may improve next year as researchers have observed more whales feeding in the Bay of Fundy off Nova Scotia last summer than what they had seen in several years. The right whales that return to Cape Cod also look less thin and more robust than in the past spring seasons. Researchers also said that there may still be a possible addition to the number of calves this year, as they were trying on Wednesday, April 12, to confirm a report of a fourth mother-newborn pair that was sighted in Cape Cod. "I'm somewhat hopeful next year will be better," said whale researcher Philip Hamilton, from the New England Aquarium in Boston. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | March 30, 2017
Site: www.scientificcomputing.com

Model focuses on triggers for seasonal ice breakup that could improve accuracy of sea-level rise predictions. Projections of how much the melting of ice sheets will contribute to sea-level rise can vary by several meters based on the rate of iceberg calving at the edges of those ice sheets. To provide climate scientists with models that make more accurate forecasts, a postdoctoral researcher at Caltech has created a computer simulation of one of the key processes controlling glacial calving. Glaciers are moving slabs of ice that slowly grind downhill. Where they end in the sea, chunks break off, forming icebergs in a process known as calving. When temperatures plummet in the winter, those icebergs can freeze together and create a traffic jam that prevents further icebergs from breaking off from the glacier. During the winter, the glacier loses much less ice to the sea. The eventual spring breakup of what is known as the mélange—that frozen iceberg logjam—occurs suddenly, and is the focus of research by Caltech's Alexander Robel. "I developed a computer model that simulates how the first iceberg calving of the warm season creates a shock wave that travels through the jammed mélange, breaking it up," says Robel, a National Oceanic and Atmospheric Administration (NOAA) Postdoctoral Scholar and a Stanback Postdoctoral Scholar at Caltech. His new model was featured in Nature Communications on February 28. The mélange is a frozen granular material, so Robel adapted an open-source computer simulation called the Discrete-Element Bonded-Particle Sea Ice Model to show how icebergs freeze together in the winter and then transmit the shock of the first iceberg calving in the summer. That first calving is made possible by the thinning of sea ice in warmer water, which reduces the ability of the mélange to act a bulwark against the glacier. Robel tailored his modeled glaciers to resemble fjords in Greenland. Those fjords are narrow channels of water that are prone to trapping mélange. Robel was able show that the threshold at which spring sea-ice breakup is likely to occur is based in part on the thickness of sea ice within the mélange, but also on the shape of the channel within which the mélange is trapped. Robel, who is a researcher in Caltech's Division of Geological and Planetary Sciences, home to the Seismological Laboratory, says his work was inspired in part by seismological studies of the way fractures propagate through elastic materials—drawing a connection between earthquakes and iceberg calving.


News Article | April 25, 2017
Site: motherboard.vice.com

"While their global warming agenda continues to lose support, it's ironic that radical environmentalists are at it again, less than a month after NOAA (National Oceanic and Atmospheric Administration), announced the Great Lakes had the most widespread ice coverage in over 35 years. Thirty years ago liberals were using global cooling to push new radical regulations. Then they shifted their focus to global warming in an effort to prop up wave after wave of job-killing regulations that are leading to skyrocketing food and energy costs." [The Times-Picayune] "Louisiana's coast is eroding… but not because of oil companies." [Captain Higgins Congress] Voted in support of S.J. Res. 24, a "resolution of disapproval" under the Congressional Review Act that would nullify the Environmental Protection Agency's Clean Power Plan—the first nation-wide limit on greenhouse gas emissions from power plants, and key climate change policy. [League of Conservation Voters] "Global temperatures have not risen in 15 years." [Washington Examiner] Call him: (225) 929-7711 | Email him "The problem has to be addressed with market-based solutions rather than government. I support energy conservation, nuclear energy and encouraging technology to burn clean burning coal. I don't support a cap and trade policy. I don't support an energy tax. If it's such a swell idea, let China go first." [The Advocate]


Animal rescuers in Orange County in Southern California are looking out for a gray whale, which became entangled in what is guessed to be a metal fishing gear or frame. The whale was spotted in this condition at around 3:30 p.m. on Saturday, April 1, outside the Dana Point. As of April 4 morning, the tangled mammal was not spotted by rescuers keeping a steady watch. Gray whales are famous for the distance they can cover for the purpose of mating and giving birth to calves. These particular species of whales grow up to a length of 49 feet and can weigh up to 79,366 pounds. These whales derive their name from the gray patches and mottling, which adorns their dark skin. The distressed gray whale was first spotted by Capt. Frank Brennan, about 2 miles off Dana Point. Brennan saw the creature while conducting a whale-watching tour. Initially, because of the presence of too many boats around it, the whale was evasive. However, Brennan saw the whale again off Laguna Beach the same day. It appeared that the whale's head had somehow gotten stuck in the metal frame of a fishing gear, which had fishing lines attached. Brennan followed the whale from Laguna Beach till Main Beach before Capt. Dave Anderson could take over at 6 p.m. Anderson runs Captain Dave's Dolphin Safari and also leads Orange County's whale disentanglement team. Brennan also alerted Justin Viezbicke, network coordinator at marine mammal stranding network of National Oceanic and Atmospheric Administration. The last time the entangled gray whale was spotted was outside Newport Beach Harbor at sunset. Anderson avoided sending his tracking buoy after the whale because he was not quite sure how the head of the whale was trapped in the metal bar. "We felt very uncertain about what damage it would cause to the whale with this unusual entanglement. With darkness closing in, we thought it was best to document the last location and hope for the best tomorrow," said Anderson Viezbicke, along with other rescue groups, is trying to create a plan to remove the metal frame from the whale's head. The rescuers are trying to determine whether the metal frame is a piece of a fishing gear or some other machinery. The rescuers urge people with any information on the entangled gray whale to contact NOAA's entanglement reporting hotline (877) 767-9425. Anderson expressed his worry and stated that this is the fourth entangled whale he has spotted in the past two months. Back in 2016, 71 cases of whale entanglement were reported along the coasts of California, Washington, and Oregon. However, the West Coast tops the chart in terms of whales getting caught in crabbing gears. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | April 19, 2017
Site: www.theenergycollective.com

EIA’s April 2017 Short-Term Energy Outlook (STEO) expects that electricity generation fueled by natural gas this summer (June, July, and August) will be lower than last summer, but it will continue to exceed that of any other fuel, including coal-fired generation, for the third summer in a row. The projected share of total U.S. generation for natural gas is expected to average 34%, which is down from 37% last summer but still exceeds coal’s generation share of 32%. Based on data from the National Oceanic and Atmospheric Administration (NOAA), EIA estimates that average U.S. population-weighted cooling degree days in the summer of 2016 reached the highest level on record. NOAA projections for this summer indicate cooling degree days will be 11% lower than last year. These milder expected temperatures lead to forecast U.S. summer electricity generation of 1.16 billion megawatthours, which would be 2.4% lower than generation last summer. Generation fueled by natural gas typically peaks in the summer when power plant operators use natural gas-fired combustion turbines during the hottest part of the day to meet electricity demand for air conditioning. Natural gas first exceeded coal as the nation’s primary electricity fuel on a monthly basis in April 2015 and on an annual basis in 2016. During the summer of 2016, at a time when natural gas prices were relatively low, 37% of U.S. electricity generation came from natural gas and 33% came from coal. A decade before that, in summer 2006, 25% came from natural gas and 46% from coal. The use of natural gas in the power sector is sensitive to natural gas prices. As natural gas prices have risen, the natural gas share of the electricity generation mix has fallen slightly. Over the first three months of 2017, the Henry Hub natural gas price averaged $3.01 per million Btu (MMBtu) compared with $2.00/MMBtu during the same period of 2016. As a result, the natural gas share of the U.S. electricity mix fell from 32% in the first quarter of 2016 to 29% in the first quarter of this year, while coal’s share of generation rose from 29% to 31% over that same period. EIA expects that the Henry Hub price will continue to average slightly more than $3.00/MMBtu through the summer. At the national level, shares of both natural gas and coal are expected to be lower than last summer, as output from hydroelectric and other renewable generators is expected to increase. The Midwest region is the only area of the country in which coal fuels more than half of summer electricity generation (54%). In other regions, no single fuel provides the majority of electricity generation during the summer. The Northeast and South regions are close to this level, with natural gas accounting for an expected 44% and 43% of total summer generation, respectively. Western states have a more diverse mix of energy sources for electricity generation, including access to some of the largest sources of hydropower in the United States. After enduring an exceptional drought, California experienced record levels of precipitation and snowpack this past winter, and hydroelectricity’s share of generation in the West is expected to rise from 20% last summer to 27% this summer. This increase, along with increased solar capacity because of new solar additions, should reduce the need for natural gas-fired generation in the West, where the forecast generation share falls from 34% to 27%.


News Article | April 28, 2017
Site: motherboard.vice.com

What would you think if your government didn't believe in gravity? If your senator alleged that, because they couldn't see it, perhaps it didn't exist. To many, this might seem absurd—the science is enough to know that it's real. Like gravity, climate change isn't always obvious, but its forces on Earth are increasingly clear. Yet, more than half of America's 115th Congress are climate change deniers, according to a Motherboard survey of their personal testimonies and voting records. The majority of climate scientists—at least 97 percent—agree that climate change is happening, and is a consequence of human activity. Government and independent climate scientists alike have published abundant evidence showing our impact on Earth's climate. Meanwhile, task forces like the United Nations' Intergovernmental Panel on Climate Change (IPCC), have underscored the necessity of significantly reducing our global emissions. The United States is facing one of its most anti-science Congresses in history. Almost 30 years ago, a NASA scientist named James Hansen pleaded with Congress, under the Reagan Administration, to accept the evidence and do something about it. "It is already happening now," Hansen said before a Congressional committee in 1988. Fast-forward three decades, and the United States is facing one of its most anti-science Congresses in history. Many members of the Senate and House of Representatives have gone on-record to denounce climate change as a hoax. Others have proven through their votes that regulating greenhouse gas emissions is not a priority. And still, some state representatives claim to believe in human-made climate change, but consistently support policies that would erode initiatives to combat it. In the Senate, 53 out of 100 members are climate change deniers. In the House of Representatives, 232 out of 435 members are climate change deniers. For the purpose of this survey, we defined climate change deniers as those who deny the existence of anthropogenic, or human-made, climate change. Senators and representatives who called themselves "skeptics" were also included, because enough empirical evidence exists for them to make an informed decision on whether people are influencing the climate. To the argument that voting against climate change bills is not the same as denying it exists: the many species, ecosystems, and people already seeing its effects can no longer wait for Congress to debate the merits of addressing climate change right now. Both groups include Republicans and Democrats, though GOP members largely outnumber their counterparts. Curiously, states that are most vulnerable to climate change are not immune to a leadership of denial. In Florida, for example, where sea levels are carving away parts of its coastline, 14 out of 27 Representatives are climate change deniers. Even in California, where climate change is linked to, or at least exacerbates, periods of extreme drought, 15 out of 53 Representatives are climate change deniers. Deniers tend to use the same (scientifically debunk-able) reasoning for their beliefs. Explanations for recent climatic shifts include solar activity, corruption among scientists, Al Gore, and the discerning will of God. But the excuse most frequently touted was that Earth's climate has always been changing. It's partly true; the geological record tells us our planet has gone through several glacial and interglacial periods, most recently between 120,000 and 11,500 years ago. Experts at the National Oceanic and Atmospheric Administration, for example, believe that fluctuations in solar radiation due to Earth's varied orbit are one cause for these changes. To be clear, the mechanisms behind these cycles are established science. And instead of entering a gradual cooling period, which we should be right now, we're actually getting warmer, due to human activity. Deniers tend to use the same (scientifically debunk-able) reasoning for their beliefs. In regard to life on Earth, ancient widespread die-offs, such as the Permian-Triassic extinction that killed 70 percent of terrestrial species, have been linked to peaks in greenhouse gases and extreme warming. Scientists aren't saying that climate change definitely caused these extinction events, but they could act as much-needed harbingers for current times. What's different now—and what renders the original argument untrue—is that we know climate change is happening now because of human activity, and it's happening faster than ever. Climate scientists have compiled a thorough record of Earth's climate cycles over 800,000 years. As a result, they're able to compare historic warming rates with current ones. And, according to NASA's Earth Observatory, our modern climate is heating up ten times faster than the shifts that brought ice ages to an end. So how do we know that climate change is happening because of humans, and not volcanoes or solar activity? A report released by the IPCC stated, with 90 percent certainty, that most "of the observed increase in globally averaged temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic greenhouse gas concentrations." Atmospheric carbon dioxide has risen from 280 parts per million to 400 parts per million over the last 150 years, and its effects on global temperature are well-studied. From a scientist's perspective, the indifference to evidence and consensus has been frustrating, to say the least. A report from the American Association for the Advancement of Science and Pew Research Center last year found massive divides between public and scientific opinion. Approximately 37 percent of those surveyed did not think that climate scientists agreed on global warming. Many researchers and educators have vowed to organize and protest the Trump Administration as a result of climate inaction. Katharine Hayhoe, an atmospheric scientist at Texas Tech University, once said: "As a scientist, you don't just jump to conclusions. You do the tests. You say, 'OK, could it be a natural cycle this time? Could it be the sun? Could it be volcanoes? Could it be orbital cycles and ice ages?'" "We run those tests and we see if it could be any of those things that caused the climate to change naturally in the past. And in this case, we've run those tests and the answer to all those questions is, 'no.' In fact, if our temperature were controlled by natural causes right now, we'd be getting cooler, not warmer," she added. As we enter four years of climate ambiguity at best, and reckless environmental abuse at worst, Motherboard encourages you to know your Congressional representatives, and where they stand on the most important global issue of our generation. We've included contact information for the office of each Senator and Representative on this list so you may contact them.


News Article | March 3, 2017
Site: www.techtimes.com

A somehow eerie-looking cosmic jellyfish along with other biological curiosities have been discovered by biologists with the U.S. National Oceanic and Atmospheric Administration (NOAA) in their latest dive in the remote American Samoa region of the Pacific. Onboard the NOAA research ship Okeanos Explorer, the team had been conducting research at the Utu Seamount, with the mission to hold one of the first expansive explorations of the 13,581-square mile marine sanctuary where the seamount belongs. Dubbed the 2017 American Samoa Expedition, the three-year campaign will gather crucial scientific data in and around the protected areas. American Samoa stands out as a biodiversity hotspot, with its three marine sanctuaries a site for protecting massive coral and deepwater reefs and even archeological artifacts. “Much remains unknown about the deep-sea habitats and geology in and around these protected places,” NOAA team member and molecular ecologist Santiago Herrera said in a Gizmodo report. There is also much to be known about the Samoa Islands and seamounts forming an “age-progressive volcanic hotspot.” Volcanoes on the east are young while those in the west are progressively older, and it remains unclear how Samoan volcanoes evolve over time. “This expedition will sample various volcanoes at different stages in their development, including the young active volcano, Vailulu’u, and the older Samoan volcanic feature that defines the island of Tutuila,” the group stated in its mission plan. The sea dives, conducted from Feb. 16 to 26, used a remotely operated vehicle (ROV) to unravel a number of new biological finds, including cosmic jellyfish, Venus flytrap anemone, and a range of mollusks. The translucent, UFO-like jellyfish, imaged during their first dive, appeared to have rows of tentacles that face up and down and were likely useful in catching prey. It hovered through the deep and dark ocean, with its digestive system in a red color and its reproductive organs in yellow. Further investigations will be done to determine if the cosmic jellyfish is a new species or not. A hydroid was also spotted making rounds in Leoso Seamount, a place straddling the boundary from the Exclusive Economic Zone (EEZ) of the American Samoa to the Cook Islands EEZ. Also alien-resembling in appearance, it is a close relative of many jellyfish species and attaches itself to rock while snatching floating food through a two-tier tentacle mouth. The researchers uncovered at least a dozen potentially new species made up of sponges, sea stars, corals, and other creatures they took samples of. The collections, according to Herrera, will set new species designations if needed, as well as enable DNA analyses for greater insight into biological relationships and evolutions among similar species. The NOAA presents photos and videos of their amazing finds online for the public to see and relish. Remember the glowing purple bob discovered by E/V Nautilus ship scientists deep beneath California waters last year? The googly-eyed squid, first thought to be a cuttlefish, actually was a stubby squid (Rossia pacifica) and looked like a cross between a squid and octopus. As for its strange eyes, scientists explained that the animal activates a sticky mucus jacket as it burrows deep into the sediment to camouflage – resulting in eyes poking out to spot its prey, which includes small fish and shrimp. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | April 28, 2017
Site: www.techtimes.com

Last year saw an unusually high number of humpback whale deaths all along the East Coast. As a result, the National Oceanic and Atmospheric Administration (NOAA) is conducting an investigation to find an answer regarding the uncommon die off. Since last year and until recently, the NOAA has seen a spike in the number of mysterious humpback whale deaths that span from the territories of Maine through North Carolina. On a regular year, the number of humpback whale deaths in the region is averaged at 14, but last year saw a whopping 41 humpback whale deaths in the area, and 2017 has recorded 15 so far as of April 24. Due to this spike in mortality rate, the NOAA Fisheries declared the deaths along the East Coast as an "unusual mortality event," hence prompting immediate action and expert investigation as to the cause of these deaths. The last unusual mortality event was declared in 2006. Among the 41 humpback whales that died last year, 20 of them have been examined and 10 have shown evidence of blunt force trauma. However, though vessel strikes have been recorded in various locations in the area such as in New York and New Hampshire, there was no significant spike in ship traffic in the area. What's more, Greg Silber of the NOAA Fisheries Office of Protected Resources said that there are a number of factors that may have caused the whales to move closer to shipping routes, that perhaps they were following their prey. Just recently, though, a study found that 14.7 percent of whales in the Maine and New Hampshire coasts have at least one vessel strike injury. The current investigation will involve collecting data that can lead to the major cause of death for these animals, including environmental and human-caused threats. Humpback whales are no longer on the endangered species list since last September, but they are still covered by the U.S. Marine Mammal Protection Act, which protects all marine mammals including seals, manatees, dolphins, porpoises, sea otters, polar bears, and other whale species. Under the Marine Mammal Protection Act of 1972, which was amended in 1994, removing marine mammals from their habitats without permit is deemed as illegal, hence harassing, killing, collecting a marine mammal or a part of its body are prohibited. Further, the act protects marine mammal health and stranding response program, which improves the response rates for stranding, as well as unusual mortality events. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


Until recently, it seemed that we would be able to manage global warming-induced sea level rise through the end of the century. It would be problematic, of course, but manageable, particularly in industrialized nations like the U.S. However, troubling indications from the Greenland and Antarctic Ice Sheets show that melting is taking place faster than previously thought and that entire glaciers — if not portions of the ice sheets themselves — are destabilizing. This has scientists increasingly worried that the consensus sea level rise estimates are too conservative. With sea level rise, as with other climate impacts, the uncertainties tend to skew toward the more severe end of the scale. So, it's time to consider some worst-case scenarios. SEE ALSO: Trump White House reveals it's 'not familiar' with well-studied costs of global warming Recently, the National Oceanic and Atmospheric Administration (NOAA) published an extreme high-end sea level rise scenario, showing 10 to 12 feet of sea level rise by 2100 around the U.S., compared to the previously published global average — which is closer to 8 feet — in that time period. The research and journalism group Climate Central took this projection and plotted out the stark ramifications in painstaking, and  terrifying, detail. "By the end of the century, oceans could submerge land [that's] home to more than 12 million Americans and $2 trillion in property," according to Ben Strauss, who leads the sea level rise program at Climate Central. Here's what major cities would look like with so much sea level rise: In an online report, Climate Central states that the impacts of such a high amount of sea level rise "would be devastating."  For example, Cape Canaveral, which is a crown jewel for NASA and now the private sector space industry, would be swallowed up by the Atlantic. Major universities, including MIT, would be underwater, as would President Trump's "southern White House" of Mar-a-Lago. In the West, San Francisco would be hard-hit, with San Francisco International Airport completely submerged. "More than 99 percent of today’s population in 252 coastal towns and cities would have their homes submerged, and property of more than half the population in 479 additional communities would also be underwater," the analysis, which has not been peer-reviewed, found.  In New York City, the average high tide would be a staggering 2 feet higher than the flood level experienced during Hurricane Sandy. More than 800,000 people would be flooded out of New York City alone.  Although the findings pertain to sea level rise through the end of the century, in reality sea levels would keep rising long after that, with a total increase of about 30 feet by 2200 for all coastal states, Climate Central found.  As for how likely this extreme scenario really is, here's what the report says:  "The extreme scenario is considered unlikely, but it is plausible. NOAA’s report and Antarctic research suggest that deep and rapid cuts to heat-trapping pollution would greatly reduce its chances."  More specifically, the NOAA projection says this high-end outlook has just a 0.1 percent chance of occurring under a scenario in which we keep emitting greenhouse gases at about the current rate. While a 1-in-1,000 chance outcome might seem nearly impossible to occur, recent events suggest otherwise.  For example, Hurricane Sandy slammed into the Mid-Atlantic in 2012 while following a track that was virtually unprecedented in storm history. In addition, California is estimated to have had just a 1 percent chance of climbing out of its deep drought in a one to two-year period, and it did just that this winter.  Robert Kopp, a sea level rise researcher at Rutgers University, whose projections formed the basis of the NOAA scenarios, said it's difficult to put exact odds on the extreme scenario.  "I would say that our knowledge about marine ice-sheet instability is too deeply uncertain for us to answer that question right now," Kopp said in an email. "We can come up with a physically plausible pathway that gets us to 2.5 meters [or 8.2 feet], we know it is more likely under higher emissions, but we don't have a good way of putting a probability on it." A paper published in the journal Nature in March found that if emissions of global warming pollutants peak in the next few years and are then reduced quickly thereafter, then there is a good chance that the melting of the Antarctic Ice Sheet would be drastically curtailed.  However, with the U.S., which is the second-largest emitter of greenhouse gases, backing away from making significant cuts under the Paris Climate Agreement, adhering to such an ambitious timetable is looking less realistic.  In order for NOAA's extreme scenario, and therefore Climate Central's maps, to turn into reality, there would need to be decades more of sustained high emissions of greenhouse gases plus more melting from Antarctica than is currently anticipated.  However, recent studies have raised questions about Antarctica's stability, as mild ocean waters eat away at floating ice shelves from below, freeing up glaciers well inland to flow faster into the sea.  "What's new is that we used to think 6- to 7 feet was the max *plausible* or *possible* sea level rise this century, and now we've roughly doubled that," Strauss said in an email. "The new Antarctic science says it's plausible."  "If you were to survey ice sheet experts today, instead of something like 5 to 10 years ago, I suspect you'd get a significantly higher probability than 0.1 percent," he said.  A study published in the journal Nature Climate Change last week found that sea level rise could prompt a wave of internal migration within the U.S., especially as people move from the hardest-hit states such as Florida, Louisiana and New York. It's long been known that Florida is ground zero for sea level rise impacts, but the Climate Central projections are even more pessimistic. The report shows that a whopping 5.6 million Floridians would be at risk before the end of the century under an extreme sea level rise scenario, about double the amount simulated in the study last week.


News Article | April 29, 2017
Site: www.washingtonpost.com

As tens of thousands of people descend on the Mall to speak out on climate change and policy, here are a few important global warming articles from The Washington Post staff. How much will the oceans rise? Scientists keep increasing their projections A report by a leading research body monitoring the Arctic has found that previous projections of global sea level rise for the end of the century could be too low, due in part to the pace of ice loss of Arctic glaciers and the vast ice sheet of Greenland. Read more. Without action, say bye-bye to polar bears In a final plan to save an animal that greatly depends on ice to catch prey, the U.S. Fish and Wildlife Service identified the rapid decline of sea ice as “the primary threat to polar bears” and said “the single most important achievement for polar bear conservation is decisive action to address Arctic warming” driven by the human emission of greenhouse gases into the atmosphere. Read more. The nation is immersed in its warmest period in recorded history The U.S. is enduring a stretch of abnormally warm weather unsurpassed in the record books, and it shows no immediate sign of ending. The latest one-, two-, three-, four- and five-year periods — ending in March — rank as the warmest in 122 years of record-keeping for the Lower 48 states, according to data from the National Oceanic and Atmospheric Administration. Read more. Carbon dioxide levels could reach their highest point in 50 million years by the end of the century Continuing to burn fossil fuels at the current rate could bring atmospheric carbon dioxide to its highest concentration in 50 million years, jumping from about 400 parts per million now to more than 900 parts per million by the end of this century, a new study warns. And if greenhouse gas emissions continue unabated beyond that point, the climate could reach a warming state that hasn’t been seen in the past 420 million years. Read more. Scott Pruitt causes an uproar — and contradicts the EPA’s own website on climate change “I think that measuring with precision human activity on the climate is something very challenging to do and there’s tremendous disagreement about the degree of impact, so no, I would not agree that it’s a primary contributor to the global warming that we see,” Pruitt, the newly installed EPA administrator, said on the CNBC program “Squawk Box.” Read more. While Trump promotes coal, other countries turn to the sun On the solar farms of the Atacama Desert, the workers dress like astronauts. The sun is so intense and the air so dry that seemingly nothing survives. It’s Mars, with better cellphone reception. It is also the world’s best place to produce solar energy, with the most potent sun power on the planet. So powerful, in fact, that something extraordinary happened last year when the Chilean government invited utility companies to bid on public contracts. Read more. By 2030, half the world’s oceans could be reeling from climate change More than half the world’s oceans could suffer multiple symptoms of climate change over the next 15 years, including rising temperatures, acidification, lower oxygen levels and decreasing food supplies, new research suggests. By midcentury, without significant efforts to reduce warming, more than 80 percent could be ailing — and the fragile Arctic, already among the most rapidly warming parts of the planet, may be one of the regions most severely hit. Read more. Top Trump officials are feuding over whether the United States should stay in the historic Paris climate agreement. The president, who promised to “cancel” Paris during the election campaign, has faced calls from oil, gas and even some coal companies for the United States to remain a party to an accord endorsed by nearly 200 countries. But many conservatives and climate-change doubters have continued to urge Trump to keep his election pledge and quit the agreement. Read more. Bonus: With enough evidence, even skepticism will thaw. Experience the interactive. Climate March expected to draw massive crowd to D.C. in sweltering heat Why people are marching for science: ‘There is no Planet B’ The Trump White House is literally at war with itself over climate change Nearly 200 million chickens, turkeys and cows are making a mess of the Shenandoah River


News Article | April 17, 2017
Site: www.scientificamerican.com

More evidence has emerged to place European eels among other animals that use the Earth’s magnetic field to guide their migration, but the research methods used are causing controversy. A study published last week in Current Biology suggests eels generate a “magnetic map” of their surroundings that lets them read their location based on the field intensity, and use this map to find the Gulf Stream and catch a free ride to Europe. “We’re moving toward [magnetic sense] being the default hypothesis for how marine animals achieve their long-distance migrations,” says co-author Nathan Putman, a Florida-based biologist with the National Oceanic and Atmospheric Administration. “What was maybe viewed as an anomaly with sea turtles and salmon is now the hypothesis to beat.” Some scientists took issue with the study’s approach, however. Eels migrate twice in their lifetime: once as four inch–long larvae from the Sargasso Sea to coastal waters and rivers of Europe and Northern Africa, and again after maturing for a decade or more, when they return to the Sargasso to spawn and die. Since the 1970s, researchers have suspected eels could sense magnetic fields, and a 2013 study proposed the animals used a “magnetic compass” to orient using the North Pole. The new study suggests the eels have a more complete command of their location and direction. To test their hypothesis, researchers placed eels in a round, freshwater tank with 12 small chambers arranged like spokes around it, and a magnetic coil surrounding the whole apparatus. They simulated how it would have felt—magnetically speaking—for the eels to have been in four different locations along their larval migration route from the Sargasso to Europe. The eels chose which direction to swim by slithering over the barrier into one of the segments surrounding the central tank. Researchers tallied their choices and applied the data to a computer model of oceanic circulation. Based on the directions the eels would have gone in the Sargasso Sea, the researchers concluded the majority of the eels seemed to be swimming in the direction of what would have been the Gulf Stream from various points on their migration route. Some eel researchers have criticized the study because the authors didn’t use larval eels, but rather juveniles captured in in an estuary in the U.K., which had already completed their Europe-bound migration. “You could compare it to doing experiments on butterflies and expecting them to behave like caterpillars,” says Caroline Durif, a scientist at Norway’s Institute of Marine Research, who was not involved in the work but co-authored the 2013 study, which used adult eels captured as they started their return migration to the Sargasso Sea. Other scientists point out the impracticality of attempting such a test on larval eels, however, because of the difficulty of capturing them. Michael Hansen, a biologist at Aarhus University in Denmark who was not involved in the study, agrees the age of the eels is the largest potential source of error, but says the results make sense biologically, which convinces him it was not a major issue. “It would be almost impossible to do a study like this on eel larvae,” he says, simply because tiny, transparent eels are tough to gather in large numbers in the open ocean. “I think it would be unrealistic to do it any other way than they did.” Karin Limburg, a professor of environmental and forest biology at SUNY College of Environmental Science and Forestry who was also not involved in the work, agrees. The authors are “taking advantage of the fact that [with] eels, when they enter and colonize an estuary, you have a concentration of them,” she says. “If you were to try to sift through the Sargasso Sea for larval eels, you’d be out there till the cows come home. Odds of finding what you’re looking for are very low. I don’t see how they could have done it any other way.” Putman defends his study on similar grounds. “We picked that life stage of eel because it was what we could get our hands on: eels that had just moved from ocean habitats into estuaries,” he says. He and his colleagues say they were posing a question to the eels: “‘If you felt displaced back into the North Atlantic, where would you feel like you should go?’ We got lucky that they still had some remnant of their migratory behavior.” Durif also points out that simulating an oceanic migration in freshwater creates more potential for error.  Putman argues that transferring eels caught in freshwater back to saltwater would add another variable to what was intended as a magnetic displacement test. “Only a change in the magnetic map conditions is necessary to elicit orientation responses from the eels,” he says.


News Article | April 21, 2017
Site: www.gizmag.com

The results of the study show that on average,  households were happy to pay $152 for a program to prevent environmental damage from BP-like oil spills in the future (Credit: US Coast Guard) One month after crude oil began gushing into the Gulf of Mexico in April 2010, the US National Oceanic and Atmospheric Administration commissioned research to gauge the environmental cost of the BP Deepwater Horizon spill, which would ask American households what they'd pay to prevent similar damage in the future. The results are now in, with scientists placing a US$17.2 billion price tag on the damage to natural resources. Plenty has been said about the monumental costs of the BP oil spill, which spewed 134 million gallons of oil into ocean. Last year a judge approved a $20 billion settlement over the 2010 catastrophe, which would be paid out to the Gulf states and local governments over a 16-period. BP itself concluded its own analysis soon after, placing the total costs at around $62 billion, a figure that includes payouts, cleanup and legal costs. But what about the damage costs to the environment specifically? How do you put a value on beaches, marshes and coral? Seeking answers to these questions, the US government-commissioned researchers took a forward-thinking – and kind of strange – approach to determining the cost of the spill's damage to natural resources. This meant placing American adults in a scenario where they could pay money to stop damage of the same magnitude occurring in the future, in an effort to attach a dollar value to individual natural resources. The first three years of the study were dedicated to developing a survey to that effect, and the remainder was spent carrying out face-to-face talks with trained interviewers and completing further surveys via mail. Participants were informed about the conditions in the Gulf of Mexico before and after the spill, including descriptions of damaged beaches, animals and fish, along with what caused the spill in the first place. The prevention program, they were told, would be 100 percent effective in stopping damages from a future oil spill, which would occur sometime in the next 15 years. They were then made to vote yes or no for the prevention program, which would incur a one-time tax on their households. The results of the study show that on average each household was happy to pay $152 for such a program. The researchers then multiplied this figure by the number of households sampled to get their $17.2 billion figure. "Our estimate can guide policymakers and the oil industry in determining not only how much should be spent on restoration efforts for the Deepwater spill, but also how much should be invested to protect against damages that could result from future oil spills," said Kevin Boyle, a professor of agricultural and applied economics at Virginia Tech and one of the study's authors. "People value our natural resources, so it's worth taking major actions to prevent future catastrophes and correct past mistakes." The research was published in the journal Science.


News Article | March 12, 2017
Site: www.techtimes.com

Constituents calling their local congressmen to voice out their sentiments about a particular policy seems normal. But a cabinet official? That's new. Following his controversial comments against climate change, the office of newly appointed Environmental Protection Agency Chief Scott Pruitt has reportedly been mobbed by phone calls from irate constituents. The public uproar was prompted by the EPA chief's ill-informed answers on CNBC's morning news and talk program Squawk Box on March 9. In the show, co-anchor Joe Kernen asked Pruitt if he believes carbon dioxide (CO2) is the primary control knob for climate change. "I think that measuring with precision human activity on the climate is something very challenging to do, and there's tremendous disagreement about the degree of impact, so no, I would not agree that it's a primary contributor to the global warming that we see," he responded. EPA employees who wished to remain anonymous revealed to the Washington Post that the agency was forced to set up an impromptu call center to attend to the sudden influx of incoming calls on March 10, even deploying its interns for backup. Critics viewed Pruitt's position on climate change as unbecoming of a leading environmental official. More importantly, it also refutes sound scientific evidence from reputable sources, which the EPA - the agency Pruitt now heads - has previously acknowledged. According to the latest assessment by the U.N. Intergovernmental Panel on Climate Change, it is extremely likely that human influence has been the dominant cause of the observed warming since the mid-20th century. On March 10, the National Oceanic and Atmospheric Administration released an update about the carbon dioxide levels in the atmosphere. Based on the levels measured at NOAA's Mauna Loa Baseline Atmospheric Observatory, CO2 levels increased by 3 parts per million to 405.1 parts per million in 2016 — an alarming rise in pace for two consecutive years. The White House under President Donald Trump is looking into serious budget cuts for the EPA this fiscal year — from $8.2 billion a year to $6.1 billion. Experts believe this will greatly impact the agency to the point of it being unable to carry out its core functions. The latest budget proposal also involves a massive staff reduction — from today's 15,000 to 12,000. State grants will be slashed by 30 percent. For example, the ambitious Chesapeake Bay cleanup program, which currently receives $73 million, will only have $5 million in the coming fiscal year. In addition, at least 38 key programs will be completely abolished as well. "This budget is a fantasy if the administration believes it will preserve EPA's mission to protect public health," former EPA Chief Gina McCarthy said in a statement. Trump doesn't believe that climate change is happening. He was highly criticized during the campaign period when he said climate change is a hoax by China to sabotage the U.S. economy. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | April 17, 2017
Site: www.nature.com

Physicist Andrew Zwicker was the underdog in a New Jersey state-assembly race in late 2015. But the head of science education at the Princeton Plasma Physics Laboratory took a step familiar to scientists — he used data to inform his strategy. Zwicker's campaign team mapped the registered voters in the district and created a model to identify those who would be most likely to respond to his message of 'evidence-based decision-making'. His team carefully crafted every piece of communication to draw that group to the polls to vote for him. Meanwhile, political pundits sneered. “The party insiders ignored me; the pollsters said I had no chance,” he says. Yet he was elected — with a margin of just 78 votes — to represent one of the state's legislative districts. The first physicist in the history of the state's legislature now straddles two worlds: half-time assemblyman, half-time academic. It wasn't Zwicker's first election: in 2014, he ran in a primary race for a Democratic congressional seat in New Jersey. He believed that policymaking could benefit from more-analytical thinking to combat the increase in the use of 'alternative facts' — purposeful confusion tactics — along with attacks on science. In the wake of the election of a US president who has questioned whether climate change is real and backed the debunked notion that vaccines cause autism, US scientists are increasingly exploring ways to get politically involved. On 22 April, at least 428 cities in 44 countries will host a March for Science (see ). High-profile scientists such as Jon Foley, executive director of the California Academy of Sciences in San Francisco, are calling on researchers to forego their long-standing reluctance to engage in political discussions and to stand up for facts. More than 3,000 scientists have now expressed interest in exploring the world of politics — and 150 of them will attend a training event this month on the basics of running for office. The event, which will be posted online later as a webinar, is organized by 314 Action, a non-profit political action committee that formed in Washington DC last year to encourage scientists to run for office at state and national levels, and to support them in their endeavours. At least three scientists are intending to announce this month that they will be running for congressional office in 2018. Candidates have a lot to consider before mounting a political campaign. It's a risky endeavour from a financial standpoint, and high salaries and career flexibility might explain why lawyers have tended to dominate US congressional positions. But the tide is slowly turning: in 2015, lawyers made up around one-third of the US Congress, down from 80% in 1850. Educators hold 12% of the posts and medical professionals and agriculturalists collectively hold 10%. Scientists have long shied away from politics. Many fear that they will lose their credibility if they defend science that has become politicized, observed Harvard University science historian Naomi Oreskes in her plenary talk at the annual meeting of the American Association for the Advancement of Science (AAAS) in February. Foley argues that a 'war on science' is under way, and that scientists are the ones best placed to fight this war, by demonstrating how science affects daily life and by questioning sceptics' motivations. But, he cautions, there is a stark difference between engaging in political discourse and becoming a partisan candidate. “If scientists want to run for office, they had better be prepared to leave their scientific careers behind,” he says. And leaving the bench has clear knock-on effects for the numerous students and staff researchers the labs support, so it helps to prepare for that eventuality. If that's too big a step, however, there may be more leeway for balancing political and scientific careers at the state and local level. Even if scientists end up deciding that running for a political position isn't an option, they can still influence politics by forging relationships as trusted advisers to politicians, for example, or working for a non-governmental organization (NGO). Many would-be politicians get their feet wet at the local level — often in part-time posts such as on a city council or a school board. It's a way to keep a day job while building up name recognition in political circles. But a national run can be tempting for those who want to add their scientific voice to issues of national importance — such as energy or public health. Nuclear engineer Brian Johnson contacted 314 Action, keen to have a voice on issues that could be tackled only at the congressional level, such as climate change and net neutrality, the principle that Internet service providers and governments all regulate online data in the same way. But he realized that he would probably need to give up his job as head of risk assessment at TerraPower, a nuclear-reactor design company in Bellevue, Washington, to put together a solid candidacy. He asked himself a series of questions (see 'Political checklist') to assess his readiness for a campaign and whether it would be worth the risk to his career, which required a narrowly defined skill set in a nascent sector. Ultimately, he decided against it. He's not alone. Leaving behind a scientific career is a significant concern for many of the scientists who have contacted 314 Action. One of the first questions that people ask is, “Can I work full-time and run?”, says executive director Josh Morrow. The answer, he says, is no — at least not in election year. And in the run-up to election year, potential candidates should determine whether selling themselves as a scientist would be a net positive. “We polled that question carefully,” says Bob Foster (Republican, Illinois), the only PhD physicist in Congress, “because I was worried that 'scientist' would give an elitist impression among the electorate.” That wasn't the case for him, but it's a region-specific question that potential candidates should explore. “Scientists need to realize that science doesn't dictate all policy and it never will,” says Jane Lubchenco, a marine biologist who has served as president of the AAAS and was administrator of the National Oceanic and Atmospheric Administration (NOAA) under former president Barack Obama. “But,” she adds, “we're all better off if it's at the table.” Academics such as palaeoecologist Jacquelyn Gill at the University of Maine in Orono find that timing is a hurdle. Gill considered running for office, but decided to shelve those pursuits for now. She may reconsider once she has progressed beyond processing grants and mentoring graduate students. “For most of us on the tenure track, it is not a very flexible timeline,” she says. The typical model for getting into politics looked like this, she says: do good science, become a strong communicator, get tapped to serve as a US National Science Foundation (NSF) officer. There's just one problem with that model, she adds: “When there is an explicitly anti-science administration, you won't get tapped.” Morrow points out that entering politics can open career doors through the expansion of networks. 314 Action, for example, has tens of thousands of scientists in its network, including Nobel prizewinners. “Scientists will form relationships that might further their career, even if they are not successful in a run for office,” he says. 314 Action has established more than 50 campus chapters and 25 state coordinators to help organize people to advocate on science-specific issues. For would-be candidates who are eager to get their scientific message out, it is crucial to listen, says Foster. “Spend a while listening to people in your district to make sure you understand how they are served well in government,” he says, “and how they could be served better.” Zwicker is a good example of someone who is successfully managing to combine scientific and political careers. He still has a lab, although he is rarely there and he stopped doing straight research in 2003. His half-time split means that he can continue in science education, but it's not an easy transition. Politics is “the hardest thing” he's ever done, he says. “Instead of teaching around 100 people, I represent around 155,000 people,” he says. He's sponsored or co-sponsored more than 100 bills in his first year, but fundraising is different and needs strong communication skills. Scientists, he says, typically stick to facts and figures — a strategy that failed to resonate with constituents early in his political career. Perhaps the biggest difference between science and politics, says Lubchenco, is that data and facts aren't the only factors in political decision-making. The most effective individuals have relationships across the political spectrum, not just with obvious allies. “Developing and cultivating those relationships is much of the way politics happens,” she notes. There are, of course, other ways to be politically active that don't involve running for office. Natalia Sanchez, who immigrated to the United States from Colombia when she was 14, has forged that path herself. She arrived in the United States with one goal: to become a rocket scientist. Since 2008, she has worked at NASA's Jet Propulsion Lab (JPL) in Pasadena, California, where she has served on teams that sent spacecraft to both Mars and Jupiter and has been involved in planetary-science research on Earth. After last year's presidential election, she felt that the stakes were high enough to alter her career path. She contemplated running for a school-board position in California, but ended up becoming a field director for Tracy Van Houten, a fellow JPL rocket scientist who is one of 23 candidates running to represent California's 34th congressional district. Sanchez helps Van Houten to engage her voters and to shape her platform on immigration. “Whether or not I eventually run, I've made the switch to politics,” she says. “I can help solve problems.” Scientists can also offer advice to established politicians. Having led three different scientific societies, Lubchenco knew more than 30 members of Congress quite well and had testified in Congress multiple times before she led NOAA. She encourages scientists to offer their expertise to US representatives and senators. For example, when the gene-editing tool CRISPR–Cas9 became a scientific reality a few years ago, Foster started receiving urgent requests for meetings from high-profile scientists who wanted Congress to begin grappling with the societal impacts of human genetic engineering — such as the ethical considerations of designer babies. The best way to offer advice, Foster says, is to set up an in-person meeting in your home state. “You will not be mistaken for a random lobbyist, you will be a constituent,” he points out. And a home meeting precludes the possibility that any group of scientists coming to speak to Congress would be seen as just another special-interest group, he adds. Lubchenco says that scientists can also consider doing sabbaticals in which they work with members of Congress, federal agencies or the White House. And another option is serving on an advisory committee or board of directors for a foundation or NGO. “Many NGOs are politically very savvy,” she says, but “they often need help with the science”. Foster notes that scientists should consider serving in the government's scientific management operations, such as the NSF, US Department of Energy or in oversight of military research. Key budget decisions are often made in private meetings, and it's essential to have the best scientific expertise there, he says. Scientists may find they already have skills they didn't realize would be applicable to politics. “When I went to NOAA, I would joke with students that I was ready for the political fray because I already knew how to swim with sharks,” Lubchenco says. They laughed, she adds, but there was truth to that — animal behavioural science is about reading body language accurately so that you can tell whether a shark is going to pass by or is about to eat you. “The same,” she says, “is true in politics.”


News Article | March 15, 2017
Site: www.techtimes.com

The carbon dioxide in the atmosphere has risen to an all-time level high as the United Kingdom reported a record low of their CO2 emissions. Is this the classic case of bad news and good news? Or could it be an argument to bolster the need for action to curb carbon emission? The U.S. National Oceanic and Atmospheric Administration said the CO2 levels have increased to three parts per million for the last two years bringing the total concentration to 405 parts in a million. Observation stations worldwide have recorded the rise of CO2 concentration in the atmosphere. The average rate is 2.4 parts in a million for the last decade, Dr. Pieter Tans of the Carbon Cycle Greenhouse Gases group at NOAA said. Two years ago, climate change scientists have recorded the monthly average exceeding 400 parts per million. It was the first time the level had exceeded the 400 mark which was considered symbolic of the findings that human activity is the major cause of global warming. Until the industrial revolution, the CO2 level was at 280 parts per million. The level of CO2 in the atmosphere has been rising after it breached the 400 level in 2015. Even in September of last year when CO2 levels were at its lowest, the monthly average remained above the threshold. The new NOAA data is an indication that the atmospheric CO2 levels are continually growing. The suggestion that human-caused carbon emissions have steadied for the last three years has no significant effect to the overall situation as billions of tons of greenhouse gasses have been poured into the atmosphere annually. While in the UK, the climate science and policy website Carbon Brief released new data indicating a decrease of CO2 output to a new low as coal usage has declined by almost 40 percent below the 1990 levels. The pollution level in 2016 due to carbon dioxide was at the level of 381m tons, according to Carbon Brief. UK scientists said this figure is the lowest since at the end of 19th century. Analysts have identified several factors which have driven the decline of coal usage to 74 percent since a decade ago. These factors include lower prices of gas, carbon taxes, and the expansion of possible sources of renewable energy. The overall drop in the demand for energy and closure of big manufacturing operations had also contributed to the fall. Of all the factors, the analysis concluded, the one with far-reaching effect is the carbon tax which was doubled to £18 ($22) per ton of CO2 in 2015. Carbon tax or CTax stands for carbon dioxide tax. It can also be called carbon pollution tax. The concept behind CTax is to let the fossil fuel users pay for the destruction they have caused to the environment that led to global warming. This measure is seen as deterrence for companies to go into fossil fuel. Taxing CO2 emissions involved the imposition of taxes when the fuel is extracted or being imported. It is hoped, as the UK experience has shown, that the reduction of the use of fossil will ultimately result in the reduction of atmospheric carbon dioxide. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | April 19, 2017
Site: news.yahoo.com

SEATTLE (Reuters) - Rising sea levels caused by climate change may drive U.S. coastal residents to areas far from the seaboard, not just to adjacent inland regions, according to a study published online in the journal Nature Climate Change. Even landlocked states such as Arizona and Wyoming could see significant increases in population because of coastal migration by 2100, and may be unprepared to handle the surge, said the analysis from a University of Georgia researcher. "We typically think about sea-level rise as being a coastal challenge or a coastal issue," Mathew Hauer, author of the study and head of the Applied Demography program at the University of Georgia, said in an interview on Tuesday. "But if people have to move, they go somewhere." The U.S. National Oceanic and Atmospheric Administration predicted in January a 1-to-8-foot (0.3-2.5 meter) increase in sea levels by the year 2100. Previous research by Hauer and others has put the number of Americans displaced by rising seas over the same period as high as 13.1 million. While a movement of residents from low-lying coastal regions to adjacent inland communities will likely occur, Hauer said that according to his model, even landlocked states such as Nevada, Arizona and Wyoming will see an influx. Nevada's Clark County, home to Las Vegas, is projected to see an influx of up to 117,000 climate migrants by the end of the century, and nearly every county in Wyoming is predicted to see some increase, as are many counties in western Montana, central Colorado and northern Utah, the study found. Hauer said previous studies had shown that people permanently leaving their homes often choose destinations where they have family connections or better job prospects, even if those locations are far away. “A lot of these places, although they might seem like they’re very far (from the coast), people may have kin ties or economic ties or economic reasons for moving,” he said. “People could go to school in an area and they come back years later, maybe that’s closer to family.” Although municipalities typically are not considering climate migrants in their long-term planning, Hauer said, they should start to do so because the effects of sea-level rise were already being felt. “It’s not like we go from zero feet of sea-level rise to 6 feet right at the end of the century - it’s an incremental process,” he said.


News Article | March 17, 2017
Site: www.techtimes.com

A mysterious occurrence has left the scientific community spellbound and researchers are trying to find out the reason behind it. Huge group of humpback whales have been gathering off the coast of South Africa to feed and this occurrence has been recorded in 2011, 2014 and 2015. Scientists are astonished as these whales are solitary creatures that travel in a pack of two or three at the most. However, during these gatherings, scientists have noticed more than 200 humpback whales at once. Normally, 10 or 20 would be considered a huge group when considering the size of these creatures. The researchers are trying to find out why such a high number of the whales are gathering in one place to feed and how one area in the ocean can provide enough food to satiate so many. Especially considering that each of the whales weighs around 65,000 pounds. According to the National Oceanic and Atmospheric Administration (NOAA), an average sized humpback whale consumes a ton of krill, shrimp and small fish per day. "Indeed, aggregations of whales of this size have seldom been reported in the literature, with 'large' groups often numbering in the range of 10 to 20 or less," remarked Ken Findlay, who is the lead author of the study, in an interview with Time. Another puzzling factor is that the humpback whales usually migrate near Antarctica during the summer and it is only in winter, when these whales are found in the other regions. However, these groups gathered off the coast of South Africa in summer. Researchers are also trying to determine whether the age of the whales have some connection to this strange behavior as most of the humpback whales in the groups were particularly young. Researchers have not been able to pinpoint the reason behind this perplexing occurrence yet and they state that more research in needed to find a conclusive answer. The scientists do posit that this may be a result of the surge in global humpback whale population in recent times. This increase in its numbers may have caused the whales to change its migration and feeding patterns all together. "Future areas of investigation should include identifying migration links and the population identity of participating whales," surmise the researchers. The U.S. Government has even removed the humpback whales from its endangered species list given the substantial increase of its population. Some reports indicated that the increase in number was as much as 10 percent within the last 15 years. The complete study has been published in PLOS One. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | April 20, 2017
Site: www.eurekalert.org

In the first ecosystem-wide study of changing sea depths at five large coral reef tracts in Florida, the Caribbean and Hawai'i, researchers found the sea floor is eroding in all five places, and the reefs cannot keep pace with sea level rise. As a result, coastal communities protected by the reefs are facing increased risks from storms, waves and erosion. The study, by the US Geological Survey (USGS), is published today in Biogeosciences, a journal of the European Geosciences Union. At two sites in the Florida Keys, two in the US Virgin Islands, and in waters surrounding the Hawaiian island of Maui, coral reef degradation has caused sea floor depths to increase and sand and other sea floor materials to erode over the past few decades, the Biogeosciences study found. In the waters around Maui, the sea floor losses amounted to 81 million cubic meters of sand, rock and other material - about what it would take to fill up the Empire State Building 81 times, or an Olympic swimming pool about 32,000 times, the USGS researchers calculated. As sea levels rise worldwide due to climate change, each of these ecologically and economically important reef ecosystems is projected to be affected by increasing water depths. The question of whether coral colonies can grow fast enough to keep up with rising seas is the subject of intense scientific research. But the USGS study, published on April 20, 2017 in the journal Biogeosciences, found the combined effect of rising seas and sea floor erosion has already increased water depths more than what most scientists expected to occur many decades from now. Other studies that do not factor in sea floor erosion have predicted seas will rise by between 0.5 and 1 metre by 2100. "Our measurements show that seafloor erosion has already caused water depths to increase to levels not predicted to occur until near the year 2100," said biogeochemist Kimberly Yates of the USGS' St. Petersburg Coastal and Marine Science Center, the study's lead author. "At current rates, by 2100 sea floor erosion could increase water depths by two to eight times more than what has been predicted from sea level rise alone." The study did not determine specific causes for the sea floor erosion in these coral reef ecosystems. But the authors pointed out that coral reefs worldwide are declining due to a combination of forces, including natural processes, coastal development, overfishing, pollution, coral bleaching, diseases and ocean acidification (a change in seawater chemistry linked to the oceans' absorption of more carbon dioxide from the atmosphere). For each of the five coral reef ecosystems, the team gathered detailed sea floor measurements from the National Oceanic and Atmospheric Administration taken between 1934 and 1982, and also used surveys done from the late 1990s to the 2000s by the USGS Lidar Program and the US Army Corps of Engineers. Until about the 1960s sea floor measurements were done by hand, using lead-weighted lines or sounding poles with depth markings. From approximately the 1960s on, most measurements were based on the time it takes an acoustic pulse to reach the sea floor and return. The USGS researchers converted the old measurements to a format comparable with recent lidar data. They compared the old and new sets of measurements to find the mean elevation changes at each site. The method has been used by the US Army Corps of Engineers to track other kinds of sea floor changes, such as shifts in shipping channels. This is the first time it has been applied to whole coral reef ecosystems. Next the researchers developed a computer model that used the elevation changes to calculate the volume of sea floor material lost. They found that, overall, sea floor elevation has decreased at all five sites, in amounts ranging from 0.09 metres to 0.8 metres. All five reef tracts also lost large amounts of coral, sand, and other sea floor materials to erosion. "We saw lower rates of erosion--and even some localised increases in seafloor elevation--in areas that were protected, near refuges, or distant from human population centers," Yates said. "But these were not significant enough to offset the ecosystem-wide pattern of erosion at each of our study sites." Worldwide, more than 200 million people live in coastal communities protected by coral reefs, which serve as natural barriers against storms, waves and erosion. These ecosystems also support jobs, provide about one-quarter of all fish harvests in the tropical oceans, and are important recreation and tourism sites. "Coral reef systems have long been recognised for their important economic and ecological value," said John Haines, Program Coordinator of the USGS Coastal and Marine Geology Program. "This study tells us that they have a critical role in building and sustaining the physical structure of the coastal seafloor, which supports healthy ecosystems and protects coastal communities. These important ecosystem services may be lost by the end of this century, and nearby communities may need to find ways to compensate for these losses." The study brought together ecosystem scientists and coastal engineers, who plan to use the results to assess the risks to coastal communities that rely on coral reefs for protection from storms and other hazards. Please mention the name of the publication (Biogeosciences) if reporting on this story and, if reporting online, include a link to the paper - active once embargo lifts) or to the journal website.


News Article | April 14, 2017
Site: www.sciencenews.org

Earthquake-powered shifts along the seafloor that push water forward, not just up, could help supersize tsunamis. By combining laboratory experiments, computer simulations and real-world observations, researchers discovered that the horizontal movement of sloped seafloor during an underwater earthquake can give tsunamis a critical boost. Scientists previously assumed that vertical movement alone contributed most of a tsunami’s energy. More than half of the energy for the unexpectedly large tsunami that devastated Japan in 2011 (SN Online: 6/16/11) originated from the horizontal movement of the seafloor, the researchers estimate. Accounting for this lateral motion could explain why some earthquakes generate large tsunamis while others don’t, the researchers report in a paper to be published in the Journal of Geophysical Research: Oceans. “For the last 30 years, we’ve been moving in the wrong direction to do a good job predicting tsunamis,” says study coauthor Tony Song, an oceanographer at NASA's Jet Propulsion Laboratory in Pasadena, Calif. “This new theory will lead to a better predictive approach than we have now.” The largest tsunamis form following earthquakes that occur along tectonic boundaries where an oceanic plate sinks below a continental plate. That movement isn’t always smooth; sections of the two plates can stick together. As the bottom oceanic plate sinks, it bends the top continental plate downward like a weighed-down diving board. Eventually, the pent-up stress becomes too much and the plates abruptly unstick, causing the overlying plate to snap upward and triggering an earthquake. That upward movement lifts the seafloor, displacing huge volumes of water that pile up on the sea surface and spread outward as a tsunami. These deep-sea earthquakes shift the seafloor sideways, too. The earthquake off the coast of Japan in 2011, for instance, not only lifted the ocean floor three to five meters; it also caused up to 58 meters of horizontal movement. Such lateral motion, however big, is mostly ignored in tsunami science, largely because of a 1982 laboratory study that found no connection between horizontal ground motion and wave height. The experiment used in that study, Song argues, wasn’t a properly sized-down model of the dimensions of the seafloor and overlying ocean. If lateral motion takes place on a sloped segment of the seafloor, he thought, then the shift can push large volumes of water sideways and add momentum to the budding tsunami. Using two wave-making machines at Oregon State University in Corvallis, Song and colleagues revisited the decades-old experiment. Oarlike paddles pushed water upward and outward in some tests and just upward in others. Adding horizontal motion caused higher waves than vertical motion alone, the researchers found. By combining the experimental results with a new tsunami computer simulation that incorporates lateral movement, the researchers could account for the unusual size of the 2004 Indian Ocean tsunami. That tsunami, one of the worst natural disasters on record, was bigger than uplift alone can explain. Using GPS sensors to measure the horizontal movement of the seafloor during an earthquake will enable more accurate tsunami forecasts before the wave is spotted by ocean buoys, Song proposes. The new work makes a convincing case that horizontal motion contributes to tsunami generation, says Eddie Bernard, a tsunami scientist emeritus at the National Oceanic and Atmospheric Administration’s Center for Tsunami Research in Seattle. But just how much that movement contributes to a tsunami’s overall height is unclear. It could be much less than Song and colleagues predict, he says. Other seafloor events that can follow a large earthquake — such as huge numbers of water-displacing landslides — could also boost a tsunami’s size. Until all of the factors are known, Bernard says, tsunami forecasters will probably be best off doing what they do now: waiting for a tsunami to form after an earthquake before predicting the wave’s size and trajectory.


Top Scientific Minds You Probably Never Heard Of Astrophysicist Neil deGrasse Tyson has launched a barrage of tweets warning that science and health budget cuts will make America “sick,” “weak,” and “stupid.” In a blueprint of his 2018 budget requests, President Donald Trump proposed $54 billion in cuts to significant sections of the federal government as well as popular science and education programs. The proposed budget, for instance, is not exactly good news for a number of planned and existing NASA missions, including earth sciences and the NASA Office of Education. “The fastest way to Make America Weak Again: Cut science funds to our agencies that support it,” deGrasse Tyson wrote on his Twitter account March 20. The fastest way to build a “sick” country, he added, is to “cut funding to the National Institutes of Health,” while the fastest way to “Make America Stupid” is to cut funds to programs supporting education. “The fastest way to thwart Earth’s life-support systems for us all: Turn EPA into EDA – the Environmental Destruction Agency,” the scientists tweeted to his 7.03 million followers, adding that humans can easily melt glaciers and flood Earth’s coastal areas through ignoring scientists and doing nothing to stop carbon emission rise. The famed astronomer can be remembered for his controversial pronouncements, such as last year when he proposed that the entire universe we see around us may be nothing more than a simulation. While the messages do not explicitly point to Trump or the proposed 2018 budget, the tweets follow deGrasse Tyson’s promise to refrain from making public criticisms until certain policy proposals have been revealed. The areas that his tweets touched on are some of those seeing severe funding cuts for 2018, such as education facing a looming 13.5 percent decrease and health and human services with a likely 16.2 percent drop. The Environmental Protection Agency (EPA), too, is on track to over 31 percent reduction in proposed funding. Trump’s proposed budget does not bode well for a number of NASA initiatives, with the space agency itself receiving a slightly lower budget of $19.1 billion, from $19.285 billion in fiscal year 2016. The biggest chunk of NASA funding would be reserved for the human exploration division, including the Orion spacecraft and Space Launch System jumbo rocket poised for Mars. On the chopping board, meanwhile, are the ARM program for flying a robotic space vehicle to a near-Earth object, the earth sciences program, and NASA’s entire education office. The proposal offers no budget for the Europa lander mission that will explore the Jupiter moon, while it contains “restructures” for satellite servicing initiative RESTORE-L, considered “duplicative” and in need of cuts. In the area of health, NIH’s funding is bound to suffer a $5.8 billion drop or nearly 20 percent in cuts. As one of the world’s primary research centers tasked to make important discoveries in health, the NIH allocates about 80 percent of its budget to universities and medical centers. The budget cut is seen to affect not only graduate, but also undergraduate programs, as well as lead to the reorganization of many of its study centers. The blueprint further shows that the National Oceanic and Atmospheric Administration (NOAA) will receive a 50 percent cut. In early March, Tech Times reported how planned cuts to the climate science agency can put lives at risk. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | March 18, 2017
Site: www.techtimes.com

The world's temperature sweltered last month. It was the second warmest February of all Februaries in almost 140 years, according to data released this week. Data from NASA's Goddard Institute for Space Studies revealed that the global temperature in February was 1.1 degrees Celsius above average. Last year's February recorded a temperature of 1.32 degrees Celsius, which was a breakaway from the average records in 137 years. The Japan Meteorological Agency has also recorded that last month was the second warmest since 1891. While the National Oceanic and Atmospheric Administration also disclosed that February 2017 ranked second to last year's February trailing behind by a couple tenths of a degree in its set of data. The temperature anomaly was not so significant locally, but it has a tremendous impact in global scale where measurements were calibrated in terms of hundredths and tenths of degrees. The measurement of Earth's average temperature was done over land and sea. The latest data, however, is not surprising after the last three years had recorded warmer global temperatures. The years 2014, 2015, and 2016 had seen warm global temperature records, NASA said. The dataset from NASA covering the period of 1,629 months which dates back to 1880 showed that no single month before October 2015 had a temperature irregularity of 1 degree Celsius. There were eight months since October 2015 where an irregularity of warm temperatures was recorded. Seven of these eight months happened in succession from October 2015 to April 2016. The unusually warm February was experienced in central and eastern parts of Asia, Canada's central and southern regions, and in 16 states of the central and eastern parts of the United States. The central and eastern part of Russia had also its warmest temperature last month. It was also felt in Mexico and the polar region in the north. Sea ice in the Arctic had melted to around 455,600 square miles, which was lesser by 15,400 square miles compared to last year's record, according to the National Snow and Ice Data Center. The cooler than average temperatures were also recorded in some places, the NASA/GISS records revealed. These are the regions near the Pacific Ocean at the equator, Canada's southwest, the Baffin Island and the Baffin Bay, the Middle East, northeast Africa, and the western part of Australia. While it was the second warmest February worldwide, in these areas, February 2017 marked 379 months of colder-than-average month since July 1985. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | April 25, 2017
Site: motherboard.vice.com

Click here to subscribe to Science Solved It and hear our first episode, which explores the mystery of "the bloop." In 1997, while searching for underwater volcanoes off the coast of South America, scientists recorded something they couldn't explain: a strange, exceptionally loud noise. They called it "the bloop." The bloop was one of the loudest underwater sounds ever recorded: hydrophones (underwater microphones) more than three thousand miles apart all captured the same noise. And researchers at the National Oceanic and Atmospheric Administration, which first recorded the bloop, couldn't figure out what had caused it. But they knew it was something special. "It's unusual when a sound is recorded on all of the sensors we have deployed," said Bob Dziak, the manager of the acoustics program for NOAA. "If it's a ship, or a whale, when it makes a sound in the ocean, it isn't big enough to be recorded all the way across the Pacific. But this sound was recorded on many hydrophones so it stood out in our mind as being something unique." Click here, or listen below, to find out the truth behind this tantalizing 90s mystery: The bloop captured the imaginations of people around the world. Theories began to emerge that this was the call of an aquatic dinosaur, megalodon, or an undiscovered sea creature. These theories gained even more steam when NOAA announced that the sound wasn't man-made. It was "possibly biological." Clearly, they had discovered a monster. "I hesitate to say these things because I don't think it's very helpful in the science discussion, but it was considered possibly of animal origin and one idea that was floated out there was the idea that it was a giant squid," Dziak said. Others were convinced it was not a giant squid, but a monster with a squid face: Cthulhu, the mythical creature from H.P. Lovecraft's Call of Cthulhu. Interestingly, the bloop was recorded just 1,500 kilometers from the place where, in Lovecraft's short story, Cthulhu first emerged. This only added more fuel to the fan theory that Cthulhu was calling out from the deep. It wasn't until 2005 that scientists were able to definitively explain what the bloop was, and where it came from—and they discovered the truth by accident. In our first episode of Science Solved It, Motherboard's new podcast, we explore the lore of the bloop and the scientific process that led to the solution. Editor's Note: This headline originally said the mystery took 20 years to solve, but it was actually closer to one decade.


The administration is definitely firm on their stand regarding climate change, which is made clear in their latest move in the U.S. Department of State's website. It's not a subtle change in wording and sends a clear message as to how the United States will be handling climate change under the administration or at least under the department's new secretary, Rex Tillerson. It's not surprising anymore to see changes regarding the U.S. position as the leading country in fighting climate change. This is made apparent in the changes that they made in the department's website. During the Obama era, the Global Change webpage on the site clearly stated how the United States is taking the lead on actions against climate change both in and out of the country. Though they did not entirely dismiss the country's position on the matter, in its place now is a generic statement that simply states what the office is responsible for. While the change in wording is simple paraphrasing for some, it does connote a more passive America when it comes to climate change and its cooperation with UN. Essentially, it's not really a surprising move from the administration and the department's new head. Even before he was officially announced as the department's new head, Rex Tillerson has already raised some eyebrows for his apparent connections with the oil industry, and he is not the only one whose appointment had many questioning the administration's intentions. The Environmental Protection Agency's new head Scott Pruitt has recently gotten into hot water after his statements about his uncertainty on climate change. Not a lot of these things are surprising anymore, especially since the president has been very vocal about his position on climate change even before he got elected. The lack of surprise, however, does not equate to a lack of damage. His latest environmental policies and orders have been a cause for major concern for many that even prominent theoretical physicist and scientist Stephen Hawking has expressed his concerns about the possible damage that these movements can have to America. Though a change in wording could just state the country's official position on the matter, the administration's concrete movements such as cutting the budget for science agency National Oceanic and Atmospheric Administration by 26 percent could lead to more direct adverse effects on Americans. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | April 20, 2017
Site: www.eurekalert.org

The 2010 BP Deepwater Horizon oil spill did $17.2 billion in damage to the natural resources in the Gulf of Mexico, a team of scientists recently found after a six-year study of the impact of the largest oil spill in U.S. history. This is the first comprehensive appraisal of the financial value of the natural resources damaged by the 134-million-gallon spill. "This is proof that our natural resources have an immense monetary value to citizens of the United States who visit the Gulf and to those who simply care that this valuable resource is not damaged," said Kevin Boyle, a professor of agricultural and applied economics in the Virginia Tech College of Agriculture and Life Science and one of the authors on the paper. Findings from the study are published in the issue of Science released Friday, April 21. The scientists developed a survey to put a dollar value on the natural resources damaged by the BP Deepwater spill by determining household willingness to pay for measures that would prevent similar damages should a spill of the same magnitude happen in the future. Survey information included descriptions of damaged beaches, marshes, animals, fish, and coral. On top of estimating the impact of the spill, the $17.2 billion represents the benefits to the public to protect against damages that could result from a future oil spill in the Gulf of a similar magnitude. In May 2010, one month after the spill, the U.S. National Oceanic and Atmospheric Administration commissioned a group of 18 researchers to put a dollar value on the natural resources damaged by the BP Deepwater spill. To estimate Gulf Coast resource values, researchers created a scenario in which people were told that they could have a role in mitigating future damages by effectively paying for a prevention program. Final analysis showed that the average household was willing to pay $153 for a prevention program. This rate was then multiplied by the number of households sampled to get the final valuation of $17.2 billion. "The results were eye-opening in that we could tell how much people really value marine resources and ecosystems," said Boyle. "And even more meaningful because we did additional analysis that proved the legitimacy of oft-criticized values for environmental resources." The project team administered surveys to a large random sample of American adults nation-wide after three years of survey development. The first round of surveys was administered face-to-face with trained interviewers while the remaining surveys were completed via mail. Survey participants were informed of pre- and post-spill conditions in the Gulf of Mexico and what caused the oil spill. They were then told about a prevention program, which can be viewed as 100 percent effective insurance against future spill damages, and that another spill would occur in the next 15 years. With this information, participants were asked to vote for or against the program, which would impose a one-time tax on their household. "Our estimate can guide policy makers and the oil industry in determining not only how much should be spent on restoration efforts for the Deepwater spill, but also how much should be invested to protect against damages that could result from future oil spills," said Boyle. "People value our natural resources, so it's worth taking major actions to prevent future catastrophes and correct past mistakes."


News Article | April 19, 2017
Site: news.yahoo.com

Earth Day 2017 will go down in history. On April 22nd, thousands of scientists and their supporters will march in Washington, DC and in more than 420 cities around the globe in what is expected to be the world’s largest public demonstration in the name of science. Meanwhile, President Donald Trump continues to roll back environmental protections, scrub climate change data from government websites, and slash budgets from the Environmental Protection Agency and the National Oceanic and Atmospheric Administration. According to LiveScience, the March for Science started as an idea on Reddit, grew across Facebook and Twitter, and eventually garnered support from over 170 organizations, like the Center for Biological Diversity and the Society for Neuroscience. And there are some pretty badass women leading the charge: Female scientists Caroline Weinberg and Valorie V. Aquino are two of the organizers behind the global protest, while microbiologist Heidi Arjes is knitting her own science “resistor” hats, inspired by the pink pussy hats worn at the Women’s March in January. (Arjes posted her knitting patterns for free online.) There’s even a March for Science knitting group on Facebook with over 2,000 members. But science, and the love of it, doesn't begin and end with Earth Day. There are some great causes you can donate to support the purpose of the march, and here are some fun, apolitical things you can buy right now to let your geek flag fly all year round. An LED Sweater That Reflects Your Mood The high-tech fashion company Sensoree has officially opened the waiting list for its new bioresponsive mood sweater. This soft cowl neck sweater uses LED lights to reflect the wearer’s emotion through colors, ranging from excited pink to calm azure. The fabric is made from recycled plastic bottles and has a rechargeable battery with a micro USB. This is like if your favorite mood ring from the ‘90s and a high-tech body sensor had a ethereal-yet-comfy sweater lovechild. Dresses for Astronomy Nerds and Women Who Love to Code California Etsy entrepreneur Holly Renee launched her brand Shenova with a line of unique, science-themed dresses. Two that work great for the office are the Dark Matter, for astronomy geeks, and the little black Code Poetry dress, for chic programmers.


News Article | April 29, 2017
Site: www.theguardian.com

Scientists are finalising plans to make a last-ditch attempt to save the world’s most endangered marine mammal, the vaquita porpoise. They believe there are now fewer than 30 of these distinctive cetaceans left in the Gulf of California. Only by catching the remaining creatures and protecting them in a sanctuary can the vaquita be saved, it is argued. The $4m rescue plan will involve conservationists patrolling the gulf with the help of dolphins trained by the US navy to pinpoint other cetaceans. The idea is that the animals will then be captured and transported to a sanctuary in San Felipe, Mexico. But the attempt carries risks. No one has ever tried to capture, transport or care for a vaquita before and scientists do not know how they will react. “Some porpoises, like the harbour porpoise, don’t seem to mind too much when captured, but others, such as the Dall’s porpoise, go into shock,” said Barbara Taylor, of the US National Oceanic and Atmospheric Administration. “We don’t know which it is going to be. It is a nerve-racking prospect.” However, scientists insist they now have no choice. “Vaquita numbers are so low it is clear that if we do nothing it will go extinct very soon,” said Lorenzo Rojas-Bracho, of Mexico’s National Institute of Ecology and Climate Change. “However, if we capture the last few and try to protect them we have a chance to save the species.” The vaquita, Phocoena sinus, is small, reaching a maximum length of only 5ft, and has a grey back and white belly. Its home territory – a 900-square-miles section of the northern Gulf of California – is the smallest occupied by a whale species. Twenty years ago, there were about 600 in the region. However, the population has since crashed as a result of illegal fishing of a species called the totoaba. Flesh from its swim bladder can fetch prices of more than $100,000 a kilo in China, where it is prized for its medicinal properties. “Quite simply, it commands a higher price than cocaine,” said Rojas-Bracho. The gill nets designed to catch totoaba are also the perfect size for capturing vaquitas, which get tangled and drown. The Mexican government has recently tightened its laws against illegal fishing, but the rewards for totoaba catches are so high there has been little respite and vaquita numbers have continued to plummet. “The population dropped to 30 last year, but there have been more deaths so I expect we’ll lose about half of this number this year,” said Taylor, a member of the International Committee for the Recovery of the Vaquita . “At this rate it will not last much longer. That is why our task is so urgent.” As part of the rescue project – which has received $3m backing from the Mexican government and $1m from the US Association of Zoos and Aquariums – researchers will use acoustic sensors over the next few months to find the vaquitas and then, in October, they will try to catch individual specimens in nets. “We plan to use a couple of trained dolphins to help us,” said Taylor. “It remains to be seen how effective they will be.” Once the vaquitas are caught, they will be carried to floating pens and – if they respond to the ordeal in a relatively stress-free manner – they will be taken to a sanctuary being built in San Felipe. “Ultimately, we would like to begin a captive breeding programme with the aim of restoring numbers and finally returning vaquitas to the wild, although we obviously cannot do that until we have dealt with the problems that are causing them to be wiped out at present,” said Taylor. Ten years ago Taylor was involved in an attempt to survey numbers of a similar cetacean, the Yangtze river dolphin – also known as the baiji. Its population was known to be threatened by the illegal laying of fishing nets. What Taylor’s team found turned out to be far worse. “We didn’t see a single baiji or hear one whistle,” she told the Observer. “We were too late.” The baiji is now officially listed as extinct. “I resolved then that the vaquita would not suffer a similar fate,” Taylor said – although she accepts the recent dramatic decline in its numbers puts it in a very perilous position. “It is always risky taking an animal into captivity, especially one with which we have no previous experience and who are made up of the last few individuals of that species. But we have to do this.” In the past, other species have been pulled back from the brink of extinction, Rojas-Bracho said. Hunting had reduced numbers of the northern elephant seal to a few dozen in the 19th century. Today, protected by law, there are more than 170,000 of them. “A similar story concerns the southern sea otter, which was reduced in number to around 50 but which has since bounced back to around 2,500 creatures,” he said. “This sort of thing can be done. “Certainly, we are not where we would want to be when it comes to saving the vaquita – but we have to do our best or it will be lost to the planet for ever.”


News Article | April 17, 2017
Site: www.cemag.us

In March, NOAA's Geostationary Operational Environmental Satellite-S (GOES-S) satellite was lifted into a thermal vacuum chamber to test its ability to function in the cold void of space in its orbit 22,300 miles above the Earth. The most complicated and challenging test is thermal vacuum where a satellite experiences four cycles of extreme cold to extreme heat in a giant vacuum chamber. To simulate the environment of space, the chamber is cooled to below minus 100 degrees Celsius or minus 148 degrees Fahrenheit and air is pumped out. The test simulates the temperature changes GOES-S will encounter in space, as well as worst case scenarios of whether the instruments can come back to life in case of a shut down that exposes them to even colder temperatures. In this photo from March 8, the GOES-S satellite was lowered into the giant vacuum chamber at Lockheed Martin Space Systems, Denver, Colo. GOES-S will be in the thermal vacuum chamber for 45 days. As of March 30, two of four thermal cycles were complete. GOES-S is the second in the GOES-R series. The GOES-R program is a collaborative development and acquisition effort between the National Oceanic and Atmospheric Administration and NASA. The GOES-R series of satellites will help meteorologists observe and predict local weather events, including thunderstorms, tornadoes, fog, flash floods, and other severe weather. In addition, GOES-R will monitor hazards such as aerosols, dust storms, volcanic eruptions, and forest fires and will also be used for space weather, oceanography, climate monitoring, in-situ data collection, and for search and rescue.


News Article | May 3, 2017
Site: www.sciencenews.org

A recent upsurge in planet-warming methane may not be caused by increasing emissions, as previously thought, but by methane lingering longer in the atmosphere. That’s the conclusion of two independent studies that indirectly tracked concentrations of hydroxyl, a highly reactive chemical that rips methane molecules apart. Hydroxyl levels in the atmosphere decreased roughly 7 or 8 percent starting in the early 2000s, the studies estimate. The two teams propose that the hydroxyl decline slowed the breakdown of atmospheric methane, boosting levels of the greenhouse gas. Concentrations in the atmosphere have crept up since 2007, but during the same period, methane emissions from human activities and natural sources have remained stable or even fallen slightly, both studies suggest. The research groups report their findings online April 17 in Proceedings of the National Academy of Sciences. “If hydroxyl were to decline long-term, then it would be bad news,” says Matt Rigby, an atmospheric scientist at the University of Bristol in England who coauthored one of the studies. Less methane would be removed from the atmosphere, he says, so the gas would hang around longer and cause more warming. The stability of methane emissions might also vindicate previous studies that found no rise in emissions. The Environmental Protection Agency, for instance, has reported that U.S. emissions remained largely unchanged from 2004 to 2014 (SN Online: 4/14/16). Methane enters the atmosphere from a range of sources, from decomposing biological material in wetlands to leaks in natural gas pipelines. Ton for ton, that methane causes 28 to 36 times as much warming as carbon dioxide over a century. Since the start of the Industrial Revolution, atmospheric methane concentrations have more than doubled. By the early 2000s, though, levels of the greenhouse gas inexplicably flatlined. In 2007, methane levels just as mysteriously began rising again. The lull and subsequent upswing puzzled scientists, with explanations ranging from the abundance of methane-producing microbes to the collapse of the Soviet Union. Those proposals didn’t account for what happens once methane enters the atmosphere. Most methane molecules in the air last around a decade before being broken apart during chemical reactions with hydroxyl. Monitoring methane-destroying hydroxyl is tricky, though, because the molecules are so reactive that they survive for less than a second after formation before undergoing a chemical reaction. Neither study can show conclusively that hydroxyl levels changed, notes Stefan Schwietzke, an atmospheric scientist at the National Oceanic and Atmospheric Administration’s Earth System Research Laboratory in Boulder, Colo. The papers nevertheless add a new twist in explaining the mysterious methane rise, he says. “Basically these studies are opening a new can of worms, and there was no shortage of worms.” Despite being conducted by two separate teams — one headed by Rigby and the other by atmospheric scientist Alex Turner of Harvard University — the new studies used the same roundabout approach to tracking hydroxyl concentrations over time. Both teams followed methyl chloroform, an ozone-depleting substance used as a solvent before being banned by the Montreal Protocol. Like methane, methyl chloroform also breaks apart in reactions with hydroxyl. Unlike methane, though, emission rates of methyl chloroform are fairly easy to track because the chemical is entirely human-made. Examining methyl chloroform measurements gathered since the 1980s revealed that hydroxyl concentrations have probably wobbled over time, contributing to the odd pause and rise in atmospheric methane concentrations. But to know for sure whether hydroxyl levels varied or remained steady, scientists will need to take a more detailed look at regional emissions of methane and methyl chloroform, Rigby says. Why hydroxyl levels might have fallen also remains unclear. Turner and colleagues note that the ban on ozone-depleting substances like methyl chloroform might be the cause. The now-recovering ozone layer (SN: 12/24/16, p. 28) blocks some ultraviolet light, an important ingredient in the formation of hydroxyl. Identifying the cause of the hydroxyl changes could help climate scientists better predict how methane levels will behave in the future.


News Article | April 19, 2017
Site: www.nature.com

When physicist Michael Stopa decided to run for the Massachusetts state senate in 2010, he didn’t expect much encouragement from his “overwhelmingly liberal” colleagues at Harvard University in Cambridge. He was acutely aware of his minority status as a conservative Republican on campus, and avoided talk of politics in his role as a staff scientist in a nanotechnology lab. Then the university newspaper wrote about Stopa’s campaign — and closeted Republicans around campus began to reveal themselves with quiet messages of support. “A lot of people snuck over and said, ‘Hey, I hear you’re a Tea Party guy. I am too,’” says Stopa, who lost the election and eventually left academia, but has stayed active in Republican politics. He rejects the idea that his party is anti-science, arguing that “you can find rubes and lunatics on either side” of the US political divide. But that idea has become a hard sell on many US university campuses, putting Republican researchers in an uncomfortable position, despite their party’s history of strong support for science. Between 1976 and 2013, one study found, US government research and development spending was highest under Republican presidents (S. Kushi J. Sci. Pol. Gov. 7, 2015). Yet during that period, party leaders rejected mainstream climate science, opposed environmental protections and sought to ease regulation of medicines. Republicans’ anti-science reputation seems to have deepened under President Donald Trump, who has embraced ‘alternative facts’ and proposed steep spending cuts for the National Institutes of Health (NIH) and Environmental Protection Agency (EPA), among others. On 22 April, thousands of protesters are expected to attend the March for Science in Washington DC. Organizers describe the event — one of more than 500 planned for around the globe — as non-partisan, but it has sparked concern that it could politicize science and alienate Republican politicians. Many Republican scientists who spoke to Nature say that they don’t talk politics in the lab, because they are afraid that discussing economic policy or the role of government could damage their friendships or even their careers. And some worry that by supporting the party of Trump, who has been accused of racism and misogyny, they risk being tarred by association. “It’s become increasingly difficult over the past few decades for scientists to call themselves Republican as part of their core identity,” says Katharine Hayhoe, an atmospheric scientist at Texas Tech University in Lubbock — and an evangelical Christian who has spoken about her experiences as a member of a minority group in science. “The Republican party has been moving further and further away from recognizing the neutrality of science,” she says. “Now it’s reached a peak.” The small body of research on US scientists’ political affiliations suggests that Republicans are indeed a minority in universities. A 2009 poll of the American Association for the Advancement of Science’s members, who are mostly academics, found that only 6% identified as Republican. Fifty-five per cent of respondents said they were Democrats and 32% were independents. And a 2014 survey by researchers at the University of California, Los Angeles, found that outside of mathematics, economics and engineering, academic scientists in the United States overwhelmingly identified as liberal. The causes of this ideological divide are murky. Politically conservative scholars may drop out of academia because they feel unwelcome, or just because they are drawn to jobs with better pay and shorter training periods, says Richard Alley, a Republican and a geologist at Pennsylvania State University in University Park. What is clear is that conservative and liberal scientists have trouble engaging with each other, says a biologist at a public university in the midwestern United States, who asked to remain anonymous to protect her career. Only one person at her university has ever asked her why she is a Republican, she says. “For most of my colleagues, anyone who is a conservative must fall somewhere on the continuum between stupid and evil,” the biologist says. The situation has worsened since Trump took office in January, she adds. “What you believe has come to be a stand-in for whether you are a good person.” Then there is John Tellis, a chemist at the biotechnology company Genentech in South San Francisco, California. When Tellis was studying for his PhD at the University of Pennsylvania, he tried not to reveal his Republican views. Now, ironically, the election of a Republican president has made it easier for him to talk about politics with his co-workers, he says, because they share his distaste for Trump. Encouraging political diversity among scientists could improve research by helping people to see beyond their own views and prejudices, says Richard Freeman, a labour economist at Harvard who studies gender and racial diversity in science. He notes that Republicans are not alone in staking out political positions contrary to mainstream science. Surveys show that Democrats tend to be more sceptical than Republicans about the safety of genetically modified organisms and nuclear power, even though many studies have concluded that the technologies are safe. The parties’ ideological differences translate into different priorities for government science funding. When Republicans control the government, they tend to increase the military’s research and development budget — which includes programmes that support academic scientists in a broad range of disciplines. By contrast, Democrats tend to increase the budgets of the EPA, NASA and the Department of Commerce, which includes the National Oceanic and Atmospheric Administration. Scientists could build bridges with conservative politicians by improving their relationship with the military, Freeman says, which is “100% pro-science”. Although many of its political boosters are sceptical of global warming, the military has spent billions of dollars on green-energy technology over the past decade. The Pentagon has also warned that climate change could cause water and food shortages and unrest in unstable regions of the world. Others say that people who dismiss all Republicans as anti-science should look deeper into the party’s history. Republican president Richard Nixon created the EPA in 1970, Alley notes. And Republican congressman Newt Gingrich led an effort in the late 1990s to double the NIH budget. “There’s a long tradition of support,” he says, “even if it’s the other way right now.”


Everyone knows the Himalayas, the Rockies, the Alps-the towering geographical features that take up entire horizons and tower over even the tallest cities. Less known, though, are their underwater counterparts that rise from the ocean floor: the seamounts. A seamount is, essentially, a mountain under the sea. Often remnants of extinct volcanoes, these underwater mountains can form ranges or stand alone. They rise from the ocean floor and must stand at least 3,280 feet (1,000 meters) above the surrounding seafloor to be classified as a seamount, according to the National Oceanic and Atmospheric Administration. Their submerged peaks do not reach the surface of the water-those that do are classified as islands. RELATED: Drink These Beers and Save the Planet at the Same Time These underwater fixtures cover more land than any other single land-based habitat on the planet: About 11 million square miles. Scientists estimate that there are more than 100,000 seamounts in the oceans of the world, with more than 30,000 of them in the Pacific Ocean alone. They can be found in all of Earth’s major ocean basins and are often home to a greater variety of marine species-including rare and endangered species-than the seafloor. Only an estimated one-tenth of a percent (.1 percent) have been explored, but it is already clear to scientists that these seamounts offer critical marine habitats. For that reason, there are widespread efforts to protect these underwater spots. In 2016, President Obama designated almost 5,000 square miles off the New England coast as a new marine national monument-the first in the Atlantic Ocean-called the Northeast Canyons and Seamounts Marine National Monument. This area of ocean, 130 miles off the southeast coast of Cape Cod, Massachusetts, is home to several species of deep-sea coral, sharks, sea turtles, beaked whales, and more. Its seamounts rise as much as 7,700 feet above the ocean floor, and its new designation means it-and its ecological resources and species-are protected and preserved from commercial fishing and activities like seabed mining. An act that would do the same for the seamounts off the coast of California has been introduced to the House of Representatives. The California Seamounts and Ridges National Marine Conservation Area Designation and Management Act seeks to protect these underwater peaks 45 to 185 miles off the coast as National Marine Conservation Areas. This would protect them from oil drilling, mining, and other industries while protecting recreational and charter boat fishing opportunities and other activities that don’t damage the fragile ecosystem, which is home to 500-year-old corals, deep-water sponges, and endangered sperm whales and sea turtles. Visit CaliforniaSeamounts.org for more information on efforts to pass this act, championed by Wildcoast, the Surfrider Foundation, and the Marine Conservation Institute with other grassroots supporters. Currently, only one percent of California’s deep-sea habitats are permanently protected, and efforts to protect the state’s seamounts seek to change that before they’re damaged beyond repair. RELATED: Learn More About the Surfrider Foundation


The people of Peru have suffered from what many consider to be the worst flooding in nearly three decades. Days of relentless torrential downpour carrying 10 times more rainfall than normal over a short amount of time have sent catastrophic flash floods and mudslides — destroying public roads and highways, ravaging countless homes, and claiming many lives — over the usually drought-stricken areas of the South American country. According to latest reports, at least 67 people have died and more than 115,000 locals have been left without a home. The government of Peru has declared half of the Andean nation in a state of emergency. This way, relief operations can be sent immediately to the most affected areas. "We've never seen anything like this before. From one moment to the next, sea temperatures rose and winds that keep precipitation from reaching land subsided," Jorge Chavez, a general acting as a spokesperson for the government, said. Experts from NASA are using advanced data from the Global Precipitation Measurement mission to understand the storm systems in Peru. GPM is a joint mission between NASA and the Japanese space agency JAXA. The images of the newest bouts of storms that hit Peru on March 20 were captured by GPM's core observatory satellite. Based on the data gathered by GPM's Microwave Imager and Dual-Frequency Precipitation Radar instruments during one of the satellite's flybys over Peru, an extremely high precipitation rate of 137 millimeters or 5.4 inches per hour was falling in that area. The satellite data was made into an animation at NASA's Goddard Space Flight Center in Greenbelt, Maryland and showed real-time IMERG rainfall estimates based on data collected during the period from March 14 to 21, 2017. The phenomena, dubbed "coastal El Niño" by Peruvian meteorologist Abraham Levy, was not seen until almost a century ago in 1925. This extremely warm water in Peru's western coast is said to be responsible for stimulating the rise of these storms. Equatorial sea surface temperatures are about average in neighboring parts of the central and east-central Pacific. Unfortunately, the worst is not yet over for Peru. Dimitri Gutierrez, a scientist working at Peru's El Niño committee, forecasts that the local El Niño will last along Peru's northern coast at least through April, triggering either flooding or drought. A report by the National Oceanic and Atmospheric Administration showed that, in 2016, there were 15 weather and climate disaster events, with losses exceeding $1 billion each across the United States. These included a drought, four instances of major flooding, eight severe storms, a tropical cyclone, and a wildfire. Overall, these weather- and climate-related calamities resulted in the deaths of 138 people and had significant economic impacts on the affected areas. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | April 21, 2017
Site: news.yahoo.com

US President Donald Trump has called for drastic cutbacks across multiple federal agencies that track and analyse climate (AFP Photo/SAUL LOEB) Paris (AFP) - The gutting of US-funded climate science would cripple research agendas worldwide and hamper the global fight against climate change, say scientists outside the United States, some of whom will take to the streets Saturday to make that point. US President Donald Trump has called for drastic cutbacks across multiple federal agencies that track and analyse climate by gathering data from satellites, the deepest ocean trenches, and everything in between. Tens of thousands of scientists are set to converge on Washington DC in protest, with hundreds of smaller marches planned in cities around the world. "An unprecedented attack on science, scientists and evidence-based policymaking is underway," said Kenneth Kimmell, president of the Union of Concerned Scientists, a Washington-based policy institute. "And nowhere is the attack more ferocious than on the issue of global warming." Indeed, proposed cuts to research budgets in the Departments of Energy, the Environmental Protection Agency, NASA and the National Oceanic and Atmospheric Administration -- totalling billions of dollars and thousands of jobs -- are concentrated on climate science, which Trump has notoriously dismissed as a "hoax" perpetrated by the Chinese. Scientists in Europe, Asia and Australia express alarm not just at the slowdown in US research, but the knock-on consequences for their own work. "The impacts may range from troublesome to disastrous," Bjorn Samset, research director at the Center for International Climate Research in Oslo, told AFP. "We use US climate-related data -- particularly from satellites -- on a daily basis." The United States, driven by its big federal agencies, "has become THE global provider of high quality, long-term datasets," he added. Some of the programmes targeted for axing, for example, are crucial for tracking how much carbon is vented into the atmosphere, or how the distribution of clouds -- one of the key uncertainties in projections of future climate change -- might evolve over time. "This would impair our ability in the future to keep our observations, and understanding, up to speed," said Joeri Rogelj, a research scholar at the International Institute for Applied Systems Analysis in Vienna, one of the world's leading centres for climate modelling. For Myles Allen, head of the University of Oxford's Climate Research Group, the damage from a US pullback would go well beyond raw data. "If we lose that intellectual firepower, it is obviously going to make dealing with the problem that much harder," he said in an interview. "We need American technology and innovation to find solutions." Allen noted that the European Union and China are "stepping up their game" in monitoring climate, but said Washington may not see that in a positive light. "Does the US want to rely on observations made by overseas agencies in measuring the impact of Chinese emissions on the US weather?", he wondered. Three of six major international platforms shared by climate modellers -- who calculate the risks of future climate change -- are maintained and operated in the United States, and could be in peril. "If we lose one or two of these data distribution centres in the US, it could collapse the entire coordinated system for sharing these simulations of future climate," said Valerie Masson Delmotte, research director at France's Alternative Energies and Atomic Energy Commission, and a lead scientist of the UN's climate science panel. New visa and travel restrictions in the United States likewise threaten future collaboration, said Samset, noting that almost all important climate research crosses national boundaries. "This has already gotten harder to arrange within the US, or abroad with US participation," he said. Shun Chi-ming, director of the Hong Kong Observatory, said he was "highly concerned" that impending US cuts in climate research could also affect "weather and disaster monitoring". When it comes to taking their concerns into the street with a slogan on a placard, Allen, Rogelj, and other researchers are clearly torn. "Demonstrations and protests are usually far outside the comfort zone of scientists," said Samset. But Trump's disregard for scientific consensus -- seen in the appointment of outright climate deniers to key administration posts -- has forced many to reconsider the boundary between their role as scientist and citizen.


News Article | April 19, 2017
Site: www.techtimes.com

Top Scientific Minds You Probably Never Heard Of Earth Day 2017 will be celebrated this weekend with a show of strength by the scientific community. On April 22, thousands of biologists, climate researchers, and environmental advocates are expected to flood Washington, DC in the much-anticipated March for Science. Partly modeled after the Women's March organized in January, this weekend's event promises to become the largest-ever protest by science advocates. Scientists have resolved to step out of the lab and take to the streets to protest the Trump administration's science and environmental policies, which are seen as detrimental to evidence-based thinking and scientific professions. To this effect, Earth Day Network and the March for Science are co-organizing a rally and teach-in, to be held at the National Mall in Washington. Described by the organizers as "a celebration of our passion for science and the many ways science serves our communities and our world," the activity comes as a response to the presidency's proposed budget cuts affecting science programs, medical research, and climate change and pollution management initiatives. "The March for Science is an unprecedented global gathering of scientists and science enthusiasts joining together to acknowledge the vital role science plays in our lives," states the Earth Day website. Furthermore, the organizers emphasize "the need to respect and encourage research that gives us insight into the world." The initiative has gathered more than a million supporters on Facebook and Twitter, and is also being backed by more than 100 organizations, including the American Association for the Advancement of Science and the American Geophysical Union. Many scientists at federal agencies are concerned their work may be sidelined or censored for political purposes. Faced with what they see as threats to their profession, younger scientists, in particular, have decided to take action, according to Andrew Rosenberg, formerly from the National Oceanic and Atmospheric Administration and currently a member of the Union of Concerned Scientists. In the early morning on Earth Day, protesters will be assembling at 8:00 a.m., while the teach-in is scheduled to begin at 9:00 a.m. "There will be plenty of ridiculous signs, it will be a lot of fun with serious moments too," says Ayana Johnson, a marine biologist and one of the organizers. The March for Science is announced to be a type of witty protest, with a highly intellectualized tone, wavering from pro-science and anti-Trump. Members of the scientific community will be handing out copies of the Lorax and holding up signs such as "Make America Smart Again" and "What do we want? Evidence-based policy. When do we want it? After peer review." Some sources even indicate a possible turnout of brain-like knitted hats. Moreover, Bill Nye has been named as honorary co-chair of the event, together with Mona Hanna-Attisha, the notorious pediatrician from Michigan. Those who want to join the initiative but can't make it to Washington can register to one of the 517 satellite marches taking place all over the world. The March for Science is coordinated with a similar event, the People's Climate March, which is scheduled a week later. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | March 28, 2017
Site: www.techtimes.com

The country’s top heart doctors and health researchers fear that the Trump administration’s proposed 20 percent or $5.8 billion cut to the U.S. National Institutes of Health’s budget will radically impact work that produces life-saving heart medications. As there are fewer heart drugs in development given that greater focus is on in-demand areas such as cancer, NIH-funded studies may be more critical than ever in the field, according to experts attending a recent meeting of the American College of Cardiology. "There are trials that we have to do that will never be funded by drug companies. We rely on NIH," said Cleveland Clinic’s head of preventive cardiology Dr. Leslie Cho, as reported by Reuters. Heart disease remains the top killer in the country, reminded the experts, noting that scientists instrumental to the creation of leading heart drugs today had obtained funding from the NIH. The government-funded Sprint trial, for instance, discovered that more aggressive therapy using generic blood pressure drugs significantly slashed heart failure and death risk in people ages 50 and above. Dr. Clyde Yancy, former president of the American Heart Association, dubbed the planned budget cuts “a landmine waiting to explode.” He is foreseeing a grim scenario: laboratories getting closed down, personnel getting axed, and research proposals abandoned. Cleveland Clinic’s cardiology head Dr. Steven Nissen recalled the work that produced statins, calling for funding other scientists who could turn out to be like Michael Brown and Joseph Goldstein, the cholesterol drug’s creators. The hope is now with the U.S. Congress, which could shoot down the proposed cuts. As one of the world’s primary research centers, the NIH is likely to be greatly affected by the budget proposal from the White House, which not only outlined a funding reduction but also eliminated money for research initiatives in various federal agencies. About 80 percent of NIH funding is channeled into the country’s medical centers and universities, according to Joanne Carney of the American Association for the Advancement of Science. But it’s not just graduate students who are potentially at risk, but also undergrads who need resources as well. A blueprint of the proposed White House budget reflected cuts in other science agencies, such as the National Oceanic and Atmospheric Administration, the energy department’s Office of Science, and the Environmental Protection Agency. NOAA budget cuts, for one, are feared to weaken the major climate science agency’s plans and programs such as weather satellite systems, research into coastal communities and ocean science, and innovations in climate studies. Health advocates appear to be among those most discouraged by current developments. “[Trump’s budget] doesn’t reflect the priorities of a nation committed to protecting and improving the health and well-being of its citizens,” said Mary Woolley, president of nonprofit Research!America, said in an NYT report. Astrophysicist Neil deGrasse Tyson himself tweeted that the fastest way to build a “sick” country is to cut NIH funding. Let’s not even get to NASA-led science missions that are on the losing end in the proposed budget. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | April 24, 2017
Site: www.chromatographytechniques.com

The 2010 BP Deepwater Horizon oil spill did $17.2 billion in damage to the natural resources in the Gulf of Mexico, a team of scientists recently found after a six-year study of the impact of the largest oil spill in U.S. history. This is the first comprehensive appraisal of the financial value of the natural resources damaged by the 134-million-gallon spill. “This is proof that our natural resources have an immense monetary value to citizens of the United States who visit the Gulf and to those who simply care that this valuable resource is not damaged,” said Kevin Boyle, a professor of agricultural and applied economics in the Virginia Tech College of Agriculture and Life Sciences and one of the authors of the paper. Findings from the study were published in the issue of Science released on April 20. The scientists developed a survey to put a dollar value on the natural resources damaged by the BP Deepwater spill by determining household willingness to pay for measures that would prevent similar damages should a spill of the same magnitude happen in the future. Survey information included descriptions of damaged beaches, marshes, animals, fish, and coral. On top of estimating the impact of the spill, the $17.2 billion represents the benefits to the public to protect against damages that could result from a future oil spill in the Gulf of a similar magnitude. In May 2010, one month after the spill, the U.S. National Oceanic and Atmospheric Administration commissioned a group of 18 researchers to put a dollar value on the natural resources damaged by the BP Deepwater spill. To estimate Gulf Coast resource values, researchers created a scenario in which people were told that they could have a role in mitigating future damages by effectively paying for a prevention program. Final analysis showed that the average household was willing to pay $153 for a prevention program. This rate was then multiplied by the number of households sampled to get the final valuation of $17.2 billion. “The results were eye-opening in that we could tell how much people really value marine resources and ecosystems,” said Boyle. “And even more meaningful because we did additional analysis that proved the legitimacy of oft-criticized values for environmental resources.” The project team administered surveys to a large random sample of American adults nationwide after three years of survey development. The first round of surveys was administered face-to-face with trained interviewers while the remaining surveys were completed via mail. Survey participants were informed of pre- and post-spill conditions in the Gulf of Mexico and what caused the oil spill. They were then told about a prevention program, which can be viewed as 100 percent effective insurance against future spill damages, and that another spill would occur in the next 15 years. With this information, participants were asked to vote for or against the program, which would impose a one-time tax on their household. “Our estimate can guide policymakers and the oil industry in determining not only how much should be spent on restoration efforts for the Deepwater spill, but also how much should be invested to protect against damages that could result from future oil spills,” said Boyle. “People value our natural resources, so it’s worth taking major actions to prevent future catastrophes and correct past mistakes.”


News Article | April 18, 2017
Site: www.theguardian.com

On Saturday, thousands of scientists are set to abandon the cloistered neutrality of their laboratories to plunge into the the political fray against Donald Trump in what will likely be the largest-ever protest by science advocates. The March for Science, a demonstration modeled in part on January’s huge Women’s March, will inundate Washington DC’s national mall with a jumble of marine biologists, birdwatchers, climate researchers and others enraged by what they see as an assault by Trump’s administration upon evidence-based thinking and scientists themselves. The march is a visceral response to a presidency that has set about the evisceration of the Environmental Protection Agency (EPA) and many of its science-based rules, the dismissal of basic climate change tenets by the president and his appointees and a proposed budget that would remove around $7bn from science programs, ranging from cancer research to oceanography to Nasa’s monitoring of the Earth. Many scientists at federal agencies, concerned their work may be sidelined or censored for political purposes, will take the unusual step of publicly damning the administration. “It’s important for scientists to get out of the lab and talk about what’s important,” said Andrew Rosenberg, who spent a decade at the National Oceanic and Atmospheric Administration and is now at the Union of Concerned Scientists. “You don’t check your citizenship at the door when you get a PhD. No one would tell an architect they can’t have a view on HUD [the Department of Housing and Urban Development]. That would be nonsense.” Rosenberg said younger scientists, in particular, are increasingly rejecting a stance of studied silence when faced with what they see as threats to their profession. “They don’t accept that they have to wait until tenure, comfortable in a lab to maybe then speak out,” he said. “Academia is less appealing to many of them these days, so they want to know how they can have an impact now. They aren’t content that people will just read their papers in academic journals. I think retreating to your lab and hoping it will all go away is not going to be the best strategy.” The idea to march was first tossed around on a Reddit thread in January. One of those on the discussion, University of Texas postdoctoral fellow Jonathan Berman, decided to put the idea into motion. A day or two after being set up, a Facebook page promoting the march had attracted more than 300,0000 members. The march now has dozens of people grappling with the logistics of the DC march and more than 500 companion events around the world. More than 100 organizations have lent their support, including the institutional heft of the American Association for the Advancement of Science, the world’s largest general scientific organization, and the American Geophysical Union. In March, Bill Nye, the bow-tied embodiment of science for many Americans, and Mona Hanna-Attisha, a pediatrician who alerted the world to soaring levels of lead in the blood of children in Flint, Michigan, were named as honorary co-chairs. Organizers won’t commit to an expected number of protestors but are downplaying expectations that it will be anywhere near the scale of the Women’s March. The tone is expected to waver between pro-science and anti-Trump. The march will dovetail with the People’s Climate March, which will take place a week later. Signs reading “Make America Smart Again” and “What do we want? Evidence-based policy. When do we want it? After peer review” are expected to make an appearance at the science march. Copies of the Lorax will be handed out. There may well be a sea of brain-like knitted hats. “There will be plenty of ridiculous signs, it will be a lot of fun with serious moments too,” said Ayana Johnson, a marine biologist who became an organizer after seeing fellow scientists downloading climate data in case the administration removed it from public view. “I found that horrifying. That for me was the real alarm, but everyone has their own story.” The satellite marches around the world suggest Trump isn’t the sole cause of scientists’ unease. Globally, there is a “trend of anti-intellectualism”, said Johnson, where politicians play to voters’ base emotions rather than provide evidence-based policy. “We have gotten ourselves into this situation because the public doesn’t understand how science benefits us in our everyday lives,” Johnson admitted. “We haven’t done a good job communicating the value of the work we do.” Some scientists, while sharing much of the anguish of the marchers, have questioned whether a protest in the heart of DC will in fact be counterproductive. Trump is probably more likely to respond to the march with an angry tweet than rethink cuts to cancer research, while Republicans who believe scientists are merely green-tinged activists with fancy titles will feel vindicated. “The march won’t change any minds in the Trump administration and it won’t convince rural and working class America that science is relevant to their lives,” said Robert Young, an expert in coastal geology at Western Carolina University. “The march is on Earth Day, which plays into conservative and climate skeptic thinking that scientists are just environmentalists. Just watch how it will be covered by Fox News and conservative bloggers.” Young said he doesn’t think scientists should just “sit on their hands” and is similarly troubled that, for example, EPA administrator Scott Pruitt doesn’t accept the widespread understanding that carbon dioxide is a primary driver of global warming. “But we’ve convinced all the people we are going to convince,” Young said. “We can march and shout our heads off, but that won’t engage with people who have not bought the message. “We need more face-to-face interaction in local communities. We should do AM radio talkshows. That can be quite a challenge, but that’s the radio that my family and my wife’s family listen to and they are regular working-class Americans. We need to meet these folks where they live.” While the public largely tells pollsters that it supports scientists and their work, there is underlying friction. Innovations in technology have helped drive automation of some jobs, while our ever-improving understanding of our environment has led to restrictions on some polluting industries. Trump tapped into this simmering angst and scientists’ challenge may well be explaining how their breakthroughs can help all of us. The March for Science “will exacerbate rather than address these tensions” according to Jason Lloyd, a program manager for the Consortium for Science, Policy, & Outcomes at Arizona State University. “The biggest issue confronting science is not a malicious and incompetent executive,” Lloyd wrote for Slate. “The critical challenge ... is figuring out how scientists can build an enduring relationship with all segments of the American public, so that discounting, defunding or vilifying scientists’ important work is politically intolerable.” Even some of the march’s supporters concede that the event won’t change administration thinking overnight. But even people who specialize in cool, rational thinking occasionally need to wail their frustration. “Scientists are very worried that we are losing science from the public sphere,” said Rosenberg. “I don’t think these events will prove a turning point but in Congress and in the states this will matter. Our representatives need to know that voters care about science.”


News Article | April 27, 2017
Site: www.treehugger.com

Touted as a great way to destress, turn off the thinking mind and exercise a bit of easy creativity, adult colouring books (yes, there really is such a thing) have recently become quite popular. But a new crowdfunded project is seeking to use the medium of the colouring book as a way to effectively communicate the hard science behind climate change. The Climate Change Colouring Book is the brainchild of Brian Foo, a 31-year-old data artist and computer scientist based out of New York City. The book presents published data from official sources like NASA, NOAA (National Oceanic and Atmospheric Administration) and the EPA in a visually organized way, inviting people to read the research, and to take their time to colour in the data visualization diagrams, allowing them to slowly absorb the information. The 40-page book, which has already been funded twice as much as its original goal, will be printed on recycled paper using plant-based ink. The book is intended for both kids and adults (we definitely think it would be a great supplement for science classes everywhere) and offers over 20 activities that are educational and based on real, scientific observations on the effects of climate change. For instance, one can colour the extent of the loss of Arctic sea ice during the last few decades, or the projected loss of shoreline if sea levels rise, or how many football fields of global forest we are losing every minute. For some, this may seem like a depressing activity. But these diagrams of data are based in the reality that is currently being observed around us, and the threat of climate change is real, and it's something we must confront. Even more urgent is the apparent apathy and misinformation surrounding these very real issues, says Foo: Climate change is one of the most significant issues that uniquely affects everyone around the globe. There currently is a significantly large gap between scientific consensus (97%) and public perception of climate change (48%). Since public perception influences government and business policies around environmental issues, it is important to ensure enough unbiased and reliable information about the issues are available to the public. This book is not political, but a celebration of information, learning, and research. As we've said time and time again, numbers don't mean much to us unless we are able to parse it in another way, whether that's through thought-provoking art, changes that we can observe in our own lives, or yes, even through a simple colouring book. And sometimes, the simplest way is the most effective. Pricing for one book starts at USD $15 each (cheaper if you buy more). There's also a teacher's set that includes 10 books and printable PDFs for handouts for $90. To find out more or to pledge, visit Kickstarter.


News Article | April 24, 2017
Site: www.chromatographytechniques.com

The 2010 BP Deepwater Horizon oil spill did $17.2 billion in damage to the natural resources in the Gulf of Mexico, a team of scientists recently found after a six-year study of the impact of the largest oil spill in U.S. history. This is the first comprehensive appraisal of the financial value of the natural resources damaged by the 134-million-gallon spill. “This is proof that our natural resources have an immense monetary value to citizens of the United States who visit the Gulf and to those who simply care that this valuable resource is not damaged,” said Kevin Boyle, a professor of agricultural and applied economics in the Virginia Tech College of Agriculture and Life Sciences and one of the authors of the paper. Findings from the study were published in the issue of Science released on April 20. The scientists developed a survey to put a dollar value on the natural resources damaged by the BP Deepwater spill by determining household willingness to pay for measures that would prevent similar damages should a spill of the same magnitude happen in the future. Survey information included descriptions of damaged beaches, marshes, animals, fish, and coral. On top of estimating the impact of the spill, the $17.2 billion represents the benefits to the public to protect against damages that could result from a future oil spill in the Gulf of a similar magnitude. In May 2010, one month after the spill, the U.S. National Oceanic and Atmospheric Administration commissioned a group of 18 researchers to put a dollar value on the natural resources damaged by the BP Deepwater spill. To estimate Gulf Coast resource values, researchers created a scenario in which people were told that they could have a role in mitigating future damages by effectively paying for a prevention program. Final analysis showed that the average household was willing to pay $153 for a prevention program. This rate was then multiplied by the number of households sampled to get the final valuation of $17.2 billion. “The results were eye-opening in that we could tell how much people really value marine resources and ecosystems,” said Boyle. “And even more meaningful because we did additional analysis that proved the legitimacy of oft-criticized values for environmental resources.” The project team administered surveys to a large random sample of American adults nationwide after three years of survey development. The first round of surveys was administered face-to-face with trained interviewers while the remaining surveys were completed via mail. Survey participants were informed of pre- and post-spill conditions in the Gulf of Mexico and what caused the oil spill. They were then told about a prevention program, which can be viewed as 100 percent effective insurance against future spill damages, and that another spill would occur in the next 15 years. With this information, participants were asked to vote for or against the program, which would impose a one-time tax on their household. “Our estimate can guide policymakers and the oil industry in determining not only how much should be spent on restoration efforts for the Deepwater spill, but also how much should be invested to protect against damages that could result from future oil spills,” said Boyle. “People value our natural resources, so it’s worth taking major actions to prevent future catastrophes and correct past mistakes.”


News Article | April 27, 2017
Site: phys.org

A total of 41 humpback whales (Megaptera novaeangliae) have died in the waters off Maine to North Carolina, said the National Oceanic and Atmospheric Administration. Twenty-six died last year, far higher than the average annual number of humpback whales for the area—just 14. So far this year, 15 have already washed up dead. "The increased numbers of mortalities have triggered the declaration of an unusual mortality event, or UME, for humpback whales along the Atlantic Coast," said Mendy Garron, stranding coordinator at the NOAA Fisheries Greater Atlantic Region. A UME is issued whenever a stranding is "unexpected, involves a significant dieoff of any marine mammal population, and demands immediate response," she told reporters. Animal autopsies—known as necropsies—have been performed on 20 whales. Ten of the marine mammals showed acute signs of blunt force trauma or large propeller cuts from colliding with ships or boats, suggesting this was the likely cause of death. The other 10 had no such obvious signs, and researchers are continuing tests to find out what other factors could have contributed to their demise. "Whales tested to date have had no evidence of infectious disease," said Garron. Researchers stressed they have yet to uncover the cause of the unusual spike in deaths. "The answer is really unknown," said Greg Silber, coordinator of recovery activities for large whales with the NOAA Fisheries Office of Protected Resources. There is no known spike in vessel traffic in the area, but humpback whales move around in search of prey, which could bring them closer to shore, he added. Humpback whales grow to between 48 and 62 feet (15-19 meters), weigh 40 tons, and are known for their haunting songs that travel great distances underwater. Most humpback whales are no longer considered an endangered species, after that designation was lifted in 2016 due to a rebounding population. There are more than 10,000 humpback whales in the North Atlantic Ocean. But there are 14 distinct populations of the whales, five of which are still endangered, including those in the Arabian Sea, Cape Verde Islands, northwestern Africa, the Western North Pacific and Central America. An international moratorium on hunting them was established in 1982 and remains in place. Unusual die-offs of humpback whales in the Atlantic Ocean were also reported in 2003, 2005 and 2006, NOAA said. The cause of those deaths were undetermined. Explore further: NOAA declares deaths of large whales in Gulf of Alaska an unusual mortality event


News Article | April 17, 2017
Site: www.forbes.com

The Great Barrier Reef is perhaps on its final deathbed as it suffers from back to back massive dying events in the past two years. Australia's Great Barrier Reef is one of the great natural wonders of the world, a tremendous display of beauty, biodiversity, and a fragile ecosystem. Now, the reef faces mass destruction as a result of warming seas caused by climate change. Last year the reef suffered a widespread bleaching event that damaged 95 percent of the reef's northern most third. This year, without hardly enough time to recover the middle third of the reef has suffered widespread bleaching, as discovered by a recent aerial survey taken by the ARC Center of Excellence for Coral Reef Studies. During this expedition, the center found hundreds of miles of coral ecosystems dead due to sea temperatures warming above the habitable range for corals. The study, recently published in Nature, surveyed what ecologists and marine biologists have been fearing may be true, the terminal decline of the Great Barrier Reef. Since the 1980's scientists have consistently surveyed the reef to document its vitality and changes through time. During those several decades, they have noticed increasing occurrence and severity of bleaching events caused largely by warming ocean temperatures and pollutants from agriculture and industrialization. The Great Barrier reef is a victim of global climate change and its decline is a direct result of a warming planet. This recent aerial survey covered over 5,000 miles of reef and documented over 800 individual coral reefs. Of the aforementioned 5,000 miles of reef the surveyors found 932 miles have been bleached, close to 20 percent. To make things worse, this came just a year after approximately 93 percent of the reef suffered damage due to warming. As if that wasn't enough damage, the reef was recently struck by the Tropical Cyclone Debbie, turning reef into rubble along an area that had previously escaped from bleaching. The bleaching of the Great Barrier Reef is primarily caused by climate change driven global warming and secondly so from pollution and runoff from land. When sea temperatures get too warm the symbiotic relationship between corals and zooxanthellae become strained. Zooxanthellae are algae that live within corals and provide nutrients and produce oxygen for the coral. Alternatively, the coral provides a safe environment and nutrients for photosynthesis. When temperatures get too warm the algae produce toxins that force the coral to eject the zooxanthellae, a response that leaves the corals in life support if waters don't cool. Last year, in 2016, scientists expected large scale bleaching due to the strain El Niño, which drives abnormal warming of Australian waters. However, the lack of an El Niño this year coincident with another year of bleaching provides evidence that there is a larger driver at play. The Great Barrier Reef has experienced bleaching events in 1998, 2002, 2016, and now in 2017 since measurement began. Unfortunately, it takes over a decade for reefs to recover, but with back to back bleaching that severely limits the chances of significant recovery. With Donald Trump's rollback of promises in the Paris Agreement, tackling climate change appears to have taken a step backward. In addition, the Australian government, which relies heavily on coal mining and export is unwilling to make the appropriate changes required to jumpstart a long-lasting revitalization of the reef. It's not too late to seriously start combating climate change, a phenomenon that has increasingly killed the Great Barrier Reef for decades. I don’t think the Great Barrier Reef will ever again be as great as it used to be — at least not in our lifetimes,” said C. Mark Eakin with the National Oceanic and Atmospheric Administration. Trevor Nace is a geologist, Forbes contributor, and adventurer. Follow him on Twitter @trevornace


News Article | May 7, 2017
Site: phys.org

A biologist shines his dim red headlamp and uses an ultrasound to scan the belly of an anesthetized sablefish about the length of his forearm to tell if it's female and has eggs to collect. He gently squeezes out hundreds of tiny, translucent eggs into a glass beaker. After the eggs are fertilized externally, they'll grow in large indoor tanks and some in floating net pens in Washington state's Puget Sound to be used for research. At this federal marine research station near Seattle, scientists are studying sablefish genetics and investigating ways to make it easier and more efficient to commercially grow the fish. It is part of a larger effort by the National Oceanic and Atmospheric Administration to support marine aquaculture as a solution to feed a growing demand worldwide for seafood. People are consuming more fish than in previous decades, with average worldwide per capita consumption hitting 43 pounds (20 kilograms) a year, according to the Food and Agriculture Organization of the United Nations. Fish consumption is expected to grow even more in coming years. NOAA says aquaculture can relieve pressure on fishing populations and promote economic growth. Fishermen along the U.S. West Coast, mostly in Alaska, catch millions of pounds of wild sablefish each year but no commercial sablefish net-pen farming exists in the U.S. Sablefish, also known as black cod or butterfish, are long-lived species that is native to the northeast Pacific Ocean and highly valued in Asia for its beneficial nutrients and delicate flavor. The fish are grilled, smoked, poached, roasted or served as sushi. Michael Rubino, who directs the NOAA aquaculture program, noted that practices for farming fish in the U.S. meet very strict environmental regulations. But some critics worry large-scale farms could harm wild fish stocks and ocean health, and some commercial fishermen worry about potential competition. "This would be a big threat for us," said Robert Alverson, executive director of the Fishing Vessel Owners' Association, a Seattle-based group that represents about 95 commercial fishermen in Alaska, Oregon, Washington and California. In 2015, fisherman harvested about 35 million pounds (16 million kilograms) of sablefish worth $113 million in the United States, all along the U.S. West Coast. Of that, nearly two-thirds, or about 23 million pounds (10 million kilograms), were caught in Alaska, with smaller amounts in Oregon, Washington and California. Nearly half of the sablefish caught in the United States is exported, with a majority going to Japan. "Our fear is that science isn't going to stay in the U.S., and it will be exported to a Third World country where people work for a few bucks a day," Alverson said. "They'll raise it with low-valued labor and use our science to undercut our commercial fishery and coastal communities." Rubino and others say wild harvests and aquaculture can complement each other, particularly during months when there are lower catch limits for wild sablefish. "You always have this yin-yang problem between fisheries and aquaculture," Rick Goetz, who leads the marine fish and shellfish biology program at the Manchester Research Station, across Puget Sound from Seattle. "The big problem is allaying the fears of people that you can have both. You can have both of those things working, particularly because this fish is such a high-value product." In recent years, NOAA Fisheries scientists have worked to reduce potential barriers to sablefish aquaculture. They have developed techniques to produce all-female stocks of sablefish that grow faster and much bigger than males in about 24 months. Ideal market size is roughly 5 ½ pounds (2 ½ kilograms). They've also studied different ways to reduce the costs of feeding juvenile fish, increase larvae survival rates and decrease deformities. One research project is replacing more expensive algae with clay that is used to help sablefish larvae better find their prey. Another looked at finding the optimal temperature to increase larval growth. Wild fish are caught off the Washington coast and used to develop captive brood stocks, or mature fish that are used for breeding. At the facility, the fertilized eggs grow in silos in dark, cold rooms before being moved to other indoor tanks where they're fed a steady diet of brined shrimp and other food. Large circular tanks hold fish in different growth stages. The facility produces about 10,000 all-female fingerlings, or juveniles about an inch (25 millimeters) long, each year. It has sent some fish to a Texas company that uses land-based recirculation tanks to grow fish, as well as others interested in sablefish aquaculture. NOAA Fisheries also is working with a Native American tribe in Washington state to get a pilot project to grow sablefish in net pens outside the research facility at Manchester. The tribe and others have applied for a federal grant. Kurt Grinnell, aquaculture manager for the Jamestown S'Kallam Tribe, said the tribe is very interested in sablefish aquaculture for many reasons. "It's a native fish to our area. It's a very robust fish. It's very sought-after. It's got great market value," he said. "Over time, our country and other countries will have to get their protein source somewhere, and we believe this is one way to meet that demand." In this photo taken March 28, 2017, Ken Massee, right, a biologist with the National Oceanic and Atmospheric Administration, uses an ultrasound device to scan the belly of an anesthetized sablefish to tell if it's female and has eggs to collect at a research facility in Manchester, Wash. The silvery-black fish prized for their buttery flavor live deep in the ocean, so researchers keep their labs cold and dark to simulate ideal conditions for sablefish larvae. (AP Photo/Ted S. Warren) Explore further: 90 percent of fish used for fishmeal are prime fish


Last week, the first working version of the Information System of the Global Record of Fishing Vessels, Refrigerated Transport Vessels and Supply Vessels (Global Record) was released to member countries in order to collect their data. The information system is an online comprehensive and updated repository of vessels involved in fishing operations which will serve as a single-access point of information to combat illegal, unreported and unregulated (IUU) fishing that is estimated at an annual cost of US$ 10 - 23 billion. An event held following an informal meeting of the FAO Committee on Fisheries (COFI) displayed this new tool designed to enable State authorities and regional fisheries management organizations , to work together in order to make it more difficult for vessels to operate outside the law. This tool is expected to serve inspectors, port State administrations, flag State administrations, non-governmental organizations and the general public. Certified data are compiled, disseminated and provided by official State authorities responsible for this information. The first working version of the Global Record Information System is currently open exclusively to authorized data providers to insert official data pertaining to, amongst others, their country's fishing fleet including Vessel details; Flag, Vessel and Owner history records; Authorization Details; as well as other relevant information. Once content is inserted by the authorities responsible for it, the tool will be made accessible to the general public. The foundations of the Global Record Programme were laid in 2005 when the Rome Declaration on Illegal, Unreported and Unregulated Fishing was adopted by the FAO Ministerial Meeting on Fisheries. The tool ties in to a framework of several legal instruments available including the Port State Measures Agreement (PSMA). The Global Record of Fishing Vessels, Refrigerated Transport Vessels and Supply Vessels has been developed thanks to the financial support of the European Commission, the Icelandic Ministry of Industries and Innovation, the Spanish Ministerio de Agricultura y Pesca, Alimentación y Medio Ambiente and the National Oceanic and Atmospheric Administration of the United States Department of Commerce.


News Article | May 7, 2017
Site: hosted2.ap.org

(AP) — The dark gray fish prized for its buttery flavor live deep in the ocean, so researchers keep their lab cold and dark to simulate ideal conditions for sablefish larvae. A biologist shines his dim red headlamp and uses an ultrasound to scan the belly of an anesthetized sablefish about the length of his forearm to tell if it's female and has eggs to collect. He gently squeezes out hundreds of tiny, translucent eggs into a glass beaker. Once the eggs are fertilized externally, they'll grow in large indoor tanks and some in floating net pens in Washington state's Puget Sound to be used for research. At this federal marine research station near Seattle, scientists are studying sablefish genetics and investigating ways to make it easier and more efficient to commercially grow the fish. It is part of a larger effort by the National Oceanic and Atmospheric Administration to support marine aquaculture as a solution to feed a growing demand worldwide for seafood. People are consuming more fish than in previous decades, with average worldwide per capita consumption hitting 43 pounds (20 kilograms) a year, according to the Food and Agriculture Organization of the United Nations. Fish consumption is expected to grow even more in coming years. NOAA says aquaculture can relieve pressure on fishing populations and promote economic growth. Fishermen along the U.S. West Coast, mostly in Alaska, catch millions of pounds of wild sablefish each year but no commercial sablefish net-pen farming exists in the U.S. Sablefish, also known as black cod or butterfish, are long-lived species that is native to the northeast Pacific Ocean and highly valued in Asia for its beneficial nutrients and delicate flavor. The fish are grilled, smoked, poached, roasted or served as sushi. Michael Rubino, who directs the NOAA aquaculture program, noted that practices for farming fish in the U.S. meet very strict environmental regulations. But some critics worry large-scale farms could harm wild fish stocks and ocean health, and some commercial fishermen worry about potential competition. "This would be a big threat for us," said Robert Alverson, executive director of the Fishing Vessel Owners' Association, a Seattle-based group that represents about 95 commercial fishermen in Alaska, Oregon, Washington and California. In 2015, fisherman harvested about 35 million pounds (16 million kilograms) of sablefish worth $113 million in the United States, all along the U.S. West Coast. Of that, nearly two-thirds, or about 23 million pounds (10 million kilograms), were caught in Alaska, with smaller amounts in Oregon, Washington and California. Nearly half of the sablefish caught in the United States is exported, with a majority going to Japan. "Our fear is that science isn't going to stay in the U.S., and it will be exported to a Third World country where people work for a few bucks a day," Alverson said. "They'll raise it with low-valued labor and use our science to undercut our commercial fishery and coastal communities." Rubino and others say wild harvests and aquaculture can complement each other, particularly during months when there are lower catch limits for wild sablefish. "You always have this yin-yang problem between fisheries and aquaculture," Rick Goetz, who leads the marine fish and shellfish biology program at the Manchester Research Station, across Puget Sound from Seattle. "The big problem is allaying the fears of people that you can have both. You can have both of those things working, particularly because this fish is such a high-value product." In recent years, NOAA Fisheries scientists have worked to reduce potential barriers to sablefish aquaculture. They have developed techniques to produce all-female stocks of sablefish that grow faster and much bigger than males in about 24 months. Ideal market size is roughly 5 ½ pounds (2 ½ kilograms). They've also studied different ways to reduce the costs of feeding juvenile fish, increase larvae survival rates and decrease deformities. One research project is replacing more expensive algae with clay that is used to help sablefish larvae better find their prey. Another looked at finding the optimal temperature to increase larval growth. Wild fish are caught off the Washington coast and used to develop captive brood stocks, or mature fish that are used for breeding. At the facility, the fertilized eggs grow in silos in dark, cold rooms before being moved to other indoor tanks where they're fed a steady diet of brined shrimp and other food. Large circular tanks hold fish in different growth stages. The facility produces about 10,000 all-female fingerlings, or juveniles about an inch (25 millimeters) long, each year. It has sent some fish to a Texas company that uses land-based recirculation tanks to grow fish, as well as others interested in sablefish aquaculture. NOAA Fisheries also is working with a Native American tribe in Washington state to get a pilot project to grow sablefish in net pens outside the research facility at Manchester. The tribe and others have applied for a federal grant. Kurt Grinnell, aquaculture manager for the Jamestown S'Kallam Tribe, said the tribe is very interested in sablefish aquaculture for many reasons. "It's a native fish to our area. It's a very robust fish. It's very sought-after. It's got great market value," he said. "Over time, our country and other countries will have to get their protein source somewhere, and we believe this is one way to meet that demand."


News Article | May 7, 2017
Site: news.yahoo.com

In this photo taken March 28, 2017, Bill Fairgrieve, a fisheries research biologist with the National Oceanic and Atmospheric Administration, holds a sablefish at a research facility in Manchester, Wash. Scientists are studying sablefish genetics and investigating ways to make it easier and more efficient to commercially grow and farm the fish as part of a larger effort by NOAA to support marine aquaculture as a possible solution to feed a growing demand worldwide for seafood. (AP Photo/Ted S. Warren) PORT ORCHARD, Wash. (AP) — The dark gray fish prized for its buttery flavor live deep in the ocean, so researchers keep their lab cold and dark to simulate ideal conditions for sablefish larvae. A biologist shines his dim red headlamp and uses an ultrasound to scan the belly of an anesthetized sablefish about the length of his forearm to tell if it's female and has eggs to collect. He gently squeezes out hundreds of tiny, translucent eggs into a glass beaker. After the eggs are fertilized externally, they'll grow in large indoor tanks and some in floating net pens in Washington state's Puget Sound to be used for research. At this federal marine research station near Seattle, scientists are studying sablefish genetics and investigating ways to make it easier and more efficient to commercially grow the fish. It is part of a larger effort by the National Oceanic and Atmospheric Administration to support marine aquaculture as a solution to feed a growing demand worldwide for seafood. People are consuming more fish than in previous decades, with average worldwide per capita consumption hitting 43 pounds (20 kilograms) a year, according to the Food and Agriculture Organization of the United Nations. Fish consumption is expected to grow even more in coming years. NOAA says aquaculture can relieve pressure on fishing populations and promote economic growth. Fishermen along the U.S. West Coast, mostly in Alaska, catch millions of pounds of wild sablefish each year but no commercial sablefish net-pen farming exists in the U.S. Sablefish, also known as black cod or butterfish, are long-lived species that is native to the northeast Pacific Ocean and highly valued in Asia for its beneficial nutrients and delicate flavor. The fish are grilled, smoked, poached, roasted or served as sushi. Michael Rubino, who directs the NOAA aquaculture program, noted that practices for farming fish in the U.S. meet very strict environmental regulations. But some critics worry large-scale farms could harm wild fish stocks and ocean health, and some commercial fishermen worry about potential competition. "This would be a big threat for us," said Robert Alverson, executive director of the Fishing Vessel Owners' Association, a Seattle-based group that represents about 95 commercial fishermen in Alaska, Oregon, Washington and California. In 2015, fisherman harvested about 35 million pounds (16 million kilograms) of sablefish worth $113 million in the United States, all along the U.S. West Coast. Of that, nearly two-thirds, or about 23 million pounds (10 million kilograms), were caught in Alaska, with smaller amounts in Oregon, Washington and California. Nearly half of the sablefish caught in the United States is exported, with a majority going to Japan. "Our fear is that science isn't going to stay in the U.S., and it will be exported to a Third World country where people work for a few bucks a day," Alverson said. "They'll raise it with low-valued labor and use our science to undercut our commercial fishery and coastal communities." Rubino and others say wild harvests and aquaculture can complement each other, particularly during months when there are lower catch limits for wild sablefish. "You always have this yin-yang problem between fisheries and aquaculture," Rick Goetz, who leads the marine fish and shellfish biology program at the Manchester Research Station, across Puget Sound from Seattle. "The big problem is allaying the fears of people that you can have both. You can have both of those things working, particularly because this fish is such a high-value product." In recent years, NOAA Fisheries scientists have worked to reduce potential barriers to sablefish aquaculture. They have developed techniques to produce all-female stocks of sablefish that grow faster and much bigger than males in about 24 months. Ideal market size is roughly 5 ½ pounds (2 ½ kilograms). They've also studied different ways to reduce the costs of feeding juvenile fish, increase larvae survival rates and decrease deformities. One research project is replacing more expensive algae with clay that is used to help sablefish larvae better find their prey. Another looked at finding the optimal temperature to increase larval growth.


News Article | May 4, 2017
Site: www.chromatographytechniques.com

A new analysis of decades of data on oceans across the globe has revealed that the amount of dissolved oxygen contained in the water - an important measure of ocean health - has been declining for more than 20 years. Researchers at Georgia Institute of Technology looked at a historic dataset of ocean information stretching back more than 50 years and searched for long term trends and patterns. They found that oxygen levels started dropping in the 1980s as ocean temperatures began to climb. "The oxygen in oceans has dynamic properties, and its concentration can change with natural climate variability," said Taka Ito, an associate professor in Georgia Tech's School of Earth and Atmospheric Sciences who led the research. "The important aspect of our result is that the rate of global oxygen loss appears to be exceeding the level of nature's random variability." The study, which was published April in Geophysical Research Letters, was sponsored by the National Science Foundation and the National Oceanic and Atmospheric Administration. The team included researchers from the National Center for Atmospheric Research, the University of Washington-Seattle, and Hokkaido University in Japan. Falling oxygen levels in water have the potential to impact the habitat of marine organisms worldwide and in recent years led to more frequent "hypoxic events" that killed or displaced populations of fish, crabs and many other organisms. Researchers have for years anticipated that rising water temperatures would affect the amount of oxygen in the oceans, since warmer water is capable of holding less dissolved gas than colder water. But the data showed that ocean oxygen was falling more rapidly than the corresponding rise in water temperature. "The trend of oxygen falling is about two to three times faster than what we predicted from the decrease of solubility associated with the ocean warming," Ito said. "This is most likely due to the changes in ocean circulation and mixing associated with the heating of the near-surface waters and melting of polar ice." The majority of the oxygen in the ocean is absorbed from the atmosphere at the surface or created by photosynthesizing phytoplankton. Ocean currents then mix that more highly oxygenated water with subsurface water. But rising ocean water temperatures near the surface have made it more buoyant and harder for the warmer surface waters to mix downward with the cooler subsurface waters. Melting polar ice has added more freshwater to the ocean surface - another factor that hampers the natural mixing and leads to increased ocean stratification. "After the mid-2000s, this trend became apparent, consistent and statistically significant—beyond the envelope of year-to-year fluctuations," Ito said. "The trends are particularly strong in the tropics, eastern margins of each basin and the subpolar North Pacific." In an earlier study, Ito and other researchers explored why oxygen depletion was more pronounced in tropical waters in the Pacific Ocean. They found that air pollution drifting from East Asia out over the world's largest ocean contributed to oxygen levels falling in tropical waters thousands of miles away. Once ocean currents carried the iron and nitrogen pollution to the tropics, photosynthesizing phytoplankton went into overdrive consuming the excess nutrients. But rather than increasing oxygen, the net result of the chain reaction was the depletion oxygen in subsurface water. That, too, is likely a contributing factor in waters across the globe, Ito said.


News Article | May 7, 2017
Site: hosted2.ap.org

(AP) — The dark gray fish prized for its buttery flavor live deep in the ocean, so researchers keep their lab cold and dark to simulate ideal conditions for sablefish larvae. A biologist shines his dim red headlamp and uses an ultrasound to scan the belly of an anesthetized sablefish about the length of his forearm to tell if it's female and has eggs to collect. He gently squeezes out hundreds of tiny, translucent eggs into a glass beaker. Once the eggs are fertilized externally, they'll grow in large indoor tanks and some in floating net pens in Washington state's Puget Sound to be used for research. At this federal marine research station near Seattle, scientists are studying sablefish genetics and investigating ways to make it easier and more efficient to commercially grow the fish. It is part of a larger effort by the National Oceanic and Atmospheric Administration to support marine aquaculture as a solution to feed a growing demand worldwide for seafood. People are consuming more fish than in previous decades, with average worldwide per capita consumption hitting 43 pounds (20 kilograms) a year, according to the Food and Agriculture Organization of the United Nations. Fish consumption is expected to grow even more in coming years. NOAA says aquaculture can relieve pressure on fishing populations and promote economic growth. Fishermen along the U.S. West Coast, mostly in Alaska, catch millions of pounds of wild sablefish each year but no commercial sablefish net-pen farming exists in the U.S. Sablefish, also known as black cod or butterfish, are long-lived species that is native to the northeast Pacific Ocean and highly valued in Asia for its beneficial nutrients and delicate flavor. The fish are grilled, smoked, poached, roasted or served as sushi. Michael Rubino, who directs the NOAA aquaculture program, noted that practices for farming fish in the U.S. meet very strict environmental regulations. But some critics worry large-scale farms could harm wild fish stocks and ocean health, and some commercial fishermen worry about potential competition. "This would be a big threat for us," said Robert Alverson, executive director of the Fishing Vessel Owners' Association, a Seattle-based group that represents about 95 commercial fishermen in Alaska, Oregon, Washington and California. In 2015, fisherman harvested about 35 million pounds (16 million kilograms) of sablefish worth $113 million in the United States, all along the U.S. West Coast. Of that, nearly two-thirds, or about 23 million pounds (10 million kilograms), were caught in Alaska, with smaller amounts in Oregon, Washington and California. Nearly half of the sablefish caught in the United States is exported, with a majority going to Japan. "Our fear is that science isn't going to stay in the U.S., and it will be exported to a Third World country where people work for a few bucks a day," Alverson said. "They'll raise it with low-valued labor and use our science to undercut our commercial fishery and coastal communities." Rubino and others say wild harvests and aquaculture can complement each other, particularly during months when there are lower catch limits for wild sablefish. "You always have this yin-yang problem between fisheries and aquaculture," Rick Goetz, who leads the marine fish and shellfish biology program at the Manchester Research Station, across Puget Sound from Seattle. "The big problem is allaying the fears of people that you can have both. You can have both of those things working, particularly because this fish is such a high-value product." In recent years, NOAA Fisheries scientists have worked to reduce potential barriers to sablefish aquaculture. They have developed techniques to produce all-female stocks of sablefish that grow faster and much bigger than males in about 24 months. Ideal market size is roughly 5 ½ pounds (2 ½ kilograms). They've also studied different ways to reduce the costs of feeding juvenile fish, increase larvae survival rates and decrease deformities. One research project is replacing more expensive algae with clay that is used to help sablefish larvae better find their prey. Another looked at finding the optimal temperature to increase larval growth. Wild fish are caught off the Washington coast and used to develop captive brood stocks, or mature fish that are used for breeding. At the facility, the fertilized eggs grow in silos in dark, cold rooms before being moved to other indoor tanks where they're fed a steady diet of brined shrimp and other food. Large circular tanks hold fish in different growth stages. The facility produces about 10,000 all-female fingerlings, or juveniles about an inch (25 millimeters) long, each year. It has sent some fish to a Texas company that uses land-based recirculation tanks to grow fish, as well as others interested in sablefish aquaculture. NOAA Fisheries also is working with a Native American tribe in Washington state to get a pilot project to grow sablefish in net pens outside the research facility at Manchester. The tribe and others have applied for a federal grant. Kurt Grinnell, aquaculture manager for the Jamestown S'Kallam Tribe, said the tribe is very interested in sablefish aquaculture for many reasons. "It's a native fish to our area. It's a very robust fish. It's very sought-after. It's got great market value," he said. "Over time, our country and other countries will have to get their protein source somewhere, and we believe this is one way to meet that demand."


It takes a very special person to label the photographed, documented, filmed and studied phenomenon of mass coral bleaching on the Great Barrier Reef “fake news”. You need lashings of chutzpah, blinkers the size of Donald Trump’s hairspray bill and more hubris than you can shake a branch of dead coral at. It also helps if you can hide inside the bubble of the hyper-partisan Breitbart media outlet, whose former boss is the US president’s chief strategist. So our special person is the British journalist James Delingpole who, when he’s not denying the impacts of coral bleaching, is denying the science of human-caused climate change, which he says is “the biggest scam in the history of the world”. Delingpole was offended this week by an editorial in the Washington Post that read: “Humans are killing the Great Barrier Reef, one of the world’s greatest natural wonders, and there’s nothing Australians on their own can do about it. We are all responsible.” Now before we go on, let’s deal with some language here. When we talk about the reef dying, what we are talking about are the corals that form the reef’s structure – the things that when in a good state of health can be splendorous enough to support about 69,000 jobs in Queensland and add about $6bn to Australia’s economy every year. The Great Barrier Reef has suffered mass coral bleaching three times – in 1998, 2002 and 2016 – with a fourth episode now unfolding. The cause is increasing ocean temperatures. “Is the Great Barrier Reef dying due to climate change caused by man’s selfishness and greed?” asks Delingpole, before giving a long list of people and groups who he thinks will answer yes, including “the Guardian” and “any marine biologist”. “Have they been out there personally – as I have – to check. No of course not,” says Delingpole. Yes. James Delingpole has been out there “personally” to check, but all those other people haven’t. He doesn’t say when he went but he has written about one trip before. It was back in late April 2012. Everything was fine, he said, based on that one visit. I can’t find any times when he has mentioned another trip since. So here’s the rhetorical question – one that I can barely believe I’m asking, even rhetorically. Why should there not be equivalence between Delingpole’s single trip to the reef (apparently taken 10 years after a previous severe case of bleaching and four years before the one that followed) at one spot on a reef system that spans the size of Italy [takes breath] and the observations of scientists from multiple institutions diving at 150 different locations to verify observations taken by even more scientists in low-flying aircraft traversing the entire length of the reef? I mean, come on? Why can those two things – Delingpole making a boat trip with mates and a coordinated and exhaustive scientific monitoring and data-gathering exercise – not be the same? So it seems we are now at a stage where absolutely nothing is real unless you have seen it for yourself, so you can dismiss all of the photographs and video footage of bleached and dead coral, the testimony of countless marine biologists (who, we apparently also have to point out, have been to the reef ) and the observations made by the government agency that manages the reef. Senator Pauline Hanson and her One Nation climate science-denying colleagues tried to pull a similar stunt last year by taking a dive on a part of the reef that had escaped bleaching and then claiming this as proof that everything was OK everywhere else. This is like trying to disprove to a doctor that you have two broken legs by showing him an MRI scan of your head (which may or not reveal the presence of a brain), and then being annoyed when he doesn’t accept your evidence. Last year’s bleaching on the reef was the worst episode recorded to date. The current fourth mass bleaching has scientists again taking to the field. This month a study published in Nature, and co-authored by 46 scientists, found these three episodes had impacted reefs “across almost the entire Great Barrier Reef marine park”. Only southern offshore reefs had escaped. Corals bleach when they are exposed to abnormally high ocean temperatures for too long. Under stress, the corals expel the algae that give them their colour and more of their nutrients. Corals can recover but, as the study explains, even the fastest growing and most vigorous colonisers in the coral family need between 10 and 15 years to recover. After the 2016 bleaching, a quarter of all corals on the reef, mostly located in the once “pristine” northern section, died before there was a chance for recovery. In a further blow, the study looked at factors such as improving water quality or reducing fishing pressure and asked if these had helped corals to resist bleaching. In each case, they found they did not (although they do give reefs that survive a better chance to recover). Essentially, the study found the only measure that would give corals on the reef a fighting chance was to rapidly reduce greenhouse gas emissions. The lead author of the study, Prof Terry Hughes of James Cook University (who is this week carrying out aerial surveys of the current bleaching episode), told my Positive Feedback podcast: Some commentators have suggested a key cause of the 2016 bleaching was the El Niño weather pattern that tends to deliver warmer global temperatures. But Hughes says that before 1998, the Great Barrier Reef went through countless El Niños without suffering the extensive mass bleaching episodes that are being seen, photographed, filmed and documented now. Dr Mark Eakin, head of Coral Reef Watch at the US National Oceanic and Atmospheric Administration, said the cause of the modern-day mass bleaching episodes on reefs across the world was the rise in ocean temperatures. This, says Eakin, is “being driven largely by humans and our burning of fossil fuels”. Government ministers at federal and state levels, of both political stripes, claim they want to protect the reef. They are running this protection racket, somehow, by continuing to support plans for a coalmine that will be the biggest in the country’s history. That’s some more hubris right there.


News Article | May 4, 2017
Site: www.eurekalert.org

IMAGE:  Global map of the linear trend of dissolved oxygen at the depth of 100 meters. view more A new analysis of decades of data on oceans across the globe has revealed that the amount of dissolved oxygen contained in the water - an important measure of ocean health - has been declining for more than 20 years. Researchers at Georgia Institute of Technology looked at a historic dataset of ocean information stretching back more than 50 years and searched for long term trends and patterns. They found that oxygen levels started dropping in the 1980s as ocean temperatures began to climb. "The oxygen in oceans has dynamic properties, and its concentration can change with natural climate variability," said Taka Ito, an associate professor in Georgia Tech's School of Earth and Atmospheric Sciences who led the research. "The important aspect of our result is that the rate of global oxygen loss appears to be exceeding the level of nature's random variability." The study, which was published April in Geophysical Research Letters, was sponsored by the National Science Foundation and the National Oceanic and Atmospheric Administration. The team included researchers from the National Center for Atmospheric Research, the University of Washington-Seattle, and Hokkaido University in Japan. Falling oxygen levels in water have the potential to impact the habitat of marine organisms worldwide and in recent years led to more frequent "hypoxic events" that killed or displaced populations of fish, crabs and many other organisms. Researchers have for years anticipated that rising water temperatures would affect the amount of oxygen in the oceans, since warmer water is capable of holding less dissolved gas than colder water. But the data showed that ocean oxygen was falling more rapidly than the corresponding rise in water temperature. "The trend of oxygen falling is about two to three times faster than what we predicted from the decrease of solubility associated with the ocean warming," Ito said. "This is most likely due to the changes in ocean circulation and mixing associated with the heating of the near-surface waters and melting of polar ice." The majority of the oxygen in the ocean is absorbed from the atmosphere at the surface or created by photosynthesizing phytoplankton. Ocean currents then mix that more highly oxygenated water with subsurface water. But rising ocean water temperatures near the surface have made it more buoyant and harder for the warmer surface waters to mix downward with the cooler subsurface waters. Melting polar ice has added more freshwater to the ocean surface - another factor that hampers the natural mixing and leads to increased ocean stratification. "After the mid-2000s, this trend became apparent, consistent and statistically significant -- beyond the envelope of year-to-year fluctuations," Ito said. "The trends are particularly strong in the tropics, eastern margins of each basin and the subpolar North Pacific." In an earlier study, Ito and other researchers explored why oxygen depletion was more pronounced in tropical waters in the Pacific Ocean. They found that air pollution drifting from East Asia out over the world's largest ocean contributed to oxygen levels falling in tropical waters thousands of miles away. Once ocean currents carried the iron and nitrogen pollution to the tropics, photosynthesizing phytoplankton went into overdrive consuming the excess nutrients. But rather than increasing oxygen, the net result of the chain reaction was the depletion oxygen in subsurface water. That, too, is likely a contributing factor in waters across the globe, Ito said. This material is based upon work supported by the National Science Foundation under Grant No. OCE-1357373 and the National Oceanic and Atmospheric Administration under Grant No. NA16OAR4310173. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation or the National Oceanic and Atmospheric Administration.


News Article | May 7, 2017
Site: hosted2.ap.org

(AP) — Federal scientists at a research facility near Seattle are studying ways to make it easier and more efficient to commercially grow a fish prized for its buttery flavor. The project to grow sablefish is part of a larger effort by the National Oceanic and Atmospheric Administration to support marine aquaculture to feed growing demand worldwide for seafood. The sablefish is also known as black cod or butter fish. It is a fin fish native to the northeast Pacific Ocean. It's highly valued in Asia for its delicate flavor. U.S. West Coast fishermen mostly in Alaska catch millions of pounds of wild sablefish each year. Some see a potential opportunity to farm the sablefish. NOAA fisheries researches are developing new techniques to could help make farming the fish more viable.


News Article | May 4, 2017
Site: www.undercurrentnews.com

The US National Marine Fisheries Service (NMFS) is scouting for a new head, and it has three officials in the running to lead it: a former Louisiana official, an Alaskan fishery manager and a Sea Grant program director. NMFS, housed under the National Oceanic and Atmospheric Administration, oversees fishing regulations, endangered species listings and fisheries research. It is headed by an assistant administrator for fisheries, a position that commerce secretary Wilbur Ross can fill without Senate confirmation. The three contenders are Robert Barham, who served as wildlife and fisheries secretary under former Louisiana governor Bobby Jindal; Chris Oliver, longtime executive director of the North Pacific Fishery Management Council; and LaDon Swann, who directs the Mississippi-Alabama Sea Grant Consortium -- one of the 33 Sea Grant programs that President Trump has proposed eliminating. For the complete article, click here.


News Article | May 5, 2017
Site: cleantechnica.com

Just 4 years after atmospheric carbon dioxide levels passed the 400 parts per million mark, levels have now surged past the 410 parts per million (ppm) mark — serving as a good example of the great speed at which atmospheric levels are now increasing. To be clear, atmospheric carbon dioxide levels on the Earth haven’t been that high in many million of years, since back when the planet had a very different climate than it does now. What this means is that, when the lag time required for high atmospheric carbon dioxide levels to have their full climate forcing is factored in, we are now very likely headed towards: a world without polar ice caps, with much higher sea levels, with much higher average temperatures, with greatly changed weather and oceanic circulation patterns, and with a greatly reduced capacity for high-intensity agriculture. While the passing of the 410 ppm mark doesn’t seem to mean much on its own, it does more or less show that the world is still on track for “worst case” scenarios as regards anthropogenic climate change. “Its pretty depressing that it’s only a couple of years since the 400 ppm milestone was toppled,” Gavin Foster, a paleoclimate researcher at the University of Southampton commented to Climate Central recently. “These milestones are just numbers, but they give us an opportunity to pause and take stock and act as useful yard sticks for comparisons to the geological record.” “The rate of increase will go down when emissions decrease,” commented Pieter Tans, an atmospheric scientist at the National Oceanic and Atmospheric Administration. “But carbon dioxide will still be going up, albeit more slowly. Only when emissions are cut in half will atmospheric carbon dioxide level off initially.” I’ll note here that I’m a bit skeptical of that claim — much depends on the positive feedback loops that kick in by the time emissions decrease (which I don’t think will ever happen willingly, but rather as a result of population reduction via war, agricultural failure, water scarcity, increasing drug/antibiotic resistance amongst microbes, etc.). The feedback loops that I’m talking about — permafrost melting, increasing aridity and rates of wildfires, widespread desertification, methane clathrate release, reduced Arctic and Antarctic albedo as the result of disappearing ice, etc. — already seem to be kicking in at this point, to some degree or another. With that in mind, it’s been interesting to observe the degree of sheer denial going on collectively/culturally. Not that that’s anything new — a look back at earlier civilizations show something similar. The pull of collective narratives seem to hold most people in a state of inaction even as the walls literally come down on them. People generally stick with the stories they know until the day they die. Keep up to date with all the hottest cleantech news by subscribing to our (free) cleantech daily newsletter or weekly newsletter, or keep an eye on sector-specific news by getting our (also free) solar energy newsletter, electric vehicle newsletter, or wind energy newsletter.


News Article | May 8, 2017
Site: www.sej.org

"An environmental group is suing the Trump administration, demanding it protect a species of shark. Oceana argues that a rule issued last month by the National Marine Fisheries Service (NMFS) does not go far enough to protect dusky sharks from overfishing. The lawsuit was filed against the Department of Commerce, the National Oceanic and Atmospheric Administration (NOAA) and the NMFS." Tim Devaney reports for The Hill May 4, 2017.


News Article | May 7, 2017
Site: hosted2.ap.org

(AP) — The dark gray fish prized for its buttery flavor live deep in the ocean, so researchers keep their lab cold and dark to simulate ideal conditions for sablefish larvae. A biologist shines his dim red headlamp and uses an ultrasound to scan the belly of an anesthetized sablefish about the length of his forearm to tell if it's female and has eggs to collect. He gently squeezes out hundreds of tiny, translucent eggs into a glass beaker. After the eggs are fertilized externally, they'll grow in large indoor tanks and some in floating net pens in Washington state's Puget Sound to be used for research. At this federal marine research station near Seattle, scientists are studying sablefish genetics and investigating ways to make it easier and more efficient to commercially grow the fish. It is part of a larger effort by the National Oceanic and Atmospheric Administration to support marine aquaculture as a solution to feed a growing demand worldwide for seafood. People are consuming more fish than in previous decades, with average worldwide per capita consumption hitting 43 pounds (20 kilograms) a year, according to the Food and Agriculture Organization of the United Nations. Fish consumption is expected to grow even more in coming years. NOAA says aquaculture can relieve pressure on fishing populations and promote economic growth. Fishermen along the U.S. West Coast, mostly in Alaska, catch millions of pounds of wild sablefish each year but no commercial sablefish net-pen farming exists in the U.S. Sablefish, also known as black cod or butterfish, are long-lived species that is native to the northeast Pacific Ocean and highly valued in Asia for its beneficial nutrients and delicate flavor. The fish are grilled, smoked, poached, roasted or served as sushi. Michael Rubino, who directs the NOAA aquaculture program, noted that practices for farming fish in the U.S. meet very strict environmental regulations. But some critics worry large-scale farms could harm wild fish stocks and ocean health, and some commercial fishermen worry about potential competition. "This would be a big threat for us," said Robert Alverson, executive director of the Fishing Vessel Owners' Association, a Seattle-based group that represents about 95 commercial fishermen in Alaska, Oregon, Washington and California. In 2015, fisherman harvested about 35 million pounds (16 million kilograms) of sablefish worth $113 million in the United States, all along the U.S. West Coast. Of that, nearly two-thirds, or about 23 million pounds (10 million kilograms), were caught in Alaska, with smaller amounts in Oregon, Washington and California. Nearly half of the sablefish caught in the United States is exported, with a majority going to Japan. "Our fear is that science isn't going to stay in the U.S., and it will be exported to a Third World country where people work for a few bucks a day," Alverson said. "They'll raise it with low-valued labor and use our science to undercut our commercial fishery and coastal communities." Rubino and others say wild harvests and aquaculture can complement each other, particularly during months when there are lower catch limits for wild sablefish. "You always have this yin-yang problem between fisheries and aquaculture," Rick Goetz, who leads the marine fish and shellfish biology program at the Manchester Research Station, across Puget Sound from Seattle. "The big problem is allaying the fears of people that you can have both. You can have both of those things working, particularly because this fish is such a high-value product." In recent years, NOAA Fisheries scientists have worked to reduce potential barriers to sablefish aquaculture. They have developed techniques to produce all-female stocks of sablefish that grow faster and much bigger than males in about 24 months. Ideal market size is roughly 5 ½ pounds (2 ½ kilograms). They've also studied different ways to reduce the costs of feeding juvenile fish, increase larvae survival rates and decrease deformities. One research project is replacing more expensive algae with clay that is used to help sablefish larvae better find their prey. Another looked at finding the optimal temperature to increase larval growth. Wild fish are caught off the Washington coast and used to develop captive brood stocks, or mature fish that are used for breeding. At the facility, the fertilized eggs grow in silos in dark, cold rooms before being moved to other indoor tanks where they're fed a steady diet of brined shrimp and other food. Large circular tanks hold fish in different growth stages. The facility produces about 10,000 all-female fingerlings, or juveniles about an inch (25 millimeters) long, each year. It has sent some fish to a Texas company that uses land-based recirculation tanks to grow fish, as well as others interested in sablefish aquaculture. NOAA Fisheries also is working with a Native American tribe in Washington state to get a pilot project to grow sablefish in net pens outside the research facility at Manchester. The tribe and others have applied for a federal grant. Kurt Grinnell, aquaculture manager for the Jamestown S'Kallam Tribe, said the tribe is very interested in sablefish aquaculture for many reasons. "It's a native fish to our area. It's a very robust fish. It's very sought-after. It's got great market value," he said. "Over time, our country and other countries will have to get their protein source somewhere, and we believe this is one way to meet that demand."


News Article | May 4, 2017
Site: www.eurekalert.org

VIDEO:  The University of Delaware was part of a multinational team that used autonomous underwater vehicles to map deep sea reefs, situated 100 to over 500 feet (30 to over 150... view more A study authored by University of Delaware Professor Art Trembanis and colleagues reveals new details about deep sea reefs -- known as mesophotic reefs -- near the island of Bonaire in the Dutch Caribbean. While coral reefs worldwide are in decline, the waters surrounding Bonaire comprise a marine park known as a scuba "diver's paradise" because it contains some of the most well-preserved coral reefs in the Caribbean basin. Trembanis and colleagues used autonomous underwater vehicles (AUV) to map these deep sea reefs, situated 100 to over 500 feet (30 to over 150 meters) below the ocean surface, which are considered a lifeline for shallow reef recovery due to stressors like warming (bleaching), ocean acidification, over fishing and other deteriorations. These deep reefs can be a substantive part of the coral reef ecosystem of any island, yet they remain largely unexplored because they generally are located beyond the capabilities of divers and are too expansive to be studied using submersibles. The researchers hope the mapping effort, and the associated data, will help local conservation efforts. "It's hard to manage what you don't see," said Trembanis, an associate professor in the College of Earth, Ocean, and Environment's School of Marine Science and Policy. Using an AUV called a Teledyne Gavia, equipped with remote sensing, acoustic sonar systems and cameras, Trembanis and colleagues roughly mapped nearly two square kilometers of seafloor around the leeward (downwind) side of Bonaire. The multinational team field project was part of a major National Oceanic and Atmospheric Administration (NOAA) Ocean Exploration campaign. More than 20 scientists and engineers from across two continents and half a dozen countries participated in the project, including UD undergraduate students on study abroad. The researchers first focused on identifying where these mesophotic reefs were located, then analyzed the data collected to characterize the depth, slope and surface roughness of the sea floor, creating an index of specific bottom types associated with these deeper reefs. "So you might be able to see hey, the slope is low but there is a big bump there, giving you the physical properties of the area. Then the backscatter from the sonar might tell you something about the nature of the seabed, like whether it is sandy or comprised of hard coral," explained Trembanis. These tools also allow the researchers to classify ecologically important reefs that might be worthy of further investigation. According to Trembanis, some of the reefs that were mapped may have originated when mean sea levels were lower. As sea levels have risen, Trembanis theorizes that these underwater communities have evolved to include only deeper water tolerant species, providing a refuge in spawning or settling sites not just for the corals, but also for fish. "If we want to try to study refugia connections between shallow and deep reefs, first we want to know where the deep reefs are located," he said. For example, if there are shallow water species in a deep water setting that was disturbed, it might suggest that these species didn't grow in situ but may have been transported there, perhaps by a significant wave event. "We know that places in Bonaire and the Caribbean are routinely hit by hurricanes and subject to tsunami events from volcanic collapses or eruptions within the basin. We know through other work that the signature of a storm can be seen on the seafloor offshore," he said. "The details of these maps allow us to look for anomalies on the seabed." Two sites showed evidence of reef-like structures at depths greater than 550 feet (approximately 168 meters) with no other associated shallow water reefs nearby, suggesting a submerged reef, rather than a reef created from a collapsed fragment from the shallow water reef above. In another location, a large pile of coral rubble discovered at 550 feet was deposited in such a way that suggests the coral originated in shallower waters, which may indicate that a significant wave event (either storm or tsunami) had an impact on the area. Over 50 percent of the observed reef structures in the study were found outside the marine protected area (MPA), raising important questions on whether an extension of the MPA should be considered to safeguard important sources of shallow coral larvae and photosynthetic algae called zooxanthellae. One interesting piece of information to emerge from the data was the discovery of several slump features likely related either to tsunami events or major hurricanes. According to Trembanis, the data could aid in hazard risk management throughout the Caribbean. "Neighboring islands of Aruba, Curacao and throughout the Dutch West Indies/Caribbean share a common approach to marine management and it's likely that the distributions we are seeing in Bonaire are present in these other islands. This opens the door to future projects to create baseline maps of where deep reefs are located and in what condition," he said. The study, published in the open-source journal Frontiers in Marine Science, was supported by a NOAA Office of Ocean Exploration and Research Award #NA07OAR4600291. Co-authors on the paper include Alexander L. Forrest (University of California-Davis), Bryan M. Keller (UD) and Mark R. Patterson (Northeastern University).


News Article | May 8, 2017
Site: www.eurekalert.org

Our ever-changing sun continuously shoots solar material into space. The grandest such events are massive clouds that erupt from the sun, called coronal mass ejections, or CMEs. These solar storms often come first with some kind of warning -- the bright flash of a flare, a burst of heat or a flurry of solar energetic particles. But another kind of storm has puzzled scientists for its lack of typical warning signs: They seem to come from nowhere, and scientists call them stealth CMEs. Now, an international team of scientists, led by the Space Sciences Laboratory at University of California, Berkeley, and funded in part by NASA, has developed a model that simulates the evolution of these stealthy solar storms. The scientists relied upon NASA missions STEREO and SOHO for this work, fine-tuning their model until the simulations matched the space-based observations. Their work shows how a slow, quiet process can unexpectedly create a twisted mass of magnetic fields on the sun, which then pinches off and speeds out into space -- all without any advance warning. Compared to typical CMEs, which erupt from the sun as fast as 1800 miles per second, stealth CMEs move at a rambling gait -- between 250 to 435 miles per second. That's roughly the speed of the more common solar wind, the constant stream of charged particles that flows from the sun. At that speed, stealth CMEs aren't typically powerful enough to drive major space weather events, but because of their internal magnetic structure they can still cause minor to moderate disturbances to Earth's magnetic field. To uncover the origins of stealth CMEs, the scientists developed a model of the sun's magnetic fields, simulating their strength and movement in the sun's atmosphere. Central to the model was the sun's differential rotation, meaning different points on the sun rotate at different speeds. Unlike Earth, which rotates as a solid body, the sun rotates faster at the equator than it does at its poles. The model showed differential rotation causes the sun's magnetic fields to stretch and spread at different rates. The scientists demonstrated this constant process generates enough energy to form stealth CMEs over the course of roughly two weeks. The sun's rotation increasingly stresses magnetic field lines over time, eventually warping them into a strained coil of energy. When enough tension builds, the coil expands and pinches off into a massive bubble of twisted magnetic fields -- and without warning -- the stealth CME quietly leaves the sun. Such computer models can help researchers better understand how the sun affects near-Earth space, and potentially improve our ability to predict space weather, as is done for the nation by the U.S. National Oceanic and Atmospheric Administration. A paper published in the Journal of Geophysical Research on Nov. 5, 2016, summarizes this work.


News Article | May 4, 2017
Site: www.chromatographytechniques.com

A new analysis of decades of data on oceans across the globe has revealed that the amount of dissolved oxygen contained in the water - an important measure of ocean health - has been declining for more than 20 years. Researchers at Georgia Institute of Technology looked at a historic dataset of ocean information stretching back more than 50 years and searched for long term trends and patterns. They found that oxygen levels started dropping in the 1980s as ocean temperatures began to climb. "The oxygen in oceans has dynamic properties, and its concentration can change with natural climate variability," said Taka Ito, an associate professor in Georgia Tech's School of Earth and Atmospheric Sciences who led the research. "The important aspect of our result is that the rate of global oxygen loss appears to be exceeding the level of nature's random variability." The study, which was published April in Geophysical Research Letters, was sponsored by the National Science Foundation and the National Oceanic and Atmospheric Administration. The team included researchers from the National Center for Atmospheric Research, the University of Washington-Seattle, and Hokkaido University in Japan. Falling oxygen levels in water have the potential to impact the habitat of marine organisms worldwide and in recent years led to more frequent "hypoxic events" that killed or displaced populations of fish, crabs and many other organisms. Researchers have for years anticipated that rising water temperatures would affect the amount of oxygen in the oceans, since warmer water is capable of holding less dissolved gas than colder water. But the data showed that ocean oxygen was falling more rapidly than the corresponding rise in water temperature. "The trend of oxygen falling is about two to three times faster than what we predicted from the decrease of solubility associated with the ocean warming," Ito said. "This is most likely due to the changes in ocean circulation and mixing associated with the heating of the near-surface waters and melting of polar ice." The majority of the oxygen in the ocean is absorbed from the atmosphere at the surface or created by photosynthesizing phytoplankton. Ocean currents then mix that more highly oxygenated water with subsurface water. But rising ocean water temperatures near the surface have made it more buoyant and harder for the warmer surface waters to mix downward with the cooler subsurface waters. Melting polar ice has added more freshwater to the ocean surface - another factor that hampers the natural mixing and leads to increased ocean stratification. "After the mid-2000s, this trend became apparent, consistent and statistically significant—beyond the envelope of year-to-year fluctuations," Ito said. "The trends are particularly strong in the tropics, eastern margins of each basin and the subpolar North Pacific." In an earlier study, Ito and other researchers explored why oxygen depletion was more pronounced in tropical waters in the Pacific Ocean. They found that air pollution drifting from East Asia out over the world's largest ocean contributed to oxygen levels falling in tropical waters thousands of miles away. Once ocean currents carried the iron and nitrogen pollution to the tropics, photosynthesizing phytoplankton went into overdrive consuming the excess nutrients. But rather than increasing oxygen, the net result of the chain reaction was the depletion oxygen in subsurface water. That, too, is likely a contributing factor in waters across the globe, Ito said.


News Article | May 4, 2017
Site: www.fao.org

Over 150 fisheries scientists, managers, policy-makers and fishers gathered in Rome to share and discuss ways to generate better information about the world’s fisheries and the fish stocks on which they depend.Studying fish and the fisheries that exploit them is complex and challenging. ”Fish stocks are invisible to normal means of observation because they live underwater and are often highly mobile," observed Bill Karp, Director of the NOAA* Fisheries Northeast Fisheries Science Center in the United States. "In recent decades many of the world’s fish stocks have been heavily fished or even overfished. Sustainable management of these stocks in an ecosystem context requires new and different types of information which often necessitates effective collaboration among government, academic and fishing sectors. Trust and respect among managers, policy-makers and fishers is essential to this process."Using data and information from fishers while fishing and through collaborative research is considered to be a relatively untapped source of information about the world’s fish stocks and the consequences of human interactions with them. “The Rome meeting highlighted innovative approaches for capturing and using such information and emphasized the importance of involving fishers and other stakeholders in the collection of the data and in fisheries management and related policy making,” explains FAO Senior Fisheries Officer Gabriella Bianchi. The conference provided a unique opportunity for resource managers, scientists and players from the fishing sector to share and discuss better ways to collect, use and interpret information about fisheries in the context of an ecosystem approach to fisheries management. During the closing session, several important findings were noted, including: Changes in public policy requiring more comprehensive documentation of fishing activities and their impacts on ecosystems are powerful drivers of change. Effective solutions for implementing these policies require multiparty collaboration end empowerment of fishers. Establishment of an environment for collaboration and participatory science and management that are built on a foundation of trust and respect is essential to successful fisheries management. Social scientists should be encouraged to participate in these processes because they play an essential role in improving our understanding of interactions between humans and marine ecosystems, bring scientific method to understanding resources management economics, and bring professional insight that is useful in breaking down communications barriers. For more information, contact info@fisherydependentdata.com * National Oceanic and Atmospheric Administration


News Article | April 22, 2017
Site: www.businesswire.com

WASHINGTON--(BUSINESS WIRE)--GEICO takes the effects of hail damage very seriously and offers five reminders to help you be prepared for the next hail storm. This could be almost anytime. There is no confined hail season, but spring activity is the highest. More than 5,400 major hail storms hit the U.S. annually, an average of 15 hail storms a day somewhere in the U.S. Those 15 cause an average of $2 million in losses on a daily basis or nearly $720 million each year (*National Oceanic and Atmospheric Administration’s Severe Storm database). So hail storms have to be taken seriously. Hail is caused when a thunderstorm’s wind is severe enough to push raindrops upward into the atmosphere. The extremely cold air supercools the water and causes it to freeze into spheres of ice. This can occur several times, with balls of ice falling and then being lifted by updrafts, collecting condensation as they go. Where and when are hail storms more likely? States that typically have the highest hail risk include Colorado, Iowa, Kansas, Minnesota, Missouri, Nebraska, Oklahoma, South Dakota, Texas, and Wyoming. Peak months for high hail activity are historically March, April, May, and June. GEICO: 5 tips on how to prepare for hail During the past five years, claims related to wind and hail damage on a national basis accounted for almost 40 percent of all insured losses. That figure is growing each year. Hail: believe it or not GEICO (Government Employees Insurance Company) is a member of the Berkshire Hathaway family of companies and is the second-largest private passenger auto insurance company in the United States. GEICO, which was founded in 1936, provides millions of auto insurance quotes to U.S. drivers annually. The company is pleased to serve more than 15 million private passenger customers, insuring more than 24 million vehicles (auto & cycle). Using GEICO’s online service center, policyholders can purchase policies, make policy changes, report claims and print insurance ID cards. Policyholders can also connect to GEICO through the GEICO App, reach a representative over the phone or visit a GEICO local agent. GEICO also provides insurance quotes on motorcycles, all-terrain vehicles (ATVs), boats, travel trailers and motorhomes (RVs). Coverage for life, homes and apartments is written by non-affiliated insurance companies and is secured through the GEICO Insurance Agency, Inc. Commercial auto insurance and personal umbrella protection are also available. For more information, go to www.geico.com.


News Article | March 5, 2017
Site: www.techtimes.com

Nanotechnology - What You Should Know A leaked memo reveals that U.S. President Donald Trump is planning to slash the budget of a major climate science agency by nearly a fifth. This, according to experts fearing the move, could cost lives worldwide. The White House document, a memo from the Office of Management and Budget, detailed the proposed budget cuts for the National Oceanic and Atmospheric Administration, which undertakes climate change research. The plan also involves measures such as reducing funding for programs enabling U.S. coastal areas to survive extreme weather. The Office of Oceanic and Atmospheric Research could see its budget reduced by 26 percent or $126 million, while the satellite department could lose 22 percent or $513 million. “Cutting NOAA’s satellite budget will compromise NOAA’s mission of keeping Americans safe from extreme weather and providing forecasts that allow businesses and citizens to make smart plans,” former NOAA administrator Jane Lubchenco told the Washington Post, which obtained the budget memo on March 3. Trump earlier expressed plans to increase U.S. military spending by $54 million. This would partly entail cutting environmental initiatives, including those from the Environmental Protection Agency. NOAA leads the country’s weather forecasting, weather satellite program, fisheries and ocean services, as well as climate monitoring. But how exactly would you bear the brunt of a reduced climate science budget? Here are some ways, as enumerated by Forbes. Poised for elimination in the White House proposal is the Sea Grant program, which offers research, education, and legal programs to coastal communities for responsible use of oceans, coastal areas, and Great Lakes resources. At least 33 states benefit from the program, which addresses practical issues such as “sunny day flooding” or saltwater intruding into human drinking water. The potential budget cuts involve eliminating a portion of the National Environmental Satellite, Data and Information Services, which also comprises important climate data at the National Center for Environmental Indicators. Weather satellites are critical for the public, industry, and military alike, acting like “smoke detectors” and including a fleet of low-Earth and geosynchronous orbiting satellites. Large satellite programs, Forbes reminded, need sustained, consistent research, development, and support, unless one accepts a modern version of a 1900 hurricane slamming into Galveston, Texas and killing up to 12,000 people. Christian Science Monitor also noted that in practice, NOAA works in collaboration with NASA, pooling their funds together and combining expertise. This could also endanger the work being done on the space agency’s Earth Science Division, or the operation of next-gen satellites such as JPSS-1. Advances such as smartphones, precision agriculture, GPS, and life-saving medicine stem from sustained R&D — just like advanced weather forecasting. Current capabilities have been borne out of research around satellite systems and models (including one recently announced by NOAA that’s significant upgrade of its main weather modeling system), along with headways in ocean science. Even one to four years of lags in research could cause long-term damage, experts feared, especially in the face of changing climate and steady warming trends in the United States and elsewhere around the world. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | April 19, 2017
Site: www.prlog.org

-- A local team of innovators and inventors has been chosen to advance to the next round in the HeroX Big Ocean Button Challenge. Tampa Deep Sea Xplorers has been selected to compete for up to 100,000 dollars in prize money in an international competition sponsored by HeroX. The focus of the competition is to design and write mobile applications to help the public utilize ocean data from a variety of sources such as the National Weather Service, National Oceanic and Atmospheric Administration (NOAA), U.S. Coast Guard, as well as local public and private sources.The current phase of the competition will continue through August 31at which time the teams will submit their applications for judging. Tampa Deep Sea Xplorers has 2 applications in development for the competition. One application is designed to improve small boat navigation in rough seas and poor visibility conditions. The other is an application to provide port conditions to commercial vessels.Ed Larson, founder of Tampa Deep-Sea X-plorers, said, "We are truly honored to have been selected by the HeroX Big Ocean Button Challenge. We were very surprised when both of our entries were chosen to advance to the next stage."Additional information regarding the HeroX Big Ocean Button Challenge can be found by visiting their website at https://herox.com/ bigoceanbutton . You can watch a short video that describes the competition here: https://www.youtube.com/ watch?v=PxYoCRoLkVI Tampa Deep Sea Xplorers is a Florida registered LLC which was formed for the purpose of competing for the Shell Ocean Discovery XPRIZE. Information for this competition can be found by visiting https://oceandiscovery.xprize.org The team also competes in local competitions such as the Openwerx Challenge sponsored by Sofwerx in Ybor City. Additional information regarding the Openwerx Challenge can be found by visiting the Sofwerx website at www.sofwerx.org


The international hops shortage has been driven by a combination of increased demand blended with drought “It will be our job to monitor the situation over the next few years, as craft beer demand knows no borders — movement isn’t slowing down from local and international purchasers.  Bulk order exports have increased from China and Russia,”  said Fraser Valley Hop Farms Inc spokesperson Shane Douglas Toews. Canada saw a growth in the number of acres planted, as well as the yields produced for nearly all hop varieties, regardless of hot and dry international growing climates. Craft breweries continue to pop up across North America.  Despite the increase in supply, growers just can’t meet the demand for hops, creating a 5 year shortage of the flower buds. “Big brewers have had to change the way they’re managing supply resources and cost, as access to hops are becoming more and more limited,” said Alex Blackwell.  “The more acres we control, the more power we have in the brewing game.” Several breweries located in British Columbia, Canada haven’t experienced any shortages when ordering hops.  This is largely due to companies like Chilliwack Hop Farms Inc, who have secured long-term purchase contracts - sometimes 2 years in advance. According to the National Oceanic and Atmospheric Administration, the majority of hops farms in the USA have endured an average temperatures high not seen for the last 121 years.  However, as both Chilliwack Hop Farms Inc & Fraser Valley Hop Farms Inc are located on the 49th North parallel.  The border USA-Canada border was designed to follow the 49th parallels where the sun is above the horizon for 16 hours during the summer season and 8 hours during the winter.  This makes for ideal hops farming conditions. A lack of capital on hand poses the biggest threat for microbreweries, preventing bulk hop contract orders.  Bigger breweries have strategically leveraged purchase contracts sometimes years in advance.  As a result, microbreweries are forced to create one-off beer brews from hop spot lists, made up of whatever or whenever hops become available.  Breweries that are unable to fulfill their contracts and must be subsequently sold has been a source of hops for cash strapped microbreweries. Established breweries are unlikely to default on their hops purchase contract.  Million-dollar contracts have been linked to bigger breweries, buying up available resources.  This shortage has inspired some creativity among brewers who have been forced to focus on malt leading beers that have a unique pungent taste, putting a familiar beer taste to the way side. The biggest deterrent for “wanna be” hops farmers could be associated with the expensive or high initial cost involved with setting up.  Start-up requirements can include the purchase of a processing facility, storage, tractor's, land treatment, crop treatment, human resources and more.  However, an alternative to building your own farm could be the use of Farm Partnerships in British Columbia.  Companies like Fraser Valley Hop Farms Inc, recently featured on BC Beer News, allows Farming Partners to participate with annual returns. By diversifying a 130 Acre farm into 10 Acre lots, those who have little to no personal hops farming experience, can participate in the industry with a comparably modest amount of capital. 10 Acre lots have attracted the attention of independent farm share partners, embracing BC business owners.  Fraser Valley Hop Farms Inc has recently purchased a C Can storage container imported from China and delivered to the farm’s leased property on Seabird Island First Nation band, Agassiz, British Columbia.  The C Can unit will be used for equipment and tool storage. Marketing executives at multinational brewers have also diversified their branding to take a piece of the craft beer brewing market.  Unique flavors located in populated commercial storefronts allow for social interaction.  With adequate funding to design, decorate, purchase equipment, source hops, facilitate pay-roll and purchase (or lease) real estate, multinational brewers clearly have an advantage. The 2016 growing year in the U.S. was heavily impacted by weather. Last summer, the U.S. marking severe or extreme drought, setting records set in 1958.  Thus, frequent water shortages are common in many of these area’s.  As Canada has an abundance of water combined with enough sun to propel hops growth, hops farming in the Country has accelerated. Hops grow on strings help up by elevated posts or telephone polls.  Once the plant gets started it will naturally climb up the string, guiding it’s growth upwards.  Modest to low maintenance is required from farmers who provide the basic watering, weeding, fertilizing and repeating procedures for the duration of the season.  And while hops carry with them a 25 year average plant life-span, hops farming can be considered automated, low maintenance and an easy product to sell.


News Article | May 4, 2017
Site: www.enr.com

Iowa Wesley Life is planning to develop a skilled-nursing and memory-care facility. Located in Johnston, the three-story facility will total 176,259 sq ft, including 27,211 sq ft devoted to underground parking; it also will include 36 assisted-living units, 50 independent-living units and 36 skilled-nursing units. Pope Associates Inc. is the designer. The project is valued at $30 million. Wesley Life, Attn: Rob Kretzinger, President, 5508 N.W. 88th St., Johnston, 50131. DR#16-00592599. Kentucky AppHarvest is planning to construct a 2,000,000-sq-ft greenhouse in Pikeville. The project is valued at $50 million. AppHarvest, Attn: Scott Emerson, Design Manager, P.O. Box 000, Pikeville, 41501. DR#17-00593647. Louisiana Our Lady of Holy Cross College is planning to carry out the selective demolition, remediation and renovation of the school’s old administration building and the construction of two 60-ft-tall riverfront residences. The project will result in three seven-story buildings, containing 123 units, and the transformation of the administration building into a mixed-use space. Perez APC is the designer of the project, which has been valued at $15 million. Our Lady of Holy Cross College, Attn: Ronald Ambrosetti, President, 4123 Woodland Dr., New Orleans, 70131. DR#13-00467189 South Carolina Berry & Cos. is planning to develop Pikeview Place in Regent Park Townhomes, at 5055 Regent Pkwy. in Fort Mill. On a 10-acre site at a golf course, the project will include 40 townhouses and possibly one four-unit building and four six-unit buildings. The two-story homes each will be approximately 1,200 sq ft. True Homes is the designer and general contractor. The project’s value has been estimated at $10 million. Berry & Cos., Attn: Tony Berry, 114 E. Main St., Rock Hill, 29730. DR#15-00695049. Virginia The City of Virginia Beach is planning to build the Virginia Aquarium Marine Animal Care Center, in Virginia Beach. The project entails constructing two buildings, totaling 18,000 sq ft, plus parking, staging areas and a seawater collection system. Waller Todd & Sadler Architects Inc. is the designer of the project, which has been valued at between $10 million and $15 million. City of Virginia Beach, Attn: Mitch Frazier, Contracting Agent, 2388 Liberty Way, Virginia Beach, 23456. DR#13-00531303. Washington IS Property Investments LLC is planning to construct the Highland Village Townhomes at 600 146th Ave., N.E., in Bellevue. The project will consist of 87 three-story townhouse units in 19 buildings, ranging from “three-plex” to “six-plex” configurations, with underground parking. Milbrandt Architects is the designer of the project, which has been valued at between $15 million and $25 million. IS Property Investments LLC, 419 Occidental Ave., Seattle, 98104. DR#16-00522647. New York Invenergy LLC has started to redevelop the 149.8-acre Tallgrass Golf Course in Shoreham into a 24.9-MW solar farm. The facility will consist of ground-mounted, stationary and nontracking solar arrays, with 110,000 72-cell polycrystalline modules. The clubhouse will be converted into a community center, and the cart path will be re­purposed as a multi-use recreational trail. The project is valued at $99.6 million. Invenergy LLC, Attn: Mary Ryan, Public Relations, One S. Wacker Dr., Chicago, Ill. 60606. DR#16-00558119. Vermont Engelberth Construction Inc., serving as construction manager-at-risk, has begun constructing Mack Hall, a new academic building, for Norwich University, in Northfield. The three-story, 51,000-sq-ft building will be located at 158 Harmon Dr. Construction is expected to be completed by August 2018, and the project has been valued at $18 million. Engelberth Construction Inc., Attn: Chris Yandow, Project Manager, 463 Mountain View Dr., Colchester, 05446. DR#15-00647377. Indiana 5/18 The Western Wayne Regional Sewer District is seeking bidders to carry out an expansion of the district’s wastewater treatment plant in Cambridge City. The construction team will install a new influent grinder, raw-sewage pumps, an influent screening and grit-removal structure, vertical loop reactors, secondary clarifiers, ultraviolet disinfection, aerobic digesters, sludge dewatering and storage buildings, a chemical feed system, and a maintenance garage. The project has been valued at $12.8 million. Western Wayne Regional Sewer District, Attn: Darlene Druley, Superintendent, 200 S. Plum St., Cambridge City, 47327. DR#15-00444142. Oregon 7/14 The U.S. Army Corps of Engineers is seeking bidders to carry out the replacement of turbine units at the McNary Lock and Dam Powerhouse on the Columbia River at Umatilla. The project will entail supplying and installing 14 new hydroelectric turbine runners, rewinding three main-unit generators, and rehabilitating or replacing other critical hydraulic passage and powertrain equipment. The contractor will collaborate in the design process with engineers and biologists from the Corps, the Bonneville Power Administration and the National Oceanic and Atmospheric Administration. The project is valued at between $250 million and $500 million. U.S. Army Corps of Engineers, Attn: Phyllis Buerstatte, Contract Manager, 201 N. Third Ave., Walla Walla, 99362. DR#17-00592577. Much information for Pulse is derived from Dodge Data & Analytics, the premier project information source in the construction industry. For more information on a project that has a Dodge Report (DR) number or for general information on Dodge products and services, call 1-800-393-6343 or visit the website at www.dodgeleadcenter.com.


News Article | April 22, 2017
Site: news.yahoo.com

The March for Science demonstrations come amid growing anxiety over what many see as a mounting political assault on facts and evidence (AFP Photo/Peter PARKS) Washington (AFP) - Scientists and their supporters across the globe are expected to march in the thousands Saturday amid growing anxiety over what many see as a mounting political assault on facts and evidence. Anchored in Washington, with satellite marches planned in more than 600 cities worldwide, the first-ever March for Science was described by organizers as a rallying call for the importance of science in all aspects of daily life. "The march has generated a great deal of conversation around whether or not scientists should involve themselves in politics," said a statement on the official website, MarchforScience.com. "In the face of an alarming trend toward discrediting scientific consensus and restricting scientific discovery, we might ask instead: can we afford not to speak out in its defense?" Organizers say the march is non-partisan and is not aimed against US President Donald Trump or any politician or party, though the Republican US leader's administration has certainly "catalyzed" the movement, according to honorary national co-chair Lydia Villa-Komaroff, a molecular cellular biologist. "There seems to have become this disconnect between what science is and its value to society," she told reporters this week. "Fundamental basic science really underlies all of modern life these days. We have taken it so for granted." Trump has vowed to slash budgets for research at top US agencies, including the National Institutes of Health, NASA, the National Oceanic and Atmospheric Administration, and the Environmental Protection Agency which could lose one-third of its staff if Congress approves the proposal. He also named as head of the EPA Oklahoma lawyer Scott Pruitt, who claimed last month that carbon dioxide is not the main driver of global warming, a position starkly at odds with the international scientific consensus on the matter. "In the response to this absurdity lies cause for hope," Paul Hanle, chief executive officer of Climate Central, an independent organization of scientists and journalists, wrote in an op-ed this week. "Seeing the assault on fact-based thinking, scientists are energized." The US capital rally begins Saturday at 8:00 am (1200 GMT), and will be capped with a march from the National Mall to the Capitol at 2:00 pm. Hundreds of satellite marches are planned across the United States and worldwide -- with more than 600 listed as of Friday -- including in Australia, Brazil, Canada, many nations in Europe, Japan, Mexico, Nepal, Nigeria and South Korea. At a time when the Earth has marked three consecutive years of record-breaking heat, and ice is melting at an unprecedented rate at the poles, risking massive sea level rise in the decades ahead, some marchers say it is more important than ever for scientists to communicate and work toward solutions to curb fossil fuel emissions. "I will be marching in London on Saturday not so much to fly the flag for science -- though I believe it is something worth celebrating -- but because I think that in these fractious political times, when we are facing challenges that are truly global, it has never been more important for scientists to go public," said Stephen Curry, vice-chair of Science is Vital and Professor of structural biology at Imperial College, London. Professor of carbon management at the University of Edinburgh, David Reay, said scientists "are not famous for their camaraderie. We are trained to question, criticize and, where needed, contest each other's work. "That we are now marching together is testament to just how threatened our disparate community feels."


News Article | April 22, 2017
Site: news.yahoo.com

The March for Science demonstrations come amid growing anxiety over what many see as a mounting political assault on facts and evidence (AFP Photo/Peter PARKS) Scientists and their supporters across the globe are expected to march in the thousands Saturday amid growing anxiety over what many see as a mounting political assault on facts and evidence. Anchored in Washington, with satellite marches planned in more than 600 cities worldwide, the first-ever March for Science was described by organizers as a rallying call for the importance of science in all aspects of daily life. "The march has generated a great deal of conversation around whether or not scientists should involve themselves in politics," said a statement on the official website, MarchforScience.com. "In the face of an alarming trend toward discrediting scientific consensus and restricting scientific discovery, we might ask instead: can we afford not to speak out in its defense?" Organizers say the march is non-partisan and is not aimed against US President Donald Trump or any politician or party, though the Republican US leader's administration has certainly "catalyzed" the movement, according to honorary national co-chair Lydia Villa-Komaroff, a molecular and cellular biologist. She spoke of a growing "disconnect between what science is and its value to society." "Fundamental basic science really underlies all of modern life these days. We have taken it so for granted," Villa-Komaroff told reporters this week. Trump has vowed to slash budgets for research at top US agencies, including the National Institutes of Health, NASA, the National Oceanic and Atmospheric Administration, and the Environmental Protection Agency, which could lose a third of its staff if Congress approves the proposal. He also named as head of the EPA Oklahoma lawyer Scott Pruitt, who claimed last month that carbon dioxide is not the main driver of global warming, a position starkly at odds with the international scientific consensus on the matter. "In the response to this absurdity lies cause for hope," Paul Hanle, chief executive officer of independent scientist and journalists group Climate Central wrote in an op-ed this week. "Seeing the assault on fact-based thinking, scientists are energized." The US capital rally begins Saturday at 8:00 am (1200 GMT), and will be capped with a march from the National Mall to the Capitol at 2:00 pm. Hundreds of satellite marches are planned across the United States and worldwide -- with more than 600 listed as of Friday -- including in Australia, Brazil, Canada, many nations in Europe, Japan, Mexico, Nepal, Nigeria and South Korea. At a time when the Earth has marked three consecutive years of record-breaking heat, and ice is melting at an unprecedented rate at the poles, risking massive sea level rise in the decades ahead, some marchers say it is more important than ever for scientists to communicate and work toward solutions to curb fossil fuel emissions. "I will be marching in London on Saturday not so much to fly the flag for science -- though I believe it is something worth celebrating -- but because I think that in these fractious political times, when we are facing challenges that are truly global, it has never been more important for scientists to go public," said Stephen Curry, vice-chair of Science is Vital and Professor of structural biology at Imperial College, London. Some scientists, however, expressed concern that the march might increase polarization. "The right will say the demonstration is the tool of the political left," Robert Young, a geologist at Western Carolina University, told AFP. "That is why a march is a problem, it's the wrong way to try to communicate." Despite his concerns, Young said he planned to join the march. David Reay, a professor of carbon management at the University of Edinburgh, said scientists "are not famous for their camaraderie. We are trained to question, criticize and, where needed, contest each other's work." "That we are now marching together is testament to just how threatened our disparate community feels," he added.


News Article | March 11, 2017
Site: www.techtimes.com

The greenhouse gas carbon dioxide has long been considered as a primary contributor to global warming but Environmental Protection Agency administrator Scott Pruitt does not think so. On Thursday, March 9 Pruitt issued a statement on CNBC's Squawk Box that appears to contradict the public stance of the agency that he leads. While the EPA says that carbon dioxide is the primary greenhouse gas that contributes to climate change, Pruitt expressed his doubts about the impact of carbon emissions on global warming. "I think that measuring with precision human activity on the climate is something very challenging to do and there's tremendous disagreement about the degree of impact, so no, I would not agree that it's a primary contributor to the global warming that we see," Pruitt said. EPA is a crucial agency when it comes to issues and actions linked to climate change. In 2007, the Supreme Court ruled that EPA has authority to regulate the heat-trapping gases produced by vehicles. Seven years after that, it determined that the agency may also regulate some sources of greenhouse gases such as power plants. In response to Pruitt's statement, Brian Schatz, D-Hawaii, who is co-chair of the Senate Climate Action Task Force, described the EPA chief's statement as extreme and irresponsible, saying that anyone who denies basic facts and a century's worth of established science is not fit to be the EPA's administrator. Democrats and environmentalists were not in favor of Pruitt leading the EPA because of his close association with fossil fuel companies and history of casting doubt on man-made climate change. A study published in PNAS has shown that majority of scientists who have published papers on climate science are convinced of man-made climate change. "We use an extensive dataset of 1,372 climate researchers and their publication and citation data to show that (i) 97-98 percent of the climate researchers most actively publishing in the field surveyed here support the tenets of anthropogenic climate change (ACC) outlined by the Intergovernmental Panel on Climate Change," researchers wrote in the study. Other U.S. agencies, notably the National Oceanic and Atmospheric Administration and NASA, also support the idea of carbon dioxide being a primary contributor to global warming. In January this year, the U.S. space agency and NOAA said that the average surface temperature of the planet has increased by about 2.0 degrees Fahrenheit since the late 19th century, which they attribute largely to increased levels of carbon dioxide and other man-made emissions. Scientists have been raising alarm on the rising temperatures and attribute potentially damaging phenomena such as massive coral bleaching, extinction of animals, and evolution of species to a warming planet. In 2015, the United States was among more than 190 nations that approved the landmark climate agreement that aims to reduce greenhouse gas emissions. Reducing carbon dioxide emissions into the atmosphere is a particular concern for many nations. Minimizing the release of the greenhouse gas is considered key to the enforcement of the Paris treaty. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | April 17, 2017
Site: hosted2.ap.org

(AP) — Researchers say preliminary findings show a North Atlantic right whale may have been struck by a ship before the animal was found dead in Massachusetts waters. Officials with the National Oceanic and Atmospheric Administration say bruising consistent with blunt trauma could be evidence of a ship strike. North Atlantic right whales are an endangered species. The World Wildlife Fund says only about 350 are still living. NOAA is urging vessels to keep a watch for right whales, which often swim just below the water's surface and can be hard to see. The 27-foot long, 1-year-old female was found dead in Cape Cod Bay on Thursday and towed to a harbor where it could be placed on a flatbed for transport. A final analysis is expected to take weeks.


News Article | April 17, 2017
Site: hosted2.ap.org

(AP) — Researchers say preliminary findings show a North Atlantic right whale may have been struck by a ship before the animal was found dead in Massachusetts waters. Officials with the National Oceanic and Atmospheric Administration say bruising consistent with blunt trauma could be evidence of a ship strike. North Atlantic right whales are critically endangered; only a few hundred still exist in the world. NOAA is urging vessels to keep a close watch for right whales, which often swim just below the water's surface and can be hard to see. The 27-foot long, 1-year-old female was found dead in Cape Cod Bay on Thursday and towed to a harbor where it could be placed on a flatbed for transport. A final analysis is expected to take weeks.


News Article | April 28, 2017
Site: www.chromatographytechniques.com

Government scientists launched an investigation Thursday into an unusually large number of humpback whale deaths from North Carolina to Maine, the first such "unusual mortality event" declaration in a decade. Forty-one whales have died in the region in 2016 and so far in 2017, far exceeding the average of about 14 per year, said Deborah Fauquier, a veterinary medical officer with National Oceanic and Atmospheric Administration Marine Fisheries. Ten of the 20 whales that have been examined so far were killed by collisions with boats, something scientists are currently at a loss to explain because there's been no corresponding spike in ship traffic. The investigation will focus on possible common threads like toxins and illness, prey movement that could bring whales into shipping lanes, or other factors, officials said. Humpbacks can grow to 60 feet long and are found in oceans around the world. They're popular with whale watchers because of the dramatic way they breach the ocean's surface, then flop back into the water. "The humpback is generally people's favorite because they're so animated. They're the ones that like to jump out of the ocean completely," said Zack Klyver, a naturalist with Bar Harbor Whale Watch Company. The humpback whale population that feeds in North Atlantic waters each summer was removed from the Endangered Species Act last year when NOAA divided humpback populations into 14 distinct population segments around the world. There are currently about 10,500 in the population that visits North Atlantic waters, scientists say. While they're not threatened, federal scientists are nonetheless keeping close tabs on the whales, said NOAA spokeswoman Kate Brogan. The humpback whale deaths that prompted the "unusual mortality event" designation break down to 26 last year and 15 to date this year. NOAA also declared "unusual mortality events" involving humpbacks in 2003, 2005 and 2006, Fauquier said. No conclusive cause of the deaths was determined in those investigations, she said. The 10 confirmed fatal boat strikes far exceeds the annual average of fewer than two per year attributed to boat collisions, officials said. Whales tend to be somewhat oblivious to boats when they're feeding or socializing, said Gregory Silber, coordinator of recovery activities for large whales in NOAA's Office of Protected Resources. "A vessel of any size can harm a whale. In smaller vessels they tend to be propeller strikes. And in larger vessels they appear to be in the form of blunt trauma, hemorrhaging or broken bones," he said. Klyver said any whale death is upsetting. Scientists and whale watchers know many of the whales that visit each summer. "Each whale has its own personality," he said. "We are connected to so many of them as individuals that we hate to see any of them perish."


News Article | April 17, 2017
Site: phys.org

Officials with the National Oceanic and Atmospheric Administration say bruising consistent with blunt trauma could be evidence of a ship strike. North Atlantic right whales are an endangered species. The World Wildlife Fund says only about 350 are still living. NOAA is urging vessels to keep a watch for right whales, which often swim just below the water's surface and can be hard to see. The 27-foot long, 1-year-old female was found dead in Cape Cod Bay on Thursday and towed to a harbor where it could be placed on a flatbed for transport. A final analysis is expected to take weeks. Explore further: More whales being hit by ships along US East Coast


News Article | April 28, 2017
Site: www.chromatographytechniques.com

Government scientists launched an investigation Thursday into an unusually large number of humpback whale deaths from North Carolina to Maine, the first such "unusual mortality event" declaration in a decade. Forty-one whales have died in the region in 2016 and so far in 2017, far exceeding the average of about 14 per year, said Deborah Fauquier, a veterinary medical officer with National Oceanic and Atmospheric Administration Marine Fisheries. Ten of the 20 whales that have been examined so far were killed by collisions with boats, something scientists are currently at a loss to explain because there's been no corresponding spike in ship traffic. The investigation will focus on possible common threads like toxins and illness, prey movement that could bring whales into shipping lanes, or other factors, officials said. Humpbacks can grow to 60 feet long and are found in oceans around the world. They're popular with whale watchers because of the dramatic way they breach the ocean's surface, then flop back into the water. "The humpback is generally people's favorite because they're so animated. They're the ones that like to jump out of the ocean completely," said Zack Klyver, a naturalist with Bar Harbor Whale Watch Company. The humpback whale population that feeds in North Atlantic waters each summer was removed from the Endangered Species Act last year when NOAA divided humpback populations into 14 distinct population segments around the world. There are currently about 10,500 in the population that visits North Atlantic waters, scientists say. While they're not threatened, federal scientists are nonetheless keeping close tabs on the whales, said NOAA spokeswoman Kate Brogan. The humpback whale deaths that prompted the "unusual mortality event" designation break down to 26 last year and 15 to date this year. NOAA also declared "unusual mortality events" involving humpbacks in 2003, 2005 and 2006, Fauquier said. No conclusive cause of the deaths was determined in those investigations, she said. The 10 confirmed fatal boat strikes far exceeds the annual average of fewer than two per year attributed to boat collisions, officials said. Whales tend to be somewhat oblivious to boats when they're feeding or socializing, said Gregory Silber, coordinator of recovery activities for large whales in NOAA's Office of Protected Resources. "A vessel of any size can harm a whale. In smaller vessels they tend to be propeller strikes. And in larger vessels they appear to be in the form of blunt trauma, hemorrhaging or broken bones," he said. Klyver said any whale death is upsetting. Scientists and whale watchers know many of the whales that visit each summer. "Each whale has its own personality," he said. "We are connected to so many of them as individuals that we hate to see any of them perish."


News Article | April 27, 2017
Site: hosted2.ap.org

(AP) — Government scientists launched an investigation Thursday into an unusually large number of humpback whale deaths from North Carolina to Maine, the first such "unusual mortality event" declaration in a decade. Forty-one whales have died in the region in 2016 and so far in 2017, far exceeding the average of about 14 per year, said Deborah Fauquier, a veterinary medical officer with National Oceanic and Atmospheric Administration Marine Fisheries. Ten of the 20 whales that have been examined so far were killed by collisions with boats, something scientists are currently at a loss to explain because there's been no corresponding spike in ship traffic. The investigation will focus on possible common threads like toxins and illness, prey movement that could bring whales into shipping lanes, or other factors, officials said. Humpbacks can grow to 60 feet long and are found in oceans around the world. They're popular with whale watchers because of the dramatic way they breach the ocean's surface, then flop back into the water. "The humpback is generally people's favorite because they're so animated. They're the ones that like to jump out of the ocean completely," said Zack Klyver, a naturalist with Bar Harbor Whale Watch Company. The humpback whale population that feeds in North Atlantic waters each summer was removed from the Endangered Species Act last year when NOAA divided humpback populations into 14 distinct population segments around the world. There are currently about 10,500 in the population that visits North Atlantic waters, scientists say. While they're not threatened, federal scientists are nonetheless keeping close tabs on the whales, said NOAA spokeswoman Kate Brogan. The humpback whale deaths that prompted the "unusual mortality event" designation break down to 26 last year and 15 to date this year. NOAA also declared "unusual mortality events" involving humpbacks in 2003, 2005 and 2006, Fauquier said. No conclusive cause of the deaths was determined in those investigations, she said. The 10 confirmed fatal boat strikes far exceeds the annual average of fewer than two per year attributed to boat collisions, officials said. Whales tend to be somewhat oblivious to boats when they're feeding or socializing, said Gregory Silber, coordinator of recovery activities for large whales in NOAA's Office of Protected Resources. "A vessel of any size can harm a whale. In smaller vessels they tend to be propeller strikes. And in larger vessels they appear to be in the form of blunt trauma, hemorrhaging or broken bones," he said. Klyver said any whale death is upsetting. Scientists and whale watchers know many of the whales that visit each summer. "Each whale has its own personality," he said. "We are connected to so many of them as individuals that we hate to see any of them perish." This story has been corrected to attribute a comment about past mortality events to Deborah Fauquier, a veterinary medical officer with NOAA, instead of Mendy Garron, NOAA's regional stranding coordinator.


News Article | May 4, 2017
Site: www.prnewswire.com

Net income and earnings before interest, taxes, depreciation and amortization ("EBITDA") for the second quarters of fiscal 2017 and 2016 included losses on debt extinguishment of $1.6 million and $0.3 million, respectively. Excluding the effects of the foregoing items and unrealized (non-cash) mark-to-market adjustments on derivative instruments in both periods, Adjusted EBITDA (as defined and reconciled below) amounted to $138.0 million for the second quarter of fiscal 2017, compared to $145.1 million in the prior year second quarter. In announcing these results, President and Chief Executive Officer Michael A. Stivala said, "The propane industry as a whole has just endured an unprecedented stretch of back-to-back record warm winters.  The fiscal 2017 heating season was a bit more challenging than the prior year, with four out of six months reported as record warm, and two separate three-week stretches of cold weather toward the end of each of the first and second quarters.  Nonetheless, we are proud of the way our operating personnel managed through the challenges presented by this sustained warm weather trend.  We remain focused on the things we can control – delivering exceptional customer satisfaction in every market we serve, driving further operating efficiencies, controlling costs and capital spending levels, managing margins in a challenging commodity price environment and executing on our customer base growth and retention initiatives. During the short bursts of cold weather that did arrive, our volumes and earnings responded as expected, which is a testament to the strength and readiness of our platform." Mr. Stivala continued, "As a result, we are pleased to report an improvement of $10 million, or nearly 5%, in Adjusted EBITDA for the first half of fiscal 2017 compared to the prior year.  We also took proactive steps to lower our interest costs and enhance liquidity.  During the second quarter of fiscal 2017, we successfully refinanced our previous 7 3/8% Senior Notes due 2021 with the issuance of 5 7/8% Senior Notes due 2027, effectively extending maturities on $350.0 million of our debt by six years at a very attractive interest rate, and reducing our cash interest requirement by nearly $5.0 million annually." On May 1, 2017, the Partnership secured an amendment to its revolving credit facility whereby the maximum consolidated leverage ratio covenant has been increased from 5.50 to 5.95 starting with the fiscal quarter ending June 2017 and continuing through each of the fiscal quarters ending June 2018, stepping down to 5.75 for the quarter ending September 2018 and then returning to the pre-amendment level of 5.50 for the fiscal quarter ending December 2018, which is the Partnership's first quarter of fiscal 2019. Discussing the amendment, Mr. Stivala said, "Our decision to secure this amendment was to proactively enhance our liquidity position in light of the recent warm weather trends.  The amendment provides added cushion under our consolidated leverage test, which we believe is a prudent measure that gives us added flexibility to support our cash needs and continue to invest in our long-term, strategic growth initiatives.  We received strong support from our bank group for this amendment, which is a testament to their confidence in our operating philosophy and business fundamentals." Retail propane gallons sold in the second quarter of fiscal 2017 of 153.9 million gallons decreased 7.7 million gallons, or 4.8%, compared to the prior year second quarter. Sales of fuel oil and other refined fuels decreased 0.3 million gallons, or 2.3%, compared to the prior year. According to the National Oceanic and Atmospheric Administration, average temperatures (as measured by heating degree days) across all of the Partnership's service territories for the second quarter of fiscal 2017 were 15% warmer than normal, and 2% warmer than the prior year second quarter. In certain markets, principally the Partnership's Midwest and Southeast service territories, average temperatures were approximately 30% warmer than normal and 16% warmer than the prior year second quarter.  In the Partnership's Northeast and West Coast regions, the Partnership experienced cooler weather compared to the prior year which helped contribute to an increase in volumes sold in those markets. Revenues in the second quarter of fiscal 2017 of $450.6 million increased $46.4 million, or 11.5%, compared to the prior year second quarter, primarily due to higher retail selling prices associated with higher wholesale product costs, offset to an extent by lower volumes sold. Average posted propane prices (basis Mont Belvieu, Texas) and fuel oil prices were 84.6% and 48.9% higher than the prior year, respectively. Cost of products sold for the second quarter of fiscal 2017 of $192.5 million increased $55.5 million, or 40.5%, compared to $137.0 million in the prior year, primarily due to higher wholesale product costs. Cost of products sold included a $2.5 million unrealized (non-cash) loss attributable to the mark-to-market adjustment for derivative instruments used in risk management activities, compared to a $0.7 million unrealized (non-cash) loss in the prior year second quarter. These unrealized losses are excluded from Adjusted EBITDA for both periods in the table below. Combined operating and general and administrative expenses of $122.6 million for the second quarter of fiscal 2017 were essentially flat with the prior year second quarter. Savings in fixed costs due to steps taken in the prior fiscal year to streamline operations and achieve operating efficiencies were offset by higher general insurance expenses, vehicle fuel costs, and higher provisions for potential uncollectible accounts as a result of higher commodity prices. Depreciation and amortization expense of $32.7 million decreased $0.5 million, or 1.5%, compared to the prior year second quarter. Net interest expense of $17.5 million decreased $1.4 million compared to the prior year second quarter, primarily due to savings from the refinancing of the Partnership's previously outstanding 2021 Senior Notes. During the second quarter of fiscal 2017, the Partnership funded working capital, capital expenditures and costs associated with the repurchase of its 2021 Senior Notes from operating cash flow and $26.4 million of incremental borrowings under its revolving credit facility. As previously announced on April 20, 2017, the Partnership's Board of Supervisors has declared a quarterly distribution of $0.8875 per Common Unit for the three months ended March 25, 2017. On an annualized basis, this distribution rate equates to $3.55 per Common Unit. The distribution is payable on May 9, 2017 to Common Unitholders of record as of May 2, 2017. Suburban Propane Partners, L.P. is a publicly traded master limited partnership listed on the New York Stock Exchange. Headquartered in Whippany, New Jersey, Suburban has been in the customer service business since 1928. The Partnership serves the energy needs of approximately 1.1 million residential, commercial, industrial and agricultural customers through 675 locations in 41 states. This press release contains certain forward-looking statements relating to future business expectations and financial condition and results of operations of the Partnership, based on management's current good faith expectations and beliefs concerning future developments.  These forward-looking statements are subject to certain risks and uncertainties that could cause actual results to differ materially from those discussed or implied in such forward-looking statements, including the following: Some of these risks and uncertainties are discussed in more detail in the Partnership's Annual Report on Form 10-K for its fiscal year ended September 24, 2016 and other periodic reports filed with the SEC.  Readers are cautioned not to place undue reliance on forward-looking statements, which reflect management's view only as of the date made. The Partnership undertakes no obligation to update any forward-looking statement, except as otherwise required by law. EBITDA and Adjusted EBITDA are not recognized terms under accounting principles generally accepted in the United States of America ("US GAAP") and should not be considered as an alternative to net income or net cash provided by operating activities determined in accordance with US GAAP.  Because EBITDA and Adjusted EBITDA as determined by us excludes some, but not all, items that affect net income, they may not be comparable to EBITDA and Adjusted EBITDA or similarly titled measures used by other companies. The following table sets forth our calculations of EBITDA and Adjusted EBITDA: The unaudited financial information included in this document is intended only as a summary provided for your convenience, and should be read in conjunction with the complete consolidated financial statements of the Partnership (including the Notes thereto, which set forth important information) contained in its Quarterly Report on Form 10-Q to be filed by the Partnership with the United States Securities and Exchange Commission ("SEC").  Such report, once filed, will be available on the public EDGAR electronic filing system maintained by the SEC. To view the original version on PR Newswire, visit:http://www.prnewswire.com/news-releases/suburban-propane-partners-lp-announces-second-quarter-earnings-300451179.html


News Article | March 21, 2017
Site: www.huffingtonpost.com

President Donald Trump signed a bill into law on Tuesday that authorizes $19.5 billion in NASA funding for the 2018 budget year and adds human exploration of Mars as an agency objective, The Associated Press reports. Trump signed the NASA Transition Authorization Act as astronauts and the bill’s sponsors, including Sens. Ted Cruz (R-Texas) and Marco Rubio (R-Fla.), looked on. The new law allows NASA to redouble its deep space exploration efforts and develop a manned mission to Mars. The Trump administration released a preliminary budget last week that proposed a $19.1 billion budget for the space agency next year, down $200 million from the current budget of $19.3 billion. However, Tuesday’s bill reversed course, boosting NASA’s budget to $19.5 billion for the 2018 fiscal year, which begins October 1. “I’m delighted to sign this bill,” Trump said. “It’s been a long time since a bill like this has been signed, reaffirming our national commitment to the core mission of NASA, human space exploration, space science and technology.” “With this legislation, we support NASA’s scientists, engineers and astronauts and their pursuit of discovery,” he continued, adding that the new legislation would support jobs. “This bill will make sure that NASA’s most important and effective programs are sustained.” Sen. Bill Nelson (D-Fla.), who spent six days orbiting the earth aboard the space shuttle Columbia in 1986, stood beside Trump during the signing ceremony. “It puts us on the dual track,” Nelson said. “We have the commercial companies going to and from the International Space Station and we have NASA going out and exploring the heavens. And we’re going to Mars.” While NASA was largely spared from major funding cuts, Trump’s preliminary budget also proposes to slash funding for other environmental and science agencies. The Environmental Protection Agency faces a 31.4 percent funding cut under the new budget proposal, which would also eliminate spending on the U.S. Agency for International Development’s Global Climate Change Initiative and scale back the use of satellites to monitor polar icecap melt by the National Oceanic and Atmospheric Administration.


News Article | May 1, 2017
Site: phys.org

Charles "Stormy" Mayo, director of right whale ecology at the federally funded Center for Coastal Studies in Provincetown, Massachusetts, says ominous signs suggest the global population of 500 animals is slowly declining—not incrementally rebounding as experts had hoped a year ago. And the whales, it turns out, can be pretty ornery. The Associated Press asked Mayo about how the whales—some of the rarest creatures on the planet—are really faring: Q: Right whales are back in Cape Cod Bay for the second spring in a row. That must make you pretty happy? A: Not entirely. The whole story on right whales is very simple arithmetic: How many die and how many are born? The birth rate this year is extraordinarily low. We've only seen four whales born in the North Atlantic. And the mortality rate is up. Q: How many have died? A: There have been at least four deaths, but those are whales whose carcasses have been found. We know more whales have died offshore and haven't been found. The result is a decline in the population. It's troubling because we can't seem to control the human causes—entanglements and ship strikes—and we're not sure why more calves aren't being born. Q: And yet so many of these animals are being seen. Isn't that good news? A: These are extraordinarily rare animals whose habitat stretches theoretically all the way to Spain, so yes. During a single eight-hour spotting flight, we've seen 200 whales. That's 40 percent of the estimated population. But we have to ask ourselves why they're feeding here now. It may indicate that the places they used to feed are failing. When they make these radical changes, it's a little worrisome. Q: What's the biggest threat to these whales? A: Vessel strikes seem to have dropped through a collective effort by the National Oceanic and Atmospheric Administration and the U.S. Coast Guard to reduce speed in certain shipping channels. So now, if there are collisions, they're less severe. But entanglements in fishing gear are a big problem. Eighty percent of the population bears entanglement scars. As we speak, somewhere offshore there's a whale that's dragging gear. Q: What happens to whales when they're entangled? A: Many manage to free themselves. But others drag gear for months, and we've seen individuals that have dragged it for years. There's some evidence that female right whales, if they've been entangled, are weakened and have a diminished capacity to reproduce. That may help explain the low calving rate. But we can't get to every animal, and it's not easy to cut the ropes off. Right whales are one of the most ornery creatures I've ever dealt with. Q: Really? They look so gentle. A: Well, let me wrap some rope around your neck. You can't get it off. Then down comes an alien with weird hooks poking at you. Trust me: When they want to unload, they are not easy animals to deal with. Q: Is the public getting lulled into thinking the whales are OK just because we're seeing them every spring? A: I think so. People think, 'Oh, this is good stuff.' But we're looking at animals that spend a lot of time in waters where there's a lot of ship traffic, places that have a lot of fishing gear. Look at their fate during an entire year and it doesn't look so good. Explore further: Spate of whale entanglements could inform regulations


News Article | May 7, 2017
Site: www.thefishsite.com

Alaska’s salmon season officially gets underway in less than two weeks. The first fishery for sockeye and king salmon is set for May 18 at Copper River and the town of Cordova is buzzing, said Christa Hoover, executive director of the Copper River/Prince William Sound Marketing Association. “The mood changes at the start of May with all the folks back in town and boats going in and out of the water,” she said. Enthusiasm among the fleet of more than 500 drift gillnetters has not been dampened by a reduced harvest projection. Fishery managers expect a Copper River salmon catch this season of just 889,000 sockeyes, 4,000 kings and 207,000 coho salmon. “Regardless of the forecast from one year to the next, fishermen just want to have their nets in the water. It’s what they do and they are ready to go,” Hoover said. The marketing group, which is funded and operated by local salmon fishermen, is again working with Alaska Airlines to whisk away the first catches to awaiting retailers and restaurants in Seattle. Every year, images of airline pilots carrying the famous “first fish” off the plane make headlines around the world and add to the media hoopla surrounding the Copper River catches. The salmon are first hand delivered to three chefs who have a cook off on the Sea/Tac airport tarmac. The dishes are served to airline guests who select a winner. The Cordova group also use the opportunity to promote the fact that Copper River salmon isn’t just a “May event,” Hoover said. “We do a lot of outreach to help people understand that there are five months of wild Alaska salmon coming out of Cordova, especially with cohos into the fall,” she explained, adding that they also are broadening their salmon messages to build more awareness and appeal for the entire Prince William Sound fishery. Alaska’s total salmon catch for 2017 is pegged at 204 million fish, nearly one million more than were taken last year. The breakdown for the five species calls for a sockeye salmon harvest of nearly 41 million, a decrease of 12 million reds from last year. Coho catches should increase slightly to nearly 5 million; for chums, a catch of nearly 17 million is an increase of more than one million fish. The projected statewide take of pink salmon is 142 million, an increase of nearly 103 million humpies over last year. For Chinook salmon, the forecast calls for a catch of 80,000 in regions outside of Southeast Alaska, where the harvest is determined by a treaty with Canada. The all-gear Chinook catch for Southeast in 2017 is 209,700 fish, 146,000 fewer than last year. Alaska salmon fishermen hoping for relief funds from last year’s failed pink salmon fishery appear to be out of luck. The pink fishery, the worst in over 40 years, was officially declared a failure in January by former US Commerce Secretary Penny Pritzker, setting the stage for fishermen and other stakeholders at Kodiak, Prince William Sound and Lower Cook Inlet to seek disaster assistance from the federal government. The monetary assistance, however, was not included in last week’s huge $1 trillion-plus spending bill approved by Congress to keep the government operating through September. The bill also did not include disaster relief funds for West Coast salmon and crab fisheries. Congress could choose to appropriate the money separately, but chances of that happening are slim. For 20 years, the movement to use the “power of the purse” to promote and reward sustainably managed fisheries has set a global standard for seafood purchases. Today, it’s nearly impossible for a company to do business without being officially certified as a source for earth-friendly seafood. This month another global effort was launched that uses the same strategy to promote new standards for the use of antibiotics in seafood and other animal products. The Michigan-based National Sanitation Foundation International has tested food products for health and safety since 1944. Its new Raised Without Antibiotics certification program will provide independent verification of claims made on food packages that they are antibiotic-free, including seafood, meats, dairy, eggs, even leather and certain supplements. The campaign follows a NSF survey last year that showed nearly 60 percent of consumers prefer products that are free from antibiotics. That’s backed up by the NPD Group, a market tracker that operates in 20 countries, interviews 12 million consumers each year and monitors purchase data from more than 165,000 stores. The Group said that consumers are demanding “free from” foods with fewer additives, especially antibiotics, growth hormones, tweaked genes, and they are reading labels like never before. Antibiotics are widely used in the farmed fish industry, most notably in Chile (the largest importer to the US), which has come under fire for using more than one million pounds of antibiotics to ward off a fish virus, according to the National Service of Fisheries and Aquaculture. What’s worse, Intrafish reported that 50 Chilean salmon companies refused to disclose the amount and type of antibiotics they used, saying “such disclosure would threaten their business competitiveness.” In contrast, Norway, the world’s biggest farmed salmon producer, uses about 2,100 pounds of antibiotics, mostly to combat fish lice. Sea lice are the farmed Atlantic salmon industry’s most expensive problem, costing around $550 million in lost output each year. “Free from” food labeling requirements and guidelines generally apply to products raised in a controlled environment,” said Jeremy Woodrow, Communications Director for the Alaska Seafood Marketing Institute. “Salmon in Alaska hatcheries may also receive antibiotics on occasion, but there have been no detectable levels of antibiotics found by the time the salmon are harvested in the ocean.” NSF International is now seeking companies to sign on to its Raised Without Antibiotics campaign, saying: “Without an independent protocol and certification process, customers have not been able to verify claims made by marketers – until now.” Gulf of Alaska groundfish are at the forefront for “innovation” grants from the National Fish and Wildlife Foundation’s Fisheries Innovation Fund. The Fund is a partnership with the National Oceanic and Atmospheric Administration and the Walton Family Foundation. The grants, totaling $650,000, aim to support projects that help sustain fishermen and coastal communities, promote safety, and support fishery conservation and management. While the Gulf is selected as a target area, the Innovation Fund will consider proposals in all US fisheries, both commercial and recreational. Successful projects should include approaches that promote full utilization of catches and minimize bycatch, develop markets, research and training, and “improve the quality, quantity and timeliness of fisheries-dependent data used for science, management and fishermen’s business purposes,” according to a NFWF statement. Alaska groups and communities have obtained several Innovation grants in recent years. They include Sitka’s Fisheries Trust Network that aims to acquire and keep catch quotas local, the Alaska Marine Conservation Council’s “Every Halibut Counts” project that promotes gentle release methods, and the Southeast Alaska Guides Organization for its sport sector catch share project. The Kenai Peninsula Fishermen’s Association and the Alaska Longline Fishermen’s Association also have received grants to test electronic monitoring systems. Pre-proposals are due May 25 and invitations for full proposals will be sent on June 29. Full proposals are due on August 31 and the NFWF will announce award winners by November 17. Find more information and applications  here. This material is protected by copyright. For information on reprinting, contact msfish@alaskan.com


Pfeiffer L.,National Oceanic and Atmospheric Administration | Lin C.-Y.C.,University of California at Davis
Journal of Environmental Economics and Management | Year: 2014

Encouraging the use of more efficient irrigation technology is often viewed as an effective, politically feasible method to reduce the consumptive use of water for agricultural production. Despite its pervasive recommendation, it is not clear that increasing irrigation efficiency will lead to water conservation in practice. In this paper, we evaluate the effect of a widespread conversion from traditional center pivot irrigation systems to higher efficiency dropped-nozzle center pivot systems that has occurred in western Kansas. State and national cost-share programs subsidized the conversion. On an average, the intended reduction in groundwater use did not occur; the shift to more efficient irrigation technology has increased groundwater extraction, in part due to shifting crop patterns. © 2013 Elsevier Inc.


Waples R.S.,National Oceanic and Atmospheric Administration | Luikart G.,University of Montana
Genetics | Year: 2014

Use of single-sample genetic methods to estimate effective population size has skyrocketed in recent years. Although the underlying models assume discrete generations, they are widely applied to age-structured species. We simulated genetic data for 21 iteroparous animal and plant species to evaluate two untested hypotheses regarding performance of the single-sample method based on linkage disequilibrium (LD): (1) estimates based on single-cohort samples reflect the effective number of breeders in one reproductive cycle (Nb), and (2) mixed-age samples reflect the effective size per generation (Ne). We calculated true Ne and Nb, using the model species' vital rates, and verified these with individual-based simulations. We show that single-cohort samples should be equally influenced by Nb and Ne and confirm this with simulated results: N̂b was a linear (r2 = 0.98) function of the harmonic mean of Ne and Nb. We provide a quantitative bias correction for raw Nb based on the ratio Nb/Ne, which can be estimated from two or three simple life history traits. Bias-adjusted estimates were within 5% of true Nb for all 21 study species and proved robust when challenged with new data. Mixed-age adult samples produced downwardly biased estimates in all species, which we attribute to a two-locus Wahlund effect (mixture LD) caused by combining parents from different cohorts in a single sample. Results from this study will facilitate interpretation of rapidly accumulating genetic estimates in terms of both Ne (which influences long-term evolutionary processes) and Nb (which is more important for understanding eco-evolutionary dynamics and mating systems). © 2014 by the Genetics Society of America.


Baskett M.L.,University of California at Davis | Waples R.S.,National Oceanic and Atmospheric Administration
Conservation Biology | Year: 2013

Artificial propagation strategies often incur selection in captivity that leads to traits that are maladaptive in the wild. For propagation programs focused on production rather than demographic contribution to wild populations, effects on wild populations can occur through unintentional escapement or the need to release individuals into natural environments for part of their life cycle. In this case, 2 alternative management strategies might reduce unintended fitness consequences on natural populations: (1) reduce selection in captivity as much as possible to reduce fitness load (keep them similar), or (2) breed a separate population to reduce captive-wild interactions as much as possible (make them different). We quantitatively evaluate these 2 strategies with a coupled demographic-genetic model based on Pacific salmon hatcheries that incorporates a variety of relevant processes and dynamics: selection in the hatchery relative to the wild, assortative mating based on the trait under selection, and different life cycle arrangements in terms of hatchery release, density dependence, natural selection, and reproduction. Model results indicate that, if natural selection only occurs between reproduction and captive release, the similar strategy performs better. However, if natural selection occurs between captive release and reproduction, the different and similar strategies present viable alternatives to reducing unintended fitness consequences because of the greater opportunity to purge maladaptive individuals. In this case, the appropriate approach depends on the feasibility of each strategy and the demographic goal (e.g., increasing natural abundance, or ensuring that a high proportion of natural spawners are naturally produced). In addition, the fitness effects of hatchery release are much greater if hatchery release occurs before (vs. after) density-dependent interactions. Given the logistical challenges to achieving both the similar and different strategies, evaluation of not just the preferred strategy but also the consequences of failing to achieve the desired target is critical. © 2012 Society for Conservation Biology.


Lengaigne M.,Institute Of Rechrche Pour Le Developpement Laboratoire Doceanographie Et Of Climatologie | Vecchi G.A.,National Oceanic and Atmospheric Administration
Climate Dynamics | Year: 2010

As in the observed record, the termination of El Niño in the coupled IPCC-AR4 climate models involves meridional processes tied to the seasonal cycle. These meridional processes both precondition the termination of El Niño events in general and lead to a peculiar termination of extreme El Niño events (such as those of 1982-83 and 1997-98), in which the eastern equatorial Pacific warm sea surface temperature anomalies (SSTA) persist well into boreal spring/early-summer. The mechanisms controlling the peculiar termination of extreme El Niño events, which involves to the development of an equatorially centred intertropical convergence zone, are consistent across the four models that exhibit extreme El Niños and observational record, suggesting that this peculiar termination represents a general feature of extreme El Niños. Further, due to their unusual termination, extreme El Niños exhibit an apparent eastward propagation of their SSTA, which can strongly influence estimates of the apparent propagation of ENSO over multi-decadal periods. Interpreting these propagation changes as evidence of changes in the underlying dynamical feedbacks behind El Niño could therefore be misleading, given the strong influence of a single extreme event. © 2009 Springer-Verlag.


Crozier L.G.,National Oceanic and Atmospheric Administration | Hutchings J.A.,Dalhousie University | Hutchings J.A.,University of Oslo
Evolutionary Applications | Year: 2014

The physical and ecological 'fingerprints' of anthropogenic climate change over the past century are now well documented in many environments and taxa. We reviewed the evidence for phenotypic responses to recent climate change in fish. Changes in the timing of migration and reproduction, age at maturity, age at juvenile migration, growth, survival and fecundity were associated primarily with changes in temperature. Although these traits can evolve rapidly, only two studies attributed phenotypic changes formally to evolutionary mechanisms. The correlation-based methods most frequently employed point largely to 'fine-grained' population responses to environmental variability (i.e. rapid phenotypic changes relative to generation time), consistent with plastic mechanisms. Ultimately, many species will likely adapt to long-term warming trends overlaid on natural climate oscillations. Considering the strong plasticity in all traits studied, we recommend development and expanded use of methods capable of detecting evolutionary change, such as the long term study of selection coefficients and temporal shifts in reaction norms, and increased attention to forecasting adaptive change in response to the synergistic interactions of the multiple selection pressures likely to be associated with climate change. © 2013 The Authors. Evolutionary Applications published by John Wiley & Sons Ltd.


Grant
Agency: GTR | Branch: NERC | Program: | Phase: Training Grant | Award Amount: 73.77K | Year: 2011

The rate of increase in atmospheric carbon dioxide (CO2) since the industrial revolution has been unprecedented. Over the last two decades, only half of the CO2 released by anthropogenic activities has remained in the atmosphere and about one third has been taken up by the oceans. Knowledge about the impact of recent rises in atmospheric CO2 on our oceans is growing through recently funded national and international programmes on Ocean Acidification (OA) but is by no means complete. The proposed project will provide insights into the impact of such changes during the past 25 years on marine calcifying plankton (foraminifera) using sediment trap time series samples available from the Ocean Flux Program (OFP, 1984-2008) off Bermuda (31degree 50N, 64degree10W). The results will be compared with seafloor surface sediments at nearby locations collected during the Challenger (1872-76) and Discovery (1901-04) expeditions. These comparisons will provide a pre-industrial baseline with which to evaluate the impact of recent OA. In addition, the student will have an opportunity to carry out additional sampling of time-integrated seafloor sediment at the same location for comparative study during a planned internship. The time series sediment trap and seafloor sediment samples will collectively form a unique sample set with which to quantify the ongoing impact of anthropogenic CO2 on surface ocean carbonate chemistry and plankton calcification. The project will focus on answering two major questions: (1) how has the recent rise in anthropogenic CO2 impacted bio-calcification (e.g., in this case planktonic foraminifera)? This question will be addressed by studying changes in size normalized shell mass and dimensions (diameter, area and maximum and minimum length of surface dwelling species) along with species composition and size abundance patterns within a species. These shell calcification parameters from a seasonally resolved time series will be compared against depth-stratified (0-500 m) seawater carbonate chemistry datasets at the sampling location after reconstructing species habitat depths (using Mg/Ca & oxygen isotopes to calculate calcification temperatures). (2) How has seawater carbonate ion concentration ([CO32-]) changed as a consequence of OA? A better understanding of this issue will be gained using existing proxies of [CO32-] such as the boron/calcium ratio of foraminiferal shells and shell mass. The project will generate annually resolved records of [CO32-] using the unique sample set (time series and seafloor surface sediments). The fortnightly samples from an annual seasonal cycle of the time series samples, combined with measured depth-stratified carbonate ion concentration (calculated form measured total alkalinity and dissolved inorganic carbon) and hydrography data (from the Bermuda Atlantic Time Series Study (31 degrees 43 N, 64 degrees 10 W) and Hydrostation S (32 degrees 50 N, 64 degrees 10 W)), will provide additional empirical calibration for the foraminiferal B/Ca proxy. We will exploit our refined calibration of B/Ca vs. [CO32-] to infer pre- and post-industrial changes in [CO32-] and, consequently, the scale of recent OA. The sediment trap samples will be obtained through project partner (Dr. Conte) at Bermuda Institute of Ocean Sciences, (BIOS)/Marine Biological Laboratory (MBL), Woods Hole, USA and the sea floor surface sediment samples from the Challenger and Discovery expeditions will be obtained through collaborating institution (Natural History Museum, London (support letter available)). The student will benefit from training of shipboard sample collection off Bermuda through internship, training at University of Cambridge (UK) for trace element/Ca work and co-supervision from Professor Jelle Bijma at Alfred Wegner Institute (Germany), core-member of the European Project on OCean Acidification (EPOCA) and Biological Impacts of Ocean ACIDification (BIOACID) programmes.


News Article | December 15, 2016
Site: motherboard.vice.com

In a scathing speech Wednesday in front of some of the most important climate scientists in the world, California Gov. Jerry Brown vowed to fight Donald Trump's anti-environmental policies every step of the way. One audacious promise particularly stood out: Brown said that if Trump turns off NASA's climate-monitoring satellites, the state "is going to launch its own damn satellites." Trump's advisors have indeed said he will crack down on "politicized science," and Trump campaign advisor Bob Walker noted that this would include NASA's Earth Sciences Division, which operate several Earth-monitoring satellites. No one knows yet if Trump will actually have NASA turn off satellites that are much more expensive to make and launch than they are to operate, but for the sake of preparedness, I decided to look into whether or not California could actually keep Brown's promise. I spoke to several space lawyers in an attempt to suss out how, logistically and legally, a California Space Agency would work. The legal issues will of course depend on the specifics of California's program—if the state pursued a public-private partnership, it could simply buy data from a commercial satellite company that secures launch permits from the federal government. But let's presume for a moment that California wants to start its own honest-to-goodness space agency, or, at the very least, wants to handle the launch and monitoring of its satellites. There are two main questions: Would such a plan be feasible? And can the state legally do so? Though no state has a robust satellite operations program, several states do have space authorities that are mainly tasked with incentivizing and promoting commercial space activity within their respective states. Space Florida, the Oklahoma Space Industry Development Authority, and the New Mexico Spaceport Authority are all currently operational. The California Space Authority operated between 1996 and 2011, when it was shut down. The idea of a state launching its own satellites today is much more plausible than it was back in 1987, when Joanne Gabrynowicz, the editor-in-chief emerita of the Journal of Space Law, was approached by the staff of longtime New Jersey Sen. Bill Bradley. "California can launch its own remote sensing satellites just like the private sector can launch commercial satellites" Bradley's plan was for the state of New Jersey to launch its own satellite to monitor the Jersey coast, which had to close its beaches when the "syringe tide" washed a significant amount of medical waste, syringes, and human body parts ashore. Bradley thought that a satellite would help catch illegal dumping off the coast. "In 1987, when I spoke to Sen. Bradley's legislative aides, a state having its own satellite was unrealistic," Gabrynowicz told me. "In 2016, there is so much change in technology that it is more realistic to raise the question of what's possible, however it would still require considerable scientific, economic, and legal research." In fact, Brown himself proposed an environmental satellite program for California in 1978, which he said Wednesday helped earned him the moniker "Governor Moonbeam." Past fantasies aside, the price of cubesats and other small satellites with powerful sensors has come down significantly in the last decade, though land-sensing satellites have thus far been quite expensive. Landsat 8, launched in 2013 as a joint project between NASA and the US Geological Survey, has a program cost of $850 million. Experts believe now that the same satellite could be built and launched for $650 million. Depending on the capabilities California would want, it could launch a satellite for much cheaper. Startup Skybox Imaging, which was rebranded as Terra Bella after Google purchased it, launched its first imaging cubesat for less than $50 million. For context, California's government spends in the neighborhood of $100 billion per year. It is currently $400 billion in debt, but has a balanced budget and has begun the slow process of paying off those debts. There are many models for how a California Space Agency could potentially work, and if the state were to seriously pursue this goal, it could partner with or offer tax incentives to many aerospace companies that are headquartered or have large presences in the state. SpaceX, Northrop Grumman, Aerojet Rocketdyne, Lockheed Martin, ViaSat, Boeing, and Virgin Galactic all have operations in southern California, and the state has both the Mojave Air and Space Port and the California Spaceport—which is located on Vandenberg Air Force Base but is commercially operated—that it could use for launches. Finally, it's worth noting that some states do have satellites in space, in a way. Several public universities, which fall under the purview of state governments, have launched research satellites over the years. "As far as I'm aware, there's no state that has a significant satellite program. It sounds like Brown has larger ideas than what the universities have done," Matthew Schaefer, a space law professor at the University of Nebraska, Lincoln, told me. "It's going to be a question of bang for the buck. What capabilities are you looking for and how much would that cost? It's also a question of looking at what's out there—they may be able to get the data they want from a commercial satellite that's already in orbit." Motherboard reached representatives for Gov. Brown, who said they do not currently have anything to add to Brown's remarks. Legally speaking, there are no obvious legal roadblocks that would instantly doom a California Space Agency to failure, according to four space law experts I spoke to. That said, California absolutely could not start its own space agency without some federal cooperation. The rationale behind killing NASA's Earth-monitoring satellites would seem to be economic in nature, and the team he's surrounded himself with are all champions of states' rights. So if California wants to spend its own money on a space program, maybe his agencies will tell the state to go for it. "I see no impediment," Rosanna Satler, a space lawyer with the Posternak Blankstein & Lund law firm in Boston, told me in an email. "California can launch its own remote sensing satellites just like the private sector can launch commercial satellites. I do not believe that any state as an entity or political subdivision has actually launched satellites on its own, but I see no reason why California cannot do so." No lawyer I spoke with was aware of any specific federal law that would supercede a state resolution from California to create a space agency. Internationally speaking, California itself is what's considered a "non-visible entity," meaning the state is considered to be part of the United States (which barring a Calexit, is exactly what it is). "The US is responsible and liable for any relevant categories of US-based space activities, which would include hypothetical ones of any administrative unit within the US," Frans von der Dunk, a space law professor at the University of Nebraska-Lincoln, told me in an email. "At least formally, space activities are only legitimate under international law if they are condoned by [the United States]." The federal government gives permission to other non-visible entities all the time—SpaceX and Blue Origin fall into the same category. To launch a rocket, California would need permission from the Federal Aviation Administration, which handles all launch permits. It would also need to secure communications bandwidth from the Federal Communications Commission. Depending on the sensors used by the state's hypothetical satellites, it may also need permission from the National Oceanic and Atmospheric Administration's licensing programs for remote sensing satellites. The FAA has also given itself the authority to license private and state-operated spaceports that are not run by the federal government, suggesting that the agency believes the Commercial Space Launch Act of 1984 allows it to oversee space launch facilities in the United States. "California could do it," Schaefer said. "They would be the sixth or seventh largest nation in the world, they have a large budget and are a huge global actor."


News Article | December 18, 2016
Site: www.techtimes.com

Will the U.S. winter be harsh, warm or maybe even rainy? Because of La Nina and a few other factors, parts of the United States may be in for a colder than normal winter. However, several parts of the country may experience warmer than usual temperatures and even experience an above average amount of rain. A colder than normal winter is expected for the eastern and northern parts of the country especially in the Pacific Northwest, the west Great Lakes and parts of Alaska. It is because of a weak La Niña weather pattern that there is more high pressure in the atmosphere in the North Pacific, disrupting the west to east flow of air and thereby sending cold air into the U.S. In contrast to below average temperatures projected in the northern parts of the country, warmer than normal temperatures are expected in the East Coast as well as the southern parts of the country and west and north Alaska. It should still be noted that despite warmer than normal temperatures, surges of cold weather and significant winter storms are still probable in these areas. The Climate Prediction Center expects that this year's winter may also bring in an unusual amount of rain in areas in the Pacific Northwest, the northern area of the Great Basin, the Rockies and the High Plains. Other places such as Ohio and the Tennessee valley are also expected to experience a higher amount of rain. However, parts of California and areas more strongly affected by the drought are expected to experience less than the average amount of rain. Some parts of the country that are affected by drought are fortunately expected to experience a little rain this winter. However, despite improvements in drought-ridden regions such as Oregon and Washington state, the projected amount of rain is still not expected to have any long-term effects in areas that are more severely hit by the drought, such as central and southern California. "The Historical Probability of a White Christmas" by the National Oceanic and Atmospheric Administration (NOAA) shows that on Dec. 25, people can expect the climatological probability of at least 1 inch of snow on the ground in the U.S. Most of Idaho, Upstate New York, the Rockies and Colorado will be among the few places that the NOAA predicts to have a high probability of seeing snow on Christmas day. The projection is based on the averages of climatological measurements from 1981-2010 Climate Normals. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | February 23, 2017
Site: hosted2.ap.org

(AP) — Global warming is already shrinking the Colorado River, the most important waterway in the American Southwest, and it could reduce the flow by more than a third by the end of the century, two scientists say. The river's volume has dropped more than 19 percent during a drought gripping the region since 2000, and a shortage of rain and snow can account for only about two-thirds of that decline, according to hydrology researchers Brad Udall of Colorado State University and Jonathan Overpeck of the University of Arizona. In a study published last week in the journal Water Resources Research, they concluded that the rest of the decline is due to a warming atmosphere induced by climate change, which is drawing more moisture out of the Colorado River Basin's waterways, snowbanks, plants and soil by evaporation and other means. Their projections could signal big problems for cities and farmers across the 246,000-square-mile basin, which spans parts of seven states and Mexico. The river supplies water to about 40 million people and 6,300 square miles of farmland. "Fifteen years into the 21st century, the emerging reality is that climate change is already depleting the Colorado River water supplies at the upper end of the range suggested by previously published projections," the researchers wrote. "Record-setting temperatures are an important and underappreciated component of the flow reductions now being observed." The Colorado River and its two major reservoirs, Lake Mead and Lake Powell, are already overtaxed. Water storage at Mead was at 42 percent of capacity Wednesday, and Powell was at 46 percent. Water managers have said that Mead could drop low enough to trigger cuts next year in water deliveries to Arizona, which would be the first state affected by shortages under the multistate agreements and rules governing the system. But heavy snow in the West this winter may keep the cuts at bay. Snowpack in the Wyoming and Colorado mountains that provide much of the Colorado River's water ranged from 120 to 216 percent of normal Thursday. For their study, Udall and Overpeck analyzed temperature, precipitation and water volume in the basin from 2000 to 2014 and compared it with historical data, including a 1953-1967 drought. Temperature and precipitation records date to 1896 and river flow records to 1906. Temperatures in the 2000-2014 period were a record 1.6 degrees Fahrenheit above the historical average, while precipitation was about 4.6 percent below, they said. Using existing climate models, the researchers said that much decline in precipitation should have produced a reduction of about 11.4 percent in the river flow, not the 19.3 percent that occurred. They concluded that the rest was due to higher temperatures, which increased evaporation from water and soil, sucked more moisture from snow and sent more water from plant leaves into the atmosphere. Martin Hoerling, a meteorologist at the National Oceanic and Atmospheric Administration who was not involved in the study, questioned whether the temperature rise from 2000 to 2014 was entirely due to global warming. Some was likely caused by drought, he said. Udall said warming caused by climate change in this century will dwarf any warming caused by drought. He noted that during the 1953-1967 drought, the temperature was less than a half degree warmer than the historical average, compared with 1.6 degrees during the 2000-2014 period. Udall said climate scientists can predict temperatures with more certainty than they can precipitation, so studying their individual effects on river flow can help water managers. Rain and snowfall in the Colorado River Basin would have to increase 14 percent over the historical average through the rest of the century to offset the effect of rising temperatures, he said. "We can't say with any certainty that precipitation is going to increase and come to our rescue," Udall said. Follow Dan Elliott at http://twitter.com/DanElliottAP . His work can be found at https://apnews.com/search/dan%20elliott .


News Article | December 8, 2016
Site: phys.org

Things took another turn on Monday (Dec. 5th) as Trump met with former Vice President and environmental activist Al Gore to discuss his administration's policy. This meeting was the latest in a series of gestures that suggest that the President-elect might be softening his stances on the environment. However, there is little reason to suspect that this meeting could mean any changes in policy. The meeting was apparently arranged by the President-elect's daughter, Ivanka Trump, to coincide with the former VP's attendance of a conference in New York on Monday. Said conference was the 24 hour live broadcast titled "24 Hours of Reality", an event being put on by the Climate Reality Project – a non-profit organization founded by Gore to educate the public on climate change and policy. The meeting lasted 90 minutes, after which Gore spoke to reporters about the discussion he and the President-elect had. As he was quoted as saying by The Washington Post: "I had a lengthy and very productive session with the president-elect. It was a sincere search for areas of common ground. I had a meeting beforehand with Ivanka Trump. The bulk of the time was with the president-elect, Donald Trump. I found it an extremely interesting conversation, and to be continued, and I'm just going to leave it at that." While this meeting has led to speculation that Trump's administration might be softening its stance on environmental issues, many are unconvinced. Based on past statements – which include how Climate Change is a "hoax invented by the Chinese" – to his more recent picks for his cabinet, there are those who continue to express concern for the future of NASA programs that are centered on Earth sciences and the environment. For instance, after weeks of remaining mute on the subject of NASA's future, the Trump campaign announced that it had appointed Bob Walker – a former Pennsylvania Congressman and the chair of the House Science Committee from 1995 to 1997. A fierce conservative, Walker was recently quoted as saying that NASA should cease its climate research and focus solely on space exploration. "My guess is that it would be difficult to stop all ongoing Nasa programs but future programs should definitely be placed with other agencies," he said in an interview with the Guardian in late November. "I believe that climate research is necessary but it has been heavily politicized, which has undermined a lot of the work that researchers have been doing. Mr Trump's decisions will be based upon solid science, not politicized science." From statements such as these, plus things said during the campaign that emphasized NASA's important role in space exploration, the general consensus has been that a Trump administration will likely slash funding to NASA's Earth Science Directorate while leaving long-term exploration programs unaffected. According to David Titley, who recently wrote an op-ed piece for The Conversation, this would be a terrible mistake. Titley is a Professor of Meteorology at Pennsylvania State University and the founding director of their Center for Solutions to Weather and Climate Risk. In addition to being a Rear Admiral in the US Navy (retired), he was also the Chief Operating Officer of the National Oceanic and Atmospheric Administration from 2012–2013 and has been a Fellow of the American Meteorological Society since 2009. As he noted in his piece, NASA's Earth science and Earth observation efforts are vital, and the shared missions they have with organizations like the NOAA have numerous benefits. As he explained: "There's a reason why space is called 'the ultimate high ground' and our country spends billions of dollars each year on space-based assets to support our national intelligence community. In addition to national security, NASA missions contribute vital information to many other users, including emergency managers and the Federal Emergency Management Agency (FEMA), farmers, fishermen and the aviation industry." In the past, NASA's Earth Science Directorate has contributed vital information on how rising temperatures could affect water tables and farmlands (such as the ongoing drought in California), and how changes in oceanic systems would affect fisheries. On top of that, FEMA has been working with NASA in recent years in order to develop a disaster-readiness program to address the fallout from a possible asteroid impact. This has included three tabletop exercises where the two agencies worked through asteroid impact scenarios and simulated how information would be exchanged between NASA scientists an FEMA emergency managers. As Melissa Weihenstroer – a Presidential Management Fellow in FEMA's Office of External Affairs and who works with NASA's Planetary Defense Coordination Office – recently wrote about this inter-agency cooperation: "Since FEMA doesn't have direct experience with asteroids or their impacts, we've turned to some people who do: our partners at the National Aeronautics and Space Administration (NASA). While FEMA will be the agency in charge of the U.S. government efforts in preparing for and responding to any anticipated asteroid-related event here on Earth, NASA is responsible for finding, tracking, and characterizing potentially hazardous asteroids and comets while they are still in space. Whenever a transition occurs between one presidential administration and the next, there is always some level of concern about the impact it will have on federal organization. However, when an administration is unclear about its policies, and has made statements to the effect that federal agencies should cease conducting certain types of research, NASA can be forgiven for getting a little nervous. In the coming years, it will be interesting to see how the budget environment changes for Earth science research. One can only hope that a Trump administration will not see fit to make sweeping cuts without first considering the potential consequences. Explore further: Where will President-elect Trump take American space endeavours?


News Article | September 19, 2016
Site: www.huffingtonpost.com

When my daughter Lotus recently became interested in astronomy, and we began watching shows and documentaries about space and the universe, we discovered Neil deGrasse Tyson, the dynamic host of the National Geographic talk show "StarTalk." While Neil is a world renowned and brilliant astrophysicist and author, and director of the prestigious Hayden Planetarium, he is also an immensely likable, jolly, and warm personality - one who exudes joie de vivre, as well as a contagious, passionate energy for expanding frontiers of science, human thought and innovation. Which is perhaps why, during my interview with him at his office at The Museum of Natural History, wearing his trademark tie of planets, he took such personal interest in my very outdated, essentially obsolete yet beloved RCA microcassette recorder, picking it up as if it were a fossil from prehistoric times asking incredulously, "What is this?!" When I explained that I used it as a back-up, a relic from my days 20 years ago as a magazine reporter, he couldn't control himself. He proceeded to take over my iPhone, setting the interview up to record on the Voice Memo application, beginning the recording with his trademark booming radio announcer voice: "And now, I am Neil deGrasse Tyson for HuffPo interview..." before handing my iPhone back to me, all ready to go. That anecdote is a sampling of his humor and how he sees his role, as a teacher and provocateur, educating people in entertaining ways about what they should know -- about the universe, about science, and about ourselves and the world around us. He has done this through a variety of mediums: as the host of StarTalk, the popular podcast and National Geographic Channel talk show, as well as the miniseries Cosmos: A Spacetime Odyssey, as the author of 10 books, including the best-selling Death by Black Hole: And Other Cosmic Quandaries, as well as cultivating a social media following with his thought-provoking and informative tweets. He is also the recipient of the NASA Distinguished Public Service Medal, the highest award given by NASA to a nongovernment citizen. StarTalk, his two-time Emmy-nominated late-night talk show, is an outgrowth of his popular radio and podcast series of the same name and seeks to bridge the world of pop culture and science through intimate interviews and lively discussions with an eclectic roster of guests hewn from pop culture, politics or news discussing how science and technology have affected their lives and careers. The third season of StarTalk will premiere on Monday, September 19th, with actress and comedian Whoopi Goldberg, followed by a diverse lineup of other high profile guests this season including astronaut Buzz Aldrin, actress and neuroscientist Mayim Bialik, U.S. Secretary of Defense Ash Carter, comedian Jay Leno, Olympic Gold Medalist Hope Solo, actor Ben Stiller and more. Natgeo Books has also just released an illustrated companion book to the series, STARTALK: Everything You Ever Need to Know About Space Travel, Sci-Fi, the Human Race, the Universe, and Beyond which curates the best of "StarTalk" and dives deeper into some of the most intriguing discussions from the show. Neil deGrasse Tyson: It's a perfect question, because -- when we think of science, we think of this subject that's taught in a classroom, and you're either good at it or you're not, or you're interested in it, or you're not. And you can just walk by it and say, "Oh, I don't like science but I like history," and you go learn history, as though science is this compartmentalized topic that you can climb over, walk around, dig under, or ignore. And it occurred to me that, "No, of course, science is not that. Science is everywhere." And it especially touches pop culture. So if we create a radio show that pivots on a pop culture foundation, then we can -- think of it as a scaffold. It's a pop culture scaffold that walks into every episode of StarTalk, and we then clad it with science that fits that pop culture topic. I don't need to train you on that scaffold -- pop culture by definition means everybody knows what it is! I have a guest, and you know who that guest is because that person is a famous politician, or singer, performer, artist, whatever. And so I already have you half way. It's the rest of the way that is the celebration of science on the program. So I think it's just a more honest account of what role science plays in our lives. The popularity of StarTalk, I think, is evidence of how well that resonates with people. NDT: It changes everything. You're absolutely right. And it's a cosmic perspective. The embarrassing part of this is that I and every one of my colleagues carry this perspective with us all the time. And what's embarrassing about that is we sometimes take that for granted, and we have to say, "Oh, my gosh, these other people I'm in this cocktail party with do not live with this awareness. So let me put in a little extra effort to alert you of this, of what's going on in the universe, to put our existence in a context that will either have you re-balance what you think may be important in your life or rethink how you're allocating your energies in the service of yourself versus the environment versus the planet as a whole." That's what a cosmic perspective does. So yes, it's extremely important. Let me tell you how important it is. I can assert without hesitation that the modern environmental movement where we're caring for earth was born while we were landing on the moon, 1968, '69, and '70. 1968 was our first mission to the moon, Apollo 8. People don't think about it much, because they didn't land. They just went, and orbited and came back. Apollo 9, 10, and 11 would take us into 1969. You go to the moon, and what happened? We looked back, took a picture of earth, and discovered earth for the first time. There it was adrift in space, the dark void of space. And we saw it not as -- it's hard to even go back and remember this or think that it could have ever been this way -- but we saw earth not as earth appeared in social studies class with color coded countries, no. We saw earth as only space can reveal it to you. And there were oceans, land and clouds. So like I said, we go to the moon, and we discover earth for the first time. After that photo was published, Earthrise and then the Blue Marble -- in that interval of time, we changed as a culture. They say what is it worth to go into space? We changed as a culture. By the way, we had plenty of other priorities on our plate -- there were assassinations, and a cold war, and a hot war, and a civil rights movement, and campus unrest. Yet, we found the time to create the Environmental Protection Agency! That was 1970. Oh, my God. Why didn't we create it in 1960? No one was thinking about it! 1950? Nobody was thinking about it. 1980 could be too late! Environmental Protection Agency under a Republican president, I might add. What else happened? No one had thought of the atmosphere of earth as part of earth much before that. Ask someone to draw earth before 1968. They'll just draw earth and the continents - no clouds, no atmosphere. So this juxtaposition of ocean, land, and atmosphere became administratively codified in the founding of the National Oceanic and Atmospheric Administration. NOAA was founded in 1970. Comprehensive Clean Air Act, Clean Water Act, Protection of Species Act -- some of these existed in sort of skeletal forms earlier. They came of age in the versions that were ratified in 1970. What else happened in 1970? The first Earth Day. Why didn't we have an Earth Day in 1965? 1967? No, 1970. And then 1972, leaded gas would be banned. Catalytic converter would be introduced. So this is my long way to answer your question with a yes. Knowing space matters -- maybe more because it recalibrates what it is to know ourselves. NDT: No, I would change [the school systems] - I mean I'm not there yet, but I plan to work on that as one of my projects: What can we do with the future of the school systems? Is the portfolio of topics the right ones? Should you get an entire class on a cosmic perspective? What kind of math should you be taught if at all? There are people who say they don't need their math. I think that's a mistake, because it implies that what you learn is what you then apply. Whereas, in the best scenario what you learn -- the act of having learned it -- has established a new form of wiring in your brain that empowers you to have thoughts you've never had before. And by studying math, and by struggling to do a math problem set, and with every new problem you solve, it's a new wiring of your brain. And if you only think of school as, "I want to learn this so that I could apply it here," you'll be ill equipped to navigate the rapidly changing frontier of what actually matters in society. What is it now -- the average time someone spends in a job is three years or five years before you move? Whereas the generation before us, they would spend 30 years working in the factory or working for their company. So you're a freshman in college saying, "What should I major in to have the best chance of having a good job?" Well, major in "how to think," and then you can apply that to any job that arises that at this moment is unforeseen, because you're not getting out for another four years and if you go to graduate school, it's another seven or eight. NDT: Can I say something? I didn't know in my heart how this would turn out, because it's organizationally ambitious. And I'd see the PDFs when they'd come about, but then when the book arrived in my box two days ago, I said, "Oh, my gosh, this is fun." And I'd say that even if I had nothing to do with the project. I mean I'm pleased to hear that you like it, because it's -- any page you turn, "I want to know about that! Oh, my gosh, this is cool too!" And it is the soul of StarTalk. NDT: So, philosophers historically like speaking of truth and beauty going together. And I don't think of beauty, because that value judges what you're looking at. Is the sunset beautiful but the underbelly of a tarantula not? Yet they're both part of nature. Is a squirrel cute but a rat ugly when they're both rodents? One just happens to have a furry tail, and the other doesn't. So to say something is beautiful is value judging nature, and I refuse to do that. I refuse to value judge what is beautiful and what is not. I accept it all as being part of nature. And is a virus bad because it's trying to survive and it happens to give you a fatal disease? It's just being its own thing. We have the power now to render certain strains of mosquitoes extinct by disrupting their reproductive cycle. Should we? Well, the mosquitoes are bad for us. Bats eat them -- so I'm not going to say, "We're in charge. We don't like the mosquitoes. Fine. Get rid of them." I'm not going to say they're being bad mosquitoes. It's just we're making a decision, sort of a rational decision for our own survival as any creature would do in the interest of its own survival. The lion actually doesn't care if zebras go extinct. It's not thinking that. It just wants to eat the zebra. Okay. Now, if zebras do go extinct, then they'll eat the gazelles. They don't care, right? So for me, my awe comes not so much from the universe, but from our ignorance of the universe. If you add dark matter and dark energy together, which are two unknowns in our modern science -- we know they're out there, we measure their existence, but we don't know what causes it, we don't know what it's made of. If you add them together it's 95, 96 percent of all that drives the universe. I'm in awe of that. I'm in awe of how much we understand and have used to shape our civilization and to think of how much we have yet to understand and what that could mean for the future of civilization. For more about StarTalk, visit www.channel.nationalgeographic.com/startalk/


News Article | March 23, 2016
Site: www.nature.com

In a decision hailed by animal-rights groups, the US marine-park company SeaWorld Entertainment announced last week that it will no longer breed killer whales. But whether captivity harms the planet’s biggest predator is an area of active scientific debate. The latest arguments centre on two 2015 studies that drew dramatically different conclusions about the lifespans of captive killer whales (Orcinus orca), relative to those of wild populations. Although many factors affect well-being, an apparent discrepancy between the survival of captive and wild animals has long been cited by activists as evidence of the poor welfare of captive killer whales. One of the studies1 is authored by a team largely made up of researchers at SeaWorld, which is headquartered in Orlando, Florida, and owns several animal parks that keep killer whales; the other2 is by two former killer-whale trainers at the company who feature in the 2013 documentary film Blackfish, which is critical of SeaWorld. In letters published last week3, 4, authors from each paper accuse the others of cherry-picking data to support positions on whether the animals should be captive — charges that each team in turn rejects. Although SeaWorld’s captive-killer-whale programme now has an expiration date, the company’s existing 23 animals will remain in parks for the rest of their lives, and its pregnant female Takara will give birth in captivity. Another 33 animals are held in other marine parks around the world. Robust studies of killer whales’ longevity are needed to improve the well-being of the remaining captive animals, says Douglas DeMaster, science director at the US National Oceanic and Atmospheric Administration’s Alaska Fisheries Science Center in Seattle, Washington. But the annals of research on captive killer whales are slim. Before 2015, the last major published study5 dates to 1995, when US government scientists calculated that the annual survival rate of captive killer whales was several per cent lower than that of a wild population off the coast of Washington state called southern resident killer whales. In one of the 2015 studies2, the former trainers — John Jett, a biologist at Stetson University in DeLand, Florida, and Jeffrey Ventre, a physician at Lakeview Campus Medical Facility in Yakima, Washington — attempted to measure how captive whales have fared since conditions were improved in the 1980s. They pooled data from between 1961 and 2013 on 201 captive killer whales in institutions around the world, including SeaWorld. They concluded that survival rates in captivity have improved since 1985, but that even the most recent survival rates are below those of animals in the wild. In the other 2015 study1, researchers led by SeaWorld veterinary surgeon Todd Robeck came to a very different conclusion: that animals now in captivity at SeaWorld’s US parks live just as long as wild populations. The researchers looked only at animals held at those parks after 2000, and produced a survival rate that is higher than a rate that they calculated for southern resident killer whales — and equivalent to that of another wild population that lives in the waters off British Columbia, Canada. Now, each lead author has taken aim at the work of the other. In a letter published in Marine Mammal Science3, Robeck and three colleagues note that Jett and Ventre included in their 2015 study stranded animals, which might have arrived in captivity in poor health, and newborns, which are at particularly high risk of death. This pushes down the apparent survival rate of captive animals, say the researchers. In the same journal, Jett responds4 to that critique, and accuses Robeck’s 2015 study of bias because, for instance, it compares captive whales to the southern resident population, which is endangered and exposed to pollutants and shipping traffic, and whose numbers have waxed and waned over the past four decades. Jett says that his and Ventre’s study was intended to take a wide look at captive-killer-whale survival, so they included as many data as possible. But Robeck stands by his critique. “They can include all the animals they want,” he says. “The conclusions they made were not based on the evidence they showed.” DeMaster notes that the comparison that Robeck and his colleagues made between captive killer whales and a disturbed wild population is not useful. He adds that it is also difficult to compare the approaches taken by the two teams, because they analyse different animals over different periods. On 8 March, a further group of researchers entered the fray, criticizing the 2015 Robeck study on another front. In the Journal of Mammalogy6, the group charges that Robeck’s study implied that evidence for a long post-reproductive life­span in killer whales is an artefact stemming from over­estimated ages of adults in the early days of research on captive killer whales. “People started looking at killer whales in the early 1970s and they weren’t immediately experts,” says Robeck, who has also published a response7 to that critique. The authors of the critique say that the evidence for the post-reproductive lifespan, a rare evolutionary adaptation otherwise seen only in humans and in pilot whales, is robust. “There are whales still alive now that were around in the 70s that haven’t had a calf,” says one of the authors, Darren Croft, a behavioural ecologist at the University of Exeter, UK. It will take more observation time to put firm numbers on the post-reproductive lifespan of killer whales, says Andrew Foote, an evolutionary ecologist at the University of Bern and another of the co-authors. The only way to resolve the dispute over the longevity of captive killer whales is for different teams to analyse the same data in the same manner, says DeMaster. Such studies could improve the well-being of captive animals by, for instance, identifying the facilities and husbandry practices that most benefit them.


News Article | November 29, 2016
Site: www.csmonitor.com

High tide floods a street in Portland, Maine, on Nov. 18, 2016, when the closest full moon since 1948 caused unusually high tides in the low-elevation city. Last week, policy advisors to President-elect Donald Trump announced that the administration would curb NASA’s role in climate research. As it happens, many of Maine’s scientists aren’t happy about that. “We see NASA in an exploration role, in deep space research,” Bob Walker, space policy adviser for the Trump campaign, told the Guardian last week. “Earth-centric science is better placed at other agencies where it is their prime mission.” But that’s not necessarily true. Earth monitoring would likely be delegated to agencies such as the National Oceanic and Atmospheric Administration (NOAA), which have less experience in space-based research and tighter budgets. And that presents a problem for the Pine Tree State, which is a national nerve center for earth science and climate research. Many Maine institutions – the University of Maine, the Gulf of Maine Research Institute, the Bigelow Laboratory for Ocean Sciences – rely heavily on NASA’s satellite data. “The key measurements we use to discern change in organisms in the oceans – sea surface temperature, salinity, ocean color – they all pretty much exclusively fall under NASA,” Barney Balch, a biological oceanographer with the Bigelow Laboratory, told the Portland Press Herald. “Other countries use NASA’s expertise in these areas because they are so good at it.” Without NASA’s work, Maine scientists say, it could be difficult to track environmental changes in the region and beyond, as the state has become something of a hub for climate research. Large-scale research initiatives, from marine algae monitoring to analysis of melting sea ice, might struggle to secure funding without sound data sets. “We’re talking people losing jobs and grad students who can’t show up because there isn’t funding to take them on,” Andrew Thomas, a professor of oceanography at the University of Maine, told the Portland Press Herald. “There are millions of dollars that go into the local economy that won’t happen.” Meanwhile, Maine is among the states most at-risk from anthropogenic climate change. With countless tiny inlets and coves, Maine has as much coastline as the rest of the eastern seaboard. Rising sea levels present clear hazards for coastal towns, and warming waters could kill or displace the near-shore ecosystems whose fish and crustaceans power much of Maine's economy. Last month, the Arctic Council convened in Portland to discuss climatological changes in the region. Some of Maine’s coastal industries actually stood to benefit from recent ice melt, which has opened up new marine shipping routes north of the state, but that perk may soon wear thin. Like Mr. Trump, then-President George W. Bush also refocused NASA’s role in earth sciences – at one point, appointees eliminated the phrase, “To understand and protect our home planet” from the agency’s mission statement. But at that time, climate change hadn’t yet been politicized to the degree it is now. And space-based Earth monitoring was a nonpartisan cornerstone in the national strategy. Under Bush, NASA’s 2006 Strategic Plan included three presidential initiatives: the Climate Change Research Initiative, Global Earth Observation, and the Oceans Action Plan.


News Article | January 17, 2016
Site: phys.org

However, the primary mission of the launch from Vandenberg Air Force Base in California went as planned, propelling into orbit a $180 million US-French satellite called Jason-3 to study sea level rise. "Well, at least the pieces were bigger this time!" Elon Musk, the CEO of the California-based company, wrote on Twitter. SpaceX is trying to land its rockets back on Earth in order to re-use the parts in the future, trying to make spaceflight cheaper and more sustainable than before. The firm succeeded in landing its Falcon 9 first stage—the long towering portion of the rocket—on solid ground at Cape Canaveral, Florida in December. Even though an ocean landing is more difficult, SpaceX wants to perfect the technique because ship landings "are needed for high velocity missions," Musk tweeted. "Definitely harder to land on a ship," he added after the latest foible. "Similar to an aircraft carrier vs land: much smaller target area, that's also translating and rotating." Currently, expensive rocket components are jettisoned into the ocean after launch, wasting hundreds of millions of dollars. Competitor Blue Origin, headed by Amazon founder Jeff Bezos, succeeded in landing a suborbital rocket in November. However, no other company has attempted the ocean landing that SpaceX is trying to achieve. In the end, the problem on Sunday was not due to high speed or a turbulent ocean, but came down to a leg on the rocket that did not lock out as anticipated. "So it tipped over after landing," Musk said. SpaceX said the rocket landed within 1.3 meters (yards) of the droneship's center. There was no hitch in the launch itself, and the blast off at 10:42 am (1842 GMT) of the rocket and satellite went flawlessly. The satellite aims to offer a more precise look at how global warming and sea level rise affect wind speeds and currents as close as 0.6 miles (one kilometer) from shore, whereas past satellites were limited to about 10 times that distance from the coast. The technology will monitor global sea surface heights, tropical cyclones and help support seasonal and coastal forecasts. During a five-year mission, its data will also be used to aid fisheries management and research into human impacts on the world's oceans. The satellite is the fruit of a four-way partnership between the National Oceanic and Atmospheric Administration (NOAA), the US space agency NASA, the French space agency CNES (Centre National d'Etudes Spatiales) and the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT). Explore further: SpaceX launches satellite, but fails to land rocket on barge


Even deadly hurricanes appreciate irony once in a while. Lying directly in the path of fearsome Hurricane Matthew, which could become the first Category 3 or stronger hurricane to strike the U.S. since 2005, is America's next-generation, $1.2 billion weather satellite. The spacecraft, known as GOES-R (GOES stands for "geostationary operational environmental satellite"), is sittiing in a clean room at a building in Titusville, Florida, which is on the state's northeast coast, across the Intracoastal Waterway from Kennedy Space Center. SEE ALSO: The 3 potential paths that Hurricane Matthew will follow Some computer models are projecting that the storm may make landfall directly over Titusville and nearby Cape Canaveral, and a hurricane warning is in effect there. It's within the realm of forecast scenarios that the storm could be a high-end Category 4 or even a Category 5 storm if and when it hits that area. The satellite, which is the first in a $10.9 billion series of four satellites, is scheduled to launch into space on Nov. 4, with NASA and the National Oceanic and Atmospheric Administration (NOAA) heralding the event as a milestone in U.S. weather forecasting efforts. The NOAA in particular is counting on GOES-R to begin to prevent a dangerous gap in America's weather satellite coverage, given the country's aging platforms now in orbit, as well as to boost the nation's capabilities for next-generation weather and climate data gathering. According to NOAA spokesman John Leslie, the satellite is at relatively low risk of being damaged by the storm, provided it does not exceed Category 4 intensity. "In advance of Hurricane Matthew’s potential path to Florida’s east coast, the team preparing NOAA’s GOES-R spacecraft for launch has taken appropriate safety measures to secure the satellite at its present location — Astrotech Space Operations in Titusville, Fla.," Leslie said in a statement to Mashable. "GOES-R  is contained in a building that can withstand strong (Category 4) hurricane conditions. After the effects of Hurricane Matthew subside, NOAA and NASA will carefully assess the spacecraft and provide an update on its status." NASA's facilities at the Kennedy Space Center were similarly built to withstand a direct hit from a hurricane, though studies show the facilities are vulnerable to sea level rise and storm surge flooding, which is a threat with Hurricane Matthew. As of Wednesday at 5 p.m. EDT, the National Weather Service was forecasting the risk of 5 to 8 feet of inundation above ground level at Cape Canaveral if the storm were to hit at high tide.  The facility housing the GOES-R satellite, which has not yet been mounted to the rocket that will deliver it to orbit, does not sit at the water's edge and therefore has a lower storm surge risk. Several of the older NASA buildings which are of more historical value than critical infrastructure for satellite launches are only rated to withstand a Category 3 storm. Should the storm move farther northeast than currently anticipated and affect the NASA facility in Wallops Island, Virginia, then Hurricane Matthew will have pulled off a rare feat by hitting a satellite before it is deployed, and then striking the ground control system that will communicate with the satellite and receive data from it while it orbits the planet.  The satellite dishes and other equipment at Wallops Island were designed to withstand winds of up to 150 miles per hour, according to a spokesperson for the Harris Corporation, which built the equipment. Assuming it is successfully deployed, the new satellite, for example, will provide unprecedented monitoring capabilities for global lightning activity and allow faster, more frequent imaging of storm systems that could significantly improve severe thunderstorm and even hurricane forecasts, among other advances.


RESTON, Va.--(BUSINESS WIRE)--Ligado Networks and George Mason University will partner to provide the public with real-time access to critical weather and atmospheric data, helping students and scientists better research, track and predict weather. Prior to this arrangement, certain National Oceanic and Atmospheric Administration (NOAA) weather data was available only to a small group of users at significant expense. The partnership between Ligado and Mason will demonstrate the feasibility of delivering NOAA real-time weather data to more users across the country at a lower cost, using a cloud-based network. “It’s an honor to collaborate with George Mason University on such an important, cutting-edge project,” said Doug Smith, Ligado Networks’ president and chief executive officer. “The network we’ve developed will give the university unprecedented access to real-time public weather data, making it possible for the school’s weather research programs to better study our atmosphere and develop useful tools that will benefit the broader American public.” “This is a great opportunity that speaks to our commitment to building strong partnerships,” said Dr. Ángel Cabrera, George Mason University president. “We look forward to the advancements and advantages that this partnership with Ligado will provide for students, faculty and the community at large.” George Mason University is Virginia’s largest public research university. Thanks to its pursuit of research of consequence, Mason was named to the elite group of 115 tier-one research institutions as ranked by the Carnegie Classification of Institutions of Higher Education. “Extreme weather events have a huge impact on people, including their families, homes and businesses,” said Deborah Crawford, Mason’s vice president for research. “Faster and more accurate climate modeling and weather prediction will help people and organizations – including emergency responders – better prepare for and respond more quickly to weather-related events such as tornadoes, floods and wildfires, saving lives and livelihoods.” “Mason’s climate researchers have increased our understanding of how changes in the oceans and in the atmosphere affect the weather we and others experience,” Crawford added. “Their work is helping communities become more resilient to extreme weather conditions. Our partnership with Ligado assures that the Mason community will continue to develop more accurate climate models and prepare the next generation of climate researchers, including students in Mason’s new bachelor’s program, in atmospheric sciences.” Under the partnership, Mason and Ligado will compare the delivery of the weather data from existing satellite systems with the new cloud-based content delivery network. This will include measuring the speed and reliability of data delivery to users across the country. The initiative also includes reviewing and improving the accuracy of weather forecasting models and advance detection of meteorological conditions, such as the formations of tornadoes and dense ground fog. Additionally, the new information extraction tools will be made available to the public for free. “This type of network could also be expanded so schools, libraries and the general public have access to NOAA data, which will go a long way to advancing science, technology, engineering and mathematics education,” Smith said. “It’s hard to imagine all that may be possible by opening up access to this data, and together with Mason, we look forward to exploring those possibilities over the coming years.” Mason’s College of Science includes the Department of Atmospheric, Oceanic and Earth Sciences (AOES) and the Center for Ocean-Land-Atmosphere Studies (COLA). Both AOES and COLA offer research programs, undergraduate and advanced degrees for atmospheric science, ocean and estuarine science and paleontology. Ligado provides highly dependable and secure communications throughout North America via strategic partnerships with technology and service providers. It is readying plans to deploy an advanced satellite and ground-based network that would further position the U.S. as a leader in wireless technology and infrastructure by delivering unprecedented performance and enabling the emerging 5G and Internet of Things markets. Ligado Networks is readying an advanced satellite-terrestrial network unlike any that currently exists in North America, providing pervasive, highly secure and ultra-reliable connectivity to critical industries anywhere, all the time. Utilizing one-of-a-kind technologies and capabilities and working with our leading technology partners, Ligado Networks is making stronger connections. For more information, visit: www.Ligado.com, or follow Ligado on Twitter. Located near Washington, D.C., Mason enrolls 35,000 students from 130 countries and all 50 states. Mason has grown rapidly over the past half-century and is recognized for its innovation and entrepreneurship, remarkable diversity and commitment to accessibility. For more information visit gmu.edu. This release contains forward-looking statements and information regarding Ligado Networks and its business. Such statements are based on the current expectations and certain assumptions of the company’s management and are, therefore, subject to certain risks and uncertainties. The forward-looking statements expressed herein relate only to information as of the date of this release. Ligado Networks has no obligation to update these forward-looking statements to reflect events or circumstances after the date of this release, nor is there any assurance that the plans or strategies discussed in this release will not change.


News Article | April 4, 2016
Site: news.yahoo.com

In this undated photo provided by SeaWorld, San Diego, shows whale trainer Kristi Burtis as she obtains a milk sample from Kalia, an orca whale. There's one last orca birth to come at SeaWorld, and it probably will be the last chance for a research biologist to study up close how female killer whales pass toxins to their calves through their milk. SeaWorld's decision to end its orca breeding and to phase out by 2019 its theatrical killer whale performances, the foundation of its brand, followed years of public protests. (Mike Aguilera, SeaWorld San Diego via AP) More ORLANDO, Fla. (AP) — There's one last orca birth to come at SeaWorld, and it will probably be the last chance for research biologist Dawn Noren to study up close how female killer whales pass toxins to their calves through their milk. While SeaWorld's decision last month to end its orca breeding program delighted animal rights activists, it disappointed many marine scientists, who say they will gradually lose vital opportunities to learn things that could help killer whales in the wild. Noren got to observe only one mother-and-calf pair at a SeaWorld park before the end of the breeding program was announced. "It's really difficult to publish with one. I really was hoping for a couple more, but that is what it is," said Noren, who works at the National Marine Fisheries Service's Northwest Fisheries Science Center in Seattle. SeaWorld's 29 orcas at its parks in Orlando, San Diego and San Antonio could remain on display for decades to come and will continue to be available for study by outside scientists, as they generally have been for many years. The whales are 1 to 51 years old. But as SeaWorld's orca population dwindles, researchers will lose chances to collect health data and make other observations, such as drawing blood, measuring their heart rates and lung capacity, and documenting their diets and their growth. As the animals age, scientists say, research will be limited to geriatric orcas. No other marine park or aquarium in the world has SeaWorld's experience in maintaining or breeding orcas in captivity. SeaWorld parks hold all but one of all the orcas in captivity in the U.S., and they have housed more than half of all captive killer whales in the world tracked by the National Oceanic and Atmospheric Administration over the past 50 years. Orcas held in Canada, Japan and Europe have not been as accessible to researchers. SeaWorld will continue to support research projects underway on hearing, heart rates and blood, said Chris Dold, SeaWorld's chief zoological officer. "There won't be an immediate crunch," he said. But he acknowledged: "Over time, yeah, there's a loss of this resource to society and science." SeaWorld's critics, including People for the Ethical Treatment of Animals and WDC/Whale and Dolphin Conservation, sidestepped questions of whether outside researchers will suffer. But they said SeaWorld's own research has been unhelpful to orcas in the wild. "SeaWorld has had the largest population of orcas and has had the opportunity to do useful research and had done none of that," said Jared Goodman, PETA's director of animal law. Researchers outside SeaWorld argue they need its facilities and 1,500 employees in animal care to answer questions about wild orca behavior. "If you want to interact with them and conduct research, the combination of talent you have to have is a scientist with a research question, animals that are healthy so that you're looking at normal physiological rates, and in between that are the trainers — and I think people miss that," said Terrie Williams, who runs the Center for Marine Mammal Research and Conservation at University of California, Santa Cruz. SeaWorld's decision to end orca breeding and phase out its world-famous killer whale performances by 2019 followed years of protests and a drop in ticket sales at its parks. The backlash intensified after the 2013 release of "Blackfish," a documentary that was critical of SeaWorld's orca care and focused on an animal that killed a trainer during a performance in Orlando in 2010. In the wake of SeaWorld's announcement, some researchers fear that lawmakers on Capitol Hill and in states such as Washington and California will ban breeding or keeping of killer whales altogether. Similar bans targeting other species would have stymied the captive breeding that revived the California condor, said Grey Stafford, incoming president of the International Marine Animal Trainers' Association. "Those bills can have unforeseen and unintended consequences if and when the next species has a population crash in the wild. It ties the hands of state agencies and sanctuaries and places like SeaWorld to act," Stafford said. Kay contributed to this story from Miami. This story has been corrected to give the full name of the International Marine Animal Trainers' Association.


News Article | February 21, 2017
Site: www.techtimes.com

Built as a backup, now the duplicate copy of the Lightning Imaging Sensor is off to a two-year mission in space to measure the "amount, rate, and optical characteristics of lightning over Earth." As a sequel to the success of the original LIS instrument launched in 1997, the backup is on its way to the International Space Station as a payload to the 10th SpaceX cargo resupply mission launched on Feb. 18. The original LIS was shut down after 17 years in collecting lightning which was part of the Tropical Rainfall Measuring Mission. The follow-on mission will "sample lightning over a wider geographical area," said Richard Blakeslee, science lead for the LIS at NASA's Marshall Space Flight Center. The original LIS was carried by TRMM satellite orbiting over locations on Earth between 35 degrees north latitude and 35 degrees south latitude. It generated data for the tropics but not so much toward the more temperate zones including the densely populated areas away from the equator. Unlike the first LIS, the spare LIS will be mounted on the exterior of the orbiting lab which orbital inclination will allow the LIS to observe areas in Northern and Southern hemispheres for 24 hours. Using the International Space Station for the mounting of LIS will enable to get real-time lightning data. This data can be used in weather forecasting, advisories, and warnings. The data are accessible to interested users worldwide in partnership with NASA's Short Term Prediction Research and Transition Center in Huntsville. Atmospheric scientists were convened by NASA in 1979 to explore the possibility of using space-based lightning observations as a tool to study weather and climate. It paved the way for the development of LIS and launching of TRMM as the first mission documenting the global lightning climatology from space. The LIS data will also be used in conjunction with other space-based weather instruments, such as the Geostationary Lightning Mapper, to generate date that will increase present knowledge on severe weather formation as well as in changes in lightning distributions. GLM was recently launched on the National Oceanic and Atmospheric Administration's GOES-16 satellite. "The space-based vantage point allows us to observe all forms of lightning over land and sea, 24 hours a day," said Blakeslee. The data from LIS will help explain the link between lightning and severe weather condition. Weather scientists believe that proper understanding of the relationship between lightning and the accompanying severe weather holds the key to enhanced weather forecasting that will save lives and properties not only in the U.S. but also around the world. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | December 6, 2016
Site: www.theenergycollective.com

Donald Trump’s election is generating much speculation about how his administration may or may not reshape the federal government. On space issues, a senior Trump advisor, former Pennsylvania Rep. Bob Walker, has called for ending NASA earth science research, including work related to climate change. Walker contends that NASA’s proper role is deep-space research and exploration, not “politically correct environmental monitoring.” This proposal has caused deep concern for many in the climate science community, including people who work directly for NASA and others who rely heavily on NASA-produced data for their research. Elections have consequences, and it is an executive branch prerogative to set priorities and propose budgets for federal agencies. However, President-elect Trump and his team should think very carefully before they recommend canceling or defunding any of NASA’s current Earth-observing missions. We can measure the Earth as an entire system only from space. It’s not perfect — you often need to look through clouds and the atmosphere — but there is no substitute for monitoring the planet from pole to pole over land and water. These data are vital to maintaining our economy, ensuring our safety both at home and abroad, and quite literally being an “eye in the sky” that gives us early warning of changes to come. To paraphrase Milton Friedman, there’s no free lunch. If NASA is not funded to support these missions, additional dollars will need to flow into NOAA and other agencies to fill the gap. The National Aeronautics and Space Act of 1958, which created NASA, explicitly listed “the expansion of human knowledge of phenomena in the atmosphere and space” as one of the new agency’s prime objectives. Other federal agencies have overlapping missions, which is normal, since there are few neatly defined stovepipes in the real world. The National Oceanic and Atmospheric Administration, which is part of the Department of Commerce, works to “understand and predict changes in climate, weather, oceans, and coasts.” And the U.S. Geological Survey, a bureau of the Interior Department, is charged with “characterizing and understanding complex Earth and biological systems.” These primary earth science agencies have a pretty clear division of labor. NOAA and USGS fund and operate a constellation of weather- and land-observing satellites, while NASA develops, prototypes and flies higher-risk, cutting-edge science missions. When these technologies have been proven, and Congress funds them, NASA transfers them to the other two agencies. For example, in the NOAA–NASA partnership to develop the next generation of operational weather-observing satellites, NASA took the lead in prototyping and reducing risk by building the Suomi NPP satellite. That satellite, now five years old, is improving our daily weather forecasts by sending terabytes of data every day to supercomputers at NOAA. Its images also help with tasks as diverse as navigating in the Arctic through the Northwest Passage and monitoring the tragic wildfiresnear Gatlinburg, Tennessee. The experience NASA gained by developing the new technologies is now incorporated into NOAA’s Joint Polar Satellite System, whose first launch is scheduled for next year. When I served as NOAA’s chief operating officer, I met regularly with my NASA counterpart to ensure that we were not duplicating efforts. Sometimes these relationships are even more complex. As oceanographer of the Navy, I worked with NOAA, NASA and the government of France to ensure joint funding and mission continuity for the JASON-3 ocean surface altimeter system. The JASON satellites measure the height of the ocean’s surface, track sea level rise and help the National Weather Service (which sits within NOAA) forecast tropical cyclones that threaten U.S. coastlines. It is vital for these agencies to coordinate, but each plays an important individual role, and they all need funding. NOAA does not have enough resources to build and operate a number of NASA’s long-term space-based Earth observing missions. For its part, NASA focuses on new techniques and innovations, but is not funded to maintain legacy operational spacecraft while simultaneously pushing the envelope by developing new technologies. To many members of the earth science community, organizational issues between NASA and NOAA are secondary to the real problem: lack of sufficient and sustained funding. NASA and NOAA are working jointly to patch together a space-based Earth observing system, but do not receive sufficient resources to fully meet the mission. An administration that truly wanted to improve this situation could do so by developing a comprehensive Earth observing strategy and asking Congress for enough money to execute it. That would include maintaining NASA’s annual Earth science budget at around US$2 billion and increasing NOAA’s annual satellite budget by $1-2 billion. There’s a reason why space is called “the ultimate high ground” and our country spends billions of dollars each year on space-based assets to support our national intelligence community. In addition to national security, NASA missions contribute vital information to many other users, including emergency managers and the Federal Emergency Management Agency (FEMA), farmers, fishermen and the aviation industry. While NASA’s Earth observation satellites support numerous research scientists in government labs and universities, they also provide constant real-time data on the state of space weather, the atmosphere and the oceans – information that is critical to U.S. Navy and Department of Defense operations worldwide. Six years ago while I was serving as oceanographer of the Navy, I was asked to estimate how much more money the Navy would need to spend if we did not have our NASA and NOAA partners. The answer was, very conservatively, $2 billion per year just to maintain the capability that we had. That figure has almost certainly increased. If the Trump administration cuts NASA’s earth science funding, that capability will need to come from some other set of agencies. Has the new team thought seriously about which agencies should have their budgets increased to make up this gap? Finally a few thoughts about the elephant in the room: climate change. Mr. Walker has said that “we need good science to tell us what the reality is,” a statement virtually everyone would agree with. The way to have good science is to fund a sustained observation system and ensure the scientific community has free and full access to the data that these satellites produce. Not funding observation systems, or restricting access to their data, will not change the facts on the ground. Ice will continue to melt, and our atmosphere and oceans will continue to warm. Such a policy would greatly increase risks to our economy, and even to many Americans’ lives. In the business world, this stance would be considered gross negligence. In government the stakes are even higher. David Titley is Professor of Practice in Meteorology, Director of the Center for Solutions to Weather and Climate Risk, and Adjunct Senior Fellow with the Center for New American Security, Pennsylvania State University. Main image: Sea surface temperatures, October 2016, based on NASA satellite data. Sea surface temperatures affect weather, including hurricanes, and animal and plant life in the oceans. Credit: NASA Earth Observations


LOS ANGELES, CA--(Marketwired - November 10, 2016) - HeroX today launched the Big Ocean Button Challenge -- a crowdsourcing competition to find breakthroughs that transform big ocean data into useful mobile applications. Sponsored by XPRIZE and a follow-on to the Wendy Schmidt Ocean Health XPRIZE, the Big Ocean Button Challenge announced at BlueTech Week in San Diego hosted by the Maritime Alliance, utilizes the HeroX Challenge Platform to seek new solutions for organizing and broadcasting ocean data with application innovations in five categories: Fishing, Shipping and Trade, Ocean Acidification, Public Safety and Exploration. "Terabytes of data are collected everyday by the ocean community, but only a small fraction are shared with the public through useful apps and services," said Matt Mulrennan, manager of XPRIZE's Ocean Initiative. "With billions of people around the world relying on ocean resources from seafood to shipping to recreation, unlocking this wealth of data to catalyze a multibillion-dollar blue economy in ocean data products and services will radically transform how we connect with and protect our precious seas." "We are on the edge of an exciting emergence of a new blue economy, one that's based on information and environmental intelligence," said Dr. Rick Spinrad, Chief Scientist, National Oceanic and Atmospheric Administration (NOAA). "As with the growth of commercial weather services, the development of value-added information about the ocean is bound to become a fruitful economic sector, supporting such diverse communities as public health, transportation, emergency services, and energy." "There is a tremendous gap between available ocean data and the market for products that can make use of that vital information," said HeroX CEO, Christian Cotichini. "The Big Ocean Button Challenge has so many potential and important business and social outcomes. HeroX will apply its powerful crowdsourcing strategy: Using social media to mobilize bold thinkers, empower their collaboration and drive incredible solutions. This challenge could lead to new industries that tackle some of the world's most daunting problems." $15,000 will be awarded to winning Big Ocean Button Challenge competitors in each of the following ocean data application prize categories: Bonus prizes will also be awarded for: This challenge was first announced at the gala dinner of BlueTech Week in San Diego, hosted by the Maritime Alliance, an event of leaders in the blue economy to highlight case studies of collaboration in blue technology clusters from around the world. The Big Ocean Button Challenge is open to competitors worldwide beginning today through March 31, 2017. Winners will be announced by early 2018. For more information about the Big Ocean Button Challenge, please visit https://herox.com/bigoceanbutton. To learn more about XPRIZE's Ocean Initiative, please visit: http://www.xprize.org/oceaninitiative. Founded in 2013, HeroX exists at the intersection of crowdsourcing, competition and collaboration, using each to drive positive change. A suite of tools and services to help spark and build awareness for new solutions to social and economic challenges, the HeroX incentive prize platform connects funding companies and individuals with problem solvers. XPRIZE is the global leader in designing and implementing innovative competition models to solve the world's grandest challenges. XPRIZE utilizes a unique combination of gamification, crowd-sourcing, incentive prize theory, and exponential technologies as a formula to make 10x (vs. 10%) impact in the grand challenge domains facing our world. XPRIZE's philosophy is that -- under the right circumstances -- igniting rapid experimentation from a variety of diverse lenses is the most efficient and effective method to driving exponential impact and solutions to grand challenges. Active competitions include the $30M Google Lunar XPRIZE, the $20M NRG COSIA Carbon XPRIZE, the $15M Global Learning XPRIZE, the $10M Qualcomm Tricorder XPRIZE, the $7M Shell Ocean Discovery XPRIZE, the $7M Barbara Bush Foundation Adult Literacy XPRIZE, the $5M IBM Watson AI XPRIZE, the $1.75M Water Abundance XPRIZE and the $1M Anu and Naveen Jain XPRIZE. For more information, visit www.xprize.org.


News Article | February 18, 2017
Site: www.washingtonpost.com

BOSTON — A group of scientists and their supporters are set to march Sunday in Boston’s Copley Square in an event they’ve dubbed “a rally to stand up for science” in the Trump years. Inside a large nearby convention center, meanwhile, the annual meeting of the American Association for the Advancement of Science (AAAS), the United States’ largest general scientific society, has featured speeches and panel sessions further underscoring the sense that under President Trump, scientists could face wide-ranging political conflicts and challenges, and will have to decide how to meet them. At the opening plenary, the chair of the board of the AAAS criticized Trump’s executive order on immigration; the next night, a prominent historian suggested to scientists that there’s nothing wrong with taking political stands. Einstein did it, after all, over the atomic bomb. “We live in a world where many people are trying to silence facts,” said Harvard scholar Naomi Oreskes. She told a vast hall of hundreds of scientists that history does not support the idea that “taking a public position on an urgent issue undermines the credibility of science.” [CRISPR pioneer muses about long journey from China to pinnacle of American science] And yet the challenges for scientists during the Trump administration could not only be bigger, but also potentially more diverse, than those seen during George W. Bush’s administration — a key reference point in the research community for thinking about problems at the intersection between science and politics. During the Bush years, a number of science controversies arose related to suppression of scientific information or interference with its dissemination, as numerous government scientists and experts charged they’d been blocked from speaking to the media, or that scientific documents had been politically edited. In those days, the threat of deep cuts to research funding didn’t loom so large as it does now. And today’s science world has also mobilized over Trump’s immigration executive order; more than 100 scientific societies and universities registered their concern in a recent letter to the president. The anticipation of a multi-pronged battle is shared by the marchers, organized by the Natural History Museum, ClimateTruth, the Union of Concerned Scientists, and numerous other groups. “It’s so many different fronts — subtle, not so subtle, things that can affect directly or indirectly the health, the environment, the economy, all of these things,” said Astrid Caldas, a climate scientist with the Union of Concerned Scientists who was set to speak at the march. “Depending on if we are talking about an executive order, or the gutting of EPA. So it’s a complex situation, and it’s a unique situation.” The organizers of the march — a smaller scale version of a major March for Science planned for Earth Day — charge that “from the muzzling of scientists and government agencies, to the immigration ban, the deletion of scientific data, and the de-funding of public science, the erosion of our institutions of science is a dangerous direction for our country.” Inside the conference Friday, however, more cautious science policy experts warned that Bush-style problems over the suppression and interference with science have not yet clearly emerged under Trump. They stressed that temporary communication freezes during a government transition are not abnormal, and that there are new protections in place for scientists, such as federal scientific integrity directives, that didn’t exist in the Bush years. “It’s too early to say that there is going to be some across-the-board freeze on the ability of scientists to communicate,” said Joanne Carney, director of the office of government relations at the AAAS, at the Friday science policy panel. “We have [government] scientists attending the conference here. I think it’s a little too early to say that scientists are going to be inhibited and incapable of speaking or publishing their research. But we are monitoring it.” Granted, that doesn’t mean that federal researchers aren’t already self-censoring out of concern for what they may face. “Fear is higher,” said Robert Cook-Deegan, a science and health policy expert at Arizona State University, at the Friday session. “If you’re a federal employee, I think there’s going to be a level of self-scrutiny that is higher than it has been in past administrations.” However, the speakers underscored that scientists could simultaneously face a vast new challenge over securing federal research funding and maintaining it at current levels. Citing federal budget trends, with an expected tax cut and infrastructure spending program as well as a possible dismantling of the Affordable Care Act, William Bonvillian, director of MIT’s Washington office, said that discretionary federal spending is set to be squeezed. That, in turn, can be expected to hit scientific research budgets, he said, and in turn, the federal and university based research community. “There is going to be a challenge” to research and development programs, Bonvillian said. “We’re going to need to tell the story, that R&D is actually a key part of the solution, it’s part of growth. But the challenge this time in telling that story is going to be even greater than usual.” That is a contrast with the Bush years, when the president’s science adviser, John Marburger, extolled the administration’s commitment to scientific research funding. And sure enough, based on data from the AAAS, science funding fared relatively well in the Bush years, especially when it came to defense-related research and development. (Research spending has declined as a percentage of U.S. GDP, but that’s part of a long-term trend that has persisted across many administrations and many years.) There is particular concern right now about cuts to the Environmental Protection Agency and how those would affect its science. The conservative Heritage Foundation, influential with the Trump administration, has also suggested cutting an entire Energy Department office, the Office of Energy Efficiency and Renewable Energy, which would be a major blow to renewable energy research. In contrast, medical research would appear to face fewer threats. Former president Barack Obama’s top medical research priorities, such as the cancer “moonshot,” appear far more secure than programs at the EPA, the National Oceanic and Atmospheric Administration, and the U.S. Department of Agriculture, Cook-Deegan said. “There are a lot of people in those programs that are quite worried,” he said. “I think all we have right now is question marks about those.” And then there’s the subject of immigration restrictions, and how those affect the scientific community, which has traditionally viewed itself as an international enterprise of knowledge and welcomed contributions from a global talent pool. Academics and scientists have already played a central role in the immigration debate. In Washington state’s thus far successful lawsuit over Trump’s immigration executive order, the University of Washington attested that it had numerous students and professors who were citizens of the seven countries targeted. That included one professor who had to shelve plans to attend a conference outside the United States for fear of not being able to get back in under the order, and one graduate teaching assistant who was traveling outside of the country when it was signed. “Science depends on openness, transparency, and the free flows of ideas and people,” said Geraldine Richmond, chair of the board of AAAS and a professor at the University of Oregon, at a plenary session Thursday. “Limitations on the ability of scientists to communicate with their peers and with the public through participation at meetings such as this one will harm the scientific enterprise. We must, and we will, continue to speak out publicly on these issues that are so critical for science to flourish and serve society.” Richmond said some international researchers may not have been able to attend the meeting, or had made a decision not to come, because of Trump’s executive order.


News Article | November 16, 2016
Site: www.nature.com

Joellen Russell wasn’t prepared for the 10-metre waves that pounded her research vessel during an expedition south of New Zealand. “It felt like the ship would be crushed each time we rolled into a mountain of water,” recalls Russell, an ocean modeller at the University of Arizona in Tucson. At one point, she was nearly carried overboard by a rogue wave. But what really startled her was the stream of data from sensors analysing the seawater. As the ship pitched and groaned, she realized that the ocean surface was low in oxygen, high in carbon and extremely acidic — surprising signs that nutrient-rich water typically found in the deep sea had reached the surface. As it turned out, Russell was riding waves of ancient water that had not been exposed to the atmosphere for centuries. Although controversial when she encountered it back in 1994, this powerful upwelling is now recognized as a hallmark of the Southern Ocean, a mysterious beast that swirls around Antarctica, driven by the world’s strongest sustained winds. The Southern Ocean absorbs copious amounts of carbon dioxide and heat from the atmosphere, which has slowed the rate of global warming. And its powerful currents drive much of the global ocean circulation. The hostile conditions have kept oceanographers at bay for decades, but a new era of science is now under way. Researchers from around the world are converging on the region with floats, moorings, ships, gliders, satellites, computer models and even seals fitted with sensors. The goal is to plug enormous data gaps and bolster understanding of how the Southern Ocean — and the global climate — functions. Doing so could be key to improving predictions of how quickly the world will warm, how long the Antarctic ice sheet will survive and how fast sea levels will rise. “It’s been amazing to see this explosion of information,” says Arnold Gordon, an oceanographer at Lamont-Doherty Earth Observatory in Palisades, New York, who led some of the early Southern Ocean surveys in the 1960s. “New technologies are allowing us access to these remote areas, and we are far less dependent on driving a ship through the sea ice.” Already, initial data from an array of ocean floats suggest that upwelling waters could be limiting how much CO the Southern Ocean absorbs each year. This raises new questions about how effective these waters will be as a brake on global warming in decades to come. “The Southern Ocean is doing us a big climate favour at the moment, but it’s not necessarily the case that it will continue doing so in the future,” says Michael Meredith, an oceanographer with the British Antarctic Survey in Cambridge, UK. Meredith is heading a series of expeditions over the next five years to help document the uptake of heat and carbon. “It really is the key place for studying these things.” The mysteries of the Southern Ocean have beckoned explorers for centuries, but the unique geography of the region makes it a perilous place for ships. There are no landmasses to tame the winds and waves that race around the planet at 60° S. And the ice surrounding Antarctica is notorious for engulfing wayward vessels, including Ernest Shackleton's Endurance in 1915. Scientists only started to realize how important the region is for controlling global climate in the 1980s, when several groups were trying to explain what had caused atmospheric CO concentrations to drop by about one-third during the last ice age and then later rise. Oceanographer Jorge Sarmiento at Princeton University in New Jersey realized that changes in circulation and biology in the Southern Ocean could help to cool and warm the planet1. Three decades later, Sarmiento is leading an effort to gather the first real-time data on the chemical and biological processes that govern carbon in the Southern Ocean. The US$21-million Southern Ocean Carbon and Climate Observations and Modeling Project (SOCCOM) has already deployed 51 of a planned 200 robotic floats that bob up and down in the upper 2,000 metres of the Southern Ocean. Building on the global Argo array, which consists of more than 3,700 floats collecting temperature and salinity data, the SOCCOM floats also measure oxygen, carbon and nutrients. With the new data, Sarmiento and his team can test their models and refine estimates of how CO moves between the seas and the sky. Indirect evidence suggests that the Southern Ocean is a net carbon sink and has absorbed as much as 15% of the carbon emissions emitted by humanity since the industrial revolution. But at some times of year and in specific places in this region, carbon-rich surface waters release CO into the atmosphere. Now, researchers are getting some of their first glimpses in near-real time of what happens in the Southern Ocean, particularly in winter. “Right off the bat, we are seeing CO fluxes into the atmosphere that are much greater than we had estimated before,” Sarmiento says. “It’s just revolutionary.” The unpublished analysis is based on just 13 floats that have been in the water for at least a year, so the question now is whether the higher CO emissions during winter represent larger trends across the entire Southern Ocean. “It’s pretty tantalizing,” says Alison Gray, a postdoctoral researcher at Princeton who is leading the study. “It would imply that potentially there is a much weaker carbon sink in the Southern Ocean than has been estimated.” Hints of something similar have been seen before. In 2007, a team led by Corinne Le Quéré, now director of the Tyndall Centre for Climate Change Research in Norwich, UK, published a study in Science2 indicating that the rate of carbon uptake by the Southern Ocean decreased between 1981 and 2004. The authors blamed the changes on the winds that encircle the Antarctic continent. The speed of those winds had increased during that time, probably as a result of the hole in the stratospheric ozone layer over Antarctica and possibly because of global warming. Stronger winds are better able to pull up deep, ancient water, which releases CO when it reaches the surface. That would have caused a net weakening of the carbon sink. If that trend were to continue, atmospheric CO levels would rise even faster in the future. However, a study in Science3 last year found that the carbon sink started to strengthen in the early 2000s (see ‘The unreliable sink’). Le Quéré says it’s unclear whether that rise in CO absorption is a return to normal or a deviation from the long-term weakening of the sink. Regardless, she says, it’s now clear that the Southern Ocean might be much more fickle than scientists thought. SOCCOM floats will probably help researchers to answer these questions, but it could be years before they can say anything concrete about trends. Nor is Le Quéré convinced that the new network of floats will provide enough detail. In a paper published in July4, she found that models of carbon uptake by the Southern Ocean depend strongly on assumptions about the structure of the food web there. She says that climate scientists need to improve their understanding of the type and timing of phytoplankton and zooplankton blooms if they are going to get their climate projections right. “In my view, that's the next frontier,” she says. Carbon is only part of the story in the Southern Ocean. Scientists are also beginning to pin down what happens to all the heat that gets absorbed there. The Southern Ocean is the starting point for a network of currents that carry water, heat and nutrients throughout the ocean basins. Near Antarctica, surface waters normally grow cold and dense enough to sink to the bottom of the ocean, forming abyssal currents that hug the sea floor as they flow north into the Pacific, Atlantic and Indian oceans. Much of what scientists know about these currents comes from ship surveys conducted every decade or so since the early 1990s. In 2010, when researchers analysed data from the surveys, they found a pronounced warming trend in abyssal waters, which were somehow absorbing about 10% of the excess heat arising from global warming5. The level of warming in the deep ocean came as a surprise, and researchers have proposed several explanations that centre on the Southern Ocean. One factor could be that surface waters around Antarctica have become less salty, in part because of an increase in summer rainfall over the ocean. Fresher surface water is less dense, so that change would choke the supply of cold water sinking to the sea floor to feed the bottom currents. “The deep water warms up because it’s not getting as much cold-water replenishment,” says Gregory Johnson, an oceanographer with the National Oceanic and Atmospheric Administration (NOAA) in Seattle, Washington, who co-authored the 2010 analysis. An as-yet-unpublished analysis, based on initial data from the third round of ship surveys, finds similar trends, but researchers have longed for more frequent measurements to provide a fuller picture. That could happen if a proposed international project moves forward. Called Deep Argo, this would be an array of floats that regularly dive all the way to the bottom of the ocean. Johnson is involved in a US consortium that is testing 13 floats in a basin off the coast of New Zealand, and another nine south of Australia. Others are using moorings to monitor deep water flows. Since 1999, Gordon has maintained an array of moorings in the Weddell Sea, one of the main areas where cold surface waters sink to form ocean bottom currents. He has seen the deep water growing less salty in some areas, but the long-term trends are not clear6. “We are really only scratching the surface of how bottom waters are changing, and how that is impacting the large-scale global ocean circulation,” he says. In January 2015, oceanographers aboard the Australian icebreaker Aurora Australis were cruising off the coast of Antarctica when they were presented with a unique opportunity. Following a crack in the sea ice, they were able to reach the edge of the Totten Glacier, one of the biggest drainage points for the East Antarctica ice sheet. No other expedition had reached within 50 kilometres of the glacier. The team deployed floats and gliders into the waters around and underneath the glacier, which is 200 metres thick at its front edge. What they found came as a shock. The water at the front of the glacier was 3 °C warmer than the freezing point at the base of the glacier. “We always thought Totten was too far away from warm water to be susceptible, but we found warm water all over the shelf there,” says Steve Rintoul, an oceanographer at the Antarctic Climate and Ecosystems Cooperative Research Centre in Hobart, Australia. Scientists had already shown7, 8 that warm-water currents are undercutting the West Antarctic ice sheet in many areas along the peninsula where the glaciers extend into the ocean. But Rintoul says that this expedition provided some of the first hard evidence that these same processes are affecting East Antarctica, raising new questions about the longevity of the mammoth ice sheets that blanket the continent. There is no clear answer yet for what is driving the warming of these near-surface currents. Some explanations invoke changes in the winds over the Southern Ocean and the upwelling of warm waters. Others focus on fresher surface waters and an expansion of sea ice in some areas. The combination of extra sea ice and fresher surface waters could create a kind of cap on the ocean that funnels some of the warmer upwelling water towards the coast. “Every scientist, including me, has their favourite explanation,” Gordon says. “But that’s how science works: the more you observe, the more complicated it gets.” Finding the answers may require recruiting some of Antarctica’s permanent residents. Meredith’s team at the British Antarctic Survey plans to equip Weddell seals with sensors so that the animals can collect water measurements as they forage below the sea ice along the continental shelf. This zone has particular importance because it is precisely where cold water begins its descent into the abyss. “The processes that happen in that shelf region are very important on a global scale, but measuring them is very difficult,” Meredith says. “The seals sort of transcend that barrier.” The Weddell seals are just one component of the expedition’s arsenal. The team will also send autonomous gliders under the sea ice on preprogrammed routes to collect temperature and salinity data down to depths of 1,000 metres. Measurements taken from ships will help fill in the picture of what happens in this crucial region around Antarctica — and how it relates to the rest of the global ocean circulation. Getting the data is only half the challenge. Ultimately, scientists need to improve their models of how currents transport heat, CO and nutrients around the globe. Even armed with better measurements, results suggest that modellers have a way to go. An analysis of data from the ship surveys suggests that upwelling ocean water does not rise in a simple pattern near Antarctica. Rather, it swirls around the continent one and a half times before reaching the surface. And Sarmiento’s team at Princeton found that only the highest-resolution models could accurately capture that behaviour. Sarmiento says that it could be a while before the models can simulate what really happens in this region, but he is confident that day will eventually arrive. For Russell, it’s as if scientists are at last lifting the veil on the Southern Ocean. After she returned from her maiden voyage in 1994, she turned to modelling because there wasn’t enough data at the time to quantify the effects of the upwelling she encountered. Today she has it both ways. Russell is heading the modelling component of the SOCCOM project, and she is getting more data than she ever dreamt of. “It’s just a wonderful time to be an oceanographer,” she says, “even as we are carrying out this really scary geophysical experiment on our planet.”


News Article | January 19, 2017
Site: www.theguardian.com

Small island species, confined to limited terrain, are always vulnerable, particularly to invasive species, burgeoning human populations, and new diseases. On Hawaii, climate change intersects with these three factors to imperil its unique birds, including six species of honeycreeper. The small, often brightly coloured honeycreepers tend to survive at higher altitudes where their forest habitat is less likely to be destroyed by humans. Higher elevations are also cooler, and less attractive to mosquitoes, which were first carried to Hawaii in the 19th century, long after the birds had evolved there. Outbreaks of mosquito-borne diseases such as avian malaria and avian pox began soon afterwards. As the world warms, so mosquitoes move into higher elevations – and there is nowhere for the honeycreepers to escape to. The birds are particularly susceptible to avian malaria. Last year, a scientific study noted that the prevalence of avian malaria has more than doubled since the 1990s in the upper regions of the Hawaiian island of Kauai. Naturalists working in the Kauai mountains never encountered mosquitoes despite searching for them until the last six years or so, during which time they have become commonplace. As well as mosquitos, climate change is also assisting non-native competitors and invasive weeds, which may hasten the native birds’ demise. Eben Paxton, of the US Geological Survey’s Pacific Island Ecosystems research centre, fears that two honeycreepers, the ‘akikiki and the ‘akeke’e, will fall extinct in the next decade “without major intervention”. This means action unfamiliar to many conservationists: removing standing water to reduce mosquito populations, and even releasing genetically modified mosquitoes to reduce populations over time, as undertaken in Brazil to combat the Zika virus. The Baird’s sandpiper (Calidris bairdii) is not likely to become extinct any time soon. It is still listed as a species of “least concern” on the International Union for Conservation of Nature’s (IUCN) Red List. But the challenge posed by climate change for this elegant little wading bird is one experienced by many other species: it’s a problem of phenology and synchronicity. Phenology, the study of the timing of natural events in relation to weather and climate, is increasingly complex and important in an era of rapid climate change. Changes in phenology may be a positive sign, demonstrating that species are adapting to climatic conditions and migrating earlier, or flowering sooner, or having offspring earlier in the spring to coincide with food supplies that are changing with the season. But many species are struggling to adapt quickly enough. Increasing temperates in the high Arctic are encouraging shore birds such as the Baird’s sandpiper to breed earlier in the season. This means that more chicks are emerging before the peak abundance of the insects that they feed own. Studies show that chicks raised outside the period of peak abundance grow much more slowly, which means they are less likely to survive into adulthood. A similar mismatch between chick emergence and peak food has also been shown to occur with the European pied flycatcher in the Netherlands. Increasing temperatures are posing a challenge for all kinds of montane species. They may retreat to higher altitudes but, eventually, they will run out of mountain. Mountainous regions are also likely to experience particularly extreme temperature changes: while the Intergovernmental Panel on Climate Change estimates that 21st-century climate warming is likely to exceed 2C in many scenarios, the rate of temperature increase in mountainous areas is predicted to be much higher – possibly three times the increase recorded over the 20th century. The giant mountain lobelia (Lobelia rhynchopetalum) is a native of Ethiopia, a spectacular-looking tropical alpine plant that resembles a spiky tropical palm but then shoots up a huge woolly protuberance, sometimes more than 10 metres tall. Implausibly large in arid mountainous terrain, the family of lobelia plants remarkably predate the formation of tall mountains in eastern Africa, to which they’ve adapted. They are not finding it so easy to adapt to rapid anthropogenic climate change. A scientific study of the plant’s prospects last year concluded it “will suffer massive reduction in range” under warmer climes, with just 3.4% of its habitat still suitable by 2080. By then, it is predicted to be confined to just four suitable mountain-top habitats “which may be too small to sustain viable populations”. There’s a further problem. As alpine species such as the giant mountain lobelia are confined to isolated mountaintops, their genetic diversity will narrow dramatically – by 82% – further increasing the likelihood of extinction. The travails of this mountain giant are matched by mountain plant species around the world, including high-altitude species in Britain. Botanist Trevor Dines, of the charity Plantlife, says: “It’s already clear that some of our rarest Arctic-alpine plants, such as Highland saxifrage, are at risk. As the climate warms, they’re already moving to higher altitudes to find cooler, damper conditions. At some point, they’ll run out of mountain to climb and we’ll be facing the extinction of some of our most enigmatic and wonderful flora.” For many creatures, climate change is the most vicious component of a perfect storm driving them towards extinction. For some, extinction is quite literally caused by storms and rising seas. Anthropogenic climate change has almost certainly driven our first mammal species to extinction. The Bramble Cay melomys (Melomys rubicola), or mosaic-tailed rat, lived the unobtrusive life of a small rodent in the eastern Torres Strait. It was first discovered – and killed – on the tiny vegetated coral island of Bramble Cay by Europeans in 1845. Several hundred lived there as recently as 1978. But the highest point of Bramble Cay is three metres above sea level and around the Torres Strait the sea level rose at almost twice the global average rate between 1993 and 2014. Since 1998, the area of Bramble Cay above high tide has shrunk from from 4 hectares to 2.5 hectares. The melomys has lost 97% of its habitat and was last seen by a fisherman in 2009. Scientists laid traps in 2011 and twice in 2014 to catch the little rodent and start a captive breeding programme to save it from extinction. But they were too late: they couldn’t find any trace of the animal. There’s a small chance an as-yet-undiscovered population may survive in Papua New Guinea but the scientists have judged it is almost certainly extinct. The Sierra Nevada blue (Polyommatus golgus) is a small butterfly that is both brilliant blue (the male) and dark black-brown (the female) and is one of four endangered species unique to Spain. It is only found in the peaks of the Sierra Nevada and in another small mountainous area further north. It has already lost habitat to overgrazing by animals, a ski resort, and the trampling of vegetation by people on roads and footpaths. But its biggest threat is climate change, according to a species recovery plan drawn up by the researcher Miguel Munguira for Butterfly Conservation Europe. Drought, increased temperatures and reduced snow coverage are set to displace the species to higher areas where the habitat might not be suitable. “For the populations living on the highest areas of the mountains these changes would mean their extinction,” says Munguira. Of the 482 species of butterfly in Europe, 149 are restricted to such small areas that it is difficult for scientists to assess how the changing climate will affect them. Isolated in such small pockets of land makes these rare insects hugely vulnerable – wild habitat is too fragmented for even winged creatures to easily find a suitable alternative. And those that can only live in northern Europe, or on the tops of mountain ranges, will be the first to go. “The scale of threat to the species of Europe is massive,” says Nigel Bourn, conservation director of Butterfly Conservation in Britain. “I don’t really think policymakers have even begun to come to terms with that.” The disappearance of a few butterflies may not move the hardest of human hearts but these are the most closely monitored insects: the impact of climate change on hundreds of butterflies will be replicated in other less-known pollinators and insect populations – from bees to hoverflies – and the very fabric of life on earth will start to fray. Rising seas and stormy weather will affect turtle species in the most direct of ways, eroding or destroying many of the beaches where they lay their eggs. But scientists have discovered that hotter sands also cause greater numbers of sea turtles to be born female. In the short term, over the next 20 or 30 years, this will increase turtle numbers. But a study published in Nature Climate Change examining the loggerhead turtles of Cape Verde in the Atlantic, warns that significantly warmer sands in the next 150 years could cause such a preponderance of females that species become extinct. Hotter sand can also cause complete nest failure. Turtles are facing more problems than most animals: warming ocean temperatures will alter currents and shift the distribution and abundance of prey species. Species such as the hawksbill turtle are dependant on coral reefs which are bleaching and dying with climate change. The Adélie penguin is one of just two true Antarctic penguins, surviving on the ice-bound continent for 45,000 years. Now its survival is being questioned by scientists puzzling over the precise cause for sharp declines that correlate with a rapidly changing climate. Colonies of this little penguin on the West Antarctic Peninsula have declined by at least 80% since the 1970s, and this is an area with more years of warmer-than-average sea surface temperatures than other regions. Changes in sea temperature and sea ice affects the availability of food, and where fish populations have fallen the penguins eat more krill, which is less nutritious. Nest sites may not be ideal if warming is creating premature melt and puddles on the ground as eggs cannot survive if they are lying in a pool of water. Most importantly, the Adéie penguin cannot survive without sea ice. In a paper published last year, researchers predicted that 60% of the present habitat would be unsuitable for the penguins by 2099. But Adélie populations in the southern most parts of Antarctica, where there has been fewer climatic and environmental changes, are much more stable. The Adélie has refugia but for how long? The polar bear may be the poster-creature of climate change victims but this equally attractive – and rather more timid – white furry mammal is much closer to the edge of extinction. This arboreal marsupial lives on the wooded slopes of Mount Lewis in the Daintree rainforest in Queensland, Australia, where scientists have judged it already “ecologically extinct”. The white lemuroid ringtail possum (Hemibelideus lemuroides) lives off leaf moisture and are only found in the high-altitude cloud forest and cannot survive temperatures above 30C for more than a few hours. At less than 3,000 metres high, the climate of Mount Lewis is rapidly changing. A severe heatwave in 2005 killed off most of these cool-loving creatures. In July 2014, scientists observed four or five adults during 10 surveys. Even if the population bounces back, soon it will have nowhere left to go. Genetic studies have never been carried out to determine whether the white possums are a separate species or simply colour variations of the brown-furred lemuroid ringtail possums, which appear to be able to survive higher temperatures. But Prof Bill Laurance of James Cook University has argued that the white form is “a unique evolutionary unit and therefore worthy of conservation”. It is also just one furry symbol of the “ecological catastrophe” that scientists warn will soon befall thousands of species who will find that Australia’s tropical rainforests offer them no shelter in an era of warming. The most commonly pictured victim of climate change is the polar bear clinging to a rapidly diminishing iceberg. But there is another vulnerable Arctic mammal that is just as photogenic and even more dependant upon Arctic sea ice for its survival. Climate change is driving polar bears from the safety of sea ice and on to hazardous dry land, and into more conflict with humans. But the ringed seal, the smallest Arctic seal species, cannot adapt to dry land so easily. Ringed seals rest on sea ice, conceive beneath it, and give birth upon it, excavating snow dens on the surface of the sea ice to shelter their newborns. These dens keep the young warm, and depends upon sufficient annual snowfall. Warmer spring temperatures causes snow dens to collapse and the ice to break up early, separating young – just 60cm long when born – from their mothers, and exposing them to the cold, predators and pathogens. Ring seal reproductive rates are already showing declines associated with climate change. Hundreds of pups are usually born each year on the fjords along the west coast of Svalbard but pups were “virtually non-existent” in 2006 and 2007, when many fjords did not freeze for the first time in recorded history. If ringed seal populations slump, there will be another victim, too: they are the prime food source of the polar bear. Coral is not merely a living species; it’s a miraculous ecosystem engineer, building elaborate and beautiful subterranean structures that provide food and shelter for so many other forms of life on Earth. Coral reefs are hailed as the “rainforest of the sea” but such analogies underplay their significance: they house a greater diversity of animal and plant life than rainforests. Coral is being killed by climate change and its extinction is coming sooner than many other creatures imperilled by climate change. Staghorn coral (Acropora cervicornis) is experiencing disastrous declines in its range in the southern Gulf of Mexico, Florida and the Bahamas, declining by up to 98% in parts of the Caribbean since the 1980s. It is listed as “critically endangered” on the IUCN red list. Since 2005, the Caribbean region has lost 50% of its corals, largely because of rising sea temperatures and mass bleaching incidents which have killed coral around the world. Species such as the orange-spotted filefish are completely dependent on coral reefs, and highly sensitive to warmer water. Across the world, coral reefs are bleaching and dying: Japan’s government this year reported that almost three-quarters of its biggest coral reef has died, blaming rising sea temperatures caused by global warming. Australia’s Great Barrier Reef experienced the worst bleaching ever recorded by scientists in 2016. Researchers at the US National Oceanic and Atmospheric Administration have predicted that by 2050 more than 98% of coral reefs around the world will be afflicted by “bleaching-level thermal stress” each year. They conclude that reefs, including the Great Barrier Reef, are unlikely to survive such events. Homo sapiens is not dependant on the coral reefs but their loss would be a devastating and demoralising indictment of our era, and the destructiveness of our species. “We’ll lose more species of plants and animals between 2000 and 2065 than we’ve lost in the last 65 million years,” environmentalist Paul Watson, the founder of Sea Shepherd, has pointed out. “If we don’t find answers to these problems, we’re gonna be victims of this extinction event that we’re at fault for.”


News Article | September 7, 2016
Site: news.mit.edu

Oceanographers from MIT and Woods Hole Oceanographic Institution report that the northeast Pacific Ocean has absorbed an increasing amount of anthropogenic carbon dioxide over the last decade, at a rate that mirrors the increase of carbon dioxide emissions pumped into the atmosphere. The scientists, led by graduate student Sophie Chu, in MIT’s Department of Earth, Atmospheric, and Planetary Sciences, found that most of the anthropogenic carbon (carbon arising from human activity) in the northeast Pacific has lingered in the upper layers, changing the chemistry of the ocean as a result. In the past 10 years, the region’s average pH has dropped by 0.002 pH units per year, leading to more acidic waters. The increased uptake in carbon dioxide has also decreased the availability of aragonite — an essential mineral for many marine species’ shells. Overall, the researchers found that the northeast Pacific has a similar capacity to store carbon, compared to the rest of the Pacific. However, this carbon capacity is significantly lower than at similar latitudes in the Atlantic. “The ocean has been the only true sink for anthropogenic emissions since the industrial revolution,” Chu says. “Right now, it stores about 1/4 to 1/3 of the anthropogenic emissions from the atmosphere. We’re expecting at some point the storage will slow down. When it does, more carbon dioxide will stay in the atmosphere, which means more warming. So it’s really important that we continue to monitor this.” Chu and her colleagues have published their results in the Journal of Geophysical Research: Oceans. The northeast Pacific, consisting of waters that flow from Alaska’s Aleutian Islands to the tip of southern California, is considered somewhat of a climate canary — sensitive to changes in ocean chemistry, and carbon dioxide in particular. The region sits at the end of the world’s ocean circulation system, where it has collected some of the oldest waters on Earth and accumulated with them a large amount of dissolved inorganic carbon, which is naturally occurring carbon that has been respired by marine organisms over thousands of years. “This puts the Pacific at this already heightened state of high carbon and low pH,” Chu says. Add enough atmospheric carbon dioxide into the mix, and the scales could tip toward an increasingly acidic ocean, which could have an effect first in sea snails called pteropods, which depend on aragonite (a form of calcium carbonate) to make their protective shells. More acidic waters can make carbonate less available to pteropods. “These species are really sensitive to ocean acidification,” Chu says. “It’s harder for them to get enough carbonate to build their shells, and they end up with weaker shells, and have reduced growth rates.” Chu and her colleagues originally set out to study the effects of ocean acidification on pteropods, rather than the ocean’s capacity to store carbon. In 2012, the team embarked on a scientific cruise to the northeast Pacific, where they followed the same route as a similar cruise in 2001. During the month-long journey, the scientists collected samples of pteropods, as well as seawater, which they measured for temperature, salinity, and pH. Upon their return, Chu realized that the data they collected could also be used to gauge changes in the ocean’s anthropogenic carbon storage. Ordinarily, it’s extremely difficult to tease out anthropogenic carbon in the ocean from carbon that naturally arises from breathing marine organisms. Both types of carbon are classified as dissolved inorganic carbon, and anthropogenic carbon in the ocean is miniscule compared to the vast amount of carbon that has accumulated naturally over millions of years. To isolate anthropogenic carbon in the ocean and observe how it has changed through time, Chu used a modeling technique known as extended multiple linear regression — a statistical method that models the relationships between given variables, based on observed data. The data she collected came from both the 2012 cruise and the previous 2001 cruise in the same region. She ran a model for each year, plugging in water temperature, salinity, apparent oxygen utilization, and silicate. The models then estimated the natural variability in dissolved inorganic carbon for each year. That is, the models calculated the amount of carbon that should vary from 2001 to 2012, only based on natural processes such as organic respiration. Chu then subtracted the 2001 estimate from the 2012 estimate — a difference that accounts for sources of carbon that are not naturally occurring, and are instead anthropogenic. The researchers found that since 2001, the northeast Pacific has stored 11 micromoles per kilogram of anthropogenic carbon, which is comparable to the rate at which carbon dioxide has been emitted into the atmosphere. Most of this carbon is stored in surface waters. In the northern part of the region in particular, anthropogenic carbon tends to linger in shallower waters, within the upper 300 meters of the ocean. The southern region of the northeast Pacific stores carbon a bit deeper, within the top 600 meters. Chu says this shallow storage is likely due to a subpolar gyre, or rotating current, that pushes water up from the deep, preventing surface waters from sinking. In contrast, others have observed that similar latitudes in the Atlantic have stored carbon much deeper, due to evaporation and mixing, leading to increased salinity and density, which causes carbon to sink. The team calculated that the increase in anthropogenic carbon in the upper ocean caused a decrease in the region’s average pH, making the ocean more acidic as a result. This acidification also had an effect on the region’s aragonite, decreasing its saturation state over the last decade. Richard Feely, a senior scientist at the National Oceanic and Atmospheric Administration, says that the group’s results show that this particular part of the ocean is “highly sensitive to ocean acidification.” “Our own work with pteropods, and that of others, indicate that some marine organisms are already being impacted by ocean acidification processes in this region,” says Feely, who did not contribute to the study. “Laboratory studies indicate that many species of corals, shellfish, and some fish species will be impacted in the near future. As this study, and others, have shown, the region will soon become undersaturated with respect to aragonite later this century.” While the total amount of anthropogenic carbon appears to be increasing with each year, Chu says the rate at which the northeast Pacific has been storing carbon has remained relatively the same since 2001. That means that the region could still have a good amount of “room” to store carbon, at least for the foreseeable future. But already, her team and others are seeing in the acidification trends the ocean’s negative response to the current rate of carbon storage. “It would take hundreds of thousands of years for the ocean to absorb the majority of CO that humans have released into the atmosphere,” Chu says. “But at the rate we’re going, it’s just way faster than anything can keep up with.” This research was supported in part by the National Science Foundation Ocean Acidification Program, the National Institute of Standards and Technology, and the National Science Foundation Graduate Research Fellowship Program.


News Article | November 22, 2016
Site: news.yahoo.com

Coal mines have the canary, endangered species have the panda bear, melting ice has the polar bear, and now sea level rise has … the octopus? Climate change's impact on sea levels has made tidal flooding in Miami more severe, according to scientists. After the "supermoon" earlier this month triggered high tides, parts of Miami flooded and at least one sea creature was left far from home: an octopus that became stranded in a flooded parking garage, reported the Miami Herald. Miami resident Richard Conlin discovered the octopus, and shared images of the displaced sea creature on Facebook. According to Conlin, the octopus was brought home by building security officers, who returned the animal to the ocean in a bucket of water. [Supermoon Photos: Full Moon Rises Across the Globe] Marine biologist Kathleen Sullivan Sealey, from the University of Miami, told the Miami Herald that the cyclical "king tides" — a period of especially high tides caused by the alignment of the sun, Earth and moon's gravitational forces — were intensified by the supermoon and likely washed the octopus out of pipes underneath the garage. "When that much sea water comes in, the octopus is like 'What's this?' and goes to explore and ends up in a bad place," Sealey told the Miami Herald after examining the photos. She said the marooned octopus was either a small Caribbean reef octopus or a large Atlantic pygmy octopus. Though the building's drainage pipes were designed safely above high-water marks, Sealey said rising sea levels have left some of the pipes partially submerged during very high tides, such as the king tide. These submerged pipes combine two of an octopus’ favorite things, Sealey said: a cramped, dark space with fish to eat. In his Facebook posts, Conlin noted that his building has been flooding more frequently. "This flooding to this extreme is new and gets worse each moon," he wrote. "In the past the floor of the garage would be ‘damp’ but this extreme flooding is new." Conlin added that every day for the past six months there has been "some type of water seepage in the garage." Florida is especially at risk of flooding due to climate change. A recent study by the National Oceanic and Atmospheric Administration (NOAA) determined that about 13 million Americans could be affected by rising seas caused by climate change, and nearly half of them live in Florida. In Miami alone, a third of the county could be forced to relocate, according to the NOAA study. And sea creatures that wash ashore may become a more common occurrence, Sealey said, because ocean waters will be pushed deeper onto land more frequently due to rising seas. "The sea is moving in, so we have to share the space," Sealey said.


News Article | February 16, 2017
Site: phys.org

Two new instruments are scheduled to make their way to the station Feb. 18 on the SpaceX Dragon capsule. The Stratospheric Aerosol and Gas Experiment (SAGE) III instrument will monitor the condition of the ozone layer, which covers an area in the stratosphere 10 to 30 miles above Earth and protects the planet from the sun's harmful ultraviolet radiation. Its predecessors, SAGE I and SAGE II, which were mounted to satellites, helped scientists understand the causes and effects of the Antarctic ozone hole. The Montreal Protocol of 1987 led to an eventual ban on ozone-destroying gases and to the ozone layer's recovery; SAGE III, designed to operate for no less than three years, will allow scientists to continue monitoring its recovery. The Lightning Imaging Sensor (LIS), first launched as an instrument on the Tropical Rainfall Measuring Mission in 1997, records the time, energy output and location of lightning events around the world, day and night. From its perch on the ISS, the new LIS will improve coverage of lightning events over the oceans and also in the northern hemisphere during its summer months. Because lightning is both a factor and a gauge for a number of atmospheric processes, NASA as well as other agencies will use the new LIS lightning data for many applications, from weather forecasting to climate modeling and air quality studies. While SAGE III and LIS are the latest Earth science instruments slated for operation aboard the ISS, they or not the first or the last. For two years, beginning in September 2014, the Rapid Scatterometer, or RapidScat, collected near-real-time data on ocean wind speed and direction. The instrument was designed as a low-cost replacement for the Quick Scatterometer, or QuikScat satellite, which experienced an age-related failure in 2009. In addition to addressing such questions as how changing winds affect sea surface temperatures during an El Niño season, the National Oceanic and Atmospheric Administration and the U.S. Navy relied on RapidScat data for improved tracking of marine weather, leading to more optimal ship routing and hazard avoidance. The Cloud Aerosol Transport System (CATS) was mounted to the exterior of the space station in Jan. 2015 and is in the midst of a three-year mission to measure aerosols, such as dust plumes, wildfires and volcanic ash, around the world. Built to demonstrate a low-cost, streamlined approach to ISS science payloads, the laser instrument is providing data for air quality studies, climate models and hazard warning capabilities. Over the next several years, NASA is planning to send to the space station several more instruments trained toward Earth. Total and Spectral solar Irradiance Sensor (TSIS-1) will measure total solar irradiance and spectral solar irradiance, or the total solar radiation at the top of Earth's atmosphere and the spectral distribution of that solar radiation, respectively. The data are critical for climate modeling and atmospheric studies. TSIS-1 will continue the work of NASA's Solar Radiation and Climate Experiment satellite, which has been taking those measurements since 2003. NASA's Earth System Science Pathfinder program is supporting the following instruments that are currently in development. The program is managed by NASA's Langley Research Center in Hampton, Virginia. The Orbiting Carbon Observatory-3 (OCO-3) instrument will monitor carbon dioxide distribution around the globe. Assembled with spare parts from the Orbiting Carbon Observatory-2 satellite, OCO-3 will provide insights into the greenhouse gas's role as it relates to growing urban areas and changes in fossil fuel combustion. The instrument will also measure the "glow" from growing plants (solar-induced fluorescence). Homing in on tropical and temperate forests is the Global Ecosystem Dynamics Investigation (GEDI). The lidar instrument will provide the first high-resolution observations of forest vertical structure in an effort to answer how much carbon is stored in these ecosystems and also what impacts deforestation and reforestation have on habitat diversity, the global carbon cycle and climate change. The ECOsystem Spaceborne Thermal Radiometer Experiment (ECOSTRESS) will also focus on vegetation by providing high-frequency, high-resolution measurements of plant temperature and plant water use. Among the data's numerous uses will be to indicate regions of plant heat and water stress and also improve drought forecasting for the benefit of farmers and water managers. Researchers will also use ECOSTRESS in concert with other data to calculate water use efficiency among plants and identify drought-resistant species and varieties. Also on the horizon is the Climate Absolute Radiance and Refractivity Observatory (CLARREO) Pathfinder comprising two instruments for measuring solar irradiance: a reflected solar spectrometer and an infrared spectrometer. CLARREO will collect highly accurate climate records to test climate projections in order to improve models.


News Article | April 25, 2016
Site: www.rdmag.com

Winter sports areas around the world employ some microscopic help when Mother Nature is being finicky with snow production. Pseudomonas syringae, an ice-active bacterium, boasts a high-ice nucleating ability. An inactive form of the bacterium is used in the commercial snow inducer product—Snomax. Publishing in Science Advances, researchers from the Max Planck Institutes for Chemistry and for Polymer Research zeroed in on the molecular mechanism that gives this bacteria the ability to form ice. “With their ability to induce ice formation at temperatures just below the ice melting point, bacteria such as Pseudomonas syringae attack plants through frost damage using specialized ice-nucleating proteins,” the researchers wrote. “Besides the impact on agriculture and microbial ecology, airborne P. syringae can affect atmospheric glaciation processes, with consequence for cloud evolution, precipitation, and climate.” Using a technique called sum frequency generation spectroscopy, the researchers glimpsed how the bacteria exerted influence over a nearby water network. “The interactions of specific amino-acid sequences of the protein molecules generate water domains with increased order and stronger hydrogen bonds,” according to the Max Planck Institutes. “Additionally, the proteins remove thermal energy from the water into the bacteria. As a result, water molecules can aggregate into ice crystals more easily.” The bacterium is capable of inducing freezing in water droplets at negative 2 C. Mineral dust, or atmospheric aerosols, trigger ice formation only when below negative 15 C. Tobias Weidner, one of the study’s authors, told Popular Mechanics that pure-water droplets in the atmosphere sometimes won’t freeze until reaching negative 40 C. According to The Verge, P. syringae has been found in snowfall around the world. The bacterium’s properties have led scientists to believe they’re integral to cloud formation and rainfall. It is believed the bacteria are blown from the ground to the sky. However, their role in causing precipitation has never been established, Russ Schnell, of the National Oceanic and Atmospheric Administration, told the media outlet. The researchers said they hope to replicate the bacterial ice nucleating mechanism, and use it for other applications. Establish your company as a technology leader! For more than 50 years, the R&D 100 Awards have showcased new products of technological significance. You can join this exclusive community! Learn more.


News Article | February 13, 2017
Site: www.theguardian.com

Scientists have discovered “extraordinary” levels of toxic pollution in the most remote and inaccessible place on the planet – the 10km deep Mariana trench in the Pacific Ocean. Small crustaceans that live in the pitch-black waters of the trench, captured by a robotic submarine, were contaminated with 50 times more toxic chemicals than crabs that survive in heavily polluted rivers in China. “We still think of the deep ocean as being this remote and pristine realm, safe from human impact, but our research shows that, sadly, this could not be further from the truth,” said Alan Jamieson of Newcastle University in the UK, who led the research. “The fact that we found such extraordinary levels of these pollutants really brings home the long-term, devastating impact that mankind is having on the planet,” he said. Jamieson’s team identified two key types of severely toxic industrial chemicals that were banned in the late 1970s, but do not break down in the environment, known as persistent organic pollutants (POPs). These chemicals have previously been found at high levels in Inuit people in the Canadian Arctic and in killer whales and dolphins in western Europe. The research, published in the journal Nature Ecology and Evolution, suggests that the POPs infiltrate the deepest parts of the oceans as dead animals and particles of plastic fall downwards. POPs accumulate in fat and are therefore concentrated in creatures up the food chain. They are also water-repellent and so stick to plastic waste. “The very bottom of the deep trenches like the Mariana are inhabited by incredibly efficient scavenging animals, like the 2cm-long amphipods we sampled, so any little bit of organic material that falls down, these guys turn up in huge numbers and devour it,” said Jamieson. He said it was not unexpected that some POPs would be found in the deepest parts of the oceans: “When it gets down into the trenches, there is nowhere else for it to go. The surprise was just how high the levels were – the contamination in the animals was sky high.” The level of one type of POP, called polychlorinated biphenyls (PCBs), was only equalled anywhere in the northwest Pacific in Suruga Bay in Japan, an infamous pollution blackspot. The researchers also found severe contamination in amphipods collected in the Kermadec trench, which is 7,000km from the Mariana trench. The pollution was ubiquitous, found “in all samples across all species at all depths in both trenches”, the scientists said. PCBs were manufactured from the 1930s to the 1970s, when their appalling impact on people and wildlife was realised. About a third of the 1.3m tonnes produced has already leaked into coastal sediments and the open oceans, with a steady stream still thought to be coming from poorly protected landfill sites. An expedition conducted by the US National Oceanic and Atmospheric Administration last year also found various manmade items on the slopes leading to the Sirena Deep, part of the Mariana trench, and the nearby Enigma Seamount. They included a tin of Spam, a can of Budweiser beer and several plastic bags. The results are both significant and disturbing, said the marine ecologist Katherine Dafforn at the University of New South Wales in Australia and not part of the research team: “The trenches are many miles away from any industrial source and suggests that the delivery of these pollutants occurs over long distances despite regulation since the 1970s. “We still know more about the surface of the moon than that of the ocean floor,” Dafforn said. She said the new research showed that the deep ocean trenches are not as isolated as people imagine. “Jamieson’s team has provided clear evidence that the deep ocean, rather than being remote, is highly connected to surface waters. Their findings are crucial for future monitoring and management of these unique environments.” POPs cause a wide range of damage to life, particularly harming reproductive success. Jamieson is now assessing the impact on the hardy trench creatures, which survive water pressures equivalent to balancing a tonne weight on a fingertip and temperatures of just 1C. He is also examining the deep sea animals for evidence of plastic pollution, feared to be widespread in the oceans, which has been the focus of much recent attention, leading to bans on plastic microbeads in cosmetics in the UK and US. “I reckon it will be there,” he said. Jamieson said it had been positive that the dangers of POPs had been identified and their use ended but that plastic pollution presented a new concern for contamination of the oceans. “We’ve just done it again,” he said.


News Article | November 14, 2016
Site: www.theguardian.com

2016 will very likely be the hottest year on record and a new high for the third year in a row, according to the UN. It means 16 of the 17 hottest years on record will have been this century. The scorching temperatures around the world, and the extreme weather they drive, mean the impacts of climate change on people are coming sooner and with more ferocity than expected, according to scientists. The World Meteorological Organization (WMO) report, published on Monday at the global climate summit in Morocco, found the global temperature in 2016 is running 1.2C above pre-industrial levels. This is perilously close to to the 1.5C target included as an aim of the Paris climate agreement last December. The El Niño weather phenomenon helped push temperatures even higher in early 2016 but the global warming caused by the greenhouse gas emissions from human activities remains the strongest factor. “Another year. Another record,” said WMO secretary-general, Petteri Taalas. “The extra heat from the powerful El Niño event has disappeared. The heat from global warming will continue.” “Because of climate change, the occurrence and impact of extreme events has risen,” he said. “‘Once in a generation’ heatwaves and flooding are becoming more regular.” The WMO said human-induced global warming had contributed to at least half the extreme weather events studied in recent years, with the risk of extreme heat increasing by 10 times in some cases. “It is almost as if mother nature is making a statement,” said climate scientist Michael Mann, at Penn State University in the US. “Just as one of the planet’s two largest emitters of carbon has elected a climate change denier [Donald Trump] - who has threatened to pull out of the Paris accord - to the highest office, she reminds us that she has the final word.” “Climate change is not like other issues that can be postponed from one year to the next,” he said. “The US and world are already behind; speed is of the essence, because climate change and its impacts are coming sooner and with greater ferocity than anticipated.” The record-smashing heat led to searing heatwaves across the year: a new high of 42.7C was recorded in Pretoria, South Africa in January; Mae Hong Son in Thailand saw 44.6C on 28 April; Phalodi in India reached 51.0C in May and Mitribah in Kuwait recorded 54.0C in July. Parts of Arctic Russia also saw extreme warming - 6C to 7C above average. Arctic ice reached its equal second-lowest extent in the satellite record in September while warm oceans saw coral mortality of up to 50% in parts of Australia’s Great Barrier Reef. Extreme weather and climate related events have damaged farming and food security, affecting more than 60 million people, according to the UN Food and Agriculture Organization. The level of CO2 in the atmosphere has also broken records in 2016, with May seeing the highest monthly value yet - 407.7 ppm - at Mauna Loa, in Hawaii. The forecast for 2017 is another very hot year, but probably not a record breaker. “As the El Niño wanes, we don’t anticipate that 2017 will be another record-breaking year,” said Dr Peter Stott at the UK’s Met Office. “But 2017 is likely to be warmer than any year prior to the last two decades because of the underlying extent of [human-caused] warming due to the increasing atmospheric concentration of greenhouse gases.” However, another analysis released at the UN summit in Morocco showed that global carbon emissions have barely grown in the last three years, following decades of strong growth. The main reason is China burning less coal. Professor Corinne Le Quéré, at University of East Anglia in the UK, who led the analysis, said: “This third year of almost no growth in emissions is unprecedented at a time of strong economic growth. This is a great help for tackling climate change but it is not enough. Global emissions now need to decrease rapidly, not just stop growing.” The WMO’s temperature analysis combines the three main records, from the Met Office, Nasa and the National Oceanic and Atmospheric Administration, and stretches back to 1880.


News Article | March 4, 2016
Site: www.washingtonpost.com

You could be forgiven for not being able to keep up with whether scientists do, or don’t, think global warming “paused” during the early 2000s. First, the U.N.’s Intergovernmental Panel on Climate Change told us, in a definitive 2013 report, that there had been a real slowdown of global warming over the past 15 years. It noted that the rate of warming during the period from 1998 through 2012 was  “smaller than the rate calculated since 1951,” although the body also cautioned that “Due to natural variability, trends based on short records are very sensitive to the beginning and end dates and do not in general reflect long-term climate trends.” Nonetheless, this idea of a global warming slowdown or “pause” was endlessly cited by climate change skeptics and deniers circa 2013. However, more recently, scientific reports have begun to come out challenging the notion. A dataset update from the National Oceanic and Atmospheric Administration, aimed at removing biases in the data, wiped out the “pause” — to much fanfare and controversy. “Newly corrected and updated global surface temperature data …. do not support the notion of a global warming ‘hiatus,’” the study found. Other recent research, meanwhile, has suggested that the notion of a pause may represent a bias among scientists themselves, has been defined in curiously inconsistent ways by researchers – and in any event, always seems to go away if you simply analyze a long enough time period. [Seas are now rising faster than they have in 2,800 years, scientists say] That’s what makes it so striking to find that this debate is very much not over — a group of top scientists has just published a paper in Nature Climate Change robustly defending the idea that, as they put it, “The observed rate of global surface warming since the turn of this century has been considerably less than the average simulated rate” produced by climate change models. The authors include noted climate researchers Gerald Meehl of the National Center for Atmospheric Research, Benjamin Santer of Lawrence Livermore National Laboratory, and Michael Mann of Penn State University. The research was led by John Fyfe of the Canadian Centre for Climate Modelling and Analysis at the University of Victoria. The authors also argue that a large body of research into the causes of the apparent slowdown — which tended to target natural fluctuations, and especially the behavior of the Pacific Ocean — represents valuable work that advances our understanding of “a basic science question that has been studied for at least twenty years: what are the signatures of (and the interactions between) internal decadal variability and the responses to external forcings, such as increasing GHGs or aerosols from volcanic eruptions?” To be sure, the researchers behind the current paper absolutely do not think that global warming is over or anything of the sort — rather, the argument is that there was a real slowdown that’s scientifically interesting, even if it was brief and is now probably over. After all, even if they paused, temperatures now seem to be rising again, with 2014 and 2015 setting back-to-back global temperature records. Or as Ed Hawkins, one of the researchers and a scientist at the National Centre for Atmospheric Science at the University of Reading, put it on his blog when the paper emerged: “climate scientists agree that global warming has not ‘stopped’ – global surface temperatures and ocean heat content have continued to increase, sea levels are still rising, and the planet is retaining ~0.5 days of the sun’s incoming energy per year.” Stephan Lewandowsky, a psychologist at the University of Bristol in the UK who has co-authored a series of papers questioning the pause, sent a comment in response to new paper by John Fyfe and his colleagues. Lewandowsky said one reason matters are confusing is that there are multiple issues swirling here, and the main one he has taken issue with is the idea that global warming has stopped. A brief slowdown, if it existed, doesn’t show that. In contrast, Lewandowsky argues, the new paper underscores something different — it is investigating whether temperatures in the 2000s did or did not match what climate change models had expected. And the new study says they did not. “As far as we are concerned,” Lewandowsky said by email, “there is no discrepancy between us and Fyfe et al. as we address two distinct scholarly questions–and they agree with us about the ‘warming didn’t stop part,’ which is the only thing we addressed.” The new study set off a flurry of tweets by leading climate scientist Gavin Schmidt, director of NASA’s Goddard Institute for Space Studies, who argued that there is really no long term let up in warming, but that nonetheless, it could be scientifically interesting to try to understand the reasons for a short term one. Here are some examples: Still,  there’s a giant elephant in the room. Researchers can draw nuances and distinctions about what they mean and don’t mean, but the problem here has really always been the un-nuanced way that “pause” research has been cited in the broader debate, in order to undermine the idea of human caused climate change — even though none of these researchers think that it actually does that. Indeed, the new publication is sure to re-awaken the highly politicized debate over the “pause” — so high profile that it was at the center of a recent congressional investigation, by Rep. Lamar Smith (R-Tex.) of the House Committee on Science, into the NOAA researchers who challenged the pause’s validity in 2015. A leading presidential candidate, Ted Cruz, also continues to make a pause-like argument to undermine the idea of climate change, although he takes a somewhat different tack, citing satellite data that also, he says, appear to suggest a “pause.” Thus, in the end, scientists may well reconcile over the “pause” — acknowledging that it was real, brief, and interesting, but no reason not to worry about the planet — but the fact remains that for the rest of society, this debate was anything but purely scientific. The solution to climate change that has nothing to do with cars or coal A shocking one third of Americans believe this Zika conspiracy theory Your home water heater may soon double as a battery For more, you can sign up for our weekly newsletter here, and follow us on Twitter here.


News Article | March 4, 2016
Site: news.yahoo.com

Damaging, deadly tornado clusters are becoming more common, a new study finds. Tornado clusters are outbreaks of twisters that span several days. One terrifying example is the April 25-28 outbreak in 2011, when some 350 tornadoes ripped across the south-central United States, killing more than 300 people. Outbreaks are responsible for 79 percent of tornado-related fatalities, said Michael Tippett, a climate and weather researcher at the School of Applied Science and Engineering and the Data Science Institute, both at Columbia University in New York. [Tornado Chasers: See Spinning Storms Up-Close (Photos)] Tippett's new research shows the number of tornadoes per outbreak is increasing. The analysis also discovered a 4-fold increase in the chance of extreme outbreaks — when hundreds of tornadoes spawn in storms. The researchers analyzed National Oceanic and Atmospheric Administration (NOAA) tornado records from 1954 to 2014. Outbreaks were counted when six or more EF-1 tornadoes started within 6 hours of each other, no matter the location. The scientists calculated the average number of tornadoes per outbreak, as well as variability— swings between high and low numbers of twisters — which relates to the chance of extreme outbreaks.. The findings were published Feb. 29 in the journal Nature Communications. The study was coauthored by Joel Cohen, a mathematical population biologist and head of the Laboratory of Populations at Rockefeller University in New York and Columbia's Earth Institute. "These discoveries suggest that the risks from tornado outbreaks are rising far faster than previously recognized," Cohen told Live Science in an email interview. The researchers analyzed National Oceanic and Atmospheric Administration (NOAA) tornado records from 1954 to 2014. Outbreaks were counted when six or more EF-1 tornadoes started within six hours of each other, no matter the location. They calculated the average tornadoes per outbreak, as well as variability — swings between high and low numbers of twisters. The total number of tornadoes (rated EF-1 and above) per year remained steady since the 1950s, the study reported. The Enhanced Fujita scale, or EF scale, ranks tornadoes based on wind speeds and damage. A tornado with wind speeds between 86 and 110 mph (138 and 177 km/h) is usually rated an EF-1. The highest rating is an EF-5. [See the Tornado Damage Scale in Images] However, the average number of tornadoes per outbreak rose from about 10 in the 1950s to about 15 in the past decade. The variability around that average rose four times faster. This statistical link, known as Taylor's power law, has been observed in other fields but never before with severe weather, Tippett told Live Science in an email interview. The new findings are consistent with several recent studies that suggest U.S. tornadoes are becoming more likely to strike in clusters. A NOAA study, published in October 2014 in the journal Science, showed a rise in the number of days with multiple reported tornadoes. Another study, published in July 2014 in the journal Climate Dynamics, found a similar clustering of tornadoes. The researchers said they can't blame climate change for the uptick in tornado outbreaks. However, the warming planet could be shifting weather patterns across the United States and triggering more tornadoes. For instance, extreme weather systems that spawn storms are now more likely to get stuck in one place for several days. Increasing warmth may also boost tornado outbreaks by sparking unstable weather earlier in the year. "We want to know what in the climate system is driving these changes. Some have implicated climate change. We think such a conclusion is premature, and further study is needed," Tippett said. Copyright 2016 LiveScience, a Purch company. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.


News Article | December 21, 2016
Site: www.nature.com

A physicist helped to catch the first direct signs of long-sought gravitational waves. By Davide Castelvecchi A year ago, Gabriela Gonzalez was struggling to contain the biggest secret of her life. Two giant detectors in the United States had picked up signs of gravitational waves — wrinkles in space-time imagined by Albert Einstein but never before directly witnessed. It was Gonzalez’s job to help lead more than 1,000 scientists in their careful efforts to verify the discovery before announcing it to the public. News like that doesn’t stay under wraps for long, but the discovery was so momentous that the research team took nearly five months to analyse data from the two Laser Interferometer Gravitational-Wave Observatory (LIGO) detectors in Washington state and Louisiana. As spokesperson for the LIGO Scientific Collaboration, Gonzalez was one of the key people coordinating the analysis by groups scattered around the world, including researchers at the Virgo interferometer near Pisa, Italy, which pools its data with LIGO. The role of shepherding this massive effort made use of Gonzalez’s multidimensional talents. Most physicists know early on whether they will be a theorist or an experimentalist. But Gonzalez started her graduate studies as a theoretical physicist and only later switched to experimental work, when she showed uncommon aptitude. “It was the thing that set her up as a first-class scientist,” says Rainer Weiss, a physicist at the Massachusetts Institute of Technology in Cambridge and one of the founders of LIGO. Throughout her career, Gonzalez has done “a bit of everything” at LIGO, she says. For a while, she took on the crucial task of diagnosing the performance of the interferometers to make sure that they achieved unparalleled sensitivity — which is now enough to detect length changes in the 4-kilometre-long arms of the interferometers to within one part in 1021, roughly equivalent to the width of DNA compared with the orbit of Saturn. She has helped to lead the teams that analyse the data. And she nudged gravitational-wave researchers and dozens of their colleagues in conventional astronomy into signing pacts of cooperation. Together, they will look for phenomena that emit both gravitational and electromagnetic waves, in what has been called the coming age of multimessenger astronomy. In the hectic months before announcing the LIGO discovery, Gonzalez and her colleagues struggled to make sure that they had iron-clad evidence. They knew that history had not been kind to those who had previously reported gravitational waves. Most recently, in early 2015, an international collaboration had to retract its claims that a tele­scope at the South Pole had discovered indirect signs of the long-sought vibrations. To add to the pressure on the LIGO team, rumours of a discovery began to leak within a week of the initial finding, and reporters started to call. Throughout the long analysis period, Gonzalez says, she never made an important decision without consulting colleagues. But others laud her leadership. “What Gaby did is, she managed to get us through this period,” Weiss says. Gonzalez is based at Louisiana State University in Baton Rouge, close to the LIGO interferometer in Livingston. In 2008, she became the first woman to receive a full professorship in her department. She says that she has never experienced outright sexual harassment or discrimination during her career, but “I had to prove myself perhaps more than other people”. Gonzalez has said that after her current term as LIGO spokesperson ends in March 2017, she will not run again. She plans to go back to full-time research. The field of science she helped to create — gravitational-wave astronomy — has just seen its dawn. “It has always been a fun ride. And now it’s even better.” An AI developer beat one of the best at Go. Next up, solve global problems. By Elizabeth Gibney For veteran gamer Demis Hassabis, March brought the toughest match of his life — and he wasn’t even playing. Hassabis had to watch from the sidelines as his team’s creation, the computer program AlphaGo, took on Lee Sedol, a top-ranked champion in the strategy game Go. The computer won, marking a huge victory for the field of artificial intelligence (AI) and another in a series of triumphs for Hassabis. As co-founder of DeepMind, the London-based firm that developed AlphaGo, Hassabis was both elated and relieved. “It felt like our moonshot, and it was successful,” he says. But the win was about much more than Go. Hassabis wanted to show the world the power of machine-learning techniques, which he hopes to someday harness in a human-like, general AI capable of solving complex global problems. Hassabis had sketched this vision out as a precocious youth. A chess prodigy, he began designing innovative, multimillion-selling video games while in his teens and started his own company in his early 20s. After completing a PhD in cognitive neuroscience, he founded DeepMind in 2010. Google bought the company 4 years later for a reported £400 million (more than US$650 million at the time). At the firm, researchers apply inspiration from neuro­science to eye-catching AI tasks, from synthesizing speech to navigating the London Underground. Each algorithm builds complexity on to the last, says Hassabis, and weaves in capabilities that have historically been developed separately in AI. DeepMind AIs have gone from learning how to see, and acting on that vision, to using it to plan and reason. In terms of real-world problem-solving, the team used machine learning to cut power usage in Google’s data centres by 15%, something that Hassabis hopes to apply on a much grander scale. Although the company’s researchers do publish, their work-in-progress is kept under wraps, which irks some academics. And some data-privacy advocates have concerns over Google DeepMind’s plans to collaborate with the UK National Health Service. Scientists, however, have been flocking to work at the company. In person, Hassabis is unassuming but eager. He has a knack for swaying others to his passion, says Eleanor Maguire, his former PhD supervisor at University College London. “Once he gets talking about something he’s interested in, it’s infectious,” she says. Fitting research alongside running the company now means saving science for the small hours of the morning, something Hassabis says he doesn’t mind. “It’s a very important mission that we’re on, and I think it’s worth the sacrifice.” A coral researcher sounded the alarm over massive bleaching at the Great Barrier Reef. By Daniel Cressey When Terry Hughes flew over the Great Barrier Reef in March, his heart sank at the sight of telltale pale patches just below the surface, where corals were dead or dying. Hughes, director of the Australian Research Council’s (ARC’s) Centre of Excellence for Coral Reef Studies in Townsville, says that he and his students wept after looking at the aerial surveys of the damage. The bleaching hit nearly all of the reef, with initial surveys showing 81% of the northern section suffering severely. It was the most devastating bleaching ever documented on the Great Barrier Reef — and part of a wider event that was harming corals across the Pacific. The trigger for this year’s coral troubles in the Pacific was a strong El Niño warming pattern in the tropical part of that ocean. Abnormally high water temperatures prompt corals to expel the symbiotic zooxanthellae algae that provide them with much of their food — and their colour. Some corals can recover after bleaching, but others die. Follow-up studies in October and November found that 67% of ­shallow-water corals in the 700-kilometre northern section of the Great Barrier Reef had died. When the massive El Niño reared up in the Pacific in 2015, Australian researchers feared that the country’s reefs could be in danger. So Hughes, one of the world’s leading coral researchers, assembled a task force ready to survey the reef if bleaching occurred. The group eventually expanded to 300 scientists. “We put together a very detailed research plan, hoping of course that it wouldn’t happen,” he says. Hughes is based close to the central portion of the Great Barrier Reef. After leading the initial surveys, he became the de facto spokesperson on the catastrophe. At the height of media interest in the bleaching, Hughes did 35 interviews in one day. “In Australia, even people who have never been to the Great Barrier Reef and might never go there regard it as an icon,” says Bob Pressey, a fellow researcher at the ARC centre. The crisis on the reef defied some rules. Conventional thinking on bleaching events, says Hughes, is that corals die slowly from starvation after their zooxanthellae leave. But this year, water temperatures were so high that “we saw a lot of corals die before the starvation kicked in. They actually cooked.” Corals throughout the world have struggled in the past couple of years, as global temperatures have repeatedly hit record highs. In October 2015, the US National Oceanic and Atmospheric Administration declared that a global bleaching event was happening as coral reefs in Hawaii, Papua New Guinea and the Maldives began to succumb. This year, the bleaching spread to Australia, Japan and other parts of the Pacific. Researchers say that, as climate change drives up baseline temperatures, bleaching will afflict reefs more frequently. Under some scenarios, this could happen so often that most corals can no longer survive. Hughes is not ready to give up on the Great Barrier Reef just yet. But the recent bleaching has left corals in a weakened state, prone to attacks from pathogens and predators. Another bleaching event in the near future could bring further damage. “The message to people,” he says, “should be we’ve got a closing window of opportunity to deal with climate change.” An atmospheric chemist laid the foundation for an international climate agreement. By Jeff Tollefson It isn’t often that atmospheric chemists get to help save the world, but Guus Velders had his chance in October. He was attending inter­national negotiations in Kigali, Rwanda, that were seeking to phase out production and use of hydrofluorocarbons (HFCs), extremely potent greenhouse gases commonly used in air conditioners. Most nations had agreed on an aggressive timetable to begin eliminating the compounds, but India and a handful of other countries wanted an extra four years. After plugging the numbers into a model on his laptop computer, Velders informed negotiators that this particular concession would have little impact on the planet. That and his earlier work helped to smooth the way for a widely hailed global accord, which was signed on 15 October. Velders, a soft-spoken researcher at the National Institute for Public Health and the Environment in Bilthoven, the Netherlands, is proud of the part he played. “I’ve never been involved in a process that leads to a global agreement on climate before,” he says. It was no coincidence, however. Colleagues say that Velders has become the world’s expert on HFC emissions, and that nobody else could have provided such rapid analysis in Kigali. He is part of a community of scientists that has helped to refashion the 1987 Montreal Protocol — an international agreement designed to protect the stratospheric ozone layer — into a tool with which to fight global warming. The refrigerants that fall within the scope of the protocol are also powerful greenhouse gases, and Velders’ team showed that the Montreal agreement actually did more to control global temperatures than did the 1997 Kyoto Protocol climate treaty. More recently, the team projected how much warming HFCs were likely to cause over the twenty-first century. That helped to set the stage for the agreement on HFCs, which was reached as an amendment to the Montreal Protocol. “The Velders team always answered the right questions at the right time,” says Durwood Zaelke, president of the Institute for Governance & Sustainable Development, an advocacy group in Washington DC. “It’s safe to say that we wouldn’t have this agreement without them.” Now it’s back to the drawing board for Velders’ team. Their scenario about how HFC emissions would grow over time was rendered obsolete by the new agreement to ban them. That’s the kind of intellectual setback that Velders heartily accepts. A physician raced to make sense of a medical mystery in northeast Brazil. By Declan Butler Fears about the Zika virus spread across the globe in 2016, and the ­epicentre of concern was Brazil, where the epidemic first appeared in the Americas. Some researchers even called for postponing the Olympic Games scheduled for Rio de Janeiro in August that year. But away from the media frenzy, Celina Maria Turchi Martelli battled on the front lines in northeast Brazil to make sense of the medical mystery there. Turchi, a physician and infectious-disease expert, has had her life turned upside down by Zika since September 2015. That’s when the ministry of health asked her to investigate a sharp rise in reports of babies born with abnormally small heads and brains, a condition known as microcephaly, in her home state of Pernambuco. She quickly became convinced that the country was facing a public-health emergency. “Not even in my worst nightmare as an epidemiologist had I imagined a microcephaly neonate epidemic,” she says. Turchi, who is based at the Aggeu Magalhães Research Center in Recife, immediately contacted scientists across the globe for help. She formed a networked task force of epidemiologists, infectious-diseases experts, paediatricians, neurologists and reproductive biologists. The challenges were formidable, says Turchi: there were no reliable lab tests for Zika, and there was no consensus on a case definition of microcephaly. But the intense networking paid off, and Turchi and her colleagues eventually generated enough evidence to demonstrate a link between the condition and infection with Zika in the first trimester of pregnancy. Still, the mysteries are far from solved, says Turchi. Although Zika has spread across the Americas, the expected explosion in the number of microcephaly cases outside northeast Brazil has not materialized. Turchi and her task force are now trying to work out why. When she started going into the hospitals of Recife to investigate the outbreak, Turchi says, she had to innovate. “There was no book to follow.” Now, she and her colleagues are writing that book. The founder of an illegal hub for paywalled papers has attracted litigation and acclaim. By Richard Van Noorden It took Alexandra Elbakyan just a few years to go from information-technology student to famous fugitive. In 2009, when she was a graduate student working on her final-year research project in Almaty, Kazakhstan, Elbakyan  became frustrated at being unable to read many scholarly papers because she couldn’t afford them. So she learnt how to circumvent publishers’ paywalls. Her skills were soon in demand. Elbakyan saw scientists on web forums asking for papers they couldn’t access — and she was happy to oblige. “I got thanked many times for sending paywalled papers,” she says. In 2011, she decided to automate the process and founded Sci-Hub, a pirate website that grabs copies of research papers from behind paywalls and serves them up to anyone who asks. This year, interest in Sci-Hub exploded as mainstream media cottoned on to it and usage soared. According to Elbakyan’s figures, the site now hosts around 60 million papers and is likely to serve up more than 75 million downloads in 2016 — up from 42 million last year and, by one estimate, encompassing around 3% of all downloads from science publishers worldwide. It is copyright-breaking on a grand scale — and has brought Elbakyan praise, criticism and a lawsuit. Few people support the fact that she acted illegally, but many see Sci-Hub as advancing the cause of the open-access movement, which holds that papers should be made (legally) free to read and reuse. “What she did is nothing short of awesome,” says Michael Eisen, a biologist and open-access supporter at the University of California, Berkeley. “Lack of access to the scientific literature is a massive injustice, and she fixed it with one fell swoop.” For the first few years of its existence, the site flew under the radar — but eventually it grew too big for subscription publishers to ignore. In 2015, the Dutch company Elsevier, supported by the wider publishing industry, brought a US lawsuit against Elbakyan on the basis of copyright infringement and hacking. If Elbakyan loses, she risks having to pay many millions of dollars in damages, and potentially spending time in jail. (For that reason, Elbakyan does not disclose her current location and she was interviewed for this article by encrypted e-mail and messaging.) In 2015, a US judge ordered Sci-Hub to be shut down, but the site popped up on other domains. It’s most popular in China, India and Iran, she says, but a good 5% or so of its users come from the United States. Elbakyan has found her name splashed across newspapers, and says she typically gets a hundred supportive messages a week, some with financial donations. She says she feels a moral responsibility to keep her website afloat because of the users who need it to continue their work. “Is there anything wrong or shameful in running a research-access website such as Sci-Hub? I think no, therefore I can be open about my activities,” she says. Critics and supporters alike think that the site will have a lasting impact, even if it does not last. “The future is universal open access,” says Heather Piwowar, a co-founder of Impactstory, a non-profit firm incorporated in Carrboro, North Carolina, which helps scientists track the impact of their online output. “But we suspect and hope that Sci-Hub is currently filling toll-access publishers with roaring, existential panic. Because in many cases that’s the only thing that’s going to make them actually do the right thing and move to open-access models.” Whether or not that’s true, Elbakyan says she will keep building Sci-Hub — in particular, to expand its corpus of older manuscripts — while studying for a master’s degree in the history of science. “I maintain the website myself, but if I’m prevented, somebody else can take over the job,” she says. Shock, anger, scepticism and congratulations. Those were some of the reactions that fertility specialist John Zhang triggered in the scientific community in September, when he announced that a controversial technique that mixes DNA from three people had been used to produce a healthy baby boy. This kind of technique is intended to prevent children from inheriting disorders involving mitochondria — the cellular structures that produce energy. But ethical and safety concerns have prompted the United States to ban such procedures without a permit. Zhang, who works at New Hope Fertility Center in New York City, performed the technique at the company’s clinic in Mexico. Critics saw this as an attempt to evade regulation, and complained that he had announced the work at a conference rather than in a publication. But Zhang brushes aside those objections. “The most important is to have a live-birth baby, not to tell the whole world,” he says. Zhang has a habit of pushing scientific and ethical boundaries. In the 1990s, he worked with reproductive endocrinologist Jamie Grifo at the New York University Langone Medical Center to develop a version of the technique that Zhang used this year. The approach was designed to help older women to become pregnant by replacing their ageing mitochondria with those from younger eggs. No successful pregnancies resulted. When US regulators began restricting this technique in 2001, Zhang and his collaborators in China took over the work. In 2003, Zhang’s team created and implanted multiple embryos into a woman. After all the fetuses were miscarried, China banned the technique as well. Grifo and some others applaud Zhang’s latest work. “I think it’s a great thing it was finally done,” says Grifo. But others have criticized the New Hope team. “A lot of things they did were completely unsafe,” such as infusing the donor’s egg with a drug that could cause chromosomal abnormalities, says Shoukhrat Mitalipov, a stem-cell scientist at Oregon Health & Science University in Portland. Zhang is undeterred. He says that plenty of other families at risk of mitochondrial disease have expressed interest in his procedure, and he hopes to perform it in other countries. “Five to ten years from today, people will look at it and say, ‘Why were we all so stupid, why were we against it?’” he says. “I think you have to show the benefit to mankind.” It was a trip to the Galapagos Islands at the age of ten that first whetted Kevin Esvelt’s appetite for tinkering with evolution. As he stood marvelling at the iguanas, birds and sheer diversity of the place that had inspired Charles Darwin, Esvelt vowed to understand evolution — and improve on it. “I wanted to learn more about how these creatures came to be,” he says. “And, frankly, I wanted to make more of my own.” Today, Esvelt is still a precocious biologist. Less than a year after launching his lab at the Massachusetts Institute of Technology Media Lab in Cambridge, he has already made a name for himself as one of the pioneers of a controversial technique called a gene drive. His method harnesses CRISPR–Cas9 gene editing to circumvent evolution, forcing a gene to spread rapidly through a population. It could be used to wipe out mosquito-borne diseases such as malaria or eradicate invasive species. But it could also set off unintended ecological chain reactions, or be used to create a biological weapon. The idea of CRISPR gene drives hit Esvelt when he was tinkering with the Cas9 enzyme in 2013. “I had one day of absolute, ecstatic glee: this is what’s going to let us get rid of malaria,” says Esvelt. “And then I thought, ‘Wait a minute.’” Following that thought, Esvelt has worked to ensure that ethics comes before experiments. He first sounded the alarm in 2014, calling for public discussion about gene drives even before he had demonstrated that a CRISPR–Cas9 gene drive could work (K. A. Oye et al. Science 345, 626–628 (2014); K. M. Esvelt et al. eLife 3, e03401; 2014). Since then, he and his colleagues have shown how gene drives might be made safer, and how they could be reversed (J. E. DiCarlo et al. Nature Biotechnol. 33, 1250–1255; 2015). This year, his advocacy has begun to bear fruit. Researchers and policymakers worldwide have been discussing the technology, and a report from the US National Academies of Sciences, Engineering, and Medicine urged that gene-drive research proceed, but cautiously. Omar Akbari, who studies gene drives at the University of California, Riverside, believes Esvelt’s outreach has focused public attention — and attracted funding — for a nascent technology at just the right time. “I attribute that to Kevin,” says Akbari. “It’s difficult for a scientist to do what he’s done.” An astronomer detected the nearest known planet outside the Solar System. By Alexandra Witze Guillem Anglada-Escudé wasn’t surprised early this year when evidence of an alien world rippled across his computer screen. He had been almost certain that an Earth-sized planet orbited Proxima Centauri, the star nearest the Sun at just 1.3 parsecs (4.2 light years) away. To Anglada, an astronomer at Queen Mary University of London, the discovery came as more of a relief than a shock. He and his colleagues had been working feverishly to stake their claim in the competitive world of planet hunting, and the Proxima find confirmed that they were on the right path. “We made it,” he says. To the rest of the world, the discovery of the closest known exoplanet to Earth stoked the public imagination. It raised questions about whether life might exist in our cosmic backyard, and whether astronomers might be able to detect it. These are the kinds of question that got Anglada into planet hunting in the first place. A science-fiction fan while growing up near Barcelona, Spain, he got his astronomical start doing data simulations for Gaia, a European Space Agency mission to map 1 billion stars. Later, he turned his data-crunching skills to exoplanets. He developed a method for extracting faint planetary signals from data gathered by the world’s premier ground-based planet-hunting instrument, the High Accuracy Radial velocity Planet Searcher (HARPS) at the European Southern Observatory in La Silla, Chile. “Guillem has a natural talent of seeing the big picture where others see details,” says Mikko Tuomi, an astronomer at the University of Hertfordshire in Hatfield, UK, and a collaborator of Anglada’s. But Anglada soon ran straight into high academic drama, tussling with other researchers over who deserved credit for discovering a planet bigger than Earth and smaller than Neptune orbiting the star Gliese 667C. “I could have left the field and done something else,” he says. “But I took the decision of following it very aggressively.” He dived into HARPS data, publishing paper after paper on the planetary signals he discovered amid the background noise in the data. And then, as if to push back on all the secrecy and competition, Anglada launched a very public hunt for a planet orbiting Proxima. He put together a team and got observing time on HARPS, as well as other telescopes that could double-check whether any promising evidence that they found was caused by stellar activity, which can mimic the signs of a planet (a problem that plagues many exoplanet claims). The researchers put nearly all their details on an outreach website and social-media accounts. Being so transparent “didn’t seem dangerous at all”, Anglada says. “We had a feeling nobody else would do this.” Within days, they confirmed that the planet was there; within weeks, they submitted a manuscript detailing their discovery. The planet, called Proxima b, is at least 1.3 times the mass of Earth and orbits Proxima every 11.2 days. Although it is close to its star, the world is within the ‘habitable zone’, where liquid water could exist on its surface. That makes it not only the closest known exoplanet of the 3,500-plus confirmed so far, but also a place where otherworldly life could thrive — a double bonus for researchers and science-fiction fans alike. Just before the paper was published in Nature in August (G. Anglada-Escudé et al. Nature 536, 437–440; 2016), Anglada e-mailed British sci-fi writer Stephen Baxter, author of the novel Proxima (Gollancz, 2013). They corresponded about what life might be like on a world with one hemisphere permanently facing a flaring star, as happens at Proxima. People could eventually get a close-up look at Proxima b. The Breakthrough Starshot initiative aims to send fleets of tiny laser-propelled spacecraft to a nearby star, and it may target Proxima as its closest and best option. Anglada’s next step is to see whether Proxima b transits, or passes across the face of its star as seen from Earth. The chances are low, but if it does, then much more science can be gleaned when Proxima’s light passes through the planet’s atmosphere, if it has one. And if the transit does not happen? Then Anglada may be off, to tease out some other signal of another world. A transgender physicist paved the way for greater acceptance of minority groups. By Elizabeth Gibney Physicists can be open to seeing the world in new ways, but they need to see the data first. This posed a problem for Elena Long, a nuclear physicist who has fought for her field to be more inclusive of people from sexual and gender minorities. “We didn’t have any data, because people considered it too offensive to ask if we exist. It was a catch-22.” Long was one of the architects of a first-of-its-kind survey run by the American Physical Society (APS), charting the experiences of physicists who are lesbian, gay, bisexual, transgender or from another sexual or gender minority (LGBT). The findings, presented to a packed room at the APS March meeting this year, were stark. Of the 324 scientists who responded, more than one in five reported having been excluded, intimidated or harassed at work in the previous year. Transgender physicists reported the highest incidence of discrimination. Long, who is transgender herself, was unsurprised. In 2009, she began work for her PhD at the Thomas Jefferson National Accelerator Facility in Newport News, Virginia, which lacked trans-inclusive employment protections and health-care benefits. She felt isolated without LGBT support networks. “I loved the work I was doing, and I loved the research. But it was rough,” she says. So she founded the LGBT+ Physicists support group and began pushing for greater recognition at the APS, which eventually created a committee to collect data on LGBT discrimination. Many physicists, she says, could not even understand the need for such a study. Thanks to Long and her colleagues, physics is emerging as exemplary in its approach to these issues, says Samuel Brinton, a board member of the society Out in Science, Technology, Engineering and Mathematics. “We are literally using their work to start changes for the better in multiple fields,” he says. The APS accepted the recommendations made in the March report. And in August, a major APS division voted to move its 2018 meeting out of Charlotte, North Carolina, in response to a state law that forces people to use public toilets that match the gender they were assigned at birth. Long has meanwhile won two young-scientist awards offered by her lab and become a co-leader on two new accelerator experiments. “I’ve known a lot of postdocs who’ve done voluntary work, and usually it compromises their science,” says Karl Slifer, Long’s postdoctoral supervisor at the University of New Hampshire in Durham. “I’ve never seen that in Elena.” (Long attributes her strict time management to a computer program she designed that charts every hour of her day.) Now Long is helping to set up an APS membership group focusing on diversity and inclusion, which she hopes will make it easier for scientists in other minority groups to flourish. “I’m sure there are other people facing problems in the field I never thought about,” she says. “I don’t want them to wait seven years to get to a place where they can have a voice.” Bargmann is steering the research operations of a US$3-billion effort by the philanthropic organization to cure, prevent or manage all disease by 2100. As the new head of the world’s most powerful X-ray free-electron laser, Feidenhans’l will guide the €1.2-billion (US$1.3-billion) facility during its ramp up to becoming fully operational by mid-year. Boeke is a director of an ambitious effort that is seeking to synthesize the human genome. He and others are already close to making a yeast genome. China’s plans call for launching the Chang’e-5 mission in the latter half of 2017 to collect the first lunar rock samples to be brought back to Earth since the 1970s. With her experience in President Barack Obama’s cabinet, McNutt will have a central role in representing US science during Donald Trump’s presidency.


News Article | April 5, 2016
Site: www.biosciencetechnology.com

There's one last orca birth to come at SeaWorld, and it will probably be the last chance for research biologist Dawn Noren to study up close how female killer whales pass toxins to their calves through their milk. While SeaWorld's decision last month to end its orca breeding program delighted animal rights activists, it disappointed many marine scientists, who say they will gradually lose vital opportunities to learn things that could help killer whales in the wild. Noren got to observe only one mother-and-calf pair at a SeaWorld park before the end of the breeding program was announced. "It's really difficult to publish with one. I really was hoping for a couple more, but that is what it is," said Noren, who works at the National Marine Fisheries Service's Northwest Fisheries Science Center in Seattle. SeaWorld's 29 orcas at its parks in Orlando, San Diego and San Antonio could remain on display for decades to come and will continue to be available for study by outside scientists, as they generally have been for many years. The whales are 1 to 51 years old. But as SeaWorld's orca population dwindles, researchers will lose chances to collect health data and make other observations, such as drawing blood, measuring their heart rates and lung capacity, and documenting their diets and their growth. As the animals age, scientists say, research will be limited to geriatric orcas. No other marine park or aquarium in the world has SeaWorld's experience in maintaining or breeding orcas in captivity. SeaWorld parks hold all but one of all the orcas in captivity in the U.S., and they have housed more than half of all captive killer whales in the world tracked by the National Oceanic and Atmospheric Administration over the past 50 years. Orcas held in Canada, Japan and Europe have not been as accessible to researchers. SeaWorld will continue to support research projects underway on hearing, heart rates and blood, said Chris Dold, SeaWorld's chief zoological officer. "There won't be an immediate crunch," he said. But he acknowledged: "Over time, yeah, there's a loss of this resource to society and science." SeaWorld's critics, including People for the Ethical Treatment of Animals and WDC/Whale and Dolphin Conservation, sidestepped questions of whether outside researchers will suffer. But they said SeaWorld's own research has been unhelpful to orcas in the wild. "SeaWorld has had the largest population of orcas and has had the opportunity to do useful research and had done none of that," said Jared Goodman, PETA's director of animal law. Researchers outside SeaWorld argue they need its facilities and 1,500 employees in animal care to answer questions about wild orca behavior. "If you want to interact with them and conduct research, the combination of talent you have to have is a scientist with a research question, animals that are healthy so that you're looking at normal physiological rates, and in between that are the trainers - and I think people miss that," said Terrie Williams, who runs the Center for Marine Mammal Research and Conservation at University of California, Santa Cruz. SeaWorld's decision to end orca breeding and phase out its world-famous killer whale performances by 2019 followed years of protests and a drop in ticket sales at its parks. The backlash intensified after the 2013 release of "Blackfish," a documentary that was critical of SeaWorld's orca care and focused on an animal that killed a trainer during a performance in Orlando in 2010. In the wake of SeaWorld's announcement, some researchers fear that lawmakers on Capitol Hill and in states such as Washington and California will ban breeding or keeping of killer whales altogether. Similar bans targeting other species would have stymied the captive breeding that revived the California condor, said Grey Stafford, incoming president of the International Marine Animal Trainers' Association. "Those bills can have unforeseen and unintended consequences if and when the next species has a population crash in the wild. It ties the hands of state agencies and sanctuaries and places like SeaWorld to act," Stafford said.


News Article | March 21, 2016
Site: news.yahoo.com

The sun is captured in a "starburst" mode over Earth's horizon by one of the Expedition 36 crew members aboard the International Space Station, as the orbital outpost was above a point in southwestern Minnesota on May 21, 2013. (Image cropped.) More Rush Holt is CEO of the American Association for the Advancement of Science (AAAS) and executive publisher of Science and its family of journals. Chris Field is director of the Carnegie Institution's Department of Global Ecology and a professor for interdisciplinary environmental studies at Stanford University. The authors contributed this article to Live Science's Expert Voices: Op-Ed & Insights. Multiple lines of well-established evidence point to the reality of human-caused climate change. The impacts are now apparent — and range from rising sea levels to increased weather extremes, including more severe storms, droughts, heat waves and wildfires. In response, the world's nations came together late last year at the U.N. Climate Change Conference in Paris with a commitment to fix the problem. Yet, back in the United States, Rep. Lamar Smith, R-Texas — as Chairman of the Science, Space, and Technology Committee — continues to call for "all documents and communications" related to research by a team from the National Oceanic and Atmospheric Administration (NOAA) that seemed to debunk the notion of a global warming slowdown, or "pause." Such efforts, which came up again when NOAA Administrator Kathryn Sullivan testified March 16 before the House Subcommittee on Environment, are little more than a red herring. In other words, they distract Americans from the primary point: that climate change is real , it's happening now and it's caused mostly by human activities such as fossil-fuel burning and deforestation. This is not the first time climate researchers have had to cope with ill-considered requests for emails and other documents. When climate scientist Michael Mann, now at Pennsylvania State University, was at the University of Virginia, he withstood then-Virginia Attorney General Ken Cuccinelli's sweeping demand for documents regarding his climate research. The Supreme Court of Virginia eventually ruled in Mann's favor. There also was controversy when Rep. Raul Grijalva, D-Ariz., sent letters to seven universities, seeking information on funding for several scientists who have been skeptical of, or have made controversial remarks about, climate change. He later acknowledged that he was overreaching in requesting the scientists' communications. The science on climate change is convincing. In its Fifth Assessment Report, published in 2013, the Intergovernmental Panel on Climate Change (IPCC) concluded that warming between 1998 and 2012 was "around one-third to one-half" less rapid than over the period from 1951 to 2012. Those who choose to ignore the overwhelming evidence of climate change have used that statement to argue that global warming has stopped, that something other than greenhouse gases is at work or that climate scientists have a poor understanding of their subject. The IPCC was careful to acknowledge, however, that any trend inferred from only a few years of observations is tenuous, largely because natural variations like El Niño can have an outsize influence. [Unilever CEO: Why Sustainability Is No Longer a Choice (Op-Ed )] Indeed, selecting 1998 as a starting year automatically makes trends for the next few years look small because 1998 was an unusually warm El Niño year. Still, the IPCC was frank in making the best available interpretation of the data available — data that have been examined, analyzed and validated by research teams around the world. But based on newly available information, one of the teams that analyze global temperature data realized that some of the temperatures could be made just slightly more accurate. The refinements to the temperature record are subtle but important, like adding the final buff to a freshly waxed car. However, an understanding of our planet and the way it is changing improves with each refinement, even if it is small. Consistent with their responsibility as scientists, the team that developed the refined temperature time series — Thomas Karl and colleagues at NOAA — described their results in a paper in the journal Science last June and argued that the improved temperature record no longer shows evidence of a slowdown in global warming. Such revisions are part of normal scientific discourse, and the government-funded scientists who pursued them should not be subjected to legislative subpoenas. The Science paper was part of a large effort by Karl and others at NOAA's National Centers for Environmental Information, as well as climate analytics specialist James McMahon of LMI Consulting, to develop the most accurate possible record of the Earth's surface temperature, based on thermometers.  Developing an accurate record involves many refinements, as Karl's team has done, to adjust for factors like the growth of cities around weather stations, increases in the number of stations on land, and changes in the techniques for measuring ocean temperatures. These changes include buckets thrown overboard (where measurements were very spotty), to engine intakes (which tended to report temperatures a bit too high), to automated buoys (with greatly expanded coverage and accuracy).  Since the publication of the paper by Karl and colleagues, additional groups have examined the data. Bala Rajaratnam and colleagues at Stanford, writing in the journal Climatic Change last September, took a sophisticated statistical approach. Looking at the same data set as the NOAA team, the Stanford researchers found even stronger evidence against a global warming pause. And in February, a team led by climate modeler John Fyfe, of the University of Victoria in Canada, again considered the same data set. In the journal Nature Climate Change, Fyfe and colleagues noted that recent warming, while clearly continuing, has been slower than many models have predicted. So, working independently, several research teams have converged on almost identical results for warming over the past century at the global scale, but with periodic fine-tuning as additional information becomes available. This is the way science is supposed to work. Asking tough questions and re-examining evidence make up the essence of the scientific method. Scholarly research papers undergo multiple rounds of scrutiny by independent peer reviewers, and the Karl paper was no exception. The more recent papers provide a classic illustration of the way science progresses. Successive studies take new perspectives and use new techniques to reanalyze data and refine interpretations. [February Blows Away Global Heat Record ] Making the newly corrected and updated global surface temperature data readily accessible to other scientists, as NOAA did, is a critical step in that process. Rather than subjecting the NOAA scientists to the threat of a "compulsory process," policymakers should applaud them for advancing scientific knowledge and promoting transparency in research publication. Don't be fooled by red herrings. Human-caused climate change is real. Attacking the integrity of scientists will not further our understanding of what's happening to our planet. Similarly, efforts to undermine research findings for ideological reasons are a confusing disservice to the public. Policymakers certainly have a responsibility to exercise appropriate oversight, but thinly veiled political attempts to discredit researchers can have a chilling effect on the scientific discovery that is our best hope for improving people's lives. Follow all of the Expert Voices issues and debates — and become part of the discussion — on Facebook, Twitter and Google+. The views expressed are those of the author and do not necessarily reflect the views of the publisher. This version of the article was originally published on Live Science. What Are the Odds? Temperature Records Keep Falling (Op-Ed) Copyright 2016 LiveScience, a Purch company. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.


News Article | December 19, 2016
Site: www.theguardian.com

The extreme warmth of 2016 has changed so much for the people of the Arctic that even their language is becoming unmoored from the conditions in which they now live. The Yupik, an indigenous people of western Alaska, have dozens of words for the vagaries of sea ice, which is not surprising given the crucial role it plays in subsistence hunting and transportation. But researchers have noted that some of these words, such as “tagneghneq” (thick, dark, weathered ice), are becoming obsolete. After thousands of years of use, words are vanishing as quickly as the ice they describe due to climate change. The native inhabitants are also in peril – there are 31 Alaskan towns and cities at imminent risk from the melting ice and coastal erosion. Many will have to relocate or somehow adapt. “In December, we normally have waters covered in ice but right now we have open water out there,” said Vera Metcalf, director of the Eskimo Walrus Commission, which represents 19 native communities stretching along Alaska’s western coast. “We are so dependent upon sea ice conditions. It’s our life, our culture.” Arctic sea ice extent slumped to a record low in November, winnowed away by the warming air, warming seas and unhelpful wind patterns. The region’s 2016 temperature has been 3.5C warmer than a century ago. In some locations the divergence from the long-term average has been an eye-watering 20C. On 21 November, the decline on the long-term average of sea ice extent for that day was 888,000 sq miles (2.3m sq km) – an area 10 times larger than the UK, but smaller than the long-term average. “Almost every year now we look at the record of sea ice and say ‘wow’, but this year it was like ‘three times wow’,” said Tad Pfeffer, a geophysicist at the University of Colorado. “This year has been a big exaggeration on the trends we’ve already been seeing.” These numbers have resonance for people who require dependable rhythms in the environment in order to survive. In remote Alaskan communities, the stores sell goods priced to reflect their journey – $20 for a pizza, $15 for a gallon of milk. If you can’t butcher a 1,000lb walrus because there is no sea ice to support both of you, then you might well be left hungry. “The window of opportunity for hunting continues to shrink,” Metcalf said. “The communities are worried about this because food insecurity is something we are now having to tackle every single day.” Metcalf grew up on St Lawrence island, a far-flung piece of the US that sits just 36 miles from Russia in the Bering Sea. The island is thought to be one of the last exposed fragments of a land bridge that connected North America to Asia during the last ice age. In 2013, the island’s two main communities managed to catch just a third of the walruses they normally do. Last year, Gambell, the largest settlement, snared just 36 – down from the 600 it could expect just a few years ago. Sea ice is further out from land than it once was and is becoming treacherously thin for hunters to traverse. Walruses, which require sea ice for resting and giving birth, often have to resort to heaving themselves on to crowded strips of land. These grand tusked beasts can trample each other to death in such conditions. “It’s not like the walrus populations are changing, it’s that the climate is changing the conditions,” Metcalf said. “We are trying to plan better but we can’t go out every day and hunt. We can try to adapt and hunt caribou or moose but it’s not easy. It comes at a cost to us.” The Arctic is warming at twice the rate of the rest of the world and there are “early signs” that this temperature increase is speeding up, according to Jeremy Mathis, director of the Arctic program at the National Oceanic and Atmospheric Administration. Mathis moved to Fairbanks, Alaska, in 2007 and even in that time he has seen startling changes – the -40C winters he endured in the first few years have almost completely disappeared. “For people who live in the Arctic, there is no debate over whether their environment is changing,” he said. “We are seeing a destabilization of the environment in the Arctic. The ice is melting earlier and earlier and coming back later and later in the year. For people here that means a clear impact upon food security and their way of life.” Frost locked deep in the soils is melting, causing buildings to subside. Communities are seeing their coastlines erode and are increasingly exposed to lashing storms without the protective barrier of sea ice. Several Alaskan towns and villages are wrestling over whether to fight these changes or retreat to relative safety. Two coastal villages, Shishmaref and Kivalina, have voted to relocate while a third, Newtok, has taken the first tentative steps to do so. The warmth of 2016 – almost certain to be a global record – has added to the sense of haste. The regrowth of sea ice as Alaska enters winter has been so painfully slow that many communities will be left without a buffer to storms next year. Should a large storm hit, it could prove disastrous. Such a calamity would at least free up money from the Federal Emergency Management Agency (Fema). The cost of relocating a village of just a few hundred people is around $200m – a bill that neither the federal nor Alaskan government is keen to pick up. Some people in remote communities note, darkly, that a ruinous storm would at least be followed by federal dollars that would allow them to fortify or move. “These communities need to be moved as soon as possible before a large storm hits,” said Victoria Herrmann, managing director of the Arctic Institute. “There hasn’t been much guidance as to whether they can move or who will pay for it. There are around 230 villages affected by sea level rise and they will all need a plan over the next few years as sea ice continues to retreat.” It takes a certain stoic hardiness to live in a place of such frigid cold. But Herrmann said that even those who have had to adapt to changes in the past have found the unravelling of 2016 “very scary”. She added: “What we are seeing is incredible. It’s quite frightening in terms of what it means for the future.” A solution doesn’t appear imminent. The US has no national sea level rise plan, no system to deal with displaced people. Even as the country’s first climate change refugees emerge from within its own borders, the issue is very much on the sidelines. The incoming president isn’t sure what the fuss is about, vacillating between calling climate change a “hoax” concocted by the Chinese or simply claiming that “nobody really knows” if it exists. While the politics plays out, wrenching decisions will have to be made. “Having to move elsewhere is unimaginable,” said Metcalf. “As an elder told me the other day, we are not going anywhere. We’ve been here for centuries. But we may have to consider it, for the sake of our children and grandchildren.”


News Article | April 4, 2016
Site: phys.org

While SeaWorld's decision last month to end its orca breeding program delighted animal rights activists, it disappointed many marine scientists, who say they will gradually lose vital opportunities to learn things that could help killer whales in the wild. Noren got to observe only one mother-and-calf pair at a SeaWorld park before the end of the breeding program was announced. "It's really difficult to publish with one. I really was hoping for a couple more, but that is what it is," said Noren, who works at the National Marine Fisheries Service's Northwest Fisheries Science Center in Seattle. SeaWorld's 29 orcas at its parks in Orlando, San Diego and San Antonio could remain on display for decades to come and will continue to be available for study by outside scientists, as they generally have been for many years. The whales are 1 to 51 years old. But as SeaWorld's orca population dwindles, researchers will lose chances to collect health data and make other observations, such as drawing blood, measuring their heart rates and lung capacity, and documenting their diets and their growth. As the animals age, scientists say, research will be limited to geriatric orcas. No other marine park or aquarium in the world has SeaWorld's experience in maintaining or breeding orcas in captivity. SeaWorld parks hold all but one of all the orcas in captivity in the U.S., and they have housed more than half of all captive killer whales in the world tracked by the National Oceanic and Atmospheric Administration over the past 50 years. Orcas held in Canada, Japan and Europe have not been as accessible to researchers. SeaWorld will continue to support research projects underway on hearing, heart rates and blood, said Chris Dold, SeaWorld's chief zoological officer. "There won't be an immediate crunch," he said. But he acknowledged: "Over time, yeah, there's a loss of this resource to society and science." SeaWorld's critics, including People for the Ethical Treatment of Animals and WDC/Whale and Dolphin Conservation, sidestepped questions of whether outside researchers will suffer. But they said SeaWorld's own research has been unhelpful to orcas in the wild. "SeaWorld has had the largest population of orcas and has had the opportunity to do useful research and had done none of that," said Jared Goodman, PETA's director of animal law. Researchers outside SeaWorld argue they need its facilities and 1,500 employees in animal care to answer questions about wild orca behavior. "If you want to interact with them and conduct research, the combination of talent you have to have is a scientist with a research question, animals that are healthy so that you're looking at normal physiological rates, and in between that are the trainers—and I think people miss that," said Terrie Williams, who runs the Center for Marine Mammal Research and Conservation at University of California, Santa Cruz. SeaWorld's decision to end orca breeding and phase out its world-famous killer whale performances by 2019 followed years of protests and a drop in ticket sales at its parks. The backlash intensified after the 2013 release of "Blackfish," a documentary that was critical of SeaWorld's orca care and focused on an animal that killed a trainer during a performance in Orlando in 2010. In the wake of SeaWorld's announcement, some researchers fear that lawmakers on Capitol Hill and in states such as Washington and California will ban breeding or keeping of killer whales altogether. Similar bans targeting other species would have stymied the captive breeding that revived the California condor, said Grey Stafford, incoming president of the International Marine Animal Trainers' Association. "Those bills can have unforeseen and unintended consequences if and when the next species has a population crash in the wild. It ties the hands of state agencies and sanctuaries and places like SeaWorld to act," Stafford said. Explore further: SeaWorld, activists clash on social media over orcas' lot


News Article | September 21, 2016
Site: www.nature.com

In the fight to protect Earth from solar storms, the battle lines are drawn in space at a point 1.6 million kilometres away. There, a US National Oceanic and Atmospheric Administration (NOAA) satellite waits for electrons and protons to wash over it, a sign that the Sun has burped a flood of charged particles in our direction. As early as the end of this month, NOAA should have a much better idea of just how dangerous those electromagnetic storms are. The agency will begin releasing forecasts that use a more sophisticated model to predict how incoming solar storms could fry electrical power grids. It will be the clearest guide yet as to which utility operators, in what parts of the world, need to worry. “This is the first time we will get short-term forecasts of what the changes at the surface of the Earth will be,” says Bob Rutledge, lead forecaster at NOAA’s Space Weather Prediction Center in Boulder, Colorado. “We can tell a power-grid customer not only that it will be a bad day, but give them some heads-up on what exactly they will be facing.” Powerful solar storms can knock out radio communications and satellite operations, but some of their most devastating effects are on electrical power grids. In 1989, a solar storm wiped out Canada’s entire Hydro-Québec grid for hours, leaving several million people in the dark. In 2003, storm-induced surges fried transformers in South Africa and overheated others at a nuclear power plant in Sweden. But if a power company knows that a solar storm is coming, officials can shunt power from threatened areas of the network to safer ones or take other precautions. Until now, NOAA had warned of solar activity using the planetary K-index, a scale that ranks the current geomagnetic threat to the entire Earth. The new ‘geospace’ forecast, which draws on more than two decades of research, comes in the form of a map showing which areas are likely to be hit hardest (G. Tóth et al. J. Geophys. Res. Space Phys. 110, A12226; 2005). Knowing that Canada, for instance, will be hit harder than northern Europe helps grid operators, says Tamas Gombosi, a space physicist at the University of Michigan in Ann Arbor who helped to develop the model. He compares it to having a hurricane forecast that says a storm will hit Florida, rather than just somewhere on the planet (see ‘Storms from the Sun’). Space-weather forecasting is as rudimentary as conventional weather forecasting was three or four decades ago, says Catherine Burnett, space-weather programme manager at the UK Met Office in Exeter. Researchers have developed different models to describe various portions of the Sun–Earth system, but linking them into a coherent framework has been difficult. The Michigan approach combines 15 models that collectively describe the solar atmosphere through interplanetary space and into Earth’s magnetic realm. The NOAA forecast incorporates three of those: one model describing Earth’s entire magneto­sphere, another focusing on the inner magneto­sphere and one for electrical activity in the upper atmosphere. The inner magnetosphere chunk is crucial to the model’s overall success, says developer Gábor Tóth at the University of Michigan. It describes how energetic particles flow and interact as they approach Earth’s poles, and how the particles affect magnetism at the planet’s surface. Alerts can provide roughly 20 minutes to one hour of warning. NOAA’s improved forecasts are part of a push by US agencies to implement a national space-weather strategy issued last year by the White House. Regulators will also soon require power-grid operators to produce hazard assessments that include the threat of solar storms. “Without those two pieces, we wouldn’t have remotely the interest we have now,” says Antti Pulkkinen, a space-weather researcher at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. “It really has changed the game.” NOAA plans to continue refining its forecasts as new research rolls in. The possible improvements include incorporating how the geology beneath power grids affects the intensity of a solar storm. Fluctuating magnetic fields can induce electrical currents to flow in the ground, which sets up further problems for transmission lines. “All of this is terrifically complicated,” says Jeffrey Love, a geomagnetics researcher at the US Geological Survey in Golden, Colorado. In their latest paper, Love, Pulkkinen and their colleagues describe the most detailed map of these ‘geoelectric hazards’ across part of the United States (J. J. Love et al. Geophys. Res. Lett. http://doi.org/bqpm; 2016). Of the areas surveyed so far, those at the highest risk are the upper Midwestern states of Minnesota and Wisconsin, where complex geology induces strong electrical currents. Adding in 3D models of these ground currents will improve the next generation of NOAA forecasts, Rutledge says. “This is by no means the end.”


News Article | April 28, 2016
Site: news.yahoo.com

The surface area of the Earth covered by leafy green vegetation has increased dramatically over the last several decades, thanks to excess carbon emissions. But the green shoots aren't necessarily a good thing; they are harbingers of more worri The excess carbon dioxide in the atmosphere has created a greener planet, a new NASA study shows. Around the world, areas that were once icebound, barren or sandy are now covered in green foliage. All told, carbon emissions have fueled greening in an area about twice the size of the continental United States between 1982 and 2009, according to the study. While lush forests and verdant fields may sound like a good thing, the landscape transformation could have long-term, unforeseen consequences, the researchers say. The radical greening "has the ability to fundamentally change the cycling of water and carbon in the climate system," lead author Zaichun Zhu, a researcher from Peking University in Beijing, said in a statement. [Video: See Global Warming Make Earth Greener] Green leafy flora make up 32 percent of Earth's surface area. All of those plants use carbon dioxide and sunlight to make sugars to grow — a process called photosynthesis. Past studies have shown that carbon dioxide increases plant growth by increasing the rate of photosynthesis. Other research has shown that plants are one of the main absorbers of atmospheric carbon dioxide. Human activities, such as driving cars and burning coal for energy, account for about 10 billion tons of carbon dioxide emissions per year, and half of this CO2 is stored in plants. "While our study did not address the connection between greening and carbon storage in plants, other studies have reported an increasing carbon sink on land since the 1980s, which is entirely consistent with the idea of a greening Earth," said study co-author Shilong Piao, of the College of Urban and Environmental Sciences at Peking University. However, it wasn't clear whether the greening seen in satellite data over recent years could be explained by the sky-high CO2 concentrations in the atmosphere (the highest the planet has seen in 500,000 years).  After all, rainfall, sunlight, nitrogen in the soil and land-use changes also affect how well plants grow. To isolate the causes of planetary greening, researchers from around the world analyzed satellite data collected by NASA's Moderate Resolution Imaging Spectrometer and the National Oceanic and Atmospheric Administration's Advanced Very High Resolution Radiometer instruments. They then created mathematical models and computer simulations to isolate how each of these variables would be predicted to influence greening. By comparing the models and the satellite data, the team concluded that about 70 percent of the greening could be attributed to atmospheric carbon dioxide concentrations, the researchers reported Monday (April 25) in the journal Nature Climate Change. "The second most important driver is nitrogen, at 9 percent. So we see what an outsized role CO2 plays in this process," said study co-author Ranga Myneni, an earth and environmental scientist at Boston University. While green shoots may be good, excess CO2 emissions also bring a host of more worrisome consequences, such as global warming, melting glaciers, rising sea levels and more dangerous weather, according to accumulating research. What's more, the greening may be a temporary change. "Studies have shown that plants acclimatize, or adjust, to rising carbon dioxide concentration and the fertilization effect diminishes over time," said Philippe Ciais, associate director of the Laboratory of Climate and Environmental Sciences in Gif-sur-Yvette, France. Copyright 2016 LiveScience, a Purch company. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.


News Article | February 15, 2017
Site: phys.org

The event titled "At The Crossroads: Global Shipping Lanes and Whale Conservation" is part of the 2017 IUCN/WCS Knowledge Dialogue Series that will promote discussions among various stakeholders on international sustainable development challenges. This preparatory conference will feed into important decisions made by delegates on oceans and marine issues at the upcoming UN Oceans Conference on June 5-9, 2017. "Most species of great whale are affected by shipping activities in the form of potential ship strikes and underwater noise," said WCS President and CEO Dr. Cristián Samper, who provided welcoming remarks for the event. "The challenge of finding solutions on how best to protect these marine mammals in busy waterways is a global one, and international collaboration is the key to formulating effective solutions. Today's discussions on this issue are timely and will help pave the way for a formal call to action by UN delegates in June." Samper was joined by His Excellency Peter Thomson of Fiji, President of the 71st Session of the UN General Assembly and a panel of experts from government agencies, scientific organizations, and the shipping industry. The event was organized by the Government of France, the International Union for Conservation of Nature (IUCN), and WCS (Wildlife Conservation Society). Most whale species are still recovering from the impacts of centuries of commercial whaling and, although largely protected by a global commercial whaling ban, are now threatened by a host of new dangers, including collisions with ocean-going vessels, ocean noise, entanglement in fishing gear, and other factors. Moderated by Dr. Greg Silber of the U.S. National Oceanic and Atmospheric Administration, the panel discussed the scope and scale of impacts of the shipping industry on whales, focusing specifically on the threats of collisions and increasing low-frequency noise levels from commercial ships. The participants then reviewed a number of case studies from regions around the world—Africa's Gulf of Guinea, Sri Lanka, Chile, Arctic waters, and seascapes along the Atlantic and Pacific coasts of the United States—as a means of assessing the current state of knowledge on the overlap between shipping networks and biologically important areas for whales. Panel members also discussed new technologies, emerging research and management needs, and the importance of identifying best practices for balancing the needs of shipping and whale conservation objectives. "We have a real opportunity on the global stage this week and in the coming months to work with governments, industry, and conservation organizations to secure concrete actions that will benefit whales and the marine environment," said Dr. Howard Rosenbaum, Director of WCS's Ocean Giants Program and a panel participant. "Collectively we have been evaluating impacts from ship-strikes and noise for several decades, with some clear strides made in reducing impacts," said Dr. Brandon Southall, President and Senior Scientist for SEA Inc. "But now is the time to push forward using powerful new monitoring and mitigation technologies, and building new international partnerships like those forged here in New York." Explore further: Data on blue whales off California helps protect their distant relatives


News Article | November 23, 2016
Site: news.yahoo.com

Coal mines have the canary, endangered species have the panda bear, melting ice has the polar bear, and now sea level rise has … the octopus? Climate change's impact on sea levels has made tidal flooding in Miami more severe, according to scientists. After the "supermoon" earlier this month triggered high tides, parts of Miami flooded and at least one sea creature was left far from home: an octopus that became stranded in a flooded parking garage, reported the Miami Herald. Miami resident Richard Conlin discovered the octopus, and shared images of the displaced sea creature on Facebook. According to Conlin, the octopus was brought home by building security officers, who returned the animal to the ocean in a bucket of water. [Supermoon Photos: Full Moon Rises Across the Globe] Marine biologist Kathleen Sullivan Sealey, from the University of Miami, told the Miami Herald that the cyclical "king tides" — a period of especially high tides caused by the alignment of the sun, Earth and moon's gravitational forces — were intensified by the supermoon and likely washed the octopus out of pipes underneath the garage. "When that much sea water comes in, the octopus is like 'What's this?' and goes to explore and ends up in a bad place," Sealey told the Miami Herald after examining the photos. She said the marooned octopus was either a small Caribbean reef octopus or a large Atlantic pygmy octopus. Though the building's drainage pipes were designed safely above high-water marks, Sealey said rising sea levels have left some of the pipes partially submerged during very high tides, such as the king tide. These submerged pipes combine two of an octopus’ favorite things, Sealey said: a cramped, dark space with fish to eat. In his Facebook posts, Conlin noted that his building has been flooding more frequently. "This flooding to this extreme is new and gets worse each moon," he wrote. "In the past the floor of the garage would be ‘damp’ but this extreme flooding is new." Conlin added that every day for the past six months there has been "some type of water seepage in the garage." Florida is especially at risk of flooding due to climate change. A recent study by the National Oceanic and Atmospheric Administration (NOAA) determined that about 13 million Americans could be affected by rising seas caused by climate change, and nearly half of them live in Florida. In Miami alone, a third of the county could be forced to relocate, according to the NOAA study. And sea creatures that wash ashore may become a more common occurrence, Sealey said, because ocean waters will be pushed deeper onto land more frequently due to rising seas. "The sea is moving in, so we have to share the space," Sealey said.


News Article | November 18, 2015
Site: www.reuters.com

The decision ends two years of litigation by the aquarium to get federal approval to bring the whales to Atlanta and to other facilities in the United States that had hoped to acquire them, including SeaWorld parks. Also known as white whales, belugas are common in the Arctic Ocean's coastal waters and also found in subarctic waters, according to the National Geographic website. According to the aquarium, they are classified as endangered in some areas and as “near threatened” worldwide. The Atlanta-based aquarium issued a statement criticizing the ruling by U.S. District Judge Amy Totenberg in September that denied importation of the animals. “We firmly disagree with the judge’s decision, but the extended appeal process would add to an already lengthy series of legal proceedings, which would not be in the best interest of the animals in Russia,” aquarium officials said. The aquarium sued the government in September 2013 for the right to acquire the whales, captured in 2006 off the coast of northern Russia in the Sea of Okhotsk. They are currently in the care of Russian scientists. Totenberg said in her strongly worded decision that the aquarium's legal arguments amounted to “smoke and mirrors.” She noted that the organization had accused a division of the U.S. National Oceanic and Atmospheric Administration (NOAA) fisheries service of "'cooking the books' to fabricate its rationale" for denying the permit. The Animal Welfare Institute, which was one of several environmental groups that joined U.S. regulators in opposing the importation, applauded the aquarium's decision.


News Article | November 23, 2016
Site: www.csmonitor.com

An octopus ended up out of its element in Miami earlier this month. Is there a lesson in problems at the Oroville Dam? (+video) The moon rises behind the castle of Almodovar in Cordoba, southern Spain, on Sunday, Nov. 13, 2016. The Supermoon on Nov. 14, 2016, was the closest a full moon has been to Earth since Jan. 26, 1948. Coal mines have the canary, endangered species have the panda bear, melting ice has the polar bear, and now sea level rise has … the octopus? Climate change's impact on sea levels has made tidal flooding in Miami more severe, according to scientists. After the "supermoon" earlier this month triggered high tides, parts of Miami flooded and at least one sea creature was left far from home: an octopus that became stranded in a flooded parking garage, reported the Miami Herald. Miami resident Richard Conlin discovered the octopus, and shared images of the displaced sea creature on Facebook. According to Conlin, the octopus was brought home by building security officers, who returned the animal to the ocean in a bucket of water. [Supermoon Photos: Full Moon Rises Across the Globe] Marine biologist Kathleen Sullivan Sealey, from the University of Miami, told the Miami Herald that the cyclical "king tides" — a period of especially high tides caused by the alignment of the sun, Earth and moon's gravitational forces— were intensified by the supermoon and likely washed the octopus out of pipes underneath the garage. "When that much sea water comes in, the octopus is like 'What's this?' and goes to explore and ends up in a bad place," Sealey told the Miami Herald after examining the photos. She said the marooned octopus was either a small Caribbean reef octopus or a large Atlantic pygmy octopus. Though the building's drainage pipes were designed safely above high-water marks, Sealey said rising sea levels have left some of the pipes partially submerged during very high tides, such as the king tide. These submerged pipes combine two of an octopus’ favorite things, Sealey said: a cramped, dark space with fish to eat. In his Facebook posts, Conlin noted that his building has been flooding more frequently. "This flooding to this extreme is new and gets worse each moon," he wrote. "In the past the floor of the garage would be ‘damp’ but this extreme flooding is new." Conlin added that every day for the past six months there has been "some type of water seepage in the garage." Florida is especially at risk of flooding due to climate change. A recent study by the National Oceanic and Atmospheric Administration (NOAA) determined that about 13 million Americans could be affected by rising seas caused by climate change, and nearly half of them live in Florida. In Miami alone, a third of the county could be forced to relocate, according to the NOAA study. And sea creatures that wash ashore may become a more common occurrence, Sealey said, because ocean waters will be pushed deeper onto land more frequently due to rising seas. "The sea is moving in, so we have to share the space," Sealey said.


News Article | November 15, 2016
Site: www.marketwired.com

Open GIS Leader Adds Boundless Desktop and Boundless Connect to Deliver the Most Complete Platform for Geospatial Data NEW YORK, NY--(Marketwired - Nov 15, 2016) - Boundless, the leader in open GIS, today introduced the world's first open GIS ecosystem to unlock the business intelligence of location-based data. In response to market demand for more open and scalable GIS solutions, the company extended its proven GIS platform with Boundless Connect, a subscription service to the most comprehensive repository of GIS data, and Boundless Desktop, a full-featured, professional desktop GIS, bringing a powerful ecosystem of geospatial knowledge, tools and resources to the enterprise. "With the launch of Boundless Connect and Boundless Desktop, we have taken a major step forward in delivering the most complete, commercially supported open GIS platform," said Andy Dearing, CEO of Boundless. "As the need for an alternative to costly, closed GIS systems grows, Boundless is proud to partner with the open source community to provide new tools and open solutions that foster growth of the largest repository of the world's geospatial knowledge and resources." Boundless Delivers the Ultimate Open GIS Ecosystem Boundless offers an open GIS ecosystem through a unique combination of technology, products and experts that gives enterprises deeper intelligence and insights using location-based data. The Boundless platform is built upon open source technology and open APIs that generate actionable location intelligence across third-party apps, content services and plugins for enterprise applications. Eighty percent of today's data includes a location component. Unlike proprietary, licensed solutions that are prohibitively expensive for the growing volume of geospatial data, Boundless wants to make the world of geospatial data available to any user. Boundless is open by design, immediately scalable and license-free, making it easy for developers, GIS and business analysts to access location-based data in a cloud-based GIS platform. Boundless is the leading open GIS platform for government and commercial mapping and analytics. The Boundless platform is a complete, open GIS solution, including Exchange, Suite, and now Desktop and Connect. The experts at Boundless reduce the cost and time of deploying and managing geospatial software with packaging, support, maintenance, professional services, and expert training. Key features and benefits of the Boundless open GIS platform include: Boundless is currently operational in government and commercial environments, with customers including National Oceanic and Atmospheric Administration (NOAA), Port of Seattle, Louisiana Department of Health and Hospitals, TriMet and many more. Global Open GIS Community to Take the Enterprise By Storm According to P&S Market Research, the global geographic information system (GIS) market is expected to reach $14.6 billion by 2020, growing at a CAGR of 11.4%. Additionally, the U.S. Department of Labor named geospatial technology one of the top three most important high-growth industries in the 21st century, along with nanotechnology and biotechnology. Helping to advance open GIS, Boundless is an active leader in the open source community, working with partners, customers, and developers worldwide. Unlike proprietary software vendors, Boundless gives customers direct access to the open source community for the ability to provide feedback and influence future direction. The company is committed to open source and all of the projects it works with are published using the Open Source Initiative approved license. Availability and Pricing The Boundless open GIS platform includes Exchange, Suite, Desktop and Connect. Announced today, Boundless Desktop and Boundless Connect are available immediately. Subscription pricing for commercial support and maintenance of the Boundless product stack starts as low as $5 per month. For more information and additional pricing details, email contact@boundlessgeo.com. Supporting Quotes "Leveraging open source technology supplemented with enterprise support from Boundless, we have secured the assurance we needed to proceed with our development ambitions while minimizing risk for the enterprise. Not only does the platform perform exceptionally well, but the ability to work in a hybrid environment and embrace open source software alongside our existing investments has also allowed for the scalability, flexibility and agility our organization needs." -- Eric Drenckpohl, Enterprise GIS Manager, Port of Seattle "As many government agencies across the country experience, funding for new technology can be challenging to secure. In Louisiana, which experiences more than its share of disasters, mapping of facilities and threats is a critical deliverable from our emergency preparedness group, with demand for such information spiking during actual emergencies. With Boundless, we've been able to cost-effectively leverage our existing databases with the ease and power of a web interface based on Boundless Exchange -- achieving real-time visibility during the entire lifecycle of an emergency. The platform has been reliable and scalable and we consider this application suite to be the 'hidden jewel' in a market crowded with complex, expensive GIS products." -- Henry Yennie, Program Monitor, Emergency Preparedness, Louisiana Department of Health About Boundless Boundless is the leader in open GIS (geographic information systems). Unlike proprietary, licensed solutions, Boundless opens the world of geospatial data to any user. The experts at Boundless reduce the cost and time of deploying and managing geospatial software with a scalable, open GIS platform -- including Exchange, Suite, Desktop and Connect -- and a powerful ecosystem of geospatial knowledge, tools and resources. Learn more at boundlessgeo.com. Join the GIS community at https://connect.boundlessgeo.com/. Follow Boundless @boundlessgeo.


News Article | August 22, 2016
Site: www.washingtonpost.com

Scientists have repeatedly faulted Republican presidential contender Donald Trump for continuing to question the strong scientific consensus that humans are causing climate change. But more recently, some have also challenged the Green Party candidate and medical doctor Jill Stein on the same topic. The contention is that Stein, who has been criticized previously for statements on scientific topics like vaccines and genetically modified organisms, is presenting climate science in an unduly alarmist way. The problem traces to this tweet by Stein from last week: Several climate scientists on Twitter quickly faulted the statement. Here’s Jacquelyn Gill, a researcher at the University of Maine: And here’s Chris Colose, a climate science Ph.D. candidate at the University of Albany: So is there any way to defend Stein’s statement about a 9 foot sea level rise by 2050 (in 34 years), which would sweep inland in many coastal cities around the world and cause major damage and displacement? Perhaps the best argument in Stein’s favor would cite a recent, controversial study by James Hansen, an extremely famous climate scientist who retired several years ago from NASA, and who published the work with a long and influential list of colleagues. That paper, which outlined a series of feedback processes which could drive a particularly catastrophic version of climate change, contemplated the idea that the rate of ice loss from Greenland and Antarctica could double, and continue to double, over time periods ranging from every five years to every 40 years. “Doubling times of 10, 20 or 40 years yield multi-meter sea level rise in about 50, 100 or 200 years,” the paper noted. “Recent ice melt doubling times are near the lower end of the 10–40-year range, but the record is too short to confirm the nature of the response.” When asked if this study was indeed a source of the 9 foot figure, Stein’s press director Meleiza Figueroa commented, “James Hansen has said that we could see several meters of sea level rise as soon as the next 50 years. Considering that the effects of climate change we’ve seen in real life have consistently met, or even exceeded, what were previously considered worst case scenarios, we need to take Dr. Hansen’s alarming findings very seriously.” Figueroa also pointed to an April article in the Insurance Journal, which paraphrased the National Oceanic and Atmospheric Administration’s Margaret Davidson, a coastal sciences advisor, as follows: “Davidson said recent data that has been collected but has yet to be made official indicates sea levels could rise by roughly 3 meters or 9 feet by 2050-2060, far higher and quicker than current projections.” So let’s weigh all of this, starting with the Hansen study. What that study asserts is in effect an “if, then” statement: If the 10-year doubling time is actually correct, then yes, around 2065 you could get “multi-meter” sea level rise, because the rate of ice loss by then would have doubled five times. Thus, taking Greenland as an example, it might have gone from its current 281 billion tons of annual ice loss, which is not quite enough to raise seas by 1 millimeter, to close to 9,000 billion tons, which would raise seas by 25 about millimeters per year, or roughly an inch. (In reality, Hansen thinks Antarctica, not Greenland, will be the bigger ice loser.) But the paper did not actually say 10 years is the right number for the doubling time — rather, it said that current empirical evidence suggests a number “near the lower end” of the range between 10 and 40 years, adding that “the record is too short to confirm the nature of the response.” In effect, the paper is testing out different time frames for doubling of ice loss in order to assess the likely response, and to provide a range of possibilities. Indeed, in a widely watched video discussing the research, Hansen commented that But not every scientist agrees with Hansen, or about whether this thought experiment is anything more than just that. The study has driven both praise and also criticism, as our coverage of it has made clear. One of the most important comments on the study came from Penn State glaciologist Richard Alley, who said the work “usefully reminds us that large and rapid changes are possible.” But as Alley added, “the paper does not include enough ice-sheet physics to tell us how much how rapidly is how likely.” Certainly, we should not discount Hansen outright. He’s too celebrated and important of a scientist for that. Still, the consensus projection of sea level rise, for the moment, remains that of the U.N.’s Intergovernmental Panel on Climate Change (IPCC), cited by Colose above, which predicts up to about a meter (3.28 feet) by 2100 at the high end. Increasingly, though, researchers are suspecting this could be too low, and one major study earlier this year suggested Antarctica alone could contribute close to a meter by that year. But that’s still not close to 9 feet by 2050. So in sum, the Hansen study suggests a possibility of multi-meter sea level rise this century, but certainly does not establish a firm prediction that this is actually going to happen — and does not represent a scientific consensus position right now. As for the article quoting NOAA’s Margaret Davidson, she appears to have subsequently clarified her words, in an email to Slate’s Eric Holthaus. There, Davidson cited concerns about rapid loss of West Antarctica and said “actually said my personal opinion was increasingly leaning towards 2-3 meters in next 50 years (that 2100 was not a useful frame for most people).” The punch line is just about the same: An official has at least floated the possibility of super-rapid sea level rise, but this does not mean it’s a scientifically accepted or consensus projection right now. It also isn’t clear where Stein got the figure of 12.3 million Americans seeing their homes inundated by this much sea level rise. Ben Strauss, a sea level rise expert with Climate Central, which has often performed mapping projections to determine how much U.S. land would be flooded by different levels of rising seas, said he wasn’t sure the source of the number. Strauss said that some of his research suggests that by 2050, we could lock in 3 meter sea level rise that would affect 13.5 million Americans, but we wouldn’t actually see all of that sea level rise by 2050. Any presidential candidate, from Trump to Clinton to Stein, has every right to dig in and explain all of this. Moreover, that candidate could easily justify the conclusion that we have good reason to worry that sea level rise by 2100 could be considerably worse than the IPCC suggests — if we don’t get our acts together. That is the way the sea level rise story is trending these days. But what’s more questionable is to cite only a worst case scenario, without explaining the state of the evidence or scientific opinion overall. Stein said “could,” which is a bit of a hedge, but it’s not really enough. Granted, at least Stein is indeed listening to scientists and drawing on scientific evidence and opinion (although not including adequate context). Compare that with Trump, who just says he’s “not a big believer” in the overwhelmingly accepted scientific idea of climate change.


News Article | February 24, 2017
Site: news.yahoo.com

FILE – In this April 16, 2013 file photo, a "bathtub ring" marks the high water mark as a recreational boat approaches Hoover Dam along Black Canyon on Lake Mead, the largest Colorado River reservoir, near Boulder City, Nev. Scientists say global warming may already be shrinking the Colorado River and could reduce its flow by more than a third by the end of the century. (AP Photo/Julie Jacobson, File) DENVER (AP) — Global warming is already shrinking the Colorado River, the most important waterway in the American Southwest, and it could reduce the flow by more than a third by the end of the century, two scientists say. The river's volume has dropped more than 19 percent during a drought gripping the region since 2000, and a shortage of rain and snow can account for only about two-thirds of that decline, according to hydrology researchers Brad Udall of Colorado State University and Jonathan Overpeck of the University of Arizona. In a study published last week in the journal Water Resources Research, they concluded that the rest of the decline is due to a warming atmosphere induced by climate change, which is drawing more moisture out of the Colorado River Basin's waterways, snowbanks, plants and soil by evaporation and other means. Their projections could signal big problems for cities and farmers across the 246,000-square-mile basin, which spans parts of seven states and Mexico. The river supplies water to about 40 million people and 6,300 square miles of farmland. "Fifteen years into the 21st century, the emerging reality is that climate change is already depleting the Colorado River water supplies at the upper end of the range suggested by previously published projections," the researchers wrote. "Record-setting temperatures are an important and underappreciated component of the flow reductions now being observed." The Colorado River and its two major reservoirs, Lake Mead and Lake Powell, are already overtaxed. Water storage at Mead was at 42 percent of capacity Wednesday, and Powell was at 46 percent. Water managers have said that Mead could drop low enough to trigger cuts next year in water deliveries to Arizona and Nevada, which would be the first states affected by shortages under the multistate agreements and rules governing the system. But heavy snow in the West this winter may keep the cuts at bay. Snowpack in the Wyoming and Colorado mountains that provide much of the Colorado River's water ranged from 120 to 216 percent of normal Thursday. For their study, Udall and Overpeck analyzed temperature, precipitation and water volume in the basin from 2000 to 2014 and compared it with historical data, including a 1953-1967 drought. Temperature and precipitation records date to 1896 and river flow records to 1906. Temperatures in the 2000-2014 period were a record 1.6 degrees Fahrenheit above the historical average, while precipitation was about 4.6 percent below, they said. Using existing climate models, the researchers said that much decline in precipitation should have produced a reduction of about 11.4 percent in the river flow, not the 19.3 percent that occurred. They concluded that the rest was due to higher temperatures, which increased evaporation from water and soil, sucked more moisture from snow and sent more water from plant leaves into the atmosphere. Martin Hoerling, a meteorologist at the National Oceanic and Atmospheric Administration who was not involved in the study, questioned whether the temperature rise from 2000 to 2014 was entirely due to global warming. Some was likely caused by drought, he said. Udall said warming caused by climate change in this century will dwarf any warming caused by drought. He noted that during the 1953-1967 drought, the temperature was less than a half degree warmer than the historical average, compared with 1.6 degrees during the 2000-2014 period. Udall said climate scientists can predict temperatures with more certainty than they can precipitation, so studying their individual effects on river flow can help water managers. Rain and snowfall in the Colorado River Basin would have to increase 14 percent over the historical average through the rest of the century to offset the effect of rising temperatures, he said. "We can't say with any certainty that precipitation is going to increase and come to our rescue," Udall said. Follow Dan Elliott at http://twitter.com/DanElliottAP . His work can be found at https://apnews.com/search/dan%20elliott .


News Article | February 16, 2017
Site: www.eurekalert.org

The number of instruments on the International Space Station dedicated to observing Earth to increase our understanding of our home planet continues to grow. Two new instruments are scheduled to make their way to the station Feb. 18 on the SpaceX Dragon capsule. The Stratospheric Aerosol and Gas Experiment (SAGE) III instrument will monitor the condition of the ozone layer, which covers an area in the stratosphere 10 to 30 miles above Earth and protects the planet from the sun's harmful ultraviolet radiation. Its predecessors, SAGE I and SAGE II, which were mounted to satellites, helped scientists understand the causes and effects of the Antarctic ozone hole. The Montreal Protocol of 1987 led to an eventual ban on ozone-destroying gases and to the ozone layer's recovery; SAGE III, designed to operate for no less than three years, will allow scientists to continue monitoring its recovery. The Lightning Imaging Sensor (LIS), first launched as an instrument on the Tropical Rainfall Measuring Mission in 1997, records the time, energy output and location of lightning events around the world, day and night. From its perch on the ISS, the new LIS will improve coverage of lightning events over the oceans and also in the northern hemisphere during its summer months. Because lightning is both a factor and a gauge for a number of atmospheric processes, NASA as well as other agencies will use the new LIS lightning data for many applications, from weather forecasting to climate modeling and air quality studies. While SAGE III and LIS are the latest Earth science instruments slated for operation aboard the ISS, they or not the first or the last. For two years, beginning in September 2014, the Rapid Scatterometer, or RapidScat, collected near-real-time data on ocean wind speed and direction. The instrument was designed as a low-cost replacement for the Quick Scatterometer, or QuikScat satellite, which experienced an age-related failure in 2009. In addition to addressing such questions as how changing winds affect sea surface temperatures during an El Niño season, the National Oceanic and Atmospheric Administration and the U.S. Navy relied on RapidScat data for improved tracking of marine weather, leading to more optimal ship routing and hazard avoidance. The Cloud Aerosol Transport System (CATS) was mounted to the exterior of the space station in Jan. 2015 and is in the midst of a three-year mission to measure aerosols, such as dust plumes, wildfires and volcanic ash, around the world. Built to demonstrate a low-cost, streamlined approach to ISS science payloads, the laser instrument is providing data for air quality studies, climate models and hazard warning capabilities. Over the next several years, NASA is planning to send to the space station several more instruments trained toward Earth. Total and Spectral solar Irradiance Sensor (TSIS-1) will measure total solar irradiance and spectral solar irradiance, or the total solar radiation at the top of Earth's atmosphere and the spectral distribution of that solar radiation, respectively. The data are critical for climate modeling and atmospheric studies. TSIS-1 will continue the work of NASA's Solar Radiation and Climate Experiment satellite, which has been taking those measurements since 2003. NASA's Earth System Science Pathfinder program is supporting the following instruments that are currently in development. The program is managed by NASA's Langley Research Center in Hampton, Virginia. The Orbiting Carbon Observatory-3 (OCO-3) instrument will monitor carbon dioxide distribution around the globe. Assembled with spare parts from the Orbiting Carbon Observatory-2 satellite, OCO-3 will provide insights into the greenhouse gas's role as it relates to growing urban areas and changes in fossil fuel combustion. The instrument will also measure the "glow" from growing plants (solar-induced fluorescence). Homing in on tropical and temperate forests is the Global Ecosystem Dynamics Investigation (GEDI). The lidar instrument will provide the first high-resolution observations of forest vertical structure in an effort to answer how much carbon is stored in these ecosystems and also what impacts deforestation and reforestation have on habitat diversity, the global carbon cycle and climate change. The ECOsystem Spaceborne Thermal Radiometer Experiment (ECOSTRESS) will also focus on vegetation by providing high-frequency, high-resolution measurements of plant temperature and plant water use. Among the data's numerous uses will be to indicate regions of plant heat and water stress and also improve drought forecasting for the benefit of farmers and water managers. Researchers will also use ECOSTRESS in concert with other data to calculate water use efficiency among plants and identify drought-resistant species and varieties. Also on the horizon is the Climate Absolute Radiance and Refractivity Observatory (CLARREO) Pathfinder comprising two instruments for measuring solar irradiance: a reflected solar spectrometer and an infrared spectrometer. CLARREO will collect highly accurate climate records to test climate projections in order to improve models. NASA collects data from space to increase our understanding of our home planet, improve lives and safeguard our future. For more information about NASA's Earth science programs, visit: Keep up with the International Space Station at:


News Article | January 17, 2016
Site: news.yahoo.com

The SpaceX Falcon 9 rocket is seen after launch from Space Launch Complex 4 East at Vandenberg Air Force Base in California with the Jason-3 spacecraft onboard, January 17, 2016 (AFP Photo/Bill Ingalls) More Los Angeles (AFP) - SpaceX's unmanned Falcon 9 rocket broke apart Sunday as it tried to land on a floating platform in the Pacific, marking the fourth such failure in the company's bid to recycle rockets. However, the primary mission of the launch from Vandenberg Air Force Base in California went as planned, propelling into orbit a $180 million US-French satellite called Jason-3 to study sea level rise. "Well, at least the pieces were bigger this time!" Elon Musk, the CEO of the California-based company, wrote on Twitter. SpaceX is trying to land its rockets back on Earth in order to re-use the parts in the future, trying to make spaceflight cheaper and more sustainable than before. The firm succeeded in landing its Falcon 9 first stage -- the long towering portion of the rocket -- on solid ground at Cape Canaveral, Florida in December. Even though an ocean landing is more difficult, SpaceX wants to perfect the technique because ship landings "are needed for high velocity missions," Musk tweeted. "Definitely harder to land on a ship," he added after the latest foible. "Similar to an aircraft carrier vs land: much smaller target area, that's also translating and rotating." Currently, expensive rocket components are jettisoned into the ocean after launch, wasting hundreds of millions of dollars. Competitor Blue Origin, headed by Amazon founder Jeff Bezos, succeeded in landing a suborbital rocket in November. However, no other company has attempted the ocean landing that SpaceX is trying to achieve. In the end, the problem on Sunday was not due to high speed or a turbulent ocean, but came down to a leg on the rocket that did not lock out as anticipated. "So it tipped over after landing," Musk said. SpaceX said the rocket landed within 1.3 meters (yards) of the droneship's center. There was no hitch in the launch itself, and the blast off at 10:42 am (1842 GMT) of the rocket and satellite went flawlessly. The satellite aims to offer a more precise look at how global warming and sea level rise affect wind speeds and currents as close as 0.6 miles (one kilometer) from shore, whereas past satellites were limited to about 10 times that distance from the coast. The technology will monitor global sea surface heights, tropical cyclones and help support seasonal and coastal forecasts. During a five-year mission, its data will also be used to aid fisheries management and research into human impacts on the world's oceans. The satellite is the fruit of a four-way partnership between the National Oceanic and Atmospheric Administration (NOAA), the US space agency NASA, the French space agency CNES (Centre National d'Etudes Spatiales) and the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT).


News Article | December 21, 2016
Site: www.csmonitor.com

Rebecca Smith poses for a photograph during winter solstice with her Irish Wolfhound dog called Amazing Grace at the 5000 year old stone age tomb of Newgrange (not in view) in the Boyne Valley at sunrise in Newgrange, Ireland, December 21, 2016. —Daylight lovers in half the world can rejoice! While winter days in the Northern Hemisphere will get only colder in the coming months, sunset will occur later and later each day after Wednesday morning's solstice. Although the winter solstice is merely a date on the calendar to most modern humans, even those who are happy to begin marking time towards the long, warm days of summer, this date’s historical celestial significance makes it remarkable in itself. Astronomically, the December solstice occurs as the sun reaches its southernmost point in the sky, which happened this year at 5:44 a.m., Eastern Standard Time, on Wednesday. Yet while this day marks the beginning of astronomical winter, scientists say that nothing much will change, weather-wise, at least in the short term. In fact, it is not until late January that residents of the Northern Hemisphere will likely experience the coldest day of the season. “There’s not a good answer for why people say that December 21 is the beginning of winter,” the National Oceanic and Atmospheric Administration’s Anthony Arguez said in 2015, according to The Christian Science Monitor. “There’s nothing magical that says that winter has to happen after the solstice.” Wednesday is a remarkably short day by any terms. Washington, D.C., will experience fewer than nine and a half hours of sunlight, and more northern cities such as New York and Montreal will see even less. The Boston Globe reports that while the number of daylight hours lost between the summer and winter solstices varies by region, residents of some of the largest US cities, including Chicago and New York, can expect to lose six to seven hours of daylight from summer to winter. But for those north of the equator, things will only get brighter from here. From now on, the Earth will tilt slowly on its axis towards the sun, until the longest day of the year occurs on Wednesday, June 21, 2017, also known as the summer solstice in the Northern Hemisphere. Yet due to the Earth’s tilt, as the Northern Hemisphere tilts away from the sun, the Southern Hemisphere tilts towards it, making Wednesday the Southern Hemisphere’s summer solstice. From now on, days will get marginally shorter for residents of São Paulo or Sydney. While some individuals still celebrate both summer and winter solstices, as well as their vernal and autumnal equinox cousins, the solstice was far more important to our ancient ancestors. Some of the world’s most awe-inspiring ancient sites were constructed with the celestial calendar in mind, from England’s Stonehenge to the Goseck Circle in Germany to pyramids in Mexico and other parts of Central America. Ancient humans used these sites to keep track of the seasons, keeping track of the rhythms of the seasons and daily life. In 2014, Noelle Swan wrote for The Christian Science Monitor: On Wednesday, however, modern humans look forward, not to begin planting or breeding their cows, but rather to more daylight and commuting before nightfall.


News Article | November 22, 2016
Site: www.eurekalert.org

The Cooperative Institute for Mesoscale Meteorological Studies at the University of Oklahoma collaborate with the National Oceanic and Atmospheric Administration on weather and climate under the terms of a five-year, $95.3 million agreement with NOAA. CIMMS, the largest and second oldest research center at OU, supports NOAA with two of its next generation, long-term planning initiatives: Weather Ready Nation and Climate Adaptation and Mitigation. "The university is very excited by this new five-year agreement totaling over $95 million to support important weather and climate research on our campus in cooperation with the federal government," said OU President David L. Boren. "It underlines the importance of what is happening at our university. We are proud to be national leaders in this effort." CIMMS contributes to NOAA's enterprise-wide capabilities in science and technology, engagement and organization and administration in the following research areas: weather radar research and development; stormscale and mesoscale modeling research and development; forecast and warning improvements research and development; impacts of climate change related to extreme weather events; and societal and socioeconomic impacts of high impact weather systems. "CIMMS research improves our understanding of stormscale meteorological phenomena, weather radar and regional climate variations," said Interim Director Randy Peppler. "Our ultimate goal is to help NOAA produce better forecasts and warnings that save lives and protect property." CIMMS research affiliates or associates include: Oceanic and Atmospheric Research National Severe Storm Laboratory; Oceanic and Atmospheric Research Air Resources Laboratory; National Weather Service Radar Operations Center for the WSR-88D (NEXRAD) Program; National Weather Service/National Center for Environmental Protection Storm Prediction Center; National Weather Service Warning Decision Training Division; National Weather Service Norman Weather Forecast Office; and National Weather Service Training Center in Kansas City. CIMMS was established in 1978 through a memorandum of agreement between OU and NOAA. As a NOAA cooperative research institute, CIMMS supports scientists, engineers and students who conduct research, training and outreach in mesoscale weather, weather radar and regional-scale climate processes. For more information, contact cimms@nwc.ou.edu or visit http://cimms. .


News Article | March 1, 2017
Site: www.eurekalert.org

PRINCETON, N.J. -- An influx of pollution from Asia in the western United States and more frequent heat waves in the eastern U.S. are responsible for the persistence of smog in these regions over the past quarter century despite laws curtailing the emission of smog-forming chemicals from tailpipes and factories. The study, led by researchers at Princeton University and the National Oceanic and Atmospheric Administration's Geophysical Fluid Dynamics Laboratory (GFDL), highlights the importance of maintaining domestic emission controls on motor vehicles, power plants and other industries at a time when pollution is increasingly global. Published March 1 in the journal Atmospheric Chemistry and Physics, the study looked at the sources of smog, also known as ground-level ozone, across a period ranging from the 1980s to today. Ground-level ozone, which is distinct from the ozone in the upper atmosphere that protects the planet from ultraviolet radiation, is harmful to human health, exacerbating asthma attacks and causing difficulty breathing. It also harms sensitive trees and crops. Despite a 50 percent cut in smog-forming chemicals such as nitrogen oxides, commonly known as "NOx", over the past 25 years, ozone levels measured in rural areas of the west have actually climbed. And while ozone in the eastern U.S. has decreased overall, the levels can spike during heat waves. The study traced the increase of ozone in the west to the influx of pollution from Asian countries, including China, North and South Korea, Japan, India, and other South Asian countries. Collectively, the region has tripled its emissions of NOx since 1990. In the eastern U.S., meanwhile, heat waves -- which have become more frequent in the past few decades -- trap polluted air in place, leading to temporary escalations in locally produced ozone. The study explains why springtime ozone levels measured in Yellowstone National Park and other western parks far from urban areas have climbed over the past quarter century. According to the study, springtime ozone levels in the national parks rose during that period by 5 to 10 parts per billion (ppb), which is significant given that the federal ozone standard is 70 ppb. The influx of pollution from Asia could make it difficult for these areas to comply with the federal ozone standards, according to the study's authors. "Increasing background ozone from rising Asian emissions leaves less room for local production of ozone before the federal standard is violated," said lead author Meiyun Lin, a research scholar in the Program in Atmospheric and Oceanic Sciences at Princeton University and a scientist at GFDL. Lin's co-authors were Larry Horowitz, also of GFDL; Richard Payton and Gail Tonnesen of the U.S. Environmental Protection Agency; and Arlene Fiore of the Lamont-Doherty Earth-Observatory and Department of Earth and Environmental Sciences at Columbia University. Using ozone measurements combined with climate models developed at GFDL, the authors identified pollution from Asia as driving the climb in ozone in western U.S. national parks in the spring, when wind and weather patterns push Asian pollution across the Pacific Ocean. In the summer, when these weather patterns subside, ozone levels in national parks are still above what would be expected given U.S. reductions in ozone-precursors. While it has been known for over a decade that Asian pollution contributes to ozone levels in the United States, this study is one of the first to categorize the extent to which rising Asian emissions contribute to U.S. ozone, according to Lin. In the eastern United States, where Asian pollution is a minor contributor to smog, NOx emission controls have been successful at reducing ozone levels. However, periods of extreme heat and drought can trap pollution in the region, making bad ozone days worse. Regional NOx emission reductions alleviated the ozone buildup during the recent heat waves of 2011 and 2012, compared to earlier heat waves such as in 1988 and 1999. As heat waves appear to be on the rise due to global climate change, smog in the eastern U.S. is likely to worsen, according to the study. Climate models such as those developed at GFDL can help researchers predict future levels of smog, enabling cost-benefit analyses for costly pollution control measures. The researchers compared results from a model called GFDL-AM3 to ozone measurements from monitoring stations over the course of the last 35 years, from 1980 to 2014. Prior studies using global models poorly matched the ozone increases measured in western national parks. Lin and co-authors were able to match the measurements by narrowing their analysis to days when the airflow is predominantly from the Pacific Ocean. Modeling the sources of air pollution can help explain where the ozone measured in the national parks is coming from, explained Lin. "The model allows us to divide the observed air pollution into components driven by different sources," she said. The team also looked at other contributors to ground-level ozone, such as global methane from livestock and wildfires. Wildfire emissions contributed less than 10 percent and methane about 15 percent of the western U.S. ozone increase, whereas Asian air pollution contributed as much as 65 percent. These new findings suggest that a global perspective is necessary when designing a strategy to meet U.S. ozone air quality objectives, said Lin. The negative effect of imported pollution on the US's ability to achieve its air quality goals is not wholly unexpected, according to Owen Cooper, a senior research scientist at the University of Colorado and the NOAA Earth System Research Laboratory, who is familiar with the current study but not directly involved. "Twenty years ago, scientists first speculated that rising Asian emissions would one day offset some of the United States' domestic ozone reductions," Cooper said. "This study takes advantage of more than 25 years of observations and detailed model hindcasts to comprehensively demonstrate that these early predictions were right."


News Article | January 17, 2016
Site: www.fastcompany.com

Update 2:08 p.m.: SpaceX reports that the rocket first stage successfully navigated to the drone ship, but it was a hard landing that broke one of the rocket's landing legs. Update 10:34 p.m.: SpaceX CEO Elon Musk posted video this evening showing the rocket's landing attempt. In a tweet, he wrote, "Falcon lands on drone ship, but the lockout collet doesn't latch on one the four legs, causing it" to topple and explode. Update 3:30 p.m: In two tweets afterwards, SpaceX CEO Elon Musk wrote that it's "Definitely harder to land on a ship. Similar to an aircraft carrier vs land: much smaller target area, that's also translating & rotating. However, that was not what prevented it being good. Touchdown speed was ok, but a leg lockout didn't latch, so it tipped over after landing." Today, at 1:42 p.m. EST, SpaceX will make its fourth attempt to launch, and then land, its Falcon 9 rocket on an at-sea platform. The launch will take place at Vandenberg Air Force Base, California. The goal is to land the rocket's first stage on a so-called "drone ship" known as "Just Read the Instructions." Each of the company’s three previous planned attempts to land the rocket at sea had failed, some in spectacular explosions, some in oh-so-close misses, and one when the rocket exploded before it could be brought back for landing. Last month, for the first time, SpaceX successfully landed the Falcon 9 after launch, returning it to terra firma. About 10 minutes after liftoff, the rocket will attempt to land itself upright on the deck of a 100-foot-by-300-foot, unmanned floating platform, about 200 miles from Vandenberg, off the coast of Southern California. The rocket is meant to guide itself to the barge using GPS. Today's landing could be complicated by 12-foot to 15-foot waves. For Musk’s company, successfully reusing a rocket—and demonstrating that last month’s performance wasn’t a fluke—is a key element of a future of affordable launches. "SpaceX believes a fully and rapidly reusable rocket is the pivotal breakthrough needed to substantially reduce the cost of space access," the company says on its website. "The majority of the launch cost comes from building the rocket, which flies only once. Compare that to a commercial airliner—each new plane costs about the same as Falcon 9, but can fly multiple times per day, and conduct tens of thousands of flights over its lifetime. Following the commercial model, a rapidly reusable space launch vehicle could reduce the cost of traveling to space by a hundredfold." Today's mission also had a scientific purpose beyond returning the rocket home. The launch was meant to "deliver the Jason-3 satellite to low-Earth orbit for the U.S. National Oceanic and Atmospheric Administration (NOAA), National Aeronautics and Space Administration (NASA), French space agency Centre National d'Etudes Spatiales (CNES) and the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT)," SpaceX said on its site (PDF). "If all goes as planned, the Jason-3 satellite will be deployed approximately an hour after launch." Watch a live stream of the satellite launch above. We will update this article with news of the ocean landing attempt.


News Article | February 24, 2017
Site: www.futurity.org

Warming in the 21st century has reduced Colorado River flows by at least 0.5 million acre-feet—about the amount of water used by 2 million people for one year, a new study warns. “This paper is the first to show the large role that warming temperatures are playing in reducing the flows of the Colorado River,” says Jonathan Overpeck, professor of geosciences and of hydrology and atmospheric sciences at the University of Arizona. From 2000-2014, the river’s flows declined to only 81 percent of the 20th-century average, a reduction of about 2.9 million acre-feet of water per year. One acre-foot of water will serve a family of four for one year, according to the US Bureau of Reclamation. From one-sixth to one-half of the 21st-century reduction in flow can be attributed to the higher temperatures since 2000. The new analysis shows that as temperatures continues to rise, Colorado River flows will continue to decline. Current climate change models indicate temperatures will increase as long as humans continue to emit greenhouse gases into the atmosphere, but the projections of future precipitation are far less certain. Forty million people rely on the Colorado River for water, according to the US Bureau of Reclamation. The river supplies water to seven US Western states plus the Mexican states of Sonora and Baja California. “The future of Colorado River is far less rosy than other recent assessments have portrayed,” says Bradley Udall, a senior water and climate scientist/scholar at Colorado State University’s Colorado Water Institute. “A clear message to water managers is that they need to plan for significantly lower river flows.” The study’s findings “provide a sobering look at future Colorado River flows.” The Colorado River Basin has been in a drought since 2000. Previous research has shown the region’s risk of a megadrought—one lasting more than 20 years—rises as temperatures increase. “We’re the first to make the case that warming alone could cause Colorado River flow declines of 30 percent by midcentury and over 50 percent by the end of the century if greenhouse gas emissions continue unabated,” Overpeck says. The researchers began their study, published in the journal Water Resources Research, because Udall learned that recent Colorado flows were lower than managers expected given the amount of precipitation. The team wanted to provide water managers with insight into how future projections of temperature and precipitation for the Colorado River Basin would affect the river’s flows. They began by looking at the drought years of 2000-2014. About 85 percent of the river’s flow originates as precipitation in the Upper Basin—the part of the river that drains portions of Wyoming, Utah, Colorado, and New Mexico. The team found during 2000-2014, temperatures in the river’s Upper Basin were 1.6 degrees Fahrenheit (0.9 degree Celsius) higher than the average for the previous 105 years. To see how increased temperatures might contribute to the reductions in the river’s flow that have been observed since 2000, they reviewed and synthesized 25 years of research about how climate and climate change have and will affect the region and how temperature and precipitation affect the river’s flows. Water loss increases as temperatures rise because plants use more water, and higher temperatures increase evaporative loss from the soil and from the water surface and lengthen the growing season. In previous studies, researchers have showed current climate models simulated 20th-century conditions well, but the models cannot simulate the 20- to 60-year megadroughts known to have occurred in the past. Moreover, many of those models did not reproduce the current drought. Those researchers and others suggest the risk of a multidecadal drought in the Southwest in the 21st century is much higher than climate models indicate and that as temperatures increase, the risk of such a drought increases. “A megadrought in this century will throw all our operating rules out the window,” Udall says. The findings show that all current climate models agree that temperatures in the Colorado River Basin will continue rising if the emission of greenhouse gases is not curbed. However, the models’ predictions of future precipitation in the basin have much more uncertainty. “Even if the precipitation does increase, our work indicates that there are likely to be drought periods as long as several decades when precipitation will still fall below normal,” Overpeck says. The new study suggests Colorado River flows will continue to decline. “I was surprised at the extent to which the uncertain precipitation aspects of the current projections hid the temperature-induced flow declines,” Udall says. The US Bureau of Reclamation lumps temperature and precipitation together in its projections of Colorado River flow, he says. “Current planning understates the challenge that climate change poses to the water supplies in the American Southwest. My goal is to help water managers incorporate this information into their long-term planning efforts.” The Colorado Water Institute, National Science Foundation, the National Oceanic and Atmospheric Administration, and the US Geological Survey funded the work.


News Article | November 18, 2016
Site: www.newscientist.com

In a ghost town of dead coral off a remote Pacific island, scientists have found a little more life. In excursions a year ago and then last April, scientists examined the normally-stunning coral reefs around the island of Kiritimati and pronounced it mostly a boneyard of dead coral. About 85 per cent of the coral was dead, 10 per cent was sick and bleached but still technically alive, and only 5 per cent was doing okay. The same scientists returned this month and found that 6 to 7 per cent of the coral is alive and not bleached, says Julia Baum, coral reef scientist from the University of Victoria, in Canada. “We left with a sense of dread and came back with a renewed purpose because there are some corals that literally came back from the brink,” says Kim Cobb, climate scientist from Georgia Tech in the US, who returned from the expedition earlier. “It’s the best we could have hoped for.” Many of the fish that rely on the reef and had been absent seem to be back, Cobb says. Hot water – mostly from El Niño, the natural occasional warming of the Pacific that changes weather worldwide, and man-made global warming – had made the area one of the worst-hit coral spots in the world. Later, nearby Jarvis Island was even more damaged. And the death of 85 per cent of the coral of the better-known and much larger Great Barrier Reef has been reported, says Mark Eakin, coral reef watch co-ordinator for the National Oceanic and Atmospheric Administration. “But despite this mass mortality, there are a few small signs of hope,” Baum says. “It’s clear that coral reefs have great resilience and the coral here is trying to recover.” Not only has some of the bleached coral recovered, she says, but “there are coral babies that have settled on the reef some time in the last year to year and half and these are the reef’s best hope for recovery”. A study published on Thursday in the journal Current Biology goes back more than a million years and finds that even during mass die-offs, coral species are able to rebound. Eakin points to Scott reef off western Australia where 12 years after the damaging 1998 El Niño coral die-off, nearly half the original reef revived. But it was damaged again by the recent El Niño. Even after the recovery seen at Kiritimati, Baum is wary. “It’s like having a patient who is very sick and instead of letting them recover we keep infecting them with more and more illnesses,” Baum says. “There’s only so much that any person – or any natural system – can take.”


News Article | February 19, 2017
Site: www.theguardian.com

The embattled Great Barrier Reef could face yet more severe coral bleaching in the coming month, with areas badly hit by last year’s event at risk of death. Images taken by local divers last week and shared exclusively with the Guardian by the Australian Marine Conservation Society show newly bleached corals discovered near Palm Island. Most of the Great Barrier Reef has been placed on red alert for coral bleaching for the coming month by the US National Oceanic and Atmospheric Administration (NOAA). Its satellite thermal maps have projected unusually warm waters off eastern Australia after an extreme heatwave just over a week ago saw land temperatures reach above 47C in parts of the country. According to the Great Barrier Reef Marine Park Authority, sea surface temperatures from Cape Tribulation to Townsville have been up to 2C higher than normal for the time of year for more than a month. The NOAA Coral Reef Watch’s forecast for the next four weeks has placed an even higher level alert on parts of the far northern, northern and central reef, indicating mortality is likely. Corals south of Cairns, in the Whitsundays and parts of the far northern reef that were badly hit by last year’s mass bleaching event are at fatal risk. Imogen Zethoven, the Great Barrier Reef’s campaign director for the AMCS, said the projections for the next four weeks, plus evidence of new coral bleaching, were “extremely concerning”. The bleaching that occurred over eight to nine months of last year was the worst-ever on record for the Great Barrier Reef, with as much as 85% of coral between Cape York and Lizard Island dying. Twenty-two per cent of corals over the entire reef are dead. Zethoven pointed to projections by NOAA that severe bleaching of the Great Barrier Reef would occur annually by 2043 if nothing was done to reduce emissions. “The reef will be gone before annual severe bleaching,” she said. “It won’t survive even biennial bleaching.” The $1bn reef fund announced by the prime minister, Malcolm Turnbull, in June last year was a “cynical rebadging exercise” undercut by its support for fossil fuel initiatives such as Adani’s Carmicheal coalmine “that will spell catastrophe for the reef”, Zethoven said. “There’s no doubt about that anymore,” she said. “They know what they are doing and they should come clean with the Australian public that they have no interest in the long-term survival of the Great Barrier Reef. “To the average person on the street, that’s what it looks like. And if the government thinks that’s not the case, they’re out of touch.” In December last year the government’s Northern Australia Infrastructure Fund granted Adani “conditional approval” to $1bn loan for its Carmicheal coalmine and rail project in central Queensland, which could produce 60m tons of coal annually for 60 years. Warmer ocean temperatures brought about by climate change is a key factor in coral bleaching. Polling suggests that more than two-thirds of Australians believe the reef’s condition should be declared a national emergency. Zethoven said the government had made “a very deliberate decision to go down the coal road”, despite it jeopardising the reef’s future prospects as well as the 70,000 jobs in regional Queensland that depend on it. John Rumney, a diving operator based in Port Douglas, said the “commercial advantage” to saving the reef went beyond jobs. Much of coastal Queensland was “majorly invested” in reef tourism, he said. The federal government’s measures to save the reef were hypocrisy and lip service, he said, when it was simultaneously “actively supporting the cause of the cancer – the worst cause”. “It’s immoral that those of us who are making our living from a healthy environment are paying taxes to subsidise infrastructure that’s going to cause climate change in a major way for the next 50 years,” he said. “If this all goes ahead, we’re basically dooming our tourism industry.” Rumney said he had seen new and extensive bleaching of corals from Cairns to Townsville. “There are definite large areas of mortality. It’s just the next depressing moment. Before, the reef has bleached and recovered but now we’re talking about how often is it bleaching and what percentage is left.” Areas that suffered in last year’s event were now less resilient and there seemed to be less coral strong enough to spawn. Climate change-induced mass bleaching increasingly resembled a catastrophe the reef would be unable to recover from, he said. “It’s weaker, just like humans,” Rumney said. “If you’re already down and out with a cold or cancer, you’re less resilient – the next thing that comes along is going to knock you back more. “It’s the continual onslaught that will eventually kill the reef.”


What Rick Perry told senators vetting him for Energy secretary varied from what he wrote in his 2010 book, "Fed Up! Our Fight to Save America from Washington." Nowhere in his testimony before the Senate Energy and Natural Resources Committee in January did the former Republican governor of Texas reiterate his written belief of a masterful cover-up of data proving "global cooling." The climate, he testified instead, is changing. Also missing was his insistence that the bloated federal government, overspent and underappreciative of states' rights, get out of the way. Perry instead offered to defend DOE if confirmed. "He went before the committee and it was like he never wrote that book," said Jeff Navin, co-founder and partner at Boundary Stone Partners and a deputy chief of staff at DOE during the Obama administration. With the Senate voting today 62-37 to confirm Perry to lead DOE, his shifting positions have raised questions about the fate of the sprawling department, its budget and research at 17 national labs. Speaking on the condition of anonymity, DOE workers said it's unclear whether Perry will pursue the agency's "all of the above" mission or advocate for wind power as he did as governor of Texas. Whether Perry will push back on rumored budget cuts, continue DOE's climate-related research and how he'll respond to a congressional review of agency spending are also looming questions. Another DOE worker said it's not Perry that concerns employees, but the possible effect of a host of conservative policy shops and think tanks on the agency's $30 billion budget and mission (Greenwire, Feb. 17). Perry at his confirmation hearing carefully stepped back from his past call to dismantle DOE. In a sharp pivot, Perry said he regretted his famous 2011 blunder in a nationally televised Republican presidential primary debate when he forgot the name of the federal agency he wanted to dismantle. Like his book, that statement came during the heated months of debate leading up to the 2012 presidential election. "He went before the committee and it was like he never wrote that book," said Jeff Navin, co-founder and partner at Boundary Stone Partners and a deputy chief of staff at DOE during the Obama administration. While many a politician has penned a tome to boost support from their base ahead of tight elections, some observers have suggested Perry's book is a departure from past, poorly written political memoirs. "Fed Up!," some say, provides insight into the former governor's mindset, one that takes aim at the political culture of Washington, analyzes states' rights and admonishes regulations on everything from energy production to how much salt Americans can put on their food. "This is not a boring book," Washington Post reporter Ezra Klein wrote on Aug. 15, 2011. "More to the point, it's not even a book about Rick Perry. It's a book about Rick Perry's ideas. And his big idea is that most everything the federal government does is unconstitutional." But other observers say the book isn't indicative of how Perry will lead the agency under the Trump administration, but rather a reflection of a candidate running in a presidential election long passed. Frank Maisano of Bracewell LLP said he believes Perry will have a significant say in how the agency is reorganized and what's a priority, within the context of Trump's overall vision for the administration. Perry will be an advocate for DOE, Maisano said, one who shapes the budget in cooperation with Congress and the White House. "Bottom line, his book is a different Rick Perry from Governor Rick Perry," Maisano said. "I expect that Governor Rick Perry is probably more likely to emerge as a manager of DOE and a leader of DOE than the guy who wrote the book." Perry's past statements on federal spending and climate science are already coloring his relationship with Congress and will likely shape the outcome of his confirmation vote — at least among Democrats. Sen. Al Franken (D-Minn.) in January pushed Perry on his assertion in the book of a global cooling trend. Franken asked whether Perry believed the changing climate is linked to human activity. "Far from me to be sitting before you today and claiming to be a climate scientist," said Perry. "I will not do that." "I don't think you're ever going to be a climate scientist," Franken responded. "But you're going to be the head of the Department of Energy." Sen. Maria Cantwell (D-Wash.), ranking member of the Energy and Natural Resources Committee, pursued a similar line of questioning. Cantwell said the National Oceanic and Atmospheric Administration had recently found the planet's 2016 surface temperatures were the warmest since modern record keeping began in 1880, and melting of sea ice was at an all-time high. "How do we know all this?" Cantwell asked Perry. "We know this because the Department of Energy does the research." Cantwell said she hoped Perry would understand and lead the federal science mission — be it in DOE labs or partnering universities — and quell anxieties over the Trump administration's plans to scrap or starve programs tied to climate change, efficiency and clean energy. Perry at the hearing said some climate change is naturally occurring and that some is due to human activity. He also vowed to make decisions based on "sound science" and touted his work as Texas governor in bringing down carbon output and other air pollutants while growing the economy. But a quick look at "Fed Up!" shows Perry's past views were quite different. Perry in the book blasted former Vice President Al Gore for raising concerns about melting icebergs and "undersized" polar bears and accused the left of embracing the fantasy of climate change. "Hollywood toasted him as their hero. The Nobel committee gave him a peace prize," Perry wrote. "He won an Oscar. And it's all one contrived phony mess that is falling apart under his own weight. Al Gore is a prophet all right, a false prophet of a secular carbon cult, and now even moderate Democrats aren't buying it." Perry also touched on the need to revisit federal spending in his book, writing, "I think we should have a legitimate, honest, national discussion about Washington's continuing to spend money we don't have on programs that we don't need.” Perry appeared to be blindsided at his hearing about possible budget cuts that Trump's transition team had considered, including cuts to nuclear physics and advanced scientific computing research, efficiency, renewables, and fossil research. When asked about the reports, Perry quickly pivoted to his 2011 blunder (Greenwire, Jan. 19). "Well, senator," Perry told Sen. Mazie Hirono (D-Hawaii), "maybe [the Trump administration will] have the same experience I had and forget they said that." Reprinted from Greenwire with permission from E&E News. Copyright 2017. E&E provides essential news for energy and environment professionals at www.eenews.net


News Article | April 4, 2016
Site: news.yahoo.com

An aerial view of a section of the Great Barrier Reef, with bleached corals visible in the water. More The northern part of the world's largest coral reef ecosystem is experiencing "the worst mass bleaching event in its history," according to a statement released Tuesday (March 29) by the Australian Research Council. Documented by the National Coral Bleaching Taskforce (NCBT) in aerial surveys, observations of more than 500 coral reefs spanning 2,485 miles (4,000 kilometers) showed that the majority of reefs were undergoing extensive and severe bleaching. "Almost without exception, every reef we flew across showed consistently high levels of bleaching, from the reef slope right up onto the top of the reef," said Terry Hughes of the NCBT, calling the surveys "the saddest research trip of my life." [Worst Coral Reef Bleaching on Record for the Great Barrier Reef | Aerial Video] Bleaching happens when corals are exposed to stresses such as warmer-than-average waters for prolonged periods of time. The corals respond to the stress by expelling the algae that provide them with their color, which makes the corals look like they've been bleached white. Bleaching can be fatal for corals if the stress is too intense, or if it continues for too long and the algae are unable to recolonize them. Australia's Great Barrier Reef (GBR) covers 134,364 square miles (348,000 square kilometers), making it larger than the U.K., Switzerland and the Netherlands combined, according to the Great Barrier Reef Marine Park Authority. Recognized as a World Heritage Area in 1981, the reef contains 400 types of coral and hosts 1,500 types of fish and 4,000 mollusk species, as well as other marine life such as large green turtles and dugongs ("sea cows"). The GBR experienced bleaching events in 1998 and in 2002, but the current mass bleaching is much more severe, experts are saying. Rebecca Albright, a marine biologist with the Carnegie Institution for Science in Washington, D.C., has studied the GBR since 2011. Albright told Live Science that 95 percent of the GBR's northern reefs are currently showing signs of extreme bleaching, compared with 18 percent that experienced bleaching in 2002. Even the more robust corals are affected, Albright said, another sign that this event is particularly serious. She cautioned that it's still too early to assess the long-term impacts of bleaching on the corals, though estimates of coral mortality anticipate losses of about 50 percent. Two factors are responsible for stressing the corals, Albright said: climate change, which is driving ocean temperatures upward, and a strong El Niño  — a cyclical climate event associated with warmer-than-average sea surface temperatures in the tropical Pacific. And with El Niño conditions expected to extend through 2016, that doesn't bode well for the corals' recovery. "Corals are sensitive to not only the anomaly in temperature — how high it goes — but also the duration of that exposure," Albright told Live Science. "This kind of perfect storm of all these factors coming together makes this a catastrophic scenario right now." [Images: Colorful Corals of the Deep Barrier Reef] But what's happening to the GBR is only part of the picture. A global bleaching event prolonged by El Niño is currently underway — "the longest coral die-off on record," according to a statement released by the National Oceanic and Atmospheric Administration (NOAA) on Feb. 23. Mark Eakin, coordinator of the NOAA Coral Reef Watch program, told Live Science that the event, which began in 2014 in the Pacific, could linger through 2017. "We consider it a global bleaching event if it's widespread in all three of the major ocean basins — Indian, Atlantic and Pacific," he said. Eakin described current reports of bleaching that extend over half of the Southern Hemisphere, with severe bleaching in New Caledonia, Fiji and southern Indonesia, as well as in the GBR. Even fast-growing corals take decades to develop, so damaged reefs will need time before they're restored to their former level of health, Eakin said. And recovery time may be in short supply. Global bleaching events have been expanding their reach and increasing in severity since the first event was documented in 1998, Eakin told Live Science.


News Article | April 27, 2016
Site: www.rdmag.com

From a quarter to half of Earth's vegetated lands has shown significant greening over the last 35 years largely due to rising levels of atmospheric carbon dioxide, according to a new study published in the journal Nature Climate Change on April 25. An international team of 32 authors from 24 institutions in eight countries led the effort, which involved using satellite data from NASA's Moderate Resolution Imaging Spectrometer and the National Oceanic and Atmospheric Administration's Advanced Very High Resolution Radiometer instruments to help determine the leaf area index, or amount of leaf cover, over the planet's vegetated regions. The greening represents an increase in leaves on plants and trees equivalent in area to two times the continental United States. Green leaves use energy from sunlight through photosynthesis to chemically combine carbon dioxide drawn in from the air with water and nutrients tapped from the ground to produce sugars, which are the main source of food, fiber and fuel for life on Earth. Studies have shown that increased concentrations of carbon dioxide increase photosynthesis, spurring plant growth. However, carbon dioxide fertilization isn't the only cause of increased plant growth--nitrogen, land cover change and climate change by way of global temperature, precipitation and sunlight changes all contribute to the greening effect. To determine the extent of carbon dioxide's contribution, researchers ran the data for carbon dioxide and each of the other variables in isolation through several computer models that mimic the plant growth observed in the satellite data. Results showed that carbon dioxide fertilization explains 70 percent of the greening effect, said co-author Ranga Myneni, a professor in the Department of Earth and Environment at Boston University. "The second most important driver is nitrogen, at 9 percent. So we see what an outsized role CO2 plays in this process." About 85 percent of Earth's ice-free lands is covered by vegetation. The area covered by all the green leaves on Earth is equal to, on average, 32 percent of Earth's total surface area - oceans, lands and permanent ice sheets combined. The extent of the greening over the past 35 years "has the ability to fundamentally change the cycling of water and carbon in the climate system," said lead author Zaichun Zhu, a researcher from Peking University, China, who did the first half of this study with Myneni as a visiting scholar at Boston University. Every year, about half of the 10 billion tons of carbon emitted into the atmosphere from human activities remains temporarily stored, in about equal parts, in the oceans and plants. "While our study did not address the connection between greening and carbon storage in plants, other studies have reported an increasing carbon sink on land since the 1980s, which is entirely consistent with the idea of a greening Earth," said co-author Shilong Piao of the College of Urban and Environmental Sciences at Peking University. While rising carbon dioxide concentrations in the air can be beneficial for plants, it is also the chief culprit of climate change. The gas, which traps heat in Earth's atmosphere, has been increasing since the industrial age due to the burning of oil, gas, coal and wood for energy and is continuing to reach concentrations not seen in at least 500,000 years. The impacts of climate change include global warming, rising sea levels, melting glaciers and sea ice as well as more severe weather events. The beneficial impacts of carbon dioxide on plants may also be limited, said co-author Dr. Philippe Ciais, associate director of the Laboratory of Climate and Environmental Sciences, Gif-suv-Yvette, France. "Studies have shown that plants acclimatize, or adjust, to rising carbon dioxide concentration and the fertilization effect diminishes over time." "While the detection of greening is based on data, the attribution to various drivers is based on models," said co-author Josep Canadell of the Oceans and Atmosphere Division in the Commonwealth Scientific and Industrial Research Organisation in Canberra, Australia. Canadell added that while the models represent the best possible simulation of Earth system components, they are continually being improved.


News Article | February 21, 2017
Site: www.eurekalert.org

Warming in the 21st century reduced Colorado River flows by at least 0.5 million acre-feet, about the amount of water used by 2 million people for one year, according to new research from the University of Arizona and Colorado State University. The research is the first to quantify the different effects of temperature and precipitation on recent Colorado River flow, said authors Bradley Udall of CSU and Jonathan Overpeck of the UA. "This paper is the first to show the large role that warming temperatures are playing in reducing the flows of the Colorado River," said Overpeck, UA Regents' Professor of Geosciences and of Hydrology and Atmospheric Sciences and director of the UA Institute of the Environment. From 2000-2014, the river's flows declined to only 81 percent of the 20th-century average, a reduction of about 2.9 million acre-feet of water per year. One acre-foot of water will serve a family of four for one year, according to the U.S. Bureau of Reclamation. From one-sixth to one-half of the 21st-century reduction in flow can be attributed to the higher temperatures since 2000, report Udall and Overpeck. Their analysis shows as temperature continues to increase with climate change, Colorado River flows will continue to decline. Current climate change models indicate temperatures will increase as long as humans continue to emit greenhouse gases into the atmosphere, but the projections of future precipitation are far less certain. Forty million people rely on the Colorado River for water, according to the U.S. Bureau of Reclamation. The river supplies water to seven U.S. Western states plus the Mexican states of Sonora and Baja California. Udall, a senior water and climate scientist/scholar at CSU's Colorado Water Institute, said, "The future of Colorado River is far less rosy than other recent assessments have portrayed. A clear message to water managers is that they need to plan for significantly lower river flows." The study's findings, he said, "provide a sobering look at future Colorado River flows." The Colorado River Basin has been in a drought since 2000. Previous research has shown the region's risk of a megadrought--one lasting more than 20 years--rises as temperatures increase. Overpeck said, "We're the first to make the case that warming alone could cause Colorado River flow declines of 30 percent by midcentury and over 50 percent by the end of the century if greenhouse gas emissions continue unabated." The paper by Udall and Overpeck, "The 21st Century Colorado River Hot Drought and Implications for the Future," went online Feb. 17 in the American Geophysical Union journal Water Resources Research. The Colorado Water Institute, National Science Foundation, the National Oceanic and Atmospheric Administration and the U.S. Geological Survey funded the research. The team began its investigation because Udall learned that recent Colorado flows were lower than managers expected given the amount of precipitation. The two researchers wanted to provide water managers with insight into how future projections of temperature and precipitation for the Colorado River Basin would affect the river's flows. Udall and Overpeck began by looking at the drought years of 2000-2014. About 85 percent of the river's flow originates as precipitation in the Upper Basin--the part of the river that drains portions of Wyoming, Utah, Colorado and New Mexico. The team found during 2000-2014, temperatures in the river's Upper Basin were 1.6 degrees F (0.9 C) higher than the average for the previous 105 years. To see how increased temperatures might contribute to the reductions in the river's flow that have been observed since 2000, Udall and Overpeck reviewed and synthesized 25 years of research about how climate and climate change have and will affect the region and how temperature and precipitation affect the river's flows. Water loss increases as temperatures rise because plants use more water, and higher temperatures increase evaporative loss from the soil and from the water surface and lengthen the growing season. In previous research, Overpeck and other colleagues showed current climate models simulated 20th-century conditions well, but the models cannot simulate the 20- to 60-year megadroughts known to have occurred in the past. Moreover, many of those models did not reproduce the current drought. Those researchers and others suggest the risk of a multidecadal drought in the Southwest in the 21st century is much higher than climate models indicate and that as temperatures increase, the risk of such a drought increases. Udall said, "A megadrought in this century will throw all our operating rules out the window." Udall and Overpeck found all current climate models agree that temperatures in the Colorado River Basin will continue rising if the emission of greenhouse gases is not curbed. However, the models' predictions of future precipitation in the Basin have much more uncertainty. Overpeck said, "Even if the precipitation does increase, our work indicates that there are likely to be drought periods as long as several decades when precipitation will still fall below normal." The new study suggests Colorado River flows will continue to decline. Udall said, "I was surprised at the extent to which the uncertain precipitation aspects of the current projections hid the temperature-induced flow declines." The U.S. Bureau of Reclamation lumps temperature and precipitation together in its projections of Colorado River flow, he said. "Current planning understates the challenge that climate change poses to the water supplies in the American Southwest," Udall said. "My goal is to help water managers incorporate this information into their long-term planning efforts."


News Article | June 8, 2016
Site: www.techtimes.com

A proposal seeks to ban the import of American lobsters by the 28 member-nations of the European Union (EU). The move comes after Sweden found 32 of these lobsters in its waters, deeming them invasive species that are likely to overtake the native lobster populations and spread diseases. American fishery officials expressed disagreement with the plan, with National Oceanic and Atmospheric Administration (NOAA) assistant administrator Eileen Sobeck penning a letter for EU officials and highlighting the proposal’s lack of scientific basis. The letter includes NOAA's and the Canadian Department of Fisheries and Oceans’ data analysis and a paper from University of Maine scientist Robert Steneck — both arguing against the Swedish findings. But why do American lobsters suddenly have a bad rap in Europe? “American lobster can carry diseases and parasites that can spread to the European lobster and cause extremely high mortality rate,” said Swedish climate and environment minister Asa Romson in a statement, adding that it will also compete for local food and habitat. The 85-page report on American lobsters posing a “very high” risk to native species came after Sweden discovered 32 American species in their waters in an eight-year period, including 26 in 2014. These foreign lobsters have also been seen off the coast of Denmark, Ireland, Norway, and the United Kingdom, the report went further. Four of the American lobsters found were said to carry eggs, including those with traits from both American and Swedish lobsters. It will be “impossible to eradicate” the established species due to natural dispersal capability, the report stated. According to Romson, Scandinavian lobsters are quite small and delicate, and any hybrid formed with their American counterparts can result in “negative genetic effects” and threaten the European variety. The Swedish marine and water management agency listed a deadly bacterial shell disease on its website as a potential consequence, although adding that evidence behind the link remains low. The country also cited parasites eating American lobsters’ eggs as a potential threat to crabs and other seafood. Not Backed By "Best Available Science" U.S. and Canadian authorities, however, were quick to question the report claims. “Our initial findings suggest that these conclusions are not supported by the best available science,” wrote Sobeck, not dismissing the idea that the proposal could be violating international trade rules. Robert Bayer, executive director of the Lobster Institute at the University of Maine, also disputed many of the report's claims. Bacterial disease has remained dormant for at least a decade now while shell disease is not contagious, he said. Europe imports around 13,000 metric tons of American lobsters annually. The goods are delivered alive in order to preserve their freshness. The possible ban is expected to deliver a hard blow to the American and Canadian lobster market, which export $200 million of their fresh products to Europe every year. NOAA, too, refused to comment on whether retaliation will follow in the form of banning European seafood imports. Steven Wilson, its Office of International Affairs and Seafood Inspection’s deputy director, said the agency is not in a position to elaborate on trade matters — only tasked to scientifically vouch for how American lobsters will not thrive on those overseas waters or overcome the native species. © 2016 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | March 17, 2016
Site: phys.org

The map was generated from the first 10 days of data collected once Jason-3 reached its operational orbit of 1,336 kilometers on Feb. 12. It shows the continuing evolution of the ongoing El Niño event that began early last year. After peaking in January, the high sea levels in the eastern Pacific are now beginning to shrink. Launched Jan. 17 from California's Vandenberg Air Force Base, Jason-3 is operated by the National Oceanic and Atmospheric Administration (NOAA) in partnership with NASA, the French Space Agency Centre National d'Etudes Spatiales (CNES) and the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT). Its nominal three-year mission will continue nearly a quarter-century record of monitoring changes in global sea level. These measurements of ocean surface topography are used by scientists to help calculate the speed and direction of ocean surface currents and to gauge the distribution of solar energy stored in the ocean. Information from Jason-3 will be used to monitor climate change and track phenomena like El Niño. It will also enable more accurate weather, ocean and climate forecasts, including helping global weather and environmental agencies more accurately forecast the strength of tropical cyclones. Jason-3 data will also be used for other scientific, commercial and operational applications, including monitoring of deep-ocean waves; forecasts of surface waves for offshore operators; forecasts of currents for commercial shipping and ship routing; coastal forecasts to respond to environmental challenges like oil spills and harmful algal blooms; coastal modeling crucial for marine mammal and coral reef research. "We are very happy to have been able to deploy so quickly the JASON-3 satellite on its orbit, just behind JASON-2, Gérard Zaouche, CNES project manager said, allowing us to begin the mission product comparison with JASON-2 so easily. "The performances of this new mission are already very promising. Thanks to the good behavior of the instruments, the satellite and all the elements of the system, users will be able to benefit soon from this new high-accuracy mission." That record began with the 1992 launch of the NASA/CNES TOPEX/Poseidon mission (1992-2006) and was continued by Jason-1 (2001-2013); and Jason-2, launched in 2008 and still in operation. Data from Jason-3's predecessor missions show that mean sea level has been rising by about 0.12 inches (3 millimeters) a year since 1993. Over the past several weeks, mission controllers activated and checked out Jason-3's systems, instruments and ground segment, all of which are functioning properly. They also maneuvered Jason-3 into its operational orbit, where it now flies in formation with Jason-2 in the same orbit, approximately 80 seconds apart. The two satellites will make nearly simultaneous measurements over the mission's six-month checkout phase to allow scientists to precisely calibrate Jason-3's instruments. Remko Scharroo, Remote Sensing Scientist at EUMETSAT said. "Jason-3 is continuing the climate data record of sea level change as measured by altimeters going back to 1992. The Jason missions have become the reference for all satellite altimeters. "Until the summer, Jason-2 and Jason-3 overfly the same spot of ocean just 80 seconds apart. This allows us to cross-calibrate those missions with extreme precision of less than one millimeter of sea level, thus ensuring a consistent time series. "With the Sentinel-3 just launched as well, one of our first efforts during the commissioning of the Sentinel-3 SRAL altimeter will be to calibrate it against the Jason-2 and -3 missions. "Taken together, these missions will help us not only to monitor the large-scale changes of the ocean but also those at smaller scales. "The myriad of benefits of Jason-3 include near real-time applications such as hurricane forecasting, monitoring of El Niño, and modeling of ocean currents. And also societal benefits for the long term, such as the monitoring of sea level rise." Once Jason-3 is fully calibrated and validated, it will begin full science operations, precisely measuring the height of 95 percent of the world's ice-free ocean every 10 days and providing oceanographic products to users around the world. Jason-2 will then be moved into a new orbit, with ground tracks that lie halfway between those of Jason-3. This move will double coverage of the global ocean and improve data resolution for both missions. This tandem mission will improve our understanding of ocean currents and eddies and provide better information for forecasting them throughout the global oceans. EUMETSAT, CNES and NOAA will process data from Jason-3, with EUMETSAT being responsible for data services to users of the EUMETSAT and EU Member States, on behalf of the EU Copernicus Programme. Data access in Europe will be secured via the multi-mission infrastructure available at EUMETSAT and CNES, including EUMETSAT's EUMETCast real-time data dissemination system, Earth Observation Portal and archives, as well as the CNES/AVISO data system. Jason-3 is the result of an international partnership between EUMETSAT, the French Space Agency (CNES), the US National Oceanic and Atmospheric Administration (NOAA), the US National Aeronautics and Space Administration (NASA), and the European Union, which funds European contributions to Jason-3 operations as part of its Copernicus Programme. Within Copernicus, Jason-3 is the reference mission for cross-calibrating Sentinel-3 observations of sea surface height and the precursor to the future cooperative Sentinel-6/Jason-CS mission also implemented in partnership between Europe and the United States.


News Article | December 19, 2016
Site: www.realclimate.org

The Norwegian Meteorological institute has celebrated its 150th anniversary this year. It was founded to provide weather data and tentative warnings to farmers, sailors, and fishermen. The inception of Norwegian climatology in the mid-1800s started with studies of geographical climatic variations to adapt important infrastructure to the ambient climate. The purpose of the meteorology and climatology was to protect lives and properties. The journey from the early history of meteorology and climatology to the present weather forecasts and climate research is one of mankind’s great success stories. In the early days, there was a belief that the weather was influenced by sunspots and northern lights, but this notion lost its traction as meteorology and became more and more successful in forecasting. Modern meteorology started in 1904 with a landmark theory proposed by Vilhelm Bjerknes that made it possible to compute how the atmospheric state changes over time, based on a set of key variables and differential equations. The progress was build on science and painstaking efforts, as described by Paul Edwards in his book “A Vast Machine”. Today, weather forecasts ensure safety over a wider range of dimensions depending on where you are. Some of the most important sectors for Met Norway include roads, rail, aviation, and maritime operations. However, the general public and businesses are also important recipients, and they benefit from an open data policy and the popular weather portal Yr.no. In the USA, the National Oceanic and Atmospheric Administration (NOAA) and National Aeronautics and Space Administration (NASA) provide a wide range of services. One example is the early warning systems for phenomena such as hurricanes which in the past took a large number of lives. Why do meteorological services include both weather and climate, and how are they connected? Climate can be defined as the statistical description of weather, or the “typical weather” if you like. Long records of weather data is an essential part of climatology and you need a sufficiently large sample in order to get an accurate statistical description. The statistical character of weather includes both the means, the range of variations, and the extremes. It is determined by the presence of physical processes, such as solar inclination (latitude), air mass above (elevation), prevailing winds, and how these are modified by the proximity of oceans and mountains. I use the term climatology for the science-based knowledge about our climate, and this is built on climate research over time. Climatology puts us in a better position to be prepared for a range of natural hazards and is a key element in risk handling. Risk R is often defined as the product between the consequence C of an event and the probability P that it takes place: R = C P. Area planning is a typical examples of risk management, where the purpose is to avoid building in floodplains where floods are likely and to make sure that excess water drains efficiently. We also want to live where we are not killed by rockslides. In mountainous Norway, we are particularly exposed to avalanches and rockslide hazards that may affect roads, rail, and buildings. Other types of exposure include wind (bridges) and storm surges (build environment). Operational meteorological services collaborate closely with homeland security and water authorities. Weather data typically involve daily observations, and long series are essential to map the risks and quantify the typical frequency of extreme events. They are also essential for the evaluation of weather/climate model skill and for keeping an eye on the state of our climate. Climate change means a shift in the weather statistics: Weather that was typical in the past may no longer be so common, and we start to see new types of weather events. Trend analysis and record-breaking occurrences can tell us if the probability for a particular type of events is changing. The climate changes because the physical conditions change, either from one location to the next, or over time as the greenhouse effect is increased. Meteorology and climatology have been like two twins that have followed each other for a long time through our history. They have grown into modern sciences and are now a foundation for our safety. Weather forecasts and early warning systems represent the last line of climate change adaptation. Recently, there have been some loud voices from people who find facts to be inconvenient and then try to make scientists look like villains. It is therefore important to remind ourselves that meteorology and climatology are your friends. At the same time, I would like to take the opportunity share this video that combines this message with a Merry Christmas greetings.


News Article | October 10, 2016
Site: motherboard.vice.com

On New Year's Day in 1995, hurricane-force winds were blowing in the North Sea, off the coast of Norway. Twelve-meter-high waves pummelled the Draupner oil platform, where workers were stationed, but the platform was designed to withstand this sort of punishment, and workers were sheltered for safety. Suddenly, a freak monster wave hammered the platform, seemingly out of nowhere. The rig was unharmed, but its instruments took a measurement that caused scientists' jaws to drop: The wave it recorded had a height of roughly 26 meters (85 feet). Rogue waves have long been part of sailors' lore. Until the Draupner Event, as it's now called, some scientists had a hard time believing they were real. Why would a skyscraper-size wave erupt out of the ocean, only to disappear shortly after? "That got scientists interested in these waves," Amin Chabchoub, assistant professor of hydrodynamics at Aalto University, told me over the phone from Helsinki. 1995 was the first time that a decent measurement of one had been taken. Now nobody doubts they exist. We still can't predict when one will appear, but scientists think they're getting closer to forecasting when a rogue wave will strike. To better study this, Chabchoub, who recently published a new paper in Physical Review Letters, built a mini rogue wave in the lab. Rogue waves, according to Johannes Gemmrich of the University of Victoria, don't have to be absolute monsters—they can be any height. "It's a relatively easy definition," he told me: "An individual wave which is large compared to surrounding waves." Tsunamis, by contrast, are often caused by displacements at the bottom of the ocean, and can travel long distances, including in shallow waters close to shore. Read More: MIT Algorithm Could Save Future Sailors From Rogue Waves Chabchoub is collaborating with Themistoklis P. Sapsis of the Massachusetts Institute of Technology (MIT), to get better at predicting them. "He's providing us with two things," Sapsis told me. "One, measurements of the wave field before the rogue wave occurs. And then giving us confirmation about the exact location of where it is in his wave tank." They've used the rogue waves to swamp mini boats, to watch how it all works. It's hard to say for sure what causes a rogue wave to form, which makes it even harder to predict them. According to Sapsis, there are two theories: one is that ocean swells, travelling in different speeds and directions, "superimpose with the right phase," he explained, creating an abnormally huge wave. Some vanish in less than a minute after they spike up. The other is that waves all travelling in the same direction eventually mash and join together, forming a giant one. These ones tend to be longer-lived, according to the National Oceanic and Atmospheric Administration (NOAA)'s Ocean Service. Chabchoub and Sapsis aren't the only ones trying to find a way to predict rogue waves. Francesco Fedele of the Georgia Institute of Technology has developed a method using mathematical models of underlying wave energy and other factors. (The NOAA is implementing an approach based on his work.) "You cannot tell that a rogue wave will occur for sure," Fedele told me, but it could still be an early warning system that the probability is high, giving "advance notice to ships." Ultimately, ocean waves might just be too chaotic for even the best algorithms to parse and understand. A rogue wave forecast "will be like what you get for thunderstorms," Arun Chawla of the National Weather Service told me. "You don't know exactly when or where [the rogue wave will appear], but you know the conditions are right." Get six of our favorite Motherboard stories every day by signing up for our newsletter.


News Article | March 28, 2016
Site: motherboard.vice.com

Known for its vast size, rich marine ecosystem, and brilliant coloration, the Great Barrier Reef is probably the most famous coral reef in the world. As of today, a vast swath of it has turned a sickly skeletal white. It’s fallen victim to coral bleaching, a phenomenon that results when the ocean waters that reefs call home get too hot—weakening or killing off the coral altogether. A new survey conducted by Professor Terry Hughes of James Cook University, who used charter planes and helicopters to conduct an aerial survey of the reef, found that as much as 95 percent of the northern section of the Reef—an area spanning 100,000 square kilometers (38,600 square miles)—was “severely” bleached. Hughes flew over the reef, tweeting out pictures of the bone-white reefs alongside calls to action.Day 2 Terry Hughes March 24, 2016It’s “the worst bleaching ever seen on what was the healthiest part of the Great Barrier Reef,” Dr. Mark Eakin, the Coordinator of the National Oceanic and Atmospheric Administration’s Coral Reef Watch, told me. “It’s quite sad.” Australia, he says, “may lose half of their healthiest corals.” He adds: “This won’t be the end of the GBR but it is a huge amount of damage. The problem is that it can take decades for reefs to recover from bleaching this bad and severe bleaching is becoming much more frequent and more severe.” Richard Vevers, the CEO of the Ocean Agency, who’s currently conducting a high-res underwater survey of the world’s ailing reefs in conjunction with Google, says he’s seen the damage firsthand, and it’s bad. "The bleaching on the Great Barrier Reef is devastating,” he tells me. “It’s the worst bleaching ever to hit the most pristine part of the reef." "What we are most concerned about is that the reef won’t have time to recover before next bleaching event hits."The devastation is part of what oceanographers and marine biologists are calling the third global coral bleaching event—the longest ever recorded, according to NOAA. Record-warm waters, heated up by climate change and El Niño, are expelling the algae that corals depend on for survival. This renders them a ghostly shade of grey, and sometimes kills them altogether. And while the Great Barrier Reef is its most high profile victim, few corals will escape undamaged from the epidemic. “For the rest of the world, this event is nowhere near over,” Eakin tells me. “Many places have been hit by bleaching for two years in a row and Florida may be hit for a third year in a row.” And the bleaching is only rolling on. “Right now there is bleaching across half the southern hemisphere—literally! It spans from the coast of Tanzania in the west (40° east) to French Polynesia in the east (140° west). Reports are coming in this week of bleaching throughout Indonesia.” Eakin adds that while the plight of the GBR is making headlines, there are other reefs that are currently faring even worse. The water was so warm in Fiji that it wasn’t just bleaching that grew severe—fish were straight-up dying off. The inshore reefs of New Caledonia are also in deep trouble, as are those in Kiribati, where Eakin’s colleagues are currently returning from an expedition with what he expects will be extraordinarily dire news. All of which is to say: The world’s reefs, one of our most colorful, vibrant ecosystems, are going white. Many are dying off. Some experts, as I’ve noted before, fear that this is the beginning of the end of coral itself. The world’s most famous reef isn’t out of hot water yet, either. “We expect it to last through the rest of this year and could even continue into 2017 if bleaching returns to the GBR as it sometimes does after the end of an El Niño,” Eakin says. "What we are most concerned about is that the reef won’t have time to recover before next bleaching event hits,” Vever told me. “We are already committed to decades of continued ocean warming which will make bleaching events like this more common in the years to come.” When I asked Eakin just how dire the situation was, he responded with a two-word sentence. “Very dire.”


News Article | December 21, 2015
Site: www.greentechmedia.com

Energy production has picked up at the Ivanpah Solar Electric Generating System in the Mojave Desert, but not enough to allow the plant’s owners -- which include Google and Oakland-based BrightSource Energy -- to avoid the risk of defaulting on their contracts to deliver electricity to Pacific Gas & Electric. Majority owner and plant manager NRG Energy said in its most recent quarterly report that it won’t be able to deliver the electricity promised in its power-purchase agreements with PG&E. The agreements cover output from two of Ivanpah’s three units. Almost all of Costa Rica's electricity came from renewable sources this year, making it one of a few countries in the world to eschew fossil fuels in energy generation, the state electricity agency said Friday. The Costa Rican Electricity Institute said in a statement it achieved "99 percent renewable electricity generation" this year. It also said for 285 days this year, the country managed to power its grid on 100 percent renewable sources. Federal officials are turning over scores of emails and documents related to climate science research in response to a subpoena from a House committee. A National Oceanic and Atmospheric Administration (NOAA) official said Thursday that the agency provided about 100 documents to the House Science Committee this week. The documents relate to an investigation kicked off by the committee's chairman, Rep. Lamar Smith (R-Texas), into NOAA’s climate science research, specifically a study that concluded there has not been a 15-year “pause” in global warming. Energy management software startup BuildingIQ raised $20 million in its IPO on Wednesday and began trading on the Australian Securities Exchange. The company offered almost 20 million securities listed at $1 each with an indicative market capitalization of approximately $85 million (Australian dollars) on a fully diluted basis. The securities closed on the first day of trading at the same price, with about 2.8 million securities changing hands, according to Australia’s SBS News. Prior to its IPO, the company had raised $23.4 million from investors such as Aster Capital, Paladin Capital Group and Siemens Venture Capital. Japanese cities are entering the renewable-energy business, the latest phase in a shake-up of the nation’s power sector in the aftermath of the 2011 Fukushima nuclear crisis. So far, about 14 cities have formed companies to generate clean energy from local resources and sell it to area businesses and homes. With full deregulation of the nation’s electricity markets set to begin next year, the government aims to have 1,000 such city-operated companies up and running by 2021 in a direct challenge to regional power monopolies.


News Article | February 15, 2017
Site: www.scientificamerican.com

Members of a House of Representatives committee hammered the Environmental Protection Agency on Tuesday at a hearing titled “Making EPA Great Again,” accusing it of basing its regulations on biased, politicized science, and calling for reforms in the EPA’s rule-making process. But a number of scientific organizations call this an attempt to covertly strip the agency’s power—and ultimately to interfere with the scientific process itself. In his opening statement at the House Committee on Science, Space and Technology hearing, Chair Lamar Smith—a Republican from Texas—excoriated the EPA over what he has called its “secret science.” In setting past environmental regulations the EPA has “routinely relied on questionable science, based on nonpublic information, that could not be reproduced…and deliberately used its regulatory power to undercut American industries and advance a misguided political agenda that has minimal environmental benefit,” Smith said. With Pres. Donald Trump’s administration newly in charge, Smith added that he now sees a chance to rein in an agency he thinks has run amok. “There is now an opportunity to right the ship at the EPA, and steer the agency in the right direction,” he said. Many believe that means Smith plans to revive legislation called the Secret Science Reform Act, which he co-sponsored in 2014 and introduced again in 2015—but which Pres. Barack Obama vowed to veto. The bill would prohibit the EPA from creating regulations based on science that is “not transparent or reproducible.” Scientific organizations say this would make it more difficult for the EPA to create rules at all, and craft them based on the best available science. For example, if the bill requires the EPA only use studies that can be identically reproduced, that would impose an unreasonable demand on scientists, according to Rush Holt, who testified at the hearing as CEO of the American Association for the Advancement of Science. “Many studies cannot be repeated in exactly the same way—the populations have changed, those people [in the studies] have grown up or moved away or the forest you’re studying has been overtaken by an invasive [species],” Holt explained. “The Secret Science Act has been based on a misunderstanding of how science works—the gold standard is to find other approaches to come up with the same conclusions. Rarely can you repeat an experiment in exactly the same way.” Critics also worry the legislation could keep the EPA from using important multiyear studies—say, for example, a 10-year study examining air pollution’s effect on human health—in the agency’s rule-making process. Those critical long-term studies are extremely difficult to replicate because they require so much time and money. Because of this, they may not fall under the definition of “reproducible.” Although the bill’s supporters might argue long-term studies would not be excluded, the law’s language would likely leave the term “reproducibility” open to interpretation. For instance, someone could potentially sue the EPA for using one of those long-term studies in its rule-making, leaving it to the courts to determine the definition of “reproducibility.” All of this means the bill could limit the number of studies the EPA might consider, if either the courts decide a study is not “reproducible” or if the EPA refrains from using a multiyear study because it believes the research will not meet the bill’s “reproducibility” demand. In other words, the agency may not be able use the best available science to make its rules. “I think [the Secret Science bill] is fundamentally substituting a politically originated revision of the process for the scientific process,” Holt said in the hearing. The Secret Science Reform Act would also require the EPA use only studies for which data is publicly available online—or the agency makes publicly available—in the name of transparency. But critics of this approach note that scientific studies often include private data, including individual health information, or industry records that cannot be made public for competitive, ethical or legal reasons. During the hearing the representative from the American Chemistry Council (ACC), an industry group, asked that confidential commercial data be protected in the bill. “That was another great illustration that the bill is not about transparency—it’s about what is politically expedient to move industry’s agenda forward,” says Yogin Kothari, a representative with the Center for Science and Democracy at the Union of Concerned Scientists. As for medical data, supporters of the bill say names and other private information could be scrubbed—but that would likely be expensive and time-intensive, and thus another factor limiting the number of studies the EPA could use to make its environmental protection rules. “You don’t need access to the raw data to figure out what information the EPA is relying on,” Kothari wrote in an e-mail. “The idea of secret science is based on a false premise.” The Congressional Budget Office estimates that implementing the latest version of the Secret Science bill (the 2015 version) would cost the EPA $250 million annually over the next few years. The bill, however, allots the EPA only $1 million per fiscal year to carry out its new requirements. “The goal [of the bill] is really to throw a wrench in the rule-making process at the agency,” Kothari says. Smith’s office referred queries to the House Science Committee, whose spokesperson was not immediately available for comment. Industry groups including the ACC have supported the latest version of the bill. “A more transparent EPA helps to foster the kind of regulatory environment that gives our members the confidence and certainty they need to continue to invest in the U.S. economy and develop transformational, innovative products,” an ACC spokesperson wrote to Scientific American in an e-mail after the hearing. Other industry groups that supported the latest version of the bill declined to commment. The House panel also focused on reforming the EPA’s Science Advisory Board, which some committee members and industry groups say does not represent a balanced view of science. In 2015 Smith co-sponsored a bill called the “EPA Science Advisory Board Reform Act,” which never became law—it is widely believed Smith will revive that legislation this year, along with the Secret Science bill. Opponents say the Advisory Board act would make it possible to stack the board with members who favor industry. “[The board] will not function better by having fewer scientists on it,” Holt said at the hearing. Committee members also devoted a significant portion of the hearing to a recent controversial article about climate change research, recently published in the Daily Mail, a London tabloid newspaper. A whistleblower at the National Oceanic and Atmospheric Administration (NOAA) reportedly told the newspaper the agency violated scientific integrity and rushed to publish a landmark scientific paper, which showed no pause in global warming, for political reasons. Smith referenced the story in his opening statement at Tuesday’s hearing, saying, “Recent news stories report that NOAA tried to deceive the American people by falsifying data to justify a partisan agenda.” The whistleblower, John Bates, told another publication on Tuesday, however, that the agency had broken protocol when it rushed to publication—but that the data had not been manipulated. The points Bates complained about made no difference in the scientific paper’s overall conclusions, according to Zeke Hausfather, a climate scientist and an energy systems analyst at the University of California, Berkeley. Hausfather noted other studies, including one of his own, have independently verified the NOAA paper’s results. “I would strongly recommend,” he adds, “that if Congress wants to assess matters of science, they should rely on peer-reviewed publications rather than tabloid articles.”


News Article | December 1, 2016
Site: news.yahoo.com

The House of Representatives' Committee on Science, Space and Technology sent a tweet on Thursday linking to an article on the conservative media outlet Breitbart, saying that Earth's temperatures are in a "plunge." Judging from reactions on Twitter — one of which was a stinging burn tweeted by Sen. Bernie Sanders of Vermont — many are finding it deeply and sadly ironic that the Science Committee doesn't recognize the overwhelming scientific consensus that climate change is real and influenced by human activity. SEE ALSO: Large parts of West Antarctic Ice Sheet could collapse 'in our lifetimes' As Mashable reported today, for example, November was so warm in the U.S. that record daily highs outnumbered record lows by about 51-to-1. This year is likely to go down as the hottest ever on record. Yet the tweet only barely touches on just how anti-science the Science Committee actually is. The committee's chairman, Rep. Lamar Smith of Texas, has spent much of the past two years defending one of his donors — oil giant ExxonMobil Corp — from allegations that it misled investors about the risks that global warming poses to its business. Smith and other Republican members of the committee have turned what used to be a quiet committee assignment dealing with weighty and geeky subjects, like NASA's space exploration plans, into another investigative panel within Congress, which is unprecedented in the panel's history. In a statement on Nov. 4, Smith reacted to the enactment of the Paris Climate Agreement by referring to climate change as “science fiction.” In 2015, Smith issued a wide-ranging subpoena to the National Oceanic and Atmospheric Administration (NOAA), asking for all correspondence regarding particular climate research the agency produced. The Committee spent more than a year locked in a battle with NOAA's administrator, Kathryn Sullivan, a former NASA astronaut, over this subpoena. Smith has received a total of about $700,000 in campaign contributions from the fossil fuel industry since 2008, of which at least $19,500 came from Exxon. The Twitter account that came to people's attention on Thursday often reads more like that of a think tank or activist organization, spreading often misleading information about climate trends. The Breitbart link is particularly interesting, considering that the former leader of that media company, Steve Bannon, is now President-elect Donald J. Trump chief strategist. Given Trump's harsh view of mainstream climate science findings, it's possible that Rep. Smith will be even more influential in the next few years. In other words, look for more tweets like this one, and more subpoenas.


News Article | October 28, 2016
Site: www.sciencemag.org

When world leaders reached a deal last month in Kigali to curb the use of hydrofluorocarbons (HFCs)—planet-warming chemicals widely used in air conditioners and refrig­erators—many boasted the move would prevent nearly 0.5°C in warming by 2100. That is a big number, given that the Paris climate agreement aims to keep total warming to less than 2°C. If the HFC number is correct, it will make it easier for nations to achieve the Paris goal. But there’s a bit more scientific uncer­tainty surrounding that half-degree claim than the politicians let on. The figure has its origins in a 2006 dinner held by five scien­tists in a village in the Swiss Alps. The U.S. and European researchers, who work for gov­ernment and industry, were part of a group that advises policymakers on the Montreal Protocol, the 1987 pact that curbed the use of chemicals that harm the ozone layer. The re­searchers found that the protocol also helped reduce global warming, because some of the regulated chemicals were potent greenhouse gases. But they realized the pact had a warm­ing downside, too, says David Fahey, a physi­cist at the National Oceanic and Atmospheric Administration’s Earth System Research Lab­oratory in Boulder, Colorado. That’s because some of the newer, ozone-friendlier chemi­cals that the protocol thrust into use, such as HFCs, trap heat thousands of times more effectively than carbon dioxide. Soon, the researchers were trying to figure out what that meant for the planet. The half-degree estimate was first floated in a 2013 paper co-authored by one of the dinner guests, physicist Guus Velders of the National Institute for Public Health and the Environment in Bilthoven, the Netherlands. The study, published in , forecast that rising HFC use in the developing world would push global temperatures up by 0.35°C to 0.5°C by 2100. Those numbers caused a stir, because they were substantially higher than HFC warm­ing forecasts made by other climate models, including those underpinning the massive reports of the Intergovernmental Panel on Climate Change (IPCC). Ultimately, however, they helped galvanize support for the Ki­gali agreement, which aims to cut HFC use by 80% to 85% by 2047. And advocates and negotiators tended to cite the higher, 0.5°C estimate in their public remarks. That didn’t sit well with Andrew Jones, co-director of Climate Interactive, a promi­nent climate analysis group. A day after na­tions announced the 15 October Kigali deal, Jones wrote a blog post hailing it as “excel­lent news for the climate.” But he cautioned against counting on the full 0.5°C benefit. One reason, he wrote, is that he considers the 2013 paper to be an outlier, because it proj­ects HFC warming that is roughly four times greater than that projected by a model cited by IPCC. “I’m not really buying it,” says Jones, who is in Asheville, North Carolina. Velders says his team came up with higher warming estimates than IPCC because their model accounts for trends that others don’t, such as the faster-than expected adop­tion of HFCs driven by the Montreal Protocol, and an air-conditioning boom in the devel­oping world. Still, he concedes that forecast­ing HFC use is difficult. If warming prompts greater demand for air conditioners in India, for instance, future HFC impacts could be even greater. His team was careful to clarify the uncertainty, he notes, by presenting a range of forecasts, with 0.5°C at the high end. More sophisticated models that offer a range of possible futures, such as different patterns of economic growth, could improve such estimates, says climate modeling specialist Steven Smith of the Department of Energy’s Pacific Northwest National Laboratory, who is based in College Park, Maryland. “No slight to their work,” he says of the Velders group’s HFC projection. “They’ve clearly done the best work on this to date.” But although Velders and other scientists routinely acknowl­edge the uncertainty in their forecasts, “that’s not what poli­ticians do,” says Durwood Zaelke, president of the Institute for Governance & Sustain­able Development in Washington, D.C., which backed aggressive HFC reductions. For Velders, the Kigali agreement, and the role of the work done by his team, is a point of pride. “The half-degree estimate was the offshoot of a series of papers over a decade by the scientists, research that proved pivotal in the evolving understanding of the ties [Kigali agreement], of course it feels great that people are using your work,” he says. Now, Velders is offering a single new number: 0.06°C. That is his new estimate of how much warming HFCs will cause by 2100 if the Kigali deal hits its targets.


News Article | August 31, 2016
Site: www.scientificamerican.com

The world’s most mathematically perfect marine species moved a little bit closer to protection last week when the National Oceanic and Atmospheric Administration agreed to consider listing the chambered nautilus (Nautilus pompilius) under the Endangered Species Act. The move comes after several years of hard work on the part of conservationists and federal agencies to understand the massive scope of the nautilus trade and how it impacts wild populations. According to that research, nearly 1.7 million of these mollusk shells—the natural embodiment of the Fibonacci spiral—have been imported into the U.S. alone over the past 16 years, where they’re sold for anywhere between $15 and $200. The trade in nautilus shells is so bad that it has all-but depleted many populations of these ancient animals. Further imports as well as interstate trade would become illegal if the species does gain Endangered Species Act protection. Of course, that’s a long process that could take several years. Right now NOAA is conducting a formal status review for chambered nautiluses, which will collect all available information on the species. Data and public comments will be collected through October 25. “If, after reviewing the best available scientific and commercial information and after taking into account conservation efforts to protect the species, we determine that listing the species under the ESA is warranted, we will publish a proposed rule proposing to list the species as threatened or endangered and solicit public comments,” says Kate Brogan, public affairs officer with NOAA’s National Marine Fisheries Service. International action could actually come much faster than that. The U.S., Fiji, India and the island national of Palau have teamed up to request that nautiluses also be protected under the Convention on International Trade in Endangered Species (CITES). That proposal, along with many others, will be discussed next month at CoP17, the meeting of the treaty’s signatories. If passed, nautiluses would be added to CITES Appendix II, which would regulate but not outright ban trade on the international market. The CITES proposal would actually go further than the ESA proposal as it would cover all six nautilus species, not just the chambered nautilus. Abel Valdivia, a marine scientist with the Center for Biological Diversity, the conservation organization that petitioned for nautilus protection, says the ESA proposal will still help dramatically as it would protect the most heavily traded species. “Most of the international shell trade of nautilus is of chambered nautilus because it is the most common species,” he says. “Over 90% of complete shells imported to the U.S. in the past 16 years are classified as chambered nautilus.” Nautilus shells are admittedly one of nature’s great wonders and it’s easy to see why people would want to own and admire them. Hopefully these two important steps towards protecting these ancient creatures will allow us to continue enjoying them for many years to come.


News Article | November 20, 2016
Site: www.csmonitor.com

The GOES-R satellite is a major upgrade in terms of better information about Earth and the Sun, improving how we respond to weather threats across the United States. This photo provided by United Launch Alliance shows a United Launch Alliance (ULA) Atlas V rocket carrying GOES-R spacecraft for NASA and NOAA lifting off from Space Launch Complex-41 at 6:42 p.m. EST at Cape Canaveral Air Force Station, Fla., Saturday, Nov. 19, 2016. The most advanced weather satellite ever built rocketed into space Saturday night, part of an $11 billion effort to revolutionize forecasting and save lives. GOES-R, the most powerful weather satellite ever built, launched into orbit atop an Atlas V rocket Saturday evening from Cape Canaveral, Fla. The weather satellite is the first of a new generation of satellites operated by the National Oceanic and Atmospheric Administration (NOAA) that is expected to improve weather forecasting across the entire Western hemisphere. The satellite launched at 6:42 PM EST from Cape Canaveral Air Force station. It took about 12 minutes for the rocket to boost the high-tech piece of equipment into orbit. GOES-R, which stands for Geostationary Operational Environmental Satellite, is the 16th in the GOES series. Once it reaches its final orbit in two weeks, its name will be changed to GOES-16 to reflect its technological heritage. But the super-advanced GOES-R is a far cry from its ancestors, the first of which was launched in 1975. "For weather forecasters, GOES-R will be similar to going from a black-and-white TV to super-high-definition TV," Stephen Volz, assistant administrator for NOAA's Satellite and Information Services division, said during a pre-launch news conference on Thursday. "For the American public, that will mean faster, more accurate weather forecasts and warnings. That also will mean more lives saved and better environmental intelligence for state and local officials and all decision makers." As the Monitor previously reported, GOES-R's final orbit will be geosynchronous, meaning that it will remain in the same relative point in space above the Earth, matching our planet's daily rotation, at a height of about 22,000 miles. It has enough fuel to remain there for up to 18 years, though currently it is expected to operate only for a decade. "We’ll be able to [image] the whole hemisphere every five minutes or better yet, for a hurricane or a big thunderstorm, we'll be able to actually focus in and do updates every 30 seconds," Greg Mandt, NOAA’s GOES-R program director, told CBS. "And we get the data to the forecasters within seconds or minutes. So in a sense, it's like watching it with a camera in real time so they can really watch what's going on, how it's unfolding and therefore make much more precise warnings of the significant weather events that are coming on." Because of the high precision and fast imaging rate, GOES-R is expected to save lives in the event of destructive weather events by using the Advanced Baseline Imager (ABI) camera. The ABI is capable of taking pictures five times faster than, and with four times the resolution of, current GOES cameras. GOES-R is also equipped with a magnetometer to measure solar radiation in the upper atmosphere, a UV imager to monitor the sun, sensors to measure charged solar particles, and a lightning mapper that will take infrared photos 200 times a second of areas with active electrical disturbances. "The launch of GOES-R represents a major step forward in terms of our ability to provide more timely and accurate information that is critical for life-saving weather forecasts and warnings," said Thomas Zurbuchen, associate administrator for NASA’s Science Mission Directorate in Washington, in a statement from NASA. "It also continues a decades-long partnership between NASA and NOAA to successfully build and launch geostationary environmental satellites." The launch of GOES-R had been rescheduled due to damage from Hurricane Matthew, and was delayed by another hour on the day of the launch because of technical glitches. But the flight itself seemed to go off without a hitch, marking the 100th launch for the US Air Force's Evolved Expendable Launch Vehicle program, which was created in the 1990s to create a more reliable and affordable means of launching cargo into orbit. This milestone is matched only by the technological significance of GOES-R, which is the first of four next-generation satellites that will monitor weather patterns through 2036. "For 40 years, we've sort of had the same simple pictures," Mr. Mandt told Spaceflight Now. "Meteorologists are calling this a game-changer from their ability to watch what's going on and warn the nation."


News Article | November 20, 2016
Site: www.csmonitor.com

The GOES-R satellite is a major upgrade in terms of better information about Earth and the Sun, improving how we respond to weather threats across the United States. This photo provided by United Launch Alliance shows a United Launch Alliance (ULA) Atlas V rocket carrying GOES-R spacecraft for NASA and NOAA lifting off from Space Launch Complex-41 at 6:42 p.m. EST at Cape Canaveral Air Force Station, Fla., Saturday, Nov. 19, 2016. The most advanced weather satellite ever built rocketed into space Saturday night, part of an $11 billion effort to revolutionize forecasting and save lives. GOES-R, the most powerful weather satellite ever built, launched into orbit atop an Atlas V rocket Saturday evening from Cape Canaveral, Fla. The weather satellite is the first of a new generation of satellites operated by the National Oceanic and Atmospheric Administration (NOAA) that is expected to improve weather forecasting across the entire Western hemisphere. The satellite launched at 6:42 PM EST from Cape Canaveral Air Force station. It took about 12 minutes for the rocket to boost the high-tech piece of equipment into orbit. GOES-R, which stands for Geostationary Operational Environmental Satellite, is the 16th in the GOES series. Once it reaches its final orbit in two weeks, its name will be changed to GOES-16 to reflect its technological heritage. But the super-advanced GOES-R is a far cry from its ancestors, the first of which was launched in 1975. "For weather forecasters, GOES-R will be similar to going from a black-and-white TV to super-high-definition TV," Stephen Volz, assistant administrator for NOAA's Satellite and Information Services division, said during a pre-launch news conference on Thursday. "For the American public, that will mean faster, more accurate weather forecasts and warnings. That also will mean more lives saved and better environmental intelligence for state and local officials and all decision makers." As the Monitor previously reported, GOES-R's final orbit will be geosynchronous, meaning that it will remain in the same relative point in space above the Earth, matching our planet's daily rotation, at a height of about 22,000 miles. It has enough fuel to remain there for up to 18 years, though currently it is expected to operate only for a decade. "We’ll be able to [image] the whole hemisphere every five minutes or better yet, for a hurricane or a big thunderstorm, we'll be able to actually focus in and do updates every 30 seconds," Greg Mandt, NOAA’s GOES-R program director, told CBS. "And we get the data to the forecasters within seconds or minutes. So in a sense, it's like watching it with a camera in real time so they can really watch what's going on, how it's unfolding and therefore make much more precise warnings of the significant weather events that are coming on." Because of the high precision and fast imaging rate, GOES-R is expected to save lives in the event of destructive weather events by using the Advanced Baseline Imager (ABI) camera. The ABI is capable of taking pictures five times faster than, and with four times the resolution of, current GOES cameras. GOES-R is also equipped with a magnetometer to measure solar radiation in the upper atmosphere, a UV imager to monitor the sun, sensors to measure charged solar particles, and a lightning mapper that will take infrared photos 200 times a second of areas with active electrical disturbances. "The launch of GOES-R represents a major step forward in terms of our ability to provide more timely and accurate information that is critical for life-saving weather forecasts and warnings," said Thomas Zurbuchen, associate administrator for NASA’s Science Mission Directorate in Washington, in a statement from NASA. "It also continues a decades-long partnership between NASA and NOAA to successfully build and launch geostationary environmental satellites." The launch of GOES-R had been rescheduled due to damage from Hurricane Matthew, and was delayed by another hour on the day of the launch because of technical glitches. But the flight itself seemed to go off without a hitch, marking the 100th launch for the US Air Force's Evolved Expendable Launch Vehicle program, which was created in the 1990s to create a more reliable and affordable means of launching cargo into orbit. This milestone is matched only by the technological significance of GOES-R, which is the first of four next-generation satellites that will monitor weather patterns through 2036. "For 40 years, we've sort of had the same simple pictures," Mr. Mandt told Spaceflight Now. "Meteorologists are calling this a game-changer from their ability to watch what's going on and warn the nation."


News Article | November 20, 2016
Site: www.csmonitor.com

The GOES-R satellite is a major upgrade in terms of better information about Earth and the Sun, improving how we respond to weather threats across the United States. This photo provided by United Launch Alliance shows a United Launch Alliance (ULA) Atlas V rocket carrying GOES-R spacecraft for NASA and NOAA lifting off from Space Launch Complex-41 at 6:42 p.m. EST at Cape Canaveral Air Force Station, Fla., Saturday, Nov. 19, 2016. The most advanced weather satellite ever built rocketed into space Saturday night, part of an $11 billion effort to revolutionize forecasting and save lives. GOES-R, the most powerful weather satellite ever built, launched into orbit atop an Atlas V rocket Saturday evening from Cape Canaveral, Fla. The weather satellite is the first of a new generation of satellites operated by the National Oceanic and Atmospheric Administration (NOAA) that is expected to improve weather forecasting across the entire Western hemisphere. The satellite launched at 6:42 PM EST from Cape Canaveral Air Force station. It took about 12 minutes for the rocket to boost the high-tech piece of equipment into orbit. GOES-R, which stands for Geostationary Operational Environmental Satellite, is the 16th in the GOES series. Once it reaches its final orbit in two weeks, its name will be changed to GOES-16 to reflect its technological heritage. But the super-advanced GOES-R is a far cry from its ancestors, the first of which was launched in 1975. "For weather forecasters, GOES-R will be similar to going from a black-and-white TV to super-high-definition TV," Stephen Volz, assistant administrator for NOAA's Satellite and Information Services division, said during a pre-launch news conference on Thursday. "For the American public, that will mean faster, more accurate weather forecasts and warnings. That also will mean more lives saved and better environmental intelligence for state and local officials and all decision makers." As the Monitor previously reported, GOES-R's final orbit will be geosynchronous, meaning that it will remain in the same relative point in space above the Earth, matching our planet's daily rotation, at a height of about 22,000 miles. It has enough fuel to remain there for up to 18 years, though currently it is expected to operate only for a decade. "We’ll be able to [image] the whole hemisphere every five minutes or better yet, for a hurricane or a big thunderstorm, we'll be able to actually focus in and do updates every 30 seconds," Greg Mandt, NOAA’s GOES-R program director, told CBS. "And we get the data to the forecasters within seconds or minutes. So in a sense, it's like watching it with a camera in real time so they can really watch what's going on, how it's unfolding and therefore make much more precise warnings of the significant weather events that are coming on." Because of the high precision and fast imaging rate, GOES-R is expected to save lives in the event of destructive weather events by using the Advanced Baseline Imager (ABI) camera. The ABI is capable of taking pictures five times faster than, and with four times the resolution of, current GOES cameras. GOES-R is also equipped with a magnetometer to measure solar radiation in the upper atmosphere, a UV imager to monitor the sun, sensors to measure charged solar particles, and a lightning mapper that will take infrared photos 200 times a second of areas with active electrical disturbances. "The launch of GOES-R represents a major step forward in terms of our ability to provide more timely and accurate information that is critical for life-saving weather forecasts and warnings," said Thomas Zurbuchen, associate administrator for NASA’s Science Mission Directorate in Washington, in a statement from NASA. "It also continues a decades-long partnership between NASA and NOAA to successfully build and launch geostationary environmental satellites." The launch of GOES-R had been rescheduled due to damage from Hurricane Matthew, and was delayed by another hour on the day of the launch because of technical glitches. But the flight itself seemed to go off without a hitch, marking the 100th launch for the US Air Force's Evolved Expendable Launch Vehicle program, which was created in the 1990s to create a more reliable and affordable means of launching cargo into orbit. This milestone is matched only by the technological significance of GOES-R, which is the first of four next-generation satellites that will monitor weather patterns through 2036. "For 40 years, we've sort of had the same simple pictures," Mr. Mandt told Spaceflight Now. "Meteorologists are calling this a game-changer from their ability to watch what's going on and warn the nation."


News Article | November 20, 2016
Site: www.csmonitor.com

The GOES-R satellite is a major upgrade in terms of better information about Earth and the Sun, improving how we respond to weather threats across the United States. This photo provided by United Launch Alliance shows a United Launch Alliance (ULA) Atlas V rocket carrying GOES-R spacecraft for NASA and NOAA lifting off from Space Launch Complex-41 at 6:42 p.m. EST at Cape Canaveral Air Force Station, Fla., Saturday, Nov. 19, 2016. The most advanced weather satellite ever built rocketed into space Saturday night, part of an $11 billion effort to revolutionize forecasting and save lives. GOES-R, the most powerful weather satellite ever built, launched into orbit atop an Atlas V rocket Saturday evening from Cape Canaveral, Fla. The weather satellite is the first of a new generation of satellites operated by the National Oceanic and Atmospheric Administration (NOAA) that is expected to improve weather forecasting across the entire Western hemisphere. The satellite launched at 6:42 PM EST from Cape Canaveral Air Force station. It took about 12 minutes for the rocket to boost the high-tech piece of equipment into orbit. GOES-R, which stands for Geostationary Operational Environmental Satellite, is the 16th in the GOES series. Once it reaches its final orbit in two weeks, its name will be changed to GOES-16 to reflect its technological heritage. But the super-advanced GOES-R is a far cry from its ancestors, the first of which was launched in 1975. "For weather forecasters, GOES-R will be similar to going from a black-and-white TV to super-high-definition TV," Stephen Volz, assistant administrator for NOAA's Satellite and Information Services division, said during a pre-launch news conference on Thursday. "For the American public, that will mean faster, more accurate weather forecasts and warnings. That also will mean more lives saved and better environmental intelligence for state and local officials and all decision makers." As the Monitor previously reported, GOES-R's final orbit will be geosynchronous, meaning that it will remain in the same relative point in space above the Earth, matching our planet's daily rotation, at a height of about 22,000 miles. It has enough fuel to remain there for up to 18 years, though currently it is expected to operate only for a decade. "We’ll be able to [image] the whole hemisphere every five minutes or better yet, for a hurricane or a big thunderstorm, we'll be able to actually focus in and do updates every 30 seconds," Greg Mandt, NOAA’s GOES-R program director, told CBS. "And we get the data to the forecasters within seconds or minutes. So in a sense, it's like watching it with a camera in real time so they can really watch what's going on, how it's unfolding and therefore make much more precise warnings of the significant weather events that are coming on." Because of the high precision and fast imaging rate, GOES-R is expected to save lives in the event of destructive weather events by using the Advanced Baseline Imager (ABI) camera. The ABI is capable of taking pictures five times faster than, and with four times the resolution of, current GOES cameras. GOES-R is also equipped with a magnetometer to measure solar radiation in the upper atmosphere, a UV imager to monitor the sun, sensors to measure charged solar particles, and a lightning mapper that will take infrared photos 200 times a second of areas with active electrical disturbances. "The launch of GOES-R represents a major step forward in terms of our ability to provide more timely and accurate information that is critical for life-saving weather forecasts and warnings," said Thomas Zurbuchen, associate administrator for NASA’s Science Mission Directorate in Washington, in a statement from NASA. "It also continues a decades-long partnership between NASA and NOAA to successfully build and launch geostationary environmental satellites." The launch of GOES-R had been rescheduled due to damage from Hurricane Matthew, and was delayed by another hour on the day of the launch because of technical glitches. But the flight itself seemed to go off without a hitch, marking the 100th launch for the US Air Force's Evolved Expendable Launch Vehicle program, which was created in the 1990s to create a more reliable and affordable means of launching cargo into orbit. This milestone is matched only by the technological significance of GOES-R, which is the first of four next-generation satellites that will monitor weather patterns through 2036. "For 40 years, we've sort of had the same simple pictures," Mr. Mandt told Spaceflight Now. "Meteorologists are calling this a game-changer from their ability to watch what's going on and warn the nation."


News Article | November 30, 2016
Site: www.sciencemag.org

The White House science office hasn’t been very productive under President Barack Obama, says the chairman of a key congressional research spending panel. And Representative John Culberson (R–TX) says he’d like to see it downsized. Culberson, whose House of Representatives subcommittee oversees the budgets for NASA, the National Science Foundation (NSF), the National Oceanic and Atmospheric Administration (NOAA), and the National Institute of Standards and Technology, has never been a fan of John Holdren, Obama’s science adviser. And his latest comments are likely to further heighten anxiety among scientific leaders about how the U.S. research enterprise will fare under President-elect Donald Trump. The commerce, justice, and science (CJS) appropriations subcommittee that Culberson chairs also oversees the White House Office of Science and Technology (OSTP), which Congress created in 1976. The office has traditionally been led by the president’s science adviser, and Holdren also co-chairs the President’s Council of Advisors on Science and Technology (PCAST), an eminent group of outsiders. But Culberson says that arrangement isn’t working well. “I’d be hard-pressed to identify any tangible, specific accomplishments or achievements by that office,” he told Insider yesterday in a phone interview from his Capitol Hill office in Washington, D.C. His unhappiness with the current scientific bureaucracy reflects his overall governing philosophy, in which smaller is better. “The president needs a science adviser to keep him posted and give him guidance on all of those areas,” he says. “But I don’t know that [OSTP] needs to have a large staff or be a big operation. … In my mind there are already too many termite mounds in Washington[, D.C]. We need to shrink the size of government in any way we can.” OSTP is a tiny stitch in the fabric of federal science. Its $5.5 million budget would be a rounding error for most federal agencies, and about half of its 120-person staff is on loan from elsewhere in the government. At the same time, the Obama administration has given it a prominent role in coordinating science and technology across all agencies. (Congress has done likewise in various pieces of legislation over the years.) [I]n my mind there are already too many termite mounds in Washington[, D.C]. We need to shrink the size of government in any way we can. Last week the leaders of 29 scientific and academic organizations (including AAAS, which publishes ), implored Trump to maintain its high profile, starting with the prompt selection of a successor to Holdren who can help fill other science jobs in his administration. “We urge that you quickly appoint a science advisor with the title of Assistant to the President for Science and Technology,” they wrote in a letter to the president-elect. “This senior-level advisor can assist you in determining effective ways to use science and technology to address major national challenges. Moreover, this individual can coordinate relevant science and technology policy and personnel decisions within the executive branch of government.” CJS is one of 12 panels in the House (there’s also a dozen in the Senate) that oversee federal spending. Since becoming CJS chairman in January 2015, Culberson has used his position as a “cardinal” to advocate for his scientific priorities, starting with a multi–billion-dollar NASA mission to a jovian moon that some scientists believe may harbor life. He will be trying to protect those priorities—which include the Federal Bureau of Investigation (FBI) and border security—as Congress spends the next 2 weeks cobbling together another short-term spending bill to keep the government open. The current agreement, called a continuing resolution (CR), runs out on 9 December. And although a CR is intended to hold agencies to their current budget, legislators have the authority to make exceptions. Culberson discussed the CR and many other issues with Insider. Here is a transcript of that conversation, edited for length and clarity. Q: Does it matter when President-elect Trump chooses a science adviser? A: It’s always helpful for a new administration to act early in choosing people for key positions. And the science adviser and NASA administrator are a very important part of preserving American leadership in scientific research and space exploration. Q: Have you spoken with the transition team? A: Yes, I’ve been in contact with them and made a number of suggestions. I’m trying to do everything I can to make sure the CJS bill is in good shape for the new administration. Q: Do you know who is handling science and space issues during the transition? A: I’ve been in touch with several people on the team, and [Vice President–elect] Mike Pence is a friend and I’ve spoken with him, too. I’m confident the new president will appoint the best person to head NASA and to be his science adviser. Q: Would you like to see any changes to the White House Office of Science and Technology Policy? A: I’d be hard-pressed to identify any tangible, specific accomplishments or achievements of the office. It’s important that we make government as efficient as possible. I think the president obviously needs a science adviser. But there are other places where he can get good advice. [NSF] is the nation’s leader in scientific research, and NASA is responsible for preserving America’s leadership in space exploration. NOAA is responsible for making sure that weather forecasting and climate data is the best that it can be, and so on. The president needs a science adviser to keep him posted on new developments and to give him guidance in all of those areas. But I don’t know that [OSTP] needs a large staff or a big operation. Q: Is there a role for PCAST? A: Well, that’s up to the new president. But in my mind there are already too many termite mounds in Washington. We need to shrink the size of government in any way we can. And we have too many advisory committees advising advisory committees, in my opinion. Q: The 21st Century Cures Act (a medical research bill now before Congress) would create a Research Policy Board to try to reduce excessive regulations affecting academic research. It’s based on a report by the National Academies of Science, Engineering, and Medicine. Do you think that’s a good idea? A: I’ve got a lot of faith in the National Academies, and I look to them for guidance on the work that I do in the CJS bill. My admiration for the academy is one reason that I put into the bill the requirement that NASA follow the recommendations of the decadal studies in all the areas they cover. They do great work, and I would always be inclined to follow their recommendations. Q: Who would you like to see as NASA administrator, and what qualities and experience should they have? A: I’ve got a superb candidate in mind. But I’m going to leave that up to the new president. Q: Would you accept the job if asked? A: I love representing the people of west Houston[, Texas], and I can’t think of a better job than helping the Department of Justice, [NSF], NASA, and the commerce department be the best they can be. This is a job I’ve always dreamed of doing, and it’s been everything I hoped for and more. I’ve been able to map out a strategy for NASA to follow the decadal surveys and ensure it is on track to discover life in the oceans of Europa and then launch the first interstellar mission to Alpha Centauri by 2069, through the work of my subcommittee. Q: Have you talked to the Trump transition team about Europa? A: Right now they are focused on filling the top positions within the administration. But I’ve had good conversations with them on the need to make sure that the U.S. space program is the best on earth and that it will be an American spacecraft that discovers life on another world and achieves interstellar travel and explores an Earth-like planet around a nearby star. There’s lots of support for that in the incoming Congress and in the new administration, and I’m looking forward to make those dreams come through. Q: Some have suggested reviving the National Space Council. Would that be useful? A: I’d have to see what the new administration proposes. But I think there are too many layers of government and advisory committees. A simplified and unified chain of command at NASA that is less political would help the agency immensely. And I will continue to try to make the NASA administrator more like the FBI director [in serving a 10-year term], so it can focus on its mission and worry less about changes in administration. The agency needs stability and certainty and adequate funding to accomplish everything on its plate. Q: There’s been talk of moving earth sciences out of NASA. A: At this point that is very speculative. There’s strong support in Congress for keeping a close eye on planet Earth and understanding our complex planet. And the future level of funding and who’s responsible for earth science will be an ongoing debate with the new administration and the incoming Congress. I’m quite confident there will continue to be strong support for the earth sciences as well as planetary sciences and the human space flight program throughout Congress and in the new administration. Q: Would that be within NASA, or somewhere else in government? A: It will continue to be a topic of ongoing discussion. But nobody in the earth sciences community should be concerned in the least. All of us in Congress are strong supporters of keeping a close eye on planet Earth. Q: Would you like science and cyber facilities to be part of a bill to rebuild U.S. roads, ports, and other infrastructure? A: I think it’s important that the United States maintains its leadership in particle physics, and I think it’s unfortunate that we have not. And we have also fallen behind in building the world’s biggest and faster supercomputers. We need to maintain American leadership in astronomy. I’d like to see NSF more deeply involved in the construction and design of the Giant Magellan Telescope, for example. And the radio telescope in [Arecibo,] Puerto Rico, and at Green Bank[, West Virginia,] are getting a little elderly. In order to preserve American leadership in critical areas of scientific discovery and technological achievement, we will need to make the necessary investments in scientific infrastructure. And that will require strong support for NSF and NASA. Q: Would that be part of an NSF and NASA bill, or part of an infrastructure bill? And would it need to be paid for? A: It’ll all be part of the debate we’ll be having in the months ahead. But I’m very concerned that America may be slipping in the areas I’ve just m