Weather Research

De Bilt, Netherlands

Weather Research

De Bilt, Netherlands

Time filter

Source Type

News Article | May 25, 2017
Site: grist.org

Less than a year after Hurricane Matthew raked the East Coast, killing 34 people and causing $10 billion in damage in the U.S. alone, coastal areas are once again preparing for the onset of the Atlantic hurricane season. This year, forecasters with the National Oceanic and Atmospheric Administration are expecting to see above average storm numbers in the Atlantic, despite the uncertainty of whether an El Niño will develop over the summer. The forecast is currently for 11 to 17 named storms to form, of which five to nine are expected to become hurricanes, and two to four major hurricanes (Category 3 or above). The forecast, though, “does not predict when, where, and how these storms might hit,” Ben Friedman, the acting NOAA administrator, said during a press conference, as he and other officials urged coastal residents to begin their preparations. During Thursday’s press conference, officials also touted the updated models and tools they have to produce better forecasts for individual storms, part of a concerted effort that has greatly improved hurricane forecasts over the past couple of decades. Those comments, though, come just a few days after the release of President Trump’s budget request, which calls for reductions to some of those very programs. The 2017 hurricane season got off to an early start, with Tropical Storm Arlene forming in April, only the second April storm in the satellite era. Early storms, however, are not necessarily indicators of how active a given season will be. To gauge the hurricane season, forecasters use various climatological clues, such as the state of the El Niño cycles, as well as expected trends in ocean temperatures and a measure called wind shear, which can cut off storm formation. El Niño is a key factor in making hurricane seasonal forecasts because the changes in atmospheric patterns over the tropical Pacific that it ushers in have a domino effect on patterns over the Atlantic, tending to suppress hurricane formation. Whether an El Niño will develop is currently something of a question mark, though, with the odds about even for El Niño or neutral conditions this summer and fall. Also uncertain is whether any El Niño that does materialize will be strong enough to influence the Atlantic. But sea surface temperatures across swaths of the Atlantic are currently above average and are expected to stay that way, and wind shear is also expected to stay low, both of which would tend to support more storm formation. So given the signals that forecasters have to work with, they expect a 45 percent chance of above average storm numbers, a 35 percent chance of near normal, and only a 20 percent chance of below normal activity. Those percentages translate to the ranges of numbers of storms expected at different strengths. The 11 to 16 named storms include those that reach tropical storm status or higher, defined as a storm with wind speeds of 39 mph or higher. Five to nine of those storms would be expected to strengthen into hurricanes, with winds in excess of 74 mph. And then two to four of those hurricanes would be expected to reach major hurricane status, defined as Category 3 or above on the Saffir-Simpson scale of hurricane strength, or winds above 111 mph. An average Atlantic season has 12 named storms, six hurricanes, and three major hurricanes. NOAA evaluates the accuracy of its seasonal forecasts each year, with the aim of seeing the number of storms fall in the given ranges at least 70 percent of the time, which they do consistently, Gerry Bell, lead seasonal hurricane forecaster with NOAA’s Climate Prediction Center, said. It has been a record-setting 12 years since a major hurricane made landfall on the U.S. coast; the last to do so was Hurricane Wilma during the blockbuster 2005 season. “While some may think that’s lucky … in fact, tropical storms and lesser hurricanes can be just as damaging and just as deadly,” Friedman said, citing Matthew as a prime example. Matthew, which was for a time the first Category 5 hurricane to form in the Atlantic since Hurricane Felix in 2007, had weakened to a Category 1 storm by the time it made landfall in South Carolina last October. The punishing storm surge that pushed ashore from Florida to the Carolinas and the torrential rains it dropped inland still made it the 10th costliest storm recorded in the Atlantic basin, according to the reinsurance firm Aon Benfield. Storm surge and heavier downpours are two areas where climate change is exerting an influence on the damage produced by hurricanes. As global temperature rise, sea level rises too, meaning hurricane surges can reach further inland. Rising temperatures also concentrate moisture in the atmosphere, providing more fuel for heavy rains. The impact of climate change on hurricanes themselves is active area of research; the general consensus is that there may be fewer storms overall in a warmer world, but a higher proportion of them will be major hurricanes. Major hurricanes have already increased in the Atlantic since 1970. Some research has also suggested that the hurricane season could become longer, meaning more pre-season storms like Arlene. During the press conference, Mary Erickson, deputy director of National Weather Service, touted the increased accuracy of hurricane forecasts resulting from investments into improving models. In the 25 years since Hurricane Andrew devastated southeastern Florida, the three-day track forecast for hurricanes has improved by 65 percent, she said. Two new models coming online this season could improve forecasts even more. One, the Hurricane Weather Research Forecast model, includes better resolution of storms, advanced ways of feeding data into the model, and more accurate atmospheric physics, all of which could improve intensity forecasts for storms by up to 10 percent and track forecasts by up to 7 percent, Erickson said. Another model replaces the retiring Geophysical Fluid Dynamics Laboratory Hurricane Model after 22 years, and it also improves track and intensity forecasts. Many of the improvements to those models have come as part of a concerted effort called the Hurricane Forecast Improvement Program. That program was established by NOAA in 2009, in part as a response to the pummeling the U.S. received from a number of hurricanes during the early years of that decade and the relative lack of progress made in improving forecasts up to that point. Trump’s 2018 budget request currently includes a $5 million reduction in funding “to slow the transition of advanced modeling research into operations for improved warnings and forecasts,” including the HFIP. That budget provision doesn’t jibe with the bipartisan-supported Weather Research and Forecasting Innovation Act of 2017, which the president signed into law last month and which states that “NOAA must plan and maintain a project to improve hurricane forecasting.” “I don’t think Congress will take his proposal seriously at all … so it can probably be ignored in favor of the legislation that has actually passed,” Brian McNoldy, a hurricane researcher at the University of Miami, said in an email. “But supposing Congress did pass his budget as-is, yes, it would be devastating to weather prediction across the board, including hurricanes.” Forecasters will also be able to use the improved observations of the GOES-16 satellite, which has four times the resolution and updates five times faster than its predecessors. In particular, its lightning mapper will help forecasters better understand how a storm is developing, as lightning often accompanies rapid storm development. When it becomes operational later this year, GOES-16 will move into orbit over the East Coast, in prime hurricane-watching position, Friedman said. NOAA is also making its previously experimental storm surge watches and warnings operational this year, in an effort to better prepare coastal areas under threat of flooding. Hurricane graphics will also include an experimental visualization of how far damaging winds extend out from the center of a storm. NOAA will update its forecast in early August, just before the typical peak of the hurricane season.


News Article | April 20, 2017
Site: www.eurekalert.org

As the sensitive areas of global climate change, Yellow River basin, situated in arid and semiarid regions, also plays an important role in food production. However, using water resources in the Yellow River basin is challenging due to the drought and increasing water consumption. "The large-scale agricultural irrigation is an important process in the utilization of water resources in Yellow River basin," said Dr. CHEN Liang, an assistant researcher in the Institute of Atmospheric Physics and also the first author of a paper recently published in Atmospheric and Oceanic Science Letters. "The agricultural irrigation affects the regional climate mainly through changing the surface water process. There have been studies on climate change effect incurred by the change of the soil moisture, but the role of irrigation has not been sufficiently depicted in those studies," he said. CHEN and his team developed a new irrigation scheme based on the Noah land surface model, and then coupled it with the Weather Research and Forecasting regional climate model. Two simulations (with and without irrigation) were conducted over the Yellow River basin for the period April to October 2000-2010. The results indicated that when irrigation was induced, the mean surface air temperature decreased, and there was a correspond increase (decrease) in latent (sensible) heat flux over the irrigated areas. The cooling effect was consistent with the changes in evapotranspiration and heat fluxes due to irrigation. The agriculture irrigation will lead to a greater probability of cloud formation which then impacted the spatial distribution of surface sir temperature and precipitation. "These studies will provide science advice to water use sustainable development in Yellow River basin," said CHEN, "In the future, we would like to use different irrigation method to asses the impact of irrigation on regional climate and hydrologic cycles."


News Article | December 5, 2016
Site: www.eurekalert.org

At century's end, the number of summertime storms that produce extreme downpours could increase by more than 400 percent across parts of the United States -- including sections of the Gulf Coast, Atlantic Coast, and the Southwest -- according to a new study by scientists at the National Center for Atmospheric Research (NCAR). The study, published today in the journal Nature Climate Change, also finds that the intensity of individual extreme rainfall events could increase by as much as 70 percent in some areas. That would mean that a storm that drops about 2 inches of rainfall today would be likely to drop nearly 3.5 inches in the future. "These are huge increases," said NCAR scientist Andreas Prein, lead author of the study. "Imagine the most intense thunderstorm you typically experience in a single season. Our study finds that, in the future, parts of the U.S. could expect to experience five of those storms in a season, each with an intensity as strong or stronger than current storms." The study was funded by the National Science Foundation (NSF), NCAR's sponsor, and the Research Partnership to Secure Energy for America. "Extreme precipitation events affect our infrastructure through flooding, landslides and debris flows," said Anjuli Bamzai, program director in NSF's Directorate for Geosciences, which funded the research. "We need to better understand how these extreme events are changing. By supporting this research, NSF is working to foster a safer environment for all of us." An increase in extreme precipitation is one of the expected impacts of climate change because scientists know that as the atmosphere warms, it can hold more water, and a wetter atmosphere can produce heavier rain. In fact, an increase in precipitation intensity has already been measured across all regions of the U.S. However, climate models are generally not able to simulate these downpours because of their coarse resolution, which has made it difficult for researchers to assess future changes in storm frequency and intensity. For the new study, the research team used a new dataset that was created when NCAR scientists and study co-authors Roy Rasmussen, Changhai Liu, and Kyoko Ikeda ran the NCAR-based Weather Research and Forecasting (WRF) model at a resolution of 4 kilometers, fine enough to simulate individual storms. The simulations, which required a year to run, were performed on the Yellowstone system at the NCAR-Wyoming Supercomputing Center. Prein and his co-authors used the new dataset to investigate changes in downpours over North America in detail. The researchers looked at how storms that occurred between 2000 and 2013 might change if they occurred instead in a climate that was 5 degrees Celsius (9 degrees Fahrenheit) warmer -- the temperature increase expected by the end of the century if greenhouse gas emissions continue unabated. Prein cautioned that this approach is a simplified way of comparing present and future climate. It doesn't reflect possible changes to storm tracks or weather systems associated with climate change. The advantage, however, is that scientists can more easily isolate the impact of additional heat and associated moisture on future storm formation. "The ability to simulate realistic downpours is a quantum leap in climate modeling. This enables us to investigate changes in hourly rainfall extremes that are related to flash flooding for the very first time," Prein said. "To do this took a tremendous amount of computational resources." The study found that the number of summertime storms producing extreme precipitation is expected to increase across the entire country, though the amount varies by region. The Midwest, for example, sees an increase of zero to about 100 percent across swaths of Nebraska, the Dakotas, Minnesota, and Iowa. But the Gulf Coast, Alabama, Louisiana, Texas, New Mexico, Arizona, and Mexico all see increases ranging from 200 percent to more than 400 percent. The study also found that the intensity of extreme rainfall events in the summer could increase across nearly the entire country, with some regions, including the Northeast and parts of the Southwest, seeing particularly large increases, in some cases of more than 70 percent. A surprising result of the study is that extreme downpours will also increase in areas that are getting drier on average, especially in the Midwest. This is because moderate rainfall events that are the major source of moisture in this region during the summertime are expected to decrease significantly while extreme events increase in frequency and intensity. This shift from moderate to intense rainfall increases the potential for flash floods and mudslides, and can have negative impacts on agriculture. The study also investigated how the environmental conditions that produce the most severe downpours might change in the future. In today's climate, the storms with the highest hourly rainfall intensities form when the daily average temperature is somewhere between 20 and 25 degrees C (68 to 77 degrees F) and with high atmospheric moisture. When the temperature gets too hot, rainstorms become weaker or don't occur at all because the increase in atmospheric moisture cannot keep pace with the increase in temperature. This relative drying of the air robs the atmosphere of one of the essential ingredients needed to form a storm. In the new study, the NCAR scientists found that storms may continue to intensify up to temperatures of 30 degrees C because of a more humid atmosphere. The result would be much more intense storms. "Understanding how climate change may affect the environments that produce the most intense storms is essential because of the significant impacts that these kinds of storms have on society," Prein said. The University Corporation for Atmospheric Research manages the National Center for Atmospheric Research under sponsorship by the National Science Foundation. Any opinions, findings and conclusions, or recommendations expressed in this publication are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.


News Article | August 22, 2016
Site: www.scientificcomputing.com

As the National Oceanic and Atmospheric Administration (NOAA) this month launches a comprehensive system for forecasting water resources in the United States, it is turning to technology developed by the National Center for Atmospheric Research (NCAR) and its university and agency collaborators. WRF-Hydro, a powerful NCAR-based computer model, is the first nationwide operational system to provide continuous predictions of water levels and potential flooding in rivers and streams from coast to coast. NOAA's new Office of Water Prediction selected it last year as the core of the agency's new National Water Model. "WRF-Hydro gives us a continuous picture of all of the waterways in the contiguous United States," said NCAR scientist David Gochis, who helped lead its development. "By generating detailed forecast guidance that is hours to weeks ahead, it will help officials make more informed decisions about reservoir levels and river navigation, as well as alerting them to dangerous events like flash floods." WRF-Hydro (WRF stands for Weather Research and Forecasting) is part of a major Office of Water Prediction initiative to bolster U.S. capabilities in predicting and managing water resources. By teaming with NCAR and the research community, NOAA's National Water Center is developing a new national water intelligence capability, enabling better impacts-based forecasts for management and decision making. Unlike past streamflow models, which provided forecasts every few hours and only for specific points along major river systems, WRF-Hydro provides continuous forecasts of millions of points along rivers, streams, and their tributaries across the contiguous United States. To accomplish this, it simulates the entire hydrologic system — including snowpack, soil moisture, local ponded water, and evapotranspiration — and rapidly generates output on some of the nation's most powerful supercomputers. WRF-Hydro was developed in collaboration with NOAA and university and agency scientists through the Consortium of Universities for the Advancement of Hydrologic Science, the U.S. Geological Survey, Israel Hydrologic Service, and Baron Advanced Meteorological Services. Funding came from NOAA, NASA, and the National Science Foundation, which is NCAR's sponsor. "WRF-Hydro is a perfect example of the transition from research to operations," said Antonio (Tony) J. Busalacchi, president of the University Corporation for Atmospheric Research, which manages NCAR on behalf of the National Science Foundation (NSF). "It builds on the NSF investment in basic research in partnership with other agencies, helps to accelerate collaboration with the larger research community, and culminates in support of a mission agency such as NOAA. The use of WRF-Hydro in an operational setting will also allow for feedback from operations to research. In the end this is a win-win situation for all parties involved, chief among them the U.S. taxpayers." "Through our partnership with NCAR and the academic and federal water community, we are bringing the state of the science in water forecasting and prediction to bear operationally," said Thomas Graziano, director of NOAA’s new Office of Water Prediction at the National Weather Service. The continental United States has more than 3 million miles of rivers and streams, from major navigable waterways such as the Mississippi and Columbia to the remote mountain brooks flowing from the high Adirondacks into the Hudson River. The levels and flow rates of these watercourses have far-reaching implications for water availability, water quality, and public safety. Until now, however, it has not been possible to predict conditions at all points in the nation's waterways. Instead, computer models have produced a limited picture by incorporating observations from about 4,000 gauges, generally on the country's bigger rivers. Smaller streams and channels are largely left out of these forecast models, and stretches of major rivers for tens of miles are often not predicted — meaning that schools, bridges, and even entire towns can be vulnerable to unexpected changes in river levels. To fill in the picture, NCAR scientists have worked for the past several years with their colleagues within NOAA, other federal agencies, and universities to combine a range of atmospheric, hydrologic, and soil data into a single forecasting system. The resulting National Water Model, based on WRF-Hydro, simulates current and future conditions on rivers and streams along points two miles apart across the contiguous United States. Along with an hourly analysis of current hydrologic conditions, the National Water Model generates three predictions: an hourly 0- to 15-hour short-range forecast, a daily 0- to 10-day medium-range forecast, and a daily 0- to 30-day long-range water resource forecast. The National Water Model predictions using WRF-Hydro offer a wide array of benefits for society. They will help local, state, and federal officials better manage reservoirs, improve navigation along major rivers, plan for droughts, anticipate water quality problems caused by lower flows, and monitor ecosystems for issues such as whether conditions are favorable for fish spawning. By providing a national view, this will also help the Federal Emergency Management Agency deploy resources more effectively in cases of simultaneous emergencies, such as a hurricane in the Gulf Coast and flooding in California. "We've never had such a comprehensive system before," Gochis said. "In some ways, the value of this is a blank page yet to be written." WRF-Hydro is a powerful forecasting system that incorporates advanced meteorological and streamflow observations, including data from nearly 8,000 U.S. Geological Survey streamflow gauges across the country. Using advanced mathematical techniques, the model then simulates current and future conditions for millions of points on every significant river, steam, tributary, and catchment in the United States. In time, scientists will add additional observations to the model, including snowpack conditions, lake and reservoir levels, subsurface flows, soil moisture, and land-atmosphere interactions such as evapotranspiration, the process by which water in soil, plants, and other land surfaces evaporates into the atmosphere. Scientists over the last year have demonstrated the accuracy of WRF-Hydro by comparing its simulations to observations of streamflow, snowpack, and other variables. They will continue to assess and expand the system as the National Water Model begins operational forecasts. NCAR scientists maintain and update the open-source code of WRF-Hydro, which is available to the academic community and others. WRF-Hydro is widely used by researchers, both to better understand water resources and floods in the United States and other countries such as Norway, Germany, Romania, Turkey, and Israel, and to project the possible impacts of climate change. "At any point in time, forecasts from the new National Water Model have the potential to impact 300 million people," Gochis said. "What NOAA and its collaborator community are doing is trying to usher in a new era of bringing in better physics and better data into forecast models for improving situational awareness and hydrologic decision making."


News Article | January 20, 2016
Site: www.scientificcomputing.com

As South West Western Australia (WA) residents count the cost of last week’s devastating bushfires, Murdoch University scientists are working on a model to help predict future bushfire threats in the region. The work’s aim is to inform preparation and help assess the risks of catastrophes, such as the Yarloop tragedy in which two elderly men died and 143 properties were razed. “Using state-of-the-art regional climate models, we are investigating future changes in fire weather by focusing on the key contributing climate factors, which are temperature, rainfall, wind speed and relative humidity,” researcher Alyce Sala-Tenna says. The end result will be bushfire risk ratings projected over time consistent with the McArthur Forest Fire Danger Index — the same index WA’s Department of Fire and Emergency Services (DFES) currently uses to produce daily Fire Danger Ratings in regional WA. “We’re aiming that our work can show how bushfires are going to change and get a better understanding of where more frequent occurrences and intensities, longer fire seasons and shifts are likely to occur,” Sala-Tenna says. To develop the models, the Murdoch team is drawing on the Weather Research and Forecast Model (WRF), developed in America. Through WRF, they’ve downscaled global climate models from their original 100- to 250-kilometer resolutions to five-kilometer grids of WA’s South West, each grid covering an area slightly larger than Kings Park. The grid model incorporates data from the Intergovernmental Panel on Climate Change (IPCC)’s A2 scenario. The resulting 30-year climate simulations will assess future changes in fire weather, including the impact of flammable fuel loads. To make the whole project work, the Murdoch team is relying on the power of the Pawsey Supercomputing Centre’s petascale machine Magnus. Magnus is the most powerful supercomputer in the southern hemisphere with its processing power equivalent to six million iPads. “Our simulations require significant computing power and data storage, so without Pawsey, the research would not be possible,” Sala-Tenna says. The research could also assist in developing effective policies on bushfire management into the future and provide agencies and the public with a better understanding of how climate change affects the natural environment. “You can never prevent bushfires — they’re part of the natural cycle and are part of how Australia has developed — but we can be better prepared,” Sala-Tenna says. Individuals concerned about bushfires in their area can consult the DFES Map of Bush Fire Prone Areas. This article was originally published on ScienceNetwork WA. Read the original article.


News Article | February 27, 2017
Site: www.eurekalert.org

As the world warms, mountain snowpack will not only melt earlier, it will also melt more slowly, according to a new study by scientists at the National Center for Atmospheric Research (NCAR). The counterintuitive finding, published today in the journal Nature Climate Change, could have widespread implications for water supplies, ecosystem health, and flood risk. "When snowmelt shifts earlier in the year, the snow is no longer melting under the high sun angles of late spring and early summer," said NCAR postdoctoral researcher Keith Musselman, lead author of the paper. "The Sun just isn't providing enough energy at that time of year to drive high snowmelt rates." The study was funded by the National Science Foundation, NCAR's sponsor. The findings could explain recent research that suggests the average streamflow in watersheds encompassing snowy mountains may decline as the climate warms -- even if the total amount of precipitation in the watershed remains unchanged. That's because the snowmelt rate can directly affect streamflow. When snowpack melts more slowly, the resulting water lingers in the soil, giving plants more opportunity to take up the moisture. Water absorbed by plants is water that doesn't make it into the stream, potentially reducing flows. Musselman first became interested in how snowmelt rates might change in the future when he was doing research in the Sierra Nevada. He noticed that shallower, lower-elevation snowpack melted earlier and more slowly than thicker, higher-elevation snowpack. The snow at cooler, higher elevations tended to stick around until early summer -- when the Sun was relatively high in the sky and the days had grown longer -- so when it finally started to melt, the melt was rapid. Musselman wondered if the same phenomenon would unfold in a future climate, when warmer temperatures are expected to transform higher-elevation snowpack into something that looks much more like today's lower-elevation snowpack. If so, the result would be more snow melting slowly and less snow melting quickly. To investigate the question, Musselman first confirmed what he'd noticed in the Sierra by analyzing a decade's worth of snowpack observations from 979 stations in the United States and Canada. He and his co-authors -- NCAR scientists Martyn Clark, Changhai Liu, Kyoko Ikeda, and Roy Rasmussen -- then simulated snowpack over the same decade using the NCAR-based Weather Research and Forecasting (WRF) model. Once they determined that the output from WRF tracked with the observations, they used simulations from the model to investigate how snowmelt rates might change in North America around the end of the century if climate change continues unabated. "We found a decrease in the total volume of meltwater -- which makes sense given that we expect there to be less snow overall in the future," Musselman said. "But even with this decrease, we found an increase in the amount of water produced at low melt rates and, on the flip side, a decrease in the amount of water produced at high melt rates." While the study did not investigate the range of implications that could come from the findings, Musselman said the impacts could be far-reaching. For example, a reduction in high melt rates could mean fewer spring floods, which could lower the risk of infrastructure damage but also negatively affect riparian ecosystems. Changes in the timing and amount of snowmelt runoff could also cause warmer stream temperatures, which would affect trout and other fish species, and the expected decrease in streamflow could cause shortages in urban water supplies. "We hope this study motivates scientists from many other disciplines to dig into our research so we can better understand the vast implications of this projected shift in hydrologic patterns," Musselman said. The University Corporation for Atmospheric Research manages the National Center for Atmospheric Research under sponsorship by the National Science Foundation. Any opinions, findings and conclusions, or recommendations expressed in this publication are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.


News Article | December 27, 2016
Site: www.eurekalert.org

Controls engineers at UC San Diego have developed practical strategies for building and coordinating scores of sensor-laden balloons within hurricanes. Using onboard GPS and cellphone-grade sensors, each drifting balloon becomes part of a ``swarm'' of robotic vehicles, which can periodically report, via satellite uplink, their position, the local temperature, pressure, humidity and wind velocity. This new, comparatively low-cost sensing strategy promises to provide much-needed in situ sampling of environmental conditions for a longer range of time and from many vantage points within developing hurricanes. This has the potential to greatly improve efforts to estimate and forecast the intensity and track of future hurricanes in real time. Current two to five day forecasts of many hurricanes deviate significantly from each other, and from the truth. For example, as Hurricane Matthew churned toward the eastern seaboard in early October of 2016, various news outlets reported "forecasts" like "Hurricane Matthew will probably make landfall somewhere between Charleston and Boston, so everyone brace yourselves." "Guidance like this is entirely inadequate for evacuation and emergence response preparations," said Thomas Bewley, a professor at the Jacobs School of Engineering at UC San Diego and the paper's senior author. Improved forecasts, to be greatly facilitated by improved in situ environmental sampling, are essential to protect property and save lives from such extreme environmental threats, he added. Key challenges in this effort include the design of small, robust, buoyancy-controlled balloons that won't accumulate ice; the efficient coordination of the motion of these balloons to keep them moving within the hurricane, between an altitude of 0 and 8 kilometers (about 5 miles); and to keep them well distributed over the dynamically significant regions within the hurricane, for up to a week at a time. Bewley and UC San Diego post-doctoral researcher Gianluca Meneghello detail various aspects of their work on this problem in the October 2016 issue of the Physical Review Fluids, building upon work they published in the proceedings of the eighth International Symposium on Stratified Flows (ISSF) in San Diego, (Sept. 1, 2016). They plan to expand on their work at the forthcoming IEEE Aerospace Conference in Big Sky, Mont. (March 6, 2017). The model for large-scale coordination of balloon swarms within hurricanes, as discussed in the Physical Review Fluids article, uses a clever strategy to model predictive control by leveraging the cutting-edge Weather Research and Forecasting code developed by the National Center for Atmospheric Research, the National Oceanic and Atmospheric Administration and the Air Force Weather Agency (AFWA). Multiple simulations indicate the remarkable effectiveness of this approach, including a simulation based on the evolution of Hurricane Katrina as it moved across the Gulf of Mexico, as summarized in the video available at http://flowcontrol. `The key idea of our large-scale balloon coordination strategy,'' said Bewley, "is to `go with the flow,' commanding small vertical movements of the balloons and leveraging the strong vertical stratification of the horizontal winds within the hurricane to distribute the balloons in the desired fashion horizontally." Intermediate-scale and small-scale fluctuations in the violent turbulent flow of a hurricane, which are unresolved by forecasting codes like WRF, are quite substantial. The researchers' strategy? "We simply ride out the smaller-scale fluctuations of the flow," said Meneghello. "The smaller-scale flowfield fluctuations induce something of a random walk in the balloon motion. We model these fluctuations statistically, and respond with corrections only if a balloon deviates too far from its desired location in the formation." As summarized in their ISSF paper, the researchers' strategy for applying such corrections, dubbed Three Level Control (and endearingly abbreviated TLC), applies a finite shift to the vertical location of the displaced balloon for a short period of time, again leveraging the strong vertical stratification of the horizontal winds to return the balloon to its nominal desired location. A third essential ingredient of the project, summarized in the researchers' IEEE paper, is the design of small (about 3 kg or 6.5 lbs.), robust, energetically-efficient, buoyancy-controlled balloons that can survive, without significant accumulation of ice, in the cold, wet, turbulent, electrically active environment of a hurricane. The balloons can operate effectively for up to a week at a time on a battery charge not much larger than that of a handful of iPhones. "Cellphone-grade technologies, for both environmental sensors as well as low-energy radios and microprocessors, coupled with new space-grade balloon technology developed by Thin Red Line Aerospace, are on the cusp of making this ambitious robotic sensing mission feasible," said Bewley. In addition to robotics, Bewley's team specializes in the field of control theory, which is the essential "hidden technology" in many engineering applications, such as cruise control and adaptive suspension systems in cars, stability augmentation systems in high-performance aircraft and adaptive noise cancellation in telecommunication. Control theory made it possible for SpaceX rockets to land on barges at sea. Though the math and numerical methods involved are sophisticated, the fundamental principle is straightforward: sensors take measurements of the physical environment, then a computer uses these measurements in real time to coordinate appropriate responses by the system (in this case, the buoyancy of the balloons) to achieve the desired effect. Bewley, Meneghello and colleagues are now working towards testing the balloons and algorithms designed in this study in the real world. With sensor balloon swarms and the special TLC coming out of their lab, fire and safety officials may soon have a crucial extra couple of days to move people out of harm's way, and to prepare emergency responses, when the next Katrina or Sandy threatens.


Controls engineers at UC San Diego have developed practical strategies for building and coordinating scores of sensor-laden balloons within hurricanes. Using onboard GPS and cellphone-grade sensors, each drifting balloon becomes part of a ``swarm’’ of robotic vehicles, which can periodically report, via satellite uplink, their position, the local temperature, pressure, humidity and wind velocity. This new, comparatively low-cost sensing strategy promises to provide much-needed in situ sampling of environmental conditions for a longer range of time and from many vantage points within developing hurricanes. This has the potential to greatly improve efforts to estimate and forecast the intensity and track of future hurricanes in real time. Current two to five day forecasts of many hurricanes deviate significantly from each other, and from the truth.  For example, as Hurricane Matthew churned toward the eastern seaboard in early October of 2016, various news outlets reported “forecasts" like “Hurricane Matthew will probably make landfall somewhere between Charleston and Boston, so everyone brace yourselves.” “Guidance like this is entirely inadequate for evacuation and emergence response preparations,” said Thomas Bewley, a professor at the Jacobs School of Engineering at UC San Diego and the paper’s senior author. Improved forecasts, to be greatly facilitated by improved in situ environmental sampling, are essential to protect property and save lives from such extreme environmental threats, he added. Key challenges in this effort include the design of small, robust, buoyancy-controlled balloons that won’t accumulate ice; the efficient coordination of the motion of these balloons to keep them moving within the hurricane, between an altitude of 0 and 8 kilometers (about 5 miles); and to keep them well distributed over the dynamically significant regions within the hurricane, for up to a week at a time. Bewley and UC San Diego post-doctoral researcher Gianluca Meneghello detail various aspects of their work on this problem in the October 2016 issue of the Physical Review Fluids, building upon work they published in the proceedings of the eighth International Symposium on Stratified Flows (ISSF) in San Diego, (Sept. 1, 2016). They plan to expand on their work at the forthcoming IEEE Aerospace Conference in Big Sky, Mont. (March 6, 2017). The model for large-scale coordination of balloon swarms within hurricanes, as discussed in the Physical Review Fluids article, uses a clever strategy to model predictive control by leveraging the cutting-edge Weather Research and Forecasting code developed by the National Center for Atmospheric Research, the National Oceanic and Atmospheric Administration and the Air Force Weather Agency (AFWA).  Multiple simulations indicate the remarkable effectiveness of this approach, including a simulation based on the evolution of Hurricane Katrina as it moved across the Gulf of Mexico, as summarized in the video available at http://flowcontrol.ucsd.edu/katrina.mp4 `The key idea of our large-scale balloon coordination strategy,’’ said Bewley, “is to `go with the flow,’ commanding small vertical movements of the balloons and leveraging the strong vertical stratification of the horizontal winds within the hurricane to distribute the balloons in the desired fashion horizontally.” Intermediate-scale and small-scale fluctuations in the violent turbulent flow of a hurricane, which are unresolved by forecasting codes like WRF, are quite substantial.  The researchers’ strategy?  “We simply ride out the smaller-scale fluctuations of the flow,” said Meneghello. “The smaller-scale flowfield fluctuations induce something of a random walk in the balloon motion.  We model these fluctuations statistically, and respond with corrections only if a balloon deviates too far from its desired location in the formation.” As summarized in their ISSF paper, the researchers’ strategy for applying such corrections, dubbed Three Level Control (and endearingly abbreviated TLC), applies a finite shift to the vertical location of the displaced balloon for a short period of time, again leveraging the strong vertical stratification of the horizontal winds to return the balloon to its nominal desired location. A third essential ingredient of the project, summarized in the researchers’  IEEE paper, is the design of small (about 3 kg or 6.5 lbs.), robust, energetically-efficient, buoyancy-controlled balloons that can survive, without significant accumulation of ice, in the cold, wet, turbulent, electrically active environment of a hurricane. The balloons can operate effectively for up to a week at a time on a battery charge not much larger than that of a handful of iPhones.  “Cellphone-grade technologies, for both environmental sensors as well as low-energy radios and microprocessors, coupled with new space-grade balloon technology developed by Thin Red Line Aerospace, are on the cusp of making this ambitious robotic sensing mission feasible,” said Bewley. In addition to robotics, Bewley’s team specializes in the field of control theory, which is the essential “hidden technology” in many engineering applications, such as cruise control and adaptive suspension systems in cars, stability augmentation systems in high-performance aircraft and adaptive noise cancellation in telecommunication. Control theory made it possible for SpaceX rockets to land on barges at sea. Though the math and numerical methods involved are sophisticated, the fundamental principle is straightforward: sensors take measurements of the physical environment, then a computer uses these measurements in real time to coordinate appropriate responses by the system (in this case, the buoyancy of the balloons) to achieve the desired effect. Bewley, Meneghello and colleagues are now working towards testing the balloons and algorithms designed in this study in the real world.  With sensor balloon swarms and the special TLC coming out of their lab, fire and safety officials may soon have a crucial extra couple of days to move people out of harm’s way, and to prepare emergency responses, when the next Katrina or Sandy threatens.


Vatvani D.,Deltares | Zweers N.C.,Weather Research | Van Ormondt M.,Deltares | Smale A.J.,Deltares | And 2 more authors.
Natural Hazards and Earth System Science | Year: 2012

To simulate winds and water levels, numerical weather prediction (NWP) and storm surge models generally use the traditional bulk relation for wind stress, which is characterized by a wind drag coefficient. A still commonly used drag coefficient in those models, some of them were developed in the past, is based on a relation, according to which the magnitude of the coefficient is either constant or increases monotonically with increasing surface wind speed (Bender, 2007; Kim et al., 2008; Kohno and Higaki, 2006). The NWP and surge models are often tuned independently from each other in order to obtain good results. Observations have indicated that the magnitude of the drag coefficient levels off at a wind speed of about 30 m s -1, and then decreases with further increase of the wind speed. Above a wind speed of approximately 30 m s -1, the stress above the air-sea interface starts to saturate. To represent the reducing and levelling off of the drag coefficient, the original Charnock drag formulation has been extended with a correction term. In line with the above, the Delft3D storm surge model is tested using both Charnock's and improved Makin's wind drag parameterization to evaluate the improvements on the storm surge model results, with and without inclusion of the wave effects. The effect of waves on storm surge is included by simultaneously simulating waves with the SWAN model on identical model grids in a coupled mode. However, the results presented here will focus on the storm surge results that include the wave effects. The runs were carried out in the Gulf of Mexico for Katrina and Ivan hurricane events. The storm surge model was initially forced with H*wind data (Powell et al., 2010) to test the effect of the Makin's wind drag parameterization on the storm surge model separately. The computed wind, water levels and waves are subsequently compared with observation data. Based on the good results obtained, we conclude that, for a good reproduction of the storm surges under hurricane conditions, Makin's new drag parameterization is favourable above the traditional Charnock relation. Furthermore, we are encouraged by these results to continue the studies and establish the effect of improved Makin's wind drag parameterization in the wave model. The results from this study will be used to evaluate the relevance of extending the present towards implementation of a similar wind drag parameterization in the SWAN wave model, in line with our aim to apply a consistent wind drag formulation throughout the entire storm surge modelling approach. © 2012 Author(s). CC Attribution 3.0 License.


Chakraborty A.,Atmospheric and Oceanic science Group | Kumar R.,Atmospheric and Oceanic science Group | Stoffelen A.,Weather Research
Remote Sensing Letters | Year: 2013

Ocean surface winds from the OCEANSAT-2 scatterometer (OSCAT) were validated with equivalent neutral wind observations from 87 global buoys and winds from the European Centre for Medium Range Weather Forecasting (ECMWF) Numerical Weather Prediction (NWP) model using triple collocation for a period of 9 months. Functional relationship analysis (FRA) employing the error-in-variables method is found to be more 'exact' in comparison with classical linear regression analysis for the validation of the OSCAT data. Moreover, using the wind component domain for validation and error assessment rather than the speed and direction domain is confirmed to be favourable. The FRA method applied on the triple-collocated wind components shows that the error standard deviations of the OSCAT and buoy winds are quite similar. The calibration trends and biases for OSCAT, buoys and ECMWF are found to be close to unity and zero, respectively. © 2012 Taylor & Francis.

Loading Weather Research collaborators
Loading Weather Research collaborators