Center for Analysis and Prediction of Storms

Norman, OK, United States

Center for Analysis and Prediction of Storms

Norman, OK, United States
SEARCH FILTERS
Time filter
Source Type

Snyder J.C.,University of Oklahoma | Bluestein H.B.,University of Oklahoma | Dawson D.T.,Purdue University | Jung Y.,Center for Analysis and Prediction of Storms
Journal of Applied Meteorology and Climatology | Year: 2017

A high-resolution numerical model and polarimetric forward operator allow one to examine simulated convective storms from the perspective of observable polarimetric radar quantities, enabling a better comparison of modeled and observed deep moist convection. Part I of this two-part study described the model and forward operator used for all simulations and examined the structure and evolution of rings of reduced copolar cross-correlation coefficient (i.e., ρhv rings). The microphysical structure of upward extensions of enhanced differential reflectivity (ZDR columns and ZDR rings) and enhanced specific differential phase (KDP columns) near and within the updrafts of convective storms serve as the focus of this paper. In general, simulated ZDR columns are located immediately west of the midlevel updraft maximum and are associated with rainwater lofted above the 0°C level and wet hail/graupel, whereas ZDR rings are associated with wet hail located near and immediately east of the midlevel updraft maximum. The deepest areas of ZDR > 1 dB aloft are associated with supercells in the highest shear environments and those that have the most intense updrafts; the upper extent of the ZDR signatures is found to be positively correlated with the amount and mean-mass diameter of large hail aloft likely as a by-product of the shared correlations with updraft intensity and wind shear. Large quantities of rain compose the KDP columns, with the size and intensity of the updrafts directly proportional to the size and depth of the KDP columns. © 2017 American Meteorological Society.


Snyder J.C.,University of Oklahoma | Bluestein H.B.,University of Oklahoma | Dawson D.T.,Purdue University | Jung Y.,Center for Analysis and Prediction of Storms
Journal of Applied Meteorology and Climatology | Year: 2017

With the development of multimoment bulk microphysical schemes and polarimetric radar forward operators, one can better examine convective storms simulated in high-resolution numerical models from a simulated polarimetric radar perspective. Subsequently, relationships between observable and unobservable quantities can be examined that may provide useful information about storm intensity and organization that otherwise would be difficult to obtain. This paper, Part I of a two-part sequence, describes the bulk microphysics scheme, polarimetric radar forward operator, and numerical model configuration used to simulate supercells in eight idealized, horizontally homogenous environments with different wind profiles. The microphysical structure and evolution of copolar cross-correlation coefficient (ρhv) rings associated with simulated supercells are examined in Part I, whereas Part II examinesZDR columns,ZDR rings, andKDP columns. In both papers, some systematic differences between the signature seen at X and S bands are discussed. The presence of hail is found to affect ρhv much more at X band than at S band (and is found to affect ZDR more at S band than at X band), which corroborates observations. The ρhv half ring is found to be associated with the presence of large, sometimes wet, hail aloft, with an ~20-min time lag between increases in the size of the ρhv ring aloft and the occurrence of a large amount of hail near the ground in some simulations. © 2017 American Meteorological Society.


Dawson D.T.,Center for Analysis and Prediction of Storms | Xue M.,Center for Analysis and Prediction of Storms | Xue M.,University of Oklahoma | Milbrandt J.A.,Environment Canada | Shapiro A.,University of Oklahoma
Monthly Weather Review | Year: 2015

Numerical predictions of the 3 May 1999 Oklahoma City, Oklahoma, tornadic supercell are performed within a real-data framework utilizing telescoping nested grids of 3-km, 1-km, and 250-m horizontal spacing. Radar reflectivity and radial velocity from the Oklahoma City WSR-88D are assimilated using a cloud analysis procedure coupled with a cycled 3DVAR system to analyze storms on the 1-km grid for subsequent forecast periods. Single-, double-, and triple-moment configurations of a multimoment bulk microphysics scheme are used in several experiments on the 1-km and 250-m grids to assess the impact of varying the complexity of the microphysics scheme on the storm structure, behavior, and tornadic activity (on the 250-m grid). This appears to be the first study of its type to investigate single- versus multimoment microphysics within a real-data context. It is found that the triple-moment scheme overall performs the best, producing the smallest track errors for the mesocyclone on the 1-km grid, and stronger and longer-lived tornado-like vortices (TLVs) on the 250-m grid, closest to the observed tornado. In contrast, the single-moment scheme with the default Marshall-Palmer rain intercept parameter performs poorly, producing a cold pool that is too strong, and only weak and short-lived TLVs. The results in the context of differences in latent cooling from evaporation and melting between the schemes, as well as implications for numerical prediction of tornadoes, are discussed. More generally, the feedbacks to storm thermodynamics and dynamics from increasing the prognostic detail of the hydrometeor size distributions are found to be important for improving the simulation and prediction of tornadic thunderstorms. © 2015 American Meteorological Society.


Clark A.J.,University of Oklahoma | Kain J.S.,National Oceanic and Atmospheric Administration | Marsh P.T.,University of Oklahoma | Marsh P.T.,National Oceanic and Atmospheric Administration | And 5 more authors.
Weather and Forecasting | Year: 2012

A three-dimensional (in space and time) object identification algorithm is applied to high-resolution forecasts of hourly maximum updraft helicity (UH)-a diagnostic that identifies simulated rotating storms- with the goal of diagnosing the relationship between forecast UH objects and observed tornado pathlengths. UH objects are contiguous swaths of UH exceeding a specified threshold. Including time allows tracks to span multiple hours and entire life cycles of simulated rotating storms. The object algorithm is applied to 3 yr of 36-h forecasts initialized daily from a 4-km grid-spacing version of the Weather Research and Forecasting Model (WRF) run in real time at the National Severe Storms Laboratory (NSSL), and forecasts from the Storm Scale Ensemble Forecast (SSEF) system run by the Center for Analysis and Prediction of Storms for the 2010 NOAA Hazardous Weather Testbed Spring Forecasting Experiment. Methods for visualizing UH object attributes are presented, and the relationship between pathlengths of UH objects and tornadoes for corresponding 18- or 24-h periods is examined. For deterministic NSSL-WRF UH forecasts, the relationship of UH pathlengths to tornadoes was much stronger during spring (March-May) than in summer (June-August). Filtering UH track segments produced by high-based and/or elevated storms improved the UH-tornado pathlength correlations. The best ensemble results were obtained after filtering high-based and/or elevated UH track segments for the 20 cases in April-May 2010, during which correlation coefficients were as high as 0.91. The results indicate that forecast UH pathlengths during spring could be a very skillful predictor for the severity of tornado outbreaks as measured by total pathlength. © 2012 American Meteorological Society.


Johnson A.,University of Oklahoma | Wang X.,University of Oklahoma | Kong F.,Center for Analysis and Prediction of Storms | Xue M.,University of Oklahoma
Monthly Weather Review | Year: 2011

Convection-allowing ensemble forecasts with perturbations to model physics, dynamics, and initial (IC) and lateral boundary conditions (LBC) generated by the Center for the Analysis and Prediction of Storms for the NOAA Hazardous Weather Testbed (HWT) Spring Experiments provide a unique opportunity to understand the relative impact of different sources of perturbation on convection-allowing ensemble diversity. Such impacts are explored in this two-part study through an object-oriented hierarchical cluster analysis (HCA) technique. In this paper, an object-oriented HCA algorithm, where the dissimilarity of precipitation forecasts is quantified with a nontraditional object-based threat score (OTS), is developed. The advantages of OTS-based HCA relative to HCA using traditional Euclidean distance and neighborhood probability-based Euclidean distance (NED) as dissimilarity measures are illustrated by hourly accumulated precipitation ensemble forecasts during a representative severe weather event. Clusters based on OTS and NED are more consistent with subjective evaluation than clusters based on traditional Euclidean distance because of the sensitivity of Euclidean distance to small spatial displacements. OTS improves the clustering further compared to NED. Only OTS accounts for important features of precipitation areas, such as shape, size, and orientation, and OTS is less sensitive than NED to precise spatial location and precipitation amount. OTS is further improved by using a fuzzy matching method. Application of OTS-based HCA for regional subdomains is also introduced. Part II uses the HCA method developed in this paper to explore systematic clustering of the convection-allowing ensemble during the full 2009 HWT Spring Experiment period. © 2011 American Meteorological Society.


Johnson A.,University of Oklahoma | Wang X.,University of Oklahoma | Xue M.,University of Oklahoma | Kong F.,Center for Analysis and Prediction of Storms
Monthly Weather Review | Year: 2011

Twenty-member real-time convection-allowing storm-scale ensemble forecasts with perturbations to model physics, dynamics, initial conditions (IC), and lateral boundary conditions (LBC) during the NOAA Hazardous Weather Testbed Spring Experiment provide a unique opportunity to study the relative impact of different sources of perturbation on convection-allowing ensemble diversity. In Part II of this two-part study, systematic similarity/dissimilarity of hourly precipitation forecasts among ensemble members from the spring season of 2009 are identified using hierarchical cluster analysis (HCA) with a fuzzy object-based threat score (OTS), developed in Part I. In addition to precipitation, HCA is also performed on ensemble forecasts using the traditional Euclidean distance for wind speed at 10 m and 850 hPa, and temperature at 500 hPa. At early lead times (3 h, valid at 0300 UTC) precipitation forecasts cluster primarily by data assimilation and model dynamic core, indicating a dominating impact of models, with secondary clustering by microphysics. There is an increasing impact of the planetary boundary layer (PBL) scheme on clustering relative to the microphysics scheme at later lead times. Forecasts of 10-m wind speed cluster primarily by the PBL scheme at early lead times, with an increasing impact of LBC at later lead times. Forecasts of midtropospheric variables cluster primarily by IC at early lead times and LBC at later lead times. The radar and Mesonet data assimilation (DA) show its impact, with members withoutDAin a distinct cluster, through the 12-h lead time (valid at 1200 UTC) for both precipitation and nonprecipitation variables. The implication for optimal ensemble design for storm-scale forecasts is also discussed. © 2011 American Meteorological Society.


Schenkman A.D.,University of Oklahoma | Xue M.,University of Oklahoma | Shapiro A.,University of Oklahoma | Brewster K.,Center for Analysis and Prediction of Storms | Gao J.,National Severe Storms Laboratory
Monthly Weather Review | Year: 2011

The impact of radar and Oklahoma Mesonet data assimilation on the prediction of mesovortices in a tornadic mesoscale convective system (MCS) is examined. The radar data come from the operational Weather Surveillance Radar-1988 Doppler (WSR-88D) and the Engineering Research Center for Collaborative Adaptive Sensing of the Atmosphere's (CASA) IP-1 radar network. The Advanced Regional Prediction System (ARPS) model is employed to perform high-resolution predictions of an MCS and the associated cyclonic line-end vortex that spawned several tornadoes in central Oklahoma on 8-9 May 2007, while the ARPS three-dimensional variational data assimilation (3DVAR) system in combination with a complex cloud analysis package is used for the data analysis. A set of data assimilation and prediction experiments are performed on a 400-m resolution grid nested inside a 2-km grid, to examine the impact of radar data on the prediction of meso-γ-scale vortices (mesovortices). An 80-min assimilation window is used in radar data assimilation experiments. An additional set of experiments examines the impact of assimilating 5-min data from the Oklahoma Mesonet in addition to the radar data. Qualitative comparison with observations shows highly accurate forecasts of mesovortices up to 80 min in advance of their genesis are obtained when the low-level shear in advance of the gust front is effectively analyzed. Accurate analysis of the low-level shear profile relies on assimilating high-resolution low-level wind information. The most accurate analysis (and resulting prediction) is obtained in experiments that assimilate low-level radial velocity data from the CASA radars. Assimilation of 5-min observations from the Oklahoma Mesonet has a substantial positive impact on the analysis and forecast when high-resolution low-level wind observations from CASA are absent; when the low-level CASA wind data are assimilated, the impact of Mesonet data is smaller. Experiments that do not assimilate low-level wind data from CASA radars are unable to accurately resolve the low-level shear profile and gust front structure, precluding accurate prediction of mesovortex development. © 2011 American Meteorological Society.


Wang X.,University of Oklahoma | Wang X.,Center for Analysis and Prediction of Storms | Lei T.,University of Oklahoma | Lei T.,Center for Analysis and Prediction of Storms
Monthly Weather Review | Year: 2014

A four-dimensional (4D) ensemble-variational data assimilation (DA) system (4DEnsVar) was developed, building upon the infrastructure of the gridpoint statistical interpolation (GSI)-based hybrid DA system. 4DEnsVar used ensemble perturbations valid at multiple time periods throughout the DA window to estimate 4D error covariances during the variational minimization, avoiding the tangent linear and adjoint of the forecast model. The formulation of its implementation in GSI was described. The performance of the system was investigated by evaluating the global forecasts and hurricane track forecasts produced by the NCEP Global Forecast System (GFS) during the 5-week summer period assimilating operational conventional and satellite data. The newly developed system was used to address a few questions regarding 4DEnsVar. 4DEnsVar in general improved upon its 3D counterpart, 3DEnsVar. At short lead times, the improvement over the Northern Hemisphere extratropics was similar to that over the Southern Hemisphere extratropics. At longer lead times, 4DEnsVar showed more improvement in the Southern Hemisphere than in the Northern Hemisphere. The 4DEnsVar showed less impact over the tropics. The track forecasts of 16 tropical cyclones initialized by 4DEnsVar were more accurate than 3DEnsVar after 1-day forecast lead times. The analysis generated by 4DEnsVar was more balanced than 3DEnsVar. Case studies showed that increments from 4DEnsVar using more frequent ensemble perturbations approximated the increments from direct, nonlinear model propagation better than using less frequent ensemble perturbations. Consistently, the performance of 4DEnsVar including both the forecast accuracy and the balances of analyses was in general degraded when less frequent ensemble perturbations were used. The tangent linear normalmode constraint had positive impact for global forecast but negative impact for TC track forecasts. © 2014 American Meteorological Society.


News Article | March 23, 2016
Site: www.rdmag.com

When a hail storm moved through Fort Worth, Texas on May 5, 1995, it battered the highly populated area with hail up to 4 inches in diameter and struck a local outdoor festival known as the Fort Worth Mayfest. The Mayfest storm was one of the costliest hailstorms in U.S history, causing more than $2 billion in damage and injuring at least 100 people. Scientists know that storms with a rotating updraft on their southwestern sides -- which are particularly common in the spring on the U.S. southern plains -- are associated with the biggest, most severe tornadoes and also produce a lot of large hail. However, clear ideas on how they form and how to predict these events in advance have proven elusive. A team based at University of Oklahoma (OU) working on the Severe Hail Analysis, Representation and Prediction (SHARP) project works to solve that mystery, with support from the National Science Foundation (NSF). Performing experimental weather forecasts using the Stampede supercomputer at the Texas Advanced Computing Center, researchers have gained a better understanding of the conditions that cause severe hail to form, and are producing predictions with far greater accuracy than those currently used operationally. To predict hail storms, or weather in general, scientists have developed mathematically based physics models of the atmosphere and the complex processes within, and computer codes that represent these physical processes on a grid consisting of millions of points. Numerical models in the form of computer codes are integrated forward in time starting from the observed current conditions to determine how a weather system will evolve and whether a serious storm will form. Because of the wide range of spatial and temporal scales that numerical weather predictions must cover and the fast turnaround required, they are almost always run on powerful supercomputers. The finer the resolution of the grid used to simulate the phenomena, the more accurate the forecast; but the more accurate the forecast, the more computation required. The highest-resolution National Weather Service's official forecasts have grid spacing of one point for every three kilometers. The model the Oklahoma team is using in the SHARP project, on the other hand, uses one grid point for every 500 meters -- six times more resolved in the horizontal directions. "This lets us simulate the storms with a lot higher accuracy," said Nathan Snook, an OU research scientist. "But the trade-off is, to do that, we need a lot of computing power -- more than 100 times that of three-kilometer simulations. Which is why we need Stampede." Stampede is currently one of the most powerful supercomputers in the U.S. for open science research and serves as an important part of NSF's portfolio of advanced cyberinfrastructure resources, enabling cutting-edge computational and data-intensive science and engineering research nationwide. According to Snook, there's a major effort underway to move to a "warning on forecast" paradigm -- that is, to use computer-model-based, short-term forecasts to predict what will happen over the next several hours and use those predictions to warn the public, as opposed to warning only when storms form and are observed. "How do we get the models good enough that we can warn the public based on them?" Snook asks. "That's the ultimate goal of what we want to do -- get to the point where we can make hail forecasts two hours in advance. 'A storm is likely to move into downtown Dallas, now is a good time to act.'" With such a system in place, it might be possible to prevent injuries to vulnerable people, divert or move planes into hangers and protect cars and other property. Looking at past storms to predict future ones To study the problem, the team first reviews the previous season's storms to identify the best cases to study. They then perform numerical experiments to see if their models can predict these storms better than the original forecasts using new, improved techniques. The idea is to ultimately transition the higher-resolution models they are testing into operation in the future. Now in the third year of their hail forecasting project, the researchers are getting promising results. Studying the storms that produced the May 20, 2013 Oklahoma-Moore tornado that led to 24 deaths, destroyed 1,150 homes and resulted in an estimated $2 billion in damage, they developed zero to 90 minute hail forecasts that captured the storm's impact better than the National Weather Service forecasts produced at the time. "The storms in the model move faster than the actual storms," Snook said. "But the model accurately predicted which three storms would produce strong hail and the path they would take." The models required Stampede to solve multiple fluid dynamics equations at millions of grid points and also incorporate the physics of precipitation, turbulence, radiation from the sun and energy changes from the ground. Moreover, the researchers had to simulate the storm multiple times -- as an ensemble -- to estimate and reduce the uncertainty in the data and in the physics of the weather phenomena themselves. "Performing all of these calculations on millions of points, multiple times every second, requires a massive amount of computing resources," Snook said. The team used more than a million computing hours on Stampede for the experiments and additional time on the Darter system at the National Institute for Computational Science for more recent forecasts. The resources were provided through the NSF-supported Extreme Science and Engineering Discovery Environment (XSEDE) program, which acts as a single virtual system that scientists can use to interactively share computing resources, data and expertise. Though the ultimate impacts of the numerical experiments will take some time to realize, its potential motivates Snook and the severe hail prediction team. "This has the potential to change the way people look at severe weather predictions," Snook said. "Five or 10 years down the road, when we have a system that can tell you that there's a severe hail storm coming hours in advance, and to be able to trust that -- it will change how we see severe weather. Instead of running for shelter, you'll know there's a storm coming and can schedule your afternoon." Ming Xue, the leader of the project and director of the Center for Analysis and Prediction of Storms (CAPS) at OU, gave a similar assessment. "Given the promise shown by the research and the ever increasing computing power, numerical prediction of hailstorms and warnings issued based on the model forecasts, with a couple of hours of lead time, may indeed be realized operationally in a not-too-distant future, and the forecasts will also be accompanied by information on how certain the forecasts are." The team published its results in the proceedings of the 20th Conference on Integrated Observing and Assimilation Systems for Atmosphere, Oceans and Land Surface (IOAS-AOLS); they will also be published in an upcoming issue of the American Meteorological Society journal Weather and Forecasting. "Severe hail events can have significant economic and safety impacts," said Nicholas F. Anderson, program officer in NSF's Division of Atmospheric and Geospace Sciences. "The work being done by SHARP project scientists is a step towards improving forecasts and providing better warnings for the public."


News Article | March 28, 2016
Site: www.scientificcomputing.com

When a hail storm moved through Fort Worth, TX, on May 5, 1995, it battered the highly populated area with hail up to four inches in diameter and struck a local outdoor festival known as the Fort Worth Mayfest. The Mayfest storm was one of the costliest hailstorms in U.S history, causing more than $2 billion in damage and injuring at least 100 people. Scientists know that storms with a rotating updraft on their southwestern sides — which are particularly common in the spring on the U.S. southern plains — are associated with the biggest, most severe tornadoes and also produce a lot of large hail. However, clear ideas on how they form and how to predict these events in advance have proven elusive. A team based at University of Oklahoma (OU) working on the Severe Hail Analysis, Representation and Prediction (SHARP) project works to solve that mystery, with support from the National Science Foundation (NSF). Performing experimental weather forecasts using the Stampede supercomputer at the Texas Advanced Computing Center, researchers have gained a better understanding of the conditions that cause severe hail to form, and are producing predictions with far greater accuracy than those currently used operationally. To predict hail storms, or weather in general, scientists have developed mathematically based physics models of the atmosphere and the complex processes within, and computer codes that represent these physical processes on a grid consisting of millions of points. Numerical models in the form of computer codes are integrated forward in time starting from the observed current conditions to determine how a weather system will evolve and whether a serious storm will form. Because of the wide range of spatial and temporal scales that numerical weather predictions must cover and the fast turnaround required, they are almost always run on powerful supercomputers. The finer the resolution of the grid used to simulate the phenomena, the more accurate the forecast; but the more accurate the forecast, the more computation required. The highest-resolution National Weather Service's official forecasts have grid spacing of one point for every three kilometers. The model the Oklahoma team is using in the SHARP project, on the other hand, uses one grid point for every 500 meters — six times more resolved in the horizontal directions. "This lets us simulate the storms with a lot higher accuracy," says Nathan Snook, an OU research scientist. "But the trade-off is, to do that, we need a lot of computing power — more than 100 times that of three-kilometer simulations. Which is why we need Stampede." Stampede is currently one of the most powerful supercomputers in the U.S. for open science research and serves as an important part of NSF's portfolio of advanced cyberinfrastructure resources, enabling cutting-edge computational and data-intensive science and engineering research nationwide. According to Snook, there's a major effort underway to move to a "warning on forecast" paradigm — that is, to use computer-model-based, short-term forecasts to predict what will happen over the next several hours and use those predictions to warn the public, as opposed to warning only when storms form and are observed. "How do we get the models good enough that we can warn the public based on them?" Snook asks. "That's the ultimate goal of what we want to do — get to the point where we can make hail forecasts two hours in advance. 'A storm is likely to move into downtown Dallas, now is a good time to act.'" With such a system in place, it might be possible to prevent injuries to vulnerable people, divert or move planes into hangers and protect cars and other property. Looking at past storms to predict future ones To study the problem, the team first reviews the previous season's storms to identify the best cases to study. They then perform numerical experiments to see if their models can predict these storms better than the original forecasts using new, improved techniques. The idea is to ultimately transition the higher-resolution models they are testing into operation in the future. Now in the third year of their hail forecasting project, the researchers are getting promising results. Studying the storms that produced the May 20, 2013 Oklahoma–Moore tornado that led to 24 deaths, destroyed 1,150 homes and resulted in an estimated $2 billion in damage, they developed zero to 90-minute hail forecasts that captured the storm's impact better than the National Weather Service forecasts produced at the time. "The storms in the model move faster than the actual storms," Snook says. "But the model accurately predicted which three storms would produce strong hail and the path they would take." The models required Stampede to solve multiple fluid dynamics equations at millions of grid points and also incorporate the physics of precipitation, turbulence, radiation from the sun and energy changes from the ground. Moreover, the researchers had to simulate the storm multiple times — as an ensemble — to estimate and reduce the uncertainty in the data and in the physics of the weather phenomena themselves. "Performing all of these calculations on millions of points, multiple times every second, requires a massive amount of computing resources," Snook says. The team used more than a million computing hours on Stampede for the experiments and additional time on the Darter system at the National Institute for Computational Science for more recent forecasts. The resources were provided through the NSF-supported Extreme Science and Engineering Discovery Environment (XSEDE) program, which acts as a single virtual system that scientists can use to interactively share computing resources, data and expertise. Though the ultimate impacts of the numerical experiments will take some time to realize, its potential motivates Snook and the severe hail prediction team. "This has the potential to change the way people look at severe weather predictions," Snook says. "Five or 10 years down the road, when we have a system that can tell you that there's a severe hail storm coming hours in advance, and to be able to trust that — it will change how we see severe weather. Instead of running for shelter, you'll know there's a storm coming and can schedule your afternoon." Ming Xue, the leader of the project and director of the Center for Analysis and Prediction of Storms (CAPS) at OU, gave a similar assessment. "Given the promise shown by the research and the ever-increasing computing power, numerical prediction of hailstorms and warnings issued based on the model forecasts, with a couple of hours of lead time, may indeed be realized operationally in a not-too-distant future, and the forecasts will also be accompanied by information on how certain the forecasts are." The team published its results in the proceedings of the 20th Conference on Integrated Observing and Assimilation Systems for Atmosphere, Oceans and Land Surface (IOAS-AOLS); they will also be published in an upcoming issue of the American Meteorological Society journal Weather and Forecasting. "Severe hail events can have significant economic and safety impacts," says Nicholas F. Anderson, program officer in NSF's Division of Atmospheric and Geospace Sciences. "The work being done by SHARP project scientists is a step towards improving forecasts and providing better warnings for the public."

Loading Center for Analysis and Prediction of Storms collaborators
Loading Center for Analysis and Prediction of Storms collaborators