Time filter

Source Type

Goodman S.J.,NASA | Gurka J.,NASA | De Maria M.,The Center for Satellite Applications and Research | Schmit T.J.,The Center for Satellite Applications and Research | And 9 more authors.
Bulletin of the American Meteorological Society | Year: 2012

The Geostationary Operational Environmental Satellite R series (GOES-R) Proving Ground (PG) is an initiative to accelerate user readiness for the next generation of US geostationary environmental satellites. The GOES-R PG program enables the transition from research to operations with the principal emphasis on National Oceanic and Atmospheric Administration's (NOAA) operational forecast office environment. The GOES-R PG principal collaboration for severe convective weather occurs within NOAA's HWT and Storm Prediction Center (SPC) in Norman, Oklahoma. The Satellite-Based Convection Analysis and Tracking (SATCAST) is a proxy for the AWG version of the GOES-R CI algorithm. Simulated GOES-R ABI imagery and band differences generated from the NSSL WRF 0000 UTC 4-km model run are provided by CIMSS and CIRA for display within the HWT National Advanced Weather Information Processing System (NAWIPS).


Mishra S.,Cooperative Institute for Mesoscale Meteorological Studies | Mishra S.,Desert Research Institute | Mitchell D.L.,Desert Research Institute | Turner D.D.,National Oceanic and Atmospheric Administration | Lawson R.P.,SPEC, Inc.
Journal of Geophysical Research: Atmospheres | Year: 2014

The climate sensitivity predicted in general circulation models can be sensitive to the treatment of the ice particle fall velocity. In this study, the mass-weighted ice fall speed (Vm) and the number concentration ice fall speed (Vn) in midlatitude cirrus clouds are computed from in situ measurements of ice particle area and number concentration made by the two-dimensional stereo probe during the Small Particles In Cirrus field campaign. For single-moment ice microphysical schemes, Vm and the ice particle size distribution effective diameter De were parameterized in terms of cloud temperature (T) and ice water content (IWC). For two-moment schemes, Vm and Vn were related to De and the mean maximum dimension¯D, respectively. For single-moment schemes, although the correlations of Vm and De with T were higher than the correlations of Vm and De with IWC, it is demonstrated that Vm and De are better predicted by using both T and IWC. The parameterization relating Vm to T and IWC is compared with another scheme relating Vm to T and IWC, with the latter based on millimeter cloud radar measurements. Regarding two-moment ice microphysical schemes, a strong correlation was found between De and Vm and between¯D and Vn owing to their similar weightings by ice particle mass and number concentration, respectively. Estimating Vm from De makes Vm a function of IWC and projected area, realistically coupling Vm with both the cloud microphysics and radiative properties. Key Points Compute cirrus ice fall speed from in situ dataParameterized mass-weighted ice fall speed for GCMsParameterized number-weighted ice fall speed for GCMs. ©2014. American Geophysical Union. All Rights Reserved.


Zhang G.,University of Oklahoma | Luchs S.,University of Oklahoma | Ryzhkov A.,Cooperative Institute for Mesoscale Meteorological Studies | Xue M.,University of Oklahoma | And 2 more authors.
Journal of Applied Meteorology and Climatology | Year: 2011

The study of precipitation in different phases is important to understanding the physical processes that occur in storms, as well as to improving their representation in numerical weather prediction models. A 2D video disdrometer was deployed about 30 km from a polarimetric weather radar in Norman, Oklahoma, (KOUN) to observe winter precipitation events during the 2006/07 winter season. These events contained periods of rain, snow, and mixed-phase precipitation. Five-minute particle size distributions were generated from the disdrometer data and fitted to a gamma distribution; polarimetric radar variables were also calculated for comparison with KOUN data. It is found that snow density adjustment improves the comparison substantially, indicating the importance of accounting for the density variability in representing model microphysics. © 2011 American Meteorological Society.


Corfidi S.J.,Cooperative Institute for Mesoscale Meteorological Studies | Dean A.R.,National Oceanic and Atmospheric Administration
Weather and Forecasting | Year: 2011

Recent literature has identified several supercell/tornado forecast parameters in common use that are operationally beneficial in assessing environments supportive of supercell tornadoes. These parameters are utilized in the computation of tornado forecast guidance such as the significant tornado parameter (STP), a dimensionless parameter developed at the Storm Prediction Center (SPC) that applies a subjectively chosen scale. The goal of this research is to determine if useful logistic regression equations can be developed to estimate the conditional probability of supercell tornadoes that are categorized as level 2 or stronger on the enhanced Fujita scale (EF) when a similar set of environmental background parameters is applied as variables. A large database of Rapid Update Cycle (RUC) analysis soundings in proximity to a representative sample of tornadic and nontornadic supercells over the central and eastern United States, a number of which were associated with EF2 or stronger tornadoes, was used to compute supercell tornado forecast parameters similar to those in the original version of STP. Three logistic regression equations were developed from this database, two of which are described and analyzed in detail. Statistical verification for both equations was accomplished using independent data from 2008 in proximity to supercell storms identified by staff at SPC. A recent version of the STP was utilized as a comparison diagnostic to accomplish part of the statistical verification. The results of this research suggest that output from both logistic regression equations can provide valuable guidance in a probabilistic sense, when adjustments are made for the ongoing convective mode. Case studies presented also suggest that this guidance can provide information complementary to STP in severe weather situations with potential for supercell tornadoes. © 2011 American Meteorological Society.


Johnson A.,Cooperative Institute for Mesoscale Meteorological Studies | Wang X.,University of Oklahoma
Monthly Weather Review | Year: 2016

The impacts of multiscale flow-dependent initial condition (IC) perturbations for storm-scale ensemble forecasts of midlatitude convection are investigated using perfect-model observing system simulation experiments. Several diverse cases are used to quantitatively and qualitatively understand the impacts of different IC perturbations on ensemble forecast skill. Scale dependence of the results is assessed by evaluating 2-h storm-scale reflectivity forecasts separately from hourly accumulated mesoscale precipitation forecasts. Forecasts are initialized with different IC ensembles, including an ensemble of multiscale perturbations produced by a multiscale data assimilation system, mesoscale perturbations produced at a coarser resolution, and filtered multiscale perturbations. Mesoscale precipitation forecasts initialized with the multiscale perturbations are more skillful than the forecasts initialized with the mesoscale perturbations at several lead times. This multiscale advantage is due to greater consistency between the IC perturbations and IC uncertainty. This advantage also affects the short-term, smaller-scale forecasts. Reflectivity forecasts on very small scales and very short lead times are more skillful with the multiscale perturbations as a direct result of the smaller-scale IC perturbation energy. The small-scale IC perturbations also contribute to some improvements to the mesoscale precipitation forecasts after the ~5-h lead time. Altogether, these results suggest that the multiscale IC perturbations provided by ensemble data assimilation on the convection-permitting grid can improve storm-scale ensemble forecasts by improving the sampling of IC uncertainty, compared to downscaling of IC perturbations from a coarser-resolution ensemble. © 2016 American Meteorological Society.


Tanamachi R.L.,University of Oklahoma | Tanamachi R.L.,Cooperative Institute for Mesoscale Meteorological Studies | Wicker L.J.,National Severe Storms Laboratory | Dowell D.C.,National Oceanic and Atmospheric Administration | And 4 more authors.
Monthly Weather Review | Year: 2013

Mobile Doppler radar data, along with observations from a nearby Weather Surveillance Radar-1988 Doppler (WSR-88D), are assimilated with an ensemble Kalman filter (EnKF) technique into a nonhydrostatic, compressible numerical weather prediction model to analyze the evolution of the 4 May 2007 Greensburg, Kansas, tornadic supercell. The storm is simulated via assimilation of reflectivity and velocity data in an initially horizontally homogeneous environment whose parameters are believed to be a close approximation to those of the Greensburg supercell inflow sector. Experiments are conducted to test analysis sensitivity to mobile radar data availability and to the mean environmental near-surface wind profile, which was changing rapidly during the simulation period. In all experiments, a supercell with similar location and evolution to the observed storm is analyzed, but the simulated storm's characteristics differ markedly. The assimilation of mobile Doppler radar data has a much greater impact on the resulting analyses, particularly at low altitudes (≤2 km), than modifications to the near-surface environmental wind profile. Differences in the analyzed updrafts, vortices, cold pool structure, rear-flank gust front structure, and observation-space diagnostics are documented.An analyzed vortex corresponding to the enhanced Fujita scale 5 (EF-5)Greensburg tornado is stronger and deeper in experiments in which mobile (higher resolution) Doppler radar data are included in the assimilation. This difference is linked to stronger analyzed horizontal convergence,which in turn is associated with increased stretching of vertical vorticity. Changing the near-surface wind profile appears to impact primarily the updraft strength, availability of streamwise vorticity for tilting into the vertical, and lowlevel vortex strength and longevity. ©2013 American Meteorological Society. © 2013 American Meteorological Society.


Jones T.A.,Earth System Science Center | Jones T.A.,Cooperative Institute for Mesoscale Meteorological Studies | Christopher S.A.,Earth System Science Center
Atmospheric Chemistry and Physics | Year: 2011

Using daily Goddard Chemistry Aerosol Radiation and Transport (GOCART) model simulations and columnar retrievals of 0.55 μm aerosol optical thickness (AOT) and fine mode fraction (FMF) from the Moderate Resolution Imaging Spectroradiometer (MODIS), we estimate the satellite-derived aerosol properties over the global oceans between June 2006 and May 2007 due to black carbon (BC), organic carbon (OC), dust (DU), sea-salt (SS), and sulfate (SU) components. Using Aqua-MODIS aerosol properties embedded in the CERES-SSF product, we find that the mean MODIS FMF values for each aerosol type are SS: 0.31 ± 0.09, DU: 0.49 ± 0.13, SU: 0.77 ± 0.16, and (BC + OC): 0.80 ± 0.16. We further combine information from the ultraviolet spectrum using the Ozone Monitoring Instrument (OMI) onboard the Aura satellite to improve the classification process, since dust and carbonate aerosols have positive Aerosol Index (AI) values >0.5 while other aerosol types have near zero values. By combining MODIS and OMI datasets, we were able to identify and remove data in the SU, OC, and BC regions that were not associated with those aerosol types. The same methods used to estimate aerosol size characteristics from MODIS data within the CERES-SSF product were applied to Level 2 (L2) MODIS aerosol data from both Terra and Aqua satellites for the same time period. As expected, FMF estimates from L2 Aqua data agreed well with the CERES-SSF dataset from Aqua. However, the FMF estimate for DU from Terra data was significantly lower (0.37 vs. 0.49) indicating that sensor calibration, sampling differences, and/or diurnal changes in DU aerosol size characteristics were occurring. Differences for other aerosol types were generally smaller. Sensitivity studies show that a difference of 0.1 in the estimate of the anthropogenic component of FMF produces a corresponding change of 0.2 in the anthropogenic component of AOT (assuming a unit value of AOT). This uncertainty would then be passed along to any satellite-derived estimates of anthropogenic aerosol radiative effects. © 2011 Author(s).


Argyle E.M.,Cooperative Institute for Mesoscale Meteorological Studies | Ling C.,University of Akron | Gourley J.J.,National Oceanic and Atmospheric Administration
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | Year: 2015

Flash flooding can be difficult to predict using traditional, rainfall threshold-based approaches. New initiatives like the Flooded Locations and Simulated Hydrographs (FLASH) project provide real-time information using rainfall observations to force distributed hydrologic models to predict flash flooding events. However, in order to address the goal of creating a weather ready nation, system designers must not only possess tools that relay useful information, but such tools must also be able to communicate environmental threats to stakeholders in a clear and easy-to-use interface. Where previous research has addressed the performance of forecasting models, the present study uses a human factors approach to enhance FLASH’s ability to present information to decision-makers (i.e., forecasters). © Springer International Publishing Switzerland 2015.


Kirstetter P.-E.,University of Oklahoma | Kirstetter P.-E.,National Oceanic and Atmospheric Administration | Kirstetter P.-E.,U.S. National Center for Atmospheric Research | Hong Y.,University of Oklahoma | And 11 more authors.
Journal of Hydrometeorology | Year: 2012

Characterization of the error associated with satellite rainfall estimates is a necessary component of deterministic and probabilistic frameworks involving spaceborne passive and active microwave measurements for applications ranging from water budget studies to forecasting natural hazards related to extreme rainfall events. The authors focus here on the error structure of NASA's Tropical Rainfall Measurement Mission (TRMM) Precipitation Radar (PR) quantitative precipitation estimation (QPE) at ground. The problem is addressed by comparison of PR QPEs with reference values derived from ground-based measurements using NOAA/NSSL ground radar-based National Mosaic and QPE system (NMQ/Q2). A preliminary investigation of this subject has been carried out at the PR estimation scale (instantaneous and 5 km) using a 3-month data sample in the southern part of the United States. The primary contribution of this study is the presentation of the detailed steps required to derive a trustworthy reference rainfall dataset from Q2 at the PR pixel resolution. It relies on a bias correction and a radar quality index, both of which provide a basis to filter out the less trustworthy Q2 values. Several aspects of PR errors are revealed and quantified including sensitivity to the processing steps with the reference rainfall, comparisons of rainfall detectability and rainfallrate distributions, spatial representativeness of error, and separation of systematic biases and random errors. The methodology and framework developed herein applies more generally to rainfall-rate estimates from other sensors on board low-earth-orbiting satellites such as microwave imagers and dual-wavelength radars such as with the Global Precipitation Measurement (GPM) mission. © 2012 American Meteorological Society.


Amy McGovern, a computer scientist at the University of Oklahoma, has been studying tornadoes, nature's most violent storms for eight years. She uses computational thinking to help understand and solve these scientific problems. Computational thinking is a way of solving problems, designing systems, and understanding human behavior that draws on concepts fundamental to computer science. In science and engineering, computational thinking is an essential part of the way people think about and understand the world. "Since 2008, we've been trying to understand the formation of tornadoes, what causes tornadoes, and why some storms generate tornadoes and other storms don't," McGovern said. "Weather is a real world application where we can really make a difference to people." She wants to find solutions that are useful. Specifically, she is trying to identify precursors of tornadoes in supercell simulations by generating high resolution simulations of these thunderstorms. Supercell storms, sometimes referred to as rotating thunderstorms, are a special kind of single cell thunderstorm that can persist for many hours. They are responsible for nearly all of the significant tornadoes produced in the U.S. and for most of the hailstorms larger than golf ball size. McGovern would like to generate as many as 100 different supercell simulations during this project. In addition to high resolution simulations, McGovern is also using a combination of data mining and visualization techniques as she explores the factors that separate tornado formation from tornado failure. Studying tornadoes and violent weather comes with a high learning curve, as it requires the application of science and technology to predict the state of the atmosphere for a given location. When McGovern first started the research with a National Science Foundation (NSF) Career Grant, she had to attend several classes so that she would understand more about meteorology, the interdisciplinary scientific study of the atmosphere. She worked closely with meteorology students who taught her about the atmosphere, and she, in turn, taught them about computer science. They went back and forth until they understood each other. The early research generated by the NSF Career Grant resulted in developing data mining software and developing initial techniques on lower resolution simulations. "Now, we're trying to make high resolution simulations of super cell storms, or tornadoes," McGovern said. "What we get with the simulations are the fundamental variables of whatever our resolution is—we've been doing 100 meter x 100 meter cubes—there's no way you can get that kind of data without doing simulations. We're getting the fundamental variables like pressure, temperature and wind, and we're doing that for a lot of storms, some of which will generate tornadoes and some that won't. The idea is to do data mining and visualization to figure out what the difference is between the two." Corey Potvin, a research scientist with the OU Cooperative Institute for Mesoscale Meteorological Studies and the NOAA National Severe Storms Laboratory, said: "I knew nothing about data mining until I started working with Amy on this project. I've enjoyed learning about the data mining techniques from Amy, and in turn teaching her about current understanding of tornadogenesis. It's a very fun and rewarding process. Both topics are so deep that you really need experts in both fields to tackle this problem successfully." The process to do this research requires five steps: 1) Running the simulations; 2) Post-processing the simulation to merge the data; 3) Visualizing the data (to ensure the simulation was valid); 4) Extracting the meta-data; and 5) Data mining (discovering patterns or new knowledge in very large data sets). McGovern's research is related to the National Oceanic and Atmospheric Administration's (NOAA) Warn-on-Forecast program, tasked to increase tornado, severe thunderstorm, and flash flood warning lead times to reduce loss of life, injury, and damage to the economy. NOAA believes the current yearly-averaged tornado lead times are reaching a plateau, and a new approach is needed. "My ideal goal would be to find something that no one has thought of...discovering new science," McGovern said. According to the National Weather Service, on average, nearly three out of every four tornado warning issues are false alarms. How do we reduce the false alarm rate for tornado prediction and increase warning lead time? This is a question that McGovern asks on a continual basis. Right now the lead time is about 15 minutes on average for every tornado, but McGovern and team want to be able to better predict it. They're trying to do this by issuing warnings based on probabilities from the weather forecast rather than issuing warnings based on weather that is already about to form. "Once the weather is already starting to form, you won't get a two hour lead time," McGovern said. How Is the Extreme Science and Engineering Discovery Environment (XSEDE) Helping? When asked about XSEDE, McGovern replied: "XSEDE is fabulous. We've been using XSEDE resources for years. I started out with the resources at my university and then quickly outgrew what they had. They pointed me to XSEDE. I started out at NICS using Darter, and when that went away, I started using Stampede at TACC. These resources are fundamental...you can't do this kind of data mining on your PC." Stampede is one of the largest, most capable high-performance computing (HPC) systems in the world. McGovern says she is one of the few people that's using HPC and data mining for severe weather. In addition to using XSEDE's Stampede for high resolution simulations, McGovern is taking advantage of XSEDE's Extended Collaborative Service and Support (ECSS) program. McGovern has been working with Greg Foss at the Texas Advanced Computing Center (TACC) for visualization expertise. ECSS experts like Foss are available for collaborations lasting months to a year to help researchers fundamentally advance their use of XSEDE resources. Foss is an expert in scientific visualization, a field devoted to graphically illustrating scientific data to enable scientists and non-scientists to understand, illustrate, and glean insight from their data. "Greg comes at the problem from a completely different perspective, and provides new ways of looking at the data that you wouldn't have thought of in the beginning," McGovern noted. "Once you get into a domain, it's easy to think, 'This is the only way to look at it,' but then someone else comes along and asks, 'Why are you doing it like that?" Serving as a bridge between science research and the lay person, Foss says he enjoys working through XSEDE and highlighting the value and validity of the program. "I believe in our mission and I believe in the visualization field. It's quite a sense of accomplishment to help our users and even be a part of the science." For this project, Foss says that he's learned more about all of the aspects of weather than any of his six past weather projects. "Ultimately, we're trying to discover if a 3-D visualization approach will be a useful data mining tool for violent weather testbeds," Foss said. Foss, McGovern and Potvin are working together to find storm features (objects) in the tornado simulations. Weather simulations must capture as much of the state of the atmosphere at each time step as possible, and this results in a tremendous amount of data. For example, in one of their first simulated data sets, McGovern had to sort through 352 million data points per time step. "Since you can't save all of this data or mine it, you try to find the high level features because you don't want to do that for 100 simulations by hand, which is the traditional method of studying storms in meteorology. There's no way you can take traditional analysis techniques to that set and find an answer. Data mining is designed to help us sift through that data set and find statistical patterns that are causing tornadoes. The simulations are run in Fortran, the post-processing is in Python, and the data mining is in Java or Python," she said. For Foss, this project is unique. "Instead of designing an interesting way to present the storm event, we're applying visualization techniques, ideas, and training to data mining. We're using the process to explore ways to identify individual storm features, and characteristics that wouldn't be found with other data mining methods," Foss said. McGovern built the first dataset using Darter (decommissioned as of June 2016) at the National Institute for Computation Sciences. The beginning of the process was writing out different variables and transferring this huge dataset (approx. 5.7 TB from one simulation) to TACC. Then, Foss recruited another TACC visualization specialist, Greg Abram, to program a custom data reader for ParaView, and the visualization work could commence. "ParaView is the software I've been using the most at TACC," he said. "Once I get the data and can read it correctly, my goal is to build scenes from different variables, looking for critical values and viewpoints that the researchers confirm are accurate and useful, and hopefully end up with something visually interesting," Foss said. There are variables expected in a storm simulation, such as velocity (direction, distance). "This is the first project where I used velocity's vertical component to model strong updrafts, key indicators of violent weather. The visualization process isn't just building models- it's allowing a viewer to see the data, so it's important a scene communicates accurately, and doesn't mislead or confuse," Foss said. "Visualizing these datasets is important because they are extremely complex, making it difficult to pull out the most physically important information, in this case, processes contributing to tornadogenesis," Potvin said. "The graphics will be used to help develop the storm feature (objects) definitions for the data mining, to ensure automatically extracted objects match visually and subjectively, and to develop definitions for new objects." Potvin continued, "Using the visualizations to guide our object definitions is critical, he said. "We need the data mining to "know" about the storm features that most matter to whether a tornado forms so that it can tell us about the relationships between those features and tornadogenesis. My primary role in the project is to use my familiarity with current conceptual models of these storms to help isolate features known to be important to tornadoes, and to guide identification of features that aren't typically examined in simulations but that may actually play an important role in tornadogenesis." So far, Foss's visualization work has identified six weather features, data mining 'objects' that can potentially be used to learn about tornados and violent weather: hook echoes, bounded weak echo regions, updrafts, cold pools, helicity with regions of strong vorticity, and vertical pressure perturbation gradients. The results came from Foss exploring variables by experimenting with different values and various models, and looking for consistent patterns and interesting structures over the life of the simulated storm. The goal is to compare these simulated storms with real storms. In real weather, you can't actually see these objects. The simulated data sets turn the tornado or storm into what would be actual objects. In real weather, you can't see an updraft, for example. "Greg's visualizations are of much higher quality than what meteorologists typically use, since most of us lack the skills and computational resources," Potvin said. "The four-dimensionality and high resolution provide a much fuller perspective on how storms and tornadoes evolve. I've been studying these storms for over a decade, and these visualizations have changed the way I think about them." In summary, the XSEDE ECSS team investigated computationally intensive datasets using 3D visualization techniques and an interactive user interface with datamining for identifying 'tornadogenesis' precursors in supercell thunderstorm simulations. They found that the results will assist in defining storm features extracted and input to the data mining: ensuring automatically extracted objects match visually identified ones. "We strongly believe that using data science methods will enable us to discover new knowledge about the formation of tornados," McGovern concluded. Explore further: Supercomputer simulations to help predict tornadoes

Loading Cooperative Institute for Mesoscale Meteorological Studies collaborators
Loading Cooperative Institute for Mesoscale Meteorological Studies collaborators