Entity

Time filter

Source Type

DWD
Lindenberg, Germany

Yano J.-I.,Meteo - France | Soares P.M.M.,University of Lisbon | Kohler M.,DWD | Deluca A.,Max Planck Institute for the Physics of Complex Systems
Bulletin of the American Meteorological Society | Year: 2015

Forty-three scientists from 13 European countries, Israel, and the United States met to synthesize the breadth of convective parameterization issues at the Workshop on Concepts for Convective Parameterizations in Large-Scale models VII: Operations and Fundamentals. Numerical weather forecasts, in which convective parameterization is an indispensable component, are performed with the end users? needs in mind. However, better satisfaction of the end users does not necessarily ensure the overall quality of the forecasts. To make forecasts of user needs truly reliable, a reliable prediction of surface pressure must first be ensured. Though tuning may easily improve the forecast skill of those secondary variables, it is often accomplished with deteriorations in other aspects of the forecasts, especially the climatological state of the model. A qualitatively different measure must be introduced in order to better quantify this aspect of forecast error.


Yano J.-I.,Meteo - France | MacHulskaya E.,DWD | Bechtold P.,ECMWF | Plant R.S.,University of Reading
Bulletin of the American Meteorological Society | Year: 2013

The fifth annual series of workshop entitled 'Concepts for Convective Parameterizations in Large-Scale Models' was held in 2012. The purpose of the workshop series has been to discuss the fundamental theoretical issues of convection parameterization with a small number of European scientists. It was funded by the European Cooperation in Science and Technology (COST) Action ES0905. The theme of the workshop for the year 2012 was decided from a main conclusion of the earlier workshop, which focused on the convective organization problem, seeking a means for implementing such effects into convection parameterizations. The participants were informed that the inclusion of mid-troposphere humidity sensitivities into entrainment and detrainment formulations had contributed substantially to the model \improvements. It had significantly contributed to the improvement in prediction of the Madden-Julian oscillation (MJO).


News Article
Site: www.scientificcomputing.com

The Deutsches KlimaRechenZentrum (DKRZ), German Climate Computer Center, supercomputer is ranked among the largest systems employed for scientific computing. On October 5, 2015, Germany enhanced its leadership in climate research with the inauguration of Mistral — a state-of-the-art HPC system and one of the world’s most efficient supercomputers. The Mistral HPC system is 20 times faster than the previous supercomputer and features a large storage system to house the large climate simulation data archive managed by DKRZ. Using the Mistral system and high performance computing (HPC) tools will keep DKRZ at the forefront in supporting scientific and climate modeling research. Scientists conduct premium climate research and are able to simulate anthropogenic influences on the climate system, which includes cloud research. Clouds and precipitation strongly influence atmospheric radiation and are critical for life on earth. The scale of clouds spans from a micrometer — which is the size of a single cloud particle — to hundreds of kilometers, which is the dimension of a frontal system. Researchers have to resolve all ranges, which makes exact modeling of clouds and precipitation physically difficult and extremely resource-consuming in terms of computer time and storage space. Climate modeling research requires supercomputers that combine the power of thousands of computers and HPC tools to simulate complex climate models and research problems. Mistral is used in the High Definition Clouds and Precipitation for Climate Prediction-HD(CP)2 project, which integrates cloud building and precipitation processes into atmospheric simulations to better understand and research clouds and cloud related processes. The project uses cloud resolving modeling to determine cloud formations in central Europe. According to Professor Thomas Ludwig, Director of the German Climate Computing Center, “The unique characteristic of HD(CP)2 is to develop a cloud resolving LES version (Large Eddy Simulation) of the ICON model (Icosahedral non-hydrostatic general circulation model, a joint development of the German Weather Service DWD and the Max Planck Institute for Meteorology, see e.g. www.mpimet.mpg.de/en/science/models/icon.html) in order to explicitly simulate cloud and precipitation processes. The model region is centered on Germany using a grid with a resolution of 10,000 x 10,000 x 400 grid elements and a grid spacing of 100 m (www.hdcp2.eu). Such simulations are computationally very intensive and the necessary computing power can be found only on massively parallel computing platforms. In order to achieve this, DKRZ performed a major refactoring of the ICON model.” Figure 2 shows a visualization of the simulated cloud water content for one time step with about 3.5 billion cells per time step (22.5 million cells per slice on 150 levels). The data is on-the-fly resampled from an unstructured ICON grid onto a regular Cartesian grid with a down sampling of 1/10. The ICON simulation was performed using over 400 nodes of Mistral, while the visualization was done using the Vapor software on one single GPU node of the system, thereby consuming over 200 GB of main memory. “The ability to conduct this level of cloud and atmospheric research requires the use of a state-of-the-art HPC system. Using Mistral and HPC tools allows DKRZ to run new processes and ensemble members as well as see clouds or local climate at a higher resolution. Per core, we see a performance improvement of our models between 1.8 and 2.6 using one Intel Xeon processor core, as compared to one 4.7 GHz IBM Power6 core. In times where scientists expect performance gains only through scaling, this is a welcome advancement,” Ludwig states. Mistral, the new High Performance Computer System for Earth System Research (HLRE-3) is from the French company Bull, which was purchased by Atos in 2014. Mistral replaces the IBM Power6 system named Blizzard which was in operation at DKRZ since 2009. The Mistral supercomputer is being installed in two stages. Phase 1 of the Mistral system began in June 2015, with the second stage of Mistral expansion scheduled for summer 2016. Parallel to the installation of the Mistral system in Phase 1, DKRZ users had access to a small test system with 432 Intel Xeon processor cores and a 300 TByte Lustre file system from Xyratex/Seagate, for the purpose of preparing climate models for the new architecture. During the testing, DKRZ provided training classes on how to use the new system in the areas of debugging, machine usage and using visualization tools. The Mistral System consists of computer components by Bull, a disk storage system by Xyratex/Seagate and high performance network switches by Mellanox. These components are distributed over 41 racks weighing up to or even more than a metric ton, which are connected by bundles of fiber fabric. The Phase 1 Mistral supercomputer has about 1,500 compute nodes on the basis of the bullx 700 DLC system each with two 12 core Intel Xeon processors 2680 v3 (for a total of 36,000 cores) — the system and racks deploy hot liquid direct cooling. The Mistral Intel processor-based system allows an inlet cooling liquid temperature of 40 degrees centigrade. The hot liquid heats up to 50 degrees centigrade and is piped to the roof for cooling by fans only. “This means that all the racks that have the hot liquid cooling do not require additional expensive chillers, as the temperature on the roof in Hamburg almost never exceeds 40 degrees,” states Ludwig. Mistral provides 24 high-end visualization nodes equipped with powerful graphics processors and 100 further nodes for pre- and post-processing and analysis of data. All components are connected with each other via optical cables and can directly access the shared file system. This means that the results of modeling calculated on the supercomputer can be directly analyzed on the data visualization nodes. DKRZ does not conduct climate research itself but supports climate modeling and related scientific research. Ludwig indicates, “We participate in various infrastructure and research projects with the aim to support the climate scientists in all aspects of their work in our HPC environment. DKRZ departments support scientists in model parallelization and optimization of the code, data management, storage, data compression, analysis and visualization, help with libraries, improving I/O as well as quality assurance and archiving of data.” DKRZ uses the Allinea DDT debugging tool and Vampire and Intel VTune software as performance tuning tools and Vapor as the visualization tool. The DKRZ staff creates customized in-house tools to help with issues such as data compression, scalability and visualization issues in the parallel climate simulations and research models. There is close cooperation with Ludwig´s research group from the University of Hamburg. In fact, his chair for Scientific Computing has his offices in the DKRZ building. A group of 10 researchers focus on file system and storage issues and on energy efficiency for HPC. In the Mistral phase 1 system, DKRZ uses a 20 PBytes Lustre filesystem based on a Xyratex/Seagate CS9000 system with a bandwidth in excess of 150 GB/s. The metadata performance is outperforming its competitors. The Lustre filesystem will be expanded to 50 PBytes and 430 GB/s in 2016 as the Mistral system expands. According to Ludwig, “In addition to supporting our users to efficiently utilize the supercomputer, we engage in joint projects to enable new science on the current and future systems. Since our users run a large diversity of different models on our system, DKRZ also develops universally usable libraries to facilitate scalable parallel models (YAXT) and make better use of the available storage capacity through data compression (libAEC).” In addition to their other services, DKRZ manages the world’s largest climate simulation data archive. The archive is used by researchers worldwide and contains massive amounts of data. The archive currently contains more than 40 PBytes of data and is projected to grow by 75 PBytes annually over the next five years. “There is a growing gap in the ability of HPC systems to generate large amounts of data and the cost of storage to store this data. DKRZ estimates we are currently spending 25 percent of our investment budget, as well as the electricity expenses, on storage, and we expect this gap to increase for the climate modeling data created in the future. The widening gap between compute capabilities and storage is a problem which means we need to shift some focus to how to maximize storage if you want to keep the balance in the ability to store all the data being generated.” “Lustre as a file system gets constantly increasing support from major vendors and from the computer science community,” Ludwig said. “We are confident that emerging requirements will be picked up quickly and solutions can be provided promptly.” DKRZ supports CMIPs and the Intergovernmental Panel on Climate Change (IPCC) Research DKRZ performs simulations for the research community, such as the Climate Modeling Intercomparison Projects (CMIPs), which build the foundation for the findings presented in the IPCC reports. Climate modelers in Germany worked on the IPCC project performing calculations using the DKRZ computer with an Earth system model from the Max Planck Institute for Meteorology that also simulated the carbon cycle. Ludwig indicates, “We stored approximately 2 PBytes of CMIP5 data on the Mistral machine from DKRZ and international centers. Planning is going on for how much data DRKZ will receive on the next CMIP6 project. It is expected there will be 20 to 50 times more data. DKRZ expects to begin computations for the next IPCC report in 2016. The German climate model data contribution for publications that will be included in the next IPCC assessment report will start to be released in 2016 and will be computed exclusively on the expanded Mistral machine. The DKRZ Center has extensive experience in computations and data dissemination for the IPCC report, and with the new Mistral system, we have a powerful computer and storage system to host at least all the computations that will be conducted on the German side, and probably more.” Installation of the expanded Mistral system is predicted to start in February of 2016 and will also use the Bull direct hot liquid cooling being used in the Mistral Phase 1 system. According to Helena Liebelt, Intel Business Development Manager, “The second phase of the Mistral HLRE-3 System is planned to be available in summer 2016. The expanded Mistral system in 2016 will have more than 3,000 computing nodes and more than 68,000 cores. This extension will roughly double computing and disk storage capacity. With a peak performance of 3 PFlops and a 50 PByte parallel file system, scientists can improve the regional resolution, account for more processes in the Earth system models or reduce uncertainties in climate projections.” Ludwig estimates the expanded Mistral system will be in the Top 100 of the June 2016 TOP500 list and in the top five in the file system capabilities — making Mistral one of the top HPC systems worldwide for storage. While DKRZ uses Seagate Lustre in the production environment of Mistral, Ludwig´s research group became a member of Intel´s Parallel Processing Center for Lustre (IPCC-L) and will conduct research on data compression mechanisms. The group is also using Intel Xeon Phi coprocessors and graphics processing units (GPUs) in their test environment. How HPC will aid climate modeling in the future Professor Ludwig indicates that computer scientists face a number of challenges in climate modeling, including “the growing number of cores and the fact that parallelization is becoming more complicated due to multiple runs of climate simulations which are mathematically non-linear. Memory bandwidth is always a problem, because climate modeling applications are memory intensive. The ability to modify code to take advantage of HPC parallelization and optimization is a problem because of legacy code and not enough software engineers to adapt codes. Growing energy requirements may become a limitation in providing more computational power to future climate models. In addition, I/O bandwidth and storage capacity growth may be even harder to maintain. Science is looking to computer scientists to develop software that can handle the huge number of computing elements.” As supercomputers such as Mistral and HPC tools advance, it will be possible to create finer grids and more grid cells, which will provide a higher resolution of climate information. The German government has funded a project called PalMod that takes the opposite approach and uses a coarse grid for a very long simulated time period. It seeks to apply today’s climate models against 135,000 years of data going back to the ice age. The hope is that this will allow researchers to recompute climate data to see how effective the current climate models are in showing past climate changes and as a way to predict climate changes in the future. DKRZ will be involved in supporting PalMod. However, today’s many and multi core processor architectures will probably not be sufficient to achieve the desired performance: a challenge to be addressed jointly by DKRZ and industry. “DKRZ is the link between hardware vendors, solution providers and the climate research community. Its vision is to make the potential of accelerating technical progress reliably accessible to climate research. We closely follow technological trends and are in permanent contact with companies such as processor producers. At the same time, we participate in climate research projects to learn about the future resources that will be necessary for new insights. We translate between these communities and communicate scientific requirement specifications and technical product characteristics. An efficient usage of HPC adds optimal value to the science of climate researchers,” states Ludwig. Linda Barney is the founder and owner of Barney and Associates, a technical/marketing writing, training and web design firm in Beaverton, OR.


Baehr J.,University of Hamburg | Frohlich K.,DWD | Botzet M.,Max Planck Institute for Meteorology | Domeisen D.I.V.,University of Hamburg | And 6 more authors.
Climate Dynamics | Year: 2015

A seasonal forecast system is presented, based on the global coupled climate model MPI-ESM as used for CMIP5 simulations. We describe the initialisation of the system and analyse its predictive skill for surface temperature. The presented system is initialised in the atmospheric, oceanic, and sea ice component of the model from reanalysis/observations with full field nudging in all three components. For the initialisation of the ensemble, bred vectors with a vertically varying norm are implemented in the ocean component to generate initial perturbations. In a set of ensemble hindcast simulations, starting each May and November between 1982 and 2010, we analyse the predictive skill. Bias-corrected ensemble forecasts for each start date reproduce the observed surface temperature anomalies at 2–4 months lead time, particularly in the tropics. Niño3.4 sea surface temperature anomalies show a small root-mean-square error and predictive skill up to 6 months. Away from the tropics, predictive skill is mostly limited to the ocean, and to regions which are strongly influenced by ENSO teleconnections. In summary, the presented seasonal prediction system based on a coupled climate model shows predictive skill for surface temperature at seasonal time scales comparable to other seasonal prediction systems using different underlying models and initialisation strategies. As the same model underlying our seasonal prediction system—with a different initialisation—is presently also used for decadal predictions, this is an important step towards seamless seasonal-to-decadal climate predictions. © 2014, Springer-Verlag Berlin Heidelberg.


Lock S.-J.,University of Leeds | Bitzer H.-W.,DWD | Coals A.,University of Leeds | Gadian A.,University of Leeds | Mobbs S.,University of Leeds
Monthly Weather Review | Year: 2012

Advances in computing are enabling atmospheric models to operate at increasingly fine resolution, giving rise to more variations in the underlying orography being captured by the model grid. Consequently, highresolution models must overcome the problems associated with traditional terrain-following approaches of spurious winds and instabilities generated in the vicinity of steep and complex terrain. Cut-cell representations of orography present atmospheric models with an alternative to terrain-following vertical coordinates. This work explores the capabilities of a cut-cell representation of orography for idealized orographically forced flows. The orographic surface is represented within the model by continuous piecewise bilinear surfaces that intersect the regular Cartesian grid creating cut cells. An approximate finite-volume method for use with advection-form governing equations is implemented to solve flows through the resulting irregularly shaped grid boxes. Comparison with a benchmark orographic test case for nonhydrostatic flow shows very good results. Further tests demonstrate the cut-cellmethod for flowaround 3D isolated hills and stably resolving flows over very steep orography. © 2012 American Meteorological Society.

Discover hidden collaborations