Hamburg, Germany
Hamburg, Germany

Time filter

Source Type

Heinze R.,Max Planck Institute For Meteorologie | Heinze R.,Leibniz University of Hanover | Dipankar A.,Max Planck Institute For Meteorologie | Dipankar A.,Center for Climate Research Singapore | And 62 more authors.
Quarterly Journal of the Royal Meteorological Society | Year: 2017

Large-eddy simulations (LES) with the new ICOsahedral Non-hydrostatic atmosphere model (ICON) covering Germany are evaluated for four days in spring 2013 using observational data from various sources. Reference simulations with the established Consortium for Small-scale Modelling (COSMO) numerical weather prediction model and further standard LES codes are performed and used as a reference. This comprehensive evaluation approach covers multiple parameters and scales, focusing on boundary-layer variables, clouds and precipitation. The evaluation points to the need to work on parametrizations influencing the surface energy balance, and possibly on ice cloud microphysics. The central purpose for the development and application of ICON in the LES configuration is the use of simulation results to improve the understanding of moist processes, as well as their parametrization in climate models. The evaluation thus aims at building confidence in the model's ability to simulate small- to mesoscale variability in turbulence, clouds and precipitation. The results are encouraging: the high-resolution model matches the observed variability much better at small- to mesoscales than the coarser resolved reference model. In its highest grid resolution, the simulated turbulence profiles are realistic and column water vapour matches the observed temporal variability at short time-scales. Despite being somewhat too large and too frequent, small cumulus clouds are well represented in comparison with satellite data, as is the shape of the cloud size spectrum. Variability of cloud water matches the satellite observations much better in ICON than in the reference model. In this sense, it is concluded that the model is fit for the purpose of using its output for parametrization development, despite the potential to improve further some important aspects of processes that are also parametrized in the high-resolution model. © 2016 The Authors. Quarterly Journal of the Royal Meteorological Society published by John Wiley & Sons Ltd on behalf of the Royal Meteorological Society.


Eyring V.,German Aerospace Center | Righi M.,German Aerospace Center | Lauer A.,German Aerospace Center | Evaldsson M.,Swedish Meteorological and Hydrological Institute | And 34 more authors.
Geoscientific Model Development | Year: 2016

A community diagnostics and performance metrics tool for the evaluation of Earth system models (ESMs) has been developed that allows for routine comparison of single or multiple models, either against predecessor versions or against observations. The priority of the effort so far has been to target specific scientific themes focusing on selected essential climate variables (ECVs), a range of known systematic biases common to ESMs, such as coupled tropical climate variability, monsoons, Southern Ocean processes, continental dry biases, and soil hydrology-climate interactions, as well as atmospheric CO2 budgets, tropospheric and stratospheric ozone, and tropospheric aerosols. The tool is being developed in such a way that additional analyses can easily be added. A set of standard namelists for each scientific topic reproduces specific sets of diagnostics or performance metrics that have demonstrated their importance in ESM evaluation in the peer-reviewed literature. The Earth System Model Evaluation Tool (ESMValTool) is a community effort open to both users and developers encouraging open exchange of diagnostic source code and evaluation results from the Coupled Model Intercomparison Project (CMIP) ensemble. This will facilitate and improve ESM evaluation beyond the state-of-the-art and aims at supporting such activities within CMIP and at individual modelling centres. Ultimately, we envisage running the ESMValTool alongside the Earth System Grid Federation (ESGF) as part of a more routine evaluation of CMIP model simulations while utilizing observations available in standard formats (obs4MIPs) or provided by the user. © 2016 Author(s).


Riedel M.,Juelich Supercomputing Center | Wittenburg P.,Max Planck Institute For Meteorologie | Reetz J.,Rechenzentrum Garching | van de Sanden M.,Stichting Academisch Rekencentrum Amsterdam | And 24 more authors.
Journal of Internet Services and Applications | Year: 2013

The wide variety of scientific user communities work with data since many years and thus have already a wide variety of data infrastructures in production today. The aim of this paper is thus not to create one new general data architecture that would fail to be adopted by each and any individual user community. Instead this contribution aims to design a reference model with abstract entities that is able to federate existing concrete infrastructures under one umbrella. A reference model is an abstract framework for understanding significant entities and relationships between them and thus helps to understand existing data infrastructures when comparing them in terms of functionality, services, and boundary conditions. A derived architecture from such a reference model then can be used to create a federated architecture that builds on the existing infrastructures that could align to a major common vision. This common vision is named as 'ScienceTube' as part of this contribution that determines the high-level goal that the reference model aims to support. This paper will describe how a well-focused use case around data replication and its related activities in the EUDAT project aim to provide a first step towards this vision. Concrete stakeholder requirements arising from scientific end users such as those of the European Strategy Forum on Research Infrastructure (ESFRI) projects underpin this contribution with clear evidence that the EUDAT activities are bottom-up thus providing real solutions towards the so often only described 'high-level big data challenges'. The followed federated approach taking advantage of community and data centers (with large computational resources) further describes how data replication services enable data-intensive computing of terabytes or even petabytes of data emerging from ESFRI projects. © 2013 Riedel et al.; licensee Springer.


Lawrence B.N.,University of Reading | Lawrence B.N.,Rutherford Appleton Laboratory | Lawrence B.N.,Natural Environment Research Council | Balaji V.,Princeton University | And 25 more authors.
Geoscientific Model Development | Year: 2012

The Metafor project has developed a common information model (CIM) using the ISO19100 series formalism to describe numerical experiments carried out by the Earth system modelling community, the models they use, and the simulations that result. Here we describe the mechanism by which the CIM was developed, and its key properties. We introduce the conceptual and application versions and the controlled vocabularies developed in the context of supporting the fifth Coupled Model Intercomparison Project (CMIP5). We describe how the CIM has been used in experiments to describe model coupling properties and describe the near term expected evolution of the CIM. © 2009 Author(s).


Ziegenhein P.,German Cancer Research Center | Kamerling C.P.,German Cancer Research Center | Bangert M.,German Cancer Research Center | Kunkel J.,Deutsches Klimarechenzentrum | Oelfke U.,German Cancer Research Center
Physics in Medicine and Biology | Year: 2013

Intensity modulated treatment plan optimization is a computationally expensive task. The feasibility of advanced applications in intensity modulated radiation therapy as every day treatment planning, frequent re-planning for adaptive radiation therapy and large-scale planning research severely depends on the runtime of the plan optimization implementation. Modern computational systems are built as parallel architectures to yield high performance. The use of GPUs, as one class of parallel systems, has become very popular in the field of medical physics. In contrast we utilize the multi-core central processing unit (CPU), which is the heart of every modern computer and does not have to be purchased additionally. In this work we present an ultra-fast, high precision implementation of the inverse plan optimization problem using a quasi-Newton method on pre-calculated dose influence data sets. We redefined the classical optimization algorithm to achieve a minimal runtime and high scalability on CPUs. Using the proposed methods in this work, a total plan optimization process can be carried out in only a few seconds on a low-cost CPU-based desktop computer at clinical resolution and quality. We have shown that our implementation uses the CPU hardware resources efficiently with runtimes comparable to GPU implementations, at lower costs. © 2013 Institute of Physics and Engineering in Medicine.

Loading Deutsches Klimarechenzentrum collaborators
Loading Deutsches Klimarechenzentrum collaborators