Hamburg, Germany
Hamburg, Germany
Time filter
Source Type

Chunpir H.I.,University of Hamburg | Chunpir H.I.,DKRZ Inc | Ludwig T.,University of Hamburg | Ludwig T.,DKRZ Inc
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | Year: 2017

Research shows that people construct mental models of concepts, situations and things, thus the theory of mental models is a well-established phenomenon in science. While people are given a particular task, they also construct a mental model to solve that task, for instance; a task to draw spatial objects e.g. two polygons intersecting each other in a two-dimensional environment. Yet, there are merely fewer studies that point at a scientific software that helps to capture preferred and alternative mental models of people during tasks of drawing spatial objects. The major contribution of this work is foundation of an experimental environment as a software application that can recognise mental models of people during drawing spatial objects based on the spatial relations of the drawn objects. The software serves as an experimental environment to find out the preferred mental models based on spatial relations amongst drawn objects i.e. the preferred way of performing tasks to draw spatial objects. Regional Connectivity Calculus is used as an underlying spatial scheme to extract preferred mental models of people who draw drawings using the software. The time to perform each task, including the time to draw and the time to think to understand the drawing tasks is also determined by the software. © Springer International Publishing AG 2017.

Stockhause M.,DKRZ Inc | Lautenschlager M.,DKRZ Inc
Data Science Journal | Year: 2017

Data citations have become widely accepted. Technical infrastructures as well as principles and recommendations for data citation are in place but best practices or guidelines for their implementation are not yet available. On the other hand, the scientific climate community requests early citations on evolving data for credit, e.g. for CMIP6 (Coupled Model Intercomparison Project Phase 6). The data citation concept for CMIP6 is presented. The main challenges lie in limited resources, a strict project timeline and the dependency on changes of the data dissemination infrastructure ESGF (Earth System Grid Federation) to meet the data citation requirements. Therefore a pragmatic, flexible and extendible approach for the CMIP6 data citation service was developed, consisting of a citation for the full evolving data superset and a data cart approach for citing the concrete used data subset. This two citation approach can be implemented according to the RDA recommendations for evolving data. Because of resource constraints and missing project policies, the implementation of the second part of the citation concept is postponed to CMIP7. © 2017 The Author(s).

Chunpir H.I.,DKRZ Inc | Chunpir H.I.,University of Hamburg | Badewi A.A.,Cranfield University | Ludwig T.,DKRZ Inc | Ludwig T.,University of Hamburg
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | Year: 2014

e-Science infrastructures have changed the process of research. Researchers can now access distributed data around the globe with the help of e-infrastructures. This is particularly a very important development for the developing countries. User support services play an important role to provide researchers with the required information needs to accomplish their research goals with the help of e-infrastructures. However, the current user-support practices in e-infrastructures in the climate domain are being followed on intuitive basis, hence over-burdening infrastructure development staffs who partly act as human support agents. The main contribution of this paper is to present the environmental complexity with-in the contemporary user support practices of climate science e-infrastructure known as Earth System Grid Federation (ESGF). ESGF is a leading distributed peer-to-peer (P2P) data-grid system in Earth System Modelling (ESM) having around 25000 users distributed all over the world. © 2014 Springer International Publishing Switzerland.

Stevens B.,Max Planck Institute For Meteorologie | Giorgetta M.,Max Planck Institute For Meteorologie | Esch M.,Max Planck Institute For Meteorologie | Mauritsen T.,Max Planck Institute For Meteorologie | And 15 more authors.
Journal of Advances in Modeling Earth Systems | Year: 2013

ECHAM6, the sixth generation of the atmospheric general circulation model ECHAM, is described. Major changes with respect to its predecessor affect the representation of shortwave radiative transfer, the height of the model top. Minor changes have been made to model tuning and convective triggering. Several model configurations, differing in horizontal and vertical resolution, are compared. As horizontal resolution is increased beyond T63, the simulated climate improves but changes are incremental; major biases appear to be limited by the parameterization of small-scale physical processes, such as clouds and convection. Higher vertical resolution in the middle atmosphere leads to a systematic reduction in temperature biases in the upper troposphere, and a better representation of the middle atmosphere and its modes of variability. ECHAM6 represents the present climate as well as, or better than, its predecessor. The most marked improvements are evident in the circulation of the extratropics. ECHAM6 continues to have a good representation of tropical variability. A number of biases, however, remain. These include a poor representation of low-level clouds, systematic shifts in major precipitation features, biases in the partitioning of precipitation between land and sea (particularly in the tropics), and midlatitude jets that appear to be insufficiently poleward. The response of ECHAM6 to increasing concentrations of greenhouse gases is similar to that of ECHAM5. The equilibrium climate sensitivity of the mixed-resolution (T63L95) configuration is between 2.9 and 3.4 K and is somewhat larger for the 47 level model. Cloud feedbacks and adjustments contribute positively to warming from increasing greenhouse gases. ©2013. American Geophysical Union. All Rights Reserved.

Claussen M.,Max Planck Institute for Meteorology | Selent K.,Max Planck Institute for Meteorology | Selent K.,University of Hamburg | Selent K.,DKRZ Inc | And 3 more authors.
Biogeosciences | Year: 2013

The factor separation of Stein and Alpert (1993) is applied to simulations with the MPI Earth system model to determine the factors which cause the differences between vegetation patterns in glacial and pre-industrial climate. The factors firstly include differences in the climate, caused by a strong increase in ice masses and the radiative effect of lower greenhouse gas concentrations; secondly, differences in the ecophysiological effect of lower glacial atmospheric CO2 concentrations; and thirdly, the synergy between the pure climate effect and the pure effect of changing physiologically available CO2. It is has been shown that the synergy can be interpreted as a measure of the sensitivity of ecophysiological CO2 effect to climate. The pure climate effect mainly leads to a contraction or a shift in vegetation patterns when comparing simulated glacial and pre-industrial vegetation patterns. Raingreen shrubs benefit from the colder and drier climate. The pure ecophysiological effect of CO2 appears to be stronger than the pure climate effect for many plant functional types-in line with previous simulations. The pure ecophysiological effect of lower CO 2 mainly yields a reduction in fractional coverage, a thinning of vegetation and a strong reduction in net primary production. The synergy appears to be as strong as each of the pure contributions locally, but weak on global average for most plant functional types. For tropical evergreen trees, however, the synergy is strong on global average. It diminishes the difference between glacial and pre-industrial coverage of tropical evergreen trees, due to the pure climate effect and the pure ecophysiological CO2 effect, by approximately 50 per cent. © 2013 Author(s).

Stockhause M.,DKRZ Inc | Stockhause M.,Max Planck Institute for Meteorology | Hock H.,DKRZ Inc | Toussaint F.,DKRZ Inc | Lautenschlager M.,DKRZ Inc
Geoscientific Model Development | Year: 2012

The preservation of data in a high state of quality which is suitable for interdisciplinary use is one of the most pressing and challenging current issues in long-term archiving. For high volume data such as climate model data, the data and data replica are no longer stored centrally but distributed over several local data repositories, e.g. the data of the Climate Model Intercomparison Project Phase 5 (CMIP5). The most important part of the data is to be archived, assigned a DOI, and published according to the World Data Center for Climate's (WDCC) application of the DataCite regulations. The integrated part of WDCC's data publication process, the data quality assessment, was adapted to the requirements of a federated data infrastructure. A concept of a distributed and federated quality assessment procedure was developed, in which the workload and responsibility for quality control is shared between the three primary CMIP5 data centers: Program for Climate Model Diagnosis and Intercomparison (PCMDI), British Atmospheric Data Centre (BADC), and WDCC. This distributed quality control concept, its pilot implementation for CMIP5, and first experiences are presented. The distributed quality control approach is capable of identifying data inconsistencies and to make quality results immediately available for data creators, data users and data infrastructure managers. Continuous publication of new data versions and slow data replication prevents the quality control from check completion. This together with ongoing developments of the data and metadata infrastructure requires adaptations in code and concept of the distributed quality control approach.

Luttgau J.,University of Hamburg | Kunkel J.M.,DKRZ Inc
Proceedings of PDSW 2014: 9th Parallel Data Storage Workshop - Held in Conjunction with SC 2014: The International Conference for High Performance Computing, Networking, Storage and Analysis | Year: 2014

Evaluating I/O performance of an application across different systems is a daunting task because it requires preparation of the software dependencies and required input data. Feign aims to be an extensible trace replay solution for parallel applications that supports arbitrary software and library layers. The tool abstracts and streamlines the replay process while allowing plug-ins to provide, manipulate and interpret trace data. Therewith, the application's behavior can be evaluated without potentially proprietary or confidential software and input data.Even more interesting is the potential of Feign as a virtual laboratory for I/O research: by manipulating trace data, experiments can be conducted; for example, it becomes possible to evaluate the benefit of optimization strategies. Since a plug-in could determine 'future' activities, this enables us to develop optimal strategies as baselines for any run-time heuristics, but also eases testing of a developed strategy on many applications without modifying them.The paper proposes and evaluates a workflow to automatically apply optimization candidates to application traces and approximate potential performance gains. By using Feign's reporting facilities, an automatic optimization engine can then independently conduct experiments by feeding traces and strategies to compare the results. © 2014 IEEE.

Kunkel J.,DKRZ Inc | Zimmer M.,University of Hamburg | Betke E.,University of Hamburg
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | Year: 2015

Data sieving in ROMIO promises to optimize individual noncontiguous I/O. However, making the right choice and parameterizing its buffer size accordingly are non-trivial tasks, since predicting the resulting performance is difficult. Since many performance factors are not taken into account by data sieving, extracting the optimal performance for a given access pattern and system is often not possible. Additionally, in Lustre, settings such as the stripe size and number of servers are tunable, yet again, identifying rules for the data-centre proves challenging indeed. In this paper, we (1) discuss limitations of data sieving, (2) apply machine learning techniques to build a performance predictor, and (3) learn and extract best practices for the settings from the data. We used decision trees as these models can capture non-linear behavior, are easy to understand and allow for extraction of the rules used. Even though this initial research is based on decision trees, with sparse training data, the algorithm can predict many cases sufficiently. Compared to a standard setting, the decision trees created are able to improve performance significantly and we can derive expert knowledge by extracting rules from the learned tree. Applying the scheme to a set of experimental data improved the average throughput by 25–50% of the best parametrization’s gain. Additionally, we demonstrate the versatility of this approach by applying it to the porting system of DKRZ’s next generation supercomputer and discuss achievable performance gains. © Springer International Publishing Switzerland 2015.

Ortega P.,Complutense University of Madrid | Ortega P.,DKRZ Inc | Montoya M.,Complutense University of Madrid | Montoya M.,DKRZ Inc | And 4 more authors.
Climate Dynamics | Year: 2012

The variability of the Atlantic meridional overturning circulation (AMOC) is investigated in several climate simulations with the ECHO-G atmosphere-ocean general circulation model, including two forced integrations of the last millennium, one millennial-long control run, and two future scenario simulations of the twenty-first century. This constitutes a new framework in which the AMOC response to future climate change conditions is addressed in the context of both its past evolution and its natural variability. The main mechanisms responsible for the AMOC variability at interannual and multidecadal time scales are described. At high frequencies, the AMOC is directly responding to local changes in the Ekman transport, associated with three modes of climate variability: El Niño-Southern Oscillation (ENSO), the North Atlantic Oscillation (NAO), and the East Atlantic (EA) pattern. At low frequencies, the AMOC is largely controlled by convection activity south of Greenland. Again, the atmosphere is found to play a leading role in these variations. Positive anomalies of convection are preceded in 1 year by intensified zonal winds, associated in the forced runs to a positive NAO-like pattern. Finally, the sensitivity of the AMOC to three different forcing factors is investigated. The major impact is associated with increasing greenhouse gases, given their strong and persistent radiative forcing. Starting in the Industrial Era and continuing in the future scenarios, the AMOC experiences a final decrease of up to 40% with respect to the preindustrial average. Also, a weak but significant AMOC strengthening is found in response to the major volcanic eruptions, which produce colder and saltier surface conditions over the main convection regions. In contrast, no meaningful impact of the solar forcing on the AMOC is observed. Indeed, solar irradiance only affects convection in the Nordic Seas, with a marginal contribution to the AMOC variability in the ECHO-G runs. © 2011 Springer-Verlag.

Chunpir H.I.,Federal University of São Carlos | Chunpir H.I.,University of Hamburg | Chunpir H.I.,DKRZ Inc
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | Year: 2016

Service desk has been widely deployed to cater user-support in an organisation. However, in the field of e-Research there are only few studies conducted to enhance the user-support services or user-services. Little has been done to improve the motivation of the employees of e-Science infrastructures to service incoming user requests known as incidents. In this paper, User-Support- Worker’s Activity Model (USWAM) is presented that enhances the interactivity of the employees of cyber-infrastructures with the incidents. Furthermore, the model enhances not only the handling of the incoming user requests but also the management of the core activities assigned to the employees via visualization queues and matrices in the UI. Subsequently, USWAM aids the employees to remain interested in supporting users, similar to playing a game. Accomplished tasks can be rewarded in the form of money/gifts or recognitions. Finally, USWAM can be transferred to other service-oriented domains where prioritization or management of tasks is required. © Springer International Publishing Switzerland 2016.

Loading DKRZ Inc collaborators
Loading DKRZ Inc collaborators