Jocham S.,University of Innsbruck |
Dobler W.,Technikerstrasse 21a |
Baran R.,Technikerstrasse 21a |
Aufleger M.,University of Innsbruck |
Steinbacher F.,Technikerstrasse 21a
Osterreichische Wasser- und Abfallwirtschaft | Year: 2014
Bathymetric surveys are an essential basis for investigations in river hydraulics. In this context, ecohydraulic studies represent an interpretation of the hydraulic situation in running waters with regard to the living conditions for their flora and fauna. With increasing computational performance and powerful models for describing the hydraulic situation, the expectations for basic surveys are rising. Against this backdrop and further reinforced by the stipulations of the European Water Framework Directive, the technology of Airborne Hydromapping (surveying with a water-penetrating laser system) was developed. With this technology it is possible to survey water bodies and riparian strips comprehensively and in high resolution (10-40 points/m2). The data generated can in turn be used to create detailed, high-resolution calculation meshes and therefore to accurately describe the hydraulic conditions in large river reaches. It can be used in both small-scale and large-scale contexts (e.g. habitat modeling or structural analysis), and also opens new avenues for monitoring applications. © 2014 Springer-Verlag Wien.
Gschwandtner P.,Technikerstrasse 21a |
Chalios C.,Bernard Crossland Building |
Nikolopoulos D.S.,Bernard Crossland Building |
Vandierendonck H.,Bernard Crossland Building |
Fahringer T.,Technikerstrasse 21a
Computer Science - Research and Development | Year: 2015
Dynamic voltage and frequency scaling (DVFS) exhibits fundamental limitations as a method to reduce energy consumption in computing systems. In the HPC domain, where performance is of highest priority and codes are heavily optimized to minimize idle time, DVFS has limited opportunity to achieve substantial energy savings. This paper explores if operating processors near the transistor threshold voltage (NTV) is a better alternative to DVFS for breaking the power wall in HPC. NTV presents challenges, since it compromises both performance and reliability to reduce power consumption. We present a first of its kind study of a significance-driven execution paradigm that selectively uses NTV and algorithmic error tolerance to reduce energy consumption in performance-constrained HPC environments. Using an iterative algorithm as a use case, we present an adaptive execution scheme that switches between near-threshold execution on many cores and above-threshold execution on one core, as the computational significance of iterations in the algorithm evolves over time. Using this scheme on state-of-the-art hardware, we demonstrate energy savings ranging between 35 and 67 %, while compromising neither correctness nor performance. © 2014, Springer-Verlag Berlin Heidelberg.
Perez-Castillo R.,University of Castilla - La Mancha |
De Guzman I.G.-R.,University of Castilla - La Mancha |
Piattini M.,University of Castilla - La Mancha |
Weber B.,Technikerstrasse 21a |
Places A.S.,University of La Coruña
Proceedings of the ACM Symposium on Applied Computing | Year: 2011
Legacy information systems age over time as a consequence of the uncontrolled maintenance and need to be modernized. Process mining allows the discovery of business processes embedded in legacy information systems, which is necessary to preserve the legacy business knowledge, and align them with the new, modernized information systems. There are two main approaches to address the mining of business processes from legacy information systems: (i) the static approach that only considers legacy source code's elements from a syntactical viewpoint; and (ii) the dynamic approach, which also considers information derived by system execution. Unfortunately, there is a lack of empirical evidence facilitating the selection of one of them. This paper provides a formal comparison of the static and dynamic approach through a case study. This study shows that the static approach provides better performance, while the dynamic approach discovers more accurate business processes. © 2011 ACM.