Software Competence Center Hagenberg

Hagenberg, Austria

Software Competence Center Hagenberg

Hagenberg, Austria

Time filter

Source Type

Felderer M.,University of Innsbruck | Ramler R.,Software Competence Center Hagenberg
Software Quality Journal | Year: 2014

Risk-based testing has a high potential to improve the software development and test process as it helps to optimize the allocation of resources and provides decision support for the management. But for many organizations, its integration into an existing test process is a challenging task. In this article, we provide a comprehensive overview of existing work and present a generic testing methodology enhancing an established test process to address risks. On this basis, we develop a procedure on how risk-based testing can be introduced in a test process and derive a stage model for its integration. We then evaluate our approach for introducing risk-based testing by means of an industrial study and discuss benefits, prerequisites and challenges to introduce it. Potential benefits of risk-based testing identified in the studied project are faster detection of defects resulting in an earlier release, a more reliable release quality statement as well as the involved test-process optimization. As necessary prerequisites for risk-based testing, we identified an inhomogeneous distribution of risks associated with the various parts of the tested software system as well as consolidated technical and business views on it. Finally, the identified challenges of introducing risk-based testing are reliable risk assessment in the context of complex systems, the availability of experts for risk assessment as well as established tool supports for test management. © 2013, Springer Science+Business Media New York.


Martinez-Gil J.,Software Competence Center Hagenberg
Computer Science Review | Year: 2015

A fundamental challenge in the intersection of Artificial Intelligence and Databases consists of developing methods to automatically manage Knowledge Bases which can serve as a knowledge source for computer systems trying to replicate the decision-making ability of human experts. Despite of most of the tasks involved in the building, exploitation and maintenance of KBs are far from being trivial, and significant progress has been made during the last years. However, there are still a number of challenges that remain open. In fact, there are some issues to be addressed in order to empirically prove the technology for systems of this kind to be mature and reliable. © 2015 Elsevier Inc.


Pirklbauer G.,Software Competence Center Hagenberg
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | Year: 2010

Change impact analysis plays an immanent role in the maintenance and enhancement of software systems. There still exist many approaches to support change impact analysis. In the last years researchers try to utilize data in software repositories to gain findings for supporting miscellaneous aspects of software engineering, e.g. software evolution analysis or change impact analysis. In the context of change impact analysis, approaches (=strategies) try to detect logical dependencies among artifacts based on the version histories of files in the concurrent versioning system (e.g. CVS). They try to infer logical couplings of files (artifacts) based on co-changes (files which are frequently changed together). Based on these findings we want to contribute with the presentation of insights of deeper investigation of historical information in concurrent versioning systems in general. In this paper we have identified and described existing strategies to detect logical change couplings. These strategies will be illustrated by practical use cases. We have empirically evaluated these strategies based on versioning system repositories of two industrial projects. The analysis figures the absolute and relative contribution of dependency results per strategy. Furthermore we show overlappings of dependency results. © 2010 Springer-Verlag Berlin Heidelberg.


Traxler P.,Software Competence Center Hagenberg
Algorithmica | Year: 2016

We study the exponential time complexity of approximate counting satisfying assignments of CNFs. We reduce the problem to deciding satisfiability of a CNF. Our reduction preserves the number of variables of the input formula and thus also preserves the exponential complexity of approximate counting. Our algorithm is also similar to an algorithm which works particularly well in practice and for which no approximation guarantee is known. © 2016 Springer Science+Business Media New York


Moser B.A.,Software Competence Center Hagenberg
Discrete and Computational Geometry | Year: 2012

Weyl's discrepancy measure induces a norm on ℝn which shows a monotonicity and a Lipschitz property when applied to differences of index-shifted sequences. It turns out that its n-dimensional unit ball is a zonotope that results from a multiple sheared projection from the (n+1)-dimensional hypercube which can be interpreted as a discrete differentiation. This characterization reveals that this norm is the canonical metric between sequences of differences of values from the unit interval in the sense that the n-dimensional unit ball of the discrepancy norm equals the space of such sequences. © 2012 Springer Science+Business Media, LLC.


Moser B.A.,Software Competence Center Hagenberg
Electronic Journal of Combinatorics | Year: 2014

Two different elementary approaches for deriving an explicit formula for the distribution of the range of a simple random walk on ℤ of length n are presented. Both of them rely on Hermann Weyl's discrepancy norm, which equals the maximal partial sum of the elements of a sequence. By this the original combinatorial problem on ℤ can be turned into a known path-enumeration problem on a bounded lattice. The solution is provided by means of the adjacency matrix Qd of the walk on a bounded lattice (0,1,…,d). The second approach is algebraic in nature, and starts with the adjacency matrix Qd. The powers of the adjacency matrix are expanded in terms of products of non-commutative left and right shift matrices. The representation of such products by means of the discrepancy norm reveals the solution directly. © 2014, Australian National University. All rights reserved.


Pichler J.,Software Competence Center Hagenberg
Proceedings - Working Conference on Reverse Engineering, WCRE | Year: 2013

Technical software systems contain extensive and complex computations that are frequently implemented in an optimized and unstructured way. Computations are, therefore, hard to comprehend from source code. If no other documentation exists, it is a tedious endeavor to understand which input data impact on a particular computation and how a program does achieves a particular result. We apply symbolic execution to automatically extract computations from source code. Symbolic execution makes it possible to identify input and output data, the actual computation as well as constraints of a particular computation, independently of encountered optimizations and unstructured program elements. The proposed technique may be used to improve maintenance and reengineering activities concerning legacy code in scientific and engineering domains. © 2013 IEEE.


Martinez-Gil J.,Software Competence Center Hagenberg
Cognitive Systems Research | Year: 2016

Semantic similarity measurement aims to determine the likeness between two text expressions that use different lexicographies for representing the same real object or idea. There are a lot of semantic similarity measures for addressing this problem. However, the best results have been achieved when aggregating a number of simple similarity measures. This means that after the various similarity values have been calculated, the overall similarity for a pair of text expressions is computed using an aggregation function of these individual semantic similarity values. This aggregation is often computed by means of statistical functions. In this work, we present CoTO (Consensus or Trade-Off) a solution based on fuzzy logic that is able to outperform these traditional approaches. © 2016 Elsevier B.V.


Martinez-Gil J.,Software Competence Center Hagenberg
International Journal of Uncertainty, Fuzziness and Knowlege-Based Systems | Year: 2016

Semantic similarity measurement of biomedical nomenclature aims to determine the likeness between two biomedical expressions that use different lexicographies for representing the same real biomedical concept. There are many semantic similarity measures for trying to address this issue, many of them have represented an incremental improvement over the previous ones. In this work, we present yet another incremental solution that is able to outperform existing approaches by using a sophisticated aggregation method based on fuzzy logic. Results show us that our strategy is able to consistently beat existing approaches when solving well-known biomedical benchmark data sets. © 2016 World Scientific Publishing Company.


Buchgeher G.,Software Competence Center Hagenberg | Weinreich R.,Johannes Kepler University
Proceedings - 9th Working IEEE/IFIP Conference on Software Architecture, WICSA 2011 | Year: 2011

Traceability requires capturing the relations between software artifacts like requirements, architecture and implementation explicitly. Manual discovery and recovery of tracing information by studying documents, architecture documentation and implementation is time-intensive, costly, and may miss important information not found in the analyzed artifacts. Approaches for explicitly capturing traces exist, but either require manual capturing or lack comprehensive tracing to both architecture and implementation. In this paper we present an approach for (semi)automatically capturing trace-ability relationships from requirements and design decisions to architecture and implementation. Traces are captured in a non-intrusive way during architecture design and implementation. The captured traces are integrated with a semi-formally defined architecture description model and serve as the basis for different kinds of architecture-related activities. © 2011 IEEE.

Loading Software Competence Center Hagenberg collaborators
Loading Software Competence Center Hagenberg collaborators