Time filter

Source Type

Moser B.A.,Software Competence Center Hagenberg
Proceedings of 1st International Conference on Event-Based Control, Communication and Signal Processing, EBCCSP 2015 | Year: 2015

A novel approach for matching event sequences, that result from threshold-based sampling, is introduced. This approach relies on Hermann Weyl's discrepancy norm, which plays a central role in the context of stability analysis of threshold-based sampling. This metric is based on a maximal principle that evaluates intervals of maximal partial sums. It is shown that minimal length intervals of maximal discrepancy can be exploited, in order to efficiently cluster spikes by means of approximating step functions. In contrast to ordinary spikes, these spike clusters can not only be shifted, deleted or inserted, but also stretched and shrinked, which allows more flexibility in the matching process. A dynamic programming approach is applied in order to minimizing an energy functional of such deformation manipulations. Simulations based on integrate-And-fire sampling show its potential above all regarding robustness. © 2015 IEEE.

Traxler P.,Software Competence Center Hagenberg
Algorithmica | Year: 2016

We study the exponential time complexity of approximate counting satisfying assignments of CNFs. We reduce the problem to deciding satisfiability of a CNF. Our reduction preserves the number of variables of the input formula and thus also preserves the exponential complexity of approximate counting. Our algorithm is also similar to an algorithm which works particularly well in practice and for which no approximation guarantee is known. © 2016 Springer Science+Business Media New York

Pirklbauer G.,Software Competence Center Hagenberg
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | Year: 2010

Change impact analysis plays an immanent role in the maintenance and enhancement of software systems. There still exist many approaches to support change impact analysis. In the last years researchers try to utilize data in software repositories to gain findings for supporting miscellaneous aspects of software engineering, e.g. software evolution analysis or change impact analysis. In the context of change impact analysis, approaches (=strategies) try to detect logical dependencies among artifacts based on the version histories of files in the concurrent versioning system (e.g. CVS). They try to infer logical couplings of files (artifacts) based on co-changes (files which are frequently changed together). Based on these findings we want to contribute with the presentation of insights of deeper investigation of historical information in concurrent versioning systems in general. In this paper we have identified and described existing strategies to detect logical change couplings. These strategies will be illustrated by practical use cases. We have empirically evaluated these strategies based on versioning system repositories of two industrial projects. The analysis figures the absolute and relative contribution of dependency results per strategy. Furthermore we show overlappings of dependency results. © 2010 Springer-Verlag Berlin Heidelberg.

Moser B.A.,Software Competence Center Hagenberg
Electronic Journal of Combinatorics | Year: 2014

Two different elementary approaches for deriving an explicit formula for the distribution of the range of a simple random walk on ℤ of length n are presented. Both of them rely on Hermann Weyl's discrepancy norm, which equals the maximal partial sum of the elements of a sequence. By this the original combinatorial problem on ℤ can be turned into a known path-enumeration problem on a bounded lattice. The solution is provided by means of the adjacency matrix Qd of the walk on a bounded lattice (0,1,…,d). The second approach is algebraic in nature, and starts with the adjacency matrix Qd. The powers of the adjacency matrix are expanded in terms of products of non-commutative left and right shift matrices. The representation of such products by means of the discrepancy norm reveals the solution directly. © 2014, Australian National University. All rights reserved.

Felderer M.,University of Innsbruck | Ramler R.,Software Competence Center Hagenberg
Software Quality Journal | Year: 2014

Risk-based testing has a high potential to improve the software development and test process as it helps to optimize the allocation of resources and provides decision support for the management. But for many organizations, its integration into an existing test process is a challenging task. In this article, we provide a comprehensive overview of existing work and present a generic testing methodology enhancing an established test process to address risks. On this basis, we develop a procedure on how risk-based testing can be introduced in a test process and derive a stage model for its integration. We then evaluate our approach for introducing risk-based testing by means of an industrial study and discuss benefits, prerequisites and challenges to introduce it. Potential benefits of risk-based testing identified in the studied project are faster detection of defects resulting in an earlier release, a more reliable release quality statement as well as the involved test-process optimization. As necessary prerequisites for risk-based testing, we identified an inhomogeneous distribution of risks associated with the various parts of the tested software system as well as consolidated technical and business views on it. Finally, the identified challenges of introducing risk-based testing are reliable risk assessment in the context of complex systems, the availability of experts for risk assessment as well as established tool supports for test management. © 2013, Springer Science+Business Media New York.

Discover hidden collaborations