Entity

Time filter

Source Type

Eindhoven, Netherlands

Tretmans J.,Embedded Systems Institute | Tretmans J.,Radboud University Nijmegen
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | Year: 2011

Model-based testing is one of the promising technologies to increase the efficiency and effectiveness of software testing. In model-based testing, a model specifies the required behaviour of a system, and test cases are algorithmically generated from this model. Obtaining a valid model, however, is often difficult if the system is complex, contains legacy or third-party components, or if documentation is incomplete. Test-based modelling, also called automata learning, turns model-based testing around: it aims at automatically generating a model from test observations. This paper first gives an overview of formal, model-based testing in general, and of model-based testing for labelled transition system models in particular. Then the practice of model-based testing, the difficulty of obtaining models, and the role of learning are discussed. It is shown that model-based testing and learning are strongly related, and that learning can be fully expressed in the concepts of model-based testing. In particular, test coverage in model-based testing and precision of learned models turn out to be two sides of the same coin. © 2011 Springer-Verlag.


Stuijk S.,TU Eindhoven | Geilen M.,TU Eindhoven | Basten T.,TU Eindhoven | Basten T.,Embedded Systems Institute
Proceedings - 13th Euromicro Conference on Digital System Design: Architectures, Methods and Tools, DSD 2010 | Year: 2010

The design of new embedded systems is getting more and more complex as more functionality is integrated into these systems. To deal with the design complexity, a predictable design flow is needed. The result should be a system that guarantees that an application can perform its own tasks within strict timing deadlines, independent of other applications running on the system. Synchronous Dataflow Graphs (SDFGs) provide predictability and are often used to model time-constrained streaming applications that are mapped onto a multiprocessor platform. However, the model abstracts from the dynamic application behaviour which may lead to a large overestimation of its resource requirements. We present a design flow that takes the dynamic behaviour of applications into account when mapping them onto a multiprocessor platform. The design flow provides throughput guarantees for each application independent of the other applications while taking into account the available processing capacity, memory and communication bandwidth. The design flow generates a set of mappings that provide a trade-off in their resource usage. This trade-off can be used by a run-time mechanism to adapt the mapping in different use-cases to the available resource. The experimental results show that our design flow reduces the resource requirements of an MPEG-4 decoder by 66% compared to a state-of-the-art design flow based on SDFGs. © 2010 IEEE.


Theelen B.,Embedded Systems Institute | Katoen J.-P.,RWTH Aachen | Wu H.,RWTH Aachen
Proceedings -Design, Automation and Test in Europe, DATE | Year: 2012

Various dataflow formalisms have been used for capturing the potential parallelism in streaming applications to realise distributed (multi-core) implementations as well as for analysing key properties like absence of deadlock, throughput and buffer occupancies. The recently introduced formalism of Scenario-Aware Dataflow (SADF) advances these abilities by appropriately capturing the dynamism in modern streaming applications like MPEG-4 video decoding. This paper reports on the application of Interactive Markov Chains (IMC) to capture SADF and to formally verify functional and performance properties. To this end, we propose a compositional IMC semantics for SADF based on which the Construction and Analysis of Distributed Processes (CADP) tool suite enables model checking various properties. Encountered challenges included dealing with probabilistic choice and potentially unbounded buffers, both of which are not natively supported, as well as a fundamental difference in the underlying time models of SADF and IMC. Application of our approach to an MPEG-4 decoder revealed state space reduction factors up to about 21 but also some limitations in terms of scalability and the performance properties that could be analysed. © 2012 EDAA.


Mooij A.J.,Embedded Systems Institute
Information and Software Technology | Year: 2013

Context: Large software systems are usually developed by integrating several smaller systems, which may have been developed independently. The integration of such systems often requires the development of a custom adapter (sometimes called mediator or glue logic) for bridging any technical incompatibilities between the systems. Adapter development often focuses on how to respond to events from the external interfaces, e.g., by applying data conversions and performing events on (other) external interfaces. Such an operational focus is closely related to an implementation of the adapter, but it makes it complicated to reason about complex adapters. For example, it requires a lot of effort to understand the relation that the adapter establishes between the systems to be integrated, and to avoid any internal inconsistencies. Objective: This article investigates a way to develop adapters in terms of a more abstract, model-based specification. Experts from the application domain should be able to reason about the specification, and the specification should contain enough details to generate an implementation. Method: Based on a few industrial adapters from the domain of Maritime Safety and Security, we study ways to specify them conveniently, and ways to generate them efficiently. On this basis, we identify an approach for developing adapters. In turn, this approach is validated using an additional set of adapters. Results: After abstracting from the details of the interface technologies, the studied adapters could be generated using techniques for incremental view maintenance. This requires an adapter specification in terms of database views to relate the main semantic concepts in the application domain. Conclusion: For developing adapters, it can be useful to model all interface technologies as database operations. Thus adapters can be specified using database views, which improve the conceptual understanding of the adapters. Publish/subscribe-style adapters can then be generated using incremental view maintenance. © 2012 Elsevier B.V. All rights reserved.


Tretmans J.,Embedded Systems Institute
Electronic Notes in Theoretical Computer Science | Year: 2010

Model-based testing is one of the promising technologies to meet the challenges imposed on software testing. In model-based testing test cases are generated from a model that describes the required behaviour of the implementation under test. First, we will argue that model-based testing is more than just the generation of some test cases from a model. A well-defined and sound theory of model-based testing is feasible and necessary, and it brings many benefits, also practical ones. As an example we will consider the ioco-testing theory, where models are expressed as labelled transition systems and correctness is defined with the ioco-implementation relation. Second, we consider component-based testing as an application area of model-based testing, triggered by the popularity of component-based development. The strong and weak points of the ioco-testing theory for component-based testing will be discussed, and an ioco-variant called eco, environmental conformance, is presented that allows model-based testing of both provided and required interfaces of a component. © 2010 Elsevier B.V.

Discover hidden collaborations