Fraunhofer ISST

Berlin, Germany

Fraunhofer ISST

Berlin, Germany
Time filter
Source Type

Liagouris J.,Information Systems Management Institute | Athanasiou S.,Information Systems Management Institute | Efentakis A.,RA Computer Technology Institute | Pfennigschmidt S.,Fraunhofer ISST | And 4 more authors.
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | Year: 2011

Mobile devices are a promising platform for content delivery considering the (i) variety of attached sensors, (ii) widespread availability of wireless networks, (iii) even increasing screen estate and hardware specs. What has been missing so far is the adequate coupling of content to those devices and their users' actions. This is especially apparent in the area of Location-based Services (LBS), which, with few exceptions (e.g., navigation), have not fulfilled their predicted commercial success in mobile environments due to the following reasons: (i) content in typical LBS applications is still narrow and static, (ii) available methods and interfaces in mobile handsets for the discovery of available content are at best cumbersome (e.g., keyword-type search), and (iii) existing structured content available in LBS applications is hard to reuse. In this work, we propose the concept of task computing to complement and extend LBS as a means to enable the intuitive and efficient re-purpose, discovery, and delivery of rich content according to the user's needs. Further, we establish the theoretical foundations of task computing and its application in the LBS domain. We also present a fully functional prototype iPhone application structured around the concept of task computing. © 2011 Springer-Verlag.

Otto B.,TU Dortmund | Barenfanger R.,University of St. Gallen | Steinbuss S.,Fraunhofer ISST
28th Bled eConference: #eWellbeing - Proceedings | Year: 2015

Digitization is affecting almost all areas of business and society. It brings about opportunities for enterprises to design a digital business model. While a significant amount of research exist examining the conceptual foundation of business models in general, no comprehensive approach is available that helps enterprises in designing a digital business model. This paper addresses this gap and proposes Digital Business Engineering as a method for digital business model design. The activities are structured into six phases, namely End-to-End Customer Design, Business Ecosystem Design, Digital Product/Service Design, Digital Capability Design, Data Mapping, and Digital Technology Architecture Design. The method development follows principles of design-oriented research. Five case studies are used to analyse method requirements and evaluate it within is natural context.

Beckers K.,University of Duisburg - Essen | Schmidt H.,University of Duisburg - Essen | Kuster J.-C.,Fraunhofer ISST | Fassbender S.,TU Dortmund
Proceedings of the 2011 6th International Conference on Availability, Reliability and Security, ARES 2011 | Year: 2011

The ISO 27000 is a well-established series of information security standards. The scope for applying these standards can be an organisation as a whole, single business processes or even an IT application or IT infrastructure. The context establishment and the asset identification are among the first steps to be performed. The quality of the results produced when performing these steps has a crucial influence on the subsequent steps such as identifying loss, vulnerabilities, possible attacks and defining countermeasures. Thus, a context analysis to gather all necessary information in the initial steps is important, but is not offered in the standard. In this paper, we focus on the scope of cloud computing systems and present a way to support the context establishment and the asset identification described in ISO 27005. A cloud system analysis pattern and different kinds of stakeholder templates serve to understand and describe a given cloud development problem, i.e. the envisaged IT systems and the relevant parts of the operational environment. We illustrate our support using an online banking cloud scenario. © 2011 IEEE.

Burger J.,TU Dortmund | Jurjens J.,TU Dortmund | Wenzel S.,Fraunhofer ISST
International Journal on Software Tools for Technology Transfer | Year: 2015

Security certification of complex systems requires a high amount of effort. As a particular challenge, today’s systems are increasingly long-living and subject to continuous change. After each change of some part of the system, the whole system needs to be re-certified from scratch (since security properties are not in general modular), which is usually far too much effort. When models for software get changed, this can lead to security weaknesses that are also part of the software system that is derived from those models. Hence, it is important to check the models with respect to security properties and correct them respectively. To address this challenge, we present an approach which not only finds security weaknesses but can also correct them in a tool-supported way. As time goes by, a diverse number of changing requirements that may be security-related and non-security-related lead to an evolving system that met its security requirements at design time but can contain vulnerabilities with respect to meanwhile updated security knowledge. Supported by patterns we can describe and detect potential flaws that may arise in models, such as inconsistencies in security requirements. Potential violations can be formalized in the patterns as well as the correction alternatives to fix these. It is based on graph transformation and can be applied to different types of models and violations. For flaw detection, these patterns are used as the left-hand sides of graph transformation rules. Using graph transformation, we can further correct the models and establish that they no longer violate the security requirements under investigation. The approach is supported by a tool which can check whether these patterns arise in models and assist the user in correcting the security vulnerabilities. © 2014, Springer-Verlag Berlin Heidelberg.

Wenzel S.,Fraunhofer ISST | Poggenpohl D.,Fraunhofer ISST | Jurjens J.,TU Dortmund | Ochoa M.,TU Munich
Computer Standards and Interfaces | Year: 2014

In model-based development, quality properties such as consistency of security requirements are often verified prior to code generation. Changed models have to be re-verified before re-generation. If several alternative evolutions of a model are possible, each alternative has to be modeled and verified to find the best model for further development. We present a verification strategy to analyze whether evolution preserves given security properties. The UMLchange profile is used for specifying potential evolutions of a given model simultaneously. We present a tool that reads these annotations and computes a delta containing all possible evolution paths. The paths can be verified wrt. security properties, and for each successfully verified path a new model version is generated automatically. © 2014 Elsevier B.V.

Wenzel S.,TU Dortmund | Wessel C.,TU Dortmund | Humberg T.,Fraunhofer ISST | Jurjens J.,TU Dortmund
CLOSER 2012 - Proceedings of the 2nd International Conference on Cloud Computing and Services Science | Year: 2012

Cloud computing is yet one of the leading developments and depicts the biggest progress in web technologies. It offers a convenient way for using shared and easy accessible resources, in both a web-based and demandoriented sense. However, cloud computing brings concept-based risks, e.g. the risk of private data becoming publicly available. Outsourcing of services into a cloud computing environment arises numerous compliance and security-problems for the potential customer. Legal as well as business requirements have to be met after migration to a cloud environment. Compliance to laws, industry-specific regulations and other rules have to be kept. In this paper we present the research project SecureClouds and our ongoing research towards security and compliance analysis of processes which are to be outsourced into the cloud. We further show a first prototype of an analytic tool-environment that allows us to examine whether outsourcing of a business process is possible while keeping all security and compliance requirements.

Humberg T.,Fraunhofer ISST | Wessel C.,TU Dortmund | Poggenpohl D.,Fraunhofer ISST | Wenzel S.,TU Dortmund | And 2 more authors.
CLOSER 2013 - Proceedings of the 3rd International Conference on Cloud Computing and Services Science | Year: 2013

Despite its significant potential benefits, the concept of Cloud Computing is still regarded with skepticism in most companies. One of the main obstacle is posed by concerns about the systems' security and compliance issues. Examining system and process models for compliance manually is time-consuming and error-prone, in particular due to the mere extent of potentially relevant sources of security and compliance concerns that have to be considered. This paper proposes techniques to ease these problems by providing support in identifying relevant aspects, as well as suggesting possible methods (from an existing pool of such) to actually check a given model. We developed a two-step approach: At first, we build an ontology to formalize rules from relevant standards, augmented with additional semantic information. This ontology is then utilized in the analysis of an actual model of a system or a business process in order to detect possible compliance obligations.

Apfelbeck C.,Custom Solution Development | Fritz M.,Custom Solution Development | Jurjens J.,Fraunhofer ISST | Jurjens J.,TU Dortmund | Zweihoff J.,TU Dortmund
Proceedings - International Computer Software and Applications Conference | Year: 2015

In this paper, we develop an approach to preserve validity of executable batch-job specifications during changes at run-time based on Petri-nets. The approach in particular supports changing batch-job specifications while they are being executed, which makes it particularly important to ensure that the change preserves the critical properties. The approach supports verification of the batch-job specifications that are subject to change against these properties and correction of those batch-job specifications that become invalid by the change. The developed approach was implemented and validated in an industrial application context. © 2015 IEEE.

Klafft M.,Fraunhofer ISST | Meissen U.,Fraunhofer ISST
8th International Conference on Information Systems for Crisis Response and Management: From Early-Warning Systems to Preparedness and Training, ISCRAM 2011 | Year: 2011

As of today, investments into early warning systems are, to a large extent, politically motivated and "disaster-driven". This means that investments tend to increase significantly if a disaster strikes, but are often quickly reduced in the following disaster-free years. Such investment patterns make the continuous operation, maintenance and development of the early warning infrastructure a challenging task and may lead to sub-optimal investment decisions. The paper presented here proposes an economic assessment model for the tangible economic impact of early warning systems. The model places a focus on the false alert problematic and goes beyond previous approaches by incorporating some socio-cultural factors (qualitatively estimated as of now). By doing so, it supports policymakers (but also private investors) in their investment decisions related to early warning applications.

Meister S.,Fraunhofer ISST | Koch O.,Fraunhofer ISST
IFAC Proceedings Volumes (IFAC-PapersOnline) | Year: 2012

Today's central issues in the healthcare supply make it imperative to develop new concepts to reduce the emerging costs and ensure high quality standards. Applying information and communication technology (ICT) and especially telemedicine - technologies that offer the chance to optimize medical data transfer - is regarded as the promising strategy, when developing cost saving concepts. As a result, physicians, as recipients of medical data, are confronted with a growing amount of information, called information overload. Therefore information has to be transported according to the principles of information logistics (ILOG). This paper presents preliminary results of a new approach to use complex event processing (CEP) as a vehicle for information logistics processing termed as Telemedical ILOG Listener (TIL) using context to specify the users' information need. Every telemedical value, like for instance blood-pressure, has to be described as a telemedical event on the basis of HL7 V3, a widespread international standard for data exchange in the healthcare sector. We will define a message type which is able to include the medical data, data necessary for CEP, context and at least data to represent the dimension of ILOG, so it can be processed by a TIL. © 2012 IFAC.

Loading Fraunhofer ISST collaborators
Loading Fraunhofer ISST collaborators