Kolb S.,University of Bamberg |
Rock C.,Senacor Technologies AG
Proceedings - 2016 IEEE World Congress on Services, SERVICES 2016 | Year: 2016
From its early stages, cloud computing has evolved from being a principal source for computing resources to a fully fledged alternative for rapid application deployment. Especially the service model Platform as a Service facilitates the hosting of scalable applications in the cloud by providing managed and highly automated application environments. Although most offerings are conceptually comparable to each other, the interfaces for application deployment and management vary greatly between vendors. Despite providing similar functionalities, technically different workflows and commands provoke vendor lock-in and hinder portability as well as interoperability. To that end, we present a unified interface for application deployment and management among cloud platforms. We validate our proposal with a reference implementation targeting four leading cloud platforms. The results show the feasibility of our approach and promote the possibility of portable DevOps scenarios in PaaS environments. © 2016 IEEE.
Schonberger A.,University of Bamberg |
Schwalb J.,Senacor Technologies AG |
Wirtz G.,University of Bamberg
Proceedings - 2011 IEEE 9th International Conference on Web Services, ICWS 2011 | Year: 2011
Recently, the Web Services Interoperability Organization (WS-I) has announced to have completed its interoperability standards work. The latest deliverables include the so-called "Basic Security Profile" and the "Reliable Secure Profile". This gives rise to the question whether or not Web Services adopters can rely on interoperability of Web Services stacks, in particular in terms of security and reliability features. To answer this question, we thoroughly analyze two important Web Services stacks for interoperability of WS-Security and WS-ReliableMessaging features. Our analysis shows that security and reliability features are far from being implemented in an interoperable manner. Additionally, we reveal that some of those interoperability problems are not even covered by WS-I profiles and therefore conclude that WS-I's work has not yet resulted in Web Services interoperability. © 2011 IEEE.
Zimmermann S.,University of Innsbruck |
Katzmarzik A.,Senacor Technologies AG |
Kundisch D.,University of Paderborn
Data Base for Advances in Information Systems | Year: 2012
Global sourcing of Information Technology (IT) work has become a widely accepted practice among transnational corporations. Most of the big IT Services Providers (ITSPs) maintain a portfolio of globally distributed delivery centers and have to decide on the assignment of specific software development projects to their available sites. ITSPs have to consider expected costs, risks, and interdependencies between projects and sites when making value-based sourcing decisions. However, value-based decision approaches that are both well founded in theory and applicable in practice have until now been missing in the Information Systems literature. As decision making with respect to the construction of portfolios of risky financial assets exhibits similar characteristics compared to valuebased sourcing decision making, we base our approach on the Modern Portfolio Theory. This paper makes two contributions in this context: (1) It provides a conceptual foundation for the application of Modern Portfolio Theory within the scope of global sourcing of software development projects by ITSPs. Therefore, we adapt the Modern Portfolio Theory to ensure an optimal and full allocation of given software development projects to available sites. Our newly formed model considers site/project combinations as risky assets, assumes discrete portfolio shares, and factors in transaction costs as well as dependencies between both projects and sites. (2) It is the first to actually apply Modern Portfolio Theory using a real world business case. Thereby, we illustrate that using our model leads to considerably different project allocations to the available delivery centers of our case company as well as to substantially lower costs of the sourcing portfolio.
Roser S.,Senacor Technologies AG |
Muller J.P.,Clausthal University of Technology |
Bauer B.,University of Augsburg
Information Systems and e-Business Management | Year: 2011
Our work aims at providing support for the decision-making processes involved in the model-driven development of information technology (IT) solutions for cross-organizational business process (CBP) coordination and automation. The objective of the work described in this paper is to provide enterprise IT architects with an evaluation and decision model that enables the principled assessment and selection of an effective IT architecture paradigm (e. g. central broker, federated brokers, peer-to-peer) for a given cross-organizational business process coordination task. Our approach follows the principles of design science; the contribution of this paper is threefold: First, we present three common architectural patterns for (service-oriented) CBP coordination. Second, the core contribution is established by a new method for decision support suitable for IT architects to derive and evaluate an appropriate architecture paradigm for a given use case or application domain. The method is accompanied by a set of representative scenario descriptions that allow the evaluation and selection of appropriate IT system coordination architecture paradigms for CBP enactment, as well as a set of guidelines for how different contingencies influence IT system coordination architecture. © 2010 Springer-Verlag.
Gorz Q.,University of Augsburg |
Kaiser M.,Senacor Technologies AG
Lecture Notes in Business Information Processing | Year: 2012
Owing to the fact that insufficient data quality usually leads to wrong decisions and high costs, managing data quality is a prerequisite for the successful execution of business and decision processes. An economics-driven management of data quality is in need of efficient measurement procedures, which allow for a predominantly automated identification of poor data quality. Against this background the paper investigates how metrics for the DQ dimensions completeness, validity, and currency can be aggregated to derive an indicator for accuracy. Therefore existing approaches to measure these dimensions are analyzed in order to make explicit, which metric addresses which aspect of data quality. Based on this analysis, an indicator function is designed returning a measure for accuracy on different levels of a data resource. The indicator function's applicability is demonstrated using a customer database example. © 2012 Springer-Verlag Berlin Heidelberg.
Emmersberger C.,University of Regensburg |
Springer F.,Senacor Technologies AG
DEBS 2013 - Proceedings of the 7th ACM International Conference on Distributed Event-Based Systems | Year: 2013
"Interesting applications rarely live in isolation." (, xxix) With this sentence G. Hohpe and B. Woolf start the introduction to their book Enterprise Integration Pattern: Designing, Building, and Deploying Messaging Solutions. While the statement is valid now for more than ten years, Gartner estimates today the cost increase targeting integration aspects for midsize to large companies at about 33% within the next three years (cf. ). The expected increase will be mainly driven by the integration of cloud services and mobile devices. Since event processing addresses clearly problems arising with the growth of computational distribution, particularly with the increasing number of mobile devices or cloud services, integration is a topic that needs to be addressed by event processing functionalities. One of the frameworks within the integration domain is Apache Camel. Since it's initial release in 2007, the framework has gained quite some attention - not only within the open-source arena. Apache Camel has a strong focus on enterprise application integration since it implements well known Enterprise Integration Patterns (EIP's) (cf. ). This work reveals the event processing capabilities of Apache Camel alongside a logistics parcel delivery process. The delivery process facilitates the scenario descriptions to exemplify the event processing functionalities within a real-world context. All coding examples, supporting the functionality demonstration, are setup around the shipment of parcels. Copyright © 2013 ACM.
Sebastian S.,University of Mannheim |
Stefan B.,Senacor Technologies AG
Proceedings - Pacific Asia Conference on Information Systems, PACIS 2012 | Year: 2012
In the recent years, Software-as-a-Service has gained growing attention from software vendors as well as software customers. In this distribution and pricing model, software vendors are responsible for the operation and maintenance of solutions and customers pay for the service in form of continuous usage-based subscription fees. This study analyses the key characteristics of the Softwareas- a-Service concept and evaluates their influence on development processes of software vendors. Based on the literature, two defining characteristics (vendor-hosted, pay-per-usage) and five supportive characteristics (standardization, web-technologies, multi-tenancy, fine-granularity, continuous evolution) are identified. Two vendors of complex business applications - both applying a mix of deployment models - are analyzed to identify the impact of Software-as-a-Service on development processes, as well as the driving forces behind the change. The results indicate, that the concept especially affects the requirements engineering as well as operations phases. The combination of the defining characteristics results in an increased cost as well as innovation pressure. Vendors are challenged to optimize and streamline existing practices to reduce internal costs. At the same time, vendors need to innovate quicker. The combination of these aspects asks for an increased integration of internal activities like development and operations, as well as an increased customer-orientation and integration.
Kramer T.,University of Mannheim |
Eschweiler M.,Senacor Technologies AG
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | Year: 2013
This paper seeks to address the decision making problem in software development outsourcing scenarios in which a project manager is in charge of deciding about which software components will be outsourced and which ones will be developed internally. Therefore we propose a methodology and tool support which leverage the classification of a project's software components by means of a graph-based model of the components' requirements and their corresponding clustering. In the course of our design oriented research approach, a prototypical implementation of the methodology has been developed and evaluated. It illustrates the practical applicability of the proposed method. We thereby contribute to the location selection problem in distributed software projects and give guidance for in-house or external software production. The theoretical contribution consists of revealing an improved processing methodology for assessing software requirements and increasing the outsourcing success of a software project. Our contribution for practice is an implemented prototype for project leads of distributed teams. © 2013 Springer-Verlag.
Kaiser M.,Senacor Technologies AG |
Ullrich C.,Research Center
ECIS 2014 Proceedings - 22nd European Conference on Information Systems | Year: 2014
Information systems (IS) projects are famous for experiencing severe cost overruns, which amongst others are often caused by inaccurate ex-ante cost estimations. Against this background, this article presents a descriptive case study located in an IS transformation program at a major German financial services provider. In this case study, a multi-stage cost estimation process, which was applied to 79 IS projects, is described and the estimation accuracy of the cost estimations of all IS projects is determined using different estimation accuracy measures: Estimating Quality Factor, Forecast Error, and Mean Absolute Percentage Error. Depending on the concrete estimation accuracy measure used for the evaluation, the overall estimation quality of the program turns out to be evaluated as good or at least average - which seems to be contrary to most studies in scientific literature. However, the results further reveal that the estimation accuracy also depends on the estimation accuracy measure chosen for the evaluation. These differing judgements are discussed from a management perspective.
Kaiser M.,Senacor Technologies AG
Proceedings of the European, Mediterranean and Middle Eastern Conference on Information Systems: Global Information Systems Challenges in Management, EMCIS 2010 | Year: 2010
This paper analyses whether and if so which logical connections can be established between metrics for measuring different dimensions of Data Quality. The purpose is to conclude from the result of the metric for one dimension the value ranges of the metrics for the other dimensions. To achieve this, we sum up and refines existing definitions and metrics for the three Data Quality dimensions completeness, consistency, and accuracy. Afterwards, we state logical considerations concerning the results of these metrics and their effects on the results of the metrics for the other dimensions. The result of this analysis is an order in which metrics for the three dimensions should be applied to a data resource. Finally, we state some limitations and discuss directions for further research.