Entity

Time filter

Source Type


Schreiber M.,Research Center for Information Technology | Knoppel C.,Daimler AG | Franke U.,Daimler AG
IEEE Intelligent Vehicles Symposium, Proceedings | Year: 2013

Precise and robust localization in real-world traffic scenarios is a new challenge arising in the context of autonomous driving and future driver assistance systems. The required precision is in the range of a few centimeters. In urban areas this precision cannot be achieved by standard global navigation satellite systems (GNSS). Our novel approach achieves this requirement using a stereo camera system and a highly accurate map containing curbs and road markings. The maps are created beforehand using an extended sensor setup. GNSS position is used for initialization only and is not required during the localization process. In the paper we present the localization process and provide an evaluation on a test track under known conditions as well as a long term evaluation on approximately 50 km of rural roads, where a precision in centimeter-range is achieved. © 2013 IEEE. Source


Filipova-Neumann L.,Research Center for Information Technology | Welzel P.,University of Augsburg
Telematics and Informatics | Year: 2010

Monitoring and recording driving behavior has become technologically feasible recently which allows inference of drivers' risk types. We examine the effects of such technologies in automobile insurance markets with adverse selection for both perfect competition and monopoly. Specifically, we assume that insurers can offer a contract with access to recorded information ex post, i.e., after an accident, in addition to the usual second-best contracts. We find that this leads to a Pareto-improvement of social welfare except when high risks initially received an information rent. Regulation can be used to establish Pareto-improvement also in these cases. Explicit consideration of privacy concerns of insurees does not alter our positive welfare results. © 2010 Elsevier Ltd. All rights reserved. Source


Muller L.,Research Center for Information Technology
UbiComp 2013 Adjunct - Adjunct Publication of the 2013 ACM Conference on Ubiquitous Computing | Year: 2013

Reflection on daily work practices can support informal learning and continuous improvement of work practices. This dissertation aims at supporting reflection by employing sensors and corresponding data visualizations to make employees ask the right questions about their work. Two tools have been developed and initial studies have been conducted to evaluate the impact of psychophysiological sensors and proximity sensing for employees in the healthcare domain. The main contribution of this work is the connection of reflective learning and wearable sensors with the goal to persuade employees to reflect. The resulting tools will be evaluated in real work settings. Source


Traverso-Ribon I.,Research Center for Information Technology
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | Year: 2015

Precisely determining semantic similarity between entities becomes a building block for data mining tasks, and existing approaches tackle this problem by mainly considering ontology-based annotations to decide relatedness. Nevertheless, because semantic similarity measures usually rely on the ontology class hierarchy and blindly treat ontology facts, they may erroneously assign high values of similarity to dissimilar entities. We propose ColorSim, a similarity measure that considers semantics of OWL2 annotations, e.g., relationship types, and implicit facts and their inferring processes, to accurately compute the relatedness of two ontology annotated entities. We compare ColorSim with state-of the- art approaches and report on preliminary experimental results that suggest the benefits of exploiting knowledge encoded in the ontologies to measure similarity. © Springer International Publishing Switzerland 2015. Source


Bock J.,Research Center for Information Technology | Hettenhausen J.,Griffith University
Information Sciences | Year: 2012

Particle swarm optimisation (PSO) is a biologically-inspired, population-based optimisation technique that has been successfully applied to various problems in science and engineering. In the context of semantic technologies, optimisation problems also occur but have rarely been considered as such. This work addresses the problem of ontology alignment, which is the identification of overlaps in heterogeneous knowledge bases backing semantic applications. To this end, the ontology alignment problem is revisited as an optimisation problem. A discrete particle swarm optimisation algorithm is designed in order to solve this optimisation problem and compute an alignment of two ontologies. A number of characteristics of traditional PSO algorithms are partially relaxed in this article, such as fixed dimensionality of particles. A complex fitness function based on similarity measures of ontological entities, as well as a tailored particle update procedure are presented. This approach brings several benefits for solving the ontology alignment problem, such as inherent parallelisation, anytime behaviour, and flexibility according to the characteristics of particular ontologies. The presented algorithm has been implemented under the name MapPSO (ontology mapping using particle swarm optimisation). Experiments demonstrate that applying PSO in the context of ontology alignment is a feasible approach. © 2010 Elsevier Inc. All rights reserved. Source

Discover hidden collaborations