Laboratoire Dinformatique Of Grenoble

Sainte-Foy-lès-Lyon, France

Laboratoire Dinformatique Of Grenoble

Sainte-Foy-lès-Lyon, France
SEARCH FILTERS
Time filter
Source Type

Schaefer G.,Loughborough University | Budnik M.,Laboratoire Dinformatique Of Grenoble | Krawczyk B.,Wroclaw University of Technology
Proceedings of the 11th International Conference on Ubiquitous Information Management and Communication, IMCOM 2017 | Year: 2017

In this paper, we present an immersive image database navigation system. Images are visualised in a spherical visualisation space and arranged, on a grid, by colour so that images of similar colour are located close to each other, while access to large image sets is possible through a hierarchical browsing structure. The user is wearing a 3-D head mounted display (HMD) and is immersed inside the image sphere. Navigation is performed by head movement using a 6-degree-of-freedom tracker integrated in the HMD in conjunction with a wiimote remote control. © 2017 ACM.


Martinet J.,Laboratoire Dinformatique Fondamentale Of Lille | Chiaramella Y.,Laboratoire Dinformatique Of Grenoble | Mulhem P.,Laboratoire Dinformatique Of Grenoble
Information Processing and Management | Year: 2011

In this paper, we lay out a relational approach for indexing and retrieving photographs from a collection. The increase of digital image acquisition devices, combined with the growth of the World Wide Web, requires the development of information retrieval (IR) models and systems that provide fast access to images searched by users in databases. The aim of our work is to develop an IR model suited to images, integrating rich semantics for representing this visual data and user queries, which can also be applied to large corpora. Our proposal merges the vector space model of IR - widely tested in textual IR - with the conceptual graph (CG) formalism, based on the use of star graphs (i.e. elementary CGs made up of a single relation connected to some concepts representing image objects). A novel weighting scheme for star graphs, based on image objects size, position, and image heterogeneity is outlined. We show that integrating relations into the vector space model through star graphs increases the system's precision, and that the results are comparable to those from graph projection systems, and also that they shorten processing time for user queries. © 2010 Elsevier Ltd. All rights reserved.


Kosmyna N.,Laboratoire Dinformatique Of Grenoble | Tarpin-Bernard F.,Laboratoire Dinformatique Of Grenoble | Rivet B.,Grenoble Institute of Technology
ACM Transactions on Computer-Human Interaction | Year: 2015

Using Brain Computer Interfaces (BCIs) as a control modality for games is popular. However BCIs require prior training before playing, which is hurtful to immersion and player experience in the game. For this reason, we propose an explicit integration of the training protocol in game by a modification of the game environment to enforce the synchronicity with the BCI system and to provide appropriate instructions to user. We then dissimulate the synchronicity in the game mechanics by using priming to mask the training instruction (implicit stimuli). We conduct an evaluation of the effects on game experience compared to standard BCI training on 36 subjects. We use the game experience questionnaire (GEQ) coupled with reliability analysis (Cronbach's alpha). The integration does not change the feeling of competence (3/4). However, flow and immersion increase sizably with explicit training integration (2.78 and 2.67/4 from 1.79/4 and 1.52/4) and even more with the implicit training integration (3.27/4 and 3.12/4). © 2015 ACM 1073-0516/2015/10-ART26 $15.00.


Ohliger M.,Free University of Berlin | Ohliger M.,University of Potsdam | Nesme V.,Free University of Berlin | Nesme V.,Laboratoire Dinformatique Of Grenoble | Eisert J.,Free University of Berlin
New Journal of Physics | Year: 2013

We present a novel method for performing quantum state tomography for many-particle systems, which are particularly suitable for estimating the states in lattice systems such as of ultra-cold atoms in optical lattices. We show that the need to measure a tomographically complete set of observables can be overcome by letting the state evolve under some suitably chosen random circuits followed by the measurement of a single observable. We generalize known results about the approximation of unitary two-designs, i.e. certain classes of random unitary matrices, by random quantum circuits and connect our findings to the theory of quantum compressed sensing. We show that for ultra-cold atoms in optical lattices established experimental techniques such as optical super-lattices, laser speckles and time-of-flight measurements are sufficient to perform fully certified, assumption-free tomography. This is possible without the need to address single sites in any step of the procedure. Combining our approach with tensor network methods - in particular, the theory of matrix product states - we identify situations where the effort of reconstruction is even constant in the number of lattice sites, allowing, in principle, to perform tomography on large-scale systems readily available in present experiments. © IOP Publishing and Deutsche Physikalische Gesellschaft.


Pellier D.,University of Paris Descartes | Fiorino H.,Laboratoire Dinformatique Of Grenoble | Metivier M.,University of Paris Descartes
12th International Conference on Autonomous Agents and Multiagent Systems 2013, AAMAS 2013 | Year: 2013

Devising intelligent robots or agents that interact with humans is a major challenge for artificial intelligence. In such contexts, agents must constantly adapt their decisions according to human activities and modify their goal. In this extended abstract, we present a novel continual planning approach, called Moving Goal Planning (MGP) to adapt plans to goal evolutions. This approach draws inspiration from Moving Target Search (MTS) algorithms. In order to limit the number of search iterations and to improve its efficiency, MGP delays as much as possible the start of new searches when the goal changes over time. To this purpose, MGP uses two strategies: Open Check (OC) that checks if the new goal is still in the current search tree and Plan Follow (PF) that estimates whether executing actions of the current plan brings MGP closer to the new goal. Copyright © 2013, International Foundation for Autonomous Agents and Multiagent Systems (www.ifaamas.org). All rights reserved.


Moreno-Garcia D.,Laboratoire Dinformatique Of Grenoble | Estublier J.,Laboratoire Dinformatique Of Grenoble
Proceedings - 2012 IEEE 9th International Conference on Services Computing, SCC 2012 | Year: 2012

Service-based software applications, such as pervasive and ubiquitous ones, are increasingly embedded in our daily lives integrating smart communicating devices. Usually, changes in the execution context of these applications occur unpredictably over time, such as dynamic variations in the availability of the used services and devices, or of the user location and needs. This unpredictable variability in the execution contexts makes impossible to know at design-time the exact conditions under which these applications will be used and the services that will be most suited at a given time. Therefore, the architecture of such applications cannot be fully defined at design-time. These applications must be defined in abstract and flexible ways, allowing incremental composition and dynamic adaptation to their execution context at runtime. In this paper, we present a model-driven approach for designing, developing, executing and managing service-based applications. At design-time, an application is mainly defined by its requirements and goals. The application definition can be extended to add specific functional or non-functional concerns, such as dynamic adaptation, deployment or distribution. At development-time, the application can be automatically and incrementally composed, ensuring its consistency with respect to its definition. At runtime, the application execution is supported and controlled by our runtime environment. © 2012 IEEE.


Saint-Marc C.,Laboratoire Dinformatique Of Grenoble | Davoine P.-A.,Laboratoire Dinformatique Of Grenoble | Villanova-Oliver M.,Laboratoire Dinformatique Of Grenoble
Journal of Maps | Year: 2014

This article describes a number of issues encountered when developing maps of past natural phenomena in the field of volcanology. In order to enable experts to exploit geographical data related to this topic, maps showing the temporal chronology of such events are required. Developing useful maps are made more complex by the coexistence of numerous phenomena in the geographic same space over time, which entails the management of spatial overlays, and includes the difficulty of integrating temporal information in static maps. In this article, we present our approach to the overlaying and temporal ordering of natural phenomena presented as information in maps, using the example of lava flow data. These data are derived from a case study of volcanic hazard affecting La Réunion Island in the Indian Ocean. We explore different methods for mapping evolution over time, for instance 'map collections' or 'small multiple maps', the use of both color or hue and saturation to represent the dates of events, and the use of 'semi-transparency' to preserve the representation of past events overlaid one on the other. Legibility and effectiveness of the map were a prime concern in this exploratory analysis. © 2014 © 2014 Cécile Saint-Marc.


Chaabani M.,University of Boumerdès | Echahed R.,Laboratoire Dinformatique Of Grenoble | Strecker M.,Toulouse 1 University Capitole
CEUR Workshop Proceedings | Year: 2013

This paper is about transformations of knowledge bases with the aid of an imperative programming language which is non-standard in the sense that it features conditions (in loops and selection statements) that are description logic (DL) formulas, and a non-deterministic assignment statement (a choice operator given by a DL formula). We sketch an operational semantics of the proposed programming language and then develop a matching Hoare calculus whose pre- and post-conditions are again DL formulas. A major difficulty resides in showing that the formulas generated when calculating weakest preconditions remain within the chosen DL fragment. In particular, this concerns substitutions whose result is not directly representable. We therefore explicitly add substitution as a constructor of the logic and show how it can be eliminated by an interleaving with the rules of a traditional tableau calculus.


Negrevergne B.,Laboratoire Dinformatique Of Grenoble | Termier A.,Laboratoire Dinformatique Of Grenoble | Mehaut J.-F.,Laboratoire Dinformatique Of Grenoble | Uno T.,National Institute of Informatics
Proceedings of the 2010 International Conference on High Performance Computing and Simulation, HPCS 2010 | Year: 2010

The problem of closed frequent itemset discovery is a fundamental problem of data mining, having applications in numerous domains. It is thus very important to have efficient parallel algorithms to solve this probem, capable of efficiently harnessing the power of multicore processors that exists in our computers (notebooks as well as desktops). In this paper we present PLCM QS, a parallel algorithm based on the LCM algorithm, recognized as the most efficient algorithm for sequential discovery of closed frequent itemsets. We also present a simple yet powerfull parallelism interface based on the concept of Tuple Space, which allows an efficient dynamic sharing of the work. Thanks to a detailed experimental study, we show that PLCMQS is efficient on both on sparse and dense databases. © 2010 IEEE.


Bricon-Souf N.,CNRS Toulouse Institute in Information Technology | Verdier C.,Laboratoire Dinformatique Of Grenoble | Flory A.,INSA Lyon | Jaulent M.C.,French Institute of Health and Medical Research
IRBM | Year: 2013

This paper presents the activities of the theme C "medical information systems and databases" in the GDR Stic Santé. Six one-day workshops have been organized during the period 2011-2012. They were devoted to 1) sharing anatomical and physiological object models for simulation of clinical medical images, 2) advantages and limitations of datawarehouse for biological data, 3) medical information engineering, 4) systems for sharing medical images for research, 5) knowledge engineering for semantic interoperability in e-health applications, and 6) using context in health. In the future, our activities will continue with a specific interest on information systems for translational medicine and the role of electronic healthcare reports in decision-making. Workshops with other research groups will be organized in particular with the e-health research group. © 2013 Elsevier Masson SAS.

Loading Laboratoire Dinformatique Of Grenoble collaborators
Loading Laboratoire Dinformatique Of Grenoble collaborators