CNRS Lille Research Center in Informatics, Signal and Automatic control
CNRS Lille Research Center in Informatics, Signal and Automatic control
El Ahmar Y.,CEA Saclay Nuclear Research Center |
Le Pallec X.,CNRS Lille Research Center in Informatics, Signal and Automatic control |
Gerard S.,CEA Saclay Nuclear Research Center
CEUR Workshop Proceedings | Year: 2016
Recent empirical studies about UML showed that software practitioners often use it to communicate. When they use diagram(s) during a meeting with clients/users or during an informal discussion with their architect, they may want to highlight some elements to synchronise the visual support to their discourse. To that end, they are free to use color, size, brightness, grain and/or orientation. The mentioned freedom is due to the lack of formal specifications of their use in the UML standard and refers to what is called the secondary notation, by the Cognitive dimensions framework. According to the Semiology of Graphics (SoG), one of the main references in cartography, each mean of visual annotation is characterized by its perceptual properties. Being under modeler's control, the 5 means of visual annotations can differently be applied to UML graphic components: to the border, text, background and to the related other graphic nodes. In that context, the goal of this research is to study the effective implementations, which maintain the perceptual properties of, especially, the size visual variation. This latter has been chosen because it is considered as the "strongest" among the other visual means, having all the perceptual properties. The present proposal consists of a quantitative methodology using an experiment as strategy of inquiry. The participants will be the ∼ 20 attendees of the HuFaMo workshop. They must be experts on modeling and they know UML. The treatment is the reading and the visual extraction of information from a set of UML sequence diagrams, provided via a web application. The dependent variables we study are the responses and the response times of participants, that will be validated based on the SoG principles. © 2016, CEUR-WS. All rights reserved.
Gendron B.,University of Montréal |
Khuong P.-V.,University of Montréal |
Semet F.,CNRS Lille Research Center in Informatics, Signal and Automatic control
Computers and Operations Research | Year: 2017
We consider the two-level uncapacitated facility location problem with single assignment constraints (TUFLP-S), an extension of the uncapacitated facility location problem. We present six mixed-integer programming models for the TUFLP-S based on reformulation techniques and on the relaxation of the integrality of some of the variables associated with location decisions. We compare the models by carrying out extensive computational experiments on large, hard, artificial instances, as well as on instances derived from an industrial application in freight transportation. © 2017 Elsevier Ltd
Yaseen A.A.,CNRS Lille Research Center in Informatics, Signal and Automatic control |
Bayart M.,CNRS Lille Research Center in Informatics, Signal and Automatic control
Journal of Physics: Conference Series | Year: 2017
In this work, a new approach will be introduced as a development for the attack-tolerant scheme in the Networked Control System (NCS). The objective is to be able to detect an attack such as the Stuxnet case where the controller is reprogrammed and hijacked. Besides the ability to detect the stealthy controller hijacking attack, the advantage of this approach is that there is no need for a priori mathematical model of the controller. In order to implement the proposed scheme, a specific detector for the controller hijacking attack is designed. The performance of this scheme is evaluated be connected the detector to NCS with basic security elements such as Data Encryption Standard (DES), Message Digest (MD5), and timestamp. The detector is tested along with networked PI controller under stealthy hijacking attack. The test results of the proposed method show that the hijacked controller can be significantly detected and recovered. © Published under licence by IOP Publishing Ltd.
Kiran B.R.,CNRS Lille Research Center in Informatics, Signal and Automatic control |
Serra J.,University Paris Est Creteil
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | Year: 2017
Random forests perform boostrap-aggregation by sampling the training samples with replacement. This enables the evaluation of out-of-bag error which serves as a internal cross-validation mechanism. Our motivation lies in the using of the unsampled training samples to improve the ensemble of decision trees. In this paper we study the effect of using the out-of-bag samples to improve the generalization error first of the decision trees and second the random forest by post-pruning. A preliminary empirical study on four UCI repository datasets show consistent decrease in the size of the forests without considerable loss in accuracy. © Springer International Publishing AG 2017.
Noe L.,CNRS Lille Research Center in Informatics, Signal and Automatic control
Algorithms for Molecular Biology | Year: 2017
Background: Spaced seeds, also named gapped q-grams, gapped k-mers, spaced q-grams, have been proven to be more sensitive than contiguous seeds (contiguous q-grams, contiguous k-mers) in nucleic and amino-acid sequences analysis. Initially proposed to detect sequence similarities and to anchor sequence alignments, spaced seeds have more recently been applied in several alignment-free related methods. Unfortunately, spaced seeds need to be initially designed. This task is known to be time-consuming due to the number of spaced seed candidates. Moreover, it can be altered by a set of arbitrary chosen parameters from the probabilistic alignment models used. In this general context, Dominant seeds have been introduced by Mak and Benson (Bioinformatics 25:302-308, 2009) on the Bernoulli model, in order to reduce the number of spaced seed candidates that are further processed in a parameter-free calculation of the sensitivity. Results: We expand the scope of work of Mak and Benson on single and multiple seeds by considering the Hit Integration model of Chung and Park (BMC Bioinform 11:31, 2010), demonstrate that the same dominance definition can be applied, and that a parameter-free study can be performed without any significant additional cost. We also consider two new discrete models, namely the Heaviside and the Dirac models, where lossless seeds can be integrated. From a theoretical standpoint, we establish a generic framework on all the proposed models, by applying a counting semi-ring to quickly compute large polynomial coefficients needed by the dominance filter. From a practical standpoint, we confirm that dominant seeds reduce the set of, either single seeds to thoroughly analyse, or multiple seeds to store. Moreover, in http://bioinfo.cristal.univ-lille.fr/yass/iedera_dominance , we provide a full list of spaced seeds computed on the four aforementioned models, with one (continuous) parameter left free for each model, and with several (discrete) alignment lengths. © 2017 The Author(s).
Petreczky M.,CNRS Lille Research Center in Informatics, Signal and Automatic control |
Automatica | Year: 2017
In this paper we propose a unified geometric framework for representing all solutions of a Linear Time Invariant Differential–Algebraic Equation (DAE-LTI) as outputs of classical Linear Time Invariant Ordinary Differential Equations (ODE-LTI). The proposed framework is then used to solve an LQ optimal control problem for DAE-LTIs with rectangular matrices. © 2017 Elsevier Ltd
Olivier P.,CEA Grenoble |
Bourasseau C.,CEA Grenoble |
Bouamama P.B.,CNRS Lille Research Center in Informatics, Signal and Automatic control
Renewable and Sustainable Energy Reviews | Year: 2017
This review provides an exhaustive and comprehensive analysis of the existing modelling works about low temperature electrolysis system: alkaline and proton exchange membrane (PEM) technologies. In order to achieve this review, a classification was built, based on different criteria such as physical domains involved or modelling approaches. The proposed methodology allows both exposing an overview of the electrolysis system modelling field and providing a deep analysis of each reviewed model. Actual strengths, weaknesses and lacks in this research field are pointed out and the performed analysis provides ideas for future research in this area. © 2017
Amor B.B.,CNRS Lille Research Center in Informatics, Signal and Automatic control |
Su J.,Texas Tech University |
Srivastava A.,Florida State University
IEEE Transactions on Pattern Analysis and Machine Intelligence | Year: 2016
We study the problem of classifying actions of human subjects using depth movies generated by Kinect or other depth sensors. Representing human body as dynamical skeletons, we study the evolution of their (skeletons') shapes as trajectories on Kendall's shape manifold. The action data is typically corrupted by large variability in execution rates within and across subjects and, thus, causing major problems in statistical analyses. To address that issue, we adopt a recently-developed framework of Su et al.  ,  to this problem domain. Here, the variable execution rates correspond to re-parameterizations of trajectories, and one uses a parameterization-invariant metric for aligning, comparing, averaging, and modeling trajectories. This is based on a combination of transported square-root vector fields (TSRVFs) of trajectories and the standard Euclidean norm, that allows computational efficiency. We develop a comprehensive suite of computational tools for this application domain: smoothing and denoising skeleton trajectories using median filtering, up- and down-sampling actions in time domain, simultaneous temporal-registration of multiple actions, and extracting invertible Euclidean representations of actions. Due to invertibility these Euclidean representations allow both discriminative and generative models for statistical analysis. For instance, they can be used in a SVM-based classification of original actions, as demonstrated here using MSR Action-3D, MSR Daily Activity and 3D Action Pairs datasets. Using only the skeletal information, we achieve state-of-the-art classification results on these datasets. © 1979-2012 IEEE.
Devlaminck V.,CNRS Lille Research Center in Informatics, Signal and Automatic control
Journal of the Optical Society of America A: Optics and Image Science, and Vision | Year: 2015
In this paper, we address the issue of the existence of a solution of depolarizing differential Mueller matrix for a homogeneous medium. Such a medium is characterized by linear changes of its differential optical properties with z the thickness of the medium. We show that, under a short correlation distance assumption, it is possible to derive such linear solution, and we clarify this solution in the particular case where the random fluctuation processes associated to the optical properties are Gaussian white noise-like. A solution to the problem of noncommutativity of a previously proposed model [J. Opt. Soc. Am. 30, 2196 (2013)] is given by assuming a random permutation of the order of the layers and by averaging all the differential matrices resulting from these permutations. It is shown that the underlying assumption in this case is exactly the Gaussian white noise assumption. Finally, a recently proposed approach [Opt. Lett. 39, 4470 (2014)] for analysis of the statistical properties related to changes in optical properties is revisited, and the experimental conditions of application of these results are specified. © 2015 Optical Society of America.
Baudry B.,French Institute for Research in Computer Science and Automation |
Monperrus M.,CNRS Lille Research Center in Informatics, Signal and Automatic control
ACM Computing Surveys | Year: 2015
Early experiments with software diversity in the mid 1970s investigated N-version programming and recovery blocks to increase the reliability of embedded systems. Four decades later, the literature about software diversity has expanded in multiple directions: goals (fault tolerance, security, software engineering), means (managed or automated diversity), and analytical studies (quantification of diversity and its impact). Our article contributes to the field of software diversity as the first work that adopts an inclusive vision of the area, with an emphasis on the most recent advances in the field. This survey includes classical work about design and data diversity for fault tolerance, as well as the cybersecurity literature that investigates randomization at different system levels. It broadens this standard scope of diversity to include the study and exploitation of natural diversity and the management of diverse software products. Our survey includes the most recent works, with an emphasis from 2000 to the present. The targeted audience is researchers and practitioners in one of the surveyed fields who miss the big picture of software diversity. Assembling the multiple facets of this fascinating topic sheds a new light on the field. © 2015 ACM.