Reis R.,ISEP |
ACM International Conference Proceeding Series | Year: 2014
The virtual worlds can be used to achieve different purposes according to the intended use. The design of games for learning under platforms virtual worlds has been an important research field for several years. However, the research in this specific field has shown that in most of the cases, the environments do not have appropriate technical characteristics. The development of the games for learning under virtual worlds platforms has as goal to produce environments that encourage users to achieve effective learning. In this sense, the current study presents a development model for implementation of games for learning under platforms virtual worlds. This model is based on the engineering software techniques and methods. It is supported by a spiral cycle that allows us to develop applications. The process is divided into a set of activities that are being carried out throughout each cycle, producing several work products, with the aim to provide each team member a set of guidelines and tools necessary for to make intelligent decisions about what they do. The model includes five steps, namely: Conception, Analysis, Design, Implementation and Evaluation. Each step contains a set of diagrams to support the developer team in their tasks. With this model, the applications are developed in a series of incremental releases, that is, the final system is constructed, based on the refined prototype. These steps include activities that enable to quantify the quality of games for learning. It is based on the Quantitative Evaluation Framework developed by Escudeiro , and allows us to have a degree of freedom in the selection of quality criteria. Thus, we can obtain a single quantitative value of quality for any domain in analysis, i.e., we can adapt it in any domain and valence. Copyright 2014 ACM.
Boespflug M.,McGill University |
Carbonneaux Q.,French Institute for Research in Computer Science and Automation |
CEUR Workshop Proceedings | Year: 2012
The λΠ-calculus forms one of the vertices in Barendregt's λ-cube and has been used as the core language for a number of logical frameworks. Following earlier extensions of natural deduction , Cousineau and Dowek  generalize the definitional equality of this well studied calculus to an arbitrary congruence generated by rewrite rules, which allows for more faithful encodings of foreign logics. This paper motivates the resulting language, the λΠ-calculus modulo, as a universal proof language, capable of expressing proofs from many other systems without losing their computational properties. We further show how to very simply and efficiently check proofs from this language. We have implemented this scheme in a proof checker called DEDUKTI.
Chabchoub Y.,ISEP |
Chiky R.,ISEP |
Eurasip Journal on Information Security | Year: 2014
IP networks are constantly targeted by new techniques of denial of service attacks (SYN flooding, port scan, UDP flooding, etc), causing service disruption and considerable financial damage. The on-line detection of DoS attacks in the current high-bit rate IP traffic is a big challenge. We propose in this paper an on-line algorithm for port scan detection. It is composed of two complementary parts: First, a probabilistic counting part, where the number of distinct destination ports is estimated by adapting a method called 'sliding HyperLogLog' to the context of port scan in IP traffic. Second, a decisional mechanism is performed on the estimated number of destination ports in order to detect in real time any behavior that could be related to a malicious traffic. This latter part is mainly based on the exponentially weighted moving average algorithm (EWMA) that we adapted to the context of on-line analysis by adding a learning step (supposed without attacks) and improving its update mechanism. The obtained port scan detecting method is tested against real IP traffic containing some attacks. It detects all the port scan attacks within a very short time response (of about 30 s) and without any false positive. The algorithm uses a very small total memory of less than 22 kb and has a very good accuracy on the estimation of the number of destination ports (a relative error of about 3.25%), which is in agreement with the theoretical bounds provided by the sliding HyperLogLog algorithm. © 2014 Chabchoub et al.
Nunes J.P.,University of Minho |
Materials Science Forum | Year: 2013
This work reviews the work made in last years to produce thermoplastic matrix towpregs to highly demanding and more cost-effective commercial applications using a powder coating technology developed in Portugal by Minho and Porto Universities. Different thermoplastic matrices and continuous fibre reinforcements were used in the towpregs produced for highly demanding markets (e.g., carbon fibre reinforced Primospire® towpreg) and for more commercial applications (e.g., glass fibre reinforced polypropylene and polyvinyl chloride towpregs). The relevant processing parameters, such as, fibre pull-speed and furnace temperature were varied to determine their influence on the polymer mass fraction obtained in the studied raw materials. Several technologies were also developed and used (compression moulding, pultrusion and filament winding) to process composite parts with adequate properties for the envisaged markets at compatible production rates. The obtained results lead us to conclude that the studied thermoplastic matrix towpregs and their processed composite parts have very interesting conditions for being applied both in highly advanced and cost-effective markets. © (2013) Trans Tech Publications, Switzerland.
Chabchoub Y.,ISEP |
Fricker C.,French Institute for Research in Computer Science and Automation
2014 International Workshop on Computational Intelligence for Multimedia Understanding, IWCIM 2014 | Year: 2014
The Bike Sharing System (BSS) has become a more and more popular means of transport in Paris and in many other cities around the world. It is also generating an increasingly huge amount of data describing users trips. Such datasets may be very useful for the data mining community in order to improve the global performance of the BSS. In this paper, we focus on the resources availability (free docks and available bikes) in the Parisian BSS called Velib'. We analyze a Velib' trip dataset in order to separate the Velib' stations into three categories (balanced, overloaded and underloaded clusters), according to the ratio between arrivals and departures relative to each station, during the whole day. For this purpose, we use the well known Kmeans clustering algorithm, along with the Dynamic Time Wraping (DTW) metric to measure the similarity between the clusters. We choose to update the centers of the clusters using the efficient Dtw Barycenter Averaging (DBA) method. © 2014 IEEE.
Ghorbel I.,FOVEA Pharmaceuticals |
Ghorbel I.,Telecom ParisTech |
Rossant F.,ISEP |
Bloch I.,Telecom ParisTech |
Paques M.,Center Hospitalier National des Quinze Vingts
Proceedings - International Conference on Image Processing, ICIP | Year: 2011
Parametric deformable models are an important technique for image segmentation. In order to improve the robustness of the model, it may be interesting to incorporate a priori information about the shape of the objects to be segmented. In this paper, we propose to add a parallelism constraint. Such a model is relevant in many applications where elongated structures have to be detected. One main advantage of our formulation is that it only needs few parameters to be adjusted in addition to those of traditional snakes. The proposed model has been applied for the segmentation of OCT images of the retina and for the segmentation of retinal vessels. Experimental results, obtained on 25 OCT images and 30 eye fundus images, demonstrated the robustness, flexibility and large potential applicability of this new formulation. The accuracy of the method has been assessed by comparing manual segmentations, made by experts, with the automatic ones. © 2011 IEEE.
Martins F.,ISEP |
Costa C.A.V.,University of Porto
Computers and Chemical Engineering | Year: 2010
Screening of topologies developed by hierarchical heuristic procedures can be carried out by comparing their optimal performance. In this work we will be exploiting mono-objective process optimization using two algorithms, simulated annealing and tabu search, and four different objective functions: two of the net present value type, one of them including environmental costs and two of the global potential impact type. The hydrodealkylation of toluene to produce benzene was used as case study, considering five topologies with different complexities mainly obtained by including or not liquid recycling and heat integration. The performance of the algorithms together with the objective functions was observed, analyzed and discussed from various perspectives: average deviation of results for each algorithm, capacity for producing high purity product, screening of topologies, objective functions robustness in screening of topologies, trade-offs between economic and environmental type objective functions and variability of optimum solutions. © 2009 Elsevier Ltd. All rights reserved.
Taconet C.,Institute Telecom |
Journal of Digital Information Management | Year: 2010
The design process followed to produce traditional applications needs to be enhanced to cope with new context-aware ubiquitous application requirements. With the popularity of ubiquitous computing, context-aware applications become clearly necessary. This new kind of applications allows mobile users to universally access services in respect to any context including his computing environment. Challenges for the design of such applications are to easily define context collection requirements, context analysis and adaptations of the applications due to changes in its environment. To face these issues, we propose, in this article, a generic and extensible way to model context-awareness of any application using the model-driven engineering (MDE) approach. For this purpose, we add a context-awareness aspect to application model views. We illustrate our solution by modeling a context-aware e-commerce application. The addition of a context-awareness aspect, should ease the definition of mobile applications. Furthermore, context-awareness models open the way to automate context-awareness code production.
Costa A.M.,ISEP |
Machado J.T.,ISEP |
Quelhas M.D.,National Health Institute
Bioinformatics | Year: 2011
Motivation: We describe a novel approach to explore DNA nucleotide sequence data, aiming to produce high-level categorical and structural information about the underlying chromosomes, genomes and species. Results: The article starts by analyzing chromosomal data through histograms using fixed length DNA sequences. After creating the DNA-related histograms, a correlation between pairs of histograms is computed, producing a global correlation matrix. These data are then used as input to several data processing methods for information extraction and tabular/graphical output generation. A set of 18 species is processed and the extensive results reveal that the proposed method is able to generate significant and diversified outputs, in good accordance with current scientific knowledge in domains such as genomics and phylogenetics. © The Author 2011. Published by Oxford University Press.
Cousineau D.,French Institute for Research in Computer Science and Automation |
Leibniz International Proceedings in Informatics, LIPIcs | Year: 2012
Two main lines have been adopted to prove the cut elimination theorem: the syntactic one, that studies the process of reducing cuts, and the semantic one, that consists in interpreting a sequent in some algebra and extracting from this interpretation a cut-free proof of this very sequent. A link between those two methods was exhibited by studying in a semantic way, syntactical tools that allow to prove (strong) normalization of proof-terms, namely reducibility candidates. In the case of deduction modulo, a framework combining deduction and rewriting rules in which theories like Zermelo set theory and higher order logic can be expressed, this is obtained by constructing a reducibility candidates valued model. The existence of such a pre-model for a theory entails strong normalization of its proof-terms and, by the usual syntactic argument, the cut elimination property. In this paper, we strengthen this gate between syntactic and semantic methods, by providing a full semantic proof that the existence of a pre-model entails the cut elimination property for the considered theory in deduction modulo. We first define a new simplified variant of reducibility candidates à la Girard, that is sufficient to prove weak normalization of proof-terms (and therefore the cut elimination property). Then we build, from some model valued on the pre-Heyting algebra of those WN reducibility candidates, a regular model valued on a Heyting algebra on which we apply the usual soundness/strong completeness argument. Finally, we discuss further extensions of this new method towards normalization by evaluation techniques that commonly use Kripke semantics. © Denis Cousineau and Olivier Hermant.