Entity

Time filter

Source Type

Porto, Portugal

Nunes J.P.,University of Minho | Siva J.F.,ISEP
Materials Science Forum | Year: 2013

This work reviews the work made in last years to produce thermoplastic matrix towpregs to highly demanding and more cost-effective commercial applications using a powder coating technology developed in Portugal by Minho and Porto Universities. Different thermoplastic matrices and continuous fibre reinforcements were used in the towpregs produced for highly demanding markets (e.g., carbon fibre reinforced Primospire® towpreg) and for more commercial applications (e.g., glass fibre reinforced polypropylene and polyvinyl chloride towpregs). The relevant processing parameters, such as, fibre pull-speed and furnace temperature were varied to determine their influence on the polymer mass fraction obtained in the studied raw materials. Several technologies were also developed and used (compression moulding, pultrusion and filament winding) to process composite parts with adequate properties for the envisaged markets at compatible production rates. The obtained results lead us to conclude that the studied thermoplastic matrix towpregs and their processed composite parts have very interesting conditions for being applied both in highly advanced and cost-effective markets. © (2013) Trans Tech Publications, Switzerland. Source


Boespflug M.,McGill University | Carbonneaux Q.,French Institute for Research in Computer Science and Automation | Hermant O.,ISEP
CEUR Workshop Proceedings | Year: 2012

The λΠ-calculus forms one of the vertices in Barendregt's λ-cube and has been used as the core language for a number of logical frameworks. Following earlier extensions of natural deduction [14], Cousineau and Dowek [11] generalize the definitional equality of this well studied calculus to an arbitrary congruence generated by rewrite rules, which allows for more faithful encodings of foreign logics. This paper motivates the resulting language, the λΠ-calculus modulo, as a universal proof language, capable of expressing proofs from many other systems without losing their computational properties. We further show how to very simply and efficiently check proofs from this language. We have implemented this scheme in a proof checker called DEDUKTI. Source


Chabchoub Y.,ISEP | Fricker C.,French Institute for Research in Computer Science and Automation
2014 International Workshop on Computational Intelligence for Multimedia Understanding, IWCIM 2014 | Year: 2014

The Bike Sharing System (BSS) has become a more and more popular means of transport in Paris and in many other cities around the world. It is also generating an increasingly huge amount of data describing users trips. Such datasets may be very useful for the data mining community in order to improve the global performance of the BSS. In this paper, we focus on the resources availability (free docks and available bikes) in the Parisian BSS called Velib'. We analyze a Velib' trip dataset in order to separate the Velib' stations into three categories (balanced, overloaded and underloaded clusters), according to the ratio between arrivals and departures relative to each station, during the whole day. For this purpose, we use the well known Kmeans clustering algorithm, along with the Dynamic Time Wraping (DTW) metric to measure the similarity between the clusters. We choose to update the centers of the clusters using the efficient Dtw Barycenter Averaging (DBA) method. © 2014 IEEE. Source


Martins F.,ISEP | Costa C.A.V.,University of Porto
Computers and Chemical Engineering | Year: 2010

Screening of topologies developed by hierarchical heuristic procedures can be carried out by comparing their optimal performance. In this work we will be exploiting mono-objective process optimization using two algorithms, simulated annealing and tabu search, and four different objective functions: two of the net present value type, one of them including environmental costs and two of the global potential impact type. The hydrodealkylation of toluene to produce benzene was used as case study, considering five topologies with different complexities mainly obtained by including or not liquid recycling and heat integration. The performance of the algorithms together with the objective functions was observed, analyzed and discussed from various perspectives: average deviation of results for each algorithm, capacity for producing high purity product, screening of topologies, objective functions robustness in screening of topologies, trade-offs between economic and environmental type objective functions and variability of optimum solutions. © 2009 Elsevier Ltd. All rights reserved. Source


Cousineau D.,French Institute for Research in Computer Science and Automation | Hermant O.,ISEP
Leibniz International Proceedings in Informatics, LIPIcs | Year: 2012

Two main lines have been adopted to prove the cut elimination theorem: the syntactic one, that studies the process of reducing cuts, and the semantic one, that consists in interpreting a sequent in some algebra and extracting from this interpretation a cut-free proof of this very sequent. A link between those two methods was exhibited by studying in a semantic way, syntactical tools that allow to prove (strong) normalization of proof-terms, namely reducibility candidates. In the case of deduction modulo, a framework combining deduction and rewriting rules in which theories like Zermelo set theory and higher order logic can be expressed, this is obtained by constructing a reducibility candidates valued model. The existence of such a pre-model for a theory entails strong normalization of its proof-terms and, by the usual syntactic argument, the cut elimination property. In this paper, we strengthen this gate between syntactic and semantic methods, by providing a full semantic proof that the existence of a pre-model entails the cut elimination property for the considered theory in deduction modulo. We first define a new simplified variant of reducibility candidates à la Girard, that is sufficient to prove weak normalization of proof-terms (and therefore the cut elimination property). Then we build, from some model valued on the pre-Heyting algebra of those WN reducibility candidates, a regular model valued on a Heyting algebra on which we apply the usual soundness/strong completeness argument. Finally, we discuss further extensions of this new method towards normalization by evaluation techniques that commonly use Kripke semantics. © Denis Cousineau and Olivier Hermant. Source

Discover hidden collaborations