Phimeca Engineering

Pérignat-lès-Sarliève, France

Phimeca Engineering

Pérignat-lès-Sarliève, France
SEARCH FILTERS
Time filter
Source Type

Armand P.,CEA DAM Ile-de-France | Brocheton F.,NUMTECH | Poulet D.,NUMTECH | Vendel F.,Sillages Environnement | And 2 more authors.
Atmospheric Environment | Year: 2014

This paper is an original contribution to uncertainty quantification in atmospheric transport & dispersion (AT&D) at the local scale (1-10km). It is proposed to account for the imprecise knowledge of the meteorological and release conditions in the case of an accidental hazardous atmospheric emission. The aim is to produce probabilistic risk maps instead of a deterministic toxic load map in order to help the stakeholders making their decisions. Due to the urge attached to such situations, the proposed methodology is able to produce such maps in a limited amount of time. It resorts to a Lagrangian particle dispersion model (LPDM) using wind fields interpolated from a pre-established database that collects the results from a computational fluid dynamics (CFD) model. This enables a decoupling of the CFD simulations from the dispersion analysis, thus a considerable saving of computational time. In order to make the Monte-Carlo-sampling-based estimation of the probability field even faster, it is also proposed to recourse to the use of a vector Gaussian process surrogate model together with high performance computing (HPC) resources. The Gaussian process (GP) surrogate modelling technique is coupled with a probabilistic principal component analysis (PCA) for reducing the number of GP predictors to fit, store and predict. The design of experiments (DOE) from which the surrogate model is built, is run over a cluster of PCs for making the total production time as short as possible. The use of GP predictors is validated by comparing the results produced by this technique with those obtained by crude Monte Carlo sampling. © 2014 Elsevier Ltd.


Decatoire R.,PHIMECA Engineering | Decatoire R.,CNRS Foton Laboratory | Decatoire R.,University of Bordeaux 1 | De Larrard T.,INSA Toulouse | And 3 more authors.
7th European Workshop on Structural Health Monitoring, EWSHM 2014 - 2nd European Conference of the Prognostics and Health Management (PHM) Society | Year: 2014

In the scope of the inspection plan optimization of civil engineering structures subject to carbonation (carbon dioxyde penetration), several analytical models help to predict the degradation evolution. The degradation model used here is coming from the european project DuraCrete. This paper investigates its predictive capability, using the a priori of its inputs given in the litterature, but also after updating its predictions based on simulated measures with a finite element code. Copyright © Inria (2014).


Aguirre Martinez F.,Phimeca Engineering | Sallak M.,CNRS Heuristic and Diagnostic Methods for Complex Systems | Schon W.,CNRS Heuristic and Diagnostic Methods for Complex Systems
IEEE Transactions on Reliability | Year: 2015

We present an efficient method based on the inclusion-exclusion principle to compute the reliability of systems in the presence of epistemic uncertainty. A known drawback of belief functions and other imprecise probabilistic theories is that their manipulation is computationally demanding. Therefore, we investigate some conditions under which the measures of belief function theory are additive. If this property is met, the application of belief functions is more computationally efficient. It is shown that these conditions hold for minimal cuts and paths in reliability theory. A direct implication of this result is that the credal state (state of beliefs) about the failing (working) behavior of components does not affect the credal state about the working (failing) behavior of the system. This result is proven using a reliability analysis approach based on belief function theory. This result implies that the bounding interval of the system's reliability can be obtained with two simple calculations using methods similar to those of classical probabilistic approaches. A discussion about the applicability of the discussed theorems for non-coherent systems is also proposed. © 1963-2012 IEEE.


Keller M.,Électricité de France | Pasanisi A.,Électricité de France | Marcilhac M.,Phimeca Engineering | Yalamas T.,Phimeca Engineering | And 2 more authors.
Quality and Reliability Engineering International | Year: 2014

Seismic hazard curves provide the rate (or probability) of exceedance of different levels of a ground motion parameter (e.g., the peak ground acceleration, PGA) in a given geographical point and for a given time frame. Hence, to evaluate seismic hazard curves, one needs an occurrence model of earthquakes and an attenuation law of the ground motion with the distance. Generally, the input data needed to define the occurrence model consists in values of the magnitude, experimentally observed or, in the case of ancient earthquakes, indirectly inferred based on historically recorded damages. In this paper, we sketch a full Bayesian methodology for estimating the parameters characterizing the seismic activity in pre-determined seismotectonical zones, given such a catalog of recorded magnitudes. The statistical model, following the peak over threshold formalism, consists in the distribution of the annual number of earthquakes exceeding a given magnitude, coupled with the probability density of the magnitudes, given that they exceed the threshold. Then, as an example of the possible applications of the proposed methodology, the PGA is evaluated in several sites of interest, while taking into account the uncertainty tainting the parameters of the magnitudes' distribution in several seismotectonical zones and the attenuation law. Finally, some perspectives are sketched. Copyright © 2014 John Wiley & Sons, Ltd.


Heitner B.,Phimeca Engineering | Heitner B.,University College Dublin | Obrien E.J.,University College Dublin | Schoefs F.,University of Nantes | And 3 more authors.
Procedia Engineering | Year: 2016

This paper introduces the various aspects of bridge safety models. It combines the different models of load and resistance involving both deterministic and stochastic variables. The actual safety, i.e. the probability of failure, is calculated using Monte Carlo simulation and accounting for localized damage of the bridge. A possible damage indicator is also presented in the paper and the usefulness of updating the developed bridge safety model, with regards to the damage indicator, is examined. © 2016 The Authors. Published by Elsevier Ltd.


Noret E.,Phimeca Engineering | Prod'Homme G.,INERIS | Yalamas T.,Phimeca Engineering | Reimeringer M.,INERIS | And 3 more authors.
European Journal of Environmental and Civil Engineering | Year: 2012

The occurrence of a chain reaction from blast on atmospheric storage tanks in oil and chemical facilities is hard to predict. The current French practice for SEVESO facilities ignores projectiles and assumes a critical peak overpressure value observed from accident data. This method could lead to conservative or dangerous assessments. This study presents various simple mechanical models to facilitate quick effective assessment of risk analysis, the results of which are compared with the current practice. The damage modes are based on experience of the most recent accidents in France. Uncertainty propagation methods are used in order to evaluate the sensitivity and the failure probability of global tank models for a selection of overpressure signatures. The current work makes use of these evaluations to demonstrate the importance of a dynamic analysis to study domino effects in accidents. © 2012 Phimeca Engineering.


Baussaron J.,PHIMECA Engineering | Yalamas T.,PHIMECA Engineering
Bridge Maintenance, Safety, Management, Resilience and Sustainability - Proceedings of the Sixth International Conference on Bridge Maintenance, Safety and Management | Year: 2012

In structures submitted to repeated variable loads, initiation and growth of cracks in the material are the most common causes of the deterioration of structures. When it happens after a long time, this failure mode is called fatigue. Under variable amplitude and repeated loads, some local cracks initiate on critical points of the structures. The fatigue phenomenon is the cause of different well-known accidents over the last hundred years. The complicated phenomenon has been discussed in many articles and books, in order to observe, explain and model it. All fields of engineering have been confronted to failures of structures incurring fatigue. Many parameters used for predict times to failure of structure are variable and their variations have a big influence on the real lifetime. This paper will focus on a global methodology to take into account main sources of variability in fatigue life prediction. The first step of this methodology is to determine the variability of each parameter. Loading is one of the most important sources of variability. Rainflow matrix are used to transform a time variant signal into constant amplitude cycles. Then, a fatigue equivalent load is defined, it produces the same damage as the initial signal. Another important source of variability is the strength of the structure. Probabilistic Wöhler curves are fitted to represent the failure probability of the structure. An accurate reliability assessment of the structure can be performed when the main variability sources are considered. Reliability methods also allow to rank the influence of each variable and optimize parameters in order to reach product reliability target. © 2012 Taylor & Francis Group.


Julien B.,Phimeca Engineering | Bertrand F.,Phimeca Engineering | Thierry Y.,Phimeca Engineering
Procedia Engineering | Year: 2013

Industrial components subjected to random vibration fatigue (loading with a power spectral density, PSD, or a sinusoidal spectrum) are frequently designed by the use of numerical simulations to estimate the lifetime and ensure the mechanical system behavior. Finite element simulations need the use of a significant amount of input parameters to solve the problem (material properties such as Young's modulus, density, stiffness of joints between components, model element choice..). Some of these parameters are identified from tests with unavoidable uncertainty and others are subjected to inherent variability. It is important to know the influence of this variability/uncertainty on model responses. Innovative methods have been developed to determine the sensitivity of model responses to uncertain parameters in order to study the robustness of the design. Then, the identification of the most influential parameters, associated with the study of their variations, allow to determine result dispersions for the modal analysis (eigen frequencies, mode shapes), the calculation of random vibration (RMS constraints) and damage calculation. This methodology provides an estimation of the probability of system failure instead of a binary result obtained by a deterministic design method. © 2013 The Authors. Published by Elsevier Ltd.


Argence S.,NUMTECH | Armand P.,CEA DAM Ile-de-France | Brocheton F.,NUMTECH | Yalamas T.,PHIMECA Engineering | And 2 more authors.
HARMO 2011 - Proceedings of the 14th International Conference on Harmonisation within Atmospheric Dispersion Modelling for Regulatory Purposes | Year: 2011

Long-term impact assessment (LIA) studies still rely on the use of deterministic dispersion modelling systems. Such a methodology does not allow quantifying uncertainties regarding the results of LIA studies while this kind of information is more and more asked by regulatory institutions. For three years, NUMTECH, CEA and PHIMECA have been collaborating to conceive an exhaustive tool dealing with the quantification of uncertainties related to LIA results. This tool is now functional and covers all aspects of a standard uncertainties study from the definition of input uncertainties to the statistical processing of concentration outputs (mean, confidence intervals, probabilities of threshold exceedance, etc.). The paper explains how this tool has been developed and gives examples of its use for a practical case study (with a unique source, a flat terrain and no plume rise) and a Gaussian plume dispersion model.


Dominguez N.,Airbus | Feuillard V.,Airbus | Jenson F.,CEA Saclay Nuclear Research Center | Willaume P.,PHIMECA Engineering
AIP Conference Proceedings | Year: 2012

The concept of Probability of Detection (POD) is generally used to quantitatively assess performances and reliability of NDT operations for in-service operations related to damage tolerant designs. Application of the POD approach as a metric for manufacturing NDT assessment would also be relevant but the very expensive cost of such campaigns generally prevents us from doing so. However the increase in NDT simulation capability and maturity opens the field for POD demonstrations for manufacturing NDT with the help of simulation. This paper presents the example of an automated phased array ultrasonic testing procedure of Electron Beam Welding on rotative parts, as part of the PICASSO European project. POD is calculated by using the uncertainty propagation approach in CIVA. The peculiarity of uncertainties in automated NDT compared to in-service manual operations is discussed and raises questions on appropriate statistics to be used for this kind of data. Alternative estimation techniques like Box-Cox transform or quantile regression are proposed and evaluated. © 2012 American Institute of Physics.

Loading Phimeca Engineering collaborators
Loading Phimeca Engineering collaborators