Center dAffaires du Zenith

Pérignat-lès-Sarliève, France

Center dAffaires du Zenith

Pérignat-lès-Sarliève, France
SEARCH FILTERS
Time filter
Source Type

Dubourg V.,Center dAffaires du Zenith | Dubourg V.,French Institute for Advanced Mechanics | Sudret B.,ETH Zurich | Deheeger F.,Center dAffaires du Zenith
Probabilistic Engineering Mechanics | Year: 2013

Structural reliability methods aim at computing the probability of failure of systems with respect to some prescribed performance functions. In modern engineering such functions usually resort to running an expensive-to-evaluate computational model (e.g. a finite element model). In this respect simulation methods which may require 103-6 runs cannot be used directly. Surrogate models such as quadratic response surfaces, polynomial chaos expansions or Kriging (which are built from a limited number of runs of the original model) are then introduced as a substitute for the original model to cope with the computational cost. In practice it is almost impossible to quantify the error made by this substitution though. In this paper we propose to use a Kriging surrogate for the performance function as a means to build a quasi-optimal importance sampling density. The probability of failure is eventually obtained as the product of an augmented probability computed by substituting the metamodel for the original performance function and a correction term which ensures that there is no bias in the estimation even if the metamodel is not fully accurate. The approach is applied to analytical and finite element reliability problems and proves efficient up to 100 basic random variables. © 2013 Elsevier Ltd. All rights reserved.


Venkovic N.,Laval University | Sorelli L.,Laval University | Sudret B.,ETH Zurich | Yalamas T.,Center dAffaires du Zenith | Gagne R.,Université de Sherbrooke
Probabilistic Engineering Mechanics | Year: 2013

The durability of concrete materials with regard to early-age volume changes and cracking phenomena depends on the evolution of the poroelastic properties of cement paste. The ability of engineers to control the uncertainty of the percolation threshold and the evolution of the elastic modulus, the Biot-Willis parameter and the skeleton Biot modulus is key for minimizing the vulnerability of concrete structures at early-age. This work presents original results on the uncertainty propagation and the sensitivity analysis of a multiscale poromechanics-hydration model applied to cement pastes of water-to-cement ratio of 0.40, 0.50 and 0.60. Notably, the proposed approach provides poroelastic properties required to model the behavior of partially saturated aging cement pastes (e.g. autogenous shrinkage) and it predicts the percolation threshold and undrained elastic modulus in good agreement with experimental data. The development of a stochastic metamodel using polynomial chaos expansions allows to propagate the uncertainties of kinetic parameters of hydration, cement phase composition, elastic moduli and morphological parameters of the microstructure. The presented results show that the propagation does not magnify the uncertainty of the single poroelastic properties although, their correlation may amplify the variability of the estimates obtained from poroelastic state equations. In order to reduce the uncertainty of the percolation threshold and that of the poroelastic properties at early-age, engineers need to assess more accurately the apparent activation energy of calcium aluminate and, later on, of the elastic modulus of low density calcium-silicate-hydrate. © 2012 Elsevier Ltd.


Dubourg V.,Center dAffaires du Zenith | Sudret B.,ETH Zurich
Structural Safety | Year: 2014

Reliability sensitivity analysis aims at studying the influence of the parameters in the probabilistic model onto the probability of failure of a given system. Such an influence may either be quantified on a given range of values of the parameters of interest using a parametric analysis, or only locally by means of its partial derivatives. This paper is concerned with the latter approach when the limit-state function involves the output of an expensive-to-evaluate computational model. In order to reduce the computational cost it is proposed to compute the failure probability by means of the recently proposed meta-model-based importance sampling method. This method resorts to the adaptive construction of a Kriging meta-model which emulates the limit-state function. Then, instead of using this meta-model as a surrogate for computing the probability of failure, its probabilistic nature is used in order to build an quasi-optimal instrumental density function for accurately computing the actual failure probability through importance sampling. The proposed estimator of the failure probability recasts as a product of two terms. The augmented failure probability is estimated using the emulator only, while the correction factor is estimated using both the actual limit-state function and its emulator in order to quantify the substitution error. This estimator is then differentiated by means of the score function approach which enables the estimation of the gradient of the failure probability without any additional call to the limit-state function (nor its Kriging emulator). The approach is validated on three structural reliability examples. © 2013 Elsevier Ltd.


Blatman G.,French Institute for Advanced Mechanics | Blatman G.,Électricité de France | Sudret B.,French Institute for Advanced Mechanics | Sudret B.,Center dAffaires du Zenith
Journal of Computational Physics | Year: 2011

Polynomial chaos (PC) expansions are used in stochastic finite element analysis to represent the random model response by a set of coefficients in a suitable (so-called polynomial chaos) basis. The number of terms to be computed grows dramatically with the size of the input random vector, which makes the computational cost of classical solution schemes (may it be intrusive (i.e. of Galerkin type) or non intrusive) unaffordable when the deterministic finite element model is expensive to evaluate.To address such problems, the paper describes a non intrusive method that builds a sparse PC expansion. First, an original strategy for truncating the PC expansions, based on hyperbolic index sets, is proposed. Then an adaptive algorithm based on least angle regression (LAR) is devised for automatically detecting the significant coefficients of the PC expansion. Beside the sparsity of the basis, the experimental design used at each step of the algorithm is systematically complemented in order to avoid the overfitting phenomenon. The accuracy of the PC metamodel is checked using an estimate inspired by statistical learning theory, namely the corrected leave-one-out error. As a consequence, a rather small number of PC terms are eventually retained (sparse representation), which may be obtained at a reduced computational cost compared to the classical " full" PC approximation. The convergence of the algorithm is shown on an analytical function. Then the method is illustrated on three stochastic finite element problems. The first model features 10 input random variables, whereas the two others involve an input random field, which is discretized into 38 and 30 - 500 random variables, respectively. © 2010 Elsevier Inc.


Pou J.-M.,Center dAffaires du Zenith | Leblond L.,PSA Peugeot Citroën
International Journal of Metrology and Quality Engineering | Year: 2015

Beyond their evaluations, measurement uncertainties raise many questions about their use in the context of the declaration of conformity of "products and services". If different approaches have been developed over the past years, including the "capability approach", 2012 has seen the JCGM document #106 being published. This paper just published, was taken as an international standard ISO/IEC Guide 98-4 by ISO in the very same year and has just been taken (2013) in the collection of French standards (NF ISO/IEC Guide 98-4). This approach is singularly different from traditional approaches in that it introduces Bayesian concepts in the world of Metrology that was hitherto relatively impermeable to it. With this new approach, metrologists discover that measure is not a science of discovery, but a science of confirmation (or denial) of an "a priori". © 2015 EDP Sciences.


Dubourg V.,Center dAffaires du Zenith | Dubourg V.,French Institute for Advanced Mechanics | Sudret B.,Center dAffaires du Zenith | Sudret B.,French Institute for Advanced Mechanics | Bourinet J.-M.,French Institute for Advanced Mechanics
Structural and Multidisciplinary Optimization | Year: 2011

The aim of the present paper is to develop a strategy for solving reliability-based design optimization (RBDO) problems that remains applicable when the performance models are expensive to evaluate. Starting with the premise that simulation-based approaches are not affordable for such problems, and that the most-probable-failure-pointbased approaches do not permit to quantify the error on the estimation of the failure probability, an approach based on both metamodels and advanced simulation techniques is explored. The kriging metamodeling technique is chosen in order to surrogate the performance functions because it allows one to genuinely quantify the surrogate error. The surrogate error onto the limit-state surfaces is propagated to the failure probabilities estimates in order to provide an empirical error measure. This error is then sequentially reduced by means of a population-based adaptive refinement technique until the kriging surrogates are accurate enough for reliability analysis. This original refinement strategy makes it possible to add several observations in the design of experiments at the same time. Reliability and reliability sensitivity analyses are performed by means of the subset simulation technique for the sake of numerical efficiency. The adaptive surrogate-based strategy for reliability estimation is finally involved into a classical gradientbased optimization algorithm in order to solve the RBDO problem. The kriging surrogates are built in a so-called augmented reliability space thus making them reusable from one nested RBDO iteration to the other. The strategy is compared to other approaches available in the literature on three academic examples in the field of structural mechanics. © Springer-Verlag 2011.


Martin C.,National Engineering School of Tarbes | Micol A.,Center Daffaires Du Zenith | Peres F.,National Engineering School of Tarbes
Structural and Multidisciplinary Optimization | Year: 2016

In this paper a method is proposed, introducing an adaptive response surface allowing, on the one hand, to minimize the number of calls to finite element codes for the assessment of the parameters of the response surface and, on the other hand, to refine the solution around the design point by iterating through the procedure. This method is implemented in a parallel environment to optimize the calculation time with respect of the architecture of a computational cluster. © 2016 Springer-Verlag Berlin Heidelberg


Caniou Y.,French Institute for Advanced Mechanics | Sudret B.,Center dAffaires du Zenith
Applications of Statistics and Probability in Civil Engineering -Proceedings of the 11th International Conference on Applications of Statistics and Probability in Civil Engineering | Year: 2011

Modern engineering based on virtual testing platforms involves multiple models with dependent parameters. In this work, a methodology to address global sensitivity analysis for this kind of problems is introduced. A moment-independent sensitivity index that suits problems with dependent parameters is reviewed. A metamodeling technique, namely the generalized polynomial chaos expansion is used to process massive simulations at low cost. The copula theory is briefly presented. It allows one to describe the dependence structure of parameters more precisely than a traditional correlation coefficient. The methodology is applied to the sensitivity analysis of a mechanical example, namely a composite beam under dead weight. © 2011 Taylor & Francis Group, London.


Blatman G.,Électricité de France | Sudret B.,French Institute for Advanced Mechanics | Sudret B.,Center dAffaires du Zenith
Applications of Statistics and Probability in Civil Engineering -Proceedings of the 11th International Conference on Applications of Statistics and Probability in Civil Engineering | Year: 2011

Polynomial chaos (PC) expansions allow an explicit representation of the random response of a mechanical system whose input parameters are modelled by random variables. Recently, an iterative procedure based on Least Angle Regression has been devised in order to build up sparse PC approximations (i.e. PC representations containing a small number of significant coefficients) by means of a low number of model evaluations. This approach was dedicated to scalar model responses though. In contrast, a vector-valued response is considered in this paper. A two-step strategyis proposed in order to approximate all the response components by means of few PC representations. First, a principal component analysis (PCA) of the vector random response is carried out, making it possible to capture the main stochastic features of the response by means of a small number of (non physical) variables compared to the original number of output components. Then the LAR procedure is applied to each non physical variable. The method is finally applied to the study of the displacement field of a truss structure involving 10 random variables. © 2011 Taylor & Francis Group, London.


Dubourg V.,French Institute for Advanced Mechanics | Dubourg V.,Center dAffaires du Zenith | Deheeger F.,Center dAffaires du Zenith | Sudret B.,French Institute for Advanced Mechanics | Sudret B.,Center dAffaires du Zenith
Applications of Statistics and Probability in Civil Engineering -Proceedings of the 11th International Conference on Applications of Statistics and Probability in Civil Engineering | Year: 2011

In the field of structural reliability, the Monte-Carlo estimator is considered as the reference probability estimator. However, it is still untractable for real engineering cases since it requires a high number of runs of the model. In order toreduce the number of computer experiments, many other approaches known as reliability methods have been proposed. A certain approach consists in replacing the original experiment by a surrogate which is much faster to evaluate. Nevertheless, it is often difficult (or even impossible) to quantify the error made by this substitution. In this paper an alternative approach is developed. It takes advantage of the kriging meta-modeling and importance sampling techniques. The proposed alternative estimator is finally applied to a finite element based structural reliability analysis. © 2011 Taylor & Francis Group, London.

Loading Center dAffaires du Zenith collaborators
Loading Center dAffaires du Zenith collaborators