Time filter

Source Type

Aven T.,University of Stavanger | Nokland T.E.,IRIS - International Research Institute of Stavanger
Reliability Engineering and System Safety | Year: 2010

This paper discusses the use of uncertainty importance measures in reliability and risk analysis. Such measures are used to rank the importance of components (activities) of complex systems. The measures reflect to what degree the uncertainties on the component level influence the uncertainties on the system level. An example of such a measure is the change in the variance of the reliability of the system when ignoring the uncertainties in the component reliability. The measures are traditionally based on a Bayesian perspective as knowledge-based (subjective) probabilities express the epistemic uncertainties about the reliability and risk parameters introduced. In this paper we carry out a rethinking of the rationale for such measures. What information do they provide compared to the traditional importance measures such as the improvement potential and the Birnbaum measure? To discuss these issues we distinguish between two situations: (A) the key quantities of interest are observable quantities such as the occurrence of a system failure and the number of failures and (B) the key quantities of interest are fictional parameters constructed to reflect the aleatory uncertainties. A new type of combined sets of measures are introduced based on an integration of a traditional measure and a related uncertainty importance measure. A simple reliability example is used to illustrate the analysis and findings. © 2009 Elsevier Ltd. All rights reserved. Source

Blomgren A.,IRIS - International Research Institute of Stavanger
Corporate Social Responsibility and Environmental Management | Year: 2011

The literature on the business case for Corporate Social Responsibility (CSR) has identified possible mechanisms for CSR to increase profits. Other parts of the CSR literature argue that any strategy yielding above-average profits will be imitated by competitors and profits driven down to industry average; consequently, CSR may only be a strategy for achieving average profits. The empirical research on the relationship between CSR and profits, consisting mostly of quantitative studies, has thus far proved inconclusive and many studies are challenged on methodological grounds. This paper presents the results of interviews with senior executives of 15 of the largest textile companies on the Norwegian market. The aim of the paper is to investigate the relationship between CSR and profits while avoiding the most important methodological pitfalls of the quantitative research and acknowledging the distinction between CSR as a strategy for achieving average profits and as a strategy for achieving above-average profits. © 2010 John Wiley & Sons, Ltd. and ERP Environment. Source

Sydnes M.O.,IRIS - International Research Institute of Stavanger
Current Organic Synthesis | Year: 2011

As focus is increasing on conducting chemistry in a greener fashion, this implies that chemists need to find new methods for synthetic transformations that are more efficient or open up the possibility for recycling. By such means less waste will be generated. Catalysts on solid supports are one of many avenues that are being explored in order to reach this goal. For the reasons just stated palladium on solid supports is gaining momentum in synthesis. More and more reaction conditions are being developed that enable the use of heterogenic catalysts and a range of solid supports are being explored. The use of a heterogenic catalyst, which can easily be removed post synthesis, has the advantage that it greatly simplifies the work-up of the reaction mixture. It also reduces the amount of palladium remaining in the product. The latter benefit is highly important when the product is a pharmaceutical. Another benefit with having the catalyst on solid supports is that it also offers the opportunity to relatively easily recycle the catalyst. This review will present some of the latest developments in solid supports for palladium, and highlight the use of palladium on solid supports in organic synthesis. © 2011 Bentham Science Publishers. Source

Chen Y.,IRIS - International Research Institute of Stavanger | Oliver D.S.,University of Bergen
Computational Geosciences | Year: 2013

The use of the ensemble smoother (ES) instead of the ensemble Kalman filter increases the nonlinearity of the update step during data assimilation and the need for iterative assimilation methods. A previous version of the iterative ensemble smoother based on Gauss-Newton formulation was able to match data relatively well but only after a large number of iterations. A multiple data assimilation method (MDA) was generally more efficient for large problems but lacked ability to continue "iterating" if the data mismatch was too large. In this paper, we develop an efficient, iterative ensemble smoother algorithm based on the Levenberg-Marquardt (LM) method of regularizing the update direction and choosing the step length. The incorporation of the LM damping parameter reduces the tendency to add model roughness at early iterations when the update step is highly nonlinear, as it often is when all data are assimilated simultaneously. In addition, the ensemble approximation of the Hessian is modified in a way that simplifies computation and increases stability. We also report on a simplified algorithm in which the model mismatch term in the updating equation is neglected. We thoroughly evaluated the new algorithm based on the modified LM method, LM-ensemble randomized maximum likelihood (LM-EnRML), and the simplified version of the algorithm, LM-EnRML (approx), on three test cases. The first is a highly nonlinear single-variable problem for which results can be compared against the true conditional pdf. The second test case is a one-dimensional two-phase flow problem in which the permeability of 31 grid cells is uncertain. In this case, Markov chain Monte Carlo results are available for comparison with ensemble-based results. The third test case is the Brugge benchmark case with both 10 and 20 years of history. The efficiency and quality of results of the new algorithms were compared with the standard ES (without iteration), the ensemble-based Gauss-Newton formulation, the standard ensemble-based LM formulation, and the MDA. Because of the high level of nonlinearity, the standard ES performed poorly on all test cases. The MDA often performed well, especially at early iterations where the reduction in data mismatch was quite rapid. The best results, however, were always achieved with the new iterative ensemble smoother algorithms, LM-EnRML and LM-EnRML (approx). © 2013 Springer Science+Business Media Dordrecht. Source

Luo X.,IRIS - International Research Institute of Stavanger | Hoteit I.,King Abdullah University of Science and Technology
Monthly Weather Review | Year: 2013

This article examines the influence of covariance inflation on the distance between the measured observation and the simulated (or predicted) observation with respect to the state estimate. In order for the aforementioned distance to be bounded in a certain interval, some sufficient conditions are derived, indicating that the covariance inflation factor should be bounded in a certain interval, and that the inflation bounds are related to the maximum and minimum eigenvalues of certain matrices. Implications of these analytic results are discussed, and a numerical experiment is presented to verify the validity of the analysis conducted. © 2013 American Meteorological Society. Source

Discover hidden collaborations