EMANUELE BORGONOVO

Publications

Author of more than 100 works. See the full list in the "Full publication list" folder.


 


Sampling strategies in density-based sensitivity analysis (30/04/2012)


W. Castaings, E. Borgonovo, M.D. Morris, S. Tarantola
Environmental Modelling&Software

Decision and policy-makers benefit from the utilization of computer codes in an increasing number of areas and applications. Several authorities and agencies recommend the utilization of proper sensitivity analysis methods in order to confidently entrust model results. In this respect, density-based techniques have recently attracted interest among academicians and practitioners, for their property to characterize uncertainty in terms of the entire distribution of an output variable. However, their estimation is a challenging task and, without a proper methodical approach, errors in the estimates can lead to misleading conclusions. In this work, we propose sampling plans for reducing the computational burden of sensitivity estimates while improving and controlling the accuracy in the estimation. We compare designs based on column substitutions and designs based on permutations. We investigate their behaviour in terms of type I and type II errors. We apply the methods to the Level E model, a computational tool developed by the Nuclear Energy Agency of the OECD for the assessment of nuclear waste disposal sites. Results show that application of the proposed sampling plans allows one to obtain confidence in the sensitivity estimates at a number of model runs several orders of magnitude lower than a brute-force approach. This assessment, based upon the entire distribution of the model output, provides us with ways to effectively reduce uncertainty in the model output, either by prioritizing the model factors that need to be better known or by prioritizing the areas where additional modelling efforts are needed.

 

Environmenal Modelling & Software, 38, December 2012, pp. 13-26.

 



Last change 12/06/2012

Composite Multilinearity, Epistemic Uncertainty and Risk Achievement Worth (03/05/2012)


Borgonovo E. and Smith C.L.
European Journal of Operational Research, forthcoming.
Risk Achievement Worth is one of the most widely utilized importance measures. RAW is defined as the ratio of the risk metric value attained when a component has failed over the base case value of the risk metric. Traditionally, both the numerator and denominator are point estimates. Relevant literature has shown that inclusion of epistemic uncertainty i) induces notable variability in the point estimate ranking and ii) causes the expected value of the risk metric to differ from its nominal value. We investigate the conditions under which the equality of the nominal and expected values of a reliability risk metric holds. We then study how the presence of epistemic uncertainty affects RAW and the associated ranking. We propose an extension of RAW (called ERAW) which allows one to obtain a ranking robust to epistemic uncertainty. We discuss the properties of ERAW and the conditions under which it coincides with RAW. We apply our findings to a probabilistic risk assessment model developed for the safety analysis of NASA lunar space missions.


Last change 04/05/2012

A Study of Interactions in the Risk Assessment of Complex Engineering Systems: An Application to Space PSA (10/12/2011)


E. Borgonovo and C.L. Smith
Operations Research
Risk-managers are often confronted with the evaluation of operational policies in which two or more system components are simultaneously affected by a change. In these instances, the decision-making process should be informed by the relevance of interactions. However, because of system and model complexity, a rigorous study for determining whether and how interactions quantitatively impact operational choices has not been developed yet. In the light of the central role played by the multilinearity of the decision-support models, we investigate of the presence of interactions in multilinear functions first. We identify interactions that can be a-priori excluded from the analysis. We introduce sensitivity measures that apportion the model output change to factors individual and interaction contributions in an exact fashion. The sensitivity measures are linked to graphical representation methods as tornado diagrams and Pareto charts, and a systematic way of inferring managerial insights is presented. We then specialize the findings to reliability and probabilistic safety assessment (PSA) problems. We set forth a procedure for determining the magnitude of changes that make interactions relevant in the analysis. Quantitative results are discussed by application to a PSA model developed at NASA to support decision-making in space mission planning and design. Numerical findings show that sub-optimal decisions concerning the components on which to focus managerial attention can be made, if the decision-making process is not informed by the relevance of interactions.


Last change 25/04/2012

Emulators in Moment Independent Sensitivity Analysis: An Application to Environmental Modelling (01/03/2012)


E. Borgonovo, W. Castaings and S. Tarantola
Environmental Modelling and Software
Moment-independent sensitivity methods are attracting increasing attention among practitioners, since they provide a thorough way of investigating the sensitivity of model output under uncertainty. However, their estimation is challenging, especially in the presence of computationally intensive models. We argue that replacement of the original model by a metamodel can contribute in lowering the computation burden. A numerical estimation procedure is set forth. The procedure is first tested on analytical cases with increased structural complexity. We utilize the emulator proposed in cite{RattoandPagano(2010)}. Results show that the emulator allows an accurate estimation of density-based sensitivity measures, when the main structural features of the original model are captured. However, performance deteriorates for a model with interactions of order higher than 2. For this test case, also a kriging emulator is investigated, but no gain in performance is registered. However, an accurate estimation is obtained by applying a logarithmic transformation of the model output for both the kriging and cite{RattoandPagano(2010)} emulators. These findings are then applied to the investigation of a benchmark environmental case study, the LevelE model. Results show that use of the metamodel allows an efficient estimation of moment-independent sensitivity measures while leading to a notable reduction in  computational burden.


Last change 25/04/2012

Sensitivity Analysis with Finite Changes: an Application to Modified EOQ Models (01/06/2010)


E. Borgonovo
European Journal of Operational Research
In this work, we introduce a new method for the sensitivity analysis of model output in the presence of finite changes in one or more of the exogenous variables. We define sensitivity measures that do not rest on differentiability. We relate the sensitivity measures to classical differential and comparative statics indicators. We prove a result that allows us to obtain the sensitivity measures at the same cost of one-variable-at-a-time methods, thus making their estimation feasible also for computationally intensive models. We discuss in detail the derivation of managerial insights formulating a procedure based on the concept of textquotedblleft Settingstextquotedblright. The method is applied to the sensitivity analysis of a discrete change in optimal order quantity following a jump in the exogenous variables of a nonlinear programming inventory model.

European Journal of Operational Research, 200, pp. 127–138.


Last change 25/04/2012

Function decomposition, monotonicity and ultramodularity: applications to multi-attribute utility theory (01/09/2011)


Beccacece F. and Borgonovo E.
European Journal of Operational Research
Utility function properties as monotonicity and concavity play a fundamental role in reflecting a decision-maker's preference structure. These properties are usually characterized via partial derivatives. However, elicitation methods do not necessarily lead to twice-differentiable utility functions. Furthermore, while in a single-attribute context concavity fully reflects risk aversion, in multiattribute problems such correspondence is not one-to-one. We show that Tsetlin and Winkler's multivariate risk attitudes imply ultramodularity of the utility function. We demonstrate that geometric properties of a multivariate utility function can be successfully studied by utilizing an integral function expansion (functional ANOVA). The necessary and sufficient conditions under which monotonicity and/or ultramodularity of single-attribute functions imply the monotonicity and/or ultramodularity of the corresponding multiattribute function under additive, preferential and mutual utility independence are then established without reliance on the utility function differentiability. We also investigate the relationship between the presence of interactions among the attributes of a multiattribute utility function and the decision-maker's multivariate risk attitudes.

European Journal of Operational Research, 210, 326-335.

Last change 25/04/2012

The Reliability Importance of Components and Prime Implicants in Coherent and Non-Coherent Systems Including Total-Order Interactions (01/09/2010)


E. Borgonovo
European Journal of Operational Research
In the management of complex systems, knowledge of how components contribute to system performance is essential to the correct allocation of resources. Recent works have renewed interest in the properties of the joint ($J$) and differential ($D$) reliability importance measures. However, a common background for these importance measures has not been developed yet. In this work, we build a unified framework for the utilization of $J$ and $D$ in both coherent and non-coherent systems. We show that the reliability function of any system is multilinear and its Taylor expansion is exact at an order $T$. We then introduce a total order importance measure ($D^{T}$) that coincides with the exact portion of the change in system reliability associated with any (finite or infinitesimal) change in component reliabilities. We show that $D^{T}$ synthesizes the Birnbaum, joint and differential importance of all orders in one unique indicator. We propose an algorithm that enables the numerical estimation of $D^{T}$ by varying one probability at a time, making it suitable in the analysis of complex systems. Findings demonstrate that the simultaneous utilization of $D^{T}$ and $J$ provides reliability analysts with a complete dissection of system performance.

European Journal of Operational Research, 204, pp. 485–495.

Last change 25/04/2012

Finite Change Comparative Statics for Risk Coherent Inventories (01/01/2011)


Borgonovo E. and Peccati L., 2011
International Journal of Production Economics, 2011, 131(1), 52-62
This work introduces a comprehensive approach to the sensitivity analysis (SA) of risk-coherent inventory models. We address the issues posed by i) the piecewise-defined nature of risk-coherent objective functions and ii) by the need of multiple model evaluations. The solutions of these issues is found by introducing the extended finite change sensitivity indices (FCSI's). We obtain properties and invariance conditions for the sensitivity
of risk-coherent optimization problems. An inventory management case study involving risk-neutral and conditional value-at-risk (CVaR) objective function illustrates our methodology. Three SA settings are formulated to obtain managerial insights. Numerical findings show that risk-neutral decision-makers are more exposed to variations in exogenous variables than CVaR decision-makers.

Last change 25/04/2012

Moment Independent Importance Measures: New Results and Analytical Test Cases (05/06/2010)


Borgonovo E., Castaings W. and Tarantola S.
Risk Analysis
Moment independent methods for the sensitivity analysis of model output are attracting growing attention among both academicians and practitioners. However, the lack of benchmarks against which to compare numerical strategies forces one to rely on ad-hoc experiments in estimating the sensitivity measures. This paper introduces a methodology that allows one to obtain moment independent sensitivity measures analytically. We illustrate the procedure by implementing four test cases with different model structures and model input distributions.  Numerical experiments are performed at increasing sample size to check convergence of the sensitivity estimates to the analytical values.

Risk Analysis, 2010, 31 (3), pp. 404-428.

Last change 25/04/2012

A Methodology for Determining Interactions in PSA models by Varying One Parameter At-a-Time (01/06/2010)


E. Borgonovo
Risk Analysis
In risk analysis problems, the decision-making process is supported by the utilization of quantitative models. Assessing the relevance of interactions is an essential information in the interpretation of model results. By such
knowledge, analysts and decision-maker are able to understand whether risk is apportioned by individual factor contributions or by their joint action. However, models are oftentimes large, requiring a high number of input
parameters, and complex, with individual model runs being time consuming. Computational complexity leads analysts to utilize one-parameter-at-a-time sensitivity methods, which prevent one from assessing interactions. In this work, we illustrate a methodology to quantify interactions in probabilistic safety assessment (PSA) models by varying one-parameter-at-a-time. The method is based on a property of the functional ANOVA decomposition of a finite change that allows to exactly determine the relevance of factors when considered individually or together with their interactions with all other factors. A set of test cases illustrates the technique. We apply the
methodology to the analysis of the Core Damage Frequency of the Large Loss of Coolant Accident of a nuclear reactor. Numerical results reveal the non-additive model structure, allow to quantify the relevance of interactions and to identify the direction of change (increase or decrease in risk) implied by individual factor variations and by their cooperation.

Risk Analysis, 2010, 30 (3), pp. 385

Last change 25/04/2012

What Drives Value Creation? An Application of Sensitivity Analysis to Project Finance Transactions (05/05/2010)


E. Borgonovo, S. Gatti and L. Peccati
European Journal of Operational Research
Evaluating the economic attractiveness of large projects often requires the development of large and complex financial models. Model complexity can prevent management from obtaining crucial information, with the risk of a
suboptimal exploitation of the modelling efforts. We propose a methodology based on the so-called "differential importance measure ($D$)" to enhance the managerial insights obtained from financial models. We illustrate our
methodology by applying it to a project finance case study. We show that the additivity property of $D$ grants analysts and managers full flexibility in combining parameters into any group and at the desired aggregation level. We analyze investment criteria related to both the investors's and lenders' perspectives. Results indicate that exogenous factors affect investors (sponsors and lenders) in different ways, whether exogenous variables are considered individually or by groups.

European Journal of Operational Research, 205 (1), pp. 227-236


Last change 25/04/2012

Moment Calculations for Piecewise-Defined Functions: An Application to Stochastic Optimization with Coherent Risk Measures (05/08/2010)


Borgonovo E. and Peccati L.
Annals of Operations Research
This work introduces a new analytical approach to the formulation of optimization problems with piecewise-defined (PD) objective functions. First, we introduce a new definition of multivariate PD functions and derive formal results for their continuity and differentiability. Then, we obtain closed-form expressions for the calculation of their moments. We apply these findings to three classes of optimization problems involving coherent risk measures. The method enables one to obtain insights on problem structure and on sensitivity to imprecision at the problem formulation stage, eliminating reliance on ad-hoc post-optimality numerical calculations.


Last change 25/04/2012

Managerial Insights from Service Industry Models: a new scenario decomposition method (03/06/2011)


Borgonovo E. and Peccati L.
Annals of Operations Research
The service industry literature has recently assisted to the development of several new decision-support models. The new models have been often corroborated via scenario analysis. We introduce a new approach to obtain managerial insights in scenario analysis. The method is based on the decomposition of model results across sub-scenarios generated according to the high dimensional model representation theory. The new method allows
analysts to quantify the effects of factors, their synergies and to identify the key drivers of scenario results. The method is applied to the scenario analysis of the workforce allocation model by Corominas et al 2004.

Annals of Operations Research, 2011, 185(1), pp. 161-179.

Last change 25/04/2012

Cancer cell reprogramming: stem cell differentiation stage factors and an agent based model to optimize cancer treatment (01/01/2011)


Biava P.M., Basevi M., Biggiero L., Borgonovo A., Borgonovo E. and Burigana F.
Current Pharmaceutical Biotechnology, 2011, 12(2), pp. 231-242
The recent tumor research has lead scientists to recognize the central role played by cancer stem cells in sustaining malignancy and chemoresistence. A model of cancer presented by [44] describes the mechanisms that give rise to the different kinds of cancer cells like-stem cells and the role of these cells in cancer diseases. The model implies a shift in the conceptualization of the disease from reductionism to complexity theory. By exploiting the link between the agent-based simulation technique and the theory of complexity, the medical view is here translated into a corresponding computational model. Two main categories of agents characterize the model: 1) cancer stem cells and 2) differentiation factors. Cancer cells agents are then distinguished based on the differentiation stage associated with their malignancy. Differentiation factors interact with cancer cells and cen, with varying degrees of fitness, induce differentiation or cause apoptosis. The model inputs are then fitted to experimental data and numerical simulations carried out. By performing virtual experiments on the model’s choice variables a decision-maker (physician) can obtains insights on the progression of the disease and on the effects of a choice of administration frequency and or dose. The model also paves the way to future research, whose perspectives are discussed.


Last change 25/04/2012

Sensitivity Analysis of Model Output with Input Constraints: A Generalized Rationale for Local Methods (01/06/2008)


Borgonovo E., 2008: , .
Risk Analysis, 28 (3) (June 2008), pp. 667-680
In this work, we introduce a generalized rationale for local sensitivity analysis methods that allows to solve the problems connected with input constraints. Several models in use in the risk analysis field are characterized by the presence of deterministic relationships among the input parameters. However, sensitivity analysis issues related to the presence of constraints have been mainly dealt with in a heuristic fashion. We start with a systematic analysis of the effects of constraints. The findings can be summarized in the following three effects. textit{i}) Constraints makes it impossible to vary one parameter while keeping all others fixed. textit{% ii}) The model output becomes insensitive to a parameter if a constraint is solved for that parameter. textit{iii}) Sensitivity analysis results depend on which parameter is selected as dependent. The explanation of these
effects is found by proposing a result that leads to a natural extension of the local sensitivity analysis rationale introduced in Helton (1993). We then extend the definitions of the Birnbaum, Criticality and the Differential importance measures to the constrained case. In addition, it is introduced a procedure that allows to obtain constrained sensitivity results at the same cost as in the absence of constraints. The application to a non-binary event tree concludes the paper providing a numerical illustration of the above findings.


Last change 25/04/2012

A note on the sensitivity analysis of the internal rate of return (forthcoming)


M. Percoco and E. Borgonovo
International Journal of Production Economics, forthcoming
In this note we discuss the local sensitivity analysis of the internal rates of return (IRR). We show that the use of partial derivatives can be misleading in the identification of key drivers of an investment project’s performance. To remedy this shortcoming, we propose the use of an alternative sensitivity measure called the Differential Importance Measure. The analysis shows that, even if the theoretical conditions for using the Net Present Value or the IRR as valuation criteria apply, the sensitivity analysis results for the two indicators may differ.


Last change 25/04/2012

Epistemic Uncertainty in the Ranking and Categorization of Probabilistic Safety Assessment Model Elements: Issues and Findings (06/06/2008)


Borgonovo E.
Risk Analysis, 2008, 28 (4), pp. 983 - 1001.
In this work, we study the effect of epistemic uncertainty in the ranking and categorization of elements of probabilistic safety assessment (PSA) models. We show that, while in a deterministic setting a PSA element belongs to a given category univocally, in the presence of epistemic uncertainty, a PSA element belongs to a given category only with a certain probability. We propose an approach to estimate these probabilities, showing that their knowledge allows to appreciate textquotedblleft texti{the sensitivity of component categorizations to uncertainties in the parameter values} [US NRC Regulatory Guide 1.174]. We investigate the meaning and utilization of an assignment method based on the expected value of importance measures. We discuss the problem of evaluating changes in quality assurance, maintenance activities prioritization (etc.) in the presence of epistemic uncertainty. We show that the inclusion of epistemic uncertainly in the evaluation makes it necessary to evaluate changes through their effect on PSA model parameters. We propose a categorization of parameters based on the Fussell-Vesely and Differential Importance (DIM) measures. In addition, issues in the calculation of the expected value of the joint importance measure are present when evaluating changes affecting groups of components. We illustrate that the problem can be solved using DIM. A numerical application to a case study concludes the work.


Last change 25/04/2012

Financial Management in Inventory Problems: Risk Averse vs Risk Neutral Policies (01/01/2009)


Borgonovo E. and L. Peccati
International Journal of Production Economics, 2009, 118 (1), pp. 233-242.
In this work, we discuss the effect of risk measure selection in the determination of inventory policies. We consider an inventory system characterized by the loss function of Luciano et al (2003.) We derive the optimization problems faced by risk neutral, quadratic utility, mean-absolute and CVaR decision makers. Results show that while the global nature of the optimal policy is assured for risk coherent and risk neutral decision makers, the convexity of the quadratic utility problem depends on the stochastic properties of demand. We investigate the economic and stochastic determinants of the different policies. This allows us to establish the conditions under which each type of decision maker is indifferent to imprecision in the distribution families. Finally, we discuss the numerical impact of the choice of the risk measure by means of a multi-item inventory. The introduction of an approach based on Savage Scores allows us to offer a quantitative measurement of the similarity/discrepancy of policies reflecting different risk attitudes.


Last change 25/04/2012

Differential Importance and Comparative Statics: An Application to Inventory Management (01/01/2008)


Borgonovo E.
International Journal of Production Economics, 111 (1) (January 2008), pp. 170-179
In this work the framework for establishing which of the input parameters influences an inventory management policy the most is developed. To do so, a new sensitivity measure ($Gamma $) is introduced by relating the
differential importance ($D$) and comparative statics (CS) techniques. We discuss the properties of the new indicator, and show that it shares the additivity property. We provide the expression of $Gamma $ for inventory
management models both in the form of unconstrained and constrained optimization. Numerical results are offered for the sensitivity analysis of the Luciano and Peccati (1999) inventory management model.

Last change 25/04/2012

Decision Making During Nuclear Power Plant Incidents gco A New Approach to the Evaluation of Precursor Events (01/01/2007)


C. Smith and Borgonovo E.
Risk Analysis, 2007, 27 (4), pp.1027-1042.
Renewed interest in precursor analysis has shown that the evaluation of near misses is an interdisciplinary effort, fundamental within the life of an organization for reducing operational risks and enabling accident prevention. The practice of precursor analysis has been a part of nuclear power plant regulation in the US for over twenty-five years. During this time, the models utilized in the analysis have evolved from simple risk equations to quite complex probabilistic risk assessments. But, one item that has remained constant over this time is that the focus of the analysis has been on modeling the scenario using the risk model (regardless of the model sophistication) and then utilizing the results of the model to determine the severity of the precursor incident. We believe that evaluating precursors in this fashion could be a shortcoming since decision making during the incident is not formally investigated. Consequently, we present the idea for an evaluation procedure that enables one to integrate current practice with the evaluation of decisions made during the precursor event. The methodology borrows from technologies both in the risk analysis and the decision analysis realms. We demonstrate this new methodology via an evaluation of a US precursor incident. Specifically, the course of the incident is represented by the integration of a probabilistic risk assessment model (i.e., the risk analysis tool) with an influence diagram and the corresponding decision tree (i.e., the decision analysis tools). The results and insights from the application of this new methodology are discussed.


Last change 25/04/2012

A New Uncertainty Importance Measure (06/06/2007)


Borgonovo E., 2007
Reliability Engineering and System Saftey, 2007, 92, pp. 771-784
Uncertainty in parameters is present in many risk assessment problems and leads to uncertainty in model predictions. In this work, we introduce a global sensitivity indicator which looks at the influence of input uncertainty on the entire output distribution without reference to a specific moment of the output (moment independence) and which can be defined also in the presence of correlations among the parameters. We discuss its mathematical properties and highlight the differences between the present indicator, variance-based uncertainty importance measures and a moment independent sensitivity indicator previously introduced in the literature. Numerical results are discussed with application to the probabilistic risk assessment model on which Iman (1987) first introduced uncertainty importance measures.


Last change 25/04/2012

Measuring Uncertainty Importance: Investigation and Comparison of Alternative Approaches (06/06/2006)


Borgonovo E.
Risk Analysis, 2006, 26 (5), pp. 1349-1362
Uncertainty importance measures are quantitative tools aiming at identifying the contribution of uncertain inputs to output uncertainty. Their application ranges from food-safety [Frey and Patil (2002)] to hurricane losses [Iman et al (2005)]. Results and indications an analyst derives depend on the method selected for the study. In this work, we investigate the assumptions at the basis of various indicator families to discuss the information they convey to the analyst/decision maker. We start with nonparametric techniques, and then present variance-based methods. By means of an example we show that output variance does not always reflect a decision maker state of knowledge of the inputs. We then examine the use of moment-independent approaches to global sensitivity analysis, i.e. techniques that look at the entire output distribution without a specific reference to its moments. Numerical results demonstrate that both moment-independent and variance based indicators agree in identifying non-influential parameters. However, differences in the ranking of the most relevant factors show that inputs that influence variance the most are not necessarily the ones that influence the output uncertainty distribution the most.


Last change 25/04/2012

Global Sensitivity Analysis in Inventory Management (07/07/2007)


Borgonovo E. and Peccati L.
International Journal of Production Economics, 2007, 108 (1-2), pp. 302-313.
This paper deals with the sensitivity analysis (SA) of inventory management models when uncertainty in the input parameters is given full consideration. We make use of Sobol' function and variance decomposition method for determining the most influential parameters on the model output. We first illustrate the method by means of an analytical example. We provide the expression of the global importance of demand, holding costs, order costs of the Harris EOQ formula. We then present the global SA of the inventory management model developed by Luciano and Peccati (1999) for the economic order quantity estimation in the context of the temporary sale problem. We show that by performing global SA in parallel to the modeling process an analyst derives insights not only on the EOQ structure when its expression is not analytically known, but also on the relevance of modeling choices, as the inclusion of financing policies and special orders.


Last change 25/04/2012

Differential, Criticality and Birnbaum Importance Measures: an Application to Basic Event, Groups and SSCs in Event Trees and Binary Decision Diagrams (10/10/2007)


Borgonovo E.
Reliability Engineering & System Safety, 2007, 92,10, pp. 1458-1467
Recent works [Epstein and Rauzy (2005)] have questioned the validity of traditional fault tree/event tree (FTET) representation of probabilistic risk assessment problems. In spite of whether the risk model is solved through FTET or binary decision diagrams (BDDs), importance measures need to be calculated to provide risk managers with information on the risk/safety significance of system structures and components (SSCs). In this work, we
discuss the computation of the Fussel-Vesely (FV), Criticality, Birnbaum, Risk Achievement Worth (RAW) and Differential Importance Measure (DIM) for individual basic events, basic event groups and components. For individual basic events, we show that these importance measures are linked by simple relations and that this enables to compute basic event DIMs both for FTET and BDD codes without additional model runs. We then investgate whether/how importance measures can be extended to basic event groups and components. Findings show that the estimation of a group Birnbaum or Criticality importance is not possible. On the other hand, we show that the DIM of a group or of a component is exactly equal to the sum of the DIMs of the corresponding basic events and can therefore be found with no additional model runs. The above findings hold for both the FTET and the BDD methods.


Last change 04/05/2012

Uncertainty and Global Sensitivity Analysis in the Evaluation of Investment Projects (06/06/1006)


Borgonovo E. and L. Peccati
International Journal of Production Economics, 2006, 104 (1), pp. 62-73
This paper discusses the use of global Sensitivity Analysis (SA) techniques in investment decisions. Global SA is a branch of Statistics that complements and improves Uncertainty Analysis (UA) providing the analyst/decision-maker with information on how uncertainty is apportioned by the uncertain factors. In this work, we introduce global SA in the investment project evaluation realm. We then need to deal with two aspects: 1) the identification of the appropriate global SA method to be used and 2) the interpretation of their results from an investment uncertainty point of view. For task 1), we compare the performance of two family of techniques: non-parametric and variance decomposition based. For task 2), we explore the determination of the cash flow global importance (GI) for valuation criteria utilized in investment project evaluation. For the Net Present Value (NPV), we show that it is possible to derive an analytical expression of the cash flow GI, which is the same for all the techniques. This knowledge enables us to: 1) offer a direct way to compute cash flow GI; 2) illustrate the practical impact of global SA on the information collection process. For the Internal Rate of Return (IRR), we show that the same conclusions cannot be driven. In particular, a) one has to utilize a numerical approach for the computation of the cash flow influence, since an analytical expression cannot be found and b) different techniques can produce different ranking. These observations are illustrated by means of the application to a discounted cash flow model utilized in the energy sector for the evaluation of projects under survival risk . The quantitative comparison of cash flow ranking with respect to the NPV and IRR concludes the paper, illustrating that information gained on the NPV through global SA cannot be transferred to the IRR.


Last change 25/04/2012

The Importance of Assumptions in Investment Evaluation (06/06/2006)


Borgonovo E. and L. Peccati
International Journal of Production Economics, 2006, 101 (2), pp. 298-311.
This work illustrates a new method for estimating the importance of assumptions in investment evaluation. The most diffused sensitivity analysis schemes present limitations when used to assess the importance of individual parameters and cannot be employed to estimate the importance of groups of assumptions. However, such problems can be solved by making use of the Differential Importance Measure (DIM). We set forth the framework for the application of DIM at the parameter level of investment valuation models. We study the relationship between DIM and investment marginal behavior. We analyze the link between importance of a parameter and risk associated with it. We discuss general results for a sample valuation model. The numerical application to the valuation of an energy sector investment project follows. We rank the factors based on their importance and determine the project risk profile. We discuss the importance of groups of assumptions. Results will show that assumptions relating to revenues are the most influential ones, followed by discounting and operating cost assumptions. We discuss numerically the relationship between importance and risk, analyzing the effect of variable costs hedging through the comparison of the project risk profile in the presence and in the absence of such a hedging.


Last change 25/04/2012

Sensitivity Analysis in Investment Project Evaluation (04/04/2004)


Borgonovo E. and L. Peccati
International Journal of Production Economics, 2004, 70, p.17-25
This paper discusses the sensitivity analysis of valuation equations used in investment decisions. Since financial
decision are commonly supported via a point value of some criterion of economic relevance (net present value,
economic value added, internal rate of return, etc.), we focus on local sensitivity analysis. In particular, we present the differential importance measure (DIM) and discuss its relation to elasticity and other local sensitivity analysis techniques in the context of discounted cash flow valuation models. We present general results of the net present value and internal rate of return sensitivity on changes in the cash flows. Specific results are obtained for a valuation model of projects under severe survival risk used in the industry sector of power generation.

Link to Pdf


Last change 25/04/2012

Comparison of Global Sensitivity Analysis Techniques and Importance Measures in PSA (03/03/2003)


Borgonovo E., G.E. Apostolakis, S. Tarantola and A. Saltelli
Reliability Engineering and System Safety, 2003, 79, pp. 175-185.
This paper discusses application and results of global sensitivity analysis techniques to probabilistic safety assessment (PSA) models, and their comparison to importance measures. This comparison allows one to understand whether PSA elements that are important to the risk, as revealed by importance measures, are also important contributors to the model uncertainty, as revealed by global sensitivity analysis. We show that, due to epistemic dependence, uncertainty and global sensitivity analysis of PSA models must be performed at the parameter level. A difficulty arises, since standard codes produce the calculations at the basic event level. We discuss both the indirect comparison through importance measures computed for basic events, and the direct comparison performed using the differential importance measure and the Fussell–Vesely importance at the parameter level. Results are discussed for the large LLOCA sequence of the advanced test reactor PSA.

Link to pdf

Last change 25/04/2012

A new importance measure for risk-informed decision making (01/01/2001)


Borgonovo E. and G.E. Apostolakis
Reliability Engineering & System Safety, 2001, 72 (2), pp. 193-212
 In this paper, we introduce a new importance measure, the differential importance measure (DIM), for probabilistic safety assessment (PSA). DIM responds to the need of the analyst/decision maker to get information about the importance of proposed changes that affect component properties and multiple basic events. DIM is directly applicable to both the basic events and the parameters of the PSA model. Unlike the Fussell–Vesely (FV), risk achievement worth (RAW), Birnbaum, and criticality importance measures, DIM is additive, i.e. the DIM of groups of basic events or parameters is the sum of the individual DIMs. We discuss the difference between DIM and other local sensitivity measures that are based on normalized partial derivatives. An example is used to demonstrate the evaluation of DIM at both the basic event and the parameter level. To compare the results obtained with DIM at the parameter level, an extension of the definitions of FV and RAW is necessary. We discuss possible extensions and compare the results of the three measures for a more realistic example.

Link to pdf

Last change 25/04/2012

A Monte Carlo methodological approach to plant availability modeling with maintenance, aging and obsolescence (01/01/2000)


Borgonovo E. , M. Marseguerra, and E. Zio
Reliability Engineering & System Safety, 2000, 67 (1), 2000, pp. 6173
In this paper we present a Monte Carlo approach for the evaluation of plant maintenance strategies and operating procedures under economic constraints. The proposed Monte Carlo simulation model provides a flexible tool which enables one to describe many of the relevant aspects for plant management and operation such as aging, repair, obsolescence, renovation, which are not easily captured by analytical models. The maintenance periods are varied with the age of the components. Aging is described by means of a modified Brown–Proschan model of imperfect (deteriorating) repair which accounts for the increased proneness to failure of a component after it has been repaired. A model of obsolescence is introduced to evaluate the convenience of substituting a failed component with a new, improved one. The economic constraint is formalized in terms of an energy, or cost, function; optimization studies are then performed using the maintenance period as the control parameter.

Link to pdf


Last change 25/04/2012

Last change 25/04/2012