首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 328 毫秒
1.
This article presents a new importance analysis framework, called parametric moment ratio function, for measuring the reduction of model output uncertainty when the distribution parameters of inputs are changed, and the emphasis is put on the mean and variance ratio functions with respect to the variances of model inputs. The proposed concepts efficiently guide the analyst to achieve a targeted reduction on the model output mean and variance by operating on the variances of model inputs. The unbiased and progressive unbiased Monte Carlo estimators are also derived for the parametric mean and variance ratio functions, respectively. Only a set of samples is needed for implementing the proposed importance analysis by the proposed estimators, thus the computational cost is free of input dimensionality. An analytical test example with highly nonlinear behavior is introduced for illustrating the engineering significance of the proposed importance analysis technique and verifying the efficiency and convergence of the derived Monte Carlo estimators. Finally, the moment ratio function is applied to a planar 10‐bar structure for achieving a targeted 50% reduction of the model output variance.  相似文献   

2.
Jan F. Van Impe 《Risk analysis》2011,31(8):1295-1307
The aim of quantitative microbiological risk assessment is to estimate the risk of illness caused by the presence of a pathogen in a food type, and to study the impact of interventions. Because of inherent variability and uncertainty, risk assessments are generally conducted stochastically, and if possible it is advised to characterize variability separately from uncertainty. Sensitivity analysis allows to indicate to which of the input variables the outcome of a quantitative microbiological risk assessment is most sensitive. Although a number of methods exist to apply sensitivity analysis to a risk assessment with probabilistic input variables (such as contamination, storage temperature, storage duration, etc.), it is challenging to perform sensitivity analysis in the case where a risk assessment includes a separate characterization of variability and uncertainty of input variables. A procedure is proposed that focuses on the relation between risk estimates obtained by Monte Carlo simulation and the location of pseudo‐randomly sampled input variables within the uncertainty and variability distributions. Within this procedure, two methods are used—that is, an ANOVA‐like model and Sobol sensitivity indices—to obtain and compare the impact of variability and of uncertainty of all input variables, and of model uncertainty and scenario uncertainty. As a case study, this methodology is applied to a risk assessment to estimate the risk of contracting listeriosis due to consumption of deli meats.  相似文献   

3.
In this note I reply to the comments by Haimes et al. on my paper on the sensitivity analysis of the inoperability input‐output model. I make the case for a moment‐independent sensitivity analysis.  相似文献   

4.
A wide range of uncertainties will be introduced inevitably during the process of performing a safety assessment of engineering systems. The impact of all these uncertainties must be addressed if the analysis is to serve as a tool in the decision-making process. Uncertainties present in the components (input parameters of model or basic events) of model output are propagated to quantify its impact in the final results. There are several methods available in the literature, namely, method of moments, discrete probability analysis, Monte Carlo simulation, fuzzy arithmetic, and Dempster-Shafer theory. All the methods are different in terms of characterizing at the component level and also in propagating to the system level. All these methods have different desirable and undesirable features, making them more or less useful in different situations. In the probabilistic framework, which is most widely used, probability distribution is used to characterize uncertainty. However, in situations in which one cannot specify (1) parameter values for input distributions, (2) precise probability distributions (shape), and (3) dependencies between input parameters, these methods have limitations and are found to be not effective. In order to address some of these limitations, the article presents uncertainty analysis in the context of level-1 probabilistic safety assessment (PSA) based on a probability bounds (PB) approach. PB analysis combines probability theory and interval arithmetic to produce probability boxes (p-boxes), structures that allow the comprehensive propagation through calculation in a rigorous way. A practical case study is also carried out with the developed code based on the PB approach and compared with the two-phase Monte Carlo simulation results.  相似文献   

5.
Marco Percoco 《Risk analysis》2011,31(7):1038-1042
Natural and man‐made disasters are currently a source of major concern for contemporary societies. In order to understand their economic impacts, the inoperability input‐output model has recently gained recognition among scholars. In a recent paper, Percoco (2006) has proposed an extension of the model to map the technologically most important sectors through so‐called fields of influence. In the present note we aim to show that this importance measure also has a clear connection with local sensitivity analysis theory.  相似文献   

6.
Regional flood risk caused by intensive rainfall under extreme climate conditions has increasingly attracted global attention. Mapping and evaluation of flood hazard are vital parts in flood risk assessment. This study develops an integrated framework for estimating spatial likelihood of flood hazard by coupling weighted naïve Bayes (WNB), geographic information system, and remote sensing. The north part of Fitzroy River Basin in Queensland, Australia, was selected as a case study site. The environmental indices, including extreme rainfall, evapotranspiration, net‐water index, soil water retention, elevation, slope, drainage proximity, and density, were generated from spatial data representing climate, soil, vegetation, hydrology, and topography. These indices were weighted using the statistics‐based entropy method. The weighted indices were input into the WNB‐based model to delineate a regional flood risk map that indicates the likelihood of flood occurrence. The resultant map was validated by the maximum inundation extent extracted from moderate resolution imaging spectroradiometer (MODIS) imagery. The evaluation results, including mapping and evaluation of the distribution of flood hazard, are helpful in guiding flood inundation disaster responses for the region. The novel approach presented consists of weighted grid data, image‐based sampling and validation, cell‐by‐cell probability inferring and spatial mapping. It is superior to an existing spatial naive Bayes (NB) method for regional flood hazard assessment. It can also be extended to other likelihood‐related environmental hazard studies.  相似文献   

7.
Moment independent methods for the sensitivity analysis of model output are attracting growing attention among both academics and practitioners. However, the lack of benchmarks against which to compare numerical strategies forces one to rely on ad hoc experiments in estimating the sensitivity measures. This article introduces a methodology that allows one to obtain moment independent sensitivity measures analytically. We illustrate the procedure by implementing four test cases with different model structures and model input distributions. Numerical experiments are performed at increasing sample size to check convergence of the sensitivity estimates to the analytical values.  相似文献   

8.
Uncertainty appears to jump up after major shocks like the Cuban Missile crisis, the assassination of JFK, the OPEC I oil‐price shock, and the 9/11 terrorist attacks. This paper offers a structural framework to analyze the impact of these uncertainty shocks. I build a model with a time‐varying second moment, which is numerically solved and estimated using firm‐level data. The parameterized model is then used to simulate a macro uncertainty shock, which produces a rapid drop and rebound in aggregate output and employment. This occurs because higher uncertainty causes firms to temporarily pause their investment and hiring. Productivity growth also falls because this pause in activity freezes reallocation across units. In the medium term the increased volatility from the shock induces an overshoot in output, employment, and productivity. Thus, uncertainty shocks generate short sharp recessions and recoveries. This simulated impact of an uncertainty shock is compared to vector autoregression estimations on actual data, showing a good match in both magnitude and timing. The paper also jointly estimates labor and capital adjustment costs (both convex and nonconvex). Ignoring capital adjustment costs is shown to lead to substantial bias, while ignoring labor adjustment costs does not.  相似文献   

9.
Food‐borne infection is caused by intake of foods or beverages contaminated with microbial pathogens. Dose‐response modeling is used to estimate exposure levels of pathogens associated with specific risks of infection or illness. When a single dose‐response model is used and confidence limits on infectious doses are calculated, only data uncertainty is captured. We propose a method to estimate the lower confidence limit on an infectious dose by including model uncertainty and separating it from data uncertainty. The infectious dose is estimated by a weighted average of effective dose estimates from a set of dose‐response models via a Kullback information criterion. The confidence interval for the infectious dose is constructed by the delta method, where data uncertainty is addressed by a bootstrap method. To evaluate the actual coverage probabilities of the lower confidence limit, a Monte Carlo simulation study is conducted under sublinear, linear, and superlinear dose‐response shapes that can be commonly found in real data sets. Our model‐averaging method achieves coverage close to nominal in almost all cases, thus providing a useful and efficient tool for accurate calculation of lower confidence limits on infectious doses.  相似文献   

10.
《Risk analysis》2018,38(8):1576-1584
Fault trees are used in reliability modeling to create logical models of fault combinations that can lead to undesirable events. The output of a fault tree analysis (the top event probability) is expressed in terms of the failure probabilities of basic events that are input to the model. Typically, the basic event probabilities are not known exactly, but are modeled as probability distributions: therefore, the top event probability is also represented as an uncertainty distribution. Monte Carlo methods are generally used for evaluating the uncertainty distribution, but such calculations are computationally intensive and do not readily reveal the dominant contributors to the uncertainty. In this article, a closed‐form approximation for the fault tree top event uncertainty distribution is developed, which is applicable when the uncertainties in the basic events of the model are lognormally distributed. The results of the approximate method are compared with results from two sampling‐based methods: namely, the Monte Carlo method and the Wilks method based on order statistics. It is shown that the closed‐form expression can provide a reasonable approximation to results obtained by Monte Carlo sampling, without incurring the computational expense. The Wilks method is found to be a useful means of providing an upper bound for the percentiles of the uncertainty distribution while being computationally inexpensive compared with full Monte Carlo sampling. The lognormal approximation method and Wilks’s method appear attractive, practical alternatives for the evaluation of uncertainty in the output of fault trees and similar multilinear models.  相似文献   

11.
Models for the assessment of the risk of complex engineering systems are affected by uncertainties due to the randomness of several phenomena involved and the incomplete knowledge about some of the characteristics of the system. The objective of this article is to provide operative guidelines to handle some conceptual and technical issues related to the treatment of uncertainty in risk assessment for engineering practice. In particular, the following issues are addressed: (1) quantitative modeling and representation of uncertainty coherently with the information available on the system of interest; (2) propagation of the uncertainty from the input(s) to the output(s) of the system model; (3) (Bayesian) updating as new information on the system becomes available; and (4) modeling and representation of dependences among the input variables and parameters of the system model. Different approaches and methods are recommended for efficiently tackling each of issues (1)?(4) above; the tools considered are derived from both classical probability theory as well as alternative, nonfully probabilistic uncertainty representation frameworks (e.g., possibility theory). The recommendations drawn are supported by the results obtained in illustrative applications of literature.  相似文献   

12.
Sensitivity analysis (SA) methods are a valuable tool for identifying critical control points (CCPs), which is one of the important steps in the hazard analysis and CCP approach that is used to ensure safe food. There are many SA methods used across various disciplines. Furthermore, food safety process risk models pose challenges because they often are highly nonlinear, contain thresholds, and have discrete inputs. Therefore, it is useful to compare and evaluate SA methods based upon applications to an example food safety risk model. Ten SA methods were applied to a draft Vibrio parahaemolyticus (Vp) risk assessment model developed by the Food and Drug Administration. The model was modified so that all inputs were independent. Rankings of key inputs from different methods were compared. Inputs such as water temperature, number of oysters per meal, and the distributional assumption for the unrefrigerated time were the most important inputs, whereas time on water, fraction of pathogenic Vp, and the distributional assumption for the weight of oysters were the least important inputs. Most of the methods gave a similar ranking of key inputs even though the methods differed in terms of being graphical, mathematical, or statistical, accounting for individual effects or joint effect of inputs, and being model dependent or model independent. A key recommendation is that methods be further compared by application on different and more complex food safety models. Model independent methods, such as ANOVA, mutual information index, and scatter plots, are expected to be more robust than others evaluated.  相似文献   

13.
Land subsidence risk assessment (LSRA) is a multi‐attribute decision analysis (MADA) problem and is often characterized by both quantitative and qualitative attributes with various types of uncertainty. Therefore, the problem needs to be modeled and analyzed using methods that can handle uncertainty. In this article, we propose an integrated assessment model based on the evidential reasoning (ER) algorithm and fuzzy set theory. The assessment model is structured as a hierarchical framework that regards land subsidence risk as a composite of two key factors: hazard and vulnerability. These factors can be described by a set of basic indicators defined by assessment grades with attributes for transforming both numerical data and subjective judgments into a belief structure. The factor‐level attributes of hazard and vulnerability are combined using the ER algorithm, which is based on the information from a belief structure calculated by the Dempster‐Shafer (D‐S) theory, and a distributed fuzzy belief structure calculated by fuzzy set theory. The results from the combined algorithms yield distributed assessment grade matrices. The application of the model to the Xixi‐Chengnan area, China, illustrates its usefulness and validity for LSRA. The model utilizes a combination of all types of evidence, including all assessment information—quantitative or qualitative, complete or incomplete, and precise or imprecise—to provide assessment grades that define risk assessment on the basis of hazard and vulnerability. The results will enable risk managers to apply different risk prevention measures and mitigation planning based on the calculated risk states.  相似文献   

14.
This article presents a qualitative risk assessment of the acquisition of meticillin‐resistant Staphylococcus aureus (MRSA) in pet dogs, representing an important first step in the exploration of risk of bidirectional MRSA transfer between dogs and humans. A conceptual model of the seven potential pathways for MRSA acquisition in a dog in any given 24‐hour period was developed and the data available to populate that model were considered qualitatively. Humans were found to represent the most important source of MRSA for dogs in both community and veterinary hospital settings. The environment was found to be secondary to humans in terms of importance and other dogs less still. This study highlights some important methodological limitations of a technique that is heavily relied upon for qualitative risk assessments and applies a novel process, the use of relative risk ranking, to enable the generation of a defensible output using a matrix combination approach. Given the limitations of the prescribed methods as applied to the problem under consideration, further validation, or repudiation, of the findings contained herein is called for using a subsequent quantitative assessment.  相似文献   

15.
In the nuclear power industry, Level 3 probabilistic risk assessment (PRA) is used to estimate damage to public health and the environment if a severe accident leads to large radiological release. Current Level 3 PRA does not have an explicit inclusion of social factors and, therefore, it is not possible to perform importance ranking of social factors for risk‐informing emergency preparedness, planning, and response (EPPR). This article offers a methodology for adapting the concept of social vulnerability, commonly used in natural hazard research, in the context of a severe nuclear power plant accident. The methodology has four steps: (1) calculating a hazard‐independent social vulnerability index for the local population; (2) developing a location‐specific representation of the maximum radiological hazard estimated from current Level 3 PRA, in a geographic information system (GIS) environment; (3) developing a GIS‐based socio‐technical risk map by combining the social vulnerability index and the location‐specific radiological hazard; and (4) conducting a risk importance measure analysis to rank the criticality of social factors based on their contribution to the socio‐technical risk. The methodology is applied using results from the 2012 Surry Power Station state‐of‐the‐art reactor consequence analysis. A radiological hazard model is generated from MELCOR accident consequence code system, translated into a GIS environment, and combined with the Center for Disease Control social vulnerability index (SVI). This research creates an opportunity to explicitly consider and rank the criticality of location‐specific SVI themes based on their influence on risk, providing input for EPPR.  相似文献   

16.
Dose‐response models are the essential link between exposure assessment and computed risk values in quantitative microbial risk assessment, yet the uncertainty that is inherent to computed risks because the dose‐response model parameters are estimated using limited epidemiological data is rarely quantified. Second‐order risk characterization approaches incorporating uncertainty in dose‐response model parameters can provide more complete information to decisionmakers by separating variability and uncertainty to quantify the uncertainty in computed risks. Therefore, the objective of this work is to develop procedures to sample from posterior distributions describing uncertainty in the parameters of exponential and beta‐Poisson dose‐response models using Bayes's theorem and Markov Chain Monte Carlo (in OpenBUGS). The theoretical origins of the beta‐Poisson dose‐response model are used to identify a decomposed version of the model that enables Bayesian analysis without the need to evaluate Kummer confluent hypergeometric functions. Herein, it is also established that the beta distribution in the beta‐Poisson dose‐response model cannot address variation among individual pathogens, criteria to validate use of the conventional approximation to the beta‐Poisson model are proposed, and simple algorithms to evaluate actual beta‐Poisson probabilities of infection are investigated. The developed MCMC procedures are applied to analysis of a case study data set, and it is demonstrated that an important region of the posterior distribution of the beta‐Poisson dose‐response model parameters is attributable to the absence of low‐dose data. This region includes beta‐Poisson models for which the conventional approximation is especially invalid and in which many beta distributions have an extreme shape with questionable plausibility.  相似文献   

17.
We consider a firm that procures an input commodity to produce an output commodity to sell to the end retailer. The retailer's demand for the output commodity is negatively correlated with the price of the output commodity. The firm can sell the output commodity to the retailer through a spot, forward or an index‐based contract. Input and output commodity prices are also correlated and follow a joint stochastic price process. The firm maximizes shareholder value by jointly determining optimal procurement and hedging policies. We show that partial hedging dominates both perfect hedging and no‐hedging when input price, output price, and demand are correlated. We characterize the optimal financial hedging and procurement policies as a function of the term structure of the commodity prices, the correlation between the input and output prices, and the firm's operating characteristics. In addition, our analysis illustrates that hedging is most beneficial when output price volatility is high and input price volatility is low. Our model is tested on futures price data for corn and ethanol from the Chicago Mercantile Exchange.  相似文献   

18.
Y Roll  A Sachish 《Omega》1981,9(1):37-42
A system of productivity indices, suitable for the plant level, is developed. The productivity of each input factor is measured separately against standard, input per unit of output, figures. This system enables the decomposition of comparative indices into components, whereby the contribution of each component to productivity differentials can be determined. A model is proposed for aggregating the single factor (physical) productivities into an overall (economic) index.  相似文献   

19.
《Risk analysis》2018,38(1):163-176
The U.S. Environmental Protection Agency (EPA) uses health risk assessment to help inform its decisions in setting national ambient air quality standards (NAAQS). EPA's standard approach is to make epidemiologically‐based risk estimates based on a single statistical model selected from the scientific literature, called the “core” model. The uncertainty presented for “core” risk estimates reflects only the statistical uncertainty associated with that one model's concentration‐response function parameter estimate(s). However, epidemiologically‐based risk estimates are also subject to “model uncertainty,” which is a lack of knowledge about which of many plausible model specifications and data sets best reflects the true relationship between health and ambient pollutant concentrations. In 2002, a National Academies of Sciences (NAS) committee recommended that model uncertainty be integrated into EPA's standard risk analysis approach. This article discusses how model uncertainty can be taken into account with an integrated uncertainty analysis (IUA) of health risk estimates. It provides an illustrative numerical example based on risk of premature death from respiratory mortality due to long‐term exposures to ambient ozone, which is a health risk considered in the 2015 ozone NAAQS decision. This example demonstrates that use of IUA to quantitatively incorporate key model uncertainties into risk estimates produces a substantially altered understanding of the potential public health gain of a NAAQS policy decision, and that IUA can also produce more helpful insights to guide that decision, such as evidence of decreasing incremental health gains from progressive tightening of a NAAQS.  相似文献   

20.
Refinements of methods for life cycle impact assessment (LCIA) are directed at removing unjustified simplifications and quantifying and reducing uncertainties in results. The amount of uncertainty reduction that is actually achieved through LCIA method refinement depends on the structure of the life cycle inventory model. We investigate the general structure of inventory models using an economic input/output (I/O) life cycle assessment model of the U.S. economy. In particular, we study the results of applying a streamlining algorithm to the I/O LCA model. The streamlining algorithm retains only those "branches" of the process tree that are jointly required to account for a specified fraction of the total impacts upstream of each point in the tree. We examine the implications of these "tree pruning" results for site-informed LCIA. Percentiles are presented for U.S. commodities and several important pollutants, for the share of total upstream emissions contributed by the set of processes in each supply tier, that is, each set of processes that directly supply inputs to another set of processes Capturing at least 90% of the total direct plus upstream emissions for criteria air pollutants and toxic releases for at least 75% of the commodities in the U.S. economy requires full modeling of direct emissions plus the first five supply tiers. The requirements for capturing a high percentage (e.g., >80%) of total emissions vary widely across products or commodities. To capture more than 60% of total emissions for more than half of all commodities requires models with more than 4,000 process instances. To well characterize the total impacts of products, life cycle impact assessment methods must characterize foreground process impacts in a site-informed way and mean impacts of far-removed processes in an unbiased way.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号