首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The safety and security of straits and canals have been playing an important role in maritime transportation. The disruption of a strait or canal will lead to increased transportation costs and world trade problems. Therefore, an advanced approach incorporating fuzzy logic and an evidential reasoning (ER) algorithm is developed to conduct the vulnerability assessment of straits or canals in this paper. A hierarchical structure is first developed taking into account both qualitative and quantitative factors. The fuzzy rule-based transformation technique is applied to convert quantitative factors into qualitative ones, which enables the application of a fuzzy ER method to synthesize all the information from the bottom to the top along the developed hierarchical structure. The software of intelligent decision system (IDS) is used to facilitate the process of vulnerability assessment. The developed framework then is validated and demonstrated in a case study for vulnerability prioritization which can be used as a reference to ensure the safety and security of straits and canals for decision-makers.  相似文献   

2.
针对基本属性权重的不确定性,以及基本属性与广义属性评价集的不一致性等问题,提出一种基于证据推理的不确定多属性决策方法,将证据推理算法推广到更一般的决策环境中.根据决策矩阵的信息熵客观地获得属性的权系数;而对于基本属性与广义属性评价集不一致的情况,则通过对基本属性分布评价的模糊化及模糊变换,合理地实现到广义分布评价的统一形式;最后应用证据推理算法得到整个方案集的排序.实例结果表明,该方法是可行的、有效的.  相似文献   

3.
In this article, a classification model based on the majority rule sorting (MR‐Sort) method is employed to evaluate the vulnerability of safety‐critical systems with respect to malevolent intentional acts. The model is built on the basis of a (limited‐size) set of data representing (a priori known) vulnerability classification examples. The empirical construction of the classification model introduces a source of uncertainty into the vulnerability analysis process: a quantitative assessment of the performance of the classification model (in terms of accuracy and confidence in the assignments) is thus in order. Three different app oaches are here considered to this aim: (i) a model–retrieval‐based approach, (ii) the bootstrap method, and (iii) the leave‐one‐out cross‐validation technique. The analyses are presented with reference to an exemplificative case study involving the vulnerability assessment of nuclear power plants.  相似文献   

4.
Model uncertainty is a primary source of uncertainty in the assessment of the performance of repositories for the disposal of nuclear wastes, due to the complexity of the system and the large spatial and temporal scales involved. This work considers multiple assumptions on the system behavior and corresponding alternative plausible modeling hypotheses. To characterize the uncertainty in the correctness of the different hypotheses, the opinions of different experts are treated probabilistically or, in alternative, by the belief and plausibility functions of the Dempster‐Shafer theory. A comparison is made with reference to a flow model for the evaluation of the hydraulic head distributions present at a radioactive waste repository site. Three experts are assumed available for the evaluation of the uncertainties associated with the hydrogeological properties of the repository and the groundwater flow mechanisms.  相似文献   

5.
Vulnerability of human beings exposed to a catastrophic disaster is affected by multiple factors that include hazard intensity, environment, and individual characteristics. The traditional approach to vulnerability assessment, based on the aggregate‐area method and unsupervised learning, cannot incorporate spatial information; thus, vulnerability can be only roughly assessed. In this article, we propose Bayesian network (BN) and spatial analysis techniques to mine spatial data sets to evaluate the vulnerability of human beings. In our approach, spatial analysis is leveraged to preprocess the data; for example, kernel density analysis (KDA) and accumulative road cost surface modeling (ARCSM) are employed to quantify the influence of geofeatures on vulnerability and relate such influence to spatial distance. The knowledge‐ and data‐based BN provides a consistent platform to integrate a variety of factors, including those extracted by KDA and ARCSM to model vulnerability uncertainty. We also consider the model's uncertainty and use the Bayesian model average and Occam's Window to average the multiple models obtained by our approach to robust prediction of the risk and vulnerability. We compare our approach with other probabilistic models in the case study of seismic risk and conclude that our approach is a good means to mining spatial data sets for evaluating vulnerability.  相似文献   

6.
Groundwater leakage into subsurface constructions can cause reduction of pore pressure and subsidence in clay deposits, even at large distances from the location of the construction. The potential cost of damage is substantial, particularly in urban areas. The large‐scale process also implies heterogeneous soil conditions that cannot be described in complete detail, which causes a need for estimating uncertainty of subsidence with probabilistic methods. In this study, the risk for subsidence is estimated by coupling two probabilistic models, a geostatistics‐based soil stratification model with a subsidence model. Statistical analyses of stratification and soil properties are inputs into the models. The results include spatially explicit probabilistic estimates of subsidence magnitude and sensitivities of included model parameters. From these, areas with significant risk for subsidence are distinguished from low‐risk areas. The efficiency and usefulness of this modeling approach as a tool for communication to stakeholders, decision support for prioritization of risk‐reducing measures, and identification of the need for further investigations and monitoring are demonstrated with a case study of a planned tunnel in Stockholm.  相似文献   

7.
Point source pollution is one of the main threats to regional environmental health. Based on a water quality model, a methodology to assess the regional risk of point source pollution is proposed. The assessment procedure includes five parts: (1) identifying risk source units and estimating source emissions using Monte Carlo algorithms; (2) observing hydrological and water quality data of the assessed area, and evaluating the selected water quality model; (3) screening out the assessment endpoints and analyzing receptor vulnerability with the Choquet fuzzy integral algorithm; (4) using the water quality model introduced in the second step to predict pollutant concentrations for various source emission scenarios and analyzing hazards of risk sources; and finally, (5) using the source hazard values and receptor vulnerability scores to estimate overall regional risk. The proposed method, based on the Water Quality Analysis Simulation Program (WASP), was applied in the region of the Taipu River, which is in the Taihu Basin, China. Results of source hazard and receptor vulnerability analysis allowed us to describe aquatic ecological, human health, and socioeconomic risks individually, and also integrated risks in the Taipu region, from a series of risk curves. Risk contributions of sources to receptors were ranked, and the spatial distribution of risk levels was presented. By changing the input conditions, we were able to estimate risks for a range of scenarios. Thus, the proposed procedure may also be used by decisionmakers for long‐term dynamic risk prediction.  相似文献   

8.
Management of invasive species depends on developing prevention and control strategies through comprehensive risk assessment frameworks that need a thorough analysis of exposure to invasive species. However, accurate exposure analysis of invasive species can be a daunting task because of the inherent uncertainty in invasion processes. Risk assessment of invasive species under uncertainty requires potential integration of expert judgment with empirical information, which often can be incomplete, imprecise, and fragmentary. The representation of knowledge in classical risk models depends on the formulation of a precise probabilistic value or well-defined joint distribution of unknown parameters. However, expert knowledge and judgments are often represented in value-laden terms or preference-ordered criteria. We offer a novel approach to risk assessment by using a dominance-based rough set approach to account for preference order in the domains of attributes in the set of risk classes. The model is illustrated with an example showing how a knowledge-centric risk model can be integrated with the dominance-based principle of rough set to derive minimal covering "if ... , then...," decision rules to reason over a set of possible invasion scenarios. The inconsistency and ambiguity in the data set is modeled using the rough set concept of boundary region adjoining lower and upper approximation of risk classes. Finally, we present an extension of rough set to evidence a theoretic interpretation of risk measures of invasive species in a spatial context. In this approach, the multispecies interactions in an invasion risk are approximated with imprecise probability measures through a combination of spatial neighborhood information of risk estimation in terms of belief and plausibility.  相似文献   

9.
Risk‐benefit analyses are introduced as a new paradigm for old problems. However, in many cases it is not always necessary to perform a full comprehensive and expensive quantitative risk‐benefit assessment to solve the problem, nor is it always possible, given the lack of required date. The choice to continue from a more qualitative to a full quantitative risk‐benefit assessment can be made using a tiered approach. In this article, this tiered approach for risk‐benefit assessment will be addressed using a decision tree. The tiered approach described uses the same four steps as the risk assessment paradigm: hazard and benefit identification, hazard and benefit characterization, exposure assessment, and risk‐benefit characterization, albeit in a different order. For the purpose of this approach, the exposure assessment has been moved upward and the dose‐response modeling (part of hazard and benefit characterization) is moved to a later stage. The decision tree includes several stop moments, depending on the situation where the gathered information is sufficient to answer the initial risk‐benefit question. The approach has been tested for two food ingredients. The decision tree presented in this article is useful to assist on a case‐by‐case basis a risk‐benefit assessor and policymaker in making informed choices when to stop or continue with a risk‐benefit assessment.  相似文献   

10.
Terje Aven 《Risk analysis》2011,31(4):515-522
Recently, considerable attention has been paid to a systems‐based approach to risk, vulnerability, and resilience analysis. It is argued that risk, vulnerability, and resilience are inherently and fundamentally functions of the states of the system and its environment. Vulnerability is defined as the manifestation of the inherent states of the system that can be subjected to a natural hazard or be exploited to adversely affect that system, whereas resilience is defined as the ability of the system to withstand a major disruption within acceptable degradation parameters and to recover within an acceptable time, and composite costs, and risks. Risk, on the other hand, is probability based, defined by the probability and severity of adverse effects (i.e., the consequences). In this article, we look more closely into this approach. It is observed that the key concepts are inconsistent in the sense that the uncertainty (probability) dimension is included for the risk definition but not for vulnerability and resilience. In the article, we question the rationale for this inconsistency. The suggested approach is compared with an alternative framework that provides a logically defined structure for risk, vulnerability, and resilience, where all three concepts are incorporating the uncertainty (probability) dimension.  相似文献   

11.
In expected utility theory, risk attitudes are modeled entirely in terms of utility. In the rank‐dependent theories, a new dimension is added: chance attitude, modeled in terms of nonadditive measures or nonlinear probability transformations that are independent of utility. Most empirical studies of chance attitude assume probabilities given and adopt parametric fitting for estimating the probability transformation. Only a few qualitative conditions have been proposed or tested as yet, usually quasi‐concavity or quasi‐convexity in the case of given probabilities. This paper presents a general method of studying qualitative properties of chance attitude such as optimism, pessimism, and the “inverse‐S shape” pattern, both for risk and for uncertainty. These qualitative properties can be characterized by permitting appropriate, relatively simple, violations of the sure‐thing principle. In particular, this paper solves a hitherto open problem: the preference axiomatization of convex (“pessimistic” or “uncertainty averse”) nonadditive measures under uncertainty. The axioms of this paper preserve the central feature of rank‐dependent theories, i.e. the separation of chance attitude and utility.  相似文献   

12.
In the nuclear power industry, Level 3 probabilistic risk assessment (PRA) is used to estimate damage to public health and the environment if a severe accident leads to large radiological release. Current Level 3 PRA does not have an explicit inclusion of social factors and, therefore, it is not possible to perform importance ranking of social factors for risk‐informing emergency preparedness, planning, and response (EPPR). This article offers a methodology for adapting the concept of social vulnerability, commonly used in natural hazard research, in the context of a severe nuclear power plant accident. The methodology has four steps: (1) calculating a hazard‐independent social vulnerability index for the local population; (2) developing a location‐specific representation of the maximum radiological hazard estimated from current Level 3 PRA, in a geographic information system (GIS) environment; (3) developing a GIS‐based socio‐technical risk map by combining the social vulnerability index and the location‐specific radiological hazard; and (4) conducting a risk importance measure analysis to rank the criticality of social factors based on their contribution to the socio‐technical risk. The methodology is applied using results from the 2012 Surry Power Station state‐of‐the‐art reactor consequence analysis. A radiological hazard model is generated from MELCOR accident consequence code system, translated into a GIS environment, and combined with the Center for Disease Control social vulnerability index (SVI). This research creates an opportunity to explicitly consider and rank the criticality of location‐specific SVI themes based on their influence on risk, providing input for EPPR.  相似文献   

13.
Between 1996 and 1999, five mining subsidence events occurred in the iron-ore field in Lorraine, France, and damaged several hundred buildings. Because of the thousand hectares of undermined areas, an assessment of the vulnerability of buildings and land is necessary for risk management. Risk assessment methods changed from initial risk management decisions that took place immediately after the mining subsidence to the risk assessment studies that are currently under consideration. These changes reveal much about the complexity of the vulnerability concept and about difficulties in developing simple and relevant methods for its assessment. The objective of this article is to present this process, suggest improvements on the basis of theoretical definitions of the vulnerability, and give an operational example of vulnerability assessment in the seismic field. The vulnerability is divided into three components: weakness, stakes value, and resilience. Final improvements take into account these three components and constitute an original method of assessing the vulnerability of a city to subsidence.  相似文献   

14.
Dose‐response models are the essential link between exposure assessment and computed risk values in quantitative microbial risk assessment, yet the uncertainty that is inherent to computed risks because the dose‐response model parameters are estimated using limited epidemiological data is rarely quantified. Second‐order risk characterization approaches incorporating uncertainty in dose‐response model parameters can provide more complete information to decisionmakers by separating variability and uncertainty to quantify the uncertainty in computed risks. Therefore, the objective of this work is to develop procedures to sample from posterior distributions describing uncertainty in the parameters of exponential and beta‐Poisson dose‐response models using Bayes's theorem and Markov Chain Monte Carlo (in OpenBUGS). The theoretical origins of the beta‐Poisson dose‐response model are used to identify a decomposed version of the model that enables Bayesian analysis without the need to evaluate Kummer confluent hypergeometric functions. Herein, it is also established that the beta distribution in the beta‐Poisson dose‐response model cannot address variation among individual pathogens, criteria to validate use of the conventional approximation to the beta‐Poisson model are proposed, and simple algorithms to evaluate actual beta‐Poisson probabilities of infection are investigated. The developed MCMC procedures are applied to analysis of a case study data set, and it is demonstrated that an important region of the posterior distribution of the beta‐Poisson dose‐response model parameters is attributable to the absence of low‐dose data. This region includes beta‐Poisson models for which the conventional approximation is especially invalid and in which many beta distributions have an extreme shape with questionable plausibility.  相似文献   

15.
The Monte Carlo (MC) simulation approach is traditionally used in food safety risk assessment to study quantitative microbial risk assessment (QMRA) models. When experimental data are available, performing Bayesian inference is a good alternative approach that allows backward calculation in a stochastic QMRA model to update the experts’ knowledge about the microbial dynamics of a given food‐borne pathogen. In this article, we propose a complex example where Bayesian inference is applied to a high‐dimensional second‐order QMRA model. The case study is a farm‐to‐fork QMRA model considering genetic diversity of Bacillus cereus in a cooked, pasteurized, and chilled courgette purée. Experimental data are Bacillus cereus concentrations measured in packages of courgette purées stored at different time‐temperature profiles after pasteurization. To perform a Bayesian inference, we first built an augmented Bayesian network by linking a second‐order QMRA model to the available contamination data. We then ran a Markov chain Monte Carlo (MCMC) algorithm to update all the unknown concentrations and unknown quantities of the augmented model. About 25% of the prior beliefs are strongly updated, leading to a reduction in uncertainty. Some updates interestingly question the QMRA model.  相似文献   

16.
Multicriteria decision analysis (MCDA) has been applied to various energy problems to incorporate a variety of qualitative and quantitative criteria, usually spanning environmental, social, engineering, and economic fields. MCDA and associated methods such as life‐cycle assessments and cost‐benefit analysis can also include risk analysis to address uncertainties in criteria estimates. One technology now being assessed to help mitigate climate change is carbon capture and storage (CCS). CCS is a new process that captures CO2 emissions from fossil‐fueled power plants and injects them into geological reservoirs for storage. It presents a unique challenge to decisionmakers (DMs) due to its technical complexity, range of environmental, social, and economic impacts, variety of stakeholders, and long time spans. The authors have developed a risk assessment model using a MCDA approach for CCS decisions such as selecting between CO2 storage locations and choosing among different mitigation actions for reducing risks. The model includes uncertainty measures for several factors, utility curve representations of all variables, Monte Carlo simulation, and sensitivity analysis. This article uses a CCS scenario example to demonstrate the development and application of the model based on data derived from published articles and publicly available sources. The model allows high‐level DMs to better understand project risks and the tradeoffs inherent in modern, complex energy decisions.  相似文献   

17.
Knight不确定条件下的模糊二叉树期权定价模型   总被引:3,自引:2,他引:3  
在市场含有Knight不确定因素的环境下,影响期权价格的因素不仅仅具有随机性的特点,而且还存在着模糊的性质.因而我们假设股票价格服从模糊随机过程,使用基于抛物型模糊数的二叉树模型对欧式期权进行定价,得到的风险中性概率及期权价格为一个赋权区间.在使用中国权证数据进行的实证中,采用二次规划方法确定模型参数,并对模糊期权价格进行去模糊化,与传统的二叉树模型进行实证比较后发现,应用模糊二叉树模型能得出更准确的市场价格预测.投资者可以选择自己可接受的置信度,得到一个模糊价格区间,以此指导投资策略.此外,应用此模型能够得到期权价格的模糊程度的度量-模糊度,从而获知Knight不确定性的大小.  相似文献   

18.
Underlying information about failure, including observations made in free text, can be a good source for understanding, analyzing, and extracting meaningful information for determining causation. The unstructured nature of natural language expression demands advanced methodology to identify its underlying features. There is no available solution to utilize unstructured data for risk assessment purposes. Due to the scarcity of relevant data, textual data can be a vital learning source for developing a risk assessment methodology. This work addresses the knowledge gap in extracting relevant features from textual data to develop cause–effect scenarios with minimal manual interpretation. This study applies natural language processing and text-mining techniques to extract features from past accident reports. The extracted features are transformed into parametric form with the help of fuzzy set theory and utilized in Bayesian networks as prior probabilities for risk assessment. An application of the proposed methodology is shown in microbiologically influenced corrosion-related incident reports available from the Pipeline and Hazardous Material Safety Administration database. In addition, the trained named entity recognition (NER) model is verified on eight incidents, showing a promising preliminary result for identifying all relevant features from textual data and demonstrating the robustness and applicability of the NER method. The proposed methodology can be used in domain-specific risk assessment to analyze, predict, and prevent future mishaps, ameliorating overall process safety.  相似文献   

19.
Detailed spatial representation of socioeconomic exposure and the related vulnerability to natural hazards has the potential to improve the quality and reliability of risk assessment outputs. We apply a spatially weighted dasymetric approach based on multiple ancillary data to downscale important socioeconomic variables and produce a grid data set for Italy that contains multilayered information about physical exposure, population, gross domestic product, and social vulnerability. We test the performances of our dasymetric approach compared to other spatial interpolation methods. Next, we combine the grid data set with flood hazard estimates to exemplify an application for the purpose of risk assessment.  相似文献   

20.
Helicobacter pylori is a microaerophilic, gram‐negative bacterium that is linked to adverse health effects including ulcers and gastrointestinal cancers. The goal of this analysis is to develop the necessary inputs for a quantitative microbial risk assessment (QMRA) needed to develop a potential guideline for drinking water at the point of ingestion (e.g., a maximum contaminant level, or MCL) that would be protective of human health to an acceptable level of risk while considering sources of uncertainty. Using infection and gastric cancer as two discrete endpoints, and calculating dose‐response relationships from experimental data on humans and monkeys, we perform both a forward and reverse risk assessment to determine the risk from current reported surface water concentrations of H. pylori and an acceptable concentration of H. pylori at the point of ingestion. This approach represents a synthesis of available information on human exposure to H. pylori via drinking water. A lifetime risk of cancer model suggests that a MCL be set at <1 organism/L given a 5‐log removal treatment because we cannot exclude the possibility that current levels of H. pylori in environmental source waters pose a potential public health risk. Research gaps include pathogen occurrence in source and finished water, treatment removal rates, and determination of H. pylori risks from other water sources such as groundwater and recreational water.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号