首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Land subsidence risk assessment (LSRA) is a multi‐attribute decision analysis (MADA) problem and is often characterized by both quantitative and qualitative attributes with various types of uncertainty. Therefore, the problem needs to be modeled and analyzed using methods that can handle uncertainty. In this article, we propose an integrated assessment model based on the evidential reasoning (ER) algorithm and fuzzy set theory. The assessment model is structured as a hierarchical framework that regards land subsidence risk as a composite of two key factors: hazard and vulnerability. These factors can be described by a set of basic indicators defined by assessment grades with attributes for transforming both numerical data and subjective judgments into a belief structure. The factor‐level attributes of hazard and vulnerability are combined using the ER algorithm, which is based on the information from a belief structure calculated by the Dempster‐Shafer (D‐S) theory, and a distributed fuzzy belief structure calculated by fuzzy set theory. The results from the combined algorithms yield distributed assessment grade matrices. The application of the model to the Xixi‐Chengnan area, China, illustrates its usefulness and validity for LSRA. The model utilizes a combination of all types of evidence, including all assessment information—quantitative or qualitative, complete or incomplete, and precise or imprecise—to provide assessment grades that define risk assessment on the basis of hazard and vulnerability. The results will enable risk managers to apply different risk prevention measures and mitigation planning based on the calculated risk states.  相似文献   

2.
An empirical classification model based on the Majority Rule Sorting (MR-Sort) method has been previously proposed by the authors to evaluate the vulnerability of safety-critical systems (in particular, nuclear power plants [NPPs]) with respect to malevolent intentional acts. In this article, the model serves as the basis for an analysis aimed at determining a set of protective actions to be taken (e.g., increasing the number of monitoring devices, reducing the number of accesses to the safety-critical system) in order to effectively reduce the level of vulnerability of the safety-critical systems under consideration. In particular, the problem is here tackled within an optimization framework: the set of protective actions to implement is chosen as the one minimizing the overall level of vulnerability of a group of safety-critical systems. In this context, three different optimization approaches have been explored: (i) one single classification model is built to evaluate and minimize system vulnerability; (ii) an ensemble of compatible classification models, generated by the bootstrap method, is employed to perform a “robust” optimization, taking as reference the “worst-case” scenario over the group of models; (iii) finally, a distribution of classification models, still obtained by bootstrap, is considered to address vulnerability reduction in a “probabilistic” fashion (i.e., by minimizing the “expected” vulnerability of a fleet of systems). The results are presented and compared with reference to a fictitious example considering NPPs as the safety-critical systems of interest.  相似文献   

3.
Vulnerability of human beings exposed to a catastrophic disaster is affected by multiple factors that include hazard intensity, environment, and individual characteristics. The traditional approach to vulnerability assessment, based on the aggregate‐area method and unsupervised learning, cannot incorporate spatial information; thus, vulnerability can be only roughly assessed. In this article, we propose Bayesian network (BN) and spatial analysis techniques to mine spatial data sets to evaluate the vulnerability of human beings. In our approach, spatial analysis is leveraged to preprocess the data; for example, kernel density analysis (KDA) and accumulative road cost surface modeling (ARCSM) are employed to quantify the influence of geofeatures on vulnerability and relate such influence to spatial distance. The knowledge‐ and data‐based BN provides a consistent platform to integrate a variety of factors, including those extracted by KDA and ARCSM to model vulnerability uncertainty. We also consider the model's uncertainty and use the Bayesian model average and Occam's Window to average the multiple models obtained by our approach to robust prediction of the risk and vulnerability. We compare our approach with other probabilistic models in the case study of seismic risk and conclude that our approach is a good means to mining spatial data sets for evaluating vulnerability.  相似文献   

4.
In the nuclear power industry, Level 3 probabilistic risk assessment (PRA) is used to estimate damage to public health and the environment if a severe accident leads to large radiological release. Current Level 3 PRA does not have an explicit inclusion of social factors and, therefore, it is not possible to perform importance ranking of social factors for risk‐informing emergency preparedness, planning, and response (EPPR). This article offers a methodology for adapting the concept of social vulnerability, commonly used in natural hazard research, in the context of a severe nuclear power plant accident. The methodology has four steps: (1) calculating a hazard‐independent social vulnerability index for the local population; (2) developing a location‐specific representation of the maximum radiological hazard estimated from current Level 3 PRA, in a geographic information system (GIS) environment; (3) developing a GIS‐based socio‐technical risk map by combining the social vulnerability index and the location‐specific radiological hazard; and (4) conducting a risk importance measure analysis to rank the criticality of social factors based on their contribution to the socio‐technical risk. The methodology is applied using results from the 2012 Surry Power Station state‐of‐the‐art reactor consequence analysis. A radiological hazard model is generated from MELCOR accident consequence code system, translated into a GIS environment, and combined with the Center for Disease Control social vulnerability index (SVI). This research creates an opportunity to explicitly consider and rank the criticality of location‐specific SVI themes based on their influence on risk, providing input for EPPR.  相似文献   

5.
Yacov Y. Haimes 《Risk analysis》2011,31(8):1175-1186
This article highlights the complexity of the quantification of the multidimensional risk function, develops five systems‐based premises on quantifying the risk of terrorism to a threatened system, and advocates the quantification of vulnerability and resilience through the states of the system. The five premises are: (i) There exists interdependence between a specific threat to a system by terrorist networks and the states of the targeted system, as represented through the system's vulnerability, resilience, and criticality‐impact. (ii) A specific threat, its probability, its timing, the states of the targeted system, and the probability of consequences can be interdependent. (iii) The two questions in the risk assessment process: “What is the likelihood?” and “What are the consequences?” can be interdependent. (iv) Risk management policy options can reduce both the likelihood of a threat to a targeted system and the associated likelihood of consequences by changing the states (including both vulnerability and resilience) of the system. (v) The quantification of risk to a vulnerable system from a specific threat must be built on a systemic and repeatable modeling process, by recognizing that the states of the system constitute an essential step to construct quantitative metrics of the consequences based on intelligence gathering, expert evidence, and other qualitative information. The fact that the states of all systems are functions of time (among other variables) makes the time frame pivotal in each component of the process of risk assessment, management, and communication. Thus, risk to a system, caused by an initiating event (e.g., a threat) is a multidimensional function of the specific threat, its probability and time frame, the states of the system (representing vulnerability and resilience), and the probabilistic multidimensional consequences.  相似文献   

6.
Point source pollution is one of the main threats to regional environmental health. Based on a water quality model, a methodology to assess the regional risk of point source pollution is proposed. The assessment procedure includes five parts: (1) identifying risk source units and estimating source emissions using Monte Carlo algorithms; (2) observing hydrological and water quality data of the assessed area, and evaluating the selected water quality model; (3) screening out the assessment endpoints and analyzing receptor vulnerability with the Choquet fuzzy integral algorithm; (4) using the water quality model introduced in the second step to predict pollutant concentrations for various source emission scenarios and analyzing hazards of risk sources; and finally, (5) using the source hazard values and receptor vulnerability scores to estimate overall regional risk. The proposed method, based on the Water Quality Analysis Simulation Program (WASP), was applied in the region of the Taipu River, which is in the Taihu Basin, China. Results of source hazard and receptor vulnerability analysis allowed us to describe aquatic ecological, human health, and socioeconomic risks individually, and also integrated risks in the Taipu region, from a series of risk curves. Risk contributions of sources to receptors were ranked, and the spatial distribution of risk levels was presented. By changing the input conditions, we were able to estimate risks for a range of scenarios. Thus, the proposed procedure may also be used by decisionmakers for long‐term dynamic risk prediction.  相似文献   

7.
Major accident risks posed by chemical hazards have raised major social concerns in today's China. Land‐use planning has been adopted by many countries as one of the essential elements for accident prevention. This article aims at proposing a method to assess major accident risks to support land‐use planning in the vicinity of chemical installations. This method is based on the definition of risk by the Accidental Risk Assessment Methodology for IndustrieS (ARAMIS) project and it is an expansion application of severity and vulnerability assessment tools. The severity and vulnerability indexes from the ARAMIS methodology are employed to assess both the severity and vulnerability levels, respectively. A risk matrix is devised to support risk ranking and compatibility checking. The method consists of four main steps and is presented in geographical information‐system‐based maps. As an illustration, the proposed method is applied in Dagushan Peninsula, China. The case study indicated that the method could not only aid risk regulations on existing land‐use planning, but also support future land‐use planning by offering alternatives or influencing the plans at the development stage, and thus further enhance the roles and influence of land‐use planning in the accident prevention activities in China.  相似文献   

8.
Millions of low‐income people of diverse ethnicities inhabit stressful old urban industrial neighborhoods. Yet we know little about the health impacts of built‐environment stressors and risk perceptions in such settings; we lack even basic health profiles. Difficult access is one reason (it took us 30 months to survey 80 households); the lack of multifaceted survey tools is another. We designed and implemented a pilot vulnerability assessment tool in Worcester, Massachusetts. We answer: (1) How can we assess vulnerability to multiple stressors? (2) What is the nature of complex vulnerability—including risk perceptions and health profiles? (3) How can findings be used by our wider community, and what lessons did we learn? (4) What implications arise for science and policy? We sought a holistic picture of neighborhood life. A reasonably representative sample of 80 respondents captured data for 254 people about: demographics, community concerns and resources, time‐activity patterns, health information, risk/stress perceptions, and resources/capacities for coping. Our key findings derive partly from the survey data and partly from our experience in obtaining those data. Data strongly suggest complex vulnerability dominated by psychosocial stress. Unexpected significant gender and ethnic disease disparities emerged: notably, females have twice the disease burden of males, and white females twice the burden of females of color (p < 0.01). Self‐reported depression differentiated by gender and age is illustrative. Community based participatory research (CBPR) approaches require active engagement with marginalized populations, including representatives as funded partners. Complex vulnerability necessitates holistic, participatory approaches to improve scientific understanding and societal responses.  相似文献   

9.
Potential climate‐change‐related impacts to agriculture in the upper Midwest pose serious economic and ecological risks to the U.S. and the global economy. On a local level, farmers are at the forefront of responding to the impacts of climate change. Hence, it is important to understand how farmers and their farm operations may be more or less vulnerable to changes in the climate. A vulnerability index is a tool commonly used by researchers and practitioners to represent the geographical distribution of vulnerability in response to global change. Most vulnerability assessments measure objective adaptive capacity using secondary data collected by governmental agencies. However, other scholarship on human behavior has noted that sociocultural and cognitive factors, such as risk perceptions and perceived capacity, are consequential for modulating people's actual vulnerability. Thus, traditional assessments can potentially overlook people's subjective perceptions of changes in climate and extreme weather events and the extent to which people feel prepared to take necessary steps to cope with and respond to the negative effects of climate change. This article addresses this knowledge gap by: (1) incorporating perceived adaptive capacity into a vulnerability assessment; (2) using spatial smoothing to aggregate individual‐level vulnerabilities to the county level; and (3) evaluating the relationships among different dimensions of adaptive capacity to examine whether perceived capacity should be integrated into vulnerability assessments. The result suggests that vulnerability assessments that rely only on objective measures might miss important sociocognitive dimensions of capacity. Vulnerability indices and maps presented in this article can inform engagement strategies for improving environmental sustainability in the region.  相似文献   

10.
Risk Analysis for Critical Asset Protection   总被引:2,自引:0,他引:2  
This article proposes a quantitative risk assessment and management framework that supports strategic asset-level resource allocation decision making for critical infrastructure and key resource protection. The proposed framework consists of five phases: scenario identification, consequence and criticality assessment, security vulnerability assessment, threat likelihood assessment, and benefit-cost analysis. Key innovations in this methodology include its initial focus on fundamental asset characteristics to generate an exhaustive set of plausible threat scenarios based on a target susceptibility matrix (which we refer to as asset-driven analysis) and an approach to threat likelihood assessment that captures adversary tendencies to shift their preferences in response to security investments based on the expected utilities of alternative attack profiles assessed from the adversary perspective. A notional example is provided to demonstrate an application of the proposed framework. Extensions of this model to support strategic portfolio-level analysis and tactical risk analysis are suggested.  相似文献   

11.
Shiga‐toxin producing Escherichia coli (STEC) strains may cause human infections ranging from simple diarrhea to Haemolytic Uremic Syndrome (HUS). The five main pathogenic serotypes of STEC (MPS‐STEC) identified thus far in Europe are O157:H7, O26:H11, O103:H2, O111:H8, and O145:H28. Because STEC strains can survive or grow during cheese making, particularly in soft cheeses, a stochastic quantitative microbial risk assessment model was developed to assess the risk of HUS associated with the five MPS‐STEC in raw milk soft cheeses. A baseline scenario represents a theoretical worst‐case scenario where no intervention was considered throughout the farm‐to‐fork continuum. The risk level assessed with this baseline scenario is the risk‐based level. The impact of seven preharvest scenarios (vaccines, probiotic, milk farm sorting) on the risk‐based level was expressed in terms of risk reduction. Impact of the preharvest intervention ranges from 76% to 98% of risk reduction with highest values predicted with scenarios combining a decrease of the number of cow shedding STEC and of the STEC concentration in feces. The impact of postharvest interventions on the risk‐based level was also tested by applying five microbiological criteria (MC) at the end of ripening. The five MCs differ in terms of sample size, the number of samples that may yield a value larger than the microbiological limit, and the analysis methods. The risk reduction predicted varies from 25% to 96% by applying MCs without preharvest interventions and from 1% to 96% with combination of pre‐ and postharvest interventions.  相似文献   

12.
Dose‐response models in microbial risk assessment consider two steps in the process ultimately leading to illness: from exposure to (asymptomatic) infection, and from infection to (symptomatic) illness. Most data and theoretical approaches are available for the exposure‐infection step; the infection‐illness step has received less attention. Furthermore, current microbial risk assessment models do not account for acquired immunity. These limitations may lead to biased risk estimates. We consider effects of both dose dependency of the conditional probability of illness given infection, and acquired immunity to risk estimates, and demonstrate their effects in a case study on exposure to Campylobacter jejuni. To account for acquired immunity in risk estimates, an inflation factor is proposed. The inflation factor depends on the relative rates of loss of protection over exposure. The conditional probability of illness given infection is based on a previously published model, accounting for the within‐host dynamics of illness. We find that at low (average) doses, the infection‐illness model has the greatest impact on risk estimates, whereas at higher (average) doses and/or increased exposure frequencies, the acquired immunity model has the greatest impact. The proposed models are strongly nonlinear, and reducing exposure is not expected to lead to a proportional decrease in risk and, under certain conditions, may even lead to an increase in risk. The impact of different dose‐response models on risk estimates is particularly pronounced when introducing heterogeneity in the population exposure distribution.  相似文献   

13.
Between 1996 and 1999, five mining subsidence events occurred in the iron-ore field in Lorraine, France, and damaged several hundred buildings. Because of the thousand hectares of undermined areas, an assessment of the vulnerability of buildings and land is necessary for risk management. Risk assessment methods changed from initial risk management decisions that took place immediately after the mining subsidence to the risk assessment studies that are currently under consideration. These changes reveal much about the complexity of the vulnerability concept and about difficulties in developing simple and relevant methods for its assessment. The objective of this article is to present this process, suggest improvements on the basis of theoretical definitions of the vulnerability, and give an operational example of vulnerability assessment in the seismic field. The vulnerability is divided into three components: weakness, stakes value, and resilience. Final improvements take into account these three components and constitute an original method of assessing the vulnerability of a city to subsidence.  相似文献   

14.
In nonlinear panel data models, the incidental parameter problem remains a challenge to econometricians. Available solutions are often based on ingenious, model‐specific methods. In this paper, we propose a systematic approach to construct moment restrictions on common parameters that are free from the individual fixed effects. This is done by an orthogonal projection that differences out the unknown distribution function of individual effects. Our method applies generally in likelihood models with continuous dependent variables where a condition of non‐surjectivity holds. The resulting method‐of‐moments estimators are root‐N consistent (for fixed T) and asymptotically normal, under regularity conditions that we spell out. Several examples and a small‐scale simulation exercise complete the paper.  相似文献   

15.
Toxoplasma gondii is a protozoan parasite that is responsible for approximately 24% of deaths attributed to foodborne pathogens in the United States. It is thought that a substantial portion of human T. gondii infections is acquired through the consumption of meats. The dose‐response relationship for human exposures to T. gondii‐infected meat is unknown because no human data are available. The goal of this study was to develop and validate dose‐response models based on animal studies, and to compute scaling factors so that animal‐derived models can predict T. gondii infection in humans. Relevant studies in literature were collected and appropriate studies were selected based on animal species, stage, genotype of T. gondii, and route of infection. Data were pooled and fitted to four sigmoidal‐shaped mathematical models, and model parameters were estimated using maximum likelihood estimation. Data from a mouse study were selected to develop the dose‐response relationship. Exponential and beta‐Poisson models, which predicted similar responses, were selected as reasonable dose‐response models based on their simplicity, biological plausibility, and goodness fit. A confidence interval of the parameter was determined by constructing 10,000 bootstrap samples. Scaling factors were computed by matching the predicted infection cases with the epidemiological data. Mouse‐derived models were validated against data for the dose‐infection relationship in rats. A human dose‐response model was developed as P (d) = 1–exp (–0.0015 × 0.005 × d) or P (d) = 1–(1 + d × 0.003 / 582.414)?1.479. Both models predict the human response after consuming T. gondii‐infected meats, and provide an enhanced risk characterization in a quantitative microbial risk assessment model for this pathogen.  相似文献   

16.
The Monte Carlo (MC) simulation approach is traditionally used in food safety risk assessment to study quantitative microbial risk assessment (QMRA) models. When experimental data are available, performing Bayesian inference is a good alternative approach that allows backward calculation in a stochastic QMRA model to update the experts’ knowledge about the microbial dynamics of a given food‐borne pathogen. In this article, we propose a complex example where Bayesian inference is applied to a high‐dimensional second‐order QMRA model. The case study is a farm‐to‐fork QMRA model considering genetic diversity of Bacillus cereus in a cooked, pasteurized, and chilled courgette purée. Experimental data are Bacillus cereus concentrations measured in packages of courgette purées stored at different time‐temperature profiles after pasteurization. To perform a Bayesian inference, we first built an augmented Bayesian network by linking a second‐order QMRA model to the available contamination data. We then ran a Markov chain Monte Carlo (MCMC) algorithm to update all the unknown concentrations and unknown quantities of the augmented model. About 25% of the prior beliefs are strongly updated, leading to a reduction in uncertainty. Some updates interestingly question the QMRA model.  相似文献   

17.
This article studies a general type of initiating events in critical infrastructures, called spatially localized failures (SLFs), which are defined as the failure of a set of infrastructure components distributed in a spatially localized area due to damage sustained, while other components outside the area do not directly fail. These failures can be regarded as a special type of intentional attack, such as bomb or explosive assault, or a generalized modeling of the impact of localized natural hazards on large‐scale systems. This article introduces three SLFs models: node centered SLFs, district‐based SLFs, and circle‐shaped SLFs, and proposes a SLFs‐induced vulnerability analysis method from three aspects: identification of critical locations, comparisons of infrastructure vulnerability to random failures, topologically localized failures and SLFs, and quantification of infrastructure information value. The proposed SLFs‐induced vulnerability analysis method is finally applied to the Chinese railway system and can be also easily adapted to analyze other critical infrastructures for valuable protection suggestions.  相似文献   

18.
Dependence assessment among human errors in human reliability analysis (HRA) is an important issue. Many of the dependence assessment methods in HRA rely heavily on the expert's opinion, thus are subjective and may sometimes cause inconsistency. In this article, we propose a computational model based on the Dempster‐Shafer evidence theory (DSET) and the analytic hierarchy process (AHP) method to handle dependence in HRA. First, dependence influencing factors among human tasks are identified and the weights of the factors are determined by experts using the AHP method. Second, judgment on each factor is given by the analyst referring to anchors and linguistic labels. Third, the judgments are represented as basic belief assignments (BBAs) and are integrated into a fused BBA by weighted average combination in DSET. Finally, the CHEP is calculated based on the fused BBA. The proposed model can deal with ambiguity and the degree of confidence in the judgments, and is able to reduce the subjectivity and improve the consistency in the evaluation process.  相似文献   

19.
Cluster‐based segmentation usually involves two sets of variables: (i) the needs‐based variables (referred to as the bases variables), which are used in developing the original segments to identify the value, and (ii) the classification or background variables, which are used to profile or target the customers. The managers’ goal is to utilize these two sets of variables in the most efficient manner. Pragmatic managerial interests recognize the underlying need to start shifting from methodologies that obtain highly precise value‐based segments but may be of limited practical use as they provide less targetable segments. Consequently, the imperative is to shift toward newer segmentation approaches that provide greater focus on targetable segments while maintaining homogeneity. This requires dual objective segmentation, which is a combinatorially difficult problem. Hence, we propose and examine a new evolutionary methodology based on genetic algorithms to address this problem. We show, based on a large‐scale Monte Carlo simulation and a case study, that the proposed approach consistently outperforms the existing methods for a wide variety of problem instances. We are able to obtain statistically significant and managerially important improvements in targetability with little diminution in the identifiability of value‐based segments. Moreover, the proposed methodology provides a set of good solutions, unlike existing methodologies that provide a single solution. We also show how these good solutions can be used to plot an efficient Pareto frontier. Finally, we present useful insights that would help managers in implementing the proposed solution approach effectively.  相似文献   

20.
This article presents a regression‐tree‐based meta‐analysis of rodent pulmonary toxicity studies of uncoated, nonfunctionalized carbon nanotube (CNT) exposure. The resulting analysis provides quantitative estimates of the contribution of CNT attributes (impurities, physical dimensions, and aggregation) to pulmonary toxicity indicators in bronchoalveolar lavage fluid: neutrophil and macrophage count, and lactate dehydrogenase and total protein concentrations. The method employs classification and regression tree (CART) models, techniques that are relatively insensitive to data defects that impair other types of regression analysis: high dimensionality, nonlinearity, correlated variables, and significant quantities of missing values. Three types of analysis are presented: the RT, the random forest (RF), and a random‐forest‐based dose‐response model. The RT shows the best single model supported by all the data and typically contains a small number of variables. The RF shows how much variance reduction is associated with every variable in the data set. The dose‐response model is used to isolate the effects of CNT attributes from the CNT dose, showing the shift in the dose‐response caused by the attribute across the measured range of CNT doses. It was found that the CNT attributes that contribute the most to pulmonary toxicity were metallic impurities (cobalt significantly increased observed toxicity, while other impurities had mixed effects), CNT length (negatively correlated with most toxicity indicators), CNT diameter (significantly positively associated with toxicity), and aggregate size (negatively correlated with cell damage indicators and positively correlated with immune response indicators). Increasing CNT N2‐BET‐specific surface area decreased toxicity indicators.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号