首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 328 毫秒
1.
Hybrid Approach for Addressing Uncertainty in Risk Assessments   总被引:3,自引:0,他引:3  
Parameter uncertainty is a major aspect of the model-based estimation of the risk of human exposure to pollutants. The Monte Carlo method, which applies probability theory to address model parameter uncertainty, relies on a statistical representation of available information. In recent years, other uncertainty theories have been proposed as alternative approaches to address model parameter uncertainty in situations where available information is insufficient to identify statistically representative probability distributions, due in particular to data scarcity. The simplest such theory is possibility theory, which uses so-called fuzzy numbers to represent model parameter uncertainty. In practice, it may occur that certain model parameters can be reasonably represented by probability distributions, because there are sufficient data available to substantiate such distributions by statistical analysis, while others are better represented by fuzzy numbers (due to data scarcity). The question then arises as to how these two modes of representation of model parameter uncertainty can be combined for the purpose of estimating the risk of exposure. This paper proposes an approach (termed a hybrid approach) which combines Monte Carlo random sampling of probability distribution functions with fuzzy calculus. The approach is applied to a real case of estimation of human exposure, via vegetable consumption, to cadmium present in the surficial soils of an industrial site located in the north of France. The application illustrates the potential of the proposed approach, which allows the uncertainty affecting model parameters to be represented in a way that is consistent with the information at hand. Also, because the hybrid approach takes advantage of the “rich” information provided by probability distributions, while retaining the conservative character of fuzzy calculus, it is believed to hold value in terms of a “reasonable” application of the precautionary principle.  相似文献   

2.
A Bayesian statistical approach for determining the parameter uncertainty of a storm-water treatment model is reported. The storm-water treatment technologies included a sand filter and a subsurface gravel wetland. The two field systems were loaded and monitored in a side-by-side fashion over a two-year period. The loading to each system was storm-water runoff generated by ambient rainfall on a commuter parking lot. Contaminant transport is simulated by using a one-dimensional advection-dispersion model. The unknown parameters of the model are the contaminant deposition rate and the hydrodynamic dispersion. The following contaminants are considered in the study: total suspended solids, total petroleum hydrocarbons–diesel range hydrocarbons, and zinc. Parameter uncertainties are addressed by estimating the posterior probability distributions through a conventional Metropolis-Hastings algorithm. Results indicate that the posterior distributions are unimodal and, in some instances, exhibit some level of skewness. The Bayesian approach allowed the estimation of the 10th, 25th, 50th, 75th, and 95th percentiles of the posterior probability distributions. The prediction capabilities of the model were explored by performing a Monte Carlo simulation using the calculated posterior distributions and two rainfall-runoff events not considered during the calibration phase. The objective is to estimate effluent concentrations from the treatment systems under different scenarios of flow and contaminant loads. In general, estimated effluent concentrations and the total estimated mass fell within the defined uncertainty limits.  相似文献   

3.
Definition of a simplified model of scatter which can be incorporated in maximum likelihood reconstruction for single-photon emission tomography (SPET) continues to be appealing; however, implementation must be efficient for it to be clinically applicable. In this paper an efficient algorithm for scatter estimation is described in which the spatial scatter distribution is implemented as a spatially invariant convolution for points of constant depth in tissue. The scatter estimate is weighted by a space-dependent build-up factor based on the measured attenuation in tissue. Monte Carlo simulation of a realistic thorax phantom was used to validate this approach. Further efficiency was introduced by estimating scatter once after a small number of iterations using the ordered subsets expectation maximisation (OSEM) reconstruction algorithm. The scatter estimate was incorporated as a constant term in subsequent iterations rather than modifying the scatter estimate each iteration. Monte Carlo simulation was used to demonstrate that the scatter estimate does not change significantly provided at least two iterations OSEM reconstruction, subset size 8, is used. Complete scatter-corrected reconstruction of 64 projections of 40?128 pixels was achieved in 38 min using a Sun Sparc20 computer.  相似文献   

4.
The total maximum daily load (TMDL) approaches that have relied mostly on deterministic modeling have inherent problems with considerations of a margin of safety and estimating probabilities of excursions of water quality standards expressed in terms of magnitude, duration, and frequency. A tiered probabilistic TMDL approach is proposed in this paper. A simple databased Tier I TMDL that uses statistical principles has been proposed for watersheds that have adequate water quality databases enabling statistical evaluations. Studies have shown that for many pollutants, event mean concentrations in runoff, wastewater loads, and concentrations in the receiving waters follow the log-normal probability distribution. Other probability distributions are also applicable. Tier II Monte Carlo simulation, using a simpler deterministic or black box water quality model as a transfer function, can then be used to generate time series of data, which fills the data gaps and allows estimation of probabilities of excursions of chronic standards that are averaged over periods of 4 or 30 days. Statistical approaches, including Monte Carlo, allow replacement of an arbitrary margin of safety by a quantitative estimation of uncertainty and enable linking the model results to the standards defined in terms of magnitude, frequency, and duration.  相似文献   

5.
It is quite common to see experimental data analysed according to a variety of models of ligand-receptor interaction. Often, parameters derived from such models are compared statistically. The most commonly employed statistical analyses contain explicit assumptions about the underlying distributions of the model parameters being compared, yet the validity of these assumptions is not often ascertained. In this article, Arthur Christopoulos describes a general approach to Monte Carlo simulation of data, and outlines how the analysis of such simulated data may be used to address the question of the distribution of model parameters. The results of such an exercise can guide the researcher to the appropriate choice of statistical test or data transform.  相似文献   

6.
This work is devoted to some recent developments in uncertainty analysis of environmental models in the presence of incomplete knowledge. The classical uncertainty methodology based on probabilistic modeling provides direct estimations of relevant statistical measures to quantify the uncertainty on the model responses thanks to a nice mixing between Monte Carlo simulations and the use of efficient statistical treatments. However, this approach may lead to unrealistic results when not enough information is available to specify the probability distribution functions (pdfs) of input parameters. For example, if a fixed (i.e., the pdf is a Dirac distribution) variable is unknown between a and b, the proper way to model this knowledge is to consider a set of δc distributions (a δc distribution means that the probability that the parameter is equal to c is 1 and 0 elsewhere), c belonging to [a,b]. This is quite different from assume an equidistribution. Thus, to respect the real state of knowledge in industrial applications, a new modeling based on the theory of evidence is introduced. It allows an extension of classical Monte Carlo simulations by relaxing assumptions related to the choice of probability distribution functions and possible dependencies between uncertain parameters. To illustrate the principle of our modeling, a comparison with the probabilistic modeling is given in the case of the transfer of a radionuclide in the environment.  相似文献   

7.
The development of simple accurate, and efficient methods for estimation of the extreme response of dynamical systems subjected to random excitations is discussed in the present paper. The key quantity for calculating the statistical distribution of extreme response is the mean level upcrossing rate function. By exploiting the regularity of the tail behavior of this function, an efficient simulation based methodology for estimating the extreme response distribution function is developed. This makes it possible to avoid the commonly adopted assumption that the extreme value data follow an appropriate asymptotic extreme value distribution, which would be a Gumbel distribution for the models considered in this paper. It is demonstrated that the commonly quoted obstacle against using the standard Monte Carlo method for estimating extreme responses, i.e., excessive CPU time, can be circumvented, bringing the computational efforts down to quite acceptable levels.  相似文献   

8.
Although there has been nearly complete agreement in the scientific community that Monte Carlo techniques represent a significant improvement in the exposure assessment process, virtually all state and federal risk assessments still rely on the traditional point estimate approach. One of the rate-determining steps to a timely implementation of Monte Carlo techniques to regulatory decision making is the development of "standard" data distributions that are considered applicable to any setting. For many exposure variables, there is no need to wait any longer to adopt Monte Carlo techniques into regulatory policy since there is a wealth of data from which a robust distribution can be developed and ample evidence to indicate that the variable is not significantly influenced by site-specific conditions. In this paper, we propose several distributions that can be considered standard and customary for most settings. Age-specific distributions for soil ingestion rates, inhalation rates, body weights, skin surface area, tapwater and fish consumption, residential occupancy and occupational tenure, and soil-on-skin adherence were developed. For each distribution offered in this paper, we discuss the adequacy of the database, derivation of the distribution, and applicability of the distribution to various settings and conditions.  相似文献   

9.
Distinguishing between discrete and continuous latent variable distributions has become increasingly important in numerous domains of behavioral science. Here, the authors explore an information-theoretic approach to latent distribution modeling, in which the ability of latent distribution models to represent statistical information in observed data is emphasized. The authors conclude that loss of statistical information with a decrease in the number of latent values provides an attractive basis for comparing discrete and continuous latent variable models. Theoretical considerations as well as the results of 2 Monte Carlo simulations indicate that information theory provides a sound basis for modeling latent distributions and distinguishing between discrete and continuous latent variable models in particular. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

10.
Proton pencil beams in water, in a format suitable for treatment planning algorithms and covering the radiotherapy energy range (50-250 MeV), have been calculated using a modified version of the Monte Carlo code PTRAN. A simple analytical model has also been developed for calculating proton broad-beam dose distributions which is in excellent agreement with the Monte Carlo calculations. Radial dose distributions are also calculated analytically and narrow proton pencil-beam dose distributions derived. The physical approximations in the Monte Carlo code and in the analytical model together with their limitations are discussed. Examples showing the use of the calculated set of proton pencil beams as input to an existing photon treatment planning algorithm based on biological optimization are given for fully 3D scanned proton pencil beams; these include intensity modulated beams with range shift and scanning in the transversal plane.  相似文献   

11.
Although parking revenue is a principal source of income, supply of parking infrastructure at airports is based largely on expected needs. Although that is a rational basis, high investment costs and management fees are requiring developers and financiers to carefully analyze investment risks. This paper focuses on sources of investment risk in airport parking infrastructure development and discusses the application of Monte Carlo simulation to estimate and understand the impacts of cash flow uncertainties on project feasibility. It is shown that cost overruns, which are common in construction project development, have the most significant impact on return risk.  相似文献   

12.
This study examined various factors that affect statistical power in randomized intervention studies with noncompliance. On the basis of Monte Carlo simulations, this study demonstrates how statistical power changes depending on compliance rate, study design, outcome distributions, and covariate information. It also examines how these factors influence power in different methods of estimating intervention effects. Intent-to-treat analysis and complier average causal effect estimation are compared as 2 alternative ways of estimating intervention effects under noncompliance. The results of this investigation provide practical implications in designing and evaluating intervention studies taking into account noncompliance. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

13.
A probabilistic approach for failure analysis is presented in this paper, which investigates the probable scenarios that occur in case of failure of engineering systems with uncertainties. Failure analysis can be carried out by studying the statistics of system behavior corresponding to the random samples of uncertain parameters that are distributed as the conditional distribution given that the failure event has occurred. This necessitates the efficient generation of conditional samples, which is in general a highly nontrivial task. A simulation method based on Markov Chain Monte Carlo simulation is proposed to efficiently generate the conditional samples. It makes use of the samples generated from importance sampling simulation when the performance reliability is computed. The conditional samples can be used for statistical averaging to yield unbiased and consistent estimate of conditional expectations of interest for failure analysis. Examples are given to illustrate the application of the proposed simulation method to probabilistic failure analysis of static and dynamic structural systems.  相似文献   

14.
The dependency structure matrix (DSM) has been identified as an apt tool to represent information flows between activities. Using this representation, information dependency attributes can be organized and analyzed in a structured manner to identify activity groups and sequences for concurrent execution. Current DSM methodology requires significant efforts from the experts to estimate information dependency attributes. The methodology can be more widely used if the estimating efforts are reduced. This paper proposes two concepts to reduce the estimating efforts required for the DSM methodology. The first concept reduces the number of information dependency ratings required and the second concept reduces the effort to estimate the rating. The proposed concepts are structured into a procedure. This procedure is applied to the design phase of an induced draft cooling tower (IDCT) project and discussed. The paper also discusses the shortcomings and future directions of the present approach and concludes that the present approach is applicable in IDCT projects and can be extended to other types of projects.  相似文献   

15.
It is very common to find meta-analyses in which some of the studies compare 2 groups on continuous dependent variables and others compare groups on dichotomized variables. Integrating all of them in a meta-analysis requires an effect-size index in the same metric that can be applied to both types of outcomes. In this article, the performance in terms of bias and sampling variance of 7 different effect-size indices for estimating the population standardized mean difference from a 2 × 2 table is examined by Monte Carlo simulation, assuming normal and nonnormal distributions. The results show good performance for 2 indices, one based on the probit transformation and the other based on the logistic distribution. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

16.
Quantitative risk assessment (QRA) is rapidly accumulating recognition as the most practical method for assessing the risks associated with microbial contamination of foodstuffs. These risk analyses are most commonly developed in commercial computer spreadsheet applications, combined with Monte Carlo simulation add-ins that enable probability distributions to be inserted into a spreadsheet. If a suitable model structure can be defined and all of the variables within that model reasonably quantified, a QRA will demonstrate the sensitivity of the severity of the risk to each stage in the risk-assessment model. It can therefore provide guidance for the selection of appropriate risk-reduction measures and a quantitative assessment of the benefits and costs of these proposed measures. However, very few reports explaining QRA models have been submitted for publication in this area. There is, therefore, little guidance available to those who intend to embark on a full microbial QRA. This paper looks at a number of modeling techniques that can help produce more realistic and accurate Monte Carlo simulation models. The use and limitations of several distributions important to microbial risk assessment are explained. Some simple techniques specific to Monte Carlo simulation modelling of microbial risks using spreadsheets are also offered which will help the analyst more realistically reflect the uncertain nature of the scenarios being modeled. simulation, food safety.  相似文献   

17.
The reliability of surface-wave tests for the evaluation of VS,30 in seismic site characterization is assessed with respect to both uncertainty and accuracy. The discussion of uncertainty is mainly focused on the implications of solution nonuniqueness in inverse problems; only the inversion uncertainty is considered within this work, omitting other possible issues such as nontrivial geological settings (e.g., lateral variations) or the influence of different processing procedures. A Monte?Carlo?approach has been used to select, through a statistical test, a set of shear-wave velocity models that can be considered equivalent with respect to fitting the experimental dispersion curve according to the information content (dispersion velocities and frequency range) and the experimental uncertainties. This set of equivalent solutions is then used to evaluate the uncertainty in the determination of VS,30. Moreover, comparisons between the results obtained by surface-wave tests and invasive seismic methods are reported to assess the accuracy of VS,30 evaluation by using surface-wave methods. It is shown that, given an adequate investigation depth, the solution nonuniqueness is not a major concern and that the results are comparable in most situations with the results of invasive tests providing an accurate estimate of VS,30, even with simplified approaches.  相似文献   

18.
Project managers implement the concept of time contingency to consider uncertainty in duration estimates and prevent project completion delays. Some project managers also build a distribution of the project time contingency into the project activities to create a more manageable schedule. Generally, both the estimation and distribution of the project time contingency are conducted by using subjective approaches. Because the project schedule feasibility mainly depends on the variable behavior of the project activities, the estimate of project time contingency and its allocation at the activity level should be obtained by considering the performance variability of each activity rather than basing on human judgment. In this paper, the stochastic allocation of project allowances method, which is based on Monte?Carlo simulation, is proposed to estimate the project time contingency and allocate it among the project activities. The application of this method to a three-span bridge project results in a fair allocation of the project time contingency and provides practical means to control time contingencies at the activity level.  相似文献   

19.
The standard Pearson correlation coefficient is a biased estimator of the true population correlation, ρ, when the predictor and the criterion are range restricted. To correct the bias, the correlation corrected for range restriction, rc, has been recommended, and a standard formula based on asymptotic results for estimating its standard error is also available. In the present study, the bootstrap standard-error estimate is proposed as an alternative. Monte Carlo simulation studies involving both normal and nonnormal data were conducted to examine the empirical performance of the proposed procedure under different levels of ρ, selection ratio, sample size, and truncation types. Results indicated that, with normal data, the bootstrap standard-error estimate is more accurate than the traditional estimate, particularly with small sample size. With nonnormal data, performance of both estimates depends critically on the distribution type. Furthermore, the bootstrap bias-corrected and accelerated interval consistently provided the most accurate coverage probability for ρ. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

20.
During the planning and execution of construction projects, it often becomes necessary to shorten the duration of the project. A widely used technique for reducing the duration of a project is commonly referred to as least-cost scheduling. This procedure is based on deterministically arriving at the shortest project duration for the minimum cost possible. There is, however, one major problem with the typical application of this technique. It does not address the variability inherent in the duration and cost of the project activities. Thus, the resulting compressed schedule value cannot be applied with any stated level of statistical confidence. This paper presents a new procedure that addresses some of the major shortcomings of least-cost scheduling. It does so by accounting for the variability inherent in the duration and cost of the scheduled activities by simultaneously applying range estimating and probabilistic scheduling to the historical data. The resulting data set is then analyzed to provide a compressed schedule duration and cost estimate that have a higher overall confidence of being achieved.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号