首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 765 毫秒
1.
In case of low-dose exposure to a substance, its concentration in cells is likely to be stochastic. Assessing the consequences of this stochasticity in toxicological risk assessment requires the coupling of macroscopic dynamics models describing whole-body kinetics with microscopic tools designed to simulate stochasticity. In this article, we propose an approach to approximate stochastic cell concentration of butadiene in the cells of diverse organs. We adapted the dynamics equations of a physiologically based pharmacokinetic (PBPK) model and used a stochastic simulator for the system of equations that we derived. We then coupled kinetics simulations with a deterministic hockey stick model of carcinogenicity. Stochasticity induced substantial modifications relative to dose-response curve, compared with the deterministic situation. In particular, there was nonlinearity in the response and the stochastic apparent threshold was lower than the deterministic one. The approach that we developed could easily be extended to other biological studies to assess the influence of stochasticity at macroscopic scale for compound dynamics at the cell level.  相似文献   

2.
An ecological risk assessment framework for low-altitude aircraft overflights was developed, with special emphasis on military applications. The problem formulation and exposure analysis phases are presented in this article; an analysis of effects and risk characterization is presented in a companion article. The intent of this article is threefold: (1) to illustrate the development of a generic framework for the ecological risk assessment of an activity, (2) to show how the U.S. Environmental Protection Agency's ecological risk assessment paradigm can be applied to an activity other than the release of a chemical, and (3) to provide guidance for the assessment of ecological risks from low-altitude aircraft overflights. The key stressor for low-altitude aircraft overflights is usually sound, although visual and physical (collision) stressors may also be significant. Susceptible and regulated wildlife populations are the major assessment endpoint entities, although plant communities may be impacted by takeoffs and landings. The exposure analysis utilizes measurements of wildlife locations, measurements of sound levels at the wildlife locations, measurements of slant distances from aircraft to wildlife, models that extrapolate sound from the source aircraft to the ground, and bird-strike probability models. Some of the challenges to conducting a risk assessment for aircraft overflights include prioritizing potential stressors and endpoints, choosing exposure metrics that relate to wildlife responses, obtaining good estimates of sound or distance, and estimating wildlife locations.  相似文献   

3.
Many large organizations accomplish their various functions through interactions across their major components. Components refers to functional entities within a large complex organization, such as business sectors, academic departments, or regional divisions. The dependency between the various components can cause risk to propagate through their overall system. This article presents a risk assessment framework that integrates risk across a diverse set of components to the overall organization functions. This project addresses three major challenges: aggregating risk, estimating component interdependencies including cycles of dependencies, and propagating risk across components. The framework aggregates risk assessments through a value function for severity that is evaluated at the expected outcome of accomplishing planned goals in terms of performance, schedule, and resources. The value function, which represents risk tolerance, scales between defined points corresponding to failure and success. Different risk assessment may be aggregated together. This article presents a novel approach to establishing relationships between the various components. This article develops and compares three network risk propagation models that characterize the overall organizational risk. The U.S. Air Force has applied this risk framework to evaluate success in hypothetical future wars. The analysts employing this risk framework have informed billions of dollars of strategic investment decisions.  相似文献   

4.
Louis Anthony Cox  Jr  . 《Risk analysis》2006,26(6):1581-1599
This article introduces an approach to estimating the uncertain potential effects on lung cancer risk of removing a particular constituent, cadmium (Cd), from cigarette smoke, given the useful but incomplete scientific information available about its modes of action. The approach considers normal cell proliferation; DNA repair inhibition in normal cells affected by initiating events; proliferation, promotion, and progression of initiated cells; and death or sparing of initiated and malignant cells as they are further transformed to become fully tumorigenic. Rather than estimating unmeasured model parameters by curve fitting to epidemiological or animal experimental tumor data, we attempt rough estimates of parameters based on their biological interpretations and comparison to corresponding genetic polymorphism data. The resulting parameter estimates are admittedly uncertain and approximate, but they suggest a portfolio approach to estimating impacts of removing Cd that gives usefully robust conclusions. This approach views Cd as creating a portfolio of uncertain health impacts that can be expressed as biologically independent relative risk factors having clear mechanistic interpretations. Because Cd can act through many distinct biological mechanisms, it appears likely (subjective probability greater than 40%) that removing Cd from cigarette smoke would reduce smoker risks of lung cancer by at least 10%, although it is possible (consistent with what is known) that the true effect could be much larger or smaller. Conservative estimates and assumptions made in this calculation suggest that the true impact could be greater for some smokers. This conclusion appears to be robust to many scientific uncertainties about Cd and smoking effects.  相似文献   

5.
The management of microbial risk in food products requires the ability to predict growth kinetics of pathogenic microorganisms in the event of contamination and growth initiation. Useful data for assessing these issues may be found in the literature or from experimental results. However, the large number and variety of data make further development difficult. Statistical techniques, such as meta-analysis, are then useful to realize synthesis of a set of distinct but similar experiences. Moreover, predictive modeling tools can be employed to complete the analysis and help the food safety manager to interpret the data. In this article, a protocol to perform a meta-analysis of the outcome of a relational database, associated with quantitative microbiology models, is presented. The methodology is illustrated with the effect of temperature on pathogenic Escherichia coli and Listeria monocytogenes, growing in culture medium, beef meat, and milk products. Using a database and predictive models, simulations of growth in a given product subjected to various temperature scenarios can be produced. It is then possible to compare food products for a given microorganism, according to its growth ability in these products, and to compare the behavior of bacteria in a given foodstuff. These results can assist decisions for a variety of questions on food safety.  相似文献   

6.
Developmental anomalies induced by toxic chemicals may be identified using laboratory experiments with rats, mice or rabbits. Multinomial responses of fetuses from the same mother are often positively correlated, resulting in overdispersion relative to multinomial variation. In this article, a simple data transformation based on the concept of generalized design effects due to Rao-Scott is proposed for dose-response modeling of developmental toxicity. After scaling the original multinomial data using the average design effect, standard methods for analysis of uncorrected multinomial data can be applied. Benchmark doses derived using this approach are comparable to those obtained using generalized estimating equations with an extended Dirichlet-trinomial covariance function to describe the dispersion of the original data. This empirical agreement, coupled with a large sample theoretical justification of the Rao-Scott transformation, confirms the applicability of the statistical methods proposed in this article for developmental toxicity risk assessment.  相似文献   

7.
A simple procedure is proposed in order to quantify the tradeoff between a loss suffered from an illness due to exposure to a microbial pathogen and a loss due to a toxic effect, perhaps a different illness, induced by a disinfectant employed to reduce the microbial exposure. Estimates of these two types of risk as a function of disinfectant dose and their associated relative losses provide information for the estimation of the optimum dose of disinfectant that minimizes the total expected loss. The estimates of the optimum dose and expected relative total loss were similar regardless of whether the beta-Poisson, log-logistic, or extreme value function was used to model the risk of illness due to exposure to a microbial pathogen. This is because the optimum dose of the disinfectant and resultant expected minimum loss depend upon the estimated slope (first derivative) of the models at low levels of risk, which appear to be similar for these three models at low levels of risk. Similarly, the choice among these three models does not appear critical for estimating the slope at low levels of risk for the toxic effect induced by the use of a disinfectant. For the proposed procedure to estimate the optimum disinfectant dose, it is not necessary to have absolute values for the losses due to microbial-induced or disinfectant-induced illness, but only relative losses are required. All aspects of the problem are amenable to sensitivity analyses. The issue of risk/benefit tradeoffs, more appropriately called risk/risk tradeoffs, does not appear to be an insurmountable problem.  相似文献   

8.
All major world economies are exhibiting a shift from products to services in terms of relative share of GNP and employment. A well accepted explanation for this shift to services has been the lower productivity growth in services relative to manufacturing. A second trend visible in the United States and other advanced economies is that from material‐intensive to information‐intensive sectors with the latter growing relative to the former. There does not seem to be a generally accepted explanation for this shift; in fact, here it would appear that productivity in information‐intensive sectors is increasing. We construct a model of an economy with endogenous production and consumption decisions by utility maximizing individuals. We show that differential productivity changes can result in either relative growth or decline of a sector. A second factor affecting the direction of change is the degree to which consumption of sector outputs approaches satiation. When marginal utility of additional consumption drops sufficiently low, productivity increases can lead to declines in the relative size and share of, and employment in the sector. Concurrently, increases in productivity increase average wealth as expected, but income inequality can either increase or decrease.  相似文献   

9.
Campylobacter bacteria are an important cause of foodborne infections. We estimated the potential costs and benefits of a large number of possible interventions to decrease human exposure to Campylobacter by consumption of chicken meat, which accounts for 20-40% of all cases of human campylobacteriosis in the Netherlands. For this purpose, a farm-to-fork risk assessment model was combined with economic analysis and epidemiological data. Reduction of contamination at broiler farms could be efficient in theory. However, it is unclear which hygienic measures need to be taken and the costs can be very high. The experimental treatment of colonized broiler flocks with bacteriophages has proven to be effective and could also be cost efficient, if confirmed in practice. Since a major decrease of infections at the broiler farm is not expected in the short term, additional measures in the processing plant were also considered. At this moment, guaranteed Campylobacter-free chicken meat at the retail level is not realistic. The most promising interventions in the processing plant are limiting fecal leakage during processing and separation of contaminated and noncontaminated flocks (scheduling), followed by decontamination of the contaminated flock. New (faster and more sensitive) test methods to detect Campylobacter colonization in broilers flocks are a prerequisite for successful scheduling scenarios. Other methods to decrease the contamination of meat of colonized flocks such as freezing and heat treatment are more expensive and/or less effective than chemical decontamination.  相似文献   

10.
This article presents a Listeria monocytogenes growth model in milk at the farm bulk tank stage. The main objective was to judge the feasibility and value to risk assessors of introducing a complex model, including a complete thermal model, within a microbial quantitative risk assessment scheme. Predictive microbiology models are used under varying temperature conditions to predict bacterial growth. Input distributions are estimated based on data in the literature, when it is available. If not, reasonable assumptions are made for the considered context. Previously published results based on a Bayesian analysis of growth parameters are used. A Monte Carlo simulation that forecasts bacterial growth is the focus of this study. Three scenarios that take account of the variability and uncertainty of growth parameters are compared. The effect of a sophisticated thermal model taking account of continuous variations in milk temperature was tested by comparison with a simplified model where milk temperature was considered as constant. Limited multiplication of bacteria within the farm bulk tank was modeled. The two principal factors influencing bacterial growth were found to be tank thermostat regulation and bacterial population growth parameters. The dilution phenomenon due to the introduction of new milk was the main factor affecting the final bacterial concentration. The results show that a model that assumes constant environmental conditions at an average temperature should be acceptable for this process. This work may constitute a first step toward exposure assessment for L. monocytogenes in milk. In addition, this partly conceptual work provides guidelines for other risk assessments where continuous variation of a parameter needs to be taken into account.  相似文献   

11.
The BMD (benchmark dose) method that is used in risk assessment of chemical compounds was introduced by Crump (1984) and is based on dose-response modeling. To take uncertainty in the data and model fitting into account, the lower confidence bound of the BMD estimate (BMDL) is suggested to be used as a point of departure in health risk assessments. In this article, we study how to design optimum experiments for applying the BMD method for continuous data. We exemplify our approach by considering the class of Hill models. The main aim is to study whether an increased number of dose groups and at the same time a decreased number of animals in each dose group improves conditions for estimating the benchmark dose. Since Hill models are nonlinear, the optimum design depends on the values of the unknown parameters. That is why we consider Bayesian designs and assume that the parameter vector has a prior distribution. A natural design criterion is to minimize the expected variance of the BMD estimator. We present an example where we calculate the value of the design criterion for several designs and try to find out how the number of dose groups, the number of animals in the dose groups, and the choice of doses affects this value for different Hill curves. It follows from our calculations that to avoid the risk of unfavorable dose placements, it is good to use designs with more than four dose groups. We can also conclude that any additional information about the expected dose-response curve, e.g., information obtained from studies made in the past, should be taken into account when planning a study because it can improve the design.  相似文献   

12.
13.
The economic approach to determining the optimal control limits of control charts requires estimating the gradient of the expected cost function. Simulation is a very general methodology for estimating the expected costs, but for estimating the gradient, straightforward finite difference estimators can be inefficient. We demonstrate an alternative approach based on smoothed perturbation analysis (SPA), also known as conditional Monte Carlo. Numerical results and consequent design insights are obtained in determining the optimal control limits for exponentially weighted moving average and Bayes charts. The results indicate that the SPA gradient estimators can be significantly more efficient than finite difference estimators, and that a simulation approach using these estimators provides a viable alternative to other numerical solution techniques for the economic design problem.  相似文献   

14.
Risk Analysis for Critical Asset Protection   总被引:2,自引:0,他引:2  
This article proposes a quantitative risk assessment and management framework that supports strategic asset-level resource allocation decision making for critical infrastructure and key resource protection. The proposed framework consists of five phases: scenario identification, consequence and criticality assessment, security vulnerability assessment, threat likelihood assessment, and benefit-cost analysis. Key innovations in this methodology include its initial focus on fundamental asset characteristics to generate an exhaustive set of plausible threat scenarios based on a target susceptibility matrix (which we refer to as asset-driven analysis) and an approach to threat likelihood assessment that captures adversary tendencies to shift their preferences in response to security investments based on the expected utilities of alternative attack profiles assessed from the adversary perspective. A notional example is provided to demonstrate an application of the proposed framework. Extensions of this model to support strategic portfolio-level analysis and tactical risk analysis are suggested.  相似文献   

15.
The threat of so‐called rapid or abrupt climate change has generated considerable public interest because of its potentially significant impacts. The collapse of the North Atlantic Thermohaline Circulation or the West Antarctic Ice Sheet, for example, would have potentially catastrophic effects on temperatures and sea level, respectively. But how likely are such extreme climatic changes? Is it possible actually to estimate likelihoods? This article reviews the societal demand for the likelihoods of rapid or abrupt climate change, and different methods for estimating likelihoods: past experience, model simulation, or through the elicitation of expert judgments. The article describes a survey to estimate the likelihoods of two characterizations of rapid climate change, and explores the issues associated with such surveys and the value of information produced. The surveys were based on key scientists chosen for their expertise in the climate science of abrupt climate change. Most survey respondents ascribed low likelihoods to rapid climate change, due either to the collapse of the Thermohaline Circulation or increased positive feedbacks. In each case one assessment was an order of magnitude higher than the others. We explore a high rate of refusal to participate in this expert survey: many scientists prefer to rely on output from future climate model simulations.  相似文献   

16.
Point source pollution is one of the main threats to regional environmental health. Based on a water quality model, a methodology to assess the regional risk of point source pollution is proposed. The assessment procedure includes five parts: (1) identifying risk source units and estimating source emissions using Monte Carlo algorithms; (2) observing hydrological and water quality data of the assessed area, and evaluating the selected water quality model; (3) screening out the assessment endpoints and analyzing receptor vulnerability with the Choquet fuzzy integral algorithm; (4) using the water quality model introduced in the second step to predict pollutant concentrations for various source emission scenarios and analyzing hazards of risk sources; and finally, (5) using the source hazard values and receptor vulnerability scores to estimate overall regional risk. The proposed method, based on the Water Quality Analysis Simulation Program (WASP), was applied in the region of the Taipu River, which is in the Taihu Basin, China. Results of source hazard and receptor vulnerability analysis allowed us to describe aquatic ecological, human health, and socioeconomic risks individually, and also integrated risks in the Taipu region, from a series of risk curves. Risk contributions of sources to receptors were ranked, and the spatial distribution of risk levels was presented. By changing the input conditions, we were able to estimate risks for a range of scenarios. Thus, the proposed procedure may also be used by decisionmakers for long‐term dynamic risk prediction.  相似文献   

17.
Multiannual periods of consecutive above-median or below-median growth rates in operating performance, called runs, have a substantial influence on firm valuations. For estimating the probability of an above-median or below-median run and utilizing information efficiently, we employ a stepwise regression to automatically identify the parsimonious indicator-specific set of economically and empirically meaningful variables. Our novel approach uses logit models to distinguish firms that will persistently grow above or below the median over a period of up to 6 years. The predictive power for sales growth rates is highest to discriminate between above-median and below-median growth rates, while the future behaviour of operating income and net income growth rates can partially be explained for below-median growth rates.  相似文献   

18.
Multistage models have become the basic paradigm for modeling carcinogenesis. One model, the two-stage model of carcinogenesis, is now routinely used in the analysis of cancer risks from exposure to environmental chemicals. In its most general form, this model has two states, an initiated state and a neoplastic state, which allow for growth of cells via a simple linear birth-death process. In all analyses done with this model, researchers have assumed that tumor incidence is equivalent to the formation of a single neoplastic cell and the growth kinetics in the neoplastic state have been ignored. Some researchers have discussed the impact of this assumption on their analyses, but no formal methods were available for a more rigorous application of the birth-death process. In this paper, an approximation is introduced which allows for the application of growth kinetics in the neoplastic state. The adequacy of the approximation against simulated data is evaluated and methods are developed for implementing the approximation using data on the number and size of neoplastic clones.  相似文献   

19.
This article provides a data‐driven assessment of economic and environmental aspects of remanufacturing for product + service firms. A critical component of such an assessment is the issue of demand cannibalization. We therefore present an analytical model and a behavioral study which together incorporate demand cannibalization from multiple customer segments across the firm's product line. We then perform a series of numerical simulations with realistic problem parameters obtained from both the literature and discussions with industry executives. Our findings show that remanufacturing frequently aligns firms' economic and environmental goals by increasing profits and decreasing the total environmental impact. We show that in some cases, an introduction of a remanufactured product leads to no changes in the new products' prices (positioning within the product line), implying a positive demand cannibalization and a decrease in the environmental impact; this provides support for a heuristic approach commonly used in practice. Yet in other cases, the firm can increase profits by decreasing the new product's prices and increasing sales—a negative effective cannibalization. With negative cannibalization the firm's total environmental impact often increases due to the growth in new production. However, we illustrate that this growth is nearly always sustainable, as the relative environmental impacts per unit and per dollar rarely increase.  相似文献   

20.
A novel extension of traditional growth models for exposure assessment of food-borne microbial pathogens was developed to address the complex interactions of competing microbial populations in foods. Scenarios were designed for baseline refrigeration and mild abuse of servings of chicken broiler and ground beef Our approach employed high-quality data for microbiology of foods at production, refrigerated storage temperatures, and growth kinetics of microbial populations in culture media. Simple parallel models were developed for exponential growth of multiple pathogens and the abundant and ubiquitous nonpathogenic indigenous microbiota. Monte Carlo simulations were run for unconstrained growth and growth with the density-dependent constraint based on the "Jameson effect," inhibition of pathogen growth when the indigenous microbiota reached 10(9) counts per serving. The modes for unconstrained growth of the indigenous microbiota were 10(8), 10(10), and 10(11) counts per serving for chicken broilers, and 10(7), 10(9) and 10(11) counts per serving for ground beef at respective sites for backroom, meat case, and home refrigeration. Contamination rates and likelihoods of reaching temperatures supporting growth of the pathogens in the baseline refrigeration scenario were rare events. The unconstrained exponential growth models appeared to overestimate L. monocytogenes growth maxima for the baseline refrigeration scenario by 1500-7233% (10(6)-10(7) counts/serving) when the inhibitory effects of the indigenous microbiota are ignored. The extreme tails of the distributions for the constrained models appeared to overestimate growth maxima 110% (10(4)-10(5) counts/serving) for Salmonella spp. and 108% (6 x 10(3) counts/serving) for E. coli O157:H7 relative to the extremes of the unconstrained models. The approach of incorporating parallel models for pathogens and the indigenous microbiota into exposure assessment modeling motivates the design of validation studies to test the modeling assumptions, consistent with the analytical-deliberative process of risk analysis.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号