首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到9条相似文献,搜索用时 15 毫秒
1.
With the increased availability of toxicological hazard information arising from multiple experimental sources, risk assessors are often confronted with the challenge of synthesizing all available scientific information into an analysis. This analysis is further complicated because significant between-source heterogeneity/lab-to-lab variability is often evident. We estimate benchmark doses using hierarchical models to account for the observed heterogeneity. These models are used to construct source-specific and population-average estimates of the benchmark dose (BMD). This is illustrated with an analysis of the U.S. EPA Region IX's reference toxicity database on the effects of sodium chloride on reproduction in Ceriodaphnia dubia. Results show that such models may effectively account for the lab-source heterogeneity while producing BMD estimates that more truly reflect the variability of the system under study. Failing to account for such heterogeneity may result in estimates having confidence intervals that are overly narrow.  相似文献   

2.
Dose Response for Infection by Escherichia coli O157:H7 from Outbreak Data   总被引:1,自引:0,他引:1  
In 1996, an outbreak of E. coli O157:H7-associated illness occurred in an elementary school in Japan. This outbreak has been studied in unusual detail, making this an important case for quantitative risk assessment. The availability of stored samples of the contaminated food allowed reliable estimation of exposure to the pathogens. Collection of fecal samples allowed assessment of the numbers infected, including asymptomatic cases. Comparison to other published dose-response studies for E. coli O157:H7 show that the strain that caused the outbreak studied here must have been considerably more infectious. We use this well-documented incident as an example to demonstrate how such information on the response to a single dose can be used for dose-response assessment. In particular, we demonstrate how the high infectivity limits the uncertainty in the low-dose region.  相似文献   

3.
In this article, we develop statistical models to predict the number and geographic distribution of fires caused by earthquake ground motion and tsunami inundation in Japan. Using new, uniquely large, and consistent data sets from the 2011 Tōhoku earthquake and tsunami, we fitted three types of models—generalized linear models (GLMs), generalized additive models (GAMs), and boosted regression trees (BRTs). This is the first time the latter two have been used in this application. A simple conceptual framework guided identification of candidate covariates. Models were then compared based on their out‐of‐sample predictive power, goodness of fit to the data, ease of implementation, and relative importance of the framework concepts. For the ground motion data set, we recommend a Poisson GAM; for the tsunami data set, a negative binomial (NB) GLM or NB GAM. The best models generate out‐of‐sample predictions of the total number of ignitions in the region within one or two. Prefecture‐level prediction errors average approximately three. All models demonstrate predictive power far superior to four from the literature that were also tested. A nonlinear relationship is apparent between ignitions and ground motion, so for GLMs, which assume a linear response‐covariate relationship, instrumental intensity was the preferred ground motion covariate because it captures part of that nonlinearity. Measures of commercial exposure were preferred over measures of residential exposure for both ground motion and tsunami ignition models. This may vary in other regions, but nevertheless highlights the value of testing alternative measures for each concept. Models with the best predictive power included two or three covariates.  相似文献   

4.
Slob  W.  Pieters  M. N. 《Risk analysis》1998,18(6):787-798
The use of uncertainty factors in the standard method for deriving acceptable intake or exposure limits for humans, such as the Reference Dose (RfD), may be viewed as a conservative method of taking various uncertainties into account. As an obvious alternative, the use of uncertainty distributions instead of uncertainty factors is gaining attention. This paper presents a comprehensive discussion of a general framework that quantifies both the uncertainties in the no-adverse-effect level in the animal (using a benchmark-like approach) and the uncertainties in the various extrapolation steps involved (using uncertainty distributions). This approach results in an uncertainty distribution for the no-adverse-effect level in the sensitive human subpopulation, reflecting the overall scientific uncertainty associated with that level. A lower percentile of this distribution may be regarded as an acceptable exposure limit (e.g., RfD) that takes account of the various uncertainties in a nonconservative fashion. The same methodology may also be used as a tool to derive a distribution for possible human health effects at a given exposure level. We argue that in a probabilistic approach the uncertainty in the estimated no-adverse-effect-level in the animal should be explicitly taken into account. Not only is this source of uncertainty too large to be ignored, it also has repercussions for the quantification of the other uncertainty distributions.  相似文献   

5.
Determining the least distance to the efficient frontier for estimating technical inefficiency, with the consequent determination of closest targets, has been one of the relevant issues in recent Data Envelopment Analysis literature. This new paradigm contrasts with traditional approaches, which yield furthest targets. In this respect, some techniques have been proposed in order to implement the new paradigm. A group of these techniques is based on identifying all the efficient faces of the polyhedral production possibility set and, therefore, is associated with the resolution of a NP-hard problem. In contrast, a second group proposes different models and particular algorithms to solve the problem avoiding the explicit identification of all these faces. These techniques have been applied more or less successfully. Nonetheless, the new paradigm is still unsatisfactory and incomplete to a certain extent. One of these challenges is that related to measuring technical inefficiency in the context of oriented models, i.e., models that aim at changing inputs or outputs but not both. In this paper, we show that existing specific techniques for determining the least distance without identifying explicitly the frontier structure for graph measures, which change inputs and outputs at the same time, do not work for oriented models. Consequently, a new methodology for satisfactorily implementing these situations is proposed. Finally, the new approach is empirically checked by using a recent PISA database consisting of 902 schools.  相似文献   

6.
《Risk analysis》2018,38(7):1490-1501
Several epidemiological studies have demonstrated an association between occupational benzene exposure and increased leukemia risk, in particular acute myeloid leukemia (AML). However, there is still uncertainty as to the risk to the general population from exposure to lower environmental levels of benzene. To estimate the excess risk of leukemia from low‐dose benzene exposure, various methods for incorporating epidemiological data in quantitative risk assessment were utilized. Tobacco smoke was identified as one of the main potential sources of benzene exposure and was the focus of this exposure assessment, allowing further investigation of the role of benzene in smoking‐induced leukemia. Potency estimates for benzene were generated from individual occupational studies and meta‐analysis data, and an exposure assessment for two smoking subgroups (light and heavy smokers) carried out. Subsequently, various techniques, including life‐table analysis, were then used to evaluate both the excess lifetime risk and the contribution of benzene to smoking‐induced leukemia and AML. The excess lifetime risk for smokers was estimated at between two and six additional leukemia deaths in 10,000 and one to three additional AML deaths in 10,000. The contribution of benzene to smoking‐induced leukemia was estimated at between 9% and 24% (UpperCL 14–31%). For AML this contribution was estimated as 11–30% (UpperCL 22–60%). From the assessments carried out here, it appears there is an increased risk of leukemia from low‐level exposure to benzene and that benzene may contribute up to a third of smoking‐induced leukemia. Comparable results from using methods with varying degrees of complexity were generated.  相似文献   

7.
This article proposes a methodology for incorporating electrical component failure data into the human error assessment and reduction technique (HEART) for estimating human error probabilities (HEPs). The existing HEART method contains factors known as error-producing conditions (EPCs) that adjust a generic HEP to a more specific situation being assessed. The selection and proportioning of these EPCs are at the discretion of an assessor, and are therefore subject to the assessor's experience and potential bias. This dependence on expert opinion is prevalent in similar HEP assessment techniques used in numerous industrial areas. The proposed method incorporates factors based on observed trends in electrical component failures to produce a revised HEP that can trigger risk mitigation actions more effectively based on the presence of component categories or other hazardous conditions that have a history of failure due to human error. The data used for the additional factors are a result of an analysis of failures of electronic components experienced during system integration and testing at NASA Goddard Space Flight Center. The analysis includes the determination of root failure mechanisms and trend analysis. The major causes of these defects were attributed to electrostatic damage, electrical overstress, mechanical overstress, or thermal overstress. These factors representing user-induced defects are quantified and incorporated into specific hardware factors based on the system's electrical parts list. This proposed methodology is demonstrated with an example comparing the original HEART method and the proposed modified technique.  相似文献   

8.
The problem of extrapolating effects of reproductive toxins on experimental animals to predict the doses that would produce infertility in human males is discussed using published data on effects of testosterone and estradiol on sperm production in the rat, rabbit, rhesus monkey, ram, stallion, and humans. This analysis indicates that calculation of the dose of testosterone that reduces human sperm counts by a given percentage is best done using the dose administered to laboratory animals expressed on the basis of body weight, as opposed to some other parameter such as body surface area. A survey of the available data in the literature indicates the incompleteness of the data set and the specific information needed to improve the basis for extrapolation. Nevertheless, we can predict from studies on laboratory animals the dose of testosterone necessary to reduce sperm counts in humans within a factor of 2.  相似文献   

9.
本文首次采用1990-2008年的数据研究我国人力资本对各生产性服务业贸易竞争力的影响方向和长短期作用机制,并进行比较分析。结果表明,长期内人力资本对我国生产性服务业中的运输、保险、金融、计算机和信息、专有权利使用费和特许费行业的贸易竞争力具有明显提升作用,其中对金融行业贸易竞争力的提升作用最大。短期来看,人力资本与保险、金融行业的贸易竞争力正相关,与运输、计算机和信息、专有权利使用费和特许费行业负相关,其中对金融行业的促进作用最大,对计算机和信息行业的反向作用最明显。最后结合各行业的特点提出了结论和相关政策建议。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号