首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
For noncancer effects, the degree of human interindividual variability plays a central role in determining the risk that can be expected at low exposures. This discussion reviews available data on observations of interindividual variability in (a) breathing rates, based on observations in British coal miners; (b) systemic pharmacokinetic parameters, based on studies of a number of drugs; (c) susceptibility to neurological effects from fetal exposure to methyl mercury, based on observations of the incidence of effects in relation to hair mercury levels; and (d) chronic lung function changes in relation to long-term exposure to cigarette smoke. The quantitative ranges of predictions that follow from uncertainties in estimates of interindividual variability in susceptibility are illustrated.  相似文献   

2.
Part of the explanation for the persistent epidemiological findings of associations between mortality and morbidity with relatively modest ambient exposures to airborne particles may be that some people are much more susceptible to particle-induced responses than others. This study assembled a database of quantitative observations of interindividual variability in pharmacokinetic and pharmacodynamic parameters likely to affect particle response. The pharmacodynamic responses studied included data drawn from epidemiologic studies of doses of methacholine, flour dust, and other agents that induce acute changes in lung function. In general, the amount of interindividual variability in several of these pharmacodynamic response parameters was greater than the variability in pharmacokinetic (breathing rate, deposition, and clearance) parameters. Quantitatively the results indicated that human interindividual variability of breathing rates and major pharmacokinetic parameters-total deposition and tracheobronchial clearance-were in the region of Log(GSD) = 0.1 to 0.2 (corresponding to geometric standard deviations of 10(.1)-10(.2) or 1.26-1.58). Deposition to the deep lung (alveolar region) appeared to be somewhat more variable: Log(GSD) of about 0.3 (GSD of about 2). Among pharmacodynamic parameters, changes in FEV1 in response to ozone and metabisulfite (an agent that is said to act primarily on neural receptors in the lung) were in the region of Log(GSD) of 0.2 to 0.4. However, similar responses to methacholine, an agent that acts on smooth muscle, seemed to have still more variability (0.4 to somewhat over 1.0, depending on the type of population studied). Similarly high values were suggested for particulate allergens. Central estimates of this kind of variability, and the close correspondence of the data to lognormal distributions, indicate that 99.9th percentile individuals are likely to respond at doses that are 150 to 450-fold less than would be needed in median individuals. It seems plausible that acute responses with this amount of variability could form part of the mechanistic basis for epidemiological observations of enhanced mortality in relation to ambient exposures to fine particles.  相似文献   

3.
An analysis of the uncertainty in guidelines for the ingestion of methylmercury (MeHg) due to human pharmacokinetic variability was conducted using a physiologically based pharmacokinetic (PBPK) model that describes MeHg kinetics in the pregnant human and fetus. Two alternative derivations of an ingestion guideline for MeHg were considered: the U.S. Environmental Protection Agency reference dose (RfD) of 0.1 g/kg/day derived from studies of an Iraqi grain poisoning episode, and the Agency for Toxic Substances and Disease Registry chronic oral minimal risk level (MRL) of 0.5 g/kg/day based on studies of a fish-eating population in the Seychelles Islands. Calculation of an ingestion guideline for MeHg from either of these epidemiological studies requires calculation of a dose conversion factor (DCF) relating a hair mercury concentration to a chronic MeHg ingestion rate. To evaluate the uncertainty in this DCF across the population of U.S. women of child-bearing age, Monte Carlo analyses were performed in which distributions for each of the parameters in the PBPK model were randomly sampled 1000 times. The 1st and 5th percentiles of the resulting distribution of DCFs were a factor of 1.8 and 1.5 below the median, respectively. This estimate of variability is consistent with, but somewhat less than, previous analyses performed with empirical, one-compartment pharmacokinetic models. The use of a consistent factor in both guidelines of 1.5 for pharmacokinetic variability in the DCF, and keeping all other aspects of the derivations unchanged, would result in an RfD of 0.2 g/kg/day and an MRL of 0.3 g/kg/day.  相似文献   

4.
Environmental tobacco smoke (ETS) is a major contributor to indoor human exposures to fine particulate matter of 2.5 μm or smaller (PM2.5). The Stochastic Human Exposure and Dose Simulation for Particulate Matter (SHEDS‐PM) Model developed by the U.S. Environmental Protection Agency estimates distributions of outdoor and indoor PM2.5 exposure for a specified population based on ambient concentrations and indoor emissions sources. A critical assessment was conducted of the methodology and data used in SHEDS‐PM for estimation of indoor exposure to ETS. For the residential microenvironment, SHEDS uses a mass‐balance approach, which is comparable to best practices. The default inputs in SHEDS‐PM were reviewed and more recent and extensive data sources were identified. Sensitivity analysis was used to determine which inputs should be prioritized for updating. Data regarding the proportion of smokers and “other smokers” and cigarette emission rate were found to be important. SHEDS‐PM does not currently account for in‐vehicle ETS exposure; however, in‐vehicle ETS‐related PM2.5 levels can exceed those in residential microenvironments by a factor of 10 or more. Therefore, a mass‐balance‐based methodology for estimating in‐vehicle ETS PM2.5 concentration is evaluated. Recommendations are made regarding updating of input data and algorithms related to ETS exposure in the SHEDS‐PM model. Interindividual variability for ETS exposure was quantified. Geographic variability in ETS exposure was quantified based on the varying prevalence of smokers in five selected locations in the United States.  相似文献   

5.
In earlier work we assembled a database of classical pharmacokinetic parameters (e.g., elimination half-lives; volumes of distribution) in children and adults. These data were then analyzed to define mean differences between adults and children of various age groups. In this article, we first analyze the variability in half-life observations where individual data exist. The major findings are as follows. The age groups defined in the earlier analysis of arithmetic mean data (0-1 week premature; 0-1 week full term; 1 week to 2 months; 2-6 months; 6 months to 2 years; 2-12 years; and 12-18 years) are reasonable for depicting child/adult pharmacokinetic differences, but data for some of the earliest age groups are highly variable. The fraction of individual children's half-lives observed to exceed the adult mean half-life by more than the 3.2-fold uncertainty factor commonly attributed to interindividual pharmacokinetic variability is 27% (16/59) for the 0-1 week age group, and 19% (5/26) in the 1 week to 2 month age group, compared to 0/87 for all the other age groups combined between 2 months and 18 years. Children within specific age groups appear to differ from adults with respect to the amount of variability and the form of the distribution of half-lives across the population. The data indicate departure from simple unimodal distributions, particularly in the 1 week to 2 month age group, suggesting that key developmental steps affecting drug removal tend to occur in that period. Finally, in preparation for age-dependent physiologically-based pharmacokinetic modeling, nationally representative NHANES III data are analyzed for distributions of body size and fat content. The data from about age 3 to age 10 reveal important departures from simple unimodal distributional forms-in the direction suggesting a subpopulation of children that are markedly heavier than those in the major mode. For risk assessment modeling, this means that analysts will need to consider "mixed" distributions (e.g., two or more normal or log-normal modes) in which the proportions of children falling within the major versus highweight/fat modes in the mixture changes as a function of age. Biologically, the most natural interpretation of this is that these subpopulations represent children who have or have not yet received particular signals for change in growth pattern. These apparently distinct subpopulations would be expected to exhibit different disposition of xenobiotics, particularly those that are highly lipophilic and poorly metabolized.  相似文献   

6.
Human variability is a very important factor considered in human health risk assessment for protecting sensitive populations from chemical exposure. Traditionally, to account for this variability, an interhuman uncertainty factor is applied to lower the exposure limit. However, using a fixed uncertainty factor rather than probabilistically accounting for human variability can hardly support probabilistic risk assessment advocated by a number of researchers; new methods are needed to probabilistically quantify human population variability. We propose a Bayesian hierarchical model to quantify variability among different populations. This approach jointly characterizes the distribution of risk at background exposure and the sensitivity of response to exposure, which are commonly represented by model parameters. We demonstrate, through both an application to real data and a simulation study, that using the proposed hierarchical structure adequately characterizes variability across different populations.  相似文献   

7.
Typical exposures to lead often involve a mix of long-term exposures to relatively constant exposure levels (e.g., residential yard soil and indoor dust) and highly intermittent exposures at other locations (e.g., seasonal recreational visits to a park). These types of exposures can be expected to result in blood lead concentrations that vary on a temporal scale with the intermittent exposure pattern. Prediction of short-term (or seasonal) blood lead concentrations arising from highly variable intermittent exposures requires a model that can reliably simulate lead exposures and biokinetics on a temporal scale that matches that of the exposure events of interest. If exposure model averaging times (EMATs) of the model exceed the shortest exposure duration that characterizes the intermittent exposure, uncertainties will be introduced into risk estimates because the exposure concentration used as input to the model must be time averaged to account for the intermittent nature of the exposure. We have used simulation as a means of determining the potential magnitude of these uncertainties. Simulations using models having various EMATs have allowed exploration of the strengths and weaknesses of various approaches to time averaging of exposures and impact on risk estimates associated with intermittent exposures to lead in soil. The International Commission of Radiological Protection (ICRP) model of lead pharmacokinetics in humans simulates lead intakes that can vary in intensity over time spans as small as one day, allowing for the simulation of intermittent exposures to lead as a series of discrete daily exposure events. The ICRP model was used to compare the outcomes (blood lead concentration) of various time-averaging adjustments for approximating the time-averaged intake of lead associated with various intermittent exposure patterns. Results of these analyses suggest that standard approaches to time averaging (e.g., U.S. EPA) that estimate the long-term daily exposure concentration can, in some cases, result in substantial underprediction of short-term variations in blood lead concentrations when used in models that operate with EMATs exceeding the shortest exposure duration that characterizes the intermittent exposure. Alternative time-averaging approaches recommended for use in lead risk assessment more reliably predict short-term periodic (e.g., seasonal) elevations in blood lead concentration that might result from intermittent exposures. In general, risk estimates will be improved by simulation on shorter time scales that more closely approximate the actual temporal dynamics of the exposure.  相似文献   

8.
An integrated, quantitative approach to incorporating both uncertainty and interindividual variability into risk prediction models is described. Individual risk R is treated as a variable distributed in both an uncertainty dimension and a variability dimension, whereas population risk I (the number of additional cases caused by R) is purely uncertain. I is shown to follow a compound Poisson-binomial distribution, which in low-level risk contexts can often be approximated well by a corresponding compound Poisson distribution. The proposed analytic framework is illustrated with an application to cancer risk assessment for a California population exposed to 1,2-dibromo-3-chloropropane from ground water.  相似文献   

9.
Adam M. Finkel 《Risk analysis》2014,34(10):1785-1794
If exposed to an identical concentration of a carcinogen, every human being would face a different level of risk, determined by his or her genetic, environmental, medical, and other uniquely individual characteristics. Various lines of evidence indicate that this susceptibility variable is distributed rather broadly in the human population, with perhaps a factor of 25‐ to 50‐fold between the center of this distribution and either of its tails, but cancer risk assessment at the EPA and elsewhere has always treated every (adult) human as identically susceptible. The National Academy of Sciences “Silver Book” concluded that EPA and the other agencies should fundamentally correct their mis‐computation of carcinogenic risk in two ways: (1) adjust individual risk estimates upward to provide information about the upper tail; and (2) adjust population risk estimates upward (by about sevenfold) to correct an underestimation due to a mathematical property of the interindividual distribution of human susceptibility, in which the susceptibility averaged over the entire (right‐skewed) population exceeds the median value for the typical human. In this issue of Risk Analysis, Kenneth Bogen disputes the second adjustment and endorses the first, though he also relegates the problem of underestimated individual risks to the realm of “equity concerns” that he says should have little if any bearing on risk management policy. In this article, I show why the basis for the population risk adjustment that the NAS recommended is correct—that current population cancer risk estimates, whether they are derived from animal bioassays or from human epidemiologic studies, likely provide estimates of the median with respect to human variation, which in turn must be an underestimate of the mean. If cancer risk estimates have larger “conservative” biases embedded in them, a premise I have disputed in many previous writings, such a defect would not excuse ignoring this additional bias in the direction of underestimation. I also demonstrate that sensible, legally appropriate, and ethical risk policy must not only inform the public when the tail of the individual risk distribution extends into the “high‐risk” range, but must alter benefit‐cost balancing to account for the need to try to reduce these tail risks preferentially.  相似文献   

10.
In the days following the collapse of the World Trade Center (WTC) towers on September 11, 2001 (9/11), the U.S. Environmental Protection Agency (EPA) initiated numerous air monitoring activities to better understand the ongoing impact of emissions from that disaster. Using these data, EPA conducted an inhalation exposure and human health risk assessment to the general population. This assessment does not address exposures and potential impacts that could have occurred to rescue workers, firefighters, and other site workers, nor does it address exposures that could have occurred in the indoor environment. Contaminants evaluated include particulate matter (PM), metals, polychlorinated biphenyls, dioxins, asbestos, volatile organic compounds, particle-bound polycyclic aromatic hydrocarbons, silica, and synthetic vitreous fibers (SVFs). This evaluation yielded three principal findings. (1) Persons exposed to extremely high levels of ambient PM and its components, SVFs, and other contaminants during the collapse of the WTC towers, and for several hours afterward, were likely to be at risk for acute and potentially chronic respiratory effects. (2) Available data suggest that contaminant concentrations within and near ground zero (GZ) remained significantly elevated above background levels for a few days after 9/11. Because only limited data on these critical few days were available, exposures and potential health impacts could not be evaluated with certainty for this time period. (3) Except for inhalation exposures that may have occurred on 9/11 and a few days afterward, the ambient air concentration data suggest that persons in the general population were unlikely to suffer short-term or long-term adverse health effects caused by inhalation exposures. While this analysis by EPA evaluated the potential for health impacts based on measured air concentrations, epidemiological studies conducted by organizations other than EPA have attempted to identify actual impacts. Such studies have identified respiratory effects in worker and general populations, and developmental effects in newborns whose mothers were near GZ on 9/11 or shortly thereafter. While researchers are not able to identify specific times and even exactly which contaminants are the cause of these effects, they have nonetheless concluded that exposure to WTC contaminants (and/or maternal stress, in the case of developmental effects) resulted in these effects, and have identified the time period including 9/11 itself and the days and few weeks afterward as a period of most concern based on high concentrations of key pollutants in the air and dust.  相似文献   

11.
We investigate, through modeling, the impact of interindividual heterogeneity in the metabolism of 4-aminobiphenyl (ABP) and in physiological factors on human cancer risk: A physiological pharmacokinetic model was used to quantify the time course of the formation of the proximate carcinogen, N-hydroxy-4-ABP and the DNA-binding of the active species in the bladder. The metabolic and physiologic model parameters were randomly varied, via Monte Carlo simulations, to reproduce interindividual variability. The sampling means for most parameters were scaled from values developed by Kadlubar et al. (Cancer Res., 51 : 4371, 1991) for dogs; variances were obtained primarily from published human data (e.g., measurements of ABP N-oxidation, and arylamine N-acetylation in human liver tissue). In 500 simulations, theoretically representing 500 humans, DNA-adduct levels in the bladder of the most susceptible individuals are ten thousand times higher than for the least susceptible, and the 5th and 95th percentiles differ by a factor of 160. DNA binding for the most susceptible individual (with low urine pH, low N-acetylation and high N-oxidation activities) is theoretically one million-fold higher than for the least susceptible (with high urine pH, high N-acetylation and low N-oxidation activities). The simulations also suggest that the four factors contributing most significantly to interindividual differences in DNA-binding of ABP in human bladder are urine pH, ABP N-oxidation, ABP N-acetylation and urination frequency.  相似文献   

12.
Biomagnification of organochlorine and other persistent organic contaminants by higher trophic level organisms represents one of the most significant sources of uncertainty and variability in evaluating potential risks associated with disposal of dredged materials. While it is important to distinguish between population variability (e.g., true population heterogeneity in fish weight, and lipid content) and uncertainty (e.g., measurement error), they can be operationally difficult to define separately in probabilistic estimates of human health and ecological risk. We propose a disaggregation of uncertain and variable parameters based on: (1) availability of supporting data; (2) the specific management and regulatory context (in this case, of the U.S. Army Corps of Engineers/U.S. Environmental Protection Agency tiered approach to dredged material management); and (3) professional judgment and experience in conducting probabilistic risk assessments. We describe and quantitatively evaluate several sources of uncertainty and variability in estimating risk to human health from trophic transfer of polychlorinated biphenyls (PCBs) using a case study of sediments obtained from the New York-New Jersey Harbor and being evaluated for disposal at an open water off-shore disposal site within the northeast region. The estimates of PCB concentrations in fish and dietary doses of PCBs to humans ingesting fish are expressed as distributions of values, of which the arithmetic mean or mode represents a particular fractile. The distribution of risk values is obtained using a food chain biomagnification model developed by Gobas by specifying distributions for input parameters disaggregated to represent either uncertainty or variability. Only those sources of uncertainty that could be quantified were included in the analysis. Results for several different two-dimensional Latin Hypercube analyses are provided to evaluate the influence of the uncertain versus variable disaggregation of model parameters. The analysis suggests that variability in human exposure parameters is greater than the uncertainty bounds on any particular fractile, given the described assumptions.  相似文献   

13.
Physical property values are used in environmental risk assessments to estimate media and risk-based concentrations. Recently, however, considerable variability has been reported with such values. To evaluate potential variability in physical parameter values supporting a variety of regulatory programs, eight data sources were chosen for evaluation, and chemicals appearing in at least four sources were selected. There were 755 chemicals chosen. In addition, chemicals in seven environmentally important subgroups were also identified for evaluation. Nine parameters were selected for analysis--molecular weight (MolWt), melting point (MeltPt), boiling point (BoilPt), vapor pressure (VP), water solubility (AqSOL), Henry's law constant (HLC), octanol-water partition coefficient (Kow), and diffusion coefficients in air (Dair) and water (Dwater). Results show that while 71% of constituents had equal MolWts across data sources, <3% of the constituents had equivalent parameter values across data sources for AqSOL, VP, or HLC. Considerable dissimilarity between certain sources was also observed. Furthermore, measures of dispersion showed considerable variation in data sets for Kow, VP, AqSOL, and HLC compared to measures for MolWt, MeltPt, BoilPt, or Dwater. The magnitude of the observed variability was also noteworthy. For example, the 95th percentile ratio of maximum/minimum parameter values ranged from 1.0 for MolWt to well over 1.0 x 10(6) for VP and HLC. Risk and exposure metrics also varied by similar magnitudes. Results with environmentally important subgroups were similar. These results show that there is considerable variability in physical parameter values from standard sources, and that the observed variability could affect potential risk estimates and perhaps risk management decisions.  相似文献   

14.
15.
Whether and to what extent contaminated sites harm ecologic and human health are topics of considerable interest, but also considerable uncertainty. Several federal and state agencies have approved the use of some or many aspects of probabilistic risk assessment (PRA), but its site-specific application has often been limited to high-profile sites and large projects. Nonetheless, times are changing: newly developed software tools, and recent federal and state guidance documents formalizing PRA procedures, now make PRA a readily available method of analysis for even small-scale projects. This article presents and discusses a broad review of PRA literature published since 2000.  相似文献   

16.
The tenfold "uncertainty" factor traditionally used to guard against human interindividual differences in susceptibility to toxicity is not based on human observations. To begin to build a basis for quantifying an important component of overall variability in susceptibility to toxicity, a data base has been constructed of individual measurements of key pharmacokinetic parameters for specific substances (mostly drugs) in groups of at least five healthy adults. 72 of the 101 data sets studied were positively skewed, indicating that the distributions are generally closer to expectations for log-normal distributions than for normal distributions. Measurements of interindividual variability in elimination half-lives, maximal blood concentrations, and AUC (area under the curve of blood concentration by time) have median values of log10 geometric standard deviations in the range of 0.11-0.145. For the median chemical, therefore, a tenfold difference in these pharmacokinetic parameters would correspond to 7-9 standard deviations in populations of normal healthy adults. For one relatively lipophilic chemical, however, interindividual variability in maximal blood concentration and AUC was 0.4--implying that a tenfold difference would correspond to only about 2.5 standard deviations for those parameters in the human population. The parameters studied to date are only components of overall susceptibility to toxic agents, and do not include contributions from variability in exposure- and response-determining parameters. The current study also implicitly excludes most human interindividual variability from age and illness. When these other sources of variability are included in an overall analysis of variability in susceptibility, it is likely that a tenfold difference will correspond to fewer standard deviations in the overall population, and correspondingly greater numbers of people at risk of toxicity.  相似文献   

17.
Health Risks of Energy Systems   总被引:1,自引:0,他引:1  
Health risks from fossil, renewable and nuclear reference energy systems are estimated following a detailed impact pathway approach. Using a set of appropriate air quality models and exposure-effect functions derived from the recent epidemiological literature, a methodological framework for risk assessment has been established and consistently applied across the different energy systems, including the analysis of consequences from a major nuclear accident. A wide range of health impacts resulting from increased air pollution and ionizing radiation is quantified, and the transferability of results derived from specific power plants to a more general context is discussed.  相似文献   

18.
《Risk analysis》1996,16(6):841-848
Currently, risk assessments of the potential human health effects associated with exposure to pathogens are utilizing the conceptual framework that was developed to assess risks associated with chemical exposures. However, the applicability of the chemical framework is problematic due to many issues that are unique to assessing risks associated with pathogens. These include, but are not limited to, an assessment of pathogen/host interactions, consideration of secondary spread, consideration of short- and long-term immunity, and an assessment of conditions that allow the microorganism to propagate. To address this concern, a working group was convened to develop a conceptual framework to assess the risks of human disease associated with exposure to pathogenic microorganisms. The framework that was developed consists of three phases: problem formulation, analysis (which includes characterization of exposure and human health effects), and risk characterization. The framework emphasizes the dynamic and iterative nature of the risk assessment process, and allows wide latitude for planning and conducting risk assessments in diverse situations, each based on the common principles discussed in the framework.  相似文献   

19.
We review approaches for characterizing “peak” exposures in epidemiologic studies and methods for incorporating peak exposure metrics in dose–response assessments that contribute to risk assessment. The focus was on potential etiologic relations between environmental chemical exposures and cancer risks. We searched the epidemiologic literature on environmental chemicals classified as carcinogens in which cancer risks were described in relation to “peak” exposures. These articles were evaluated to identify some of the challenges associated with defining and describing cancer risks in relation to peak exposures. We found that definitions of peak exposure varied considerably across studies. Of nine chemical agents included in our review of peak exposure, six had epidemiologic data used by the U.S. Environmental Protection Agency (US EPA) in dose–response assessments to derive inhalation unit risk values. These were benzene, formaldehyde, styrene, trichloroethylene, acrylonitrile, and ethylene oxide. All derived unit risks relied on cumulative exposure for dose–response estimation and none, to our knowledge, considered peak exposure metrics. This is not surprising, given the historical linear no‐threshold default model (generally based on cumulative exposure) used in regulatory risk assessments. With newly proposed US EPA rule language, fuller consideration of alternative exposure and dose–response metrics will be supported. “Peak” exposure has not been consistently defined and rarely has been evaluated in epidemiologic studies of cancer risks. We recommend developing uniform definitions of “peak” exposure to facilitate fuller evaluation of dose response for environmental chemicals and cancer risks, especially where mechanistic understanding indicates that the dose response is unlikely linear and that short‐term high‐intensity exposures increase risk.  相似文献   

20.
Upperbound lifetime excess cancer risks were calculated for activities associated with asbestos abatement using a risk assessment framework developed for EPA's Superfund program. It was found that removals were associated with cancer risks to workers which were often greater than the commonly accepted cancer risk of 1 x 10(-6), although lower than occupational exposure limits associated with risks of 1 x 10(-3). Removals had little effect in reducing risk to school populations. Risks to teachers and students in school buildings containing asbestos were approximately the same as risks associated with exposure to ambient asbestos by the general public and were below the levels typically of concern to regulatory agencies. During abatement, however, there were increased risks to both workers and nearby individuals. Careless, everyday building maintenance generated the greatest risk to workers followed by removals and encapsulation. If asbestos abatement was judged by the risk criteria applied to EPA's Superfund program, the no-action alternative would likely be selected in preference to removal in a majority of cases. These conclusions should only be interpreted within the context of an overall asbestos risk management program, which includes consideration of specific fiber types and sizes, sampling and analytical limitations, physical condition of asbestos-containing material, episodic peak exposures, and the number of people potentially exposed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号