首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
    
A quantitative assessment of the exposure to Listeria monocytogenes from cold-smoked salmon (CSS) consumption in France is developed. The general framework is a second-order (or two-dimensional) Monte Carlo simulation, which characterizes the uncertainty and variability of the exposure estimate. The model takes into account the competitive bacterial growth between L. monocytogenes and the background competitive flora from the end of the production line to the consumer phase. An original algorithm is proposed to integrate this growth in conditions of varying temperature. As part of a more general project led by the French Food Safety Agency (Afssa), specific data were acquired and modeled for this quantitative exposure assessment model, particularly time-temperature profiles, prevalence data, and contamination-level data. The sensitivity analysis points out the main influence of the mean temperature in household refrigerators and the prevalence of contaminated CSS on the exposure level. The outputs of this model can be used as inputs for further risk assessment.  相似文献   

2.
    
Recently, the lag phase research in predictive microbiology is focusing more on the individual cell variability, especially for pathogenic microorganisms that typically occur in very low contamination levels, like Listeria monocytogenes. In this study, the effect of this individual cell lag phase variability was introduced in an exposure assessment study for L. monocytogenes in a liver paté. A basic framework was designed to estimate the contamination level of paté at the time of consumption, taking into account the frequency of contamination and the initial contamination levels of paté at retail. Growth was calculated on paté units of 150 g, comparing an individual-based approach with a classical population-based approach. The two different protocols were compared using simulations. If only the individual cell lag variability was taken into account, important differences were observed in cell density at the time of consumption between the individual-based approach and the classical approach, especially at low inoculum levels, resulting in high variability when using the individual-based approach. Although, when all variable factors were taken into account, no significant differences were observed between the different approaches, allowing the conclusion that the individual cell lag phase variability was overruled by the global variability of the exposure assessment framework. Even in more extreme conditions like a low inoculum level or a low water activity, no differences were created in cell density at the time of consumption between the individual-based approach and the classical approach. This means that the individual cell lag phase variability of L. monocytogenes has important consequences when studying specific growth cases, especially when the applied inoculum levels are low, but when performing more general exposure assessment studies, the variability between the individual cell lag phases is too limited to have a major impact on the total exposure assessment.  相似文献   

3.
  总被引:1,自引:0,他引:1  
In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage times, temperatures, gas conditions, and their distributions) are determined. However, the logistic chain with its queues (storages, shelves) and mechanisms for ordering products is usually not taken into account. As a consequence, storage times—mutually dependent in successive steps in the chain—cannot be described adequately. This may have a great impact on the tails of risk distributions. Because food safety risks are generally very small, it is crucial to model the tails of (underlying) distributions as accurately as possible. Logistic performance can be modeled by describing the underlying planning and scheduling mechanisms in discrete-event modeling. This is common practice in operations research, specifically in supply chain management. In this article, we present the application of discrete-event modeling in the context of a QMRA for  Listeria monocytogenes  in fresh-cut iceberg lettuce. We show the potential value of discrete-event modeling in QMRA by calculating logistic interventions (modifications in the logistic chain) and determining their significance with respect to food safety.  相似文献   

4.
    
Currently, there is a growing preference for convenience food products, such as ready-to-eat (RTE) foods, associated with long refrigerated shelf-lives, not requiring a heat treatment prior to consumption. Because Listeria monocytogenes is able to grow at refrigeration temperatures, inconsistent temperatures during production, distribution, and at consumer's household may allow for the pathogen to thrive, reaching unsafe limits. L. monocytogenes is the causative agent of listeriosis, a rare but severe human illness, with high fatality rates, transmitted almost exclusively by food consumption. With the aim of assessing the quantitative microbial risk of L. monocytogenes in RTE chicken salads, a challenge test was performed. Salads were inoculated with a three-strain mixture of cold-adapted L. monocytogenes and stored at 4, 12, and 16 °C for eight days. Results revealed that the salad was able to support L. monocytogenes’ growth, even at refrigeration temperatures. The Baranyi primary model was fitted to microbiological data to estimate the pathogen's growth kinetic parameters. Temperature effect on the maximum specific growth rate (μmax) was modeled using a square-root-type model. Storage temperature significantly influenced μmax of L. monocytogenes (p < 0.05). These predicted growth models for L. monocytogenes were subsequently used to develop a quantitative microbial risk assessment, estimating a median number of 0.00008726 listeriosis cases per year linked to the consumption of these RTE salads. Sensitivity analysis considering different time–temperature scenarios indicated a very low median risk per portion (<−7 log), even if the assessed RTE chicken salad was kept in abuse storage conditions.  相似文献   

5.
    
To assess the impact of the manufacturing process on the fate of Listeria monocytogenes, we built a generic probabilistic model intended to simulate the successive steps in the process. Contamination evolution was modeled in the appropriate units (breasts, dice, and then packaging units through the successive steps in the process). To calibrate the model, parameter values were estimated from industrial data, from the literature, and based on expert opinion. By means of simulations, the model was explored using a baseline calibration and alternative scenarios, in order to assess the impact of changes in the process and of accidental events. The results are reported as contamination distributions and as the probability that the product will be acceptable with regards to the European regulatory safety criterion. Our results are consistent with data provided by industrial partners and highlight that tumbling is a key step for the distribution of the contamination at the end of the process. Process chain models could provide an important added value for risk assessment models that basically consider only the outputs of the process in their risk mitigation strategies. Moreover, a model calibrated to correspond to a specific plant could be used to optimize surveillance.  相似文献   

6.
    
Evaluations of Listeria monocytogenes dose‐response relationships are crucially important for risk assessment and risk management, but are complicated by considerable variability across population subgroups and L. monocytogenes strains. Despite difficulties associated with the collection of adequate data from outbreak investigations or sporadic cases, the limitations of currently available animal models, and the inability to conduct human volunteer studies, some of the available data now allow refinements of the well‐established exponential L. monocytogenes dose response to more adequately represent extremely susceptible population subgroups and highly virulent L. monocytogenes strains. Here, a model incorporating adjustments for variability in L. monocytogenes strain virulence and host susceptibility was derived for 11 population subgroups with similar underlying comorbidities using data from multiple sources, including human surveillance and food survey data. In light of the unique inherent properties of L. monocytogenes dose response, a lognormal‐Poisson dose‐response model was chosen, and proved able to reconcile dose‐response relationships developed based on surveillance data with outbreak data. This model was compared to a classical beta‐Poisson dose‐response model, which was insufficiently flexible for modeling the specific case of L. monocytogenes dose‐response relationships, especially in outbreak situations. Overall, the modeling results suggest that most listeriosis cases are linked to the ingestion of food contaminated with medium to high concentrations of L. monocytogenes. While additional data are needed to refine the derived model and to better characterize and quantify the variability in L. monocytogenes strain virulence and individual host susceptibility, the framework derived here represents a promising approach to more adequately characterize the risk of listeriosis in highly susceptible population subgroups.  相似文献   

7.
    
Increasing evidence suggests that persistence of Listeria monocytogenes in food processing plants has been the underlying cause of a number of human listeriosis outbreaks. This study extracts criteria used by food safety experts in determining bacterial persistence in the environment, using retail delicatessen operations as a model. Using the Delphi method, we conducted an expert elicitation with 10 food safety experts from academia, industry, and government to classify L. monocytogenes persistence based on environmental sampling results collected over six months for 30 retail delicatessen stores. The results were modeled using variations of random forest, support vector machine, logistic regression, and linear regression; variable importance values of random forest and support vector machine models were consolidated to rank important variables in the experts’ classifications. The duration of subtype isolation ranked most important across all expert categories. Sampling site category also ranked high in importance and validation errors doubled when this covariate was removed. Support vector machine and random forest models successfully classified the data with average validation errors of 3.1% and 2.2% (n = 144), respectively. Our findings indicate that (i) the frequency of isolations over time and sampling site information are critical factors for experts determining subtype persistence, (ii) food safety experts from different sectors may not use the same criteria in determining persistence, and (iii) machine learning models have potential for future use in environmental surveillance and risk management programs. Future work is necessary to validate the accuracy of expert and machine classification against biological measurement of L. monocytogenes persistence.  相似文献   

8.
    
Listeria monocytogenes is a leading cause of hospitalization, fetal loss, and death due to foodborne illnesses in the United States. A quantitative assessment of the relative risk of listeriosis associated with the consumption of 23 selected categories of ready‐to‐eat foods, published by the U.S. Department of Health and Human Services and the U.S. Department of Agriculture in 2003, has been instrumental in identifying the food products and practices that pose the greatest listeriosis risk and has guided the evaluation of potential intervention strategies. Dose‐response models, which quantify the relationship between an exposure dose and the probability of adverse health outcomes, were essential components of the risk assessment. However, because of data gaps and limitations in the available data and modeling approaches, considerable uncertainty existed. Since publication of the risk assessment, new data have become available for modeling L. monocytogenes dose‐response. At the same time, recent advances in the understanding of L. monocytogenes pathophysiology and strain diversity have warranted a critical reevaluation of the published dose‐response models. To discuss strategies for modeling L. monocytogenes dose‐response, the Interagency Risk Assessment Consortium (IRAC) and the Joint Institute for Food Safety and Applied Nutrition (JIFSAN) held a scientific workshop in 2011 (details available at http://foodrisk.org/irac/events/ ). The main findings of the workshop and the most current and relevant data identified during the workshop are summarized and presented in the context of L. monocytogenes dose‐response. This article also discusses new insights on dose‐response modeling for L. monocytogenes and research opportunities to meet future needs.  相似文献   

9.
    
We used an agent‐based modeling (ABM) framework and developed a mathematical model to explain the complex dynamics of microbial persistence and spread within a food facility and to aid risk managers in identifying effective mitigation options. The model explicitly considered personal hygiene practices by food handlers as well as their activities and simulated a spatially explicit dynamic system representing complex interaction patterns among food handlers, facility environment, and foods. To demonstrate the utility of the model in a decision‐making context, we created a hypothetical case study and used it to compare different risk mitigation strategies for reducing contamination and spread of Listeria monocytogenes in a food facility. Model results indicated that areas with no direct contact with foods (e.g., loading dock and restroom) can serve as contamination niches and recontaminate areas that have direct contact with food products. Furthermore, food handlers’ behaviors, including, for example, hygiene and sanitation practices, can impact the persistence of microbial contamination in the facility environment and the spread of contamination to prepared foods. Using this case study, we also demonstrated benefits of an ABM framework for addressing food safety in a complex system in which emergent system‐level responses are predicted using a bottom‐up approach that observes individual agents (e.g., food handlers) and their behaviors. Our model can be applied to a wide variety of pathogens, food commodities, and activity patterns to evaluate efficacy of food‐safety management practices and quantify contamination reductions associated with proposed mitigation strategies in food facilities.  相似文献   

10.
    
A model for the assessment of exposure to Listeria monocytogenes from cold-smoked salmon consumption in France was presented in the first of this pair of articles (Pouillot et al ., 2007, Risk Analysis, 27:683–700). In the present study, the exposure model output was combined with an internationally accepted hazard characterization model, adapted to the French situation, to assess the risk of invasive listeriosis from cold-smoked salmon consumption in France in a second-order Monte Carlo simulation framework. The annual number of cases of invasive listeriosis due to cold-smoked salmon consumption in France is estimated to be 307, with a very large credible interval ([10; 12,453]), reflecting data uncertainty. This uncertainty is mainly associated with the dose-response model. Despite the significant uncertainty associated with the predictions, this model provides a scientific base for risk managers and food business operators to manage the risk linked to cold-smoked salmon contaminated with L. monocytogenes. Under the modeling assumptions, risk would be efficiently reduced through a decrease in the prevalence of L. monocytogenes or better control of the last steps of the cold chain (shorter and/or colder storage during the consumer step), whereas reduction of the initial contamination levels of the contaminated products and improvement in the first steps of the cold chain do not seem to be promising strategies. An attempt to apply the recent risk-based concept of FSO (food safety objective) on this example underlines the ambiguity in practical implementation of the risk management metrics and the need for further elaboration on these concepts.  相似文献   

11.
In this study, a variance‐based global sensitivity analysis method was first applied to a contamination assessment model of Listeria monocytogenes in cold smoked vacuum packed salmon at consumption. The impact of the choice of the modeling approach (populational or cellular) of the primary and secondary models as well as the effect of their associated input factors on the final contamination level was investigated. Results provided a subset of important factors, including the food water activity, its storage temperature, and duration in the domestic refrigerator. A refined sensitivity analysis was then performed to rank the important factors, tested over narrower ranges of variation corresponding to their current distributions, using three techniques: ANOVA, Spearman correlation coefficient, and partial least squares regression. Finally, the refined sensitivity analysis was used to rank the important factors.  相似文献   

12.
    
Food safety objectives (FSOs) are established in order to minimize the risk of foodborne illnesses to consumers, but these have not yet been incorporated into regulatory policy. An FSO states the maximum frequency and/or concentration of a microbiological hazard in a food at the time of consumption that provides an acceptable level of protection to the public and leads to a performance criterion for industry. However, in order to be implemented as a regulation, this criterion has to be achievable by the affected industry. In order to determine an FSO, the steps to produce and store that food need to be known, especially where they have an impact on contamination, growth, and destruction. This article uses existing models for growth of Listeria monocytogenes in conjunction with calculations of FSOs to approximate the outcome of more than one introduction of the foodborne organism throughout the food-processing path from the farm to the consumer. Most models for the growth and reduction of foodborne illnesses are logarithmic in nature, which fits the nature of the growth of microorganisms, spanning many orders of magnitude. However, these logarithmic models are normally limited to a single introduction step and a single reduction step. The model presented as part of this research addresses more than one introduction of food contamination, each of which can be separated by a substantial amount of time. The advantage of treating the problem this way is the accommodation of multiple introductions of foodborne pathogens over a range of time durations and conditions.  相似文献   

13.
    
《Risk analysis》2018,38(4):638-652
The objective of this research was to analyze the impact of different cooking procedures (i.e., gas hob and traditional static oven) and levels of cooking (i.e., rare, medium, and well‐done) on inactivation of Listeria monocytogenes and Salmonella in pork loin chops. Moreover, the consumer's exposure to both microorganisms after simulation of meat leftover storage at home was assessed. The results showed that well‐done cooking in a static oven was the only treatment able to inactivate the tested pathogens. The other cooking combinations allowed to reach in the product temperatures always ≥73.6 °C, decreasing both pathogens between 6 log10 cfu/g and 7 log10 cfu/g. However, according to simulation results, the few cells surviving cooking treatments can multiply during storage by consumers up to 1 log10 cfu/g, with probabilities of 0.059 (gas hob) and 0.035 (static oven) for L. monocytogenes and 0.049 (gas hob) and 0.031 (static oven) for Salmonella . The key factors affecting consumer exposure in relation to storage practices were probability of pathogen occurrence after cooking, doneness degree, time of storage, and time of storage at room temperature. The results of this study can be combined with prevalence data and dose–response models in risk assessment models and included in guidelines for consumers on practices to be followed to manage cooking of pork meat at home.  相似文献   

14.
    
One‐third of the annual cases of listeriosis in the United States occur during pregnancy and can lead to miscarriage or stillbirth, premature delivery, or infection of the newborn. Previous risk assessments completed by the Food and Drug Administration/the Food Safety Inspection Service of the U.S. Department of Agriculture/the Centers for Disease Control and Prevention (FDA/USDA/CDC)( 1 ) and Food and Agricultural Organization/the World Health Organization (FAO/WHO)( 2 ) were based on dose‐response data from mice. Recent animal studies using nonhuman primates( 3 , 4 ) and guinea pigs( 5 ) have both estimated LD50s of approximately 107 Listeria monocytogenes colony forming units (cfu). The FAO/WHO( 2 ) estimated a human LD50 of 1.9 × 106 cfu based on data from a pregnant woman consuming contaminated soft cheese. We reevaluated risk based on dose‐response curves from pregnant rhesus monkeys and guinea pigs. Using standard risk assessment methodology including hazard identification, exposure assessment, hazard characterization, and risk characterization, risk was calculated based on the new dose‐response information. To compare models, we looked at mortality rate per serving at predicted doses ranging from 10?4 to 1012 L. monocytogenes cfu. Based on a serving of 106 L. monocytogenes cfu, the primate model predicts a death rate of 5.9 × 10?1 compared to the FDA/USDA/CDC (fig. IV‐12)( 1 ) predicted rate of 1.3 × 10?7. Based on the guinea pig and primate models, the mortality rate calculated by the FDA/USDA/CDC( 1 ) is underestimated for this susceptible population.  相似文献   

15.
The management of microbial risk in food products requires the ability to predict growth kinetics of pathogenic microorganisms in the event of contamination and growth initiation. Useful data for assessing these issues may be found in the literature or from experimental results. However, the large number and variety of data make further development difficult. Statistical techniques, such as meta-analysis, are then useful to realize synthesis of a set of distinct but similar experiences. Moreover, predictive modeling tools can be employed to complete the analysis and help the food safety manager to interpret the data. In this article, a protocol to perform a meta-analysis of the outcome of a relational database, associated with quantitative microbiology models, is presented. The methodology is illustrated with the effect of temperature on pathogenic Escherichia coli and Listeria monocytogenes, growing in culture medium, beef meat, and milk products. Using a database and predictive models, simulations of growth in a given product subjected to various temperature scenarios can be produced. It is then possible to compare food products for a given microorganism, according to its growth ability in these products, and to compare the behavior of bacteria in a given foodstuff. These results can assist decisions for a variety of questions on food safety.  相似文献   

16.
  总被引:1,自引:0,他引:1  
The foodborne disease risk associated with the pathogen Listeria monocytogenes has been the subject of recent efforts in quantitative microbial risk assessment. Building upon one of these efforts undertaken jointly by the U.S. Food and Drug Administration and the U.S. Department of Agriculture (USDA), the purpose of this work was to expand on the consumer phase of the risk assessment to focus on handling practices in the home. One-dimensional Monte Carlo simulation was used to model variability in growth and cross-contamination of L. monocytogenes during food storage and preparation of deli meats. Simulations approximated that 0.3% of the servings were contaminated with >10(4) CFU/g of L. monocytogenes at the time of consumption. The estimated mean risk associated with the consumption of deli meats for the intermediate-age population was approximately 7 deaths per 10(11) servings. Food handling in homes increased the estimated mean mortality by 10(6)-fold. Of all the home food-handling practices modeled, inadequate storage, particularly refrigeration temperatures, provided the greatest contribution to increased risk. The impact of cross-contamination in the home was considerably less. Adherence to USDA Food Safety and Inspection Service recommendations for consumer handling of ready-to-eat foods substantially reduces the risk of listeriosis.  相似文献   

17.
需求变动下的物流配送干扰管理模型的知识表示与求解   总被引:2,自引:0,他引:2  
针对需求变动下的物流配送干扰管理数学模型难以支持实时建模与实时求解的缺陷,通过深入分析需求变动的物流配送干扰管理问题的已知知识、建模知识与求解知识,引入人工智能和知识工程的相关知识表示理论与建模方法,建立该问题的BRGISC模型知识表示方法,将包含多种需求变动事件的物流配送干扰管理的建模与求解过程进行知识表示,并以此知识表示为基础,提出一种解决该问题的基于知识的求解方法,设计了知识库和推理规则,实现该类问题的实时建模与求解过程,并应用到中石油大连销售分公司市内配送小配送片区0#柴油的日常需求变动干扰管理中.实例运行和数据实验的结果表明,该方法能够满足对多种需求变动事件的实时响应,实时生成干扰管理决策方案.  相似文献   

18.
The World Trade Organization introduced the concept of appropriate level of protection (ALOP) as a public health target. For this public health objective to be interpretable by the actors in the food chain, the concept of food safety objective (FSO) was proposed by the International Commission on Microbiological Specifications for Foods and adopted later by the Codex Alimentarius Food Hygiene Committee. The way to translate an ALOP into a FSO is still in debate. The purpose of this article is to develop a methodological tool to derive a FSO from an ALOP being expressed as a maximal annual marginal risk. We explore the different models relating the annual marginal risk to the parameters of the FSO depending on whether the variability in the survival probability and in the concentration of the pathogen are considered or not. If they are not, determination of the FSO is straightforward. If they are, we propose to use stochastic Monte Carlo simulation models and logistic discriminant analysis in order to determine which sets of parameters are compatible with the ALOP. The logistic discriminant function was chosen such that the kappa coefficient is maximized. We illustrate this method by the example of the risks of listeriosis and salmonellosis in one type of soft cheese. We conclude that the definition of the FSO should integrate three dimensions: the prevalence of contamination, the average concentration per contaminated typical serving, and the dispersion of the concentration among those servings.  相似文献   

19.
    
Next‐generation sequencing (NGS) data present an untapped potential to improve microbial risk assessment (MRA) through increased specificity and redefinition of the hazard. Most of the MRA models do not account for differences in survivability and virulence among strains. The potential of machine learning algorithms for predicting the risk/health burden at the population level while inputting large and complex NGS data was explored with Listeria monocytogenes as a case study. Listeria data consisted of a percentage similarity matrix from genome assemblies of 38 and 207 strains of clinical and food origin, respectively. Basic Local Alignment (BLAST) was used to align the assemblies against a database of 136 virulence and stress resistance genes. The outcome variable was frequency of illness, which is the percentage of reported cases associated with each strain. These frequency data were discretized into seven ordinal outcome categories and used for supervised machine learning and model selection from five ensemble algorithms. There was no significant difference in accuracy between the models, and support vector machine with linear kernel was chosen for further inference (accuracy of 89% [95% CI: 68%, 97%]). The virulence genes FAM002725, FAM002728, FAM002729, InlF, InlJ, Inlk, IisY, IisD, IisX, IisH, IisB, lmo2026, and FAM003296 were important predictors of higher frequency of illness. InlF was uniquely truncated in the sequence type 121 strains. Most important risk predictor genes occurred at highest prevalence among strains from ready‐to‐eat, dairy, and composite foods. We foresee that the findings and approaches described offer the potential for rethinking the current approaches in MRA.  相似文献   

20.
    
This article presents a Listeria monocytogenes growth model in milk at the farm bulk tank stage. The main objective was to judge the feasibility and value to risk assessors of introducing a complex model, including a complete thermal model, within a microbial quantitative risk assessment scheme. Predictive microbiology models are used under varying temperature conditions to predict bacterial growth. Input distributions are estimated based on data in the literature, when it is available. If not, reasonable assumptions are made for the considered context. Previously published results based on a Bayesian analysis of growth parameters are used. A Monte Carlo simulation that forecasts bacterial growth is the focus of this study. Three scenarios that take account of the variability and uncertainty of growth parameters are compared. The effect of a sophisticated thermal model taking account of continuous variations in milk temperature was tested by comparison with a simplified model where milk temperature was considered as constant. Limited multiplication of bacteria within the farm bulk tank was modeled. The two principal factors influencing bacterial growth were found to be tank thermostat regulation and bacterial population growth parameters. The dilution phenomenon due to the introduction of new milk was the main factor affecting the final bacterial concentration. The results show that a model that assumes constant environmental conditions at an average temperature should be acceptable for this process. This work may constitute a first step toward exposure assessment for L. monocytogenes in milk. In addition, this partly conceptual work provides guidelines for other risk assessments where continuous variation of a parameter needs to be taken into account.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号

京公网安备 11010802026262号