首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 515 毫秒
1.
2.
The comparative cost-effectiveness of interventions is a fundamental consideration of health technology assessment (HTA) in the UK.(1) The use of modelling to extrapolate benefits to patients and costs over a specified time period is a common technique in cost-effectiveness analyses. All modelling techniques, by their nature, are subject to different levels of uncertainty. Assessment and understanding of the level and impact of this uncertainty is a fundamental part of the decision-making process. In this article, we build on our previous article A guide to health economic evaluations and discuss different modelling approaches to cost-effectiveness analysis and the importance of uncertainty.(2)  相似文献   

3.
Economic analyses have become increasingly important in healthcare in general and with respect to pharmaceuticals in particular. If economic analyses are to play an important and useful role in the allocation of scarce healthcare resources, then such analyses must be performed properly and with care. This article outlines some of the basic principles of pharmacoeconomic analysis. Every analysis should have an explicitly stated perspective, which, unless otherwise justified, should be a societal perspective. Cost minimisation, cost-effectiveness, cost-utility and cost-benefit analyses are a family of techniques used in economic analyses. Cost minimisation analysis is appropriate when alternative therapies have identical outcomes, but differ in costs. Cost-effectiveness analysis is appropriate when alternative therapies differ in clinical effectiveness but can be examined from the same dimension of health outcome. Cost-utility analysis can be used when alternative therapies may be examined using multiple dimensions of health outcome, such as morbidity and mortality. Cost-benefit analysis requires the benefits of therapy to be described in monetary units and is not usually the technique of choice. The technique used in an analysis should be described and explicitly defended according to the problem being examined. For each technique, the method of determining costs is the same; direct, indirect, and intangible costs can be considered. The specific costs to be used depend on the analytical perspective; a societal perspective implies the use of both direct and indirect economic costs. A modelling framework such as a decision tree, influence diagram, Markov chain, or network simulation must be used to structure the analysis explicitly. Regardless of the choice of framework, all modelling assumptions should be described. The mechanism of data collection for model inputs must be detailed and defended. Models must undergo careful verification and validation procedures. Following baseline analysis of the model, further analyses should examine the role of uncertainty in model assumptions and data.  相似文献   

4.
Nestorov I 《Toxicology letters》2001,120(1-3):411-420
Two important methodological issues within the framework of the variability and uncertainty analysis of toxicokinetic and pharmacokinetic systems are discussed: (i) modelling and simulation of the existing physiologic variability in a population; and (ii) modelling and simulation of variability and uncertainty when there is insufficient or not well defined (e.g. small sample, semiquantitative, qualitative and vague) information available. Physiologically based pharmacokinetic models are especially suited for separating and characterising the physiologic variability from the overall variability and uncertainty in the system. Monte Carlo sampling should draw from multivariate distributions, which reflect all levels of existing dependencies in the intact organism. The population characteristics should be taken into account. A fuzzy simulation approach is proposed to model variability and uncertainty when there is semiquantitative, qualitative and vague information about the model parameters and their statistical distributions cannot be defined reliably.  相似文献   

5.
Green C 《PharmacoEconomics》2007,25(9):735-750
The literature reporting economic evaluations related to the treatment of Alzheimer's disease (AD) has developed over the last decade. Most analyses have used economic models to estimate the cost effectiveness of drugs for the treatment of AD. This review considers the range of methods used in the published cost-effectiveness literature to model AD progression and the effect of interventions on the progression of AD. The review builds on and updates an earlier systematic review of cost-effectiveness studies on drugs for AD. Systematic and rigorous methods were used to search the literature for economic evaluations estimating the cost effectiveness of donepezil, rivastigmine, galantamine or memantine in AD. The literature search covered a wide range of electronic databases (e.g. MEDLINE, EMBASE), and included literature from the inception of databases up to the end of 2005. The search identified 22 published economic evaluations. An outline and brief critical review of the identified studies is provided, and thereafter the methods used to model disease progression were considered in more detail. The review employs recent guidance on good practice in decision-analytic modelling in HTA to critically review the modelling methods used. Using this guidance, the models are assessed against the broad criteria of model structure, data inputs and assessment of uncertainty and inconsistency. Concerns were noted over the model structure employed in all models. The reliance on cognitive scores to model AD, the progression of the disease, and the effect of treatment on costs and consequences is regarded as a serious limitation in almost all of the studies identified. There are also limitations over the data used to populate published models, especially around the failure of studies to document and establish the basis for the modelling of treatment effects. It is also clear that studies modelling AD progression, and subsequently the cost effectiveness of treatment, have not addressed uncertainty or consistency (internal and/or external) in sufficient detail. Further research is required on more appropriate methods for the modelling of AD progression. In the meantime, future economic evaluations of treatment need to be more explicit on the methods used to model AD, and the data used to populate models.  相似文献   

6.
Over the last decade or so, a number of healthcare systems have used economic evaluations as a formal input into decisions about the coverage or reimbursement of new healthcare interventions. This change in the policy landscape has placed some important demands on the design and characteristics of economic evaluation and these are increasingly evident in studies being presented to decision makers. One challenge has been to make studies specific to the context in which the decision is being taken. This is because of the inevitable geographical variation in many of the parameters within an analysis. There has been a series of important contributions to the published literature in recent years on how to quantify geographical heterogeneity within economic analyses based on randomised controlled trials. However, there are good reasons for economic evaluation for decision making to be undertaken using methods of evidence synthesis and decision analytical modelling, but issues of geographical variation still need to be handled appropriately. The key requirements of economic evaluations for decision making within healthcare systems can be defined as follows: (i) a design that meets the objectives and constraints of the healthcare system; (ii) coherent and complete specification of the decision problem; (iii) inclusion of all relevant evidence; and (iv) recognition and appropriate handling of uncertainty. In satisfying these requirements, it is important to be aware of variation between jurisdictions, and this imposes some important analytical requirements on economic studies. While many agencies have produced guidelines on preferred methods for healthcare economic evaluation, these exhibit considerable variation. Some of this variation can be justified by genuine differences between systems in clinical practice, objectives and constraints, while some of the variation relates to differences of opinion about appropriate analysis given methodological uncertainty. However, some of the variation in guidance is difficult to justify and is inconsistent with the aims and objectives of the systems the analyses are seeking to inform. Decision makers and analysts need to work together to streamline and where possible harmonise guidelines on methods for economic evaluations, whilst recognising legitimate variation in the needs of different healthcare systems. Otherwise, there is the risk that scarce resources will be wasted in producing country-specific analyses in situations where these are not justified. Expected value of information analyses are also emerging as a tool that could be considered by decision makers to guide their policy on the acceptance or non-acceptance of data from other jurisdictions.  相似文献   

7.
Weinstein MC 《PharmacoEconomics》2006,24(11):1043-1053
The past few years have seen rapid changes in the methods of decision-analytic modelling of healthcare programmes for the purposes of economic evaluation. This paper focuses on four developments in modelling that have emerged over the past few years or have become more widely used. First, no one optimal method for extrapolating outcomes from clinical trials has yet been established. Modellers may draw from a set of varied assumptions about survival extrapolation that encompass a range of possibilities from highly optimistic to extremely cautious. Secondly, the practicality and appeal of microsimulation as a method for analysing healthcare decision problems has increased dramatically with the speed of computing technology. Individual instantiations of a system are generated by using a random process to draw from probability distributions a large number of times (also known as Monte Carlo or probabilistic simulation). Microsimulation is moving in new directions, such as discrete-event simulations that simulate sequences of events by drawing directly from probability distributions of event times; this approach is now being broadly applied to model situations where populations of patients interact with healthcare delivery systems. Microsimulation modelling of transmission systems at the population level is also rapidly developing. Thirdly, model calibration is emerging as a new tool that may offer health scientists a means of generating important fundamental knowledge about disease processes. Model calibration allows evidence synthesis in which observations on observable quantities are used to draw inferences about unobservable quantities. The methodology of model calibration has advanced considerably, drawing on theories of numerical analysis and mathematical programming such as gradient methods, intelligent grid search algorithms, and many more. As a fourth issue, an area of extraordinary activity is in the use of transmission models to analyse interventions for infectious diseases, including population-wide effects of vaccination. Transmission models use differential equations to simulate, deterministically for the most part, transitions among infection-related health states. Only recently have modelling methodologies been combined so that cost-effectiveness analyses can consider explicitly not only the patient-level benefits of interventions but also the secondary benefits through transmission dynamics. Advances in technology allow more realistic and complex healthcare models to be simulated more rapidly. However, decision makers will not readily accept results from models unless they can understand them intuitively and explain them to others in relatively simple terms. The challenge for the next generation of modellers is not only to harness the power available from these newly accessible methods, but also to extract from the new generation of models the insights that will have the power to influence decision makers.  相似文献   

8.
The aim of this work is to implement a conservative prior that safeguards against population non-exchangeability of prior and data likelihood, in the framework of population pharmacokinetic/pharmacodynamic analysis, incorporating multi-level hierarchical modelling. Three different exercises were performed: (i) we investigated the use of parametric priors in the multilevel hierarchical modelling framework; (ii) we assessed the average performance of the multilevel hierarchical model compared to the standard mixed effect model, considering also some interesting extreme cases and (iii) we implemented an application with a small Proof of Principle (PoP) study, which demonstrates the propagation of information across PD studies using multilevel modelling. Fitting with the 4-level model and informative parametric priors performed similar to a meta-analysis of the test datasets combined with the datasets that the priors came from, demonstrating that parametric priors can be used alternatively to meta-analysis. Further, the 4-level model gave posterior distributions which had larger uncertainty but at the same time were unbiased, compared to the 3-level model, and therefore implements a more conservative prior in a formal way, which is appropriate when the prior and the test populations are not exchangeable. For the application with the PoP study, the statistical power of detecting the difference in potency between two drugs, when inter-study variability was present, was greater when an extra level in the hierarchical model to account for it, was used. In conclusion, by applying the prior one hierarchical level above the level of the parameters of interest, we implemented a more conservative prior, compared to applying the prior directly on the parameters of interest. The approach is equivalent to Bayesian individualization, offers a safeguard against bias from the prior and also avoids the danger of the data being overwhelmed by a strong prior.  相似文献   

9.
The National Institute for Health and Clinical Excellence (NICE) recently issued updated guidance on the use of cholinesterase inhibitors in the treatment of Alzheimer's disease. NICE initially recommended that cholinesterase inhibitors no longer be used, but final guidance restricted treatment to patients with disease of a moderately severe stage. This decision was based largely on results from a heavily criticised economic evaluation that used an adaptation of the Assessment of Health Economics in Alzheimer's Disease (AHEAD) model. As the developers of the AHEAD model, we examined the appropriateness of NICE's economic analyses and presentation of results. We attempted to replicate NICE's results by modifying the original AHEAD model. Sensitivity analyses were then run using the modified AHEAD model to evaluate the extent of uncertainty in predictions. The AHEAD(NICE) analyses resulted in an incremental cost-effectiveness ratio for galantamine of 82,000 pound per QALY gained (year 2003 values) from the perspective of the UK NHS and Personal Social Services. This was later revised to 46,000 pound per QALY, compared with < 9000 pound per discounted QALY gained (year 2001 values) in the original AHEAD model. Using our modified AHEAD with effectiveness estimates matching those of AHEAD(NICE), we show that NICE's choice and presentation of sensitivity analyses obscured the instability of their estimates. In the final NICE evaluation, the recommendation to delay treatment with cholinesterase inhibitors until patients have moderately severe disease was based on critical assumptions in the economic analyses that had little evidence to support them. The case of NICE's guidance on cholinesterase inhibitors highlights the importance of transparent and valid economic evaluations and the dangers of using inappropriate modelling technologies, basing analyses on a limited subset of the available data, and insufficiently reflecting the uncertainty in estimates that are intended to inform decision makers.  相似文献   

10.
Information theoretic methods are often used to design studies that aim to learn about pharmacokinetic and linked pharmacokinetic–pharmacodynamic systems. These design techniques, such as D-optimality, provide the optimum experimental conditions. The performance of the optimum design will depend on the ability of the investigator to comply with the proposed study conditions. However, in clinical settings it is not possible to comply exactly with the optimum design and hence some degree of unplanned suboptimality occurs due to error in the execution of the study. In addition, due to the nonlinear relationship of the parameters of these models to the data, the designs are also locally dependent on an arbitrary choice of a nominal set of parameter values. A design that is robust to both study conditions and uncertainty in the nominal set of parameter values is likely to be of use clinically. We propose an adaptive design strategy to account for both execution error and uncertainty in the parameter values. In this study we investigate designs for a one-compartment first-order pharmacokinetic model. We do this in a Bayesian framework using Markov-chain Monte Carlo (MCMC) methods. We consider log-normal prior distributions on the parameters and investigate several prior distributions on the sampling times. An adaptive design was used to find the sampling window for the current sampling time conditional on the actual times of all previous samples.  相似文献   

11.
Information theoretic methods are often used to design studies that aim to learn about pharmacokinetic and linked pharmacokinetic-pharmacodynamic systems. These design techniques, such as D-optimality, provide the optimum experimental conditions. The performance of the optimum design will depend on the ability of the investigator to comply with the proposed study conditions. However, in clinical settings it is not possible to comply exactly with the optimum design and hence some degree of unplanned suboptimality occurs due to error in the execution of the study. In addition, due to the nonlinear relationship of the parameters of these models to the data, the designs are also locally dependent on an arbitrary choice of a nominal set of parameter values. A design that is robust to both study conditions and uncertainty in the nominal set of parameter values is likely to be of use clinically. We propose an adaptive design strategy to account for both execution error and uncertainty in the parameter values. In this study we investigate designs for a one-compartment first-order pharmacokinetic model. We do this in a Bayesian framework using Markov-chain Monte Carlo (MCMC) methods. We consider log-normal prior distributions on the parameters and investigate several prior distributions on the sampling times. An adaptive design was used to find the sampling window for the current sampling time conditional on the actual times of all previous samples.  相似文献   

12.
13.
In a recent leading article in PharmacoEconomics, Nuijten described some methods for incorporating uncertainty into health economic models and for utilising the information on uncertainty regarding the cost effectiveness of a therapy in resource allocation decision-making. His proposals are found to suffer from serious flaws in statistical and health economic reasoning.Nuijten's suggestions for incorporating uncertainty: (a) wrongly interpret the p-value as the probability that the null hypothesis is true; (b) represent this probability wrongly by truncating the input distribution; and (c) in the specific example of an antiparkinsonian drug uses a completely inappropriate p-value of 0.05 when the null hypothesis would, in reality, be emphatically disproved by the data.His suggestions regarding minimum important differences in cost effectiveness: (a) introduce areas of indifference that suggest inappropriate reliance on cost minimisation while failing to recognise that decisions should be based on expected costs versus benefits; and (b) offer no guidance on how the probabilities associated with these areas could be used in decision-making. Furthermore, Nuijten's model for Parkinson's disease is over-simplified to the point of providing a bad example of modelling practice, which may mislead the readers of PharmacoEconomics.The rationale for this paper is to ensure that readers do not apply inappropriate analyses as a result of following the proposals contained in Nuijten's paper. In addition to a detailed critique of Nuijten's proposals, we provide brief summaries of the currently accepted best practice in cost-effectiveness decision-making under uncertainty.  相似文献   

14.
Schizophrenia is a chronic disease characterized by periods of relative stability interrupted by acute episodes (or relapses). The course of the disease may vary considerably between patients. Patient histories show considerable inter- and even intra-individual variability. We provide a critical assessment of the advantages and disadvantages of three modelling techniques that have been used in schizophrenia: decision trees, (cohort and micro-simulation) Markov models and discrete event simulation models. These modelling techniques are compared in terms of building time, data requirements, medico-scientific experience, simulation time, clinical representation, and their ability to deal with patient heterogeneity, the timing of events, prior events, patient interaction, interaction between co-variates and variability (first-order uncertainty).We note that, depending on the research question, the optimal modelling approach should be selected based on the expected differences between the comparators, the number of co-variates, the number of patient subgroups, the interactions between co-variates, and simulation time. Finally, it is argued that in case micro-simulation is required for the cost-effectiveness analysis of schizophrenia treatments, a discrete event simulation model is best suited to accurately capture all of the relevant interdependencies in this chronic, highly heterogeneous disease with limited long-term follow-up data.  相似文献   

15.
Currently the extrapolation of evidence from studies of non-human species to the setting of environmental exposure standards for humans includes the imposition of a variety of uncertainty factors reflecting unknown aspects of the procedure, including the relevance of evidence from one species to impacts in another. This paper develops and explores more flexible modelling of aspects of this extrapolation, using models proposed by DuMouchel [DuMouchel, W.H., Harris, J.E., 1983. Bayes methods for combining the results of cancer studies in humans and other species (with comment). J. Am. Statist. Assoc. 78, 293–308.] The approaches are based on Bayesian meta-analysis methods involving explicit modelling of relevance in the prior distributions, estimated using Markov chain Monte Carlo (MCMC) methods. The methods are applied to evidence relating chlorinated by-products exposure to adverse reproductive health effects. The relative merits of various approaches are discussed, and developments and next steps are outlined.  相似文献   

16.
Experiments with relatively high doses are often used to predict risks at appreciably lower doses. A point of departure (PoD) can be calculated as the dose associated with a specified moderate response level that is often in the range of experimental doses considered. A linear extrapolation to lower doses often follows. An alternative to the PoD method is to develop a model that accounts for the model uncertainty in the dose–response relationship and to use this model to estimate the risk at low doses. Two such approaches that account for model uncertainty are model averaging (MA) and semi-parametric methods. We use these methods, along with the PoD approach in the context of a large animal (40,000+ animal) bioassay that exhibited sub-linearity. When models are fit to high dose data and risks at low doses are predicted, the methods that account for model uncertainty produce dose estimates associated with an excess risk that are closer to the observed risk than the PoD linearization. This comparison provides empirical support to accompany previous simulation studies that suggest methods that incorporate model uncertainty provide viable, and arguably preferred, alternatives to linear extrapolation from a PoD.  相似文献   

17.
The use of decision-analytic modelling for the purpose of health technology assessment (HTA) has increased dramatically in recent years. Several guidelines for best practice have emerged in the literature; however, there is no agreed standard for what constitutes a 'good model' or how models should be formally assessed. The objective of this paper is to identify, review and consolidate existing guidelines on the use of decision-analytic modelling for the purpose of HTA and to develop a consistent framework against which the quality of models may be assessed. The review and resultant framework are summarised under the three key themes of Structure, Data and Consistency. 'Structural' aspects relate to the scope and mathematical structure of the model including the strategies under evaluation. Issues covered under the general heading of 'Data' include data identification methods and how uncertainty should be addressed. 'Consistency' relates to the overall quality of the model. The review of existing guidelines showed that although authors may provide a consistent message regarding some aspects of modelling, such as the need for transparency, they are contradictory in other areas. Particular areas of disagreement are how data should be incorporated into models and how uncertainty should be assessed. For the purpose of evaluation, the resultant framework is applied to a decision-analytic model developed as part of an appraisal for the National Institute for Health and Clinical Excellence (NICE) in the UK. As a further assessment, the review based on the framework is compared with an assessment provided by an independent experienced modeller not using the framework. It is hoped that the framework developed here may form part of the appraisals process for assessment bodies such as NICE and decision models submitted to peer review journals. However, given the speed with which decision-modelling methodology advances, there is a need for its continual update.  相似文献   

18.
In economic evaluation, mathematical models have a central role as a way of integrating all the relevant information about a disease and health interventions, in order to estimate costs and consequences over an extended time horizon. Models are based on scientific knowledge of disease (which is likely to change over time), simplifying assumptions and input parameters with different levels of uncertainty; therefore, it is sensible to explore the consistency of model predictions with observational data. Calibration is a useful tool for estimating uncertain parameters, as well as more accurately defining model uncertainty (particularly with respect to the representation of correlations between parameters). Calibration involves the comparison of model outputs (e.g. disease prevalence rates) with empirical data, leading to the identification of model parameter values that achieve a good fit. This article provides guidance on the theoretical underpinnings of different calibration methods. The calibration process is divided into seven steps and different potential methods at each step are discussed, focusing on the particular features of disease models in economic evaluation. The seven steps are (i) Which parameters should be varied in the calibration process? (ii) Which calibration targets should be used? (iii) What measure of goodness of fit should be used? (iv) What parameter search strategy should be used? (v) What determines acceptable goodness-of-fit parameter sets (convergence criteria)? (vi) What determines the termination of the calibration process (stopping rule)? (vii) How should the model calibration results and economic parameters be integrated? The lack of standards in calibrating disease models in economic evaluation can undermine the credibility of calibration methods. In order to avoid the scepticism regarding calibration, we ought to unify the way we approach the problems and report the methods used, and continue to investigate different methods.  相似文献   

19.
A review of graphical and test based methods for evaluating assumptions underlying the use of least squares analysis with the general linear model is presented along with some discussion of robustness. Alternative analyses are described for situations where there is evidence that the assumptions are not reasonable. Evaluation of the assumptions is illustrated through the use of an example from a clinical trial used for US registration purposes. It is recommended that: (1) most assumptions required for the least squares analysis of data using the general linear model can be judged using residuals graphically without the need for formal testing, (2) it is more important to normalize data or to use nonparametric methods when there is heterogeneous variance between treatment groups, and (3) nonparametric analyses can be used to demonstrate robustness of results and that it is best to specify these analyses prior to unblinding.  相似文献   

20.
A major problem in risk assessment is the quantification of uncertainties. A probabilistic model was developed to consider uncertainties in the effect assessment of hazardous substances at the workplace. Distributions for extrapolation factors (time extrapolation, inter- and intraspecies extrapolation) were determined on the basis of appropriate empirical data. Together with the distribution for the benchmark dose obtained from substance-specific dose-response modelling for the exemplary substances 2,4,4-trimethylpentene (TMP) and aniline, they represent the input distributions for probabilistic modelling. These distributions were combined by Monte Carlo simulation. The resulting target distribution describes the probability that an aspired protection level for workers is achieved at a certain dose and the uncertainty associated with the assessment. In the case of aniline, substance-specific data on differences in susceptibility (between species; among humans due to genetic polymorphisms of N-acetyltransferase) were integrated in the model. Medians of the obtained target distributions of the basic models for TMP and aniline, but not of the specific aniline model are similar to deterministically derived reference values. Differences of more than one order of magnitude between the medians and the 5th percentile of the target distributions indicate substantial uncertainty associated with the effect assessment of these substances. The probabilistic effect assessment model proves to be a practical tool to integrate quantitative information on uncertainty and variability in hazard characterisation.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号