首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
An asymmetric multivariate generalization of the recently proposed class of normal mixture GARCH models is developed. Issues of parametrization and estimation are discussed. Conditions for covariance stationarity and the existence of the fourth moment are derived, and expressions for the dynamic correlation structure of the process are provided. In an application to stock market returns, it is shown that the disaggregation of the conditional (co)variance process generated by the model provides substantial intuition. Moreover, the model exhibits a strong performance in calculating out-of-sample Value-at-Risk measures.  相似文献   

2.
This paper examines the effectiveness of using futures contracts as hedging instruments of: (1) alternative models of volatility for estimating conditional variances and covariances; (2) alternative currencies; and (3) alternative maturities of futures contracts. For this purpose, daily data of futures and spot exchange rates of three major international currencies, Euro, British pound and Japanese yen, against the American dollar, are used to analyze hedge ratios and hedging effectiveness resulting from using two different maturity currency contracts, near-month and next-to-near-month contract. We estimate four multivariate volatility models (namely CCC, VARMA-AGARCH, DCC and BEKK), and calculate optimal portfolio weights and optimal hedge ratios to identify appropriate currency hedging strategies. The hedging effectiveness index suggests that the best results in terms of reducing the variance of the portfolio are for the USD/GBP exchange rate. The empirical results show that futures hedging strategies are slightly more effective when the near-month future contract is used for the USD/GBP and USD/JPY currencies. Moreover, the CCC and AGARCH models provide similar hedging effectiveness, which suggests that dynamic asymmetry may not be crucial empirically, although some differences appear when the DCC and BEKK models are used.  相似文献   

3.
In this study, a discrete-time robust nonlinear filtering algorithm is proposed to deal with the contaminated Gaussian noise in the measurement, which is based on a robust modification of the derivative-free Kalman filter. By interpreting the Kalman type filter (KTF) as the recursive Bayesian approximation, the innovation is reformulated capitalizing on the Huber's M-estimation methodology. The proposed algorithm achieves not only the robustness of the M-estimation but also the accuracy and flexibility of the derivative-free Kalman filter for the nonlinear problems. The reliability and accuracy of the proposed algorithm are tested in the Univariate Nonstationary Growth Model.  相似文献   

4.
5.
The class of fractionally integrated generalised autoregressive conditional heteroskedastic (FIGARCH) models is extended for modelling the periodic long-range dependence typically shown by volatility of most intra-daily financial returns. The proposed class of models introduces generalised periodic long-memory filters, based on Gegenbauer polynomials, into the equation describing the time-varying volatility of standard GARCH models. A fitting procedure is illustrated and its performance is evaluated by means of Monte Carlo simulations. The effectiveness of these models in describing periodic long-memory volatility patterns is shown through an empirical application to the Euro-Dollar intra-daily exchange rate.  相似文献   

6.
Multivariate time series may contain outliers of different types. In the presence of such outliers, applying standard multivariate time series techniques becomes unreliable. A robust version of multivariate exponential smoothing is proposed. The method is affine equivariant, and involves the selection of a smoothing parameter matrix by minimizing a robust loss function. It is shown that the robust method results in much better forecasts than the classic approach in the presence of outliers, and performs similarly when the data contain no outliers. Moreover, the robust procedure yields an estimator of the smoothing parameter less subject to downward bias. As a byproduct, a cleaned version of the time series is obtained, as is illustrated by means of a real data example.  相似文献   

7.
An extension of the iterative closest point matching by M-estimation is proposed for realization of robustness to non-overlapping data or outlying data in two sets of contour data or depth images for rigid bodies. An objective function which includes independent residual components for each of x, y and z coordinates is originally defined and proposed to evaluate the fitness, simultaneously dealing with a distribution of outlying gross noise. The proposed procedure is based on modified M-estimation iterations with bi-weighting coefficients for selecting corresponding points for optimization of estimating the transforms for matching. The transforms can be represented by ‘quaternions’ in the procedure to eliminate redundancy in representation of rotational degree of freedom by linear matrices. Optimization steps are performed by the simplex method because it does not need computation of differentiation. Some fundamental experiments utilizing real data of 2D and 3D measurement show effectiveness of the proposed method. When reasonable initial positions are given, the unique solution of position could be provided in spite of surplus point data in the objects. And then the outlying data could be filtered out from the normal ones by the proposed method.  相似文献   

8.
The financial econometrics literature includes several Multivariate GARCH models where the model parameter matrices depend on a clustering of financial assets. Those classes might be defined a priori or data-driven. When the latter approach is followed, one method for deriving asset groups is given by the use of clustering methods. In this paper, we analyze in detail one of those clustering approaches, the Gaussian mixture GARCH. This method is designed to identify groups based on the conditional variance dynamic parameters. The clustering algorithm, based on a Gaussian mixture model, has been recently proposed and is here generalized with the introduction of a correction for the presence of correlation across assets. Finally, we introduce a benchmark estimator used to assess the performances of simpler and faster estimators. Simulation experiments show evidence of the improvements given by the correction for asset correlation.  相似文献   

9.
Most empirical investigations of the business cycles in the United States have excluded the dimension of asymmetric conditional volatility. This paper analyses the volatility dynamics of the US business cycle by comparing the performance of various multivariate generalised autoregressive conditional heteroskedasticity (GARCH) models. In particular, we propose two bivariate GARCH models to examine the evidence of volatility asymmetry and time-varying correlations concurrently, and then apply the proposed models to five sectors of Industrial Production of the United States. Our findings provide strong evidence of asymmetric conditional volatility in all sectors, and some support of time-varying correlations in various sectoral pairs. This has important policy implications for government to consider the effective countercyclical measures during recessions.  相似文献   

10.
A sequential Monte Carlo method for estimating GARCH models subject to an unknown number of structural breaks is proposed. Particle filtering techniques allow for fast and efficient updates of posterior quantities and forecasts in real time. The method conveniently deals with the path dependence problem that arises in these types of models. The performance of the method is shown to work well using simulated data. Applied to daily NASDAQ returns, the evidence favors a partial structural break specification in which only the intercept of the conditional variance equation has breaks compared to the full structural break specification in which all parameters are subject to change. The empirical application underscores the importance of model assumptions when investigating breaks. A model with normal return innovations result in strong evidence of breaks; while more flexible return distributions such as t-innovations or a GARCH-jump mixture model still favor breaks but indicate much more uncertainty regarding the time and impact of them.  相似文献   

11.
Hotelling's T2 procedure is used to test the equality of means in two-group multivariate designs when covariances are homogeneous. A number of alternatives to T2, which are robust to covariance heterogeneity, have been proposed in the literature. However, all are sensitive to departures from multivariate normality. We demonstrate how to obtain multivariate tests that are robust to covariance heterogeneity and non-normality with estimators of location and scale based on trimming and Winsorizing. The performance of six alternatives to T2 was examined via Monte Carlo methods when characteristics of the research design, degree of covariance heterogeneity, and degree of non-normality were manipulated. We have recently developed a program written in the SAS/IML language that can be used to implement these robust multivariate tests. Recommendations are provided on the specific data-analytic conditions under which these tests should be adopted.  相似文献   

12.
Extreme value methods are widely used in financial applications such as risk analysis, forecasting and pricing models. One of the challenges with their application in finance is accounting for the temporal dependence between the observations, for example the stylised fact that financial time series exhibit volatility clustering. Various approaches have been proposed to capture the dependence. Commonly a two-stage approach is taken, where the volatility dependence is removed using a volatility model like a GARCH (or one of its many incarnations) followed by application of standard extreme value models to the assumed independent residual innovations.This study examines an alternative one stage approach, which makes parameter estimation and accounting for the associated uncertainties more straightforward than the two-stage approach. The location and scale parameters of the extreme value distribution are defined to follow a conditional autoregressive heteroscedasticity process. Essentially, the model implements GARCH volatility via the extreme value model parameters. Bayesian inference is used and implemented via Markov chain Monte Carlo, to permit all sources of uncertainty to be accounted for. The model is applied to both simulated and empirical data to demonstrate performance in extrapolating the extreme quantiles and quantifying the associated uncertainty.  相似文献   

13.
The Asymmetric Power GARCH (APGARCH) model allows a wider class of power transformations than simply taking the absolute value or squaring the data as in classical heteroscedastic models. A dynamic estimation is used to compare the three GARCH families and examine their forecasting performances in a value-at-risk setting. The results suggest that the optimal power transformation obtained with the APGARCH model is virtually never statistically different from 1 or 2. Moreover, although some indices switch between these two values over the time, the measures of accuracy and efficiency used to assess the performance of VaR forecasts indicate that the additional flexibility brought by the APGARCH model provides little, if any, improvements for risk management.  相似文献   

14.
Multivariate GARCH models are in principle able to accommodate the features of the dynamic conditional covariances; nonetheless the interaction between model parametrization of the second conditional moment and the conditional density of asset returns adopted in the estimation determines the fitting of such models to the observed dynamics of the data. Alternative MGARCH specifications and probability distributions are compared on the basis of forecasting performances by means of Monte Carlo simulations, using both statistical and financial forecasting loss functions.  相似文献   

15.
The multinomial probit model has emerged as a useful framework for modeling nominal categorical data, but extending such models to multivariate measures presents computational challenges. Following a Bayesian paradigm, we use a Markov chain Monte Carlo (MCMC) method to analyze multivariate nominal measures through multivariate multinomial probit models. As with a univariate version of the model, identification of model parameters requires restrictions on the covariance matrix of the latent variables that are introduced to define the probit specification. To sample the covariance matrix with restrictions within the MCMC procedure, we use a parameter-extended Metropolis-Hastings algorithm that incorporates artificial variance parameters to transform the problem into a set of simpler tasks including sampling an unrestricted covariance matrix. The parameter-extended algorithm also allows for flexible prior distributions on covariance matrices. The prior specification in the method described here generalizes earlier approaches to analyzing univariate nominal data, and the multivariate correlation structure in the method described here generalizes the autoregressive structure proposed in previous multiperiod multinomial probit models. Our methodology is illustrated through a simulated example and an application to a cancer-control study aiming to achieve early detection of breast cancer.  相似文献   

16.
Robust regression-based online filters for multivariate time series are proposed and their performance in real time signal extraction settings is discussed. The focus is on methods that can deal with time series exhibiting trends, level changes, outliers and a high level of noise as well as periods of a comparatively steady state. The new filter is based on a robust two-step online procedure, and it recognises that the data are often measured on a discrete scale. The relevant properties and the performance of this new filter are discussed and investigated by means of simulations and by a medical application.  相似文献   

17.
In this paper we use Markov chain Monte Carlo (MCMC) methods in order to estimate and compare GARCH models from a Bayesian perspective. We allow for possibly heavy tailed and asymmetric distributions in the error term. We use a general method proposed in the literature to introduce skewness into a continuous unimodal and symmetric distribution. For each model we compute an approximation to the marginal likelihood, based on the MCMC output. From these approximations we compute Bayes factors and posterior model probabilities.  相似文献   

18.
We develop a Bayesian approach for the selection of skew in multivariate skew t distributions constructed through hidden conditioning in the manners suggested by either Azzalini and Capitanio (2003) or Sahu et al. (2003). We show that the skew coefficients for each margin are the same for the standardized versions of both distributions. We introduce binary indicators to denote whether there is symmetry, or skew, in each dimension. We adopt a proper beta prior on each non-zero skew coefficient, and derive the corresponding prior on the skew parameters. In both distributions we show that as the degrees of freedom increases, the prior smoothly bounds the non-zero skew parameters away from zero and identifies the posterior. We estimate the model using Markov chain Monte Carlo (MCMC) methods by exploiting the conditionally Gaussian representation of the skew t distributions. This allows for the search through the posterior space of all possible combinations of skew and symmetry in each dimension. We show that the proposed method works well in a simulation setting, and employ it in two multivariate econometric examples. The first involves the modeling of foreign exchange rates and the second is a vector autoregression for intra-day electricity spot prices. The approach selects skew along the original coordinates of the data, which proves insightful in both examples.  相似文献   

19.
A new class of flexible threshold normal mixture GARCH models is proposed for the analysis and modelling of the stylized facts appeared in many financial time series. A Bayesian stochastic method is developed and presented for the analysis of the proposed model allowing for automatic model determination and estimation of the thresholds and their unknown number. A computationally feasible algorithm that explores the posterior distribution of the threshold models is designed using Markov chain Monte Carlo stochastic search methods. A simulation study is conducted to assess the performance of the proposed method, and an empirical application of the proposed model is illustrated using real data.  相似文献   

20.
Mixture cure models (MCMs) have been widely used to analyze survival data with a cure fraction. The MCMs postulate that a fraction of the patients are cured from the disease and that the failure time for the uncured patients follows a proper survival distribution, referred to as latency distribution. The MCMs have been extended to bivariate survival data by modeling the marginal distributions. In this paper, the marginal MCM is extended to multivariate survival data. The new model is applicable to the survival data with varied cluster size and interval censoring. The proposed model allows covariates to be incorporated into both the cure fraction and the latency distribution for the uncured patients. The primary interest is to estimate the marginal parameters in the mean structure, where the correlation structure is treated as nuisance parameters. The marginal parameters are estimated consistently by treating the observations within the cluster as independent. The variances of the parameters are estimated by the one-step jackknife method. The proposed method does not depend on the specification of correlation structure. Simulation studies show that the new method works well when the marginal model is correct. The performance of the MCM is also examined when the clustered survival times share common random effect. The MCM is applied to the data from a smoking cessation study.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号