首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
We investigate learning dynamics in the formation of household inflation expectations in the six largest euro area countries. Our findings reveal heterogeneity in the learning rules that European households use to forecast inflation. We also find pronounced heterogeneity in the way consumers process new data. These differences vary not only across countries but also over time, suggesting that the learning behavior of households is state dependent.  相似文献   

2.
Economic behaviour as well as economic resources of individuals vary with age. Swedish time series show that the age structure contains information correlated to medium‐term trends in growth and inflation. GDP gaps estimated by age structure regressions are closely related to conventional measures. Monetary policy is believed to affect inflation with a lag of 1 or 2 years. Projections of the population's age structure are comparatively reliable several years ahead and provide additional information to improve on 3–5 years‐ahead forecasts of potential GDP and inflation. Thus there is a potential scope for using age structure based forecasts as an aid to monetary policy formation. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   

3.
This paper undertakes a comprehensive examination of 10 measures of core inflation and evaluates which measure produces the best forecast of headline inflation out‐of‐sample. We use the Personal Consumption Expenditure Price Index as our measure of inflation. We use two sets of components (17 and 50) of the Personal Consumption Expenditure Price Index to construct these core inflation measures and evaluate these measures at the three time horizons (6, 12 and 24 months) most relevant for monetary policy decisions. The best measure of core inflation for both sets of components and over all time horizons uses weights based on the first principal component of the disaggregated (component‐level) prices. Interestingly, the results vary by the number of components used; when more components are used the weights based on the persistence of each component is statistically equivalent to the weights generated by the first principal component. However, those forecasts using the persistence of 50 components are statistically worse than those generated using the first principal component of 17 components. The statistical superiority of the principal component method is due to the fact that it extracts (in the first principal component) the common source of variation in the component level prices that accurately describes trend inflation over the next 6–24 months.  相似文献   

4.
We use real‐time macroeconomic variables and combination forecasts with both time‐varying weights and equal weights to forecast inflation in the USA. The combination forecasts compare three sets of commonly used time‐varying coefficient autoregressive models: Gaussian distributed errors, errors with stochastic volatility, and errors with moving average stochastic volatility. Both point forecasts and density forecasts suggest that models combined by equal weights do not produce worse forecasts than those with time‐varying weights. We also find that variable selection, the allowance of time‐varying lag length choice, and the stochastic volatility specification significantly improve forecast performance over standard benchmarks. Finally, when compared with the Survey of Professional Forecasters, the results of the best combination model are found to be highly competitive during the 2007/08 financial crisis.  相似文献   

5.
This paper examines the problem of forecasting macro‐variables which are observed monthly (or quarterly) and result from geographical and sectorial aggregation. The aim is to formulate a methodology whereby all relevant information gathered in this context could provide more accurate forecasts, be frequently updated, and include a disaggregated explanation as useful information for decision‐making. The appropriate treatment of the resulting disaggregated data set requires vector modelling, which captures the long‐run restrictions between the different time series and the short‐term correlations existing between their stationary transformations. Frequently, due to a lack of degrees of freedom, the vector model must be restricted to a block‐diagonal vector model. This methodology is applied in this paper to inflation in the euro area, and shows that disaggregated models with cointegration restrictions improve accuracy in forecasting aggregate macro‐variables. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

6.
Based on a vector error correction model we produce conditional euro area inflation forecasts. We use real‐time data on M3 and HICP, and include real GPD, the 3‐month EURIBOR and the 10‐year government bond yield as control variables. Real money growth and the term spread enter the system as stationary linear combinations. Missing and outlying values are substituted by model‐based estimates using all available data information. In general, the conditional inflation forecasts are consistent with the European Central Bank's assessment of liquidity conditions for future inflation prospects. The evaluation of inflation forecasts under different monetary scenarios reveals the importance of keeping track of money growth rate in particular at the end of 2005. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

7.
This paper investigates the relationship between forecast accuracy and effort, where effort is defined as the number of times the model used to generate forecasts is recursively estimated over the full sample period. More specifically, within a framework of costly effort, optimal effort strategies are derived under the assumption that the dynamics of the variable of interest follow an autoregressive‐type process. Results indicate that the strategies are fairly robust over a wide range of linear and nonlinear processes (including structural break processes), and deliver forecasts of transitory, core and total inflation that require less effort to generate and are as accurate as (that is, are insignificantly different from) those produced with maximum effort. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

8.
Prior studies use a linear adaptive expectations model to describe how analysts revise their forecasts of future earnings in response to current forecast errors. However, research shows that extreme forecast errors are less likely than small forecast errors to persist in future years. If analysts recognize this property, their marginal forecast revisions should decrease with the forecast error's magnitude. Therefore, a linear model is likely to be unsatisfactory at describing analysts' forecast revisions. We find that a non‐linear model better describes the relation between analysts' forecast revisions and their forecast errors, and provides a richer theoretical framework for explaining analysts' forecasting behaviour. Our results are consistent with analysts' recognizing the permanent and temporary nature of forecast errors of differing magnitudes. Copyright © 2000 John Wiley & Sons, Ltd.  相似文献   

9.
We develop a semi‐structural model for forecasting inflation in the UK in which the New Keynesian Phillips curve (NKPC) is augmented with a time series model for marginal cost. By combining structural and time series elements we hope to reap the benefits of both approaches, namely the relatively better forecasting performance of time series models in the short run and a theory‐consistent economic interpretation of the forecast coming from the structural model. In our model we consider the hybrid version of the NKPC and use an open‐economy measure of marginal cost. The results suggest that our semi‐structural model performs better than a random‐walk forecast and most of the competing models (conventional time series models and strictly structural models) only in the short run (one quarter ahead) but it is outperformed by some of the competing models at medium and long forecast horizons (four and eight quarters ahead). In addition, the open‐economy specification of our semi‐structural model delivers more accurate forecasts than its closed‐economy alternative at all horizons. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

10.
In the light of the still topical nature of ‘bananas and petrol’ being blamed for driving much of the inflationary pressures in Australia in recent times, the ‘headline’ and ‘underlying’ rates of inflation are scrutinised in terms of forecasting accuracy. A general structural time‐series modelling strategy is applied to estimate models for alternative types of Consumer Price Index (CPI) measures. From this, out‐of‐sample forecasts are generated from the various models. The underlying forecasts are subsequently adjusted to facilitate comparison. The Ashley, Granger and Schmalensee (1980) test is then performed to determine whether there is a statistically significant difference between the root mean square errors of the models. The results lend weight to the recent findings of Song (2005) that forecasting models using underlying rates are not systematically inferior to those based on the headline rate. In fact, strong evidence is found that underlying measures produce superior forecasts. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

11.
This paper presents a comparative analysis of the sources of error in forecasts for the UK economy published over a recent four-year period by four independent groups. This analysis rests on the archiving at the ESRC Macroeconomic Modelling Bureau of the original forecasts together with all their accompanying assumptions and adjustments. A method of decomposing observed forecast errors so as to distinguish the contributions of forecaster and model is set out; the impact of future expectations treated in a ‘model-consistent’ or ‘rational’ manner is specifically considered. The results show that the forecaster's adjustments make a substantial contribution to forecast performance, a good part of which comes from adjustments that bring the model on track at the start of the forecast period. The published ex-ante forecasts are usually superior to pure model-based ex-post forecasts, whose performance indicates some misspecification of the underlying models.  相似文献   

12.
We present a mixed‐frequency model for daily forecasts of euro area inflation. The model combines a monthly index of core inflation with daily data from financial markets; estimates are carried out with the MIDAS regression approach. The forecasting ability of the model in real time is compared with that of standard VARs and of daily quotes of economic derivatives on euro area inflation. We find that the inclusion of daily variables helps to reduce forecast errors with respect to models that consider only monthly variables. The mixed‐frequency model also displays superior predictive performance with respect to forecasts solely based on economic derivatives. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

13.
This paper utilizes for the first time age‐structured human capital data for economic growth forecasting. We concentrate on pooled cross‐country data of 65 countries over six 5‐year periods (1970–2000) and consider specifications chosen by model selection criteria, Bayesian model averaging methodologies based on in‐sample and out‐of‐sample goodness of fit and on adaptive regression by mixing. The results indicate that forecast averaging and exploiting the demographic dimension of education data improve economic growth forecasts systematically. In particular, the results are very promising for improving economic growth predictions in developing countries. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

14.
Initial applications of prediction markets (PMs) indicate that they provide good forecasting instruments in many settings, such as elections, the box office, or product sales. One particular characteristic of these ‘first‐generation’ (G1) PMs is that they link the payoff value of a stock's share to the outcome of an event. Recently, ‘second‐generation’ (G2) PMs have introduced alternative mechanisms to determine payoff values which allow them to be used as preference markets for determining preferences for product concepts or as idea markets for generating and evaluating new product ideas. Three different G2 payoff mechanisms appear in the existing literature, but they have never been compared. This study conceptually and empirically compares the forecasting accuracy of the three G2 payoff mechanisms and investigates their influence on participants' trading behavior. We find that G2 payoff mechanisms perform almost as well as their G1 counterpart, and trading behavior is very similar in both markets (i.e. trading prices and trading volume), except during the very last trading hours of the market. These results indicate that G2 PMs are valid instruments and support their applicability shown in previous studies for developing new product ideas or evaluating new product concepts. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

15.
This paper assesses the informational content of alternative realized volatility estimators, daily range and implied volatility in multi‐period out‐of‐sample Value‐at‐Risk (VaR) predictions. We use the recently proposed Realized GARCH model combined with the skewed Student's t distribution for the innovations process and a Monte Carlo simulation approach in order to produce the multi‐period VaR estimates. Our empirical findings, based on the S&P 500 stock index, indicate that almost all realized and implied volatility measures can produce statistically and regulatory precise VaR forecasts across forecasting horizons, with the implied volatility being especially accurate in monthly VaR forecasts. The daily range produces inferior forecasting results in terms of regulatory accuracy and Basel II compliance. However, robust realized volatility measures, which are immune against microstructure noise bias or price jumps, generate superior VaR estimates in terms of capital efficiency, as they minimize the opportunity cost of capital and the Basel II regulatory capital. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

16.
This paper examines the information content of implied volatility for crude oil options as it relates to future realized volatility. Using data for the period 1996 to 2011 we find that implied volatility is an effective predictor of the month‐ahead realized volatility. We show that implied volatility subsumes the information content of contemporaneous volatility, and it contains incremental information on future volatility after controlling for contemporaneous volatility. Furthermore, incorporating risk‐neutral skewness, and especially kurtosis, improves the forecasting of realized volatility. Overall, the association between implied volatility and month‐ahead realized volatility is consistent with evidence documented for other asset classes, leading us to conclude that implied volatility serves as a reasonable proxy for expected volatility. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

17.
This paper concerns Long‐term forecasts for cointegrated processes. First, it considers the case where the parameters of the model are known. The paper analytically shows that neither cointegration nor integration constraint matters in Long‐term forecasts. It is an alternative implication of Long‐term forecasts for cointegrated processes, extending the results of previous influential studies. The appropriate Mote Carlo experiment supports our analytical result. Secondly, and more importantly, it considers the case where the parameters of the model are estimated. The paper shows that accuracy of the estimation of the drift term is crucial in Long‐term forecasts. Namely, the relative accuracy of various Long‐term forecasts depends upon the relative magnitude of variances of estimators of the drift term. It further experimentally shows that in finite samples the univariate ARIMA forecast, whose drift term is estimated by the simple time average of differenced data, is better than the cointegrated system forecast, whose parameters are estimated by the well‐known Johansen's ML method. Based upon finite sample experiments, it recommends the univariate ARIMA forecast rather than the conventional cointegrated system forecast in finite samples for its practical usefulness and robustness against model misspecifications. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

18.
Leibniz’s and Whitehead’s analyses of motion are at the heart of their metaphysical schemes. These schemes are to be considered as two blueprints of a similar metaphysical intuition that emerged during two breakthrough eras, that is, the 17th century and the beginning of the 20th century, and retained the Aristotelian idea that existence requires an active principle. The two philosophers’ attempts to elucidate this idea in the context of their analyses of motion still interact with central, longstanding questions in philosophy, in particular that concerning the ontological status of change. For both thinkers, the phenomenon of motion is an example par excellence, of the metaphysically fundamental principle of action that is required for change in the world. I focus on Leibniz’s and Whitehead’s similar understanding of the concept of transition that is inserted as an essential constitutive component of motion and ensures its status as something real.  相似文献   

19.
This paper compares the in‐sample fitting and the out‐of‐sample forecasting performances of four distinct Nelson–Siegel class models: Nelson–Siegel, Bliss, Svensson, and a five‐factor model we propose in order to enhance the fitting flexibility. The introduction of the fifth factor resulted in superior adjustment to the data. For the forecasting exercise the paper contrasts the performances of the term structure models in association with the following econometric methods: quantile autoregression evaluated at the median, VAR, AR, and a random walk. As a pattern, the quantile procedure delivered the best results for longer forecasting horizons. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

20.
Forecasting prices in electricity markets is a crucial activity for both risk management and asset optimization. Intra‐day power prices have a fine structure and are driven by an interaction of fundamental, behavioural and stochastic factors. Furthermore, there are reasons to expect the functional forms of price formation to be nonlinear in these factors and therefore specifying forecasting models that perform well out‐of‐sample is methodologically challenging. Markov regime switching has been widely advocated to capture some aspects of the nonlinearity, but it may suffer from overfitting and unobservability in the underlying states. In this paper we compare several extensions and alternative regime‐switching formulations, including logistic specifications of the underlying states, logistic smooth transition and finite mixture regression. The finite mixture approach to regime switching performs well in an extensive, out‐of‐sample forecasting comparison. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号