Robert,
thank you for a rapid answer. I am fully aware of the optimization issues regarding model congruency, of which I will have to make some simplifying assumptions for this "exercise". Serial correlation should be handled, though.
Seasonality can be controlled by centered seasonal dummies.
A vecm (reduced rank VAR) is expressed in first differences (stationary if levels are I(1)) plus a stationary linear combination of lagged levels. Then the vecm is stationary. Of course, if the variables are not cointegrated, then a VAR in first differences is appropriate. The variables in question here are cointegrated (rank tests on entire sample and subsamples). Also, eigenvalue stability tests shows that the largest eigenvalue is fairly stable, justifying the use of only one cointegrating vector in the vecm (see also below).
Structural breaks: Chow tests indicate that they are absent for this system (and the variables react fairly similarly to stochastic shocks, they share the same deterministic trend).
Granger causality: not recommended in its usual form for vars and vecm's (see Phillips, 1991)
Exogeneity: long run weak exogeneity is captured by the alpha (adjustment) coefficient, meaning that the variable is decided outside the system, but still belongs to the cointegrating equation. A very small (significant) value of alpha yields a "practically" weakly exogenous variable in that the adjustment process is very slow. One could also implement a procedure that sets alpha to zero if its absolute t-value is below a certain level, say 1,5. Some variables are obviously exogenous, such as a US interest rate in combination with Norwegian data.
IRF's are not an issue here, only forecasting from a reduced form model, and structural models will not be considered. The forecasts are compared to single equation models with different restrictions to see if the system approach adds quality to the forecasts.
As a start, I intend to estimate vecms with a fixed number of lags and cointegrating vectors, disregarding rank tests for each subsample. One argument for doing so can be found with Johansen/Juselius: the vector with the largest eigenvalue is the most useful. Also, information criteria are very often ambigous regarding "optimal" lag length, such that in practice one would apply general to specific modeling, choosing the most parsimonious model (smallest number of lags that yields non-serially correlated errors). As for trends, quadratic trends are highly unlikely in economic data in general, but there is often a deterministic trend in the levels (and in the cointegrating equation). Though, if the variables in question share the same deterministic trend, it will cancel (see L�tkepohl). Hence, trending behaviour is captured by an unrestricted constant in the vecm.
Therefore, I was looking for a procedure that could perform the estimations and predictions with the assumptions mentioned above. A comparison of models with different lag lengths could then be undertaken, and see which information criterium (e.g. AIC, SC, FPE, HQ) is closest to the model with the lowest mean squared forecast error. Suggestions?
>>> [email protected] 08.12.2006 15:58 >>>
Svein,
Automation is a tall order here. You first have to transform your
variables to attain stationarity. Logging may work if you have variance
instability, but then the errors become multiplicative. You will have
to look for seasonality and possibly deasonalize your variables to
attain stationarity, lest you use deterministic seasonal dummies later.
Then there is the matter of graphing the series and looking for
structural breaks. Then there is the matter of running the stationarity
tests. One has to transform the series to stationarity. Regraphing
follows. Then there is the issue of lag determination. Granger
Causality tests should be run to asertain whether any variables are
exogenous. You might want to run a trace test to determine the number
of cointegrating vectors. Modeling the error correction mechanism might
be necessaryat thisjuncture. You might have to identify the VECM form,
determining
whether deterministic terms--such as drift, linear or even quadratic
terms--might be in order. There may be parameter restrctions required.
The process would have to be iterated till the model is optimized. Then
ther is the structural VAR to run with the impulse response functions.
Once all this is done, you might want to forecast, plotting the forecast
error variance decomposition. Automation of this process more than what
Stata has done in the VAR and VECM procedures is daunting.
Good luck,
Robert
Robert A. Yaffee, Ph.D.
Research Professor
Shirley M. Ehrenkranz
School of Social Work
New York University
home address:
Apt 19-W
2100 Linwood Ave.
Fort Lee, NJ
07024-3171
Phone: 201-242-3824
Fax: 201-242-3825
[email protected]
----- Original Message -----
From: "Svein.Oskar Lauvsnes" <[email protected]>
Date: Friday, December 8, 2006 9:36 am
Subject: SV: Re: st: Time series: VECM with recursive window forecasts
> Robert,
> thanks for a rapid answer. I agree that in general arima models
> (perhaps applying genreal to specific modeling) would be more
> suitable than ols for time series in order to get a "congruent
> model", i.e. no serial correlation/heterosced, normality in
> residuals. However, in this case I first intend to compare a simple
> AR(1) model (close to a pure random walk) with an extended model
> (also single equation) including some macrovariables regarding
> predictive abilities. For this purpose I might as well use ols,
> regressing the change in the log of the dependent variable on its
> 1st lag instead of formulating an AR(1), which would be the same.
> Also, I also intend to compare my results with those of Rapach et
> al (2005)
>
> The second step in this exercise is to estimate a vecm system, and
> again compare predictive abilities (see e.g. McRae et al, 2002). It
> is in this step that I need some help to automatize the estimation
> and forecasting process. Here too, congruency is not considered, I
> intend to estimate a vecm with a fixed number of lags and
> cointegrating vectors for each estimation. Of course, I will check
> subsamples to see if they differ greatly regarding these
> assumptions. When estimating the vecm on the entire sample, an
> eigenvalue test show that they are fairly stable throughout. Also,
> there are arguments for using only the cointegrating vector with
> the largest eigenvalue (See Johansen/Juselius).
>
> So, comparing predictive ability by increasing the informational
> content in a parsimonious model is the main topic. What do you
> think about this? Any programming suggestions would be great.
>
> Regards,
>
> Svein.
>
> >>> [email protected] 08.12.2006 14:57 >>>
> Sven,
> Should you not should consider using tssmooth exponential, arima, or
> prais rather than ols reg, unless you have a theoretical reason for
> showing the defects of
> not controlling for autocorrelation in the series?
> Regards,
> Robert
>
>
> Robert A. Yaffee, Ph.D.
> Research Professor
> Shirley M. Ehrenkranz
> School of Social Work
> New York University
>
> home address:
> Apt 19-W
> 2100 Linwood Ave.
> Fort Lee, NJ
> 07024-3171
> Phone: 201-242-3824
> Fax: 201-242-3825
> [email protected]
>
> ----- Original Message -----
> From: "Svein.Oskar Lauvsnes" <[email protected]>
> Date: Friday, December 8, 2006 3:03 am
> Subject: st: Time series: VECM with recursive window forecasts
>
> > Hi,
> > I am investigating the predictive abilities of macrovariables on
> > stock market returns. So far I have made 1-step ahead predictions
> > from single equation models, keeping the starting point fixed and
> > for each new regression extending the dataset by one observation.
> I
> > would like to compare the single equation forecasts with
> forecasts
> > from a system of equations such as a vector error correction
> model
> > and a VAR. I have used the following program for my forecasts:
> >
> > gen time = _n
> > tsset time
> >
> >
> >
> > capture program drop rforecast
> > program rforecast, rclass
> > syntax [if]
> > regress dose l.dose dnib `if'
> > summ time if e(sample)
> > local last = r(max)
> > local fcast = _b[_cons] + _b[L.dose]*dose[`last']///
> > + _b[dnib]*nib[`last'+1]
> >
> > return scalar forecast = `fcast'
> > return scalar actual = dose[`last' +1]
> > end
> >
> > rolling actual=r(actual) forecast=r(forecast), recursive ///
> > window(149) saving(myrolling, replace): rforecast
> >
> > use myrolling, clear
> > list in 1/100
> >
> > Hopefully, the program will work on a VECM by substituting the
> > sentences in bold. How should I modify my program to do rolling
> > window estimation/forecasting using a VECM? I suppose the number
> of
> > cointegrating vectors and lags would have to be fixed.
> >
> > Sincerely
> >
> > Svein Lauvsnes
> > Bodoe Graduate School of Business, Norway
> >
> >
> >
> >
> > *
> > * For searches and help try:
> > * http://www.stata.com/support/faqs/res/findit.html
> > * http://www.stata.com/support/statalist/faq
> > * http://www.ats.ucla.edu/stat/stata/
> >
> *
> * For searches and help try:
> * http://www.stata.com/support/faqs/res/findit.html
> * http://www.stata.com/support/statalist/faq
> * http://www.ats.ucla.edu/stat/stata/
>
> *
> * For searches and help try:
> * http://www.stata.com/support/faqs/res/findit.html
> * http://www.stata.com/support/statalist/faq
> * http://www.ats.ucla.edu/stat/stata/
>
*
* For searches and help try:
* http://www.stata.com/support/faqs/res/findit.html
* http://www.stata.com/support/statalist/faq
* http://www.ats.ucla.edu/stat/stata/
*
* For searches and help try:
* http://www.stata.com/support/faqs/res/findit.html
* http://www.stata.com/support/statalist/faq
* http://www.ats.ucla.edu/stat/stata/