Notice: On April 23, 2014, Statalist moved from an email list to a forum, based at statalist.org.
From | Stefano Lombardi <ste85@fastwebnet.it> |
To | statalist@hsphsun2.harvard.edu |
Subject | st: Perfect fit, perfect collinearity and small OLS standard errors |
Date | Sun, 27 Mar 2011 23:07:13 +0200 |
Dear all,First of all, please do not blame the model I am going to introduce, because actually I would have never fitted a model like that (it is a famous "case-study" on which I must write a report..).
I have just regressed a (log-linear) wage equation in order to predict Consumption of a good. By adopting a model with 6 covariates (plus the constant), I have fitted a model with adjusted R^2 = 0.98 (I already know that the model is "useless" as "ex ante" the dependent variable is almost a linear combination of the regressors).
In addition, I have also found perfect collinearity between each regressors and all the others (in some cases R^2_k is close to 0.99)
The question concerns the partial coefficients estimates and their standard errors in particular. In fact, none of the standard errors of the coefficients is even major than 1, even if we would expect them to tend to infinity because of collinearity. I interpreted the result this way: the collinearity effect is "masked" by the really small MSE of the model (0.001), which of course is very small because of the (almost) perfect fit.
Is that sensible? Has this kind of problem (the "masked" collinearity effect) a specific name in literature?
Thank you very much for your attention, Stefano Lombardi * * For searches and help try: * http://www.stata.com/help.cgi?search * http://www.stata.com/support/statalist/faq * http://www.ats.ucla.edu/stat/stata/