I think you have forgotten to account for the correlation of
the two independent variables. There are many algebraically
equivalent formulae for the SE of a given coefficient. but
all of them account for, in one way or another, the correlation
structure of the X's. In the two independent variable case,
this amounts to including the term 1-R-square for the regression
of X1 on X2 in the denominator. The formula you show below does not do this.
When I apply the correct formula I get the same result
as Stata.
At 11:15 AM 8/15/2006 Tuesday, you wrote:
I am a bit puzzled. I always thought (and just checked it in
several books) that in OLS regression the standard error of
the coefficient for explanatory variable x is:
root mean squared error / { sqrt(N) * standard deviation of x }
However if I try that in Stata and compare it with Stata output
(see example below) I get numbers that are close, but differ
enough for me to suspect that the cause of the difference is
more than just numerical / rounding errors in my calculation.
What am I missing?
Thanks,
Maarten
*-------begin example------
sysuse auto, clear
reg price mpg foreign
sum mpg
di e(rmse)/(sqrt(r(N))*r(sd))
*------end example-------
-----------------------------------------
Maarten L. Buis
Department of Social Research Methodology
Vrije Universiteit Amsterdam
Boelelaan 1081
1081 HV Amsterdam
The Netherlands
visiting adress:
Buitenveldertselaan 3 (Metropolitan), room Z434
+31 20 5986715
http://home.fsw.vu.nl/m.buis/
-----------------------------------------
*
* For searches and help try:
* http://www.stata.com/support/faqs/res/findit.html
* http://www.stata.com/support/statalist/faq
* http://www.ats.ucla.edu/stat/stata/
Richard T. Campbell
Professor
Division of Epidemiology and Biostatistics
School of Public Health &
Institute for Health Research and Policy
University of Illinois at Chicago
*
* For searches and help try:
* http://www.stata.com/support/faqs/res/findit.html
* http://www.stata.com/support/statalist/faq
* http://www.ats.ucla.edu/stat/stata/