Notice: On April 23, 2014, Statalist moved from an email list to a forum, based at statalist.org.
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: st: about residuals and coefficients
From
David Hoaglin <[email protected]>
To
[email protected]
Subject
Re: st: about residuals and coefficients
Date
Thu, 5 Sep 2013 13:27:17 -0400
Yuval,
I did not suggest a difference between the theorem described by Filoso
and the one in the book by Greene. The logic of the "formal proof" in
Filoso's article, however, leaves something to be desired, especially
Step 2. Also, it's not clear why the proof uses expected values of
residuals. One needs only the fact that the residuals from a
least-squares regression are orthogonal to (i.e., have zero
correlation in the data with) each of the predictor variables
(including the constant).
What "bothers" me is that many textbooks (not including the book by
Greene) interpret coefficients in multiple regression in a way that
involves holding the other predictors constant. The theorem shows
that that is not how multiple regression works.
David Hoaglin
On Thu, Sep 5, 2013 at 12:28 PM, Yuval Arbel <[email protected]> wrote:
> David,
>
> I don't see any difference between the theorem described by Valerio
> Filoso and by William Greene (accept for the fact that Greene
> generalizes the theorem to matrix notation).
>
> Maybe what is bothering you is the fact that Greene (or other textbook
> writers) does not show a practical application to this theorem.But
> this is done very nicely by Valerio Filoso in the Stata article, which
> provides a formal proof, exemplifies that the theorem works - and
> talks about -reganat- command
*
* For searches and help try:
* http://www.stata.com/help.cgi?search
* http://www.stata.com/support/faqs/resources/statalist-faq/
* http://www.ats.ucla.edu/stat/stata/