Jade - I agree with you. Ridge regression is supposed to effectively add a
positive constant to the diagonal of the cross-product matrix, so even if
that matrix were non-full rank to begin with, it should be full rank after
adding the ridge parameter. Of course you could always do ridge regression
"by hand" by appending p extra observations consisting of the ridge
parameter times an identity matrix, where p is the number of variables in
the regression, and then doing OLS.
-----Original Message-----
From: [email protected]
[mailto:[email protected]]On Behalf Of Jade Redoblado
Sent: Wednesday, December 08, 2004 2:09 AM
To: [email protected]
Subject: st: problem with ridge regression output
Hi! I tried running generalized ridge regression using Stata code by
Bob Obenchain but Stata still continued to drop variables in the
regession. I checked and found out that Stata drops variables only if
there are linear dependencies in the data. How can this be when I'm
already doing ridge regression? Isn't it the case that even if I use
the most inappropriate ridge parameter values, I'd still won't get
signs of linear dependencies in my runs?
Jade
*
* For searches and help try:
* http://www.stata.com/support/faqs/res/findit.html
* http://www.stata.com/support/statalist/faq
* http://www.ats.ucla.edu/stat/stata/
*
* For searches and help try:
* http://www.stata.com/support/faqs/res/findit.html
* http://www.stata.com/support/statalist/faq
* http://www.ats.ucla.edu/stat/stata/