>Logic dictates that the explanatory power of a model
> will decrease as the correlation among predictors increases because
> there's less independent information being added to the system.
>Compare the R^2 from a three
> variable equation in which Y is correlated with X1 at .2, with X2 at .2,
> and X1
> and X2 are correlated at .9, to the same equation with the correlation
> between
> X1 and X2 now at .2. The R^2 for the latter equation (~.067) is much
> larger
> than the R^2 for the former (~.042).
A counter-example is when suppressor effects are present, e.g. X1 is
negatively correlated with Y, X2 is positively correlated with Y, and X1
and X2 are positively correlated with each other. So, for example, R12 =
.1, RY1 = -.4, RY2 = .4. Then R^2 = .356. Now, increase R12 to .4, and
R^2 = .533.