Notice: On April 23, 2014, Statalist moved from an email list to a forum, based at statalist.org.
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
RE: st: question on zero inflated regression
From
Nick Cox <[email protected]>
To
"'[email protected]'" <[email protected]>
Subject
RE: st: question on zero inflated regression
Date
Thu, 17 Feb 2011 13:24:45 +0000
I don't think you meant that.
Likelihood and log-likelihood increase together. That is why maximising likelihood corresponds to maximising its logarithm. We maximise its logarithm because in most cases the likelihood would be very small, as the product of possibly very many, very small numbers, and handling the numbers would cause underflow. Even when the likelihood is not very small, taking the logarithm does no harm.
Nick
[email protected]
rachel grant
Sorry I meant to say the log-likelihood increases (likelihood
decreases) - I am new to all this so please bear with me! Rachel
On 17 February 2011 10:33, Maarten buis <[email protected]> wrote:
> --- On Thu, 17/2/11, rachel grant wrote:
>> In my case, two of seven variables were significant
>> predictors of zeroes (both temperature), and that makes
>> sense because at low temperatures amphibians cannot move.
>
>> 1. If I change the order of the variables sometimes the p
>> value of each variable changes although the overall LR
>> and P for the model remains similar. Why does this happen
>> and what can i do about it?
>
> Sounds like your two temparature variables are highly
> colinear. In those cases there is just very little
> information in your data that can be used to distinguish
> between the effects of these two variables. You could
> take a look at -orthog- (see: -help orthog-) to transform
> these variables such that they are less correlated.
> Alternatively, you could take the position that if they
> are that correlated any one of them will contain most of
> the relevant information and you can just leave the other
> out.
>
>> 2. when I try to get a better fitting model by removing
>> nonsignificant variables the log-likelihood decreases
>> slightly. I am not sure why this happens or what to do
>
> That is exactly what should happen. The fact that a effect
> is non-significant does not mean that the effect is really
> 0. In fact, it is highly unlikely that the effect will be
> exactly 0. The fact that you included it in your model
> suggests that you thought that it could effect the outcome.
> Such variables will probably all effect the outcome, the
> question that significance tests answer is whether you
> collected enough information to detect that effect. I
> admid that this is a rather cynical interpretation of
> statistical testing, but it is not wrong. It is good to
> keep in mind that we usually test a hypothesis that we
> already know cannot be true. For your case that means that
> the likelihood should be slightly influenced when you
> leave out these variables.
>
> Typically I would leave them in my model. If I thought it
> was worth while to look at them, then I should tell my
> audience I did that and show them that the effects where
> non-significant. The easiest way to do that is to just
> leave them in my model.
*
* For searches and help try:
* http://www.stata.com/help.cgi?search
* http://www.stata.com/support/statalist/faq
* http://www.ats.ucla.edu/stat/stata/