yes, it does. It blinds -ml- from taking derivatives properly, and it
may decide it has a flat area in likelihood, which would affect
convergence... so God only knows where that thing really converges to
with one's interfering into it. A thing that the authors of the [ML]
book should have said is to give more technical details, like the
meaning of at least the most important $ML globals, and then tell to
NEVER try to change the parameter vector, either transferred into the
program, or the $ML_b vector, inside the likelihood evaluator.
Note also that rho12 is the name of the local, and what it should
contain, to my understanding of -ml-, is the name of the variable in
your data set containing the vector of identical values equal to the
current value of that parameter. It might be OK to reset that local to
zero, since the creation and deletion of that variable is handled by
-ml-, but in some other cases this might lead to that variable with
the name like __0000D3 ending up in your data set...
Unfortunately, I don't think I can help much with the substantive
part. If the code is correct, it might be some sort of identification
issue in the model (can you demonstrate it is identified?); you might
want to try some better starting values; etc.
On 3/2/07, Geraldo Cerqueiro <[email protected]> wrote:
It was just an attempt to see where would the other coefficients converge
if i restricted rho12 to zero. Does not change things too much: both
rho1u and rho2u converge to values close of -1 (when I know they should be positive)
--
Stas Kolenikov
http://stas.kolenikov.name
*
* For searches and help try:
* http://www.stata.com/support/faqs/res/findit.html
* http://www.stata.com/support/statalist/faq
* http://www.ats.ucla.edu/stat/stata/