You have -robust- option in one of your runs, but not in the other.
This affects how the estimate of the VCE is computed, and I believe
this would also affect the running estimate of the Hessian. The
initial steps before iteration 0 are performed without regard to the
curvature of the likelihood being maximized, so the two procedures
start from the same point. But then they are using somewhat different
estimates of the second derivative matrix, I guess, and that's where
the discrepancies are coming out of. By the way, neither of the
estimation procedures converged. With 40 observations, there is no way
you get reasonable results for 28 parameters. No way. And you do see
that in some parameters shooting off to infinity.
On 2/9/06, Qiuqiong Huang <[email protected]> wrote:
> Hi
> I found that when I run mlogit on different machine, I got different
> results. The data and variables used in the regression are exactly the
> same. But one machine ran 13 iterations. another machine only ran 11
> iterations and did not converge. And the results (magnitudes and standard
> errors of coefficients are totally different too). On both machine, I
> used Stata 8.2.
>
> is there any way I can find out why i got different results? Below are
> the two sets of results.
>
> Thanks a lot.
> Qiuqiong
> ***********************************************************************
> ***********************************************************************
> ***********************************************************************
> Machine 1:
> . xi: mlogit incentive2 DClined seri_mud avgDCnumgate_2 avghhplot wstress
> age_vill edu_vill pthsch ptnonag
> > ptmig i.county, b(1) robust tr
> i.county _Icounty_31-35 (naturally coded; _Icounty_31
> omitted)
>
> Iteration 0: log pseudolikelihood = -41.286362
> 0 0 0 0 0
> 0 0 0 0 0
> 0 0 0 0 -.9444616
> 0 0 0 0 0
> 0 0 0 0 0
> 0 0 0 0 -.1823216
>
> Iteration 1: log pseudolikelihood = -23.942025
> -.1535105 -1.862865 -.0126249 -.1058805 -.0903134
> -.0083592 .0718334 .0156798 -.0209434 -.0644558
> 3.273693 .7684405 1.147216 -.7303947 1.996602
> .7346775 .4799768 -.0084356 .3432856 .8750169
> .0167751 -.1539001 .0164632 -.0378387 .0005579
> -2.041441 -.3045206 -.4144813 -1.28777 -4.461722
>
> Iteration 2: log pseudolikelihood = -20.07641
> .......Omitted here
>
> Iteration 10: log pseudolikelihood = -4.2511694
> 216.9474 -83.40591 -.0279035 6.761672 64.32783
> 22.57524 27.4512 8.661309 -10.73628 -3.635502
> 222.3915 41.19815 30.15323 -13.08641 -1635.229
> 200.3627 -3.597955 .0447571 41.53482 87.61121
> 15.90677 27.2541 8.738907 -8.765639 -.4245701
> -83.01943 -71.07202 -171.0882 -218.1327 -1636.855
>
> Iteration 11: log pseudolikelihood = -3.7138854
> 264.8277 -99.28156 -.0282477 7.7863 81.29004
> 27.57892 32.82099 10.73473 -13.09572 -4.473515
> 267.4728 44.75714 35.35282 -20.16417 -2002.28
> 246.2304 -4.792918 .0514819 51.2159 110.3305
> 19.22811 32.69608 10.69847 -10.85353 -.3073647
> -110.3294 -89.29089 -215.4751 -269.2497 -1999.169
>
> Iteration 12: log pseudolikelihood = -3.5259999
> 290.1185 -106.7525 -.0283225 9.239672 89.29666
> 30.11146 35.97972 11.79514 -14.29023 -4.862053
> 286.5799 45.57917 34.45153 -27.19174 -2195.181
> 270.1947 -5.378952 .0567416 56.18665 121.0073
> 21.10159 35.89417 11.73846 -11.91466 -.3277298
> -120.984 -97.89167 -236.4222 -295.3328 -2193.674
>
> Iteration 13: log pseudolikelihood = -3.510283
> 296.3671 -108.5992 -.0283358 9.596246 91.27622
> 30.73784 36.75933 12.05715 -14.58576 -4.958122
> 291.3211 45.79315 34.24537 -28.90659 -2242.855
> 276.1129 -5.525269 .0580142 57.41468 123.6471
> 21.56427 36.68407 11.99532 -12.17688 -.3327286
> -123.6217 -100.0199 -241.603 -301.7788 -2241.715
>
>
> Multinomial logistic regression Number of obs =
> 40
> Wald chi2(24) =
> .
> Prob > chi2 =
> .
> Log pseudolikelihood = -3.5107012 Pseudo R2 =
> 0.9150
> ***********************************************************************
> ***********************************************************************
> ***********************************************************************
>
>
> Machine 2:
>
>
> . xi: mlogit incentive2 DClined seri_mud avgDCnumgate_2 avghhplot wstress
> age_vill edu_vill pthsch ptnonag
> > ptmig i.county, b(1) tr
> i.county _Icounty_31-35 (naturally coded; _Icounty_31
> omitted)
>
> Iteration 0: log likelihood = -41.286362
> 0 0 0 0 0
> 0 0 0 0 0
> 0 0 0 0 -.9444616
> 0 0 0 0 0
> 0 0 0 0 0
> 0 0 0 0 -.1823216
>
> Iteration 1: log likelihood = -23.942025
> -.1535105 -1.862865 -.0126249 -.1058805 -.0903134
> -.0083592 .0718334 .0156798 -.0209434 -.0644558
> 3.273693 .7684405 1.147216 -.7303947 1.996602
> .7346775 .4799768 -.0084356 .3432856 .8750169
> .0167751 -.1539001 .0164632 -.0378387 .0005579
> -2.041441 -.3045206 -.4144813 -1.28777 -4.461722
>
> Iteration 2: log likelihood = -20.07641
> .7064876 -2.962282 -.0142221 -.1936033 .08562
> ........Omitted.......
>
> Iteration 10: log likelihood = -4.2511694
> 216.9474 -83.40591 -.0279035 6.761672 64.32783
> 22.57524 27.4512 8.661309 -10.73628 -3.635502
> 222.3915 41.19815 30.15323 -13.08641 -1635.229
> 200.3627 -3.597955 .0447571 41.53482 87.61121
> 15.90677 27.2541 8.738907 -8.765639 -.4245701
> -83.01943 -71.07202 -171.0882 -218.1327 -1636.855
>
> Iteration 11: log likelihood = .
> 312.7081 -115.1572 -.0285919 8.810928 98.25225
> 32.5826 38.19077 12.80816 -15.45516 -5.311528
> 312.554 48.31613 40.5524 -27.24193 -2369.331
> 292.098 -5.987881 .0582066 60.89697 133.0497
> 22.54945 38.13805 12.65803 -12.94142 -.1901592
> -137.6394 -107.5098 -259.862 -320.3667 -2361.483
>
>
> Multinomial logistic regression Number of obs =
> 40
> LR chi2(28) =
> .
> Log likelihood = . Prob > chi2 =
> .
> *
> * For searches and help try:
> * http://www.stata.com/support/faqs/res/findit.html
> * http://www.stata.com/support/statalist/faq
> * http://www.ats.ucla.edu/stat/stata/
>
--
Stas Kolenikov
http://stas.kolenikov.name
*
* For searches and help try:
* http://www.stata.com/support/faqs/res/findit.html
* http://www.stata.com/support/statalist/faq
* http://www.ats.ucla.edu/stat/stata/