Notice: On April 23, 2014, Statalist moved from an email list to a forum, based at statalist.org.
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: st: xtnbreg - robusteness check and model relevance
From
Mário Marques <[email protected]>
To
[email protected]
Subject
Re: st: xtnbreg - robusteness check and model relevance
Date
Thu, 17 Jan 2013 18:18:13 +0000
Dear All,
I have a count data panel (xtnbreg) and I have modelled the number of
new foreign firms on a set of indepvars using country-pairs fixed
effects.
A referee suggested me recently to include year dummies to account for
time trends, the results obtained before and after the inclusion of
these variables are as follows:
before inclusion of year dummies:
. xtnbreg nsubs x1 x2 x3 x4 x5 x6 x7, i(country_pairs) fe nolog
note: 345 groups (3450 obs) dropped because of all zero outcomes
Conditional FE negative binomial regression Number of obs = 8070
Group variable: country_pairs Number of groups = 807
Wald chi2(7) = 411.66
Log likelihood = -12046.814 Prob > chi2 = 0.0000
nsubs Coef. Std. Err. z P>z [95% Conf. Interval]
x1 -1.225259 .2267174 -5.40 0.000 -1.669617 -.7809005
x2 .2410233 .1037429 2.32 0.020 .037691 .4443555
x3 3.413019 .2943037 11.60 0.000 2.836194 3.989844
x4 .0858284 .0386097 2.22 0.026 .0101549 .161502
x5 .3058368 .3212767 0.95 0.341 -.3238541 .9355276
x6 -.0083512 .0027709 -3.01 0.003 -.013782 -.0029203
x7 .583563 .0996365 5.86 0.000 .3882791 .7788468
cons -3.364678 1.033477 -3.26 0.001 -5.390256 -1.3391
including i.year:
. xtnbreg nsubs x1 x2 x3 x4 x5 x6 x7 i.year, i(country_pairs) fe nolog
note: 345 groups (3450 obs) dropped because of all zero outcomes
Conditional FE negative binomial regression Number of obs = 8070
Group variable: country_pairs Number of groups = 807
Wald chi2(16) = 765.50
Log likelihood = -11917.252 Prob > chi2 = 0.0000
nsubs Coef. Std. Err. z P>z [95% Conf. Interval]
x1 -.2916239 .2756239 -1.06 0.290 -.8318369 .248589
x2 .1921068 .1011274 1.90 0.057 -.0060993 .3903128
x3 .1776646 .416702 0.43 0.670 -.6390562 .9943854
x4 .0992966 .0377979 2.63 0.009 .025214 .1733792
x5 -1.45519 .3305684 -4.40 0.000 -2.103092 -.8072876
x6 -.0011443 .0027679 -0.41 0.679 -.0065692 .0042806
x7 .0635129 .1092125 0.58 0.561 -.1505397 .2775654
year1...
year2...
...
my variables of interest are x1 and x2 which estimated coefficients
and significance drop significantly. Is there any way of testing
whether this new specification (inclusion of year dummies) fits
better the data than before? Regarding estimated standard errors, is
there a way to estimate them robustly.
Any help would be much appreciated.
M.Marques
*
* For searches and help try:
* http://www.stata.com/help.cgi?search
* http://www.stata.com/support/faqs/resources/statalist-faq/
* http://www.ats.ucla.edu/stat/stata/