Notice: On April 23, 2014, Statalist moved from an email list to a forum, based at statalist.org.
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
st: Re: Bootstrapping to get Standard Errors for Regression Discontinuity Estimators
From
nshephard <[email protected]>
To
[email protected]
Subject
st: Re: Bootstrapping to get Standard Errors for Regression Discontinuity Estimators
Date
Thu, 23 Sep 2010 04:21:26 -0700 (PDT)
Jen Zhen wrote:
>
> Dear listers,
>
> When bootstrapping Austin Nichol's rd command:
>
> bs, reps(100): rd outcome assignment, mbw(100) ,
>
> I find that often the resulting P value tells me the estimate is not
> statistically significant at the conventional levels, even when visual
> inspection and more basic methods like simple OLS regressions on a
> treatment dummy, assignment and assignment squared suggest huge
> statistical significance.
>
> That makes me wonder whether possibly this boot-strapping method might
> somehow understate the true statistical significance of the effect in
> question? Or can and should I fully trust these results and conclude
> that the estimate is not statistically significant at the conventional
> levels?"
What do you mean by "conventional levels [of significance]"?
You should set your threshold for declaring statistical significance in the
context of your study. Using p < 0.05 to declare something statistically
significant is often inappropriate.
Often of greater interest is an estimate of the effect size (and associated
CI's), what do these tell you?
see e.g. Gardner & Altman (1986)
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1339793/pdf/bmjcred00225-0036.pdf
Try more replications for your bootstrapping too, 100 isn't that many
really, try at least 1000.
Neil
--
View this message in context: http://statalist.1588530.n2.nabble.com/Bootstrapping-to-get-Standard-Errors-for-Regression-Discontinuity-Estimators-tp5562576p5562596.html
Sent from the Statalist mailing list archive at Nabble.com.
*
* For searches and help try:
* http://www.stata.com/help.cgi?search
* http://www.stata.com/support/statalist/faq
* http://www.ats.ucla.edu/stat/stata/