That's my only real idea here.
I did think of getting a larger
dataset (real or concocted) and
reducing it by random deletion,
but there's always a question
of how relevant that would be to
your real problem.
Nick
[email protected]
> -----Original Message-----
> From: [email protected]
> [mailto:[email protected]]On Behalf Of Jeannette
> Wicks-Lim
> Sent: 07 February 2005 21:44
> To: [email protected]
> Subject: st: Re: RE: Fw: Sample size and QREG
>
>
> To clarify: do you mean, simply use bsqreg and see what the
> standard errors
> look for my pwllabcon variable?
>
> Thanks!
> J
> > If you're in search of some rule that things
> > will be OK if sample size is so much, that would
> > seem a poor way to summarize, given a more likely
> > continuous dependence of error on sample size.
> >
> > Better to get an idea of quite
> > how unreliable your results are by bootstrapping?
> >
> > Nick
>
> *
> * For searches and help try:
> * http://www.stata.com/support/faqs/res/findit.html
> * http://www.stata.com/support/statalist/faq
> * http://www.ats.ucla.edu/stat/stata/
>
*
* For searches and help try:
* http://www.stata.com/support/faqs/res/findit.html
* http://www.stata.com/support/statalist/faq
* http://www.ats.ucla.edu/stat/stata/