Notice: On April 23, 2014, Statalist moved from an email list to a forum, based at statalist.org.
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: st: Calculating standard deviations used to approximate beta distributions
From
Steve Samuels <[email protected]>
To
[email protected]
Subject
Re: st: Calculating standard deviations used to approximate beta distributions
Date
Wed, 26 Mar 2014 21:55:06 -0400
I've been bitten by Emily's unannounced cross-posting . I didn't pay any
attention to this thread, but I answered Emily's question on
Stackoverflow
(http://stackoverflow.com/questions/22108153/calculating-standard-
deviations-in-stata-to-approximate-beta-distributions/22583428#22583428)
by proposing a -bootstrap- solution. I made a mistake in transcribing
her formula, and, having I also erroneously assumed that -nlcom- would
not work, having missed Austin's post.
I've corrected my answer and added -nlcom- to show that the bootstrapped
and -nlcom- standard errors are close. I made one correction to Austin's
formula: It turns out that "_cons" and _b[_cons] are not the same, as
the following code shows:
************************************
webuse cancer, clear
stset studytime, fail(died)
streg , nohr dist(weibull)
di _cons
di _b[_cons]
*************************************
I don't know where _cons comes from.
_b[cons] can also be referenced by _b[_t:_cons] and [_t]_cons
Weibull regression -- log relative-hazard form
No. of subjects = 48 Number of obs = 48
No. of failures = 31
Time at risk = 744
LR chi2(0) = -0.00
Log likelihood = -60.624022 Prob > chi2 = .
-----------------------------------------------------------------------
_t | Coef. Std. Err. z P>|z| [95% Conf. Interval]
------+----------------------------------------------------------------
_cons | -3.798627 .5789797 -6.56 0.000 -4.933406 -2.663848
------+----------------------------------------------------------------
/ln_p | .1882428 .1506025 1.25 0.211 -.1069326 .4834183
------+----------------------------------------------------------------
p | 1.207127 .1817962 .8985863 1.621608
1/p | .8284135 .1247611 .6166718 1.112859
-----------------------------------------------------------------------
. di _cons
1
. di _b[_cons]
-3.7986271
Steve
[email protected]
On Mar 1, 2014, at 8:38 AM, Austin Nichols <[email protected]> wrote:
Emily McPherson <[email protected]>
I'm not sure what you are after here, but for any values `t' and `u'
you want to substitute, you can calculate directly by substituting for
your gamma and lambda:
nlcom 1-(exp(([ln_p]_cons*((`t'-`u')^(exp(_cons))))-([ln_p]_cons*(`t'^(exp(_cons))))))
On Fri, Feb 28, 2014 at 2:49 PM, Emily McPherson <[email protected]> wrote:
> Hello all, my name is Emily McPherson. I'm a health economist at the Canadian Centre for Applied Research in Cancer Control.
>
> My question relates to calculating the standard deviation (SD) of transition probabilities derived from coefficients estimated through Weibull regression in Stata.
>
> The transition probabilities are being used to model disease progression of leukemia patients over 40 cycles of 90 days (about 10 years). I need the SDs of the probabilities (which change over the run of the Markov model) to create beta distributions whose parameters can be approximated using the corresponding Markov cycle probability and its SD. These distributions are then used to do Probabilistic sensitivity analysis, i.e., they are substituted for the simple probabilities (one for each cycle) and random draws from them can evaluate the robustness of the model's cost-effectiveness results.
>
> Anyway, using time to event survival data, I've used regression analysis to estimate coefficients that can be plugged into an equation to generate transition probabilities. For example...
>
>
> . streg, nohr dist(weibull)
>
> failure _d: event
> analysis time _t: time
>
> Fitting constant-only model:
>
> Iteration 0: log likelihood = -171.82384
> Iteration 1: log likelihood = -158.78902
> Iteration 2: log likelihood = -158.64499
> Iteration 3: log likelihood = -158.64497
> Iteration 4: log likelihood = -158.64497
>
> Fitting full model:
> Iteration 0: log likelihood = -158.64497
>
> Weibull regression -- log relative-hazard form
>
> No. of subjects = 93 Number of obs = 93
> No. of failures = 62
> Time at risk = 60250
> LR chi2(0) = -0.00
> Log likelihood = -158.64497 Prob > chi2 = .
>
> ------------------------------------------------------------------------------
> _t | Coef. Std. Err. z P>|z| [95% Conf. Interval]
> -------------+----------------------------------------------------------------
> _cons | -4.307123 .4483219 -9.61 0.000 -5.185818 -3.428429
> -------------+----------------------------------------------------------------
> /ln_p | -.4638212 .1020754 -4.54 0.000 -.6638854 -.263757
> -------------+----------------------------------------------------------------
> p | .628876 .0641928 .5148471 .7681602
> 1/p | 1.590139 .1623141 1.301812 1.942324
>
>
> We then create the probabilities with an equation () that uses p and _cons as well as t for time (i.e., Markov cycle number) and u for cycle length (usually a year, mine is 90 days since I'm working with leukemia patients who are very likely to have an event, i.e., relapse or die).
>
> So where lambda = p, gamma = (exp(_cons))
>
> gen result = (exp((lambda*((t-u)^ (gamma)))-(lambda*(t^(gamma)))))
>
> gen transitions = 1-result
>
> Turning to the variability, I first calculate the standard errors for the coefficients
>
>
> . nlcom (exp(_b[_cons])) (exp(_b[/ln_p]))
>
> _nl_1: exp(_b[_cons])
> _nl_2: exp(_b[/ln_p])
>
> ------------------------------------------------------------------------------
> _t | Coef. Std. Err. z P>|z| [95% Conf. Interval]
> -------------+----------------------------------------------------------------
> _nl_1 | .0116539 .0044932 2.59 0.009 .0028474 .0204604
> _nl_2 | .6153864 .054186 11.36 0.000 .5091838 .721589
>
>
> But what I'm really after is the standard errors on the transitions values, e.g.,
>
> nlcom (_b[transitions])
>
> But this doesn't work. Any feedback on how to get closer to this idea would be much appreciated. Thanks in advance!
>
> Emily
*
* For searches and help try:
* http://www.stata.com/help.cgi?search
* http://www.stata.com/support/faqs/resources/statalist-faq/
* http://www.ats.ucla.edu/stat/stata/
*
* For searches and help try:
* http://www.stata.com/help.cgi?search
* http://www.stata.com/support/faqs/resources/statalist-faq/
* http://www.ats.ucla.edu/stat/stata/