One of my faculty brought this discrepancy to my attention. Why is the
2-sided test of significance (.8004) not equal to twice the one-sided
test? (e.g., .3749 * 2 = .7498). She tells me that SAS gives a
two-sided p-value of .7498.
Thanks,
Paul
. sdtesti 9 . 4.849227 16 . 4.475247
Variance ratio test
------------------------------------------------------------------------
------
| Obs Mean Std. Err. Std. Dev. [95% Conf.
Interval]
---------+--------------------------------------------------------------
------
x | 9 . 1.616409 4.849227 .
.
y | 16 . 1.118812 4.475247 .
.
---------+--------------------------------------------------------------
------
combined | 25 . . . .
.
------------------------------------------------------------------------
------
Ho: sd(x) = sd(y)
F(8,15) observed = F_obs = 1.174
F(8,15) lower tail = F_L = 1/F_obs = 0.852
F(8,15) upper tail = F_U = F_obs = 1.174
Ha: sd(x) < sd(y) Ha: sd(x) != sd(y) Ha: sd(x) >
sd(y)
P < F_obs = 0.6251 P < F_L + P > F_U = 0.8004 P > F_obs =
0.3749
______________________________
Paul F. Visintainer, PhD
Professor and Program Director
Health Quantitative Sciences
School of Public Health
New York Medical College
*
* For searches and help try:
* http://www.stata.com/support/faqs/res/findit.html
* http://www.stata.com/support/statalist/faq
* http://www.ats.ucla.edu/stat/stata/