Notice: On April 23, 2014, Statalist moved from an email list to a forum, based at statalist.org.
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
RE: st: Dfactor - Optimization Terminated
From
"Degas Wright" <[email protected]>
To
<[email protected]>
Subject
RE: st: Dfactor - Optimization Terminated
Date
Wed, 20 Oct 2010 16:50:23 -0400
Richard,
Thank you for your comments. I am new to time series analysis. As it
relates to scaling of the data, I have typically standardized data in
cross sectional analysis but with time series I did not see the value of
the standardization. The _dfactor_ uses the first difference of the
data. Is there other scaling approaches that are beneficial in time
series analysis.
I am concerned with the number of observations (236 weeks of data)- so
what is a good rule of thumb related to the number of parameters?
Also, your comment on using iterate(#) works fine but I having to stop
it a 6 iterations to get it to go through the whole data set.
Thank you for your assistance.
Degas A. Wright, CFA
Chief Investment Officer
Decatur Capital Management, Inc.
250 East Ponce De Leon Avenue, Suite 325
Decatur, Georgia 30030
Voice: 404.270.9838
Fax:404.270.9840
Website: www.decaturcapital.com
-----Original Message-----
From: [email protected]
[mailto:[email protected]] On Behalf Of Richard Gates
Sent: Wednesday, October 20, 2010 4:09 PM
To: [email protected]
Subject: Re: st: Dfactor - Optimization Terminated
Degas <[email protected]> is experiencing a non-convergent
dynamic
factor model using -dfactor-.
> Iteration 21: log likelihood = 134.87566 (backed up)
> Iteration 22: log likelihood = 134.87566 (backed up)
> Iteration 23: log likelihood = 134.87566 (backed up)
> optimization terminated because of numerical instability: Hessian is
not
> negative semidefinite
> r(430);
As with any optimization problem, convergence is never guaranteed.
The example provided by Degas is characteristic of a unidentified
problem.
When this occurs the optimization search continues indefinitely, or the
Hessian
becomes numerically singular. An example of the latter is when the
estimate of
a variance component goes to zero. The strategy here is to put an
iteration
limit on the search so that it will terminate before the failure occurs,
-iterate(20)-, say. The coefficient table will probably reveal the
problem.
Perhaps the data needs to be scaled. Is there enough data to identify
the
parameters (7 variance components, 2 AR parameters, and 7 regression
coefficients)?
Degas states:
> I have created a loop that will perform _dfactor_ for each ticker in
my
> universe of 1000 stocks. ...
When running 1000 optimization problems using -dfactor- Degas should
expect
some non-convergent examples. Perhaps he can utilize the Stata
exception
handler -capture- with -noisily- to trap any optimization failures and
record
the failure for further investigation. For example,
capture noisily dfactor(D.(r ep mom qer fsr bm
np)=,noconstant)(f=,ar(1/2)) ///
if xticker==`x', iterate(50)
if c(rc) {
local failures `failures' `x'
continue
}
Note that I have put a limit of 50 iterations in the code snippet above.
The
system maximum is 1500 which could cause problems when executing
-dfactor-
1000 times in a loop.
I have speculated a lot on the source of the trouble. If Degas can
provide me
with the data for his non-convergent example, I would be happy to
investigate
the source of the failure.
-Richard Gates
[email protected]
*
* For searches and help try:
* http://www.stata.com/help.cgi?search
* http://www.stata.com/support/statalist/faq
* http://www.ats.ucla.edu/stat/stata/
*
* For searches and help try:
* http://www.stata.com/help.cgi?search
* http://www.stata.com/support/statalist/faq
* http://www.ats.ucla.edu/stat/stata/