Notice: On April 23, 2014, Statalist moved from an email list to a forum, based at statalist.org.
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
st: Mata performance monitoring
From
Gordon Hughes <[email protected]>
To
[email protected]
Subject
st: Mata performance monitoring
Date
Mon, 07 Feb 2011 17:15:59 +0000
I am experiencing severe performance bottlenecks in implementing a
maximum likelihood estimator for spatial panel data using Mata. This
is not surprising given the size of dataset (~ 200 panel units and 20
time periods) and the complexity of the likelihood function which has
to be implemented using numerical derivatives.
However, I would like to find out where the pay-off would be greatest
for improving the code and reducing memory usage. Some systems offer
tools that permit the user to track how much time and memory is
consumed for each call to functions and sub-programs, so that one can
compile statistics of what operations represent potential
bottlenecks. In any ML program it is always calls to the likelihood
function that use the most time, but I would like to find out whether
significant time and/or memory is being used in other sub-programs.
I am not aware of similar tools for Mata or of ways in which the
information could be compiled. Can anyone suggest how I might do
this? As a corollary, is there any conventional wisdom about which
of the optimisation methods is best under such
circumstances. Conventional wisdom in the past would have pointed to
Newton-Raphson (nr) but this assumed that one could program
analytical gradients and Hessian. What is the most efficient for
large datasets with numerical rather than analytical derivatives?
Gordon Hughes
[email protected]
*
* For searches and help try:
* http://www.stata.com/help.cgi?search
* http://www.stata.com/support/statalist/faq
* http://www.ats.ucla.edu/stat/stata/