I am using the Arrelano and Bond estimator, XTABOND with a relatively large
data set (800,000+ observations times 3 years). My machine has 2GB of RAM
and I can set memory equal to 1250MB in Stata. With this memory allotment
I can load my data set and run XTABOND with a dozen or so variables in my
model. However, when I increase the number of explanatory variables in my
model I get the message that there is not sufficient space for new
variables. I presume that in the process of running XTABOND there are
differenced variables that are created and the new variables are expanding
the data array past the limit of my available memory. Is this right? Is
there any way, other than trial and error, I can determine just how much
memory will be needed to run XTABOND, given the number of obervations and
number of variables in the model?
Tim
Tim R. Sass
Professor Voice: (850)644-7087
Department of Economics Fax: (850)644-4535
Florida State University E-mail: [email protected]
Tallahassee, FL 32306-2180 Internet: http://garnet.acns.fsu.edu/~tsass