In the course of reshaping a large data set (from wide to long, though I
think the same problem can arise going the other way), Stata complains that
it has run out of memory. The allocated memory is large enough for the
data set going in, and, as it turns out, also large enough for the final
result. Apparently the data goes through some intermediate state which is
larger. It is easy enough to reset the memory to a larger amount and
re-run the job, but reshaping a large data set is very time consuming.
I have also encountered a similar problem using -collapse-.
So my question is: does anybody know how to calculate the memory
requirements for running -reshape- and -collapse- so memory can be properly
set in advance? I couldn't find anything about this in the manuals.
Thanks for any help.
Clyde Schechter
Dept. of Family Medicine & Community Health
Albert Einstein College of Medicine
Bronx, NY, USA
*
* For searches and help try:
* http://www.stata.com/support/faqs/res/findit.html
* http://www.stata.com/support/statalist/faq
* http://www.ats.ucla.edu/stat/stata/