When running a memory-hog like nnmatch, you must have a large amount of RAM
installed to prevent Stata from using the virtual memory, which is very slow
because the hard-drive is used as an extra memory. Specifying the memory
size does not help - it merely instructs the computer to augment the
existing RAM with the virtual memory.
For a problem of your size, you need at least 1 GB of RAM and it will take
several hours (512 MB will not handle the same problem even after 2 days).
The tradeoff between the speed and the RAM is nonlinear, because an entirely
different process is engaged once a certain threshhold is passed.
If you wish to know the threshhold, you can run it on a smaller sample.
Roy Wada
>Message: 1 Date: Tue, 26 Jul 2005 14:49:19 -0400 From: Carlos Osorio
>Subject: st: Problems with nnmatch
>
>Hi all,
>
>I'm using the nnmatch to analyze average treatment effects, but have not
>been able to get a result after running it for 5+ hours. I'm using a
>sample with about 30,000 observations, and set memory to 1400 Megs, without
>luck. I've made a smaller subsample (random) and no luck either.
>
>Does any one have had an experience like this? If so, can you give some
>hints on how to solve it?
>
>Many thanks!
>
>- Carlos
>
>
>Carlos A. Osorio =================== Ph.D. Candidate MIT Engineering
>Systems Division http://esd.mit.edu/
>
>Communications Futures Program http://cfp.mit.edu/ Massachusetts Institute
>of Technology
>
>
>* * For searches and help try: *
>http://www.stata.com/support/faqs/res/findit.html *
>http://www.stata.com/support/statalist/faq *
>http://www.ats.ucla.edu/stat/stata/
>
>
*
* For searches and help try:
* http://www.stata.com/support/faqs/res/findit.html
* http://www.stata.com/support/statalist/faq
* http://www.ats.ucla.edu/stat/stata/