Notice: On April 23, 2014, Statalist moved from an email list to a forum, based at statalist.org.
From | v van kervel <v.l.vankervel@uvt.nl> |
To | statalist@hsphsun2.harvard.edu |
Subject | st: Clustered errors without saving data in pagefile |
Date | Sat, 24 Sep 2011 12:18:33 -0700 (PDT) |
Dear Statalist, I have a 25Gb dataset. A simple regression (regress) takes about 15 seconds to run. However, when I specify clustered (or robust/ newey west) SEs, it takes 1,5 hours. I believe the reason is that first the data in memory is copied to the harddisk in a temporary pagefile, then the robust SEs are calculated and saved, and finally the dataset is read into memory again. This is extremely time consuming. My question: Can I obtain clustered errors without copying the data in memory to the hard disk? I have 40Gb Ram memory, so I expect this is enough space to do the calculations in Ram memory only. Thank you, V van Kervel -- View this message in context: http://statalist.1588530.n2.nabble.com/Clustered-errors-without-saving-data-in-pagefile-tp6827732p6827732.html Sent from the Statalist mailing list archive at Nabble.com. * * For searches and help try: * http://www.stata.com/help.cgi?search * http://www.stata.com/support/statalist/faq * http://www.ats.ucla.edu/stat/stata/