Notice: On April 23, 2014, Statalist moved from an email list to a forum, based at statalist.org.
From | Abdel Rahmen El Lahga <rahmen.lahga@gmail.com> |
To | statalist@hsphsun2.harvard.edu |
Subject | Re: st: U.S. Census Data |
Date | Thu, 6 May 2010 16:48:47 +0200 |
This basicaly a memory problem. Stata can handle bigger data set. YOu say nothing about your OS nor the maximum RAM of your computers In my iMac with 4G RAM the following code works fine . clear* . set mem 3g (3145728k) . set obs 30000000 obs was 0, now 30000000 . foreach i of numlist 1/15 { 2. gen x`i'=rnormal() 3. } . end of do-file Abdel 2010/5/6 Nate Breznau <nbreznau@bigsss.uni-bremen.de>: > I am wanting to end my usage of SPSS, and in general have successfully done > so; however, in a project working with U.S. Census data I need to use a > datafile that has over 30 million cases and 15 variables. This is the > smallest version I can use for my purposes. Is there any way to alter Stata > to work with such a monster file? I've pushed it to its maximum allowed > memory and its not enough. > > I thank anyone kindly for any advice, no matter how dismal. > > -Nate > > -- > PhD Fellow > Social Integration and the Welfare State > BIGSSS > FVG W1150 > 0421 218 66384 > > * > * For searches and help try: > * http://www.stata.com/help.cgi?search > * http://www.stata.com/support/statalist/faq > * http://www.ats.ucla.edu/stat/stata/ > -- AbdelRahmen El Lahga * * For searches and help try: * http://www.stata.com/help.cgi?search * http://www.stata.com/support/statalist/faq * http://www.ats.ucla.edu/stat/stata/