Notice: On April 23, 2014, Statalist moved from an email list to a forum, based at statalist.org.
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: st: data set too large
From
Austin Nichols <[email protected]>
To
[email protected]
Subject
Re: st: data set too large
Date
Thu, 7 Apr 2011 13:23:29 -0400
Susan L Averett <[email protected]>:
set mem 4000m
If you can't set memory high enough (but consider upgrading your OS or
RAM as necessary), you can edit the .dct file that the ECLS extraction
program spits out to remove any extraneous variables e.g. did you want
several hundred weight vars (all doubles)? You can also downgrade the
storage type of many variables to byte, but that requires knowing
which variables are suitable. Make sure you -compress- before you
-save- in any case.
On Wed, Apr 6, 2011 at 10:55 PM, Averett, Susan L
<[email protected]> wrote:
> Hi all: I am reading in a large data set, the ECLS and it comes with a Stata dictionary file so obviously it was meant to be read in Stata. I have Stata 11 SE but I cannot get it to load. It is too big.
> The error message (below) tells me to compress the data but how can I compress the data when it is not even read into Stata yet?
> Likewise, how can I drop observations? I am reading in a dictionary file. I've got set mem set as large as it will go for me:
> 1000m and same with max var..it is set to 30,000.
> Help!
*
* For searches and help try:
* http://www.stata.com/help.cgi?search
* http://www.stata.com/support/statalist/faq
* http://www.ats.ucla.edu/stat/stata/