Thxx for the reply, Sergiy, it is quite interesting to learn that allocating
more memory to Stata via -set mem- could make things worse. The same message
was conveyed to me by Stata support staff.
I had my profile.do allocate 800m at startup, and then the erratic behavior
described in the initial post would happen randomly, without apparent
rules... What`s worth noting, though, is that it was not only graphing
commands triggering the "op. sys. refuses to provide memory". Routine tasks
like -mean- would also do that. What`s more, I sometimes repeat tasks via
double-click in the review window, and the same task that executed perfectly
normally a couple of minutes earlier would suddenly overwhelm Stata. It was
this property of the problem that prompted my initial message.
As for the data, there was nothing special about them. Most of the time, I
check whether the data occupy all memory available to them which is easily
done via -describe- and usually find ample space like 95% of memory free.
With the new information from the replies in mind (kudos to all who
replied), I will cut the memory back in future and change my profile.do
accordingly. I still think that this whole area of Stata is the source of
much user frustration, some of which might be alleviated through a column in
the Stata Journal...
Martin
_________________________________________________________________
Diplom-Kaufmann Martin Weiss
Mohlstrasse 36
Room 415
72074 Tuebingen
Germany
Fon: 0049-7071-2978184
Home: http://www.wiwi.uni-tuebingen.de/cms/index.php?id=1130
Publications: http://www.wiwi.uni-tuebingen.de/cms/index.php?id=1131
SSRN: http://papers.ssrn.com/sol3/cf_dev/AbsByAuth.cfm?per_id=669945
1. Martin, perhaps you could tell us something more specific about your
data.
2. The error message you quote usually appears after set mem command.
AFAIK graphing commands do not set memory. The only possibility I see,
is that the graphing subsystem could not allocate resources for
creating a new window, and graphical objects. Don't forget, that each
point on a scatter will be a new object in windows, with it's own
pointer (4 bytes), color(4 bytes), size (N bytes), type (M bytes),
etc. So if you are doing a scatter with 800Million observations, you
can be sure to run out of resources. By increasing memory given to
Stata you are making things WORSE. Stata is allocated it's 2GB by
Windows automatically when it is launched. What you adjust by -set
mem- is the amount of memory available for storing the DATA, and NOT
for the other objects (graphs, matrices, anything you see on the
screen, including command line, output window, etc).
To see this:
set mem 100k
sysuse auto
expand 26
memory
// you have ~4kb free
matrix A=J(100,100,1.23456789)
// matrix A requires 100*100*8 bytes ~ 80kb
// no error:
matrix list A
This is because matrix A was stored into the RESIDUAL memory of 2GB
minus whatever Stata code requires minus 100kb. Usually we do not
care, since the dataset is our most "heavy" object. But things become
more complicated, when you start working with other "heavy" objects,
like plugins, which require memory for their own work, and in my case
that can be 600M and more. Then you start carefully considering
whether you need to give Stata that much memory (see my previous
inquiries regarding how to estimate the amount needed for a particular
dataset).
3. Finally, there is a concept of "serset" in Stata. When you save a
graph, all data required to build the graph will be saved with it to
disk. (to see this, just save your graph as .gph and browse the file
with any hex viewer . You will find <BeginSersetData> and
<EndSersetData> with binary values in between ). This allows Stata to
replay the graph later. These sersets must also be residing somewhere,
so just having enough place to store your data, does not necessarily
mean you can plot the data. In the worst case, you might need twice as
much memory and good balancing of memory between different
allocations. Here "twice" is optimistic, since Stata (I suspect) also
remembers the sequence of graphical commands needed to repaint the
graph a la Metafile, which can change twice to tripple (in the worst
case). The worst case (as I see it) is a very long narrow dataset,
sort of x-y pairs for a scatter, that occupy hundreds of megabytes
(see #1).
4. From what I know, Vista should do better then XP in giving Stata
more memory. A typical limit for Stata 10 would be about 700-800 MB on
XP 32-bit, and about 1500MB on Vista 32-bit.
Best regards,
Sergiy Radyakin
*
* For searches and help try:
* http://www.stata.com/support/faqs/res/findit.html
* http://www.stata.com/support/statalist/faq
* http://www.ats.ucla.edu/stat/stata/