Hi Benjamin:
I've had similar issues on Windows machines, especially when I've got
other programs running in the background. You should try watching the
memory allocation in your task manager when you run these commands.
After a frustrating experience with trying to 'step up' my memory to
its maximum in a way similar to what you describe, I created this
program that I will give me the maximum amount of memory on the
machine until it hits it's limit...you might try it out:
*********************
clear
discard
program define givemem
syntax [, st(integer 100) max(integer 12000) stepby(integer 1000)]
clear
set matsize 800
qui set virtual on //-->you may want to turn this off
qui query mem
local x = floor(r(memory) )
*display as txt " MEM is currently " as result (`x') " kilobytes(K)"
local e = `x'/1000
local e2 = `e'/1000
display as txt "... MEM is currently " as result "`e' MB or `e2' G"
//FIND START RANGE
local current = floor(`r(memory)')
local zero 100
if (`current' > `zero') local start = `current'
if (`current' < `zero') local start = `zero'
di "GIVEMEM will maximize mem starting at: `start' increasing in
steps of: `stepby' " as result
//NOW, USE LOOPING TO FIND THE MAX
foreach p in `stepby' {
capture {
forvalues giveme = `st'(`p')`max' {
set memory `giveme'm
quietly query mem
display as txt "`r(memory)"
}
}
}
query mem
end
** **
//EXAMPLE TO RUN
givemem , st(200) max(6400) stepby(500)
***********************
There are probably some ways to improve this code, but it seems to
work pretty well for me.
I just save it in my personal folder as givemem.ado (be
careful...because the virtual memory on a Mac is unlimited if you are
on a network, it may try to grab all the free hard drive space on the
network as virtual memory...I ran it one time and it grabbed 4.6 TB of
virtual memory from the free resources available on the department's
network ! )
Best,
Eric
__
Eric A. Booth
Public Policy Research Institute
Texas A&M University
[email protected]
Office: +979.845.6754
On Jul 20, 2009, at 12:25 PM, BENJAMIN SCHWAB wrote:
I'm encountering an issue with setting memory that I have not seen
covered by the large database FAQ or in the Statalist archives.
When I open Stata and try the 'set mem xxxm' to a large memory
allocation (it seems as if it's anything over 500m), I sometimes get
the r(909) error saying “op. sys. refuses to provide memory”. The
strange part, however, is if I set the memory to a much smaller
number, e.g. set mem 250m, and then gradually raise the memory
allocation, I can get up to and past the original allocation that
was originally refused.
For example, today I did the following:
set mem 800m
op. sys. refuses to provide memory
r(909)
set mem 250m
set mem 500m
set mem 800m
op. sys. refuses to provide memory
r(909)
set mem 700m
set mem 800m
set mem 850m
I can't figure out why this occurs. This can be a real pain for
programming purposes, since it's difficult to predict what exact
memory allocations will be accepted from day to day (I'm working
with large datasets).
I'm running Stata/SE 10.1 with XP SP3. I'd love to hear the input
of those more familiar with the working of Stata. Right now, I'm
stumped.
*
* For searches and help try:
* http://www.stata.com/help.cgi?search
* http://www.stata.com/support/statalist/faq
* http://www.ats.ucla.edu/stat/stata/
*
* For searches and help try:
* http://www.stata.com/help.cgi?search
* http://www.stata.com/support/statalist/faq
* http://www.ats.ucla.edu/stat/stata/