How to resolve Arcpy ExecuteError: ERROR 010005: Unable to allocate memory. Failed to Execute (ExtractByMask) - out-of-memory

I am running arcpy (using Spyder IDE to run script) and using a for loop which undertakes a number of geoprocessing activities in each loop, as I iterate through a table of inputs. I am using a surface pro laptop (16GB memory) to run the script.
After I've looped through a few hundred times, I get the ExecuteError: ERROR 010005: Unable to allocate memory. Failed to Execute (ExtractByMask). It seems to follow an ExtractByMask step each time (see image).
When I first run the script in Spyder, I get get through about 1000 iterations with no problems. But when I try again straight after this, it terminates with the error after a few hundred more loops at most. I could restart Spyder each time and work in smaller data batches (of say 1000), but this is quite teadious if I am trying to get through about 20,000 iterations.
When i check my laptop system memory, it only appears to be at 30% usage when the arcpy program is running - so I don't think it is this. I had read a few posts which suggest it could be an environment setting associated with ArcGIS. Has anyone encountered this and found a solution to it?
Only work around so far is to quit the Spyder IDE application and reopen, running the loop for smaller batches of data.

Related

"Cannot allocate vector of size xxx mb" error, nothing seems to fix

I'm running RStudio x64 on Windows 10 with 16GB of RAM. RStudio seems to be running out of memory for allocating large vectors, in this case a 265MB one. I've gone through multiple tests and checks to identify the problem:
Memory limit checks via memory.limit() and memory.size(). Memory limit is ~16GB and size of objects stored in environment is ~5.6GB.
Garbage collection via gc(). This removes some 100s of MBs.
Upped priority of rsession.exe and rstudio.exe via Task Manager to real-time.
Ran chkdsk and RAM diagnostics on system restart. Both returned no errors.
But the problem persists. It seems to me that R can access 16GB of RAM (and shows 16GB committed on Resource Monitor), but somehow is still unable to make a large vector. My main confusion is this: the problem only begins appearing if I run code on multiple datasets consecutively, without restarting RStudio in between. If I do restart RStudio, the problem doesn't show up anymore, not for a few runs.
The error should be replicable with any large R vector allocation (see e.g. the code here). I'm guessing the fault is software, some kind of memory leak, but I'm not sure where or how, or how to automate a fix.
Any thoughts? What have I missed?

Stop submitted lines of code from running

I'm running a long R script, which takes 2 or 3 days to finish. I accidentally run another script, which, if it works as R usually does, will go in some queue and R will run it as soon as the first script is over. I need to stop that, as it would compromise the results from the first script. Is there a visible queue or any other way to stop R from running some code?
I'm working on an interactive session in R studio, on windows 10.
Thanks a lot for any help!
Assuming you're running in console (or interactive session in R studio, that's undetermined from your question) and that what you did was sourcing a script/pasting code and while it was running pasting another chunck of code:
What is ongoing is that you pushed data into R process input stream, it's a buffered input, so it will run each line once the previous line call has ended and free the process.
There's no easy way to play with an input buffer, that's R internal input/output system and mostly it's the Operating system which have those information in cache for now.
Asking R itself is not possible as it already has this buffer to read, any new command would go after.
Last chance thing: If you can spot your another chunck of code starting in your console, you can try pressing the esc key to stop the code running.
You may try messing with the process buffers with procexp but there's a fair chance to just make your R session segfault anyway.
To avoid that in the future, use scripts and run them on the command line separately with Rscript (present in R bin directory under windows too despite the link pointing to a linux manpage).
This would create one session per script and allow to kill them independently. That said if they both write to the same place (database, a file would create an error if accessed by two process) that won't prevent data corruption.
I am guessing OP has below problem:
# my big code, running for a long time
Sys.sleep(10); print("hello 1")
# other big code I dropped in console while R was still busy with above code
print("hello 2")
If this is the case, I don't think it is possible to stop the 2nd process from running.

Long R system call hangs

I would greatly appreciate some help with the following:
I am simply running 3 instances of a standalone app from R in foreach in parallel. Please see the mock code below:
require("foreach")
require("doMC")
registerDoMC(cores=3)
foreach(sample=1:9) %dopar%{
system2(command="app", args=c("some","args"),
stdout = NULL, stderr = NULL)
}
Regardless of whether I use system, system2, in the task manager I can see:
rsession (parent)
3x rsessions (from foreach)
3x app (processes)
The problem is: parent uses a lot of CPU and keeps consuming RAM until I fall into swap. This happens for any app, any number of threads in foreach and essentially, any system call from R, which runs longer than some negligible times hangs and never returns.
My machine has Debian Jessie and R 3.2.3.
Thank you in advance!
Ok, I have found what was causing the problem: RStudio. The rsession parent thing was likely scanning for file changes and stalling somewhere until the process finishes. Running R from console displays no such problem.

Consistent unknown and fatal error in R / RStudio

Is there any way I can find out what happened that results into a crash?
I tried running a script multiple times now, and it seems to crash after a random amount of work done.
This is what I get (R studio):
As you can see, this isn't of any help.
As for memory: I am trying to construct a matrix with 34 million entries.
I monitored the memory usage during execution and came to the conclusion that there wasn't a big increase in memory. Memory was constant-ish at about 580mb.
The crash only occurs in R studio. Not in the R interpreter.

Intellij hangs on generating a large number of graphs

I'm running a Java code on intellij which calls an R script which is supposed to generate abt 2000 graphs. But when I run the code it generates abt 200 and just hangs. It doesn't do anything.
I initially thought it had something to do with the memory allocated to for the application. So i increased the memory allocated using the following link.http://sohu.io/questions/870094/intellij-increase-scalatest-heap-space
But it still doesn't make any difference.
I also tried running the same code from the command line. But it generates the same number of graphs and hangs again.
What could be the issue??

Resources