java.lang.OutOfMemoryError using bartMachine package in R - r

I ran a BART model with 11000 samples and 20 features(half of them are categorical variable). My mac has 8G ram. At first, I set memory to 5000 MB via function set_bart_machine_memory(5000).
Then I can fit a model through the function bartMachine one time. If I want to run another model then the R returns a error like this:
Exception in thread "pool-10-thread-1" Exception in thread "pool-10-thread-3"
java.lang.OutOfMemoryError: Java heap space
java.lang.OutOfMemoryError: Java heap space
Exception in thread "pool-10-thread-2" java.lang.OutOfMemoryError: Java heap space
Exception in thread "pool-10-thread-4" java.lang.OutOfMemoryError: Java heap space
Error in .jcall(bart_machine$java_bart_machine, "Z", "isDestroyed") :
java.lang.OutOfMemoryError: Java heap space
I think that having two bartMachine object in memory may not be a good idea, so I just kill the first model through function destroy_bart_machine(), then the second model is OK to run.
The main problem is on bartMachineCV(). There are about 20 model to fit in default, and the memory error like the one above hits me when R is running the bart model with second set of parameter setting (that is : bartMachine CV try: k: 2 nu, q: 3, 0.9 m: 200 ).
I'm not familiar to JAVA, is there some way to run bartMachineCV() on a 8GB RAM computer? Thanks.

I'm the maintainer of the bartMachine package. Make sure you download the new version and pay attention the message that appears after you initialize the library:
> library(bartMachine)
...
Welcome to bartMachine v1.2.0! You have 0.48GB memory available.
If you see a low amount of RAM on the message, something is wrong with your JVM setup. 64-bit JVM is a must. Use
options(java.parameters = "-Xmx2500m")
before calling library(bartMachine) to attempt to set more.

You'll need to run a 64-bit Java JVM; the 32-bit JVM only gives you ~1.8GB max heap. I'd recommend that you use JDK 7 or higher; that's production for Oracle these days.
Once you have that, you can set JVM memory settings like this:
http://www.oracle.com/technetwork/java/javase/tech/vmoptions-jsp-140102.html
You'll want to set -Xmx=1024M or something like that.

Related

In R Studio I am getting Java Out of Memory (for RWeka)

Ok, this looks familiar from the Java world, where/how can I allow more memory for RWeka in RStudio.
Error in .jcall("RWekaInterfaces", "[S", "tokenize", .jcast(tokenizer, :
java.lang.OutOfMemoryError: GC overhead limit exceeded
Not sure how R interfaces to Java and if I can allow more heap space.
Thanks in advance
Gary
Yes you can increase the heap space.
You can do it before loading any rJava dependent package, in this case, RWeka.
To increase the default (512 MB) to 1024MB:
options(java.parameters = "-Xmx1024m")

How to initialise JVM with larger maximum heap size using rJava

I am attempting to make use of the Stanford NLP tools (Java) in R, using the rJava package. When attempting to create a StanfordCoreNLP object I get the following error:
Error in .jnew("edu/stanford/nlp/pipeline/StanfordCoreNLP", props) : java.lang.OutOfMemoryError: Java heap space
To resolve this, I have attempted to initialise the JVM with a larger maximum heap size, using variations of the following code:
.jinit(parameters=c("-Xms1g","-Xmx4g"))
When the maximum heap is set to 1GB using -Xmx1g the JVM loads but I continue to get the OutOfMemoryError. When the maximum heap size is set to 2 or 3 GB (-Xmx2g or -Xmx3g), R will stop responding. When set to 4GB or higher -Xmx4g I will get the following message:
Error in .jinit(parameters = c("-Xms1g", "-Xmx4g"), force.init = TRUE) : Cannot create Java virtual machine (-6)
How do you successfully initialise the JVM using rJava to values larger than 1GB? I am using 32bit versions of Java (v8 u51) and R (v3.2.0)
I am using 32bit versions of Java (v8 u51) and R (v3.2.0)
That's your problem right there. Switch to 64bit versions.

Git-svn out of memory

I'm trying to clone a reasonably big svn repository with git-svn and at a certain point I get a error message:
Failure loading plugin: APR: Can't create a character converter from 'UTF-8' to native encoding: Cannot allocate memory at /usr/libexec/git-core/git-svn line 5061
And sometimes a
Cannot allocate memory: zlib (compress2): out of memory: Compression of svndiff data failed at /usr/libexec/git-core/git-svn line 5061
error message. I still have ~3GB RAM free. What should I do so git-svn can utilize it?
(I'm doing this on RedHat Enterprise Linux 6.5 if that makes any difference)
From:
This error message is about the memory git is trying to allocate --
it's more than what is free. This is most likely caused by a large
file having been checked into SVN. Unfortunately, there's no easy way
to fix it (apart from buying more memory) -- you would have to remove
the large file and the commit adding it from SVN.
However try following:
Increase swap memory
Increase ulimit

R: Cannot allocate memory greater than x MB

I have a main function in R which calls other files to run my program. I call the main file through a bat file(.exe). When I run it line-by-line it runs without a memory error, but when I call the bat file to run it, it halts and gives me the following error:
Cannot allocate memory greater than 51 MB.
How can I avoid this?
Memory limitations in R such as this are a recurring nightmare for a lot of us.
Very often the problem is a limit imposed by your OS limits (which can usually be changed on a Bash or PowerShell command line), architecture (32 v. 64 bit), or the availability of contiguous free RAM, irregardless of overall available memory.
It's hard to say why something would not cause a memory issue when run line by line, but would hit the memory limit when run as a .bat.
What version of R are you running? Do you have both installed? Is 32-bit being called by Rscript when you run your .bat file whereas you run a 64-bit version line by line? You can check the version of R that's being run with R.Version().
You can test this by running the command memory.limit() in both your R IDE/terminal and in your .bat file (be sure to print or save the result as an object in your .bat file). You might also do well to try setting memory.limit() in your .bat file, as it may just have a smaller default, perhaps due to differences in your R Profile that's invoked in your IDE or terminal versus the .bat file.
If architecture isn't the cause of your memory error, then you have several more troubleshooting steps to try:
Check memory usage in both environments (in R directly and via your .bat process) using this:
sort( sapply(ls(),function(x){object.size(get(x))}))
Run the garbage collector explicitly in your scripts, that's the gc() command
Check all object sizes to make sure there are no unexpected results in your .bat process: sort( sapply(ls(),function(x){format(object.size(get(x)), units = "Mb")}))
Try memory profiling:
Rprof(tf <- "rprof.log", memory.profiling=TRUE)
Rprof(NULL)
summaryRprof(tf)
While this is a RAM issue, for good measure you might want to check that the compute power available is both sufficient and not varying between these two ways of running your code: parallel::detectCores()
Examine your performance with Prof. Hadley Wikham's lineprof tool (warning: requires devtools and doesn't work on lines of code which call the C programming language)
References While I'm pulling these snippets out of my own code, most of them originally came from other, related StackOverflow posts, such as:
Reaching memory allocation in R
R Memory Allocation "Error: cannot allocate vector of size 75.1 Mb"
R memory limit warning vs "unable to allocate..."
How to compute the size of the allocated memory for a general type
R : Any other solution to "cannot allocate vector size n mb" in R?
Yes you should be using 64bit R, if you can.
See this question, and this from the R docs.

SBT runs out of memory

I am using SBT 0.12.3 to test some code and often I get this error message while testing interactively with the ~test command.
8. Waiting for source changes... (press enter to interrupt)
[info] Compiling 1 Scala source to C:\Users\t\scala-projects\scala test\target\s
cala-2.10\classes...
sbt appears to be exiting abnormally.
The log file for this session is at C:\Users\t\AppData\Local\Temp\sbt566325905
3150896045.log
java.lang.OutOfMemoryError: PermGen space
at java.util.concurrent.FutureTask$Sync.innerGet(Unknown Source)
at java.util.concurrent.FutureTask.get(Unknown Source)
at sbt.ConcurrentRestrictions$$anon$4.take(ConcurrentRestrictions.scala:
196)
at sbt.Execute.next$1(Execute.scala:85)
at sbt.Execute.processAll(Execute.scala:88)
at sbt.Execute.runKeep(Execute.scala:68)
at sbt.EvaluateTask$.run$1(EvaluateTask.scala:162)
at sbt.EvaluateTask$.runTask(EvaluateTask.scala:177)
at sbt.Aggregation$$anonfun$4.apply(Aggregation.scala:46)
at sbt.Aggregation$$anonfun$4.apply(Aggregation.scala:44)
at sbt.EvaluateTask$.withStreams(EvaluateTask.scala:137)
at sbt.Aggregation$.runTasksWithResult(Aggregation.scala:44)
at sbt.Aggregation$.runTasks(Aggregation.scala:59)
at sbt.Aggregation$$anonfun$applyTasks$1.apply(Aggregation.scala:31)
at sbt.Aggregation$$anonfun$applyTasks$1.apply(Aggregation.scala:30)
at sbt.Command$$anonfun$applyEffect$2$$anonfun$apply$3.apply(Command.sca
la:62)
at sbt.Command$$anonfun$applyEffect$2$$anonfun$apply$3.apply(Command.sca
la:62)
at sbt.Command$.process(Command.scala:90)
at sbt.MainLoop$$anonfun$next$1$$anonfun$apply$1.apply(MainLoop.scala:71
)
at sbt.MainLoop$$anonfun$next$1$$anonfun$apply$1.apply(MainLoop.scala:71
)
at sbt.State$$anon$2.process(State.scala:170)
at sbt.MainLoop$$anonfun$next$1.apply(MainLoop.scala:71)
at sbt.MainLoop$$anonfun$next$1.apply(MainLoop.scala:71)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:18)
at sbt.MainLoop$.next(MainLoop.scala:71)
at sbt.MainLoop$.run(MainLoop.scala:64)
at sbt.MainLoop$$anonfun$runWithNewLog$1.apply(MainLoop.scala:53)
at sbt.MainLoop$$anonfun$runWithNewLog$1.apply(MainLoop.scala:50)
at sbt.Using.apply(Using.scala:25)
at sbt.MainLoop$.runWithNewLog(MainLoop.scala:50)
at sbt.MainLoop$.runAndClearLast(MainLoop.scala:33)
at sbt.MainLoop$.runLoggedLoop(MainLoop.scala:17)
Error during sbt execution: java.lang.OutOfMemoryError: PermGen space
The error is clear, I cloud increase the heap size and it may stop throwing that error, but the thing is that it shuts down after a number(I don't know how many) of test interactions with a minimal change in the code, and if a simple increase in the heap would solve the problem or do I have to do additional work not to run out of memory.
Thanks in advance.
If you haven't, try giving more PermGen space in your sbt.bat. I don't run sbt on Windows, but I give java -Xmx1512M -XX:MaxPermSize=512M.
Another thing to try may be to fork during testing: https://www.scala-sbt.org/1.x/docs/Testing.html#Forking+tests
Test / fork := true
specifies that all tests will be executed in a single external JVM.

Resources