How to increase memory limit in Rstudio? - r

If I type
memory.limit()
in the console of Rstudio, I get "1e+13". Then, if I try to increase this memory limit by doing
memory.limit(1e+20)
or
memory.limit(size=1e+20)
I get an error: "Warning message: In memory.limit(size = 1e+20) : cannot decrease memory limit: ignored". Which makes no sense. How can I increase this memory limit? (I am running on Windows 10 with 24GB of RAM).

Related

"Error: cannot allocate vector of size x kb" when loading R workspace

I am encountering an issue when attempting to load a 1.8GB workspace into RStudio. I receive a string of messages such as:
Error: cannot allocate vector of size 8 Kb
Error: cannot allocate vector of size 64 Kb
Error: cannot allocate vector of size 16 Kb
Error: cannot allocate vector of size 256 Kb
Error: cannot allocate vector of size 32 Kb
etc.
The objects appear in my Global Environment but attempting to call them yields further errors such as those above.
In addition, my PC should have ample (>10GB) RAM for this workspace, but in Task Manager RAM usage is shown as ~80%.
I have tried:
Restarting everything
gc()
Using 64 bit version
I couldn't find any other solutions.
Many thanks

Managing memory.size() with foreach parallel threads in R

When I run memory.size(max=NA) I get: [1] 16264
But memory.size(max=T) gives me [1] 336.88
When I look at Task Manager, the 4 threads are using a total of ~1,000 MB (1/16 of my 16GB of available RAM) but they are using 100% of my CPU. While running, all processes combined are only using 50% of my 16GB of available RAM.
Whenever I try to increase memory allocation with memory.size(max=1000), I get the warning message:
Warning message:
In memory.size(max = 1000) : cannot decrease memory limit: ignored
What is going on here?
1) Is my CPU just slow given the amount of RAM I have? (Intel i7-6500U 2.5 GHz)
2) Does memory allocation require additional steps when using parallel threading? (e.g. doParallel)

How to extend Allowed memory size in PHPExcel Library?

I want to extend the allowed memory size. Currently my excel downloads are not downloading it throws following error:
Fatal error: Allowed memory size of 268435456 bytes exhausted (tried
to allocate 32 bytes) in application/libraries/Excel/PHPExcel/Cell.php
on line 839
Increase memory limit in your php:
ini_set('memory_limit', '1024M');
After this the memory limit will be 1 GB.

In R Studio I am getting Java Out of Memory (for RWeka)

Ok, this looks familiar from the Java world, where/how can I allow more memory for RWeka in RStudio.
Error in .jcall("RWekaInterfaces", "[S", "tokenize", .jcast(tokenizer, :
java.lang.OutOfMemoryError: GC overhead limit exceeded
Not sure how R interfaces to Java and if I can allow more heap space.
Thanks in advance
Gary
Yes you can increase the heap space.
You can do it before loading any rJava dependent package, in this case, RWeka.
To increase the default (512 MB) to 1024MB:
options(java.parameters = "-Xmx1024m")

Is there a 2GB memory usage limit when R boots?

I have the following code for loading some data in my .Rprofile (which is a R script in my project folder running automatically when I switch to the project with Rstudio).
data_files <- list.files(pattern="\\.(RData|rda)$")
if("data.rda" %in% data_files) {
attach(what="data.rda",
pos = 2)
cat("The file 'data.rda' was attached to the search path under 'file:data.rda'.\n\n")
}
The data being loaded is relatively big:
Type Size PrettySize Rows Columns
individual_viewings_26 data.frame 1547911120 [1] "1.4 Gb" 3685312 63
viewing_statements_all data.table 892316088 [1] "851 Mb" 3431935 38
weights data.frame 373135464 [1] "355.8 Mb" 3331538 14
pet data.table 63926168 [1] "61 Mb" 227384 34
But I have 16 GB and I can allocate them:
> memory.limit()
[1] 16289
When my data was not as big, I did not have any issue. I recently saved some more data frames in data.rda and my R session suddenly fails at start-up (when I switch to the project in Rstudio and .Rprofile is executed):
Error: cannot allocate vector of size 26.2 Mb
In addition: Warning messages:
1: Reached total allocation of 2047Mb: see help(memory.size)
2: Reached total allocation of 2047Mb: see help(memory.size)
3: Reached total allocation of 2047Mb: see help(memory.size)
4: Reached total allocation of 2047Mb: see help(memory.size)
I suspect that for some reason, the memory limit is set at 2GB at boot? Any way I can change that?
Edit: Added OS and software version
> sessionInfo()
R version 3.2.2 (2015-08-14)
Platform: x86_64-w64-mingw32/x64 (64-bit)
Running under: Windows 7 x64 (build 7601) Service Pack 1
Edit2: Just to clarify, I am able to load the data myself by running the code, I have plenty of available memory and the R process commonly uses up to 10GB during my daily work. The problem is, there is a apparently a 2GB memory limit when R boots and executes the .Rprofile...
Yes, there is a 2GB limit when R starts, at least when the user profile (.Rprofile files and .First() functions) are executed.
Proof:
Content of Rprofile:
message("Available memory when .Rprofile is sourced: ", memory.limit())
.First <- function() {
message("Available memory when .First() is called: ", memory.limit())
}
Output at startup
Available memory when .Rprofile is sourced: 2047
Available memory when .First() is called: 2047
Output of memory.limit once R has started
> memory.limit()
[1] 16289

Resources