Memory limit in RStudio on a Windows Server 2012 VM - r

I am using RStudio on a windows server 2012 Virtual Machine.
I get the memory allocation error "Error: cannot allocate vector of size 6.9 GB" when I try to merge 2 tables.
I have tried all the steps I found on other stack answers, with no luck.
I used the gc() function
I increased the memory limit with memory.limit(size=500000)
I checked the version of R in use is 64 bit
I checked the OS is 64 bit
I have played around with different VM settings, going up to 128GB RAM.
Is there a hard limit in Widows Server that cannot change?
Thanks a lot.

Related

Memory Limit in R

I'm relatively new to R and seem to be having a memory limit issue with a new laptop. When I run a large dataset of about 50K survey respondents and over 500 variables I receive the error message: Error: cannot allocate vector of size 413 Kb
I got around this issue fine on my old laptop by increasing the memory limit size via the code: memory.limit(size = 10000). Everything worked fine but on my new laptop which is faster and more powerful, the memory limit fills up very fast and will crash at size 27000 after I run about 7 models.
I have tried closing all unnecessary programs, removing all the unneeded objects in R, and clearing the garbage can: gc(). I was using latest version of R 4.14 and have now gone back to 4.04 where it worked fine on my old PC - but none of these help really.
I am running the 64bit version of R on a 64bit PC that has 8GB capacity.
Does anyone know why this might be occurring on a brand new laptop that runs faster while running slower on my 4-year old PC but atleast worked with it.
Also, how high can you set the memory limit as the manual says R can handle 8TB? And how do you permanently set a memory limit?
Thanks

How To Limit the Amount of RAM RServer/RSession Uses on Windows

I need to limit the amount of RAM RServer/RSession uses on our Windows 2012 server.
I am a system admin who has two users running R Studio on our Server. Each R Studio spawns one or more RSession.exe processes. Sometimes one of the RSession.exe processes takes up too much RAM. Yesterday, it grew to about 60GB, forcing almost everything else running to a halt. (We have 72 GB available, but 48 GB of that is allocated to SQL Server.) I need to make sure that using R doesn’t bring down my server or affect other applications running on it.
Info:
sessionInfo()
R version 3.4.3 (2017-11-30)
Platform: x86_64-w64-mingw32/x64 (64-bit)
Running under: Windows Server >= 2012 x64 (build 9200)
As suggested in some forums, I tried adding --max-mem-size=4000M to the shortcut they use to start R. This works with RGUI.exe, but doesn’t work with RStudio.exe. When I run memory.limit() from RGui, it return 4000, as expected. When I run memory.limit() from RStudio, it returns 71999, regardless of the start parameters. And, in any case, I’m not certain this is the right place to set limits. Would a command line switch like this for RStudio affect RSession processes it spawns?
I also tried running memory.limit(4000) from within RStudio, but got the following message:
“Warning message: In memory.limit(4000) : cannot decrease memory limit: ignored”
I’ve also read something about setting memory limits in user profiles, but all of the instructions for doing so are Unix oriented and refer to files and directories that don’t exist on our Windows server. (Sorry, I’m not an R person, just a poor sysadmin trying to keep his server alive…)
One post noted that “Just to let you know, support.rstudio.com on 2014/06/10: "We've got it on our list of things to investigate and hope to have a solution soon" (How to set memory limit in RStudio (desktop version)?). Is this a bug? If so, what work-arounds are there? If not, how can I limit RSession memory usage so it doesn’t bring down my server?
Thank You,
Robbie

Increasing memory size in R on Linux

I am using Linux through a Virtual Machine (I need to in order to run this R code which uses Linux specific commands). I am using R version 3.3.1 x86_64-pc-linux-gnu on my Virtual Machine. I want to allocate larger memory size for R, since my code fails to finish due to memory size issues. I know in R on Windows you can use memory.limit(size=specify_size) to increase the size of the memory allocated, how would I do so on Linux in a straight forward fashion.
Compiled from comments:
R will use everything it can in a Linux environment, which does not limit an application's memory allowance like Windows does. As code requires more memory than the VM has available, more memory should be allocated when establishing the virtual environment.
The problem was solved by increasing the Base Memory of the Virtual Machine as well as increasing the CPU from 1 to 4. The code is running well now.

how to override the 2GB memory limit when R starts

When R boots, the memory limit (as returned by memory.limit) is set to 2GB, regardless of the available memory on the computer. (I found that out recently). I imagine that at some point in the booting process, this limit is set to the actually available memory.
This can be seen by printing memory.limit() in the .Rprofile file which is sourced at startup. It prints "2047". On the other hand, when R has booted and I type memory.limit() in the console, I get "16289".
I use a custom .Rprofile file and I need to have access to more than 2GB during bootup.
How can override this limit?
My current workaround is to set the limit myself in the .Rprofile using memory.limit(size=16289) but then I will have to edit this every time I work on a computer with a different amount of RAM which happens fairly often.
Is there an option I can change, a .ini file I can edit, or anything I can do about it?
OS and R version:
> sessionInfo()
R version 3.2.2 (2015-08-14)
Platform: x86_64-w64-mingw32/x64 (64-bit)
Running under: Windows 7 x64 (build 7601) Service Pack 1
Edit: this is not a duplicate, at least not a duplicate of the proposed question. It is not about managing available memory! I have 16GB of memory and memory.limit() shows that my limit is indeed 16GB.
It all started when I got the warning that I had "reached 2GB memory allocation" (implicating that I had a 2GB memory limit). After investigation, it appears that indeed R limits the memory at 2GB during the startup process.
I want to load my data automatically when R starts, for this I have a small loading script in the .Rprofile. I load more than 2GB data hence I need to have access to my 16GB. My question is about achieving this. This has nothing at all in common with the proposed duplicate, except keywords...
I'm interpreting this as you wanting memory.limit(size=16289) in your .RProfile file, but you don't want to set the specific number every time you change computers with different memory. Why not just dynamically pull the memory you need? In windows:
TOT_MEM <- as.numeric(gsub("\r","",gsub("TotalVisibleMemorySize=","",system('wmic OS get TotalVisibleMemorySize /Value',intern=TRUE)[3])))/1024
memory.limit(size=TOT_MEM)
which would set the available memory to the total memory of the system, or
FREE_MEM <- as.numeric(gsub("\r","",gsub("FreePhysicalMemory=","",system('wmic OS get FreePhysicalMemory /Value',intern=TRUE)[3])))/1024
memory.limit(size=FREE_MEM)
which would set memory.limit to the total available memory on boot.
Place this in RProfile, above where you load your data.

In 64bit R, what should my memory.limit() be set to?

I intend to work with some large vectors in R.
memory.limit currently gives:
memory.limit()
[1] 4095
But I think that this is the default for 32bit R, whilst my installation is 64bit.
What should my memory.limit() be set to in 64bit R?
memory.limit is used to limit memory usage. It can be set to any value between 0 and the amount of available ram on the machine.
The "correct" value for working with large data sets is the full amount of ram available. So, for example, with 4GB of ram, 4095 is the appropriate value: memory.limit(4095).

Resources