How To Limit the Amount of RAM RServer/RSession Uses on Windows - r

I need to limit the amount of RAM RServer/RSession uses on our Windows 2012 server.
I am a system admin who has two users running R Studio on our Server. Each R Studio spawns one or more RSession.exe processes. Sometimes one of the RSession.exe processes takes up too much RAM. Yesterday, it grew to about 60GB, forcing almost everything else running to a halt. (We have 72 GB available, but 48 GB of that is allocated to SQL Server.) I need to make sure that using R doesn’t bring down my server or affect other applications running on it.
Info:
sessionInfo()
R version 3.4.3 (2017-11-30)
Platform: x86_64-w64-mingw32/x64 (64-bit)
Running under: Windows Server >= 2012 x64 (build 9200)
As suggested in some forums, I tried adding --max-mem-size=4000M to the shortcut they use to start R. This works with RGUI.exe, but doesn’t work with RStudio.exe. When I run memory.limit() from RGui, it return 4000, as expected. When I run memory.limit() from RStudio, it returns 71999, regardless of the start parameters. And, in any case, I’m not certain this is the right place to set limits. Would a command line switch like this for RStudio affect RSession processes it spawns?
I also tried running memory.limit(4000) from within RStudio, but got the following message:
“Warning message: In memory.limit(4000) : cannot decrease memory limit: ignored”
I’ve also read something about setting memory limits in user profiles, but all of the instructions for doing so are Unix oriented and refer to files and directories that don’t exist on our Windows server. (Sorry, I’m not an R person, just a poor sysadmin trying to keep his server alive…)
One post noted that “Just to let you know, support.rstudio.com on 2014/06/10: "We've got it on our list of things to investigate and hope to have a solution soon" (How to set memory limit in RStudio (desktop version)?). Is this a bug? If so, what work-arounds are there? If not, how can I limit RSession memory usage so it doesn’t bring down my server?
Thank You,
Robbie

Related

Why would R Shiny app run indefinitely in R Studio?

I wanted to learn how to build Shiny Apps in R so I started this beginner-level tutorial. However, when I run the app on my desktop (Windows 10 x64, 16GB RAM, 500 GB SSD, i5-3470 CPU, Dual Display) in R Studio 1.3.1093) using R 4.0.3, it loads indefinitely with no error output. I tried running even the basic built-in examples (which you could find here) and they also failed to load. The exact same scripts and examples run on my laptop (Windows 10 x64, 8GB RAM, 250 GB SSD; R & R Studio specs the same) without issue. I've reinstalled the shiny package, reinstalled R and R Studio, and changed whether the app runs internally or externally with no success. I did find this post which seems to have encountered the same issue, but with no solution.
I know it's not much to go on, but I'm at a loss as to the next thing I should check. Any suggestions?
I figured out from this mostly unrelated post that there was a file at the path C:/ Users/.../Documents\R\win-library\4.0/ called 00LOCK which was giving R trouble downloading and updating new packages. I'm not sure how it got there or why R was not telling me there were issues in updating the packages, but the shiny app seems to work perfectly fine now.

R will not run after latest windows 10 updates

I have updated my windows and R cannot run, and hence neither can R studio. When I run R GUI it just freezes and is unresponsive. I have allowed chromium exemption to the firewall
I am on Windows Insider program and has just updated to
Windows 10 Home, Insider Preview
Evaluation Copy.Build 20190.rs_prerelease.200807-1609
Note that R GUI freezes and then shuts down on its own, so maybe the problem is R GUI and not R Studio.
I get the following errors on R studio.
This site can’t be reached
127.0.0.1 refused to connect.
Try:
Checking the connection
Checking the proxy and the firewall
ERR_CONNECTION_REFUSED
Cannot Connect to R
RStudio can't establish a connection to R. This usually indicates one of the following:
The R session is taking an unusually long time to start, perhaps because of slow operations in startup scripts or slow network drive access.
RStudio is unable to communicate with R over a local network port, possibly because of firewall restrictions or anti-virus software.
Please try the following:
If you've customized R session creation by creating an R profile (e.g. located at ~/.Rprofile), consider temporarily removing it.
If you are using a firewall or antivirus software which guards access to local network ports, add an exclusion for the RStudio and rsession executables.
Run RGui, R.app, or R in a terminal to ensure that R itself starts up correctly.
Further troubleshooting help can be found on our website:
Troubleshooting RStudio Startup
This has been fixed with Windows 10 Insider Preview Build 20201 (released on August 26, 2020 in the Dev channel).The previous two builds were missing 64-bit APIs required by the prebuilt version of R.
Same issue.
Rollback to the previous version solves the problem.
I think it is about the update of the graphic features of Windows.
Here is what Microsoft said in the build 20190 changelog:
Improved Graphics Settings experience
While this isn’t a new feature all together, we have made significant changes based on customer feedback that will benefit our customers’ Graphics Settings experience. We have made the following improvements:
We’ve updated the Graphics Settings to allow users to specify a default high performance GPU.
We’ve updated the Graphics Settings to allow users to pick a specific GPU on a per application basis.

"Cannot allocate vector of size xxx mb" error, nothing seems to fix

I'm running RStudio x64 on Windows 10 with 16GB of RAM. RStudio seems to be running out of memory for allocating large vectors, in this case a 265MB one. I've gone through multiple tests and checks to identify the problem:
Memory limit checks via memory.limit() and memory.size(). Memory limit is ~16GB and size of objects stored in environment is ~5.6GB.
Garbage collection via gc(). This removes some 100s of MBs.
Upped priority of rsession.exe and rstudio.exe via Task Manager to real-time.
Ran chkdsk and RAM diagnostics on system restart. Both returned no errors.
But the problem persists. It seems to me that R can access 16GB of RAM (and shows 16GB committed on Resource Monitor), but somehow is still unable to make a large vector. My main confusion is this: the problem only begins appearing if I run code on multiple datasets consecutively, without restarting RStudio in between. If I do restart RStudio, the problem doesn't show up anymore, not for a few runs.
The error should be replicable with any large R vector allocation (see e.g. the code here). I'm guessing the fault is software, some kind of memory leak, but I'm not sure where or how, or how to automate a fix.
Any thoughts? What have I missed?

Increasing memory size in R on Linux

I am using Linux through a Virtual Machine (I need to in order to run this R code which uses Linux specific commands). I am using R version 3.3.1 x86_64-pc-linux-gnu on my Virtual Machine. I want to allocate larger memory size for R, since my code fails to finish due to memory size issues. I know in R on Windows you can use memory.limit(size=specify_size) to increase the size of the memory allocated, how would I do so on Linux in a straight forward fashion.
Compiled from comments:
R will use everything it can in a Linux environment, which does not limit an application's memory allowance like Windows does. As code requires more memory than the VM has available, more memory should be allocated when establishing the virtual environment.
The problem was solved by increasing the Base Memory of the Virtual Machine as well as increasing the CPU from 1 to 4. The code is running well now.

how to override the 2GB memory limit when R starts

When R boots, the memory limit (as returned by memory.limit) is set to 2GB, regardless of the available memory on the computer. (I found that out recently). I imagine that at some point in the booting process, this limit is set to the actually available memory.
This can be seen by printing memory.limit() in the .Rprofile file which is sourced at startup. It prints "2047". On the other hand, when R has booted and I type memory.limit() in the console, I get "16289".
I use a custom .Rprofile file and I need to have access to more than 2GB during bootup.
How can override this limit?
My current workaround is to set the limit myself in the .Rprofile using memory.limit(size=16289) but then I will have to edit this every time I work on a computer with a different amount of RAM which happens fairly often.
Is there an option I can change, a .ini file I can edit, or anything I can do about it?
OS and R version:
> sessionInfo()
R version 3.2.2 (2015-08-14)
Platform: x86_64-w64-mingw32/x64 (64-bit)
Running under: Windows 7 x64 (build 7601) Service Pack 1
Edit: this is not a duplicate, at least not a duplicate of the proposed question. It is not about managing available memory! I have 16GB of memory and memory.limit() shows that my limit is indeed 16GB.
It all started when I got the warning that I had "reached 2GB memory allocation" (implicating that I had a 2GB memory limit). After investigation, it appears that indeed R limits the memory at 2GB during the startup process.
I want to load my data automatically when R starts, for this I have a small loading script in the .Rprofile. I load more than 2GB data hence I need to have access to my 16GB. My question is about achieving this. This has nothing at all in common with the proposed duplicate, except keywords...
I'm interpreting this as you wanting memory.limit(size=16289) in your .RProfile file, but you don't want to set the specific number every time you change computers with different memory. Why not just dynamically pull the memory you need? In windows:
TOT_MEM <- as.numeric(gsub("\r","",gsub("TotalVisibleMemorySize=","",system('wmic OS get TotalVisibleMemorySize /Value',intern=TRUE)[3])))/1024
memory.limit(size=TOT_MEM)
which would set the available memory to the total memory of the system, or
FREE_MEM <- as.numeric(gsub("\r","",gsub("FreePhysicalMemory=","",system('wmic OS get FreePhysicalMemory /Value',intern=TRUE)[3])))/1024
memory.limit(size=FREE_MEM)
which would set memory.limit to the total available memory on boot.
Place this in RProfile, above where you load your data.

Resources