In 64bit R, what should my memory.limit() be set to? - r

I intend to work with some large vectors in R.
memory.limit currently gives:
memory.limit()
[1] 4095
But I think that this is the default for 32bit R, whilst my installation is 64bit.
What should my memory.limit() be set to in 64bit R?

memory.limit is used to limit memory usage. It can be set to any value between 0 and the amount of available ram on the machine.
The "correct" value for working with large data sets is the full amount of ram available. So, for example, with 4GB of ram, 4095 is the appropriate value: memory.limit(4095).

Related

Memory limit in RStudio on a Windows Server 2012 VM

I am using RStudio on a windows server 2012 Virtual Machine.
I get the memory allocation error "Error: cannot allocate vector of size 6.9 GB" when I try to merge 2 tables.
I have tried all the steps I found on other stack answers, with no luck.
I used the gc() function
I increased the memory limit with memory.limit(size=500000)
I checked the version of R in use is 64 bit
I checked the OS is 64 bit
I have played around with different VM settings, going up to 128GB RAM.
Is there a hard limit in Widows Server that cannot change?
Thanks a lot.

How to increase the allocated memory size for RStudio in MAC?

I tried the functions memory.limit() and memory.size() but those are windows specific.
I want the similar function for MAC by which I can increase the memory of RStudio.

Error: vector memory exhausted (limit reached?)

I previously saved a 2.8G RData file and now I'm trying to load it so I can work on it again, but weirdly, I can't. It's giving the error
Error: vector memory exhausted (limit reached?)
This is weird since I was working with it fine before. One thing that changed though is I upgraded to the latest version of R 3.5.0. I saw a previous post with the same error like this but it wasn't resolved. I was hopeful with this solution which increases the memory.limit() but unfortunately, it's only available for Windows.
Can anyone help? I don't really understand what's the problem here since I was able to work with my dataset before the update so it shouldn't be throwing this error.
Did the update somehow decreased the RAM allocated to R? Can we manually increase the memory.limit() in Mac to solve this error?
This change was necessary to deal with operating system memory over-commit issues on Mac OS. From the NEWS file:
\item The environment variable \env{R_MAX_VSIZE} can now be used
to specify the maximal vector heap size. On macOS, unless specified
by this environment variable, the maximal vector heap size is set to
the maximum of 16GB and the available physical memory. This is to
avoid having the \command{R} process killed when macOS over-commits
memory.
Set the environment variable R_MAX_VSIZE to an appropriate value for your system before starting R and you should be able to read your file.

Increasing memory size in R on Linux

I am using Linux through a Virtual Machine (I need to in order to run this R code which uses Linux specific commands). I am using R version 3.3.1 x86_64-pc-linux-gnu on my Virtual Machine. I want to allocate larger memory size for R, since my code fails to finish due to memory size issues. I know in R on Windows you can use memory.limit(size=specify_size) to increase the size of the memory allocated, how would I do so on Linux in a straight forward fashion.
Compiled from comments:
R will use everything it can in a Linux environment, which does not limit an application's memory allowance like Windows does. As code requires more memory than the VM has available, more memory should be allocated when establishing the virtual environment.
The problem was solved by increasing the Base Memory of the Virtual Machine as well as increasing the CPU from 1 to 4. The code is running well now.

how to override the 2GB memory limit when R starts

When R boots, the memory limit (as returned by memory.limit) is set to 2GB, regardless of the available memory on the computer. (I found that out recently). I imagine that at some point in the booting process, this limit is set to the actually available memory.
This can be seen by printing memory.limit() in the .Rprofile file which is sourced at startup. It prints "2047". On the other hand, when R has booted and I type memory.limit() in the console, I get "16289".
I use a custom .Rprofile file and I need to have access to more than 2GB during bootup.
How can override this limit?
My current workaround is to set the limit myself in the .Rprofile using memory.limit(size=16289) but then I will have to edit this every time I work on a computer with a different amount of RAM which happens fairly often.
Is there an option I can change, a .ini file I can edit, or anything I can do about it?
OS and R version:
> sessionInfo()
R version 3.2.2 (2015-08-14)
Platform: x86_64-w64-mingw32/x64 (64-bit)
Running under: Windows 7 x64 (build 7601) Service Pack 1
Edit: this is not a duplicate, at least not a duplicate of the proposed question. It is not about managing available memory! I have 16GB of memory and memory.limit() shows that my limit is indeed 16GB.
It all started when I got the warning that I had "reached 2GB memory allocation" (implicating that I had a 2GB memory limit). After investigation, it appears that indeed R limits the memory at 2GB during the startup process.
I want to load my data automatically when R starts, for this I have a small loading script in the .Rprofile. I load more than 2GB data hence I need to have access to my 16GB. My question is about achieving this. This has nothing at all in common with the proposed duplicate, except keywords...
I'm interpreting this as you wanting memory.limit(size=16289) in your .RProfile file, but you don't want to set the specific number every time you change computers with different memory. Why not just dynamically pull the memory you need? In windows:
TOT_MEM <- as.numeric(gsub("\r","",gsub("TotalVisibleMemorySize=","",system('wmic OS get TotalVisibleMemorySize /Value',intern=TRUE)[3])))/1024
memory.limit(size=TOT_MEM)
which would set the available memory to the total memory of the system, or
FREE_MEM <- as.numeric(gsub("\r","",gsub("FreePhysicalMemory=","",system('wmic OS get FreePhysicalMemory /Value',intern=TRUE)[3])))/1024
memory.limit(size=FREE_MEM)
which would set memory.limit to the total available memory on boot.
Place this in RProfile, above where you load your data.

Resources