R: how to deal with memory limitation - r

I am trying to deal with issues of memory limitation in R. I was running a code that would generate monthly data output. It was all going fine, R saved all monthly csv data in the computer as expected but seemed the have the console frozen (although the code ran entirely). When I restarted it, it did not launch as expected, I had to wipe everything and reinstall windows. I downloaded the new version of R (version 4.2.1) and my code no longer runs because of memory limitation. I get the error message below.
Error: cannot allocate vector of size 44.2 Gb
I tried increasing the memory with memory.limit() as I did before but it seems like it is no longer supported by R (memory.limit() bug: "memory.limit() is not longer supported". Increasing memory).
How to deal with this?

Related

Memory Limit in R

I'm relatively new to R and seem to be having a memory limit issue with a new laptop. When I run a large dataset of about 50K survey respondents and over 500 variables I receive the error message: Error: cannot allocate vector of size 413 Kb
I got around this issue fine on my old laptop by increasing the memory limit size via the code: memory.limit(size = 10000). Everything worked fine but on my new laptop which is faster and more powerful, the memory limit fills up very fast and will crash at size 27000 after I run about 7 models.
I have tried closing all unnecessary programs, removing all the unneeded objects in R, and clearing the garbage can: gc(). I was using latest version of R 4.14 and have now gone back to 4.04 where it worked fine on my old PC - but none of these help really.
I am running the 64bit version of R on a 64bit PC that has 8GB capacity.
Does anyone know why this might be occurring on a brand new laptop that runs faster while running slower on my 4-year old PC but atleast worked with it.
Also, how high can you set the memory limit as the manual says R can handle 8TB? And how do you permanently set a memory limit?
Thanks

"Cannot allocate vector of size xxx mb" error, nothing seems to fix

I'm running RStudio x64 on Windows 10 with 16GB of RAM. RStudio seems to be running out of memory for allocating large vectors, in this case a 265MB one. I've gone through multiple tests and checks to identify the problem:
Memory limit checks via memory.limit() and memory.size(). Memory limit is ~16GB and size of objects stored in environment is ~5.6GB.
Garbage collection via gc(). This removes some 100s of MBs.
Upped priority of rsession.exe and rstudio.exe via Task Manager to real-time.
Ran chkdsk and RAM diagnostics on system restart. Both returned no errors.
But the problem persists. It seems to me that R can access 16GB of RAM (and shows 16GB committed on Resource Monitor), but somehow is still unable to make a large vector. My main confusion is this: the problem only begins appearing if I run code on multiple datasets consecutively, without restarting RStudio in between. If I do restart RStudio, the problem doesn't show up anymore, not for a few runs.
The error should be replicable with any large R vector allocation (see e.g. the code here). I'm guessing the fault is software, some kind of memory leak, but I'm not sure where or how, or how to automate a fix.
Any thoughts? What have I missed?

Error: vector memory exhausted (limit reached?)

I previously saved a 2.8G RData file and now I'm trying to load it so I can work on it again, but weirdly, I can't. It's giving the error
Error: vector memory exhausted (limit reached?)
This is weird since I was working with it fine before. One thing that changed though is I upgraded to the latest version of R 3.5.0. I saw a previous post with the same error like this but it wasn't resolved. I was hopeful with this solution which increases the memory.limit() but unfortunately, it's only available for Windows.
Can anyone help? I don't really understand what's the problem here since I was able to work with my dataset before the update so it shouldn't be throwing this error.
Did the update somehow decreased the RAM allocated to R? Can we manually increase the memory.limit() in Mac to solve this error?
This change was necessary to deal with operating system memory over-commit issues on Mac OS. From the NEWS file:
\item The environment variable \env{R_MAX_VSIZE} can now be used
to specify the maximal vector heap size. On macOS, unless specified
by this environment variable, the maximal vector heap size is set to
the maximum of 16GB and the available physical memory. This is to
avoid having the \command{R} process killed when macOS over-commits
memory.
Set the environment variable R_MAX_VSIZE to an appropriate value for your system before starting R and you should be able to read your file.

Consistent unknown and fatal error in R / RStudio

Is there any way I can find out what happened that results into a crash?
I tried running a script multiple times now, and it seems to crash after a random amount of work done.
This is what I get (R studio):
As you can see, this isn't of any help.
As for memory: I am trying to construct a matrix with 34 million entries.
I monitored the memory usage during execution and came to the conclusion that there wasn't a big increase in memory. Memory was constant-ish at about 580mb.
The crash only occurs in R studio. Not in the R interpreter.

Memory allocation error in R

I am executing a sql query in R using sqldf package to create a data frame in R. But, it is throwing an error:
Error: cannot allocate vector of size 3.9 Gb
I have gone through various threads with a similar issue but I could not find a suitable answer.
Can anyone please help me out on this.
I am using R 2.15.1 version on 64-bit linux machine with 32 GB RAM.
The error is often misunderstood. It means that R is unable to allocate an additional chunk of 3.9Gb of memory space. If you were to look at the R process, it would have been using a very large amount of the available RAM before it issued the error you saw and you'd have realised that the error meant additional RAM.
You will have to expand upon this in another question to explain what it is you are trying to do as if you can't read data into R with 32Gb of RAM available you will probably need to look at incremental processing of that data. For that we need details of what you are trying to achieve.
It's just may be the memory limit in R is too low. First try memory.size() then use memory.limit() to know limit and set new one. I'm not sure if it help. Just let us all know.

Resources