I tried to decrease and give some limit to R studio but it didn't work.
I've read this article and this question.
My computer is running Windows 8 64 Bit with RAM 8 GB. I just want to give memory limit to R Studio only 4 GB.
The easiest would be to just use:
memory.limit()
[1] 8157
memory.limit(size=4000)
If you are running a server version of RStudio, it will be a bit different. You will have to change the file /etc/rstudio/rserver.conf and add rsession-memory-limit-mb=4000 to it.
If you do find that RStudio is resetting the memory limit with every new instance, you could try and add memory.limit(size=4000) to your .Rprofile file that sets your with every start
Related
The command "memory.limit()" is used to get memory in R on windows.
What is corresponding way to get memory in R on Linux
You can use ulimit -v to get memory limit for single process
Refer link
I try to knit an Rdata file in R studio and get an error that cannot allocate vector 1,8mb.
I understand that I have memory issue so I tried to use memory.limit() as suggested on another post but I got an error
memory.size() is Windows-specific.
I also made extra RAM using a usb and changing swappiness but nothing happened. I'm using Ubuntu 14.04 32bit, R Version 3.3.0 32bit and also I have Windows O/S installed.
The problem is that you're on a 32-bit installation of R and you've used up the (very small) amount of memory it can handle. If you just switch to 64-bit you won't get this error.
The error message is because you invoked a Windows-specific command on Ubuntu, but that's really not relevant because with 32-bit R there's a hard limit on memory which you've already hit.
I know this is confusing because it's complaining about a very small size vector (1.8 MB) but that just means that the remaining amount of memory 32-bit R can handle is less than that.
If you were on Windows you might need to set the memory limit in addition to using 64-bit R, but if you're using Ubuntu then just using 64-bit R should solve the problem.
RStudio Instructions
I'm basing this on my version of RStudio, yours might be slightly different but it should be very close.
Click Tools
Click Global Options...
With the "General" (default) screen selected click Change where it says "R version:"
Select Use your machine's default version of R64 (64-bit)
Click OK
Allow RStudio to restart
I'm working with a software project that requires the portable version of R platform. My intention is to use R in any version of Windows and in any compatible computer.
Problem: In Windows 7, R works fine without any worries, even in portable version. However, in Windows 10 (and probably also in Windows 8), R does not start when put the entire folder inside a directory containing whitespaces (ex.: "C:/Users/Main/Documents/My Folder/RVersion").
In Windows 10, with the absence of spaces, R runs fine. In the presence of spaces, all executable (Rscript.exe, R.exe, etc) except Rgui.exe just open a console and closes instantly. The problem is: I really need that R works in any folder (this is a important part of the project).
Additional information:
I found that R does not work well in directories without the 8dot3 format - and it think that Windows 10 lost this property, which was present in Windows 7. Also, the problem is clear when I run Rgui.exe in a whitespace-containing directory and try to run system("R.exe", intern=TRUE) function: It throws an error indicating that only the part before the first space in directory name was taken into account. Here is the message:
> system("R.exe", intern=TRUE)
[1] "'C:\\Users\\Main\\DOCUME~1\\My' nÆo ‚ reconhecido como um comando interno"
[2] "ou externo, um programa oper vel ou um arquivo em lotes." attr(,"status")
[1] 1
Warning message:
running command 'R.exe' had status 1
Translation of messages [1] and [2]: "'C:\...\My'" not recognized as a internal or external command, nor a program operation or dataset
The same occurs with non-portable version of R, as I already tested.
When I run with a .bat file with the corrected (quoted) directory as input, R.exe runs, but in a disfunctional form and looking like cmd.exe (no R command worked).
I have no ideia how to change variables such as R_HOME to a readable version before R prompt starts.
System/Resources:
Windows 10 Home 64-bit with the last update.
Dell Notebook with Intel i7-5500U 2.40 GHz (not so relevant, I think)
R and R portable 3.3 (last version until this post), downloaded here:
[https://sourceforge.net/projects/rportable/]
I believe that, with the popularity of Windows 10, many other users could face this problem (specially those who depend of R portability).
Thanks in advance!
In the end, I found a plausible solution by myself.
I realized that every time the R.exe is executed in a directory without spaces, the execution is automatically redirected to Rterm.exe. I don't know if they fixed the bug specifically to Rterm.exe in the last version, but this suggests that the Rterm executable is the responsible for opening the R console, and when I tried to execute it directly through a spaces-containing directory, it worked fine (at least in the last R version).
So, in summary, if someone else had this problem, just run Rterm.exe directly.
This resolution is useful enough for those who just depend on R portability, although the Rscript.exe and R.exe still are not working in these conditions. Rscript.exe is useful to execute scripts directly without the need to manually start a new session in a new window, and R.exe redirects to Rterm.exe based on your system, if it is 32-bit or 64-bit (at least as I observed).
Anyway, I already informed CRAN project about this bug, and I hope they check it and fix the issue in the next version. If someone else found an alternative solution, feel free to comment.
I have read through this SO question and answers (R parallel computing and zombie processes) but it doesn't seem to quite address my situation.
I have a 4-core MacBook Pro running Mac OS X 10.10.3, R 3.2.0, and RStudio 0.99.441.
Yesterday, I was trying out the packages "foreach" and "doParallel" (I want to use them in a package I am working on). I did this:
cl <- makeCluster(14)
registerDoParallel(cl)
a <- 0
ls <- foreach(icount(100)) %dopar% {
b <- a + 1
}
It is clear to me that it doesn't make sense to have 14 processes on my 4-core machine, but the software will actually be run on a 16-core machine. At this point my computer ground to a halt. I opened activity monitor and found 16 (or more, maybe?) R processes. I tried to force quit them from the activity monitor -- no luck. I closed RStudio and that killed all the R processes. I reopened RStudio and that restarted all the R processes. I restarted the computer and restarted RStudio and that restarted all the R processes.
How can I start RStudio without restarting all those processes?
EDIT: I forgot to mention that I also rebuilt the package I was working on at the time (all the processes may have been running during the build)
EDIT2: Also, I can't StopCluster(cl) because cl is not in the environment anymore...I closed that R session.
EDIT3: When I open R.app (The R GUI provided with R) or open R in the terminal, no such problem occurs. So I think it must be RStudio-related.
EDIT4: There appears to be a random delay between opening RStudio and the starting of all these undesired processes. Between 15s and 2 mins.
EDIT5: It seems the processes only start after I open the project from which they were started.
EDIT6: I have been picking through the .Rproj.user files looking for things to delete. I deleted all the files (but not the directories) in ctx, pcs, and sdb. Problem persists.
EDIT7: When I run "killall R" at the command line it kills all these processes, but when I restart RStudio and reopen the project, all the processes start again.
EDIT8: I used "killall -s R | wc -l" to find that the number of R processes grows and grows while the project is open. It got up to 358 and then I ran "killall R" because my computer was making scary sounds.
EDIT9: RStudio is completely unusable currently. Every time I "killall R", it restarts all the processes within 15 seconds.
EDIT10: When I initiate a build that also starts up tons of R processes -- 109 at last check. These processes all get started up when the build says "preparing package for lazy loading". At this point the computer grinds to a near-halt.
EDIT11: I deleted the .Rproj file (actually just moved it as a backup) and the .Rproj.user directory. I used "create project from directory" in RStudio. When I open that NEW project, I still get the same behavior. What is RStudio doing when I open a project that isn't contained anywhere in the .Rproj file or the .Rproj.user directory!? I've spent the whole day on this one problem....:(
Best guess -- the newest version of RStudio tries to do some work behind the scenes to develop an autocompletion database, based on library() and require() calls it detects within open files in your project. To do this, it launches new R processes, loads those packages (with library()), and then returns the set of all objects made available by that package.
By any chance are you loading certain packages that have complex .onLoad() actions? It's possible that this engine in RStudio is running those in R processes behind the scenes, but getting stuck for some reason and leaving you with these (maybe stale or busy) R processes.
For reference, there was somewhat similar issue reported here.
Here's what ended up fixing it:
Delete the package I built (the binary, I believe...I clicked the "x" to the right of it's name in the "Packages" part of RStudio).
Rebuild it, with
library(parallel)
commented out.
unloadNamespace("doParallel")
will kill the unnamed worker started by registerDoParallel
if you have the name of the clusters, you can use:
stopCluster(cl)
When i run the partcover from the console on a 32 bit machine, i am getting 5 mb file generated as output. But when i ran the same on 64 bit machine's commmand prompt i am getting 1 KB file as the output.
I have used
CorFlags.exe PartCover.exe /32BIT+
/Force
CorFlags.exe PartCover.Browser.exe /32BIT+
/Force
But no able to generate a proper output. I am using Nunit 2.5 and Part Cover 2.3
I've successfully used PartCover with NUnit on a 64 bit machine, but it often requires some fiddling to get working. One of the key things for me was to ensure that the tests were being run using nunit-console-x86.exe.
I recommend using the most recent version of PartCover, which is a fork on GitHub - PartCover.NET 4 and read this and this.