I'm trying to clone a reasonably big svn repository with git-svn and at a certain point I get a error message:
Failure loading plugin: APR: Can't create a character converter from 'UTF-8' to native encoding: Cannot allocate memory at /usr/libexec/git-core/git-svn line 5061
And sometimes a
Cannot allocate memory: zlib (compress2): out of memory: Compression of svndiff data failed at /usr/libexec/git-core/git-svn line 5061
error message. I still have ~3GB RAM free. What should I do so git-svn can utilize it?
(I'm doing this on RedHat Enterprise Linux 6.5 if that makes any difference)
From:
This error message is about the memory git is trying to allocate --
it's more than what is free. This is most likely caused by a large
file having been checked into SVN. Unfortunately, there's no easy way
to fix it (apart from buying more memory) -- you would have to remove
the large file and the commit adding it from SVN.
However try following:
Increase swap memory
Increase ulimit
Related
Ok, this looks familiar from the Java world, where/how can I allow more memory for RWeka in RStudio.
Error in .jcall("RWekaInterfaces", "[S", "tokenize", .jcast(tokenizer, :
java.lang.OutOfMemoryError: GC overhead limit exceeded
Not sure how R interfaces to Java and if I can allow more heap space.
Thanks in advance
Gary
Yes you can increase the heap space.
You can do it before loading any rJava dependent package, in this case, RWeka.
To increase the default (512 MB) to 1024MB:
options(java.parameters = "-Xmx1024m")
I am attempting to make use of the Stanford NLP tools (Java) in R, using the rJava package. When attempting to create a StanfordCoreNLP object I get the following error:
Error in .jnew("edu/stanford/nlp/pipeline/StanfordCoreNLP", props) : java.lang.OutOfMemoryError: Java heap space
To resolve this, I have attempted to initialise the JVM with a larger maximum heap size, using variations of the following code:
.jinit(parameters=c("-Xms1g","-Xmx4g"))
When the maximum heap is set to 1GB using -Xmx1g the JVM loads but I continue to get the OutOfMemoryError. When the maximum heap size is set to 2 or 3 GB (-Xmx2g or -Xmx3g), R will stop responding. When set to 4GB or higher -Xmx4g I will get the following message:
Error in .jinit(parameters = c("-Xms1g", "-Xmx4g"), force.init = TRUE) : Cannot create Java virtual machine (-6)
How do you successfully initialise the JVM using rJava to values larger than 1GB? I am using 32bit versions of Java (v8 u51) and R (v3.2.0)
I am using 32bit versions of Java (v8 u51) and R (v3.2.0)
That's your problem right there. Switch to 64bit versions.
I have this same error as others when running php ~/composer.phar update:
The following exception is caused by a lack of memory and not having swap configured
Check https://getcomposer.org/doc/articles/troubleshooting.md#proc-open-fork-failed-errors for details
Fatal error: Uncaught exception 'ErrorException' with message 'proc_open(): fork failed - Cannot allocate memory' in phar:///home/tea/composer.phar/vendor/symfony/console/Symfony/Component/Console/Application.php:974
Stack trace:
0 [internal function]: Composer\Util\ErrorHandler::handle(2, 'proc_open(): fo...', 'phar:///home/te...', 974, Array)
1 phar:///home/tea/composer.phar/vendor/symfony/console/Symfony/Component/Console/Application.php(974): proc_open('stty -a | grep ...', Array, NULL, NULL, NULL, Array)
2 phar:///home/tea/composer.phar/vendor/symfony/console/Symfony/Component/Console/Application.php(784): Symfony\Component\Console\Application->getSttyColumns()
3 phar:///home/tea/composer.phar/vendor/symfony/console/Symfony/Component/Console/Application.php(745): Symfony\Component\Console\Application->getTerminalDimensions()
4 phar:///home/tea/composer.phar/vendor/symfony/console/Symfony/Component/Console/Application.php(675): Symfony\Component\Console\Application->getTerminalWidth()
5 phar:///home/tea/composer in phar:///home/tea/composer.phar/vendor/symfony/console/Symfony/Component/Console/Application.php on line 974
...but with a large instance: 4gb RAM and 4gb swap. The free RAM is never exhausted, let alone the available/cached RAM, and the swap isn't touched!
total used free shared buff/cache available
Mem: 3788 885 1908 9 993 2692
Swap: 3967 0 3967
It's the first time running composer update on this new machine, CentOS/CloudLinux 7.1 (with cPanel).
In desperation, I've tried
# php -dmemory_limit=1G ../composer.phar update --no-scripts --prefer-dist
and I've tried removing the composer.lock and vendor files and even tried adding a temporary swap file but it really doesn't seem to be a memory problem - could the error be misguided?
proc_open is not disabled and I also tried with shell fork bomb protection disabled but no jive.
Would love a heads up.
N.B. I'm aware of the advice to commit the composer.lock file and do a composer install but this instance is being used for dev (as was the previous CentOS/CloudLinux 6.x machine with smaller resource specs) so we need to use the same methods we were using previously.
OK so it was CloudLinux limiting the memory for the user to 1024mb, because it works when the limit is doubled to 2048mb.
That's the same setting on our previous server (CentOS/CloudLinux 6.x) but it looks like each version of CentOS is much more memory hungry than the rest.
Whats weird is that running composer with --profile shows the most it uses is 482mb. Even if it doubles when forking (as has been suggested) that's still below the 1024mb limit.
I ran into the same problem. My system had 1.5GB RAM free and it was not enough...Composer was eating too much memory very fast.
My only solution was to clear cache and update to latest version (1.4.2):
composer clear-cache
sudo composer selfupdate
That happens when you have low memory resources and you don't have swap enabled, i had the same problem and fixed using this few commands bellow, or you can create a swap partition or file, just make sure that you activate swap.
$ /bin/dd if=/dev/zero of=/var/swap.1 bs=1M count=1024
$ /sbin/mkswap /var/swap.1
$ /sbin/swapon /var/swap.1
I hope it works for you...
I have a main function in R which calls other files to run my program. I call the main file through a bat file(.exe). When I run it line-by-line it runs without a memory error, but when I call the bat file to run it, it halts and gives me the following error:
Cannot allocate memory greater than 51 MB.
How can I avoid this?
Memory limitations in R such as this are a recurring nightmare for a lot of us.
Very often the problem is a limit imposed by your OS limits (which can usually be changed on a Bash or PowerShell command line), architecture (32 v. 64 bit), or the availability of contiguous free RAM, irregardless of overall available memory.
It's hard to say why something would not cause a memory issue when run line by line, but would hit the memory limit when run as a .bat.
What version of R are you running? Do you have both installed? Is 32-bit being called by Rscript when you run your .bat file whereas you run a 64-bit version line by line? You can check the version of R that's being run with R.Version().
You can test this by running the command memory.limit() in both your R IDE/terminal and in your .bat file (be sure to print or save the result as an object in your .bat file). You might also do well to try setting memory.limit() in your .bat file, as it may just have a smaller default, perhaps due to differences in your R Profile that's invoked in your IDE or terminal versus the .bat file.
If architecture isn't the cause of your memory error, then you have several more troubleshooting steps to try:
Check memory usage in both environments (in R directly and via your .bat process) using this:
sort( sapply(ls(),function(x){object.size(get(x))}))
Run the garbage collector explicitly in your scripts, that's the gc() command
Check all object sizes to make sure there are no unexpected results in your .bat process: sort( sapply(ls(),function(x){format(object.size(get(x)), units = "Mb")}))
Try memory profiling:
Rprof(tf <- "rprof.log", memory.profiling=TRUE)
Rprof(NULL)
summaryRprof(tf)
While this is a RAM issue, for good measure you might want to check that the compute power available is both sufficient and not varying between these two ways of running your code: parallel::detectCores()
Examine your performance with Prof. Hadley Wikham's lineprof tool (warning: requires devtools and doesn't work on lines of code which call the C programming language)
References While I'm pulling these snippets out of my own code, most of them originally came from other, related StackOverflow posts, such as:
Reaching memory allocation in R
R Memory Allocation "Error: cannot allocate vector of size 75.1 Mb"
R memory limit warning vs "unable to allocate..."
How to compute the size of the allocated memory for a general type
R : Any other solution to "cannot allocate vector size n mb" in R?
Yes you should be using 64bit R, if you can.
See this question, and this from the R docs.
I am currently trying to use pkgmk on a Solaris10-x86 machine. When I run the command
pkgmk -o -b $(HOME)/solbuild/pkg_solaris
it returns this error:
## Building pkgmap from package prototype file.
pkgmk: ERROR: memory allocation failure
## Packaging was not successful.
My first thought was that this is an out of free memory error, however I am not sure that it could be. I have close to a gigabyte free in the / partition and 12 gigabytes free in the $(HOME) partition.
Any help would be greatly appreciated.
I saw this error when /var was full.
Deleting some file from /var resolved this problem.