Memory Allocation Error using pkgmk on Solaris10? - unix

I am currently trying to use pkgmk on a Solaris10-x86 machine. When I run the command
pkgmk -o -b $(HOME)/solbuild/pkg_solaris
it returns this error:
## Building pkgmap from package prototype file.
pkgmk: ERROR: memory allocation failure
## Packaging was not successful.
My first thought was that this is an out of free memory error, however I am not sure that it could be. I have close to a gigabyte free in the / partition and 12 gigabytes free in the $(HOME) partition.
Any help would be greatly appreciated.

I saw this error when /var was full.
Deleting some file from /var resolved this problem.

Related

DBD::SQLite::db commit failed: disk I/O error

I have a system writing data to an sqlite file. I had everything operational under CentOS 8. After upgrading the system to Rocky Linux 9 I see this error when running a commit command: DBD::SQLite::db commit failed: disk I/O error
I have checked file permissions, disk space, SMART readings, everything disk related that I can think of but without success.
Has anyone encountered this error before? What could I try to fix it?
The problem turned out to be a missing Perl module (LWP::https) that was causing DBD::SQLite not to get the data it wanted. Apparently, DBD::SQLite says Disk I/O error for that case.

Tau2Slog2 not able to process 6gb tau.trc files

I am profiling my code using the TAU profiler. I am using tau_exec at runtime. It generates trace files. Some of which are in gigabytes. tau_treemerge.pl merges and generates a tau.trc which is 6GB. tau2slog2 now fails complaining about the heap space.
It would be helpful if anybody can show how to reduce the size of trace files.
Following is the way I am running the code:
mpirun -n 64 tau_exec ./a.out
tau_treemerge.pl;
tau2slog2 tau.tr tau.edf -o tau.slog2
I was able to solve the issue by increasing the heap size of the JVM.
java -Xmx50000m -Xms32000m -cp /tau/x86_64/lib/TAU_tf.jar:/tau/x86_64/lib/traceTOslog2.jar:/tau/x86_64/lib/tau2slog2.jar edu/uoregon/tau/Tau2Slog2 tau.trc tau.edf -o tau.slog2
Obviously it is a workaround and not an elegant solution. So to reduce the tau.trc filesize I have added more filtering parameters during the instrumentation.
Also I first just profiled the code export TAU_PROFILE=1 and then ran pprof and figured out which MPI function is called enormously then throttled those functions to further reduce the file size.

Composer proc_open(): fork failed - Cannot allocate memory

I have this same error as others when running php ~/composer.phar update:
The following exception is caused by a lack of memory and not having swap configured
Check https://getcomposer.org/doc/articles/troubleshooting.md#proc-open-fork-failed-errors for details
Fatal error: Uncaught exception 'ErrorException' with message 'proc_open(): fork failed - Cannot allocate memory' in phar:///home/tea/composer.phar/vendor/symfony/console/Symfony/Component/Console/Application.php:974
Stack trace:
0 [internal function]: Composer\Util\ErrorHandler::handle(2, 'proc_open(): fo...', 'phar:///home/te...', 974, Array)
1 phar:///home/tea/composer.phar/vendor/symfony/console/Symfony/Component/Console/Application.php(974): proc_open('stty -a | grep ...', Array, NULL, NULL, NULL, Array)
2 phar:///home/tea/composer.phar/vendor/symfony/console/Symfony/Component/Console/Application.php(784): Symfony\Component\Console\Application->getSttyColumns()
3 phar:///home/tea/composer.phar/vendor/symfony/console/Symfony/Component/Console/Application.php(745): Symfony\Component\Console\Application->getTerminalDimensions()
4 phar:///home/tea/composer.phar/vendor/symfony/console/Symfony/Component/Console/Application.php(675): Symfony\Component\Console\Application->getTerminalWidth()
5 phar:///home/tea/composer in phar:///home/tea/composer.phar/vendor/symfony/console/Symfony/Component/Console/Application.php on line 974
...but with a large instance: 4gb RAM and 4gb swap. The free RAM is never exhausted, let alone the available/cached RAM, and the swap isn't touched!
total used free shared buff/cache available
Mem: 3788 885 1908 9 993 2692
Swap: 3967 0 3967
It's the first time running composer update on this new machine, CentOS/CloudLinux 7.1 (with cPanel).
In desperation, I've tried
# php -dmemory_limit=1G ../composer.phar update --no-scripts --prefer-dist
and I've tried removing the composer.lock and vendor files and even tried adding a temporary swap file but it really doesn't seem to be a memory problem - could the error be misguided?
proc_open is not disabled and I also tried with shell fork bomb protection disabled but no jive.
Would love a heads up.
N.B. I'm aware of the advice to commit the composer.lock file and do a composer install but this instance is being used for dev (as was the previous CentOS/CloudLinux 6.x machine with smaller resource specs) so we need to use the same methods we were using previously.
OK so it was CloudLinux limiting the memory for the user to 1024mb, because it works when the limit is doubled to 2048mb.
That's the same setting on our previous server (CentOS/CloudLinux 6.x) but it looks like each version of CentOS is much more memory hungry than the rest.
Whats weird is that running composer with --profile shows the most it uses is 482mb. Even if it doubles when forking (as has been suggested) that's still below the 1024mb limit.
I ran into the same problem. My system had 1.5GB RAM free and it was not enough...Composer was eating too much memory very fast.
My only solution was to clear cache and update to latest version (1.4.2):
composer clear-cache
sudo composer selfupdate
That happens when you have low memory resources and you don't have swap enabled, i had the same problem and fixed using this few commands bellow, or you can create a swap partition or file, just make sure that you activate swap.
$ /bin/dd if=/dev/zero of=/var/swap.1 bs=1M count=1024
$ /sbin/mkswap /var/swap.1
$ /sbin/swapon /var/swap.1
I hope it works for you...

Git-svn out of memory

I'm trying to clone a reasonably big svn repository with git-svn and at a certain point I get a error message:
Failure loading plugin: APR: Can't create a character converter from 'UTF-8' to native encoding: Cannot allocate memory at /usr/libexec/git-core/git-svn line 5061
And sometimes a
Cannot allocate memory: zlib (compress2): out of memory: Compression of svndiff data failed at /usr/libexec/git-core/git-svn line 5061
error message. I still have ~3GB RAM free. What should I do so git-svn can utilize it?
(I'm doing this on RedHat Enterprise Linux 6.5 if that makes any difference)
From:
This error message is about the memory git is trying to allocate --
it's more than what is free. This is most likely caused by a large
file having been checked into SVN. Unfortunately, there's no easy way
to fix it (apart from buying more memory) -- you would have to remove
the large file and the commit adding it from SVN.
However try following:
Increase swap memory
Increase ulimit

SVN - SQLite - disk I/O error

When trying to commit to my SVN repository, I got the following error:
Working copy 'Z:\prace-pj\projects\other\CopyRT' locked.
So I run the clean up command and then the commit succeeded, but at the end of the response message, there was the following error:
Error bumping revisions post-commit (details follow):
disk I/O error, executing statement 'RELEASE s11'
Now when I try to e.g. update the repository, it says that it is stil locked. When I clean up and try to update again, I get an error like this:
disk I/O error, executing statement 'RELEASE s2'
sqlite: disk I/O error
What should I do to fix this?
For others reference, I just had this same error and found that one of my log files was taking up all my space (and could not write to the HDD because there was no free space).
Run (to make sure you have enough disk space)
df -h
Then I just needed to run:
svn cleanup
This resolved the error for me.
have you tried:
svn unlock --force path/to/workingcopy
? Seems it can be pointed at a url if the problem is in the repository itself... I've only used an unlock operation via the tortoise gui before, but I assume it just wraps the svn command anyway.
hope that helps

Resources