Compress or copy folder to stdout - directory

I can't compress folder recursively to stdout. Zip and 7-zip can one file to stdout, but not folder. Rar not compatible compress to stdout in principle. Tar, GZip, Bzip2 - obviously, etc. Please help me find to solution...

Related

On Circuitpython, how to decompress files from tar or gzip file?

I can use utarfile.py to untar a file on Micropython.
But for circuitpython, even the latest 8.0.0 beta2.
There is no circuitpython version of utar, gzip library, or something like that.
If I use the uarfile.py, there is also no uctypes modules on circuitpython.
Could anybody tell me how to decompress a tar or gzip file on Circuitpython.
Thank you.

Unzip specific files in a ARR file within a Tar.gz

i'am currently trying to unzip some specific files within a ARR file. This ARR file is within a tar.gz file.
Is it possible to unzip these files without a intermediate step/One liner. Its important that the first tar.gz will not be unpacked.
Thanks!
you can try something like:
gzip -dc input_file.tar.gz|tar xf - path/to/file/you/want/to/extract
This decompress and untar the archive in memory and have advantage of run faster.

Compress a dir into many small tar and uncompress it back to same parent directory

I have huge(90GB) directory which i want to compress into multiple .tgz files in a way that, when i untar all these small .tgz files they all uncompress to same main directory.
For eg:
Parent_dir/1-1000000files
Compress it to multiple small subset1.tgz subset2.tgz subset3.tgz etc.
Now hen i uncompress they all should expand to same Parent_dir.
I have used tar czvf name/of/tar files/to/compress
How can i achieve this ..?

An appendable compressed archive

I have a requirement to maintain a compressed archive of log files. The log filenames are unique and the archive, once expanded, is simply one directory containing all the log files.
The current solution isn't scaling well, since it involves a gzipped tar file. Every time a log file is added, they first decompress the entire archive, add the file, and re-gzip.
Is there a Unix archive tool that can add to a compressed archive without completely expanding and re-compressing? Or can gzip perform this, given the right combination of arguments?
I'm using zip -Zb for that (appending text logs incrementally to compressed archive):
fast append (index is at the end of archive, efficient to update)
-Zb uses bzip2 compression method instead of deflate. In 2018 this seems safe to use (you'll need a reasonably modern unzip -- note some tools do assume deflate when they see a zip file, so YMMV)
7z was a good candidate: compression ratio is vastly better than zip when you compress all files in the same operation. But when you append files one by one to the archive (incremental appending), compression ratio is only marginally better than standard zip, and similar to zip -Zb. So for now I'm sticking with zip -Zb.
To clarify what happens and why having the index at the end is useful for "appendable" archive format, with entries compressed individually:
Before:
############## ########### ################# #
[foo1.png ] [foo2.png ] [foo3.png ] ^
|
index
After:
############## ########### ################# ########### #
[foo1.png ] [foo2.png ] [foo3.png ] [foo4.png ] ^
|
new index
So this is not fopen in append mode, but presumably fopen in write mode, then fseek, then write (that's my mental model of it, someone let me know if this is wrong). I'm not 100% certain that it would be so simple in reality, it might depend on OS and file system (e.g. a file system with snapshots might have a very different opinion about how to deal with small writes at the end of a file… huge "YMMV" here 🤷🏻‍♂️)
It's rather easy to have an appendable archive of compressed files (not same as appendable compressed archive, though).
tar has an option to append files to the end of an archive (Assuming that you have GNU tar)
-r, --append
append files to the end of an archive
You can gzip the log files before adding to the archive and can continue to update (append) the archive with newer files.
$ ls -l
foo-20130101.log
foo-20130102.log
foo-20130103.log
$ gzip foo*
$ ls -l
foo-20130101.log.gz
foo-20130102.log.gz
foo-20130103.log.gz
$ tar cvf backup.tar foo*gz
Now you have another log file to add to the archive:
$ ls -l
foo-20130104.log
$ gzip foo-20130104.log
$ tar rvf backup.tar foo-20130104.log
$ tar tf backup.tar
foo-20130101.log.gz
foo-20130102.log.gz
foo-20130103.log.gz
foo-20130104.log.gz
If you don't need to use tar, I suggest 7-Zip. It has an 'add' command, which I believe does what you want.
See related SO question: Is there a way to add a folder to existing 7za archive?
Also, the 7-Zip documentation: https://sevenzip.osdn.jp/chm/cmdline/commands/add.htm

unzip all files and save all contents in a single folder - unix

I have a single directory containing multiple zip files that contain .jpg files.
I need to unzip all files and save all contents (.jpgs files) into a single folder.
Any suggestions on a unix command that does that?
Please note that some of the contents (jpgs) might exist with same name in multiple zipped files, I need to keep all jpgs.
thanks
unzip '*.zip' -o -B
as by default these utilities are not installed
see http://www.cyberciti.biz/tips/how-can-i-zipping-and-unzipping-files-under-linux.html regarding installation
read about -B flag to realize its limitations.

Resources