I have a file (reviews_dataset.tar.gz) that contains many files which contains data. I am required to extract the files in this archive and then perform some basic commands on the file. So far I have created a directory named (CW) and found a command tar zxvf fileNameHere.tgz but when I run this it of course cannot find my file as I have not "downloaded it" into my directory yet? How do I get this file into my directory so that I can then extract it? Sorry if this is poorly worded I am extremely new to this.
You must either run the command from the directory your file exists in, or provide a relative or absolute path to the file. Let's do the latter:
cd /home/jsmith
mkdir cw
cd cw
tar zxvf /home/jsmith/Downloads/fileNameHere.tgz
You should use the command with the options preceded by dash like this:
tar -zxvf filename.tar.gz
If you want to specify the directory to save all the files use -C:
tar -zxf filename.tar.gz -C /root/Desktop/folder
I am using below command to untar
tar -xvf <filename.tar>
I want to make sure extracted files' modified time is updated to current time.
GNU tar has an option for this:
-m, --touch
don't extract file modified time
With this option, extracted files have the time of extraction as their last-modified.
Anyways, I found an answer. I looped through all the files in the tar and used touch command to change the modified time.
tar -xvf <filename.tar>
for f in $(tar -tf <filename>.tar); do touch $f;done
Im trying to tar a folder with subdirectories but i want to exclude all folders with the name "log".
I have search and seen that the tar command have the option of --exclude the problem is that this option required to be specific folder not a dynamic one.
Is there any other way?
so far the command i have is:
tar czf ROOT/backup/servers/20150504.tar.gz ./servers --exclude=".*log.*"
If you want to exclude all folders with the name "log", probably using -X is more convenient. Here is an example:
$ find ./servers -type -d -name *log* > excludefiles
$ tar czf ROOT/backup/servers/20150504.tar.gz -X excludefiles ./servers
I have some files that are .py and others that are ".txt". Instead of
cp *.py myDir/
cp *.txt myDir/
is there a way to perform this in one line on the command line?
Thanks
Try this:
cp *.{py,txt} myDir/
More info about *nix wildcards you can find here.
I've been stuck on a little unix command line problem.
I have a website folder (4gb) I need to grab a copy of, but just the .php, .html, .js and .css files (which is only a couple hundred kb).
I'm thinking ideally, there is a way to zip or tar a whole folder but only grabbing certain file extensions, while retaining subfolder structures. Is this possible and if so, how?
I did try doing a whole zip, then going through and excluding certain files but it seemed a bit excessive.
I'm kinda new to unix.
Any ideas would be greatly appreciated.
Switch into the website folder, then run
zip -R foo '*.php' '*.html' '*.js' '*.css'
You can also run this from outside the website folder:
zip -r foo website_folder -i '*.php' '*.html' '*.js' '*.css'
You can use find and grep to generate the file list, then pipe that into zip
e.g.
find . | egrep "\.(html|css|js|php)$" | zip -# test.zip
(-# tells zip to read a file list from stdin)
This is how I managed to do it, but I also like ghostdog74's version.
tar -czvf archive.tgz `find test/ | egrep ".*\.html|.*\.php"`
You can add extra extensions by adding them to the regex.
I liked Nick's answer, but, since this is a programming site, why not use Ant to do this. :)
Then you can put in a parameter so that different types of files can be zipped up.
http://ant.apache.org/manual/Tasks/zip.html
you may want to use find(GNU) to find all your php,html etc files.then tar them up
find /path -type f \( -iname "*.php" -o -iname "*.css" -o -iname "*.js" -o -iname "*.ext" \) -exec tar -r --file=test.tar "{}" +;
after that you can zip it up
You could write a shell script to copy files based on a pattern/expression into a new folder, zip the contents and then delete the folder. Now, as for the actual syntax of it, ill leave that to you :D.