I have a set of zip files with multiple levels of directories in them. I want to find some content from a text file in one of those directories which can be in any of the zip files. If the files are unzipped, I would use the following
grep -r 'pattern' path
I tried using zgrep but it said that the option -r isn't supported. Is there a way to grep through the zipped files?
Thanks in advance.
Try with find command like:
find mydir -type f -name "*log.gz" -exec zgrep "pattern" {} \;
Above command will search for pattern in files named "*log.gz" residing in either mydir or sub directories within mydir.
Related
I have a parent folder with files and many sub-folders with files. I need to copy files alone from parent and sub-folders to an OutputFolder. Below is the folder structure.
ParentFolder: Parent_1.txt, Parent_2.txt
SubFolder1: Folder1_1.txt, Folder1_2.txt
SubFolder2: Folder2_1.txt, Folder2_2.txt
OutputFolder:
Parent_1.txt, Parent_2.txt, Folder1_1.txt, Folder1_2.txt, Folder2_1.txt, Folder2_2.txt
I tried below code, but it copies all the files from sub-folders to parent folder and then move to an OutputFolder. Also, when I call "sh Filename.sh", I get missing argument to `-exec'
cp: cannot stat '20190105'$'\r''/*': No such file or directory.
Today=$(date +%Y%m%d -d "today")
mkdir $Today
Yesterday=$(date +%Y%m%d -d "yesterday")
find $Yesterday -iname "*.txt" -exec cp {} $Yesterday \;
cp $Yesterday/* $Today/
Request your help on this!
I need to copy files alone from parent and sub-folders to an OutputFolder.
I tried below code, but it copies all the files from sub-folders to parent folder
In order to copy the files directly to the OutputFolder $Today, just specify $Today rather than $Yesterday after -exec cp {}.
I get missing argument to `-exec' cp: cannot stat '20190105'$'\r''/*': No such file or directory.
The \r is a sign of Windows line endings in your script - remove the CRs or save it in Unix format.
Use this:
find . -maxdepth 1 -iname "*.txt" -exec cp "{}" $Yesterday \;
to limit the depth to current directory. Mind the quotation marks around curly brackets.
We know how to combine find and tar cvf.
How to combine each file using -exec on find with a command like jar -xvf?
The use case is, I need to find specific jar files (e.g. -type f foo*.jar) in a folder and then extract specific entries from each jar file that find finds: jar -xvf <file> META-INF/services
The general case seems to be that the user wants to exec a command cmd for each file that is found when cmd takes argument(s) after the file.
find -exec lets you substitute a file name anywhere in the command. As in the linked question, you can do this with by moving {} to the desired location.
find /path -name '*.jar' -exec jar -xvf {} META-INF/services \;
I'm trying to rsync txt files from a folder to another folder but the trouble is that many of these files have names that contain spaces. As such I keep on getting an error. What exactly do I need to do to have the script understand the spaces as such without having to spell out the name for each file that has spaces? FWIW, this is on MacOS.
do shell script "rsync `find /Volumes/HD/Users/123/Desktop/Test/ -type f -mtime -3` --include='*.txt' --exclude='*' --ignore-existing -raz --progress /Volumes/HD/Users/123/Desktop/TXT/"
I have a directory containing folders and subfolders. These subfolders contain, for example, .xml files.
I'd like to copy all of the .xml files into a separate folder. My UNIX is rusty; any and all help would be greatly appreciated. Thanks, Adam
Do you mean copy all .xml files from all subfolders without having to specify the subfolder names?
find . -name \*.xml -exec /bin/cp {} /dest/dir/ \;
Try this command (with the needed changes of course), e.g.,:
cp source_dir/project1/*.xml dest_dir/new_project2/summer2012
Note that you don't have to specify the filenames at the destination when they stay the same.
For more information see the cp man page
find -iname '*.xml' -exec mv \{\} /dest/directory \;
I've been stuck on a little unix command line problem.
I have a website folder (4gb) I need to grab a copy of, but just the .php, .html, .js and .css files (which is only a couple hundred kb).
I'm thinking ideally, there is a way to zip or tar a whole folder but only grabbing certain file extensions, while retaining subfolder structures. Is this possible and if so, how?
I did try doing a whole zip, then going through and excluding certain files but it seemed a bit excessive.
I'm kinda new to unix.
Any ideas would be greatly appreciated.
Switch into the website folder, then run
zip -R foo '*.php' '*.html' '*.js' '*.css'
You can also run this from outside the website folder:
zip -r foo website_folder -i '*.php' '*.html' '*.js' '*.css'
You can use find and grep to generate the file list, then pipe that into zip
e.g.
find . | egrep "\.(html|css|js|php)$" | zip -# test.zip
(-# tells zip to read a file list from stdin)
This is how I managed to do it, but I also like ghostdog74's version.
tar -czvf archive.tgz `find test/ | egrep ".*\.html|.*\.php"`
You can add extra extensions by adding them to the regex.
I liked Nick's answer, but, since this is a programming site, why not use Ant to do this. :)
Then you can put in a parameter so that different types of files can be zipped up.
http://ant.apache.org/manual/Tasks/zip.html
you may want to use find(GNU) to find all your php,html etc files.then tar them up
find /path -type f \( -iname "*.php" -o -iname "*.css" -o -iname "*.js" -o -iname "*.ext" \) -exec tar -r --file=test.tar "{}" +;
after that you can zip it up
You could write a shell script to copy files based on a pattern/expression into a new folder, zip the contents and then delete the folder. Now, as for the actual syntax of it, ill leave that to you :D.