We know how to combine find and tar cvf.
How to combine each file using -exec on find with a command like jar -xvf?
The use case is, I need to find specific jar files (e.g. -type f foo*.jar) in a folder and then extract specific entries from each jar file that find finds: jar -xvf <file> META-INF/services
The general case seems to be that the user wants to exec a command cmd for each file that is found when cmd takes argument(s) after the file.
find -exec lets you substitute a file name anywhere in the command. As in the linked question, you can do this with by moving {} to the desired location.
find /path -name '*.jar' -exec jar -xvf {} META-INF/services \;
Related
I have a parent folder with files and many sub-folders with files. I need to copy files alone from parent and sub-folders to an OutputFolder. Below is the folder structure.
ParentFolder: Parent_1.txt, Parent_2.txt
SubFolder1: Folder1_1.txt, Folder1_2.txt
SubFolder2: Folder2_1.txt, Folder2_2.txt
OutputFolder:
Parent_1.txt, Parent_2.txt, Folder1_1.txt, Folder1_2.txt, Folder2_1.txt, Folder2_2.txt
I tried below code, but it copies all the files from sub-folders to parent folder and then move to an OutputFolder. Also, when I call "sh Filename.sh", I get missing argument to `-exec'
cp: cannot stat '20190105'$'\r''/*': No such file or directory.
Today=$(date +%Y%m%d -d "today")
mkdir $Today
Yesterday=$(date +%Y%m%d -d "yesterday")
find $Yesterday -iname "*.txt" -exec cp {} $Yesterday \;
cp $Yesterday/* $Today/
Request your help on this!
I need to copy files alone from parent and sub-folders to an OutputFolder.
I tried below code, but it copies all the files from sub-folders to parent folder
In order to copy the files directly to the OutputFolder $Today, just specify $Today rather than $Yesterday after -exec cp {}.
I get missing argument to `-exec' cp: cannot stat '20190105'$'\r''/*': No such file or directory.
The \r is a sign of Windows line endings in your script - remove the CRs or save it in Unix format.
Use this:
find . -maxdepth 1 -iname "*.txt" -exec cp "{}" $Yesterday \;
to limit the depth to current directory. Mind the quotation marks around curly brackets.
I have a set of zip files with multiple levels of directories in them. I want to find some content from a text file in one of those directories which can be in any of the zip files. If the files are unzipped, I would use the following
grep -r 'pattern' path
I tried using zgrep but it said that the option -r isn't supported. Is there a way to grep through the zipped files?
Thanks in advance.
Try with find command like:
find mydir -type f -name "*log.gz" -exec zgrep "pattern" {} \;
Above command will search for pattern in files named "*log.gz" residing in either mydir or sub directories within mydir.
Good evening fellow computational authors,
I'm trying to run the following script:
find . -name '*.php' -exec /search_replace.sh {} \;
so that it runs search_replace.sh on all the .php files in a folder and its sub-folders. I keep on getting the error:
find: /search_replace.sh: No such file or directory
Any assistance?
change
/search_replace
to
./search_replace
or whatever the full path to the script is...
I've been stuck on a little unix command line problem.
I have a website folder (4gb) I need to grab a copy of, but just the .php, .html, .js and .css files (which is only a couple hundred kb).
I'm thinking ideally, there is a way to zip or tar a whole folder but only grabbing certain file extensions, while retaining subfolder structures. Is this possible and if so, how?
I did try doing a whole zip, then going through and excluding certain files but it seemed a bit excessive.
I'm kinda new to unix.
Any ideas would be greatly appreciated.
Switch into the website folder, then run
zip -R foo '*.php' '*.html' '*.js' '*.css'
You can also run this from outside the website folder:
zip -r foo website_folder -i '*.php' '*.html' '*.js' '*.css'
You can use find and grep to generate the file list, then pipe that into zip
e.g.
find . | egrep "\.(html|css|js|php)$" | zip -# test.zip
(-# tells zip to read a file list from stdin)
This is how I managed to do it, but I also like ghostdog74's version.
tar -czvf archive.tgz `find test/ | egrep ".*\.html|.*\.php"`
You can add extra extensions by adding them to the regex.
I liked Nick's answer, but, since this is a programming site, why not use Ant to do this. :)
Then you can put in a parameter so that different types of files can be zipped up.
http://ant.apache.org/manual/Tasks/zip.html
you may want to use find(GNU) to find all your php,html etc files.then tar them up
find /path -type f \( -iname "*.php" -o -iname "*.css" -o -iname "*.js" -o -iname "*.ext" \) -exec tar -r --file=test.tar "{}" +;
after that you can zip it up
You could write a shell script to copy files based on a pattern/expression into a new folder, zip the contents and then delete the folder. Now, as for the actual syntax of it, ill leave that to you :D.
I have extended regexes enabled in my Bash by
shopt -s extglob
They may be useful in solving the problem.
I run the following unsuccessfully, since it moves also directories
$ mv `find . -maxdepth 1` django-tes/
I am trying to find all files except directories and move them to a directory called django-tes/.
How can you move all files except directories in a folder to a folder in terminal?
Try using find . -type f -maxdepth 1