Renaming a file with the parameter of another unix - unix

Hey all I am attempting rename all files that match a certain pattern in B-shell. I am stuck on the syntax of the mv command to rename the file.
I am finding all the files like this and I know I have to pipe the output of this command into the mv command but just can't figure it out. Here is the code and here is the mv command.
find . -iname "f????.a" -print0 | (some command that renames the files that have been found)
Any help on this is greatly appreciated.

find . -iname "f????.a" -exec mv {} {}.img \;

Related

How to have "make clean" ignore folders when deleting

I was given a Makefile for an assignment that gives me a make clean
command. With the way the repository is set up, it deletes everything in the /bin and /out folders, except for a file called .gitignore. This is what the command looks like:
clean:
find out/ ! -name .gitignore -type f -delete && \
find bin/ ! -name .gitignore -type f -delete
Now that I'm doing my project, I need to store things in a folder called /bin/fonts and /bin/word_lists. I'm trying to modify the command so that it ignores these two files. The only problem is, I don't know what language these commands are written in, so I don't even know where to start looking at the syntax. Could somebody point me in the right direction? I tried something like this:
clean:
find out/ ! -name .gitignore -type f -delete && \
find bin/ ! -name .gitignore ! -name fonts/FreeSans.ttf -type f -delete
But it still deletes everything in fonts, and even if it did work the way I wanted, that doesn't really solve the problem of saving every single font in the folder.
I also tried this:
clean:
find out/ ! -name .gitignore -type f -delete && \
find ./bin -mindepth 1 ! -name .gitignore ! -regex '^./fonts/\(/.*\)?' ! -regex '^./word_lists/\(/.*\)?' -delete
following this post, but it instead deleted everything INCLUDING the folders bin/fonts as well as bin/word_lists.
Any help would be greatly appreciated!
-name does not examine the full file path, it only matches against the file name (so -name FreeSans.ttf would match, but match this file name in any directory).
The predicate you are looking for is called -path but then you need to specify a pattern for the entire path.
clean:
find out/ bin/ ! -name .gitignore ! -path 'bin/fonts/*
! -path 'bin/word_lists/*' -type f -delete
(Notice also how I condensed the find to traverse two directories at the same time. I assume you mean bin not /bin; perhaps see also Difference between ./ and ~/)

Combine find and jar -xvf?

We know how to combine find and tar cvf.
How to combine each file using -exec on find with a command like jar -xvf?
The use case is, I need to find specific jar files (e.g. -type f foo*.jar) in a folder and then extract specific entries from each jar file that find finds: jar -xvf <file> META-INF/services
The general case seems to be that the user wants to exec a command cmd for each file that is found when cmd takes argument(s) after the file.
find -exec lets you substitute a file name anywhere in the command. As in the linked question, you can do this with by moving {} to the desired location.
find /path -name '*.jar' -exec jar -xvf {} META-INF/services \;

Problem redirecting output of find to a file

I am trying to put the result of a find command to a text file on a unix bash shell
Using:
find ~/* -name "*.txt" -print > list_of_txt_files.list
However the list_of_txt_files.list stays empty and I have to kill the find to have it return the command prompt. I do have many txt files in my home directory
Alternatively How do I save the result of a find command to a text file from the commandline. I thought that this should work
The first thing I would do is use single quotes (some shells will expand the wildcards, though I don't think bash does, at least by default), and the first argument to find is a directory, not a list of files:
find ~ -name '*.txt' -print > list_of_txt_files.list
Beyond that, it may just be taking a long time, though I can't imagine anyone having that many text files (you say you have a lot but it would have to be pretty massive to slow down find). Try it first without the redirection and see what it outputs:
find ~ -name '*.txt' -print
You can redirect output to a file and console together by using tee.
find ~ -name '*.txt' -print | tee result.log
This will redirect output to console and to a file and hence you don't have to guess whether if command is actually executing.
Here is what worked for me
find . -name '*.zip' -exec echo {} \\; > zips.out

Find command in unix

I want to perform a find command in a directory, and exclude from the set of results all files that are .gif, .jpeg, and .class.
I was wondering if someone could help me out. I've been trying to play with the regex option, but clearly I'm not doing it properly.
Something like:
find . \! -name '*.class' \! -name '*.jpeg' \! -name '*.class'

how do I zip a whole folder tree in unix, but only certain files?

I've been stuck on a little unix command line problem.
I have a website folder (4gb) I need to grab a copy of, but just the .php, .html, .js and .css files (which is only a couple hundred kb).
I'm thinking ideally, there is a way to zip or tar a whole folder but only grabbing certain file extensions, while retaining subfolder structures. Is this possible and if so, how?
I did try doing a whole zip, then going through and excluding certain files but it seemed a bit excessive.
I'm kinda new to unix.
Any ideas would be greatly appreciated.
Switch into the website folder, then run
zip -R foo '*.php' '*.html' '*.js' '*.css'
You can also run this from outside the website folder:
zip -r foo website_folder -i '*.php' '*.html' '*.js' '*.css'
You can use find and grep to generate the file list, then pipe that into zip
e.g.
find . | egrep "\.(html|css|js|php)$" | zip -# test.zip
(-# tells zip to read a file list from stdin)
This is how I managed to do it, but I also like ghostdog74's version.
tar -czvf archive.tgz `find test/ | egrep ".*\.html|.*\.php"`
You can add extra extensions by adding them to the regex.
I liked Nick's answer, but, since this is a programming site, why not use Ant to do this. :)
Then you can put in a parameter so that different types of files can be zipped up.
http://ant.apache.org/manual/Tasks/zip.html
you may want to use find(GNU) to find all your php,html etc files.then tar them up
find /path -type f \( -iname "*.php" -o -iname "*.css" -o -iname "*.js" -o -iname "*.ext" \) -exec tar -r --file=test.tar "{}" +;
after that you can zip it up
You could write a shell script to copy files based on a pattern/expression into a new folder, zip the contents and then delete the folder. Now, as for the actual syntax of it, ill leave that to you :D.

Resources