unix: count number of jpeg files recursively except for one subfolder in every folder? - unix

I think the code to count all the jpeg files recursively in a folder is,
find . -type f -name "*.jpeg" | wc -l
but I now realize I need to exclude some subfolders...
for instance, my folder consists of 5 subfolders and in each subfolder there is a subsubfolder named "meh" consisting of jpeg files I wish not to include in my count... Could anyone let me know how to do that?
Thanks so much for your guidance.

You can do this with find's option -prune or -regex.
find . -name meh -prune -o -name '*.jpeg' -print | wc -l
find . -not -regex '.*/meh/.*' -a -name '*.jpeg' -print | wc -l
Weird that #Prune didn't answer that.

Since find includes the relative path of each file, you could do this:
find . -type f -name "*.jpeg" | grep -vc /meh/

Use any grep variant to filter the output of find.
While you're doing that, use the count option from grep, -c.
-v is reverse logic: list only those that do not match the given pattern.
find . -type f -name "*.jpeg" | egrep -c -v "/meh/"

Related

Searching for particular files in a directory non-recursively using find. AIX

I have a script which has the following command. I am trying to edit this in such a way that it only searches the files in the directory of the path without going in the subdirectories. That is not recursive search
find {Path} -name "cor*" -type f -exec ls -l {} \;
Example: The command should give cor123.log only and not cor456.log. Currently it gives both
<Path>
..cor123.log
<directory>
..cor456.log
I tried using -maxdepth but it's not supported in AIX. -prune and -depth didn't help either.
Will appreciate any help. Thanks
You can use
find . -name . -o -prune
to find files and directories non-recursively.
So in your case this one will work:
find . -name . -o -prune -name 'cor*' -type f -exec ls -l {} \;
Do you need find for selecting files only?
When you know that all files starting with cor are regula files, you can use
ls -l ${Path}/cor*
or
ls -l ${Path}/cor*.log
When you need the -type f, you can try to filter the results.
The filename can not have a /, so remove everything with an / after the Path.
We do not know the last char of ${Path}. It can be /, which will make the grep -Ev "${Path}/.*/" filter useless. After the Path at least one character is needed before looking for the next /.
find "${Path}" -name "cor*" -type f 2>/dev/null| grep -Ev "${Path}..*/" | xargs -ls
Late answer but may save some. In aix
find /some/directory/* -prune -type f -name *.log
For instance make your path have the last forward slash with a wildcard then -prune
/*
find /some/directory/* -prune -name "cor*" -type f -exec ls -l {} \
Tested.

unix find command in terminal does not work

I need to place a command that will search for all files in the current directory as well as in its sub-directories - ending by ~, and/or all files that start or end by #. The command line will show and erase all files found. Only one command is allowed: no ’;’ or ’&&’ or other shenanigans.
here is my command:
find . -name "#*" -o -name "*#" -o -name "*~" -print -delete
but it erases only the files ending in ~
You forgot to enclose the conditions with parenthesis (). This means that only the last condition will trigger the actions -print and -delete.
The default is and -a, which would not require the parenthesis, that's why most find commands such as find -type f -name "pattern" -print works without parenthesis.
You should try:
find . \( -name "#*" -o -name "*#" -o -name "*~" \) -print -delete
How about -print0 primary in conjunction with xargs -0'' like this .
find . -type f -print0 | xargs -0 rm
za:temp za$ ls
file.txt file.txt~
za:temp za$ find . -name "*~" -print0 | xargs -0 rm
za:temp za$ ls
file.txt
Or with xargs -I {} plus your comand which does the same thing .
# xargs -I {} to capture the value of find
find . -iname *something* | xargs -I {} rm {}
edit : if you can't see the files that start with # using find . then the files have spaces within the name of the file(s) like # file.txt. you will need to find files with spaces with something like find . -name "* *" and then remove the spaces.

How to move or copy files listed by 'find' command in unix?

I have a list of certain files that I see using the command below, but how can I copy those files listed into another folder, say ~/test?
find . -mtime 1 -exec du -hc {} +
Adding to Eric Jablow's answer, here is a possible solution (it worked for me - linux mint 14 /nadia)
find /path/to/search/ -type f -name "glob-to-find-files" | xargs cp -t /target/path/
You can refer to "How can I use xargs to copy files that have spaces and quotes in their names?" as well.
Actually, you can process the find command output in a copy command in two ways:
If the find command's output doesn't contain any space, i.e if the filename doesn't contain a space in it, then you can use:
Syntax:
find <Path> <Conditions> | xargs cp -t <copy file path>
Example:
find -mtime -1 -type f | xargs cp -t inner/
But our production data files might contain spaces, so most of time this command is effective:
Syntax:
find <path> <condition> -exec cp '{}' <copy path> \;
Example
find -mtime -1 -type f -exec cp '{}' inner/ \;
In the second example, the last part, the semi-colon is also considered as part of the find command, and should be escaped before pressing Enter. Otherwise you will get an error something like:
find: missing argument to `-exec'
find /PATH/TO/YOUR/FILES -name NAME.EXT -exec cp -rfp {} /DST_DIR \;
If you're using GNU find,
find . -mtime 1 -exec cp -t ~/test/ {} +
This works as well as piping the output into xargs while avoiding the pitfalls of doing so (it handles embedded spaces and newlines without having to use find ... -print0 | xargs -0 ...).
This is the best way for me:
cat filename.tsv |
while read FILENAME
do
sudo find /PATH_FROM/ -name "$FILENAME" -maxdepth 4 -exec cp '{}' /PATH_TO/ \; ;
done

Unix find directory with missing file

How can I use find to identify those directories that do not contain a file with a specified name? I want to create a file that contains a list of all directories missing a certain file.
Find directories:
find -type d > DIRS
Find directories with the file:
find -type f -name 'SpecificName' | sed 's!/[^/]*$!!' > FILEDIRS
Find the difference:
grep DIRS -vf FILEDIRS
There are quite a few different ways you could go about this - here's the approach I would take (bash assumed):
while read d
do
[[ -r ${d}/SpecificFile.txt ]] || echo ${d}
done < <(find . -type d -print)
If your target directories only exist at a certain depth, there are other options you could add to find to limit the number of directories to check...
This is an example for a file main.cpp. In the example all directories that do not contain main.cpp in them are found. As you see I first get a list of all folders, then get a list of folders having main.cpp and then run diff to get a difference between the two lists:
diff \
<(find . -exec readlink -f {} \; | sed 's/\(.*\)\/.*$/\1/' | sort | uniq) \
<(find . -name main.cpp -exec readlink -f {} \; | sed 's/\(.*\)\/.*$/\1/' | sort | uniq) \
| sed -n 's/< \(.*\)/\1/p'

Unix one-liner to extract substring from all files in a directory and subdirectories

I'm looking for a UNIX one-liner that will output to a file all occurrences of NSLocalizedString (from that word to the end of the line) in all files in the current directory and all subdirectories. I've googled, but haven't found a solution.
find . -type f -exec fgrep NSLocalizedString {} \+ | \
sed -e 's/^.*\(NSLocalizedString.*\)$/\1/' > ../your_output_file
find <directory> -type f -print | xargs grep NSLocalizedString | tee <outputfile> should do what you're looking for, if I understand the question right...

Resources