I have a directory structure as below
output/a/1/multipleFiles
output/a/2/multipleFiles
output/a/3/multipleFiles
output/b/1/multipleFiles
output/b/2/multipleFiles
output/b/3/multipleFiles
I want to know number of lines each directory has. So basically, number of lines at each inner most directory level instead of file level. The innermost directories 1, 2, 3 are different kinds of output we generate for our analytics which contains multiple hadoop part-xxxx files.
I moved to output directory and tried the below command.
find . -maxdepth 2 -type d -name '*' | awk -F "/" 'NF==3' | awk '{print $0"/*"}' | xargs wc -l
But I am getting an error as
wc: ./a/1/*: No such file or directory
wc: ./a/2/*: No such file or directory
wc: ./a/3/*: No such file or directory
but if I try
wc -l ./a/1/*
I am getting correct output for that specific folder.
What am I missing here.
EDIT:
I updated my command as below to remove unnecessary awk commands.
find . -mindepth 2 -maxdepth 2 -type d -name '*' | xargs wc -l
This again results in error as
wc: ./a/1: Is a directory
wc: ./a/2: Is a directory
wc: ./a/2: Is a directory
Give a try to execdir, for example:
find . -maxdepth 2 -type f -execdir wc -l {} \;
This will run the command wc -l {} only within the directory that the file has been found, from the man:
-execdir The -execdir primary is identical to the -exec primary with
the exception that utility will be executed from the
directory that holds the current file.
Related
I have a directory named .poco that has subdirectories at different levels.
Some have *.css files in them. Some don't. The following script fails on
line 4 (the second for loop) if the current directory has no .css files
in it. How do I keep the script running if the current directory doesn't happen to have a match to *.css?
#!/bin/zsh
for dir in ~/pococms/poco/.poco/*; do
if [ -d "$dir" ]; then
for file in $dir/*.css # Fails if directory has no .CSS files
do
if [ -f $file ]; then
v "${file}"
fi
done
fi
done
That happens because of "shell globbing". Your shell tries to replace patterns like *.css with the list of files. When no files match the pattern, you get the "error" that you get.
You might want to use find:
find ~/pocoms/poco/.poco -mindepth 2 -maxdepth 2 -type f -name '*.css'
and then xargs to your program (in that case - echo) like:
find ~/pocoms/poco/.poco\
-mindepth 2\
-maxdepth 2\
-type f\
-name '*.css'\
-print0\
| xargs -0 -n1 -I{} echo "{}"
-n1 to pass the files one by one, remove it if you want your program to accept the full list of files as args.
I have a directory structure like this
/home
/dir-1
some-file.php
/dir-2
sibling.php
target-file.php
/dir-3
/dir-4
other-sibling.php
sibling.php
target-file.php
/dir-5
target-file.php
I need to target all directories containing the file "target-file.php" and remove those directories with its contents. In my structure, the final result wanted is:
/home
/dir-1
some-file.php
/dir-3
I am trying:
rm -rf /home/*/target-file.php
But it is only removing that file (target-file.php) and not the siblings or the parent directory.
Please help
Use this:
#!/bin/bash
find . -type f -name target-file.php -print0 | while IFS= read -r -d '' line
do
echo "$line"
/bin/rm -fr "$(dirname "$line")"
done
Using find with while like this ensure it will work with all filenames (see https://mywiki.wooledge.org/BashFAQ/001).
You can run find . -type f -name target-file.php -print to see the list of files.
dirname removes the filename so you are left with only the directory names.
/bin/rm -fr deletes the directories.
you can comment the echo line, this was just to show you the files / directories being processed.
I'm trying to use Unix's find command to count the number of executable files in a directory of a certain type, namely Solaris(MSB) executable's. I know I can get the count of all executable files in this directory fairly easy with
find . -type f -perm -u+rx | wc -l
however this doesn't count Solaris(MSB) executable files exclusively. I thought to remedy this I would just throw in a -name flag, something like this.
find . -name "sparc*" -type f -perm -u+rx | wc -l
This will correctly return that there are 6 only if I remove the part of the command that specify's that they need to be executable, if I keep this part of the command it returns a count of 0 which is "wrong". When I look at the ls -l command below I can see that these files are executable I think? or that they are pointing to an executable? This might be the root of the problem.
ls -l
lrwxrwxrwx 1 root other 57 Jul 15 2005 sparc-sun-solaris2.9-c++ -> /usr/local/gnu/pkg/gcc-3.3.6/bin/sparc-sun-solaris2.9-c++*
Any insight is appreciated.
Try
find -L . -type f -perm -u+rx | wc -l
or
find -L . -name "sparc*" -type f -perm -u+rx | wc -l
or whatever conditions you need.
Option -L instructs find to follow symbolic links instead of processing the link itself. (see e.g. https://www.unix.com/man-page/posix/1p/find/)
For example with the symbolic link
sparc-sun-solaris2.9-c++ -> /usr/local/gnu/pkg/gcc-3.3.6/bin/sparc-sun-solaris2.9-c++*
find should behave as if the file /usr/local/gnu/pkg/gcc-3.3.6/bin/sparc-sun-solaris2.9-c++ would be directly located at sparc-sun-solaris2.9-c++
If your find doesn't support option -L you can try -follow like this:
find . -follow -name "sparc*" -type f -perm -u+rx | wc -l
How to find files with a specific pattern in the parent and child directory of my present working directory using a single command ?
Filename - test.txt, the file has the pattern "nslookup"
This file is present in 3 directories and they are /home, /home/1 and /home/1/2
I am currently at /home/1. I have tried below commands :
find ../ -type f -name "test.txt"
Output :
../test.txt
../home/1/test.txt
../home/1/2/test.txt
I was able to find the files, hence I tried the below command :
$ find ../ -type f -exec grep "nslookup" {} \;
nslookup
nslookup
nslookup
This doesn't display the file names.
Command :
find . -type f -name "test.txt | xargs grep "nslookup' ==> gives me files in
pwd and child directories :
./1/test.txt:nslookup
./test.txt:nslookup
but when I try to search in the parent directory as shown below the results are erroneous :
find ../ -type f -name "test.txt" | xargs grep "nslookup
User#User-PC ~/test
$ uname -a
CYGWIN_NT-6.1 User-PC 2.5.2(0.297/5/3) 2016-06-23 14:29 x86_64 Cygwin
How about this:
grep -r -l nslookup .. | grep test.txt
I want to copy my directory structure excluding the files. Is there any option in the tar to ignore all files and copy only the Directories recursively.
You can use find to get the directories and then tar them:
find .. -type d -print0 | xargs -0 tar cf dirstructure.tar --no-recursion
If you have more than about 10000 directories use the following to work around xargs limits:
find . -type d -print0 | tar cf dirstructure.tar --no-recursion --null --files-from -
Directory names that contain spaces or other special characters may require extra attention. For example:
$ mkdir -p "backup/My Documents/stuff"
$ find backup/ -type d | xargs tar cf directory-structure.tar --no-recursion
tar: backup/My: Cannot stat: No such file or directory
tar: Documents: Cannot stat: No such file or directory
tar: backup/My: Cannot stat: No such file or directory
tar: Documents/stuff: Cannot stat: No such file or directory
tar: Exiting with failure status due to previous errors
Here are some variations to handle these cases of "unusual" directory names:
$ find backup/ -type d -print0 | xargs -0 tar cf directory-structure.tar --no-recursion
Using -print0 with find will emit filenames as null-terminated strings; with -0 xargs will interpret arguments that same way. Using null as a terminator helps ensure that even filenames with spaces and newlines will be interpreted correctly.
It's also possible to pipe results straight from find to tar:
$ find backup/ -type d | tar cf directory-structure.tar -T - --no-recursion
Invoking tar with -T - (or --files-from -) will cause it to read filenames from stdin, expecting each filename to be separated by a line break.
For maximum effect this can be combined with options for null-terminated strings:
$ find . -type d -print0 | tar cf directory-structure.tar --null --files-from - --no-recursion
Of these I consider this last version to be the most robust, because it supports both unusual filenames and (unlike xargs) is not inherently limited by system command-line sizes. (see xargs --show-limits)
for i in `find . -type d`; do mkdir -p /tmp/tar_root/`echo $i|sed 's/\.\///'`; done
pushd /tmp/tar_root
tar cf tarfile.tar *
popd
# rm -fr /tmp/tar_root
go into the folder you want to start at (that's why we use find dot)
save tar file somewhere else. I think I got an error leaving it right there.
tar with r not c. I think with cf you keep creating new files and you only
get the last set of file subdirectories. tar r appends to the tar file.
--no-recursion because the find is giving you your whole list of files already
so you don't want to recurse.
find . -type d |xargs tar rf /somewhereelse/whatever-dirsonly.tar --no-recursion
tar tvf /somewhereelse/whatever-dirsonly.tar |more to check what you got.
For AIX:
tar cvfD some-tarball.tar `find /dir_to_start_from -type d -print`