Size of files in subdirectories - unix

I would like to know if there is an easy way to compute the total size of files in subdirectories in unix? I am interested in all the .js files in a folder with subdirectories and I am trying to use du -ah and grep *.js but does not work. Any help is appreciated. thanks

find . -iname "*.js" -ls | awk '{sum += $7} END {print sum}'
I dont think there is a way with du, but you can use awk

This is for all java files:
> find . -name "*.java" | xargs du -a | awk '{sum+=$1}END{print sum}'
2774
so you can modify this to :
find . -name "*.js" | xargs du -a | awk '{sum+=$1}END{print sum}'

Try below command..... It will print the total at the end...
find . -name '*.js' -exec du {} \; | awk 'sum=sum+$1; END{print sum " total" }'

find . -name '*.js' -exec stat -c %s '{}' + | awk '{ sum += $0 } END { print sum }'

Related

Speedup find command

I am getting file count and find size using same find command but currently running find twice as shown below. How to perform both operations in one line and eliminate one find?
file_cnt[$i]=$(find $dir_name -type f -ctime +$ctime1 -ctime -$ctime2 | wc -l)
file_size[$i]=$(find $dir_name -type f -ctime +$ctime1 -ctime -$ctime2 | xargs --no-run-if-empty --max-procs=2 du -s | awk '{sum += $1} ; END {printf "%.2f", sum/1024**2}')
try something like this
read "file_cnt[$i]" "file_size[$i]" << EOF
$(find $dir_name -type f -ctime +$ctime1 -ctime -$ctime2 | xargs --no-run-if-empty --max-procs=2 du -s | awk '{count++;sum += $1} ; END {printf "%d %.2f", count, sum/1024**2}')
EOF
I would suggest to use find to return the file sizes and awk to do the sum and the file count:
$ find $dir_name -type f -ctime +$ctime1 -ctime -$ctime2 -printf "%s\n" | awk '{s+=$1}{print NR,s}'

How to calculate total size of all css,js and html pages separately uing shell script?

I have the following code in shell script
find -name "*.css" -exec -printf '%16f Size: %6s\n'
This gives me the file size of every css file. How do I modify this to get the added sum of all the file sizes ?
You could use awk:
find . -name "*.css" -type f -printf '%s\n' | awk '{ tot+=$0 } END { print tot }'
Or in pure bash:
total=0
while read -r s;
do
total=$(( total+s ))
done < <(find . -name "*.css" -type f -printf '%s\n')
echo $total
In 2 steps:
1) ll *css | tr -s " " > filename.txt
2) awk 'BEGIN {x=0} {x+=$5} END {print x}' filename.txt

Unix script to recursively search a directory and sub directories to grep and print content between 2 patterns in file

I have some files in a directory and sub directories. I need to search all the files and print the file name and the content between 2 matching patterns in the file.
For e.g. lets say my file looks like below.
File1.txt:
Pattern1
ABCDEFGHI
Pattern2
dafoaf
fafaf
dfadf
afadf
File2.txt
Pattern1
XXXXXXXXX
Pattern2
kdfaf
adfdaf
fdafad
I need to get following output
File1.txt:
ABCDEGHI
File2.txt:
XXXXXXXX
and so on for all the files under directory and sub directories separated by new line.
This might work for you:
find . \
-type f \
-exec awk 'BEGING {print FILENAME ":"} /Pattern1/ { p=1 ; next } /Pattern2/ {p=0} p==1 {print $0} END {print ""}' \{\} \;
Note, this prints the FILENAME, even if Pattern1 was not found!
This will work for you :
Create this shell script as my_grep.sh
#!/bin/sh
grep -nH "Pattern" $1 >>temp
if [ `grep -c $1 temp` -eq 2 ]; then
limits=`grep $1 temp | cut -f2 -d:`
lower_limit=`echo $limits | cut -f1 -d" "`
upper_limit=`echo $limits | cut -f2 -d" "`
echo "$1:"
head -`expr $upper_limit - 1` $1 | tail -`expr $upper_limit - $lower_limit - 1`
fi
Use find command to search files and fire this schell script:
$ find ./test -type f -exec ./my_grep {} \;
./test/File1.txt:
ABCDEFGHI
./test/File2.txt:
XXXXXXXXX

KSH sort filenames

I'm searching through a number of directories for "searchstring", and then running a script on each $file:
for file in `find $dir -name ${searchstring}'*'`;
do
echo $file >> $debug
script.sh $file >> $output
done
My $debug file yields the following:
/root/0007_searchstring/out/filename_20120105_020000.log
/root/0006_searchstring/out/filename_20120105_010000.log
/root/0005_searchstring/out/filename_20120105_013000.log
(filename is _yyyymmdd_hhmmss.log)
...
Is there a way to get find to order by filename or by mktime? Should I pipe find to sort first? Make an array then sort it as per this question?
If you want to ignore the directory path and just use the file name, then you should be able to use:
for file in `find $dir -name ${searchstring}'*' | sort --field-separator=/ --key=4`;
'ls -t' if you need to regenerate the list based on timestamp.
'sort -n' if the list is fairly static?
To sort by modification time, you can use stat with find:
$ find . -exec stat {} -c '%Y %n' \; | sort -n | cut -d ' ' -f 2
You can pipe the output of find through sort to sort by filename:
find $dir -name "${searchstring}*" | sort | while read file
do
echo "$file" >> $debug
script.sh "$file" >> $output
done

How to find file extension using UNIX?

I need to find file extension for file to be processed using UNIX. The two file extension which i will be handling are '.dat' and '.csv'.
Please let me know how this can be done.
find . -name "*.dat" -o -name "*.csv"
Finds in the current directory and recursively down, all files that end in those two extensions.
So my stab at this.
filename=file.dat
extension=$(echo ${filename}|awk -F\. '{print $2}')
if [ ${extension} == "dat" ]; then
your code here
fi
Echo the variable ${filename} pipe that output to awk. With awk reset the field separator to a . then pick up field 2 (the print $2 part)
This is what you want ?
find . -name "*.dat"
find . -name "*.csv"
find /path -type f \( -name "*.dat" -o -name "*.csv" \) | while read -r file
do
echo "Do something with $file"
done
if you have the filename in a variable
filename = test.csv
then just use this to get the "csv" part:
echo ${filename##*.}
works for bash, try it in ksh
edit:
filename=test.csv
fileext=${filename##*.}
if [ fileext = "csv" ]; then
echo "file is csv, do something"
else
if [ fileext = "dat" ]; then
echo "file is dat, do something"
else
echo "mhh what now?"
fi
fi

Resources