Command help on UNIX - unix

I need to know if we have any command on UNIX such that:
It gives me all the files which got updated after time t in the current directory

You can use find with the mtime parameter:
find . -maxdepth 1 -mtime -1h30m

You can use the find command for this.
Touch a file with your specific date and then use that file with the -newer parameter of find.
# To find all files modifed on 10th of Dec:
touch -t 12100000 foo
# MMDDhhmm
find ./ -type f -maxdepth 1 -newer foo

Use the find command with appropriate arguments. Relevant information is here.

#find files by modification time
-------------------------------
find . -mtime 1 # 24 hours
find . -mtime -7 # last 7 days
find . -mtime -7 -type f # just files
find . -mtime -7 -type d # just dirs
find with time: this works on mac os x
--------------------------------------
find / -newerct '1 minute ago' -print

Related

How to list the directories non recursively which are 30 days older

I want to list the folders in UNIX which are older than a month. I have tried the following options:
find . -type d -mtime +30
Which will list files recursively however I want non recursive one.
Add the maxdepth option:
find . -maxdepth 1 -type d -mtime +30

Find files by extension

How can I modify the below command to find all files modified in last day that have extension of .log ?
Here is the command so far :
find . -mtime -1 -print
find . -name \*.log -mtime -1 -print
find . -mtime -1 -iname '*.log'
Note: Using double quotes instead of single quotes will likely give unexpected results due to shell expansion.
Use -name option to find files by particular name
find . -mtime -1 -name "*.log" -print
Notice the use of wildcard character * to find all files ending with .log

GNU find: Search in current directory first

how can I tell find to look in the current folder first and then continue search in subfolders? I have the following:
$ find . -iname '.note'
folder/1/.note
folder/2/.note
folder/3/.note
folder/.note
What I want is this:
$ find . -iname '.note'
folder/.note
folder/1/.note
folder/2/.note
folder/3/.note
Any ideas?
find's algorithm is as follows:
For each path given on the command line, let the current entry be that path, then:
Match the current entry against the expression.
If the current entry is a directory, then perform steps 1 and 2 for every entry in that directory, in an unspecified order.
With the -depth primary, steps 1 and 2 are executed in the opposite order.
What you're asking find to do is to consider files before directories in step 2. But find has no option to do that.
In your example, all names of matching files come before names of subdirectories in the same directory, so find . -iname '.note' | sort would work. But that obviously doesn't generalize well.
One way to process non-directories before directories is to run a find command to iterate over directories, and a separate command (possibly find again) to print matching files in that directory.
find -type d -exec print-matching-files-in {} \;
If you want to use a find expression to match files, here's a general structure for the second find command to iterate only over non-directories in the specified directory, non-recursively (GNU find required):
find -type d -exec find {} -maxdepth 1 \! -type d … \;
For example:
find -type d -exec find {} -maxdepth 1 \! -type d -iname '.note' \;
In zsh, you can write
print -l **/(#i).note(Od)
**/ recurses into subdirectories; (#i) (a globbing flag) interprets what follows as a case-insensitive pattern, and (Od) (a glob qualifier) orders the outcome of recursive traversals so that files in a directory are considered before subdirectories. With (Odon), the output is sorted lexicographically within the constraint laid out by Od (i.e. the primary sort criterion comes first).
Workaround would be find . -iname '.note' | sort -r:
$ find . -iname '.note' | sort -r
folder/.note
folder/3/.note
folder/2/.note
folder/1/.note
But here, the output is just sorted in reverse order and that does not change find's behaviour.
For me with GNU find on Linux I get both orderings with different test runs.
Testcase:
rm -rf /tmp/depthtest ; mkdir -p /tmp/depthtest ; cd /tmp/depthtest ; for dir in 1 2 3 . ; do mkdir -p $dir ; touch $dir/.note ; done ; find . -iname '.note'
With this test I get the poster's first result. Note the ordering of 1 2 3 .. If I alter this ordering to to . 1 2 3
rm -rf /tmp/depthtest ; mkdir -p /tmp/depthtest ; cd /tmp/depthtest ; for dir in . 1 2 3 ; do mkdir -p $dir ; touch $dir/.note ; done ; find . -iname '.note'
I get the poster's second result.
In either case adding -depth to find does nothing.
EDIT:
I wrote a perl oneliner to look in to this further:
perl -e 'opendir(DH,".") ; print join("\n", readdir(DH)),"\n" ; closedir(DH)'
And I ran this against /tmp/depthtest after running testcase 1 with these results:
.
..
1
2
3
.note
I ran it again after testcase 2 with these results:
.
..
.note
1
2
3
Which confirms that the results are in directory order.
The -depth option to find only controls whether e.g. ./1/.note is processed before or after ./1/, not whether ./.note or ./1/ is first, so the order of the results is purely based on directory order (which is mostly creation order).
It might be helpful to look at How do I recursively list all directories at a location, breadth-first? to learn how to work around this problem.
find -s . -iname ".note" doesn't help? or find . -iname '.note'|sort ?
Find in the current folder
find ./in_save/ -type f -maxdepth 1| more
==>73!

find without recursion

Is it possible to use the find command in some way that it will not recurse into the sub-directories? For example,
DirsRoot
|-->SubDir1
| |-OtherFile1
|-->SubDir2
| |-OtherFile2
|-File1
|-File2
And the result of something like find DirsRoot --do-not-recurse -type f will be only File1, File2?
I think you'll get what you want with the -maxdepth 1 option, based on your current command structure. If not, you can try looking at the man page for find.
Relevant entry (for convenience's sake):
-maxdepth levels
Descend at most levels (a non-negative integer) levels of direc-
tories below the command line arguments. `-maxdepth 0' means
only apply the tests and actions to the command line arguments.
Your options basically are:
# Do NOT show hidden files (beginning with ".", i.e., .*):
find DirsRoot/* -maxdepth 0 -type f
Or:
# DO show hidden files:
find DirsRoot/ -maxdepth 1 -type f
I believe you are looking for -maxdepth 1.
If you look for POSIX compliant solution:
cd DirsRoot && find . -type f -print -o -name . -o -prune
-maxdepth is not POSIX compliant option.
Yes it is possible by using -maxdepth option in find command
find /DirsRoot/* -maxdepth 1 -type f
From the manual
man find
-maxdepth levels
Descend at most levels (a non-negative integer) levels of directories below the starting-points.
-maxdepth 0
means only apply the tests and actions to the starting-points themselves.

Shell Script — Get all files modified after <date>

I'd rather not do this in PHP so I'm hoping a someone decent at shell scripting can help.
I need a script that runs through directory recursively and finds all files with last modified date is greater than some date. Then, it will tar and zip the file(s) keeping the path information.
as simple as:
find . -mtime -1 | xargs tar --no-recursion -czf myfile.tgz
where find . -mtime -1 will select all the files in (recursively) current directory modified day before. you can use fractions, for example:
find . -mtime -1.5 | xargs tar --no-recursion -czf myfile.tgz
If you have GNU find, then there are a legion of relevant options. The only snag is that the interface to them is less than stellar:
-mmin n (modification time in minutes)
-mtime n (modification time in days)
-newer file (modification time newer than modification time of file)
-daystart (adjust start time from current time to start of day)
Plus alternatives for access time and 'change' or 'create' time.
The hard part is determining the number of minutes since a time.
One option worth considering: use touch to create a file with the required modification time stamp; then use find with -newer.
touch -t 200901031231.43 /tmp/wotsit
find . -newer /tmp/wotsit -print
rm -f /tmp/wotsit
This looks for files newer than 2009-01-03T12:31:43. Clearly, in a script, /tmp/wotsit would be a name with the PID or other value to make it unique; and there'd be a trap to ensure it gets removed even if the user interrupts, and so on and so forth.
You can do this directly with tar and even better:
tar -N '2014-02-01 18:00:00' -jcvf archive.tar.bz2 files
This instructs tar to compress files newer than 1st of January 2014, 18:00:00.
This will work for some number of files. You want to include "-print0" and "xargs -0" in case any of the paths have spaces in them. This example looks for files modified in the last 7 days. To find those modified before the last 7 days, use "+7".
find . -mtime -7 -print0 | xargs -0 tar -cjf /foo/archive.tar.bz2
As this page warns, xargs can cause the tar command to be executed multiple times if there are a lot of arguments, and the "-c" flag could cause problems. In that case, you would want this:
find . -mtime -7 -print0 | xargs -0 tar -rf /foo/archive.tar
You can't update a zipped tar archive with tar, so you would have to bzip2 or gzip it in a second step.
This should show all files modified within the last 7 days.
find . -type f -mtime -7 -print
Pipe that into tar/zip, and you should be good.
I would simply do the following to backup all new files from 7 days ago
tar --newer $(date -d'7 days ago' +"%d-%b") -zcf thisweek.tgz .
note you can also replace '7 days ago' with anything that suits your need
Can be : date -d'yesterday' +"%d-%b"
Or even : date -d'first Sunday last month' +"%d-%b"
well under linux try reading man page of the find command
man find
something like this should
find . -type f -mtime -7 -print -exec cat {} \; | tar cf - | gzip -9
and you have it
You can get a list of files last modified later than x days ago with:
find . -mtime -x
Then you just have to tar and zip files in the resulting list, e.g.:
tar czvf mytarfile.tgz `find . -mtime -30`
for all files modified during last month.
This script will find files having a modification date of two minutes before and after the given date (and you can change the values in the conditions as per your requirement)
PATH_SRC="/home/celvas/Documents/Imp_Task/"
PATH_DST="/home/celvas/Downloads/zeeshan/"
cd $PATH_SRC
TODAY=$(date -d "$(date +%F)" +%s)
TODAY_TIME=$(date -d "$(date +%T)" +%s)
for f in `ls`;
do
# echo "File -> $f"
MOD_DATE=$(stat -c %y "$f")
MOD_DATE=${MOD_DATE% *}
# echo MOD_DATE: $MOD_DATE
MOD_DATE1=$(date -d "$MOD_DATE" +%s)
# echo MOD_DATE: $MOD_DATE
DIFF_IN_DATE=$[ $MOD_DATE1 - $TODAY ]
DIFF_IN_DATE1=$[ $MOD_DATE1 - $TODAY_TIME ]
#echo DIFF: $DIFF_IN_DATE
#echo DIFF1: $DIFF_IN_DATE1
if [[ ($DIFF_IN_DATE -ge -120) && ($DIFF_IN_DATE1 -le 120) && (DIFF_IN_DATE1 -ge -120) ]]
then
echo File lies in Next Hour = $f
echo MOD_DATE: $MOD_DATE
#mv $PATH_SRC/$f $PATH_DST/$f
fi
done
For example you want files having modification date before the given date only, you may change 120 to 0 in $DIFF_IN_DATE parameter discarding the conditions of $DIFF_IN_DATE1 parameter.
Similarly if you want files having modification date 1 hour before and after given date,
just replace 120 by 3600 in if CONDITION.

Resources