finding files in unix based on creation/modification date - unix

How to find files on a unix server which were created/modified in previous month?
for ex. If the current month is Jul then the list of files which were created/modified in Jun should get displayed.

One way is to execute that command.
ls -laR | grep monthName where montName could be Jan,Feb,Mar, and so on ... (remember to change working directory to directory that you're interested in. Also notice that this method is recursive so all sub-directories will be inspected
With this you also retrieve all file permission and so on...
I'm sure that will be better ways (if them jump into my mind, I'll edit this post) but since I'm in coffee break, this is the faster that I find.

In order to find files modified in the previous month, you will need to use find with a set range, for example:
cd / (if you are wanting to start from the root)
find . -type f -mtime +26d -mtime -56d -print
You should adjust your range to include the dates that you wish to include.
All the best to you!

monthToFind=`date -d "1 months ago" "+%Y-%m"`
find . -printf "%TY-%Tm %p\n" | egrep "^$monthToFind " | sed "s/^$monthToFind //g"
This will be slower than using a time range in find. But the time range is hard to determine, and quickly becomes invalid, possibly even while the command is executing.
Unfortunately this will miss files modified last month when they were also modified this month. I don't know of a way to determine these files.

Related

Script Issues with find -> tar/gzip

I am currently working on a script, to store/backup our old files, so that we have more space on our server. This script will be used as a cronjob to backup the stuff every week. My script currently looks like this:
#!/bin/bash
currentDate=$(date '+%Y%m%d%T' | sed -e 's/://g')
find /Directory1/ -type f -mtime +90 | xargs tar cvf - | gzip > /Directory2/Backup$currentDate.tar.gz
find /Directory1/ -type f -mtime +90 -exec rm {} \;
The script is at first saving the current Date + Timestamp(without ":") as a variable. Afterwards it searches for files older than 90 days, tars them and finally makes a gzip out of them, which has the name "Backup$currentDate.tar.gz".
Then it's supposed to find the files again and remove them.
I do however have some issues here:
Directory1 consists of multiple Directories. It does find the files and creates the gz file, but while some files are zipped properly(for instance /DirName1/DirName2/DirName3/File), others appear directly in the "root" Dir. What could be the issue here?
Is there a way to tell the Script, to only create the gz file, if files are found? Because currently, we get gz files, even if there was nothing found, leading to empty directories.
Can I somehow use the find output later on(store variable?), so that the remove at the end really only targets those files found in the step before? Because if the third step would take, let's say a hour and the last step gets executed after it's finished, it could potentially remove files, that weren't older than 90 days before, but are now, so they are never backed up, but then deleted(highly unlikly, but not impossible).
If there's anything else you need to know, feel free to ask ^^
Best regards
I've "rephrased" your original code a bit. I don't have an AIX machine to test anything, so DO NOT cut and paste this. Using this code, you should be able to address your issues. To wit:
It make a record of what files it intends to operate on ($BFILES).
This record can be used to check for empty tar files.
This record can be used to see why your find is producing "funny" output. It wouldn't surprise me to find that xargs hit a space character.
This record can be used to delete exactly the files archived.
As a child, I had a serious accident with xargs and have avoided it ever since. Maybe there is a safe version out there.
#!/bin/bash
# I don't have an AIX machine to test this, so exit immediately until
# someone can proof this code.
exit 1
currentDate=$(date '+%Y%m%d%T' | sed -e 's/://g')
BFILES=/tmp/Backup$currentDate.files
find /Directory1 -type f -mtime +90 -print > $BFILES
# Here is the time to proofread the file list, $BFILES
# The AIX page I read lists the '-L' option to take filenames from an
# input file. I've found xargs to be sketchy unless you are very
# careful about quoting.
#tar -c -v -L $BFILES -f - | gzip -9 > /Directory2/Backup$currentDate.tar.gz
# I've found xargs to be sketchy unless you are very careful about
# quoting. I would rather loop over the input file one well quoted
# line at a time rather than use the faster, less safe xargs. But
# here it is.
#xargs rm < $BFILES

How to check if a file is older than 1 week and then move to archive folder

How to check if a file is older than 1 week and then move to archive folder.Using some if condition we should be checking the file is exists which is 1 week older.Then try to move the same to archive
This should work:
find <path> -atime +1w -exec mv {} <dest_dir> \;
Argument -atime controls last access time, you may want to use ctime (creation timestamp) or mtime (last modification timestamp). +1w means more than one week.
Argument exec let find execute some command on the files that meet the criterion. {} is a place to be filled with the path of the files found.
Read the manual for more criteria.

how to delete older carbon data automatically?

Is there any configuration to delete older carbon data automatically after certain period of time?
I tried searching it but could not find anything about it.
Thanks in advance for any suggestion and answer.
Graphite as such doesn't support deletion yet. I would advice decreasing the storage-schema to store data only until the point you need it, so as to really solve this 'problem'.
Still- you can run a cron at regular intervals to do so. The following would delete any wsp file that hasn't been touched in a day-
Using GNU find:
find /opt/graphite/storage/whisper/* -name '*\.wsp' -mtime 1 -delete
Is this for whisperfiles that are no longer being written to? Or is this just for older data in an existing metric?
if the former you can run something like
find <whisperpath> -iname "*.wsp" -mtime +<number of days lower limit>
that will list how many that fit the criteria,
if you want to delete them in the same command append -delete
find <whisperpath> -iname "*.wsp" -mtime +<number of days lower limit> -delete

How to find the files that are created in the last hour in unix

How to find the files that are created in the last hour in unix
If the dir to search is srch_dir then either
$ find srch_dir -cmin -60 # change time
or
$ find srch_dir -mmin -60 # modification time
or
$ find srch_dir -amin -60 # access time
shows files whose metadata has been changed (ctime), the file contents itself have been modified (mtime), or accessed (atime) in the last hour, respectively.
ctime is non-unintuitive and warrants further explanation:
ctime:
Unlike mtime, which is only related to the contents inside a file, changed timestamp indicates the last time some metadata of a file was changed. ctime refers to the last time when a file’s metadata.
For example, if permission settings of a file were modified, ctime will indicate it.
UNIX filesystems (generally) don't store creation times. Instead, there are only access time, (data) modification time, and (inode) change time.
That being said, find has -atime -mtime -ctime predicates:
$ man 1 find
...
-ctime n
The primary shall evaluate as true if the time of last change of
file status information subtracted from the initialization time,
divided by 86400 (with any remainder discarded), is n.
...
Thus find -ctime 0 finds everything for which the inode has changed (e.g. includes file creation, but also counts link count and permissions and filesize change) less than an hour ago.
check out this link and then help yourself out.
the basic code is
#create a temp. file
echo "hi " > t.tmp
# set the file time to 2 hours ago
touch -t 200405121120 t.tmp
# then check for files
find /admin//dump -type f -newer t.tmp -print -exec ls -lt {} \; | pg
find ./ -cTime -1 -type f
OR
find ./ -cmin -60 -type f
sudo find / -Bmin 60
From the man page:
-Bmin n
True if the difference between the time of a file's inode creation and
the time find was started, rounded up to the next full minute, is n
minutes.
Obviously, you may want to set up a bit differently, but this primary seems the best solution for searching for any file created in the last N minutes.
Check out this link for more details.
To find files which are created in last one hour in current directory, you can use -amin
find . -amin -60 -type f
This will find files which are created with in last 1 hour.

Locating most recently updated file recursively in UNIX

For a website I'm working on I want to be able to automatically update the "This page was last modified:" section in the footer as I'm doing my nightly git commit. Essentially I plan on writing a shell script to run at midnight each night which will do all of my general server maintenance. Most of these tasks I already know how to automate, but I have a file (footer.php) which is included in every page and displays the date the site was last updated. I want to be able to recursively look through my website and check the timestamp on every file, then if any of these were edited after the date in footer.php I want to update this date.
All I need is a UNIX command that will recursively iterate through my files and return ONLY the date of the last modification. I don't need file names or what changes were made, I just need to know a single day (and hopefully time) that the most recently updated file was changed.
I know using "ls -l" and "cut" I could iterate through every folder to do this, but I was hoping for a quicker-running and easier command. Preferably a single-line shell command (possibly with a -R parameter)
The find outputs all the access times in Unix format, then sort and take the biggest.
Converting into whatever date format is wanted is left as an exercise for the reader:
find /path -type f -iname "*.php" -printf "%T#" | sort -n | tail -1
GNU find
find /path -type -f -iname "*.php" -printf "%T+"
check the find man page to play with other -printf specifiers.
You might want to look at a inotify script that updates the footer every time any other file is modified, instead of looking all through the file system for new updates.

Resources