How to delete logs size greater than 100 mb in unix - unix

I want to delete logs which was greater than 100 mb and it should not return or delete current >month logs

Using (GNU) find & date:
find /path -type f -size +100M -mtime +$(date --date=yesterday +%d) -delete
where /path is where your logs are located. -mtime $(date --date=yesterday %d) means any files last modified more than day of month as of yesterday.
Make sure you test this before you use it, say, by using -ls to print the files instead of deleting them, or prompt before deleting each file with -exec rm -i {} \;.

With find you can use age and size filters and then RM.
find /path/to/files -mtime +30 -size +100M -exec rm {}
I haven't tested it

Related

Modify permission of a moved file in unix

I am moving a file from my server storage area to NAS (network attached storage).
The move command is in my script and executed daily. As :-
find . \ -mtime +0 -exec mv {} target \ ;
From my knowledge above command is supposed to move files modified last between 24hrs and 48hrs.By default the file moved gets permission of 64 . Now i wanted to change permission to 644 for these moved files. Note that file is moved from server to NAS. I added the below command in script
find target \ -mtime 0 -exec chmod 644 {} \ ;
Note the 0 and not +0 in above command .But this does not seem to work . Is it because due -mtime +0 means last modified b/w 24hr and 48hr and that is retained when file is moved from server to NAS. what would be the appropiate mtime value to be provided by me in this case . I know a simple way out may be to give it as mtime -3 which will modify permission for all files modified less than 72 hrs . But i do not want this command to run over all 3 days of moved files but just on one which has been moved via my command
find . \ -mtime +0 -exec mv {} target \ ;
I do not understand the backslashes.
Your find copies all files to one target folder.
When you are OK with that, (and have write rights in all dirs) you can do
find . -mtime +0 -exec chmod 644 {} \; -exec mv {} target \;
You have more control for simple dirs/files (without spaces/other special chars) with
find . -mtime +0 | while read file; do
filename=${file##*}
mv ${file} ${target}
chmod 644 ${target}/${filename}
done

Delete directory based on date

I am writing zsh script in which I have to get the date of 90th previous day from current date i.e I have to subtract 90 days from current date. Then I have to check the folders which have different dates as their names. I have to compare the directory date with the subtracted date and if the result is greater than the subtracted date, I have to delete the directory.
For example:
Let us say the current_date = 20131130 (yyyymmdd)
subtracted_date=current_date - 90 days
lets say there is a folder 20130621
Now this folder name should be compare with the subtracted date. If greater than subtracted_date then i have to delete the directory.
find path -type d -ctime +90 -exec rm -rf {} \;
should find all directories older than 90 days and use rm -rf on them
Be careful with that command though you will probably want to test it first with this
find path -type d -ctime +90 -exec echo {} \;
in order to keep certain folders consider -mtime instead of -ctime and touch the folder every so often
replace path above with the actual path you want to scan and delete
explanation
find is the command
path is the root directory you want to scan
-type d means look for directories only
-ctime +90 means created time older than 90 days
-exec rm -rf {} \; means remove recursively and force delete of the items which were found
-mtime is modified time
The second command will list out all folder which will be deleted so is much safer to run while you are testing
List directories before delete
find . -type d -ctime +60 -ls
List files before delete
find . -type f -ctime +60 -ls
Delete directories in current directory
find . -type d -ctime +60 -exec rm -rf {} \;
Delete files in current directory
find . -type f -ctime +60 -exec rm -rf {} \;
You can use the date command to find the date 90 days earlier to the current one. The following script should give you a list of directories that need to be deleted:
del=$(date --date="90 days ago" +%Y%m%d)
for i in `find . -type d -name "2*"`; do
(($del > $(basename $i))) && echo "delete $i" || echo "dont delete $i"
done
To perform the actual deletion of directories, you can replace the third line with the following:
(($del > $(basename $i))) && rm -rf $i
For example, if your current directory contains the following folders:
$ ls -1F
20120102/
20130104/
20130302/
20130402/
20130502/
20130602/
20130702/
Executing the above script would tell:
$ bash cleanup
delete ./20130302
delete ./20130104
delete ./20120102
delete ./20130402
dont delete ./20130702
dont delete ./20130502
dont delete ./20130602
You can also use the following command to delete all files in the current directory by date .
[jamshi#cpanel ~]$ rm -rf ls -l | grep 'Jul 19 15:32' | tr -s ' ' | cut -d ' ' -f9

find and mtime operation excluding folder mtime

Hi there I have a backup shell script executed through crontab but I have a rather large problem.
This is the particular line that scans my drive:
find $E -mtime -1 -exec cp -r --parents {} $B/$T \;
where E and B are variables holding directory paths and T holds the current date. It checks for all files that have been edited within the past day and copies them to the new directory. The folder structure is kept intact due to the --parents argument. The problem I have is that this seems to also check the mtime of all folders, meaning that if I were to change a single file in a very large folder, the entire folder would be copied across during backup, taking up an unnecessary amount of disk space. Is there any way I could remove folder mtime from the equation? I guess it might be possible to exclude folders themselves (not their contents) from the search as long as the --parents argument still takes effect.
I'm guessing you want to apply this only to regular files -
find $E -type f -mtime -1 -exec cp -r --parents {} $B/$T \;
otherwise
find $E ! -type d -mtime -1 -exec cp -r --parents {} $B/$T \;
to get other types of files as well, skipping the evaluation of age on directories.

find command not working from other directory

My dir sturcture that looks like
x
/log
/bin
I am giving this command from dir- x/bin
find ../log -type f -name \*.log -mtime +90 -exec ls -l {} \;
(to find and display list of files older than 90 days.) and it doesn't display anything.
Whereas if i execute same command in dir- x/log
find . -type f -name \*.log -mtime +90 -exec ls -l {} \;
it gives me a list of files older than 90 days.
Can you please help?
Recall that paths are relative.
If you have a dir sturcture that looks like
x
/log
/bin
AND your're in x/bin then you need to give the relative path to x/log, ie
pwd
x/bin
find ../x/log -type f -name \*.log -mtime +90 -exec ls -l {} \;
I hope this helps.
Two suggestions.
First, escape the * using \*. If you have any log files in current dir, they will get expanded before the command is executed.
Second, I think you mean find ../x/log ...?

How to purge old directories programmatically in unix/shell?

Does anyone have a good shell line for this?
I want to check the age on a directory. If I created multiple directories on a weekly basis and I want to purge them/delete them based on 7 days later for example.
How would I do that?
This will let you do a dry run, remove the echo if you like the output
find /path/to/toplevel -type d -mtime +7 -exec echo rm -rf {} +
Update
If you have an older version of find that doesn't comply with POSIX 2004 then use this instead:
find /path/to/toplevel -type d -mtime +7 -exec echo rm -rf {} \;
or
find /path/to/toplevel -type d -mtime +7 -print0 | xargs -0 echo rm -rf {}
The former terminated by \; will call rm for each directory it finds, the latter with xargs will attempt to call rm as few times as possible by passing multiple directories to a single call to rm and thus be much faster. The latter also has identical behavior to the first one terminated with a +

Resources