I want to transfer files using rsync to a FTP at the end of every day.
My current rsync script:
rsync -avz /var/spool/asterisk/monitorDONE/MP3 pbciftp:/home/voicefiles/ftp/`date +%Y.%m.%d`
The issue
I want rsync to transfer files that have today's date in their file name; a file might for example be called 20130527_agent_number_campaign.mp3.
So I need rsync to find all files whose file name starts with 20130527 and transfer them.
The most flexible way is probably with find. Something like:
find /var/spool/asterisk/monitorDONE/MP3 -name "*`date +%Y%m%d`*" -print0 | \
rsync -avz --files-from=- --from0 \
/var/spool/asterisk/monitorDONE/MP3 pbciftp:/home/voicefiles/ftp/`date +%Y.%m.%d`
Related
It's a simple question that I can't seem to figure out. I'm on a Mac with Big Sur with all the latest updates, and I'm going through Terminal to get these commands to run. If there's a better way please let me know.
This is, in basic terms, what I'm trying to do--I want RSYNC to recursively go through a source directory (which in this case would ideally be an entire drive), find any files modified within the last 24 hours, and copy those to another drive, while preserving the folder structure. So if I have:
/Volumes/Drive1/Folder1/File1.file
/Volumes/Drive1/Folder1/File2.file
/Volumes/Drive1/Folder1/File3.file
And File1 has been modified in the last 24 hours, but the other two haven't, I want it to copy that file, so that on the second drive I wind up with:
/Volumes/Drive2/Folder1/File1.file
But without copying File2 and File3.
I've tried a lot of different solutions and strings, but I'm running into problems. The closest I've been able to get is this:
find /Volumes/Drive1/ -type f -mtime -1 -exec cp -a "{}" /Volumes/Drive2/ \;
The problem is that while this one does go through Drive1 and find all the files newer than a day like I want, when it copies them it just dumps them all into the root of Drive2.
This one also seems to come close:
rsync --progress --files-from=<(find /Volumes/Drive1/ -mtime -1 -type f -exec basename {} \;) /Volumes/Drive1/ /Volumes/Drive2/
This one also identifies all the files modified in the last 24 hours, but instead of copying them it gives an error, "link_stat (filename and path) failed: no such file or directory (2)."
I've spent several days trying to figure out what I'm doing wrong but I can't figure it out. Help please!
I think this'll work:
srcDir=/Volumes/Drive1
destDir=/Volumes/Drive2
(cd "$srcDir" && find . -type f -mtime -1 -print0) |
while IFS= read -r -d $'\0' filepath; do
mkdir -p "$(dirname "$destDir/$filepath")"
cp -a "$srcDir/$filepath" "$destDir/$filepath"
done
Explanation:
Using cd "$srcDir"; find . -whatever will generate relative paths (starting with "./") from the source directory to the found files; that means appending the results to $srcDir and $destDir will give the full source and destination paths for each file.
Putting it in parentheses makes it run in a subshell, so the cd won't affect other commands. Coupling cd and find with && means that if cd fails, it won't run find (which would run in the wrong place, generate a list of the wrong file file, and generally cause trouble).
Using -print0 and while IFS= read -r -d $'\0' is a standard weird-filename-safe way of iterating over found files (see BashFAQ #20). Note that if anything in the loop reads from standard input (e.g. cp -i asking for confirmation), it'll steal part of the file list; if this is a worry, use this variant (instead of the pipe) to send the file list over file descriptor #3 instead of standard input:
while IFS= read -r -d $'\0' filepath <&3; do
...
done 3< <(cd "$srcDir" && find . -type f -mtime -1 -print0)
Finally, mkdir -p is used to make sure the destination directory exists, and then cp to copy the file.
i want to write a shell command to sync current directory to backup directory with some requirments. the command i'm using is:
rsync -ptvHS --progress --delete-after --exclude /backup $pwd ~/backup
i want the directory timestamps to be ignored, eventhough i use -t to preserve the file timestamps.
Any idea?
thank you in advance
From the man page:
-t, --times preserve modification times
-O, --omit-dir-times omit directories from --times
-J, --omit-link-times omit symlinks from --times
Seems like you need to add -O to your command.
This is from rsync 3.1.2; you might find your version is too old.
When running rsync with the --backup --delete-during and --backup-dir=PATH options, only files that are deleted are backed up, but directories are not if those directories were empty at the time they were deleted. I can't see an option that specifies directories should not be pruned from backup when being deleted.
Example:
mkdir /tmp/test_rsync_delete
cd /tmp/test_rsync_delete
mkdir -p a/a/a/a/a
ln -s . a/b
mkdir -p b/a/a
ln -s a/a b/a
touch b/a/a/a
mkdir c
mkdir backup
rsync -avi --delete-during --backup --backup-dir=backup a/ c/
find backup/ -exec ls -ldi {} \;
# Should be empty
rsync -avi --delete-during --backup --backup-dir=backup b/ c/
find backup/ -exec ls -ldi {} \;
# Will be missing the directory that was deleted to make way for the file.
Update
As per the above example, when you run it, you will notice that the empty directories were pruned/removed by the --delete option. However, the same directories were not backed up in the directory specified by the --backup-dir option. It's not necessarily the directories that are important, but the permissions and ownership that are important. If rsync fails when running in batch mode (--read-batch) then you need to be able to roll back by restoring the system to its previous state. If directories are not being backed up, then it's not really creating a reliable point from which to restore to - it will potentially be missing some directories.
So why does the --backup family of options not backup empty directories when they are going to be pruned by the --delete family of options?
This is not an answer to the specific question, but probably the answer to, what others were searching for, ending up here:
Just for info: this is what I was searching for when I found this question:
rsync -av --delete-after src dest
-av The "-a" means archive. This will preserve symlinks, permissions, timestamps, group/owners, and will be recursive. The "v" makes the job verbose. This won't be necessary, but you can see what's happening with the rsync so you know if you've done something wrong.
--delete-after Will tell rsync to compare the destination against the source and delete any extraneous files after the rsync has completed. This is a dangerous option, so use with caution.
I need a script file for backup (zip or tar or gz) of old log files in our unix server (causing the space problem). Could you please help me to create the zip or gz files for each log files in current directory and sub-directories also?
I found one command which is to create gz file for the older files, but it creates only one gz file for all older file. But I need individual gz file for each log file.
find /tmp/log/ -mtime +180 | xargs tar -czvPf /tmp/older_log_$(date +%F).tar.gz
Thanking you in advance.
Best way is
find . -mtime +3 -print -exec gzip {} \;
Where +3 means zip all files which is older than 3 days.
Thanks a lot for your reply.
I got it.
files=($(find /tmp/mallik3/ -mtime +"$days"))
for files in ${files[*]}
do
echo $files
zip $files-$(date --date="- "$days"days" +%F)_.zip $files
# tar cvfz $(files)_$(date --date='-6months' +%F).tar.gz $files
# rm $files
done
First, the -mtime argument does not get you files that are "older" than a certain amount. Rather, it checks the last time the file was modified. The creation date of files is not kept in most file systems. Often, the last modified time is sufficient, but it is not the same as the age of the file.
If you just want to create a single tar file for each archive, use -exec instead of passing the data to xargs:
find /tmp/log/ -mtime +180 -type f -exec sh -c \
'tar -czvPf /tmp/older_log_$(basename $0)_$(date +%F).tar.gz $0' {} \;
I have a text file which contains the list of files and directories that I want to copy (one on a line). Now I want rsync to take this input from my text file and sync it to the destination that I provide.
I've tried playing around with "--include-from=FILE" and "--file-from=FILE" options of rsync but it is just not working
I also tried pre-fixing "+" on each line in my file but still it is not working.
I have tried coming with various filter PATTERNs as outlined in the rsync man page but it is still not working.
Could someone provide me correct syntax for this use case. I've tried above things on Fedora 15, RHEL 6.2 and Ubuntu 10.04 and none worked. So i am definitely missing something.
Many thanks.
There is more than one way to answer this question depending on how you want to copy these files. If your intent is to copy the file list with absolute paths, then it might look something like:
rsync -av --files-from=/path/to/files.txt / /destination/path/
...This would expect the paths to be relative to the source location of / and would retain the entire absolute structure under that destination.
If your goal is to copy all of those files in the list to the destination, without preserving any kind of path hierarchy (just a collection of files), then you could try one of the following:
# note this method might break if your file it too long and
# exceed the maximum arg limit
rsync -av `cat /path/to/file` /destination/
# or get fancy with xargs to batch 200 of your items at a time
# with multiple calls to rsync
cat /path/to/file | xargs -n 200 -J % rsync -av % /destination/
Or a for-loop and copy:
# bash shell
for f in `cat /path/to/files.txt`; do cp $f /dest/; done
Given a file listing $HOME/GET/bringemback containing
**need/A
alsoneed/B
shouldget/C**
cd $HOME/GET
run
rsync -av --files-from=./bringemback me#theremote:. $HOME/GET/collect
would get the files and drop them into $HOME/GET/collect
$HOME/GET/
collect/
need/A
alsoneed/B
shouldget/C
or so I believe.
helpful
rsync supports this natively:
rsync --recursive -av --files-from=/path/to/files.txt / /destination/path/