rsync not copying my file from remote server - rsync

I have two VM's : dev and prod.
I want to use rsync to copy dump file from prod and then restore it on dev. I'm using this command to copy:
rsync -rave user#ip:/home/user/dumps /home/anotheruser/workspace/someapp/dumps
The same thing successfully copies static files (.html, .css) from another directory, but in this case only the folder itself is created but without the file:
/home/anotheruser/workspace/someapp/dumps
but I'm expecting:
/home/anotheruser/workspace/someapp/dumps/dumpfile
What is going wrong? dumpfile exists there user#ip:/home/user/dumps/dumpfile.

The command you want is probably this:
rsync -av user#ip:/home/user/dumps/ /home/anotheruser/workspace/someapp/dumps/
I've
removed the r because it's implied by the a anyway.
removed the e because that was probably your problem; it requires a parameter that you haven't given.
added the / at the end of the pathnames to make sure they're treated as directories.

Related

WinSCP script to synchronize directories, but exclude several subdirectories

I need to write a script that synchronizes local files with a remote machine.
My file structure is:
ProjectFolder/
.git/
input/
output/
classes/
main.py
readme.md
I need to synchronize everything, but:
completely ignore .git folder
ignore files in input and output folders, but copy the folder
So far my code is:
open sftp://me:password#server -hostkey="XXXXXXXX"
option batch abort
option confirm off
synchronize remote "C:\Users\MYNAME\Documents\MY FOLDER\Python Projects\ProjectFolder" "/home/MYNAME/py_proj/ProjectFolder" -filemask="|C:\Users\MYNAME\Documents\MY FOLDER\Python Projects\ProjectFolder\.git"
close
exit
First question: it doesn't seems to work.
Second question, how to add mask for input and output folder if I have spaces in file paths?
Thanks to all in advance.
Masks for directories have to end with a slash.
To exclude files in a specific folder, use something like */folder/*
-filemask="|.git\;*/input/*;*/output/*"

Rsync Specific Listed Files from specified Server Directory

I'd like to rsync specified files from a specific server folder to my local directory (in which I am running the command).
However, I'm getting the error failed: No such file or directory (2). There seems to be something wrong with my syntax and I'm not sure its picking up the source directory properly
This is my command...
rsync -az . remoteSite.com::remoteFolder/remoteSubFolder/ --files-from=filelist.txt
filelist.txt, which it seems to be finding, contains filenames within remoteSubFolder
file1.xml
file2.xml
etc
What am I doing wrong?
Thanks to #Gordon Davisson.
I now understand that the full stop represents the local directory and goes after the remote host and directory.
rsync -az --files-from=filelist.txt remoteSite.com::remoteFolder/remoteSubFolder/ .

Rsync copy "unsafe" symlinks but don't update modification time on the symlink targets

Is it possible to have rsync copy "unsafe" symlinks (that is, those that refer to files/dirs outside of the copied tree, see docs here) but not update the times on them?
I'm using rsync -a --delete --omit-dir-times to copy a bunch of files from /home/somebody/foo/bar to a destination machine, but running into the following error: rsync: failed to set times on "/home/somebody/foo/bar/symlink": Operation not permitted (1), where /home/somebody/foo/bar/smylink refers to something in /usr/lib/ owned by root at the destination and lacking proper permission for the rsync user to update it.
Essentially rsync tries to update the time on the symlink like all other files it copies, but gets blocked by permissions because it's not root at the destination.
What I'd like to do is copy the link, but not touch the symlink target at all during the copy. I just want the link. I could change permissions on the target file, but I'd like to avoid that.
Is this achievable? Is this a terrible idea and I'd be abusing rsync? Suggestions for alternative approaches in the latter case?
There is another option for rsync --omit-link-times which will probably do what you are looking for. See man page at:
http://manpages.ubuntu.com/manpages/bionic/man1/rsync.1.html

Transferring a new folder to remote using rsync?

If I create a completely new folder locally, I want to be able to rsync it remotely to an SFTP server, how can I achieve this?
I have tried:
rsync Documents/SomeFolder username#host:/home/Documents/RemoteFolder
Meaning SomeFolder must go into RemoteFolder, but this doesn't work, instead it creates a file called SomeFolder
Would appreciate some help on this
If you use the -r (recurse into directories) option that should make it work. Also -d (transfer directories without recursing) will work. You should use -r if sometimes the folder will not be empty and you want to copy its contents. Use either as shown here:
rsync -r Documents/SomeFolder username#host:/home/Documents/RemoteFolder

synchronise local directories over ssh

The following command works great for me for a single file:
scp your_username#remotehost.edu:foobar.txt /some/local/directory
What I want to do is do it recursive (i.e. for all subdirectories / subfiles of a given path on server), merge folders and overwrite files that already exist locally, and finally downland only those files on server that are smaller than a certain value (e.g. 10 mb).
How could I do that?
Use rsync.
Your command is likely to look like this:
rsync -az --max-size=10m your_username#remotehost.edu:foobar.txt /some/local/directory
-a (archive mode - the sync is recursive, transfers ownership, attributes, symlinks among other things)
-z (compresses transfer)
--max-size (only copies files up to a certain size)
There are many more flags which may be suitable. Checkout the docs for more details - http://linux.die.net/man/1/rsync
First option: use rsync.
Second option, and it's not going to be a one liner, but can be done in three or four lines:
Create a tar archive on the remote system using ssh.
Copy the tar from remote system with scp.
Untar the archive locally.
If the creation of the archive gets a bit complicated and involves using find and/or tar with several options it is quite practical to create a script which would do that locally, upload it on the server with scp, and only then execute remotely with ssh.

Resources