rsync How to preserve links on receiver - rsync

I've got dir per environment on receiver with links to some files.
This is because, some files are shared between environment.
So what I would like, when I'm doing a rsync from my remote host to my receiver, that the retrieve files are place following the links. But currently, my rsync replace my local links by the retrived files.
Is there a way to tell rsync to follow links on receiver hosts?

If those links are pointing to directories, you cas use -K option which works flawlessly.
Other thing is when you have on the receiver links pointing to files (not dirs).
I am afraid, currently there is no simple way how to preserve the links in the destination and amend the files they are pointing to with the contents of local links/files those point to.
You might be interested in the -L option if you are sending files links from source but want to copy contents they point to rather than links themselves. However, this would also remove the corresponding links in the receive destination and as mentioned earlier, just change the files they are pointing to.
Check out https://serverfault.com/questions/245774/perform-rsync-while-following-sym-links for more information.

Related

Omit hidden files from directory listing with FTP

I need to get a file from a remote server and I am using the
ls -lA command to list the files inside the FTP block. However I see the "." and ".." entries being listed as well. Is there any way to omit
them and list only the files that are not hidden?
The FTP protocol has no way to control what files the server includes to the listing.
Having that said, many servers do actually support a non-standard -a switch to show hidden files. And indeed by default most FTP servers do not show hidden files neither the . and .. by default. You have to enforce it using -a.
But if your server does show the hidden files, I'm afraid there's no way to force it not to show them, from a client side. Though there can be a server-side configuration option for this, but we do not know what FTP server you are using.
Generally, if you need to do any kind of filtering, you have to do it locally after retrieving a complete directory listing.
For example:
grep -v ^.+$ listing.txt
Presumably by the files that are not hidden you mean the entries not starting with .; to list only those, just omit the A and try ls -l.

download directory and sub directories off a virtual repository

Wondering if there is a way to download the root folder plus a bunch of sub folders (and sub folders of those folders) with all the files and keep them in their respective folders.
I've tried some firefox plugins like flashgot and download-them-all but they grab the actual web files in addition to the files in the repository, but only if they are visible. For example, if I don't collapse all the folders and expose the files in the repository, the plugins won't detect them.
I would just collapse all the folders and expose the files but these plugins won't recognize the folders...they just download as "foldername".html .... and all the files are mixed together in one folder.
I've also tried visualWget and allowed recursive downloads but again, this only grabs the actual website files, not the files in the repository.
If anyone could help it'd be greatly appreciated. I've been copying them manually but there are literally thousands of files and folders so I'm looking for a quicker solution.
As a client you can only download what's accessible. You either need to know the list of files or crawl the pages for the links, which is what the Firefox plugins do.
There's no way to get a list of files on the server without access to the server beyond http (unless the server has webdav or exposes some other api).
I ended up getting it to work. I used the following command in Terminal.
scp -r username#hostaddress:/file/path/to/directory /path/to/my/computer/directory
-r is for recursive so it downloads all files and directories and subdirectors
If you try this be sure to run this command from your local terminal. I made the mistake of doing it from the SSH connection to the server (no negative effects just frustrating)

Maintaing Images when switching servers

I just changed the server one of my sites is hosted on. In doing so, I lost all the images. The CCK file upload fields show "ghost" data but contain no actual image data as they did before the site transfer.
All my data is fine, however.
Is there a way to prevent this so all my images are maintained?
Thanks
You can transfer files with rsync directly or use drush to rsync. You'll need to have ssh access to the servers to get this to work.
Here's some info on setting up your drush aliases:
http://www.leveltendesign.com/blog/dustin-currie/synchronize-one-drupal-site-to-another
http://drupal.org/project/drush
If you're performing this task once you could also use scp to copy files from the destination:
http://www.go2linux.org/scp-linux-command-line-copy-files-over-ssh
If you're on a shared hosting platform, just FTP the files over old school style.

Nautilus script: $NAUTILUS_SCRIPT_SELECTED_FILE_PATHS empty for WebDAV folders

When writing a Nautilus script, $NAUTILUS_SCRIPT_SELECTED_FILE_PATHS gives the path to the file whose context menu has been clicked, for instance /home/nico/test.txt.
But when the file is within a WebDAV share, the variable is empty.
Is it a bug?
How to get the path for a WebDAV file?
My script is intended to be used for files on WebDAV shares.
I have just found this list of variables:
https://help.ubuntu.com/community/NautilusScriptsHowto
The one I was looking for is $NAUTILUS_SCRIPT_SELECTED_URIS, it works on WebDAV too, returning for instance dav://admin#localhost:8080/alfresco/webdav/User%20Homes/leo/test.txt
Nautilus' $NAUTILUS_SCRIPT_SELECTED_FILE_PATHS is only for LOCAL (mounted) files, and by design is blank for remote files, like $1, $2...
For REMOTE files, like WebDAV, or Samba network shares, FTP servers, (or any other location where $NAUTILUS_SCRIPT_CURRENT_URI is not like file://...), use $NAUTILUS_SCRIPT_SELECTED_URIS

UNIX ftp command get, change default destination directory

I know VERY little about UNIX commands, so I´ll do my best to explain what I want in plain English.
Though my mac´s Terminal, I am connected to an ftp account by:
ftp example.com
When I do:
get file.php
I download it to the users directory.
So now two related question:
1) How can I choose the download directory for this specific download
and
2) Choose the default destination directory for future download.
Use a second argument to get; e.g. get file.php path/to/dir/.
Use a command that's usually called lcd (for "local change directory"). Not all clients have this command, but most do (for example, lftp, a powerful ftp client, does).

Resources