When writing a Nautilus script, $NAUTILUS_SCRIPT_SELECTED_FILE_PATHS gives the path to the file whose context menu has been clicked, for instance /home/nico/test.txt.
But when the file is within a WebDAV share, the variable is empty.
Is it a bug?
How to get the path for a WebDAV file?
My script is intended to be used for files on WebDAV shares.
I have just found this list of variables:
https://help.ubuntu.com/community/NautilusScriptsHowto
The one I was looking for is $NAUTILUS_SCRIPT_SELECTED_URIS, it works on WebDAV too, returning for instance dav://admin#localhost:8080/alfresco/webdav/User%20Homes/leo/test.txt
Nautilus' $NAUTILUS_SCRIPT_SELECTED_FILE_PATHS is only for LOCAL (mounted) files, and by design is blank for remote files, like $1, $2...
For REMOTE files, like WebDAV, or Samba network shares, FTP servers, (or any other location where $NAUTILUS_SCRIPT_CURRENT_URI is not like file://...), use $NAUTILUS_SCRIPT_SELECTED_URIS
Related
I have a little bit complicate situation here:
I need to download files from a SFTP daily. I connect to the SFTP with username and SSH key, the keys have a passphrase.
This SFTP has no actual files. All the files on the server is 0 bytes. The server will dynamicly generate the file if it get a "get" command.
So when I connect the SFTP with Winscp, everything went perfectly.
But I have to do it in Synapse.
I managed to connect it in Pipeline with copy activity, and I managed to download all the files, but with no data content inside.
Does anyone know how I can download the files with content?
If you actually have files with content in SFTP location, then they should also be automatically copied using the pipeline in your Synapse. In case if you just want to copy the files that are having the content and ignore empty files, then you will have to use a get metadata activity to check the size of the file (i.e., > 0 bytes) and then filter those files only to copy to your desired destination. Using the childItems you can get the fileName, Type and Size and use these properties in the subsequent copy activity to only copy filter files to your destination.
I need to get a file from a remote server and I am using the
ls -lA command to list the files inside the FTP block. However I see the "." and ".." entries being listed as well. Is there any way to omit
them and list only the files that are not hidden?
The FTP protocol has no way to control what files the server includes to the listing.
Having that said, many servers do actually support a non-standard -a switch to show hidden files. And indeed by default most FTP servers do not show hidden files neither the . and .. by default. You have to enforce it using -a.
But if your server does show the hidden files, I'm afraid there's no way to force it not to show them, from a client side. Though there can be a server-side configuration option for this, but we do not know what FTP server you are using.
Generally, if you need to do any kind of filtering, you have to do it locally after retrieving a complete directory listing.
For example:
grep -v ^.+$ listing.txt
Presumably by the files that are not hidden you mean the entries not starting with .; to list only those, just omit the A and try ls -l.
Normally a WebDAV URL shows up as a network link within the table of root directory trees (c:/; d:/). I would like the WebDAV URL to be accessed from a regular folder e.g. c:/user/download.
How do I link the URL manually and/ or using the msdn WebDAV API for script configuration?
Thanks!
I dont know of any windows clients that can mount a drive into a local file system. A couple of options
Mount a drive like normal and then use a linked folder (ok, not a great option)
Use a file sync client to sync a local folder to the webdav server
There are a handful of sync clients around. Here's a new one that looks nice, although i havent used it - https://www.syncany.org/
Wondering if there is a way to download the root folder plus a bunch of sub folders (and sub folders of those folders) with all the files and keep them in their respective folders.
I've tried some firefox plugins like flashgot and download-them-all but they grab the actual web files in addition to the files in the repository, but only if they are visible. For example, if I don't collapse all the folders and expose the files in the repository, the plugins won't detect them.
I would just collapse all the folders and expose the files but these plugins won't recognize the folders...they just download as "foldername".html .... and all the files are mixed together in one folder.
I've also tried visualWget and allowed recursive downloads but again, this only grabs the actual website files, not the files in the repository.
If anyone could help it'd be greatly appreciated. I've been copying them manually but there are literally thousands of files and folders so I'm looking for a quicker solution.
As a client you can only download what's accessible. You either need to know the list of files or crawl the pages for the links, which is what the Firefox plugins do.
There's no way to get a list of files on the server without access to the server beyond http (unless the server has webdav or exposes some other api).
I ended up getting it to work. I used the following command in Terminal.
scp -r username#hostaddress:/file/path/to/directory /path/to/my/computer/directory
-r is for recursive so it downloads all files and directories and subdirectors
If you try this be sure to run this command from your local terminal. I made the mistake of doing it from the SSH connection to the server (no negative effects just frustrating)
I usually cannot see .htaccess file because it is hidden, when I login to remote servers with ftp access.
Since I don't have shell access, I usually perform the following steps to edit the file:
I change the settings on my mac (from terminal) to see invisible files
I open .htaccess file on a standard drupal installation and I edit it
I upload it to the remote server and I overwrite the existent one
I disable hidden files on my mac
I was wondering if there is a faster solution
thanks
I often have a separate file in my Drupal root called production.htaccess or something along those lines. Not only does this expose the file in Finder without revealing every single .DS_Store on my system, it also allows me to set separate .htaccess directives for different environments. Then, I just rename production.htaccess to .htaccess after I upload it to the server.
More often than not, the two .htaccess files are identical, but even in that case, I still use this method for the sake of convenience.
The FTP application should have the option to show you hidden files; normally, that is an option available on FTP client applications for Mac OS X.