Download only new files with WinSCP - sftp

I am currently writing a WinSCP script whose aim is to retrieve all the files from an SFTP server and then put them on a specified location in a destination server (on which the script is located, FYI).
Is there anyone to check if a file has already been transferred on the destination server? Is it overwritten when it has? In that case, is that really a bad thing? In such a case, I guess that if the file already exists on the destination server, I would like nothing to happen. If it doesn't exist, then I'd like to proceed with the transfer.
You will find enclose the code written so far below
# Automatically abort script on errors
option batch abort
# Disable overwrite confirmations that conflict with the previous
option confirm off
# Connect using a password
open sftp://SERVER#IP_ADDRESS:PORT -privatekey="PRIVATE_KEY" -hostkey="HOSTKEY" -passive=off
# Change remote directory
cd in
cd DIRECTORY
# Force binary mode transfer
option transfer binary
# Get ALL files from the directory specified
get /*.csv* \\DIRECTORY
# Remove all .csv files
rm /*.csv
# Exit WinSCP
bye
Thank you very much in advance for your help, hope it was clear enough, otherwise please let me know if I can provide you with further information

The easiest solution is to add -neweronly switch to your get command:
get -neweronly /*.csv \\DIRECTORY
For very similar results, you can also use synchronize command:
synchronize local \\DIRECTORY / -filemask=*.csv
See also WinSCP article on Downloading the most recent file.

Related

Open a file in a temp directory in Max/MSP standalone

I have a Max/MSP standalone that looks for an external folder when it opens (it contains JSON files generated by R), which it does with loadbang -> prefix ~/folder_name.
This works OK, but I don't want to store the folder in home.
What I really want is to use Terminal to tell my standalone where to look, something like:
open -a standalone.app /var/folders/whatevercrazytempdirname/folder_name
But this doesn't work. Maybe I could establish a pipe between the program that generates the folder (R) and my standalone, but I don't know that this is possible with a Max/MSP standalone, and I haven't been able to find anyone who has done this.
Thanks for any suggestions!
I don't think there is a way to acquire runtime arguments in a Max standalone.
Perhaps you can have R write a text file to a standard directory (for example ~/Library/Application Support/MaxAppName/) that contains the whatevercrazytempdir path? The Max app could take it form there..
Alternatively, if R were to support udp networking, you could send Max an OSC message with the path.

How do I scp a file to a Unix host so that a file polling service won't see it before the copy is complete?

I am trying to transfer a file to a remote Unix server using scp. On that server, there is a service which polls the target directory to detect incoming files for processing. I would like to ensure that the polling service does not pick up new files before the copy is complete. Is there a way of doing that?
My file transfer process is a simple scp command embedded in a larger Java program. Ideally, a solution which did not involve changing the Jana would be best (for reasons involving change control processes).
You can scp the file to a different (/tmp) directory and move the
file via ssh after transfer is complete. The different directory needs to be on the same partition as the final destination directory otherwise there will be a copy operation and you'll face a similar problem. Another service on the destination machine can do this move operation.
You can copy the file as hidden (prefix the filename with .) and copy, then move
If you can modify the polling service, you can check active scp processes and ignore files matching scp arguments.
You can check for open files with lsof +d $directory and ignore them in the polling server
I suggest copying the file using rsync instead of scp. rsync already copies new files to temporary filenames, and has many other useful features for file synchronization as well.
$ rsync -a source/path/ remotehost:/target/path/
Of course, you can also copy file-by-file if that's your preference.
If rsync's temporary filenames are sufficient to avoid being picked up by your polling service, then you could simply replace your scp command with a shell script that acts as a wrapper for rsync, eliminating the need to change your Java program.
You would need to know the precise format that your Java program uses to call the scp command, to make sure that the options you feed to rsync do what you expect.
You would also need to figure out how your Java program calls scp. If it does so by full pathname (i.e. /usr/bin/scp), then this solution might put other things at risk on your system that depend on scp (like you, for example, expecting scp to behave as it usually does instead of as a wrapper). Changing a package-installed binary like /usr/bin/scp may also "break" your package registration, making it difficult to install future security updates because a binary has changed to a shell script. And of course, there might be security implications to any change you make.
All in all, I suspect you're better off changing your Java program to make it do precisely what you want, even if that is to launch a shell script to handle aspects of automation that you want to be able to change in the future without modifying your Java.
Good luck!

deleteing multiple files in remote box via sh

Requirement
Several files in remote machines ought to be deleted via sh. Name of the files to be deleted are know
Approach
1) script was written with ftp (requires credential) and delete command. File names were passed as array(iterated via for loop-with ftp+delete commands enclosed within for loop). files were not getting deleted by this approach
2) another approach attempted was to pass temp.ftp(which contains delete command) to ftp command and rm the temp.ftp file eg.ftp <
Request
require pointers to delete muliple files in remote machine via shell script
I recommend using ssh instead of ftp for interfacing with the remote unix machine.
SSH allows you to run remote commands easily and securely.
Read this article for more info.

Script to recursively look for a file on ftp server until it found

I just want to transfer a file from ftp server to unix folder, --this is stright forward.
if the file doesn't exist on the ftp server, then the script needs to run recursively until it finds the file. Please let me know how do i get that file.
please remember script has to run on ftp server.
Thanks
CK
I'd mount the FTP server with curlftpfs http://curlftpfs.sourceforge.net and then use it like it were a local file system — for example, run find(1).
You need to write a program to automate your FTP session. You can either write your own custom FTP client, not that hard if you know a few things about network programming, or write a script to automate a session for an existing client. For the latter approach, I suggest using Expect if you are proficient with TCL, or PyExpect if you prefer Python. Expect is a library designed to automate interactive tasks like downloading a file with FTP.

rsync list of specific local files in 1 step

I'm working on a web application where a user uploads a list of files, which should then be immediately rsynced to a remote server. I have a list of all the local files that need to be rsynced, but they will be mixed in with other files that I do not want rsynced every time. I know rsync will only send the changed files, but this directory structure and contents will grow very large over time and the delay would not be acceptable.
I know that doing a remote rsync, I can specify a list of remote files, i.e...
rsync "host:/path/to/file1 /path/to/file2 /path/to/file3"
... but that does not work once I remove "host:" and try to specify the files locally.
I also know I can use --files-from, but that would require me to create a file ahead of time with a list of files that I want to rsync (and then delete it afterwards). I think it'd be cleaner to just effectively say "rsync these 4 specific files to this remote server", but I can't seem to get that to work.
Is there any way to do what I'm trying to accomplish, or do I have to resort to creating a tmp file with a list in it?
Thanks!
You should be able to list the files similar to the example you gave. I did this on my machine to copy 2 specific files from a directory with many other files present.
rsync test.sql test2.cpp myUser#myHost:path/to/files/synced/

Resources