I'm looking for a way to upload files/dirs structure from one server to another..
The only way it's possible in my case is SFTP upload, is there any easy way to upload it, using script or something without making archive of files/dirs, I want to recreate on remote server?
Thank you!
Perhaps a solution could be found using recursive scp (scp -r)? Or are you limited explicitly to sftp only?
There's also a client named lftp which has sftp and scripting support - much like batch file I would imagine - a list of ftp commands. (http://lftp.yar.ru/lftp-man.html)
You may want to consider Syncplify.me FTP Script! as a solution. It allows you to write very simple scripts to achieve your goal.
For example, uploading an entire directory to a remote SFTP server would actually be a single line of code added to one of the ready-made templates.
http://www.syncplify.me/products/ftp-script/
edtFTPj/PRO is a Java SFTP client that has a comprehensive scripting engine. Being Java you can run it on any platform where Java is supported.
Here's some more details on the scripting support. It has an 'mput' command that uploads all the files in the current directory to the remote directory.
Recursive transfers aren't yet supported, but could easily be added if required - email support if you are interested.
Related
When developing for asp.net using visual studio for web, it is really convenient that when you deploy your website to the web server, it will cleverly check which files have been changed and only upload those files. In addition, if you deleted some files from your source, it detects that too and deletes those files from the web server since they are no longer needed.
I started developing with the LAMP stack and am wondering how you can deploy to a web server in a similar way.
I tried using Filezilla and on copy/pasting the source files to the web server, you have these options if there are similar files:
-Overwrite
-Overwrite if source is newer
-Overwrite if different size
-Overwrite if different size or source newer
"Overwrite if source is newer" works, kind of, but it only checks the date modified, not the content of the file. Also, the above method does not delete files from the web server that were deleted from the source.
Is there a better way to do this with Filezilla? (or maybe use some other program?)
Thanks.
You can use rsync to accomplish this.
When you want to push out changes you would do something like this form your production server.
rysnc -av user#<developmentIp>:/web/root/* /production/web/root/
The pattern is rsync --flags [user#host:]/source/dir [user#host:]/destination/dir
You only need the user#host stuff for remote hosts. The user must have ssh access to the host.
Couple little suggestions.
The command can be run from the source or destination. I find it better to run the command from the destination, for permissions issues (i.e. your reading from the remote and writing to the local)
Do some tests first, I always mix up the directory stuff; do I need the end slash, should I use the star, ...
Read the man page, there are alot of available options that may be helpful (--delete, --exclude, -a)
I have a weird problem. I'm using pscp.exe from within a C# program (with process.start) to upload files to an SFTP server. Now I have set up a new server with the same program, where I upload to the same SFTP server as before, but It runs incredibly slow in the new server.
The weird thing is that when I try uploading the files manually via FileZilla, the upload goes as fast as expected, but not when using the program.
Can anyone explain this? Am I missing something obvious like a windows setting or something?
SSH supports what we call pipelining - sending multiple SSH packets without waiting response to each packet. OpenSSH supports this functionality, while Putty doesn't (at least didn't until recently). That's what you observe. Another reason is choice of algorithms. If AES is negotiated, it's faster than DES and 3DES used by default by older applications.
I ended up rewriting the SFTP transfer to use the .Net wrapper for WinSCP in stead. The solution was fast, and the file transfer also. Here's a link to the documentation.
Uploading files using WinSCP is like 10 times faster.
To do that from command line, first you got to add the winscp.com file to your %PATH%. It's not a top-level domain, but an executable .com file, which is located in your WinSCP installation directory.
Then just issue a simple command and your file will be uploaded much faster putty ever could:
WinSCP.com /command "open sftp://username:password#example.com:22" "put your_large_file.zip /var/www/somedirectory/" "exit"
And make sure your check the synchronize folders feature, which is basically what rsync does, so you won't ever want to use pscp.exe again.
WinSCP.com /command "help synchronize"
Filezilla can use multiple concurrent connections and reuse open connections. I believe PSCP is a relatively simple application.
A library like SFTP.NET will probably yield better results than running a child pscp process.
It would also help to use the ZipPackage to compress the files when sending them.
i need to write a script to automate file transfer from one server to another using only sftp .Can you guys provide me some example sftp automated script ??? or some code that would help me !!!
Check out this SO post for some suggestions. Additional details for your server environment would be helpful (LINUX, Windows, etc.).
I have developed a web based application in ASP.NET and C# where users have the facility to upload files on the server through this application I want the application to Scan the uploaded files for viruses before saving on the server. Same like when we attach files with our email in Yahoo. Please guide me how I can achieve this functionality Any API which can be integrated in ASP.NET application or any other way you can suggest. We can purchase the licensed version of a product which can achieve this. I have googled but did not find specific results.
Thanks in advance!
First of all the file must be saved onto the server before you can scan it. If you notice Yahoo will upload the file first - but not allow the attachment to be sent until scanned.
Then you can use an antivirus with a command line interface or some other kind of API. Both of these can be called via C# and should provide the functionality you require. Parhaps write a wrapper class that takes a file and returns true or false depending on whether a virus was detected.
Other applications that provide you with a command line interface:
Microsoft Security Essentials
clamAv
I believe MS AV provides better results.
Just purchase antivirus software that has a command-line interface (several popular packages include this). Once the file has been uploaded, run the scan.
I would think, in order to upload and scan at the same time, you might need to implement your own antivirus software as I'm not familiar with any package that would provide that sort of interface.
I run a shareware site. It doesn't work as you described, but I download each file to my local computer and run a scan on them. You would be doing something similar.
I just want to transfer a file from ftp server to unix folder, --this is stright forward.
if the file doesn't exist on the ftp server, then the script needs to run recursively until it finds the file. Please let me know how do i get that file.
please remember script has to run on ftp server.
Thanks
CK
I'd mount the FTP server with curlftpfs http://curlftpfs.sourceforge.net and then use it like it were a local file system — for example, run find(1).
You need to write a program to automate your FTP session. You can either write your own custom FTP client, not that hard if you know a few things about network programming, or write a script to automate a session for an existing client. For the latter approach, I suggest using Expect if you are proficient with TCL, or PyExpect if you prefer Python. Expect is a library designed to automate interactive tasks like downloading a file with FTP.