ASP, need to use SFTP - asp-classic

This is ASP classic, not .Net. We have to get a way to SFTP into a server to upload and download a couple of files, kicked off by a user.
What have other people used to do SFTP in ASP classic? Not necessarily opposed to purchasing a control.

If you have the ability to use WScript.Shell then you can just execute pscp.exe from the Putty package. Obviously this is less then ideal but it will get the job done and let you use SCP/SFTP in classic ASP.

The way I have done this is to create a command script file and pass this on the command line via the /b command to psftp.exe. I have also tried this in Perl and have yet to find a neater way of doing it.
There is an issue with this method, in that you already have to have accepted the RSA finger-print. If not, then the script will either wait for user input to accept it or will skip over it if you are running in full batch mode, with a failure. Also, if the server changes so that it's RSA finger-print changes (e.g. a cluster) then you need to re-accept the finger-print again.
Not an ideal method, but the only one I know.
I shall be watching this question incase anyone knows another way.

There is an issue with this method, in that you already have to have accepted the RSA finger-print. If not, then the script will either wait for user input to accept it or will skip over it if you are running in full batch mode, with a failure. Also, if the server changes so that it's RSA finger-print changes (e.g. a cluster) then you need to re-accept the finger-print again.

I've previously used a component from here: www.weonlydo.com. I didn't find it the easiest piece of kit to develop against but it got the job done in a hurry.

I used to do that with FTP on Windows (create a file of commands and shell out FTP.exe).

December 2020 :
ASP is dead, it has been superseded by ASP .Net 18 years ago.
At this time, the most common way to use SFTP in .Net is to use the SSH.NET NuGet package.
Maybe this question should be closed ?

Related

Is there a way to automatically make a copy of a file each time it is updated in Unix?

I have an application that updates some files in Unix server. Since I cannot modify this application, is there any way I can make sure that these files are copied before each update so I can have a history of the changes?
Is there a way/tool in Unix so I can do that?
If on Linux (specifically) you could use inotify(7) facilities (perhaps via incrontab ...)
Alternatively, you might run periodically (thru some crontab(5) entry) a script doing some make with your particular Makefile (since GNU make is designed to care about timestamps) managing e.g. backups. Or you could periodically run some rsync command.
However, it smells like you need some revision control (also known as version control system). I strongly recommend git; you could use it before and after running your application (e.g. write some wrapping shell script doing that).
But there is probably no universal solution (e.g. what if the monitored application is keeping a file descriptor opened for a long time, and write the file little by little...). You should explain much more what is happening and what do you want ...

How can i write a code part which always run on my webserver.ASP.NET

i need to write a code to update my database at 7 a.m. and 7 p.m. everyday.So i think this code have to always run on my web server and update my database.How can i make this code always running?
Can anyone help me about this problem?
Thanks.
You'll most likely have to achieve that by writing a script in the database, if it's the database itself which you want to update. Also, if so you'll have to give more details including which DBMS you're using and what the context behind this update is.
Use a cron script (on linux webserver) or in windows schedule a task. Either way, you will need a command line script (or batch file) that starts the task for you twice a day. In windows, that would probably be a VB script which then makes the procedure run to update the database or calls the program to update the database.
I think one of the good suggesting for that I read in Jeff Atwood article:
https://blog.stackoverflow.com/2008/07/easy-background-tasks-in-aspnet/
Checkout, so it might suite for you as well.

using pscp.exe for sftp transfer is very slow compared to filezilla

I have a weird problem. I'm using pscp.exe from within a C# program (with process.start) to upload files to an SFTP server. Now I have set up a new server with the same program, where I upload to the same SFTP server as before, but It runs incredibly slow in the new server.
The weird thing is that when I try uploading the files manually via FileZilla, the upload goes as fast as expected, but not when using the program.
Can anyone explain this? Am I missing something obvious like a windows setting or something?
SSH supports what we call pipelining - sending multiple SSH packets without waiting response to each packet. OpenSSH supports this functionality, while Putty doesn't (at least didn't until recently). That's what you observe. Another reason is choice of algorithms. If AES is negotiated, it's faster than DES and 3DES used by default by older applications.
I ended up rewriting the SFTP transfer to use the .Net wrapper for WinSCP in stead. The solution was fast, and the file transfer also. Here's a link to the documentation.
Uploading files using WinSCP is like 10 times faster.
To do that from command line, first you got to add the winscp.com file to your %PATH%. It's not a top-level domain, but an executable .com file, which is located in your WinSCP installation directory.
Then just issue a simple command and your file will be uploaded much faster putty ever could:
WinSCP.com /command "open sftp://username:password#example.com:22" "put your_large_file.zip /var/www/somedirectory/" "exit"
And make sure your check the synchronize folders feature, which is basically what rsync does, so you won't ever want to use pscp.exe again.
WinSCP.com /command "help synchronize"
Filezilla can use multiple concurrent connections and reuse open connections. I believe PSCP is a relatively simple application.
A library like SFTP.NET will probably yield better results than running a child pscp process.
It would also help to use the ZipPackage to compress the files when sending them.

automate sftp upload process

I'm looking for a way to upload files/dirs structure from one server to another..
The only way it's possible in my case is SFTP upload, is there any easy way to upload it, using script or something without making archive of files/dirs, I want to recreate on remote server?
Thank you!
Perhaps a solution could be found using recursive scp (scp -r)? Or are you limited explicitly to sftp only?
There's also a client named lftp which has sftp and scripting support - much like batch file I would imagine - a list of ftp commands. (http://lftp.yar.ru/lftp-man.html)
You may want to consider Syncplify.me FTP Script! as a solution. It allows you to write very simple scripts to achieve your goal.
For example, uploading an entire directory to a remote SFTP server would actually be a single line of code added to one of the ready-made templates.
http://www.syncplify.me/products/ftp-script/
edtFTPj/PRO is a Java SFTP client that has a comprehensive scripting engine. Being Java you can run it on any platform where Java is supported.
Here's some more details on the scripting support. It has an 'mput' command that uploads all the files in the current directory to the remote directory.
Recursive transfers aren't yet supported, but could easily be added if required - email support if you are interested.

From ASP.NET, How to Download Two Excel Files and Invoke Batch Command on Them?

I need to download two Excel files onto the client, and then run a (diff) executable against them. I know how to download a single Excel file, from
here. But how to download a second one automatically in succession? And then how to run a batch command on them? Is this even realistic? Any guidance or pointers would be greatly appreciated.
Thanks,
Mike
To download multiple files at once you have two main options:
1) Just open multiple windows to your page generation script to download multiple files as per http://www.webdeveloper.com/forum/showpost.php?s=b4f6b25edeb6b7ea55434c4685a675fe&p=950225&postcount=6
2) Archive the files into a package (zip/arj/7z etc..) and send the archive to the client.
eg. http://www.motobit.com/tips/detpg_multiple-files-one-request/
As for doing the diff client-side that is a lot more tricky as Shhnap has already mentioned. If you are doing this for a controlled client base you may be able to get them to allow permissions for an ActiveX script that runs something client side. (Or fire off a console application) - but if you don't have fine control over the client environment then i can't think of a way to do it.
As Shhnap suggested can you not just do the comparison server-side (and then send this to the client as a third file?)
Well, just some pointers because I'm not sure I completely understand the problem. You a user to be given two downloads at the same time and then run a diff command against those two files? On the server or the client i'm not sure? You'll have alot of problems automating the client side version because forcing people to run client side code is usually frowned upon by virus protection software.
The server side diff sounds exactly like a CGI moment to me: http://www.cs.tut.fi/~jkorpela/perl/cgi.html. That will allow you to generate a web-page that shows the diff between the two. CGI allows you to run programs on your server and display their output in a webpage; that's the simple explanation.
If that was not quite what you wanted then feel free to give me a comment and i'll try and edit to answer correctly.

Resources