I would like to print folder/file list (recursive). I can connect ftp server using ssh and can download using wget but before downloading I would like to print entire directory structure before downloading everything from server.
I have tried https://superuser.com/questions/790253/how-to-redirect-the-output-of-a-ftp-recursive-listing-to-a-local-file-with-windo but it didnt work.
Please suggest work around.
Thank you
Jessy
I have found work around, What I did is using filezilla I have loaded all the files in queue and later exported that queue to retrive path.
Thank for your help
Jessy
Related
I have been given access to a bunch of files on a server and have been tasked with making it so that these files are put on some sort of website where people can visit it and access it. I have been able to SSH to the server using terminal commands and am able to view all of the files on the directory. I have tried using scp_download, however, that downloads all of the files on there which is extremely large and would ideally be avoided. Is there a way where I can take the files on the server and get them directly uploaded to a website using Shiny (R package)? If not, is there another way possible?
It looks like you already discovered the R package ssh. You could ssh::ssh_exec_internal() to execute a command on the server that uploads/copies the files to the website server. How you can do that depends on the type of access you have to the website server.
If you have ssh access to the website server as well, your command could look something like this:
ssh::ssh_exec_internal(ssh_session,
"scp file remote_username#website_server_ip:/directory")
When developing for asp.net using visual studio for web, it is really convenient that when you deploy your website to the web server, it will cleverly check which files have been changed and only upload those files. In addition, if you deleted some files from your source, it detects that too and deletes those files from the web server since they are no longer needed.
I started developing with the LAMP stack and am wondering how you can deploy to a web server in a similar way.
I tried using Filezilla and on copy/pasting the source files to the web server, you have these options if there are similar files:
-Overwrite
-Overwrite if source is newer
-Overwrite if different size
-Overwrite if different size or source newer
"Overwrite if source is newer" works, kind of, but it only checks the date modified, not the content of the file. Also, the above method does not delete files from the web server that were deleted from the source.
Is there a better way to do this with Filezilla? (or maybe use some other program?)
Thanks.
You can use rsync to accomplish this.
When you want to push out changes you would do something like this form your production server.
rysnc -av user#<developmentIp>:/web/root/* /production/web/root/
The pattern is rsync --flags [user#host:]/source/dir [user#host:]/destination/dir
You only need the user#host stuff for remote hosts. The user must have ssh access to the host.
Couple little suggestions.
The command can be run from the source or destination. I find it better to run the command from the destination, for permissions issues (i.e. your reading from the remote and writing to the local)
Do some tests first, I always mix up the directory stuff; do I need the end slash, should I use the star, ...
Read the man page, there are alot of available options that may be helpful (--delete, --exclude, -a)
I have a file on my desktop and I need to get it onto another server, but I have no means of getting it there, i.e. email/usb or any way like that.
The server is on the same network as me.
I have heard of a way that the file can be copied via the command line.
Would anyone have any information on this and if so could you please help me?
Not sure whether you have command line access to that server or not? If yes, are you accessing it via telnet or via ssh?
If ssh, you should be able to transfer the file via SCP (secure copy), since it uses the same ssh connection you use to get your cli. If you want to transfer your file from a Windows environment, you may want to look at WinSCP, else do a man scp on your Linux or Unix server and, assuming you have it, you'll get the hang of it... it's not complicated.
If ssh is not an option, then you depend on the server having some service available for you to transfer the file, most obvious one being FTP.
Does that help?
I have a batchfile with SFTP instruction to download txt files (cron job), basically: get *.txt
Wondering what the best method is to delete those files after the server has downloaded them. The only problem being that the directory is constantly being updated with new files, so running rm *.txt afterwards won't work.
I've thought of a couple complex ways of doing this, but no command line based methods. So, I thought I'd shoot a query out to you guys, see if there's something I haven't thought of yet.
I suggest to make a list of all the files that were downloaded and then issue ftp delete/mdelete commands with the exact file names.
I just want to transfer a file from ftp server to unix folder, --this is stright forward.
if the file doesn't exist on the ftp server, then the script needs to run recursively until it finds the file. Please let me know how do i get that file.
please remember script has to run on ftp server.
Thanks
CK
I'd mount the FTP server with curlftpfs http://curlftpfs.sourceforge.net and then use it like it were a local file system — for example, run find(1).
You need to write a program to automate your FTP session. You can either write your own custom FTP client, not that hard if you know a few things about network programming, or write a script to automate a session for an existing client. For the latter approach, I suggest using Expect if you are proficient with TCL, or PyExpect if you prefer Python. Expect is a library designed to automate interactive tasks like downloading a file with FTP.