How to SSH files from a server and take those files (without downloading them) and code it so that they are uploaded to a website? - r

I have been given access to a bunch of files on a server and have been tasked with making it so that these files are put on some sort of website where people can visit it and access it. I have been able to SSH to the server using terminal commands and am able to view all of the files on the directory. I have tried using scp_download, however, that downloads all of the files on there which is extremely large and would ideally be avoided. Is there a way where I can take the files on the server and get them directly uploaded to a website using Shiny (R package)? If not, is there another way possible?

It looks like you already discovered the R package ssh. You could ssh::ssh_exec_internal() to execute a command on the server that uploads/copies the files to the website server. How you can do that depends on the type of access you have to the website server.
If you have ssh access to the website server as well, your command could look something like this:
ssh::ssh_exec_internal(ssh_session,
"scp file remote_username#website_server_ip:/directory")

Related

Unsure where image_write downloads to in shinyapps.io

I'm attempting to make a public shinyapps.io website, and I'm trying to use image_write to create a file into a local directory.
The following code works on my local R studio code:
image_write(im.resized, path = paste0(output_file_directory, file_name), format = "jpg")
When I run the code on the shinyapps.io website, the code runs without error, but I'm not sure where it downloads the file to. I know that the output_file_directory part isn't the issue, so I'm a little lost. Any help would be much appreciated!
On shinyapps.io it is not possibly to store permanently data, due to:
"Shinyapps.io is a popular server for hosting Shiny apps. It is designed to distribute your Shiny app across different servers, which means that if a file is saved during one session on some server, then loading the app again later will probably direct you to a different server where the previously saved file doesn’t exist."
See here:
https://shiny.rstudio.com/articles/persistent-data-storage.html

print recursive folders / files using ssh connecting ftp

I would like to print folder/file list (recursive). I can connect ftp server using ssh and can download using wget but before downloading I would like to print entire directory structure before downloading everything from server.
I have tried https://superuser.com/questions/790253/how-to-redirect-the-output-of-a-ftp-recursive-listing-to-a-local-file-with-windo but it didnt work.
Please suggest work around.
Thank you
Jessy
I have found work around, What I did is using filezilla I have loaded all the files in queue and later exported that queue to retrive path.
Thank for your help
Jessy

Deploying source to web server with deleting not needed files

When developing for asp.net using visual studio for web, it is really convenient that when you deploy your website to the web server, it will cleverly check which files have been changed and only upload those files. In addition, if you deleted some files from your source, it detects that too and deletes those files from the web server since they are no longer needed.
I started developing with the LAMP stack and am wondering how you can deploy to a web server in a similar way.
I tried using Filezilla and on copy/pasting the source files to the web server, you have these options if there are similar files:
-Overwrite
-Overwrite if source is newer
-Overwrite if different size
-Overwrite if different size or source newer
"Overwrite if source is newer" works, kind of, but it only checks the date modified, not the content of the file. Also, the above method does not delete files from the web server that were deleted from the source.
Is there a better way to do this with Filezilla? (or maybe use some other program?)
Thanks.
You can use rsync to accomplish this.
When you want to push out changes you would do something like this form your production server.
rysnc -av user#<developmentIp>:/web/root/* /production/web/root/
The pattern is rsync --flags [user#host:]/source/dir [user#host:]/destination/dir
You only need the user#host stuff for remote hosts. The user must have ssh access to the host.
Couple little suggestions.
The command can be run from the source or destination. I find it better to run the command from the destination, for permissions issues (i.e. your reading from the remote and writing to the local)
Do some tests first, I always mix up the directory stuff; do I need the end slash, should I use the star, ...
Read the man page, there are alot of available options that may be helpful (--delete, --exclude, -a)

Best way for downloading many files in a website

I'm designing a website that will let the user synchronize a local folder to an online folder (Kinda like dropbox).
I'm trying to find a way to avoid developing a local tool to do the download and synchronization. And do it somehow online.
Is it possible to download multiple files in a certain directory tree?
Can a website have free write access to local directories?
A zip file is not an option, since the file batch could get pretty big.
EDIT: Synchronization shouldn't occur periodically. Just when the user logs in the website.
My recommendation is to use something like WebDAV then see here

Script to recursively look for a file on ftp server until it found

I just want to transfer a file from ftp server to unix folder, --this is stright forward.
if the file doesn't exist on the ftp server, then the script needs to run recursively until it finds the file. Please let me know how do i get that file.
please remember script has to run on ftp server.
Thanks
CK
I'd mount the FTP server with curlftpfs http://curlftpfs.sourceforge.net and then use it like it were a local file system — for example, run find(1).
You need to write a program to automate your FTP session. You can either write your own custom FTP client, not that hard if you know a few things about network programming, or write a script to automate a session for an existing client. For the latter approach, I suggest using Expect if you are proficient with TCL, or PyExpect if you prefer Python. Expect is a library designed to automate interactive tasks like downloading a file with FTP.

Resources