Accessing a File on My Desktop and Saving It to Use in Google Cloud Shell [closed] - google-cloud-shell

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
Improve this question
I have a json file saved on my computer's desktop and would like to move it so it is saved in the desktop for Google Cloud Shell.
Any ideas on how you can do that?

There is a feature to download file to Cloud Shell as described here.
From the Cloud Shell web interface, you just need to click on the three-dot menu and click "Upload File":
If the file you want to upload is large, from a local install of the gcloud command, you can copy using scp by running
gcloud beta cloud-shell scp cloudshell:~/data.txt localhost:~data.txt
Note that you have to be properly authenticated with your account and project, which you can do by running gcloud init and following the steps.
If the file you want to upload is huge, you can use GCS to store the file as an intermediate step (or as a final step if your file is too big for the Cloud Shell environment). You can upload the file through the web interface by going to a bucket and clicking the "upload files" button:
If your file is very huge (up to 5 TB), the best option is to upload to GCS using the CLI. After setting up the CLI with gcloud init, you can run this command to upload the file to GCS:
gsutil cp -m ~/data.txt gs://<my-bucket>/data.txt
You can then retrieve the data in your cloud shell environment using the command line tool, or by mounting the bucket as a volume using GCSFuse.
I suggest to take a look at the whole documentation, as the Cloud Shell is really powerful tool. For example, I especially like editor which is fully browser based IDE, so it does not need any additional software than browser.

Related

AWS EC2 Rstudio Server Error Occured During Transmission [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Closed 7 years ago.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Improve this question
After over a month, I have managed to piece together how to setup an AWS EC2 server. It has been very hard to upload files as there are very conservative (size) limits when done via the upload button in Rstudio Server. The error message when this is attempted is "Unexpected empty response from server".
I am not unique in this respect e.g. Trouble Uploading Large Files to RStudio using Louis Aslett's AMI on EC2
I have managed to use the following commands through putty and this has allowed me to upload files via either filezilla or winscp.
sudo chown -R ubuntu /home/rstudio
sudo chmod -R 755 /home/rstudio
Once I use these commands and log out, I can no longer access rstudio on the instances in future logins. I can relogin to my instances via my browser, but I get the error message:
Error Occurred During Transmission
Everything is fine other than once I use Putty I lose browser access to my instances.
I think this is because the command is change of ownership or similar. Should I be using a different command?
If I don't use a command I cannot connect between filezilla/winscp and the instance.
If anyone is thinking of posting a comment that this should be closed as it is a hardware issue, I don't have a problem with hardware. I am interested in the correct coded commands.
Thank you :)
Ok so eventually I realised what was going on here. The default home directory size for AWS is less than 8-10GB regardless of the size of your instance. As this as trying to upload to home then there was not enough room. An experienced linux user would not have fallen into this trap, but hopefully any other windows users new to this who come across this problem will see this. If you upload into a different drive on the instance then this can be solved. As the Louis Aslett Rstudio AMI is based in this 8-10GB space then you will have to set your working directory outside this, the home directory. Not intuitively apparent from Rstudio server interface. Whilst this is an advanced forum and this is a rookie error I am hoping no one deletes this question as I spent months on this and I think someone else will too.
Don't change the rights of /home/rstudio unless you know what you are doing, this may cause unexpected issues (and it actually does cause issues in your case). Instead, copy the files with filezilla or winscp to a temporary file (let say /tmp), then ssh to your instance with putty and move the file to the rstudio directory with sudo (e.g sudo mv /tmp/myfile /home/rstudio).

Uploading a file via website to SFTP server [closed]

Closed. This question is not about programming or software development. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 5 months ago.
Improve this question
I'm building a website. I have the SFTP login credentials for the server.
I'm trying to make it so that a user can select a file on their hard-drive, and upload the file to a remote computer through SFTP.
Is this possible? How would I do this?
I assume you use (or can use) PHP. You didn't specify, what technology you are using.
Start with reading:
POST method uploads - for transferring a file from the client's machine to your website
PHP SFTP Simple File Upload or How do i use phpseclib to upload file from my php server -> someOther server? - for transferring a file from your website to the SFTP
That combined together gets you a code like:
include('Net/SFTP.php');
$uploaded_file = $_FILES["attachment"]["tmp_name"];
$sftp = new Net_SFTP("example.com");
if (!$sftp->login('username', 'password'))
{
die("Connection failed");
}
$sftp->put(
"/remote/path/".$_FILES["attachment"]["name"],
file_get_contents($uploaded_file));
This is a very simplified code, lacking lots of validation and error checking.
The code uses the phpseclib library.
If you are in Windows, you can use a FTP Client like WinSCP... If you are in Linux, use te SCP command:
scp /home/me/myfile.dat user:password#remoteserver:/remotedir

How do I access data from my personal computer on my AMI instance running RStudio server [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I have recently set up RStudio on an AMI ec2 instance using the process generously laid out by Louis Aslet from his website. But in an embarrassing turn of events I can't access the data I need because it resides on my personal computer. I am new to cloud computing and have 0 functional knowledge of Linux, but I do know SQL, and R well. Any help or suggestions would be greatly appreciated.
Have you tried the "Upload" button in the "Files" window of Rstudio?
use scp in terminal.
To put files from your remote server
Example: if the files are located locally in ~/mylocalfolder and you want to put them in /home/rstudio/mydata you would execute in terminal:
scp ~/mylocalfolder/*.csv ubuntu#<your address>:/home/rstudio/myData/
Note that if you want to access them under a different user, eg, rstudio, you need to change owners on the files. Use chown
To grab data from your remote server
Example: if the files are located on /home/rstudio/mydata and you want to put them locally in ~/mylocalfolder you would use
scp ubuntu#<your address>:/home/rstudio/myData*.Rda ~/mylocalfolder
I use the RStudio AMI all the time and what works for me is to use Dropbox. I can't remember exactly how I did it but I think I may have started the shell from within RStudio and installed Dropbox from the command line.
This link has a little more info:
http://www.louisaslett.com/RStudio_AMI/#comment-1041983219

Can I install New Relic on a Bluehost Shared Hosting Plan? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I want some more insights into my wordpress site so I signed up for a New Relic account. But they're asking me to install some agent by typing in commands and I have no idea where to do that or how to access a command line.
I use shared hosting with Bluehost so I have access to cPanel. I've never typed anything into a console to manage the server - the icons have covered everything I need. So far.
Is it even possible to install this on my shared hosting plan? If so, where do I get the command prompt, do I need FTP? I've tried to follow the instructions, but there're some strange words like "yum", "rpm" and what not. What are these, and how do I run them?
Is there a wordpress plugin that I can just install and have everything done automatically for me?
If anyone could point me to some clear step by step instructions as to how to go about this, I'd be very grateful...
Thanks!
What you need is shell access. According to their features page at http://www.bluehost.com/cgi/info/hosting_features, they support shell access via SecureShell (SSH). This is fairly common.
Download an SSH client like PutTTY and connect to yourdomain.com, or if you are on a Mac, open the Terminal app and type "ssh yourdomain.com" (with your web site's domain name). You can then run commands.
However, this probably will not get you what you need. You mentioned yum and rpm commands, which are system-level software installation tools. You'll need root access to do that, which you certainly cannot do on a shared hosting account.
These types of tools are really intended for application service providers. I'm fairly familiar with New Relic's products. I'm guessing you are trying to install the server monitor tool. You probably don't need to worry about that one since Bluehost should be monitoring their servers for you.

Is there an easy way to copy files between servers using aptana? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
I have two Ubuntu servers. One is my development box, the other is a production system. They are not identical, though. Most notably, the MySQL server is on a remote server from the production one whereas on the dev system it's on localhost. Basically this means that I can mostly use a clone of the dev system, but if I just sync the filesystem the production system breaks.
Also, I am using Aptana (highly recommended, BTW). Up until recently, I had a local copy of my development system as a project, but I just had to reformat. I am not using a local project anymore, but instead I am using SFTP to connect with my Dev system and I am just editing files on there. Up until now, I have just been SCPing only files that need updating to the production server from my local project, but I kind of like not having one. I can restore it if necessary, though.
My question (short form): Is there an easy way using Aptana to copy files from one remote system to another?
Tim,
Currently you have to use local project (could be reconstructed from one remote) to synchronize two remote servers.
To enable auto-upload on local changes, in Project Explorer or App Explorer in context menu select Deploy/Deployment Settings, then check "Automatically sync my changes with the remote site".
This is all for Aptana Studio 3.
Cheers,
Max

Resources