I am trying to connect to a MySQL server, which is restricted by being connected to a given server. I am trying to connect through this restricting server while not physically connected.
Through the command line this is doable by creating a SSH connection, after which I can run MySQL commands from the command line. For example:
ssh myUsername#Hostname
myUsername#Hostname's password:
[myUsername#Host ~]$ mysql -h mySQLHost -u mySQLUsername -p mySQLPassword
However, I wish to connect to the MySQL database from within R, so I can send queries to read in tables into my current R session. Usually I would run a R session inside of the commandline, but the server does not have R installed on it.
For example, I have this snippet of code that work when I am physically connected to the server (filled in information changed):
myDB <- dbConnect(MySQL(), user="mySQLUsername", password="mySQLPassword", dbname="myDbname", host="mySQLHost")
In essence, I want to run this same command through a pipe, so that the myDB object is a working mySQL connection.
I have been trying to pipe my way into the restricting server from within R, and have been able to read in a csv file. For example:
dat <- read.table(pipe('ssh myUsername#Hostname "cat /path/to/your/file"'))
This prompts me for my password, and the table is read (as is suggested it would here). However, I am unsure how to translate this to a MySQL connection. For example, should I make the pipe part of the host argument? That was my first thought, but have been unable to make that work.
Any help would be appreciated.
I accomplish a similar task with Postgres using SSH tunneling. Effectively, what you're doing with an SSH tunnel is saying "establish a connection to the remote server, and make a port from that server available as a port on my local machine."
You can set up a SSH tunnel using the following command on your local machine:
ssh -L local_port:lochalhost:remote_port username#remote_host
Specifically, what you're doing with this command is creating a Local Port Forwarding SSH tunnel, which is taking the port you'd connect to directly on the machine with your database installed (remote_port), and securely sending it to the machine you have R installed on as local_port.
For example, for a database server with the following options:
hostname: 192.168.1.3
username: mysql
server mysql port: 3306
You could use the following command (at the command line, or in R using system2) to create a tunnel to port 9000 on your machine:
ssh -L 9000:localhost:3306 mysql#192.168.1.3
Depending on what your exact DBI connection looks like in R, you may have to edit the connection configuration slightly to make it connect to your newly created tunneling port. The reason why I use a different localhost port is that it prevents conflicts with a local version of the database, if you've got one.
Related
I want to run concurrent Jupyter Lab sessions, where the sessions are served by different servers (for example, a local machine and a remote (cloud) server, or two cloud servers).
If a Jupyter Lab instance is running on my local machine and I type jupyter lab on a remote machine and paste the URL in my browser, it asks for a new workspace name or sometimes offers a localhost address that ends in :8889 instead of :8888, but I haven't been able to figure out how to add the remotely-hosted notebook to the existing Jupyter Lab instance as a new tab in Jupyter lab, or to run it side-by-side as a new browser tab.
I finally figured out how to get two cloud VMs running side-by-side Jupyter Lab sessions in the tabs of one browser.
I had configured SSH so that my cloud VM was forwarded to port 8888. The answer was to add a second SSH configuration for the second VM so that it forwarded to port 8889, and then make matching entries in the jupyter_notebook_config files on the two machines, like this:
Edit .ssh/config on your local machine (replace the contents of <...> and the identity file to adapt this to your situation)
##Override for Azure machine 1 ##
Host <IP address of VM 1>
User <your username on VM1>
IdentityFile ~/.ssh/vm1_rsa #SSH private key
LocalForward 8888 localhost:8888
##Override for Azure machine 2 ##
Host <IP address of VM 2>
User <your username on VM2>
IdentityFile ~/.ssh/vm2_rsa #SSH private key (could be the same as for VM1)
LocalForward 8889 localhost:8889
Edit the .jupyter/jupyter_notebook_config.py file on each machine so that VM1's file includes the line c.NotebookApp.port = 8888 and VM2's file includes the line c.NotebookApp.port = 8889.
I haven't tested it yet for the case where you want to run side-by-side Jupyter Lab sessions from a local machine and a remote machine, but I assume the mechanics would be the same.
I am having trouble creating an SSL connection using RPostgreSQL to an AWS hosted PostgreSQL database.
Here is what I've tried so far:
Created the PostgreSQL database on AWS.
Set the database parameter "rds.force_ssl" to 1.
Downloaded the AWS public key from https://s3.amazonaws.com/rds-downloads/rds-combined-ca-bundle.pem
Test the connection from a windows command prompt with psql (it works).
Executed the following in R:
library(RPostgreSQL)
cert <- paste0("C:/Users/johnr/Downloads/", "rds-combined-ca-bundle.pem")
dbname <- paste0("dbname=", "flargnog", " ", "sslrootcert=", cert, " ", "sslmode=verify-full")
host <- "xxxxxx.xxxxx.us-region-2.rds.amazonaws.com"
con <- dbConnect(dbDriver("PostgreSQL"), user="username", host=host, port=5432, dbname=dbname, password="abcd1234!")
I receive an error message after executing the last statement:
Error in postgresqlNewConnection(drv, ...) :
RS-DBI driver: (could not connect username#xxxxxx.xxxxx.us-region-2.rds.amazonaws.com on dbname "flargnog"
If I change the rds.force_ssl setting to 0 (and remove the ssl stuff from dbname) the connection works just fine.
I have looked at other posts on Stackoverflow related to this issue. This and this seem to indicate an SSL connection is not possible due to issues with RPostgreSQL. However, this post indicates that you can.
Any guidance would be appreciated!
You can try to ssh to the rds instance using e.g. putty and port-forward your local port 5432 to the remote port 5432. Once the ssh connection is open in R just connect to localhost:5432...
Here is how to port-forward using putty:
http://www.akadia.com/services/ssh_putty.html
Here is how this works via command-line:
https://gist.github.com/magnetikonline/3d239b82265398568f31
P.S.: Make sure your instance is in a security-group that accepts ssh connections - port 22
On my local machine, I have ssh into the bastion where I can then connect to the remote MySQL server. I know that this is working because in terminal, it says that I have successfully connected and when I use an app like SQLPro and attempt to connect to the MySQL server with the correct permissions, I am able to successfully log in. Also, the command line
mysql -u username -p
works after I ssh.
Now, I am trying to use the library RMySQL to connect to the server and using
con<-dbConnect(MySQL(), user = "username", password = "pw", host = "127.0.0.1")
I get the return
Error in .local(drv, ...) : Failed to connect to database: Error: Can't connect to MySQL server on '127.0.0.1' (61)
It seems that R cannot determine that I have connected to the bastion. I say this because I have used the line above before on the remote server and it worked just fine.
con<-dbConnect(MySQL(), user = "username", password = "pw", host = "localhost")
If you have a workbench then go to server-> client connection and check the Host name. Your host name might be incorrect
I'm running R on linux.
After a few hours of searching, the following documentation for AWS finally gave me the command I needed to connect to an RDS instance via an AWS bastion host:
https://aws.amazon.com/premiumsupport/knowledge-center/rds-connect-using-bastion-host-linux/
The "syntax 2" at the above link worked for me to set up the tunnel:
ssh -i "Private_key.pem" -f -N -L 3306:RDS_Instance_Endpoint:3306 ec2-user#EC2-Instance_Endpoint -v
This successfully forwarded my local port 127.0.0.1:3306 to the RDS port 3306.
I then connected to the RDS instance from within R with just:
cn = dbConnect(RMariaDB::MariaDB(), user = "myDataBaseUserName", password = "myPassword", host = "127.0.0.1", dbname = "mySchemaName")
I have small example script (script_p.r) like the following, which in intend run in terminal.
#!/usr/bin/Rscript
sink("output_capture.txt")
mn <- mean(1:10)
# and so on, much longer list of tasks
I want to run this script remotely with other iMac host computer (ip address e.g.. not real : 111.111.111.111) which allows me to log in and work (e.g., not real. username user101, password p12334)
Is this way to run this script remotely (say using ssh), say from other computer with ip address: 222.222.222.222 and user name user102 ?
First, put script_p.r on the remote machine.
Then either just do:
ssh user102#222.222.222.222
user102:-$ ./script_p.r
or ssh user102#222.222.222.222 'script_p.r'
or put it in a script:
runremote.sh :
#!/bin/bash
ssh user102#222.222.222.222 'script_p.r'
and run locally
user101:-$ ./runremote.sh
I want to make a backup for my remote server folders(ubunto server)to another remote sever (Linux server). but once I run this command from the the first server it dispalys me an error message:
rsync -raz --progress firstdirectoy root#serverIP:/home
The displayed messahe is:
ssh: connect to host <serverIP> port 22: Connection timed out
rsync: connection unexpectedly closed (0 bytes received so far) [sender]
rsync error: unexplained error (code 255) at io.c(601) [sender=3.0.7]
But the same command from the server 2 to the server 1 works fine and the folder is nicely copyed into the server1.
How can I escape the connexion error in order to copy my folder from server 1 to server 2 throw rsync?
Seems like server2 has no active ssh daemon while server1 has.
Try to run ssh daemon or use raw rsync protocol and rsync daemon.
If it's a connection timeout because your SSH server is slow to respond, you can tweak the timeout in rsync:
rsync -e 'ssh -o ConnectTimeout=120'
Else it may be a missing SSH daemon (sshd) on server 2 as stated by #geov, or a closed port on your firewall. You may start by testing an SSH login:
ssh user#serverIP
And see if it's working or not. Probably nmap serverIP will help you too, stating if SSH is running or not.
And please do NOT use root user for your rsync copy!
if you wait for a long time, the prompt appears
I think that your server2's IP is wrong
For me, this error appeared when attempting to rsync between two AWS EC2 instances where the two instances were not a part of the same security group.
Overview of how to create security groups
How to change the security groups of the instances
Allow instances within the same security group to communicate