Connecting to BigQuery in RShiny - r

I've tried two methods to connect my Shiny app to a BigQuery table as its source data:
Hadley's bigrquery, and
Mark Edmondson's BigQueryR
They're both failing the same way, so it's clearly a DFU error.
In each case, when I execute the appropriate command to establish the authorized connection (gar_auth_service(json_file = /path/,scope = 'https://www.googleapis.com/auth/bigquery' and bq_auth(path = /path/, respectively), I get this:
This site can’t be reached localhost refused to connect. Try:
Checking the connection Checking the proxy and the firewall
ERR_CONNECTION_REFUSED
This error comes after what appears to be a normal Google login process in the browser. The error page is hosted at localhost:1410, if that's any help.
In the Console, I have:
Created a VM instance (Ubuntu 19)
Successfully installed R, RStudio, and Shiny
Successfully logged in to RStudio in my GCP instance (from the browser, obviously, using the Externa IP I reserved in GCP)
I've also already created a BigQuery table in the same project, and successfully connected to it from an R script on my local machine.
I'm trying to get that same R script to run grom my Google Compute Engine instance.
Have I provided enough details to ask for help? If not, let me know what else I should provide. I'm walking through teaching myself GCP right now, and I'm quite the novice.
Thanks!

To bypass this issue, try connecting to your Ubuntu 19 instance using Chrome Remote Desktop on your Compute Engine instance as documented here.
Chrome Remote Desktop allows you to remotely access applications with a graphical user interface from a local computer instead of using the External IP. For this approach, you don't need to open firewall ports, and you use your Google Account for authentication and authorization. I've tried and I was able to connect both Shiny Server and to the RStudio.

Related

AWS Workspaces with NordVPN - Status Unhealthy

I've got an AWS Workspace (Windows 10) running in N. Virginia and installed NordVPN on it.
I'm trying to use NordVPN to change my location so I can access certain geo-restricted websites in Chrome while on my Workspace.
When I turn on NordVPN and connect to Atlanta (for example), after a few minutes the Workspace status changes to "Unhealthy" and eventually disconnects automatically.
I believe this is because the AWS Workspaces monitoring service can no longer see my Workspace, and thinks it's offline or something.
I also tried to use NordVPN's "Split Tunnelling" feature to only protect traffic going through the Chrome Browser application, but as soon as I turn that on my Workspace is disconnected immediately requiring a reboot.
Is there any way I can configure my Workspace to allow the Workspace monitoring service to still reach my Workspace when the Workspace has a VPN connection? Or, has anyone been able to use a Workspace with NordVPN's Split Tunneling on the Chrome Browser application?
I also tried the NordVPN Chrome Extension, but that doesn't work either.
Thanks

Can't connect to Google Cloud SQL from Cloud Run - using R Shiny

I have created an R Shiny app that connects to a Cloud SQL instance. It runs fine on my local server, but when I upload to either shinyapps.io or to Cloud Run via Dockerfile, it is unable to connect.
Here is the code I am using to connect using RPostgres package:
conn <- dbConnect(
drv=RPostgres::Postgres(),
dbname='postgres',
sslrootcert=path to 'server-ca.pem',
sslcert=path to 'client-cert.pem',
sslkey=path to 'client-key.pem',
host='xxxxxxxxxxxxxxxxxxx',
port=5432,
user='username',
password='password_string',
sslmode='verify-ca')
I've checked the logs in Cloud Run, the error message I am seeing is the following:
Warning: Error in : unable to find an inherited method for function 'dbGetQuery' for signature '"character", "character"'
The dbGetQuery() function is called after the dbConnect function, and since it runs fine on my local server, I am fairly confident that what I am seeing is a connection issue, rather than a package namespace issue. But could be wrong.
I have opened up to all IPs by adding 0.0.0.0/0 as an allowed network. The weird thing is that OCCASIONALLY I CAN CONNECT from shinyapps.io, but most of the time it fails. I have not yet got it to work once from Cloud Run. This is leading me to think that it could be a problem with a dynamic IP address or something similar?
Do I need to go through the Cloud Auth proxy to connect directly between Cloud Run and Cloud SQL? Or can I just connect via the dbConnect method above? I figured that 0.0.0.0/0 would also include Cloud Run IPs but I probably don't understand how it works well enough. Any explanations would be greatly appreciated.
Thanks very much!
I have opened up to all IPs by adding 0.0.0.0/0 as an allowed network.
From a security standpoint, this is a terrible, horrible, no good idea. It essentially means the entire world can attempt to connect to your database.
As #john-hanley stated in the comment, the Connecting Cloud Run to Cloud SQL documentation details how to connect. There are two options:
via Public IP (the internet) using the Unix domain socket on /cloudsql/CLOUD_SQL_CONNECTION_NAME
via Private IP, which connects through a VPC using the Serverless VPC Access
If a Unix domain socket is not supported by your library, you'll have to use a different library or choose Option 2 and connect over TCP. Note that Serverless VPC Access connector has additional costs associated with using it.

Running RShiny from a Google cloud platform VM

I've put together an R shiny doc that I want to run on my google cloud platform virtual machine. When I run the script, it works in that it generates a webpage "Listening on http://127.0.0.1:6840". However, when I click on the link generated via the script, I get the error "500. That’s an error."
The script works locally so I don't believe it's an issue with the code.
Help!
Here is what I did to make it work. Full disclosure - I have no idea what any of this is, just following the steps in the docs:
install R
install Shiny
clone https://github.com/rstudio/shiny-examples
cd into any of the examples (I chose 050-kmeans-example)
run R -e "shiny::runApp(host='0.0.0.0', port=8080)" (8080 is just an example and can certainly be different) to start the server
Receive a message in the console saying "Listening on http://0.0.0.0:8080"
Go back to the GCP and configure a firewall rule to allow communications to port 8080 from the outside world.
Open a new browser tab with external ip address of the VM and append the port (e.g if the VM ext IP is 1.2.3.4, type in http://1.2.3.4:8080)

Unable to access website or SSH or FTP - Google Cloud and wordpress

I have a WordPress site hosted on Google Cloud, and was working very well.
With no apparent motive, stoped working and I can't access to it, neither the front panel or admin panel.
I can't access via FTP o SSH console.
The VM on Google cloud still running as far as I can see.
Errors I get:
When trying to access de website on Google Chrome:
ERR_CONNECTION_TIMED_OUT
When trying to access FTP via FileZilla:
Error: Connection timed out after 20 seconds of inactivity
Error: Could not connect to server
When trying to access SSH:
Connection via Cloud Identity-Aware Proxy Failed Code: 4003 Reason:
failed to connect to backend You may be able to connect without using
the Cloud Identity-Aware Proxy.
i just want to update this issue.
The problem was that the memory quota.
I've increased the amounth of memory, restarted de VM and all went back to work.
Thanks
This page with SSH troubleshooting steps might be able to help you.
The issue could be solved by trying these troubleshooting steps. I think it is likely that the first one might be the cause of your issue since you mentioned it did work before.
Does the instance have a full disk? Try to expand it!
Is the firewall correctly setup, check your firewall rules and ensure that the default-allow-ssh rule is present.
Check your IAM permissions, do you have the roles required to connect to the VM?
Enable the serial console from your instance settings, connect and review the logs, they might give you some useful insights.

sftp net drive ssh server settings - get error 10058

I am trying out Eldos's SFTP Net Drive to map a drive to a virtual Ubuntu server. I works great when authenticating with a password. I have tried the normal/typical methods for configuring Key-based access. However, I receive error 10058. I have searched for clear instructions without success.
Anyone using this with Key-based access? Please share how you have it configured.
Thank you :)
If you're connecting via command line (e.g. using "open /profile:server" command line arguments) and you're accessing recently reinstalled server, the connection will silently fail because of changed server ssh fingerprint.
Try to reconnect manually (not using command line), accept new fingerprint and the problem will resolve.
This is because the server doesn't accept the key for whatever reason and closes connection. Usually this is an indicator of buggy server which normally should send the error packet in response.
But your question is offtopic here on StackOverflow.

Resources