I have an R script that I run in google cloud environment, it helps me pull google analytics data and then I store it in storage. I call the googleAuthR library in one line of my script but I keep getting the same error. Has anyone had this problem before or can help?
I call the library like this:
library(googleAuthR)
library(googleCloudStorageR)
and error text i get:
Error in library(googleAuthR) : there is no package called ‘googleAuthR’
Looks like your R installation cannot find the package. Probably it is not installed where R is looking for it.
To fix it, just open R from a terminal and execute:
install.packages("googleAuthR");
and
install.packages("googleCloudStorageR");
Remember that you will need to pass your Google credentials to work with cloud storage (for instance in a .json file, and set the GOOGLE_APPLICATION_CREDENTIALS environment variable - see https://cloud.google.com/docs/authentication/getting-started ).
Related
I'm tryng to import a csv with a scheduled task with cronR. Unfortunatly even if I don't receive any error from log file I don't see any dataframe in global environment.
Below the super easy script
#Test_cronR
require("readr")
setwd("~/projects/my_proj")
df <- read.csv("~/projects/my_proj/df.csv")
I tried to insert also the absolute path but it's still not working.
I'm working on Rstudio Server in a VM on Google Cloud Platform,
Can somebody help me ?
Consulting the web actully I was not able to find anything, in particular on importing such a kind of file.Only some tips regarding the full path vs relative one.
My expecting is having simply a csv in my global environment.
I explored the log4r package and found that we have console and file appenders. I am looking for pushing the logs to a mongoDB database. I am unable to find any way in pushing the logs to mongoDB, do we have any appender for the same.
TIA
You can try the mongoimport tool
If you do not specify a file (option --file=), mongoimport reads data from standard input (e.g. "stdin").
Or write logs into a temporary file and use mongoimport to import it.
I'm trying to connect to Spark from an RStudio instance on IBM Watson Studio but I'm getting the following error.
No encoding supplied: defaulting to UTF-8. Error in force(code) :
Failed during initialize_connection: attempt to use zero-length
variable name
Log: /tmp/Rtmpdee7QC/file1b33141066_spark.log
---- Output Log ----
hummingbird kernel
http://localhost:8081/apsrstudio/agent/v1/kernel/hb-connect ; Time
Diff :1.31352798938751
{"code": "import sparklyr._"} ; Time Diff :0.00552034378051758
Here's the code I'm using to create the connection:
kernels <- load_spark_kernels()
sc <- spark_connect(config = kernels[1])
Any help would be highly appreciated!
I was able to fix this issue! Seems like I was missing a Project Access Token. Project access tokens can be manually created as described here. Tokens can be created on the Settings page of your project. From the link shared above:
Create an access token on the Settings page of your project. Only project admins can create access tokens. The access token can have viewer or editor access permissions. Only editors can inject the token into a notebook.
After adding a project access token, I could connect to Spark using the code provided in the question with no problems.
kernels <- load_spark_kernels()
sc <- spark_connect(config = kernels[1])
If you are using IBM Watson Studio on Cloud and using Rstudio in it, you should be using list_spark_kernels() to list the kernels.
kernels <- list_spark_kernels()
Then use spark_connect() to connect to it.
One more thing, do not upgrade the sparklyr, if you did, uninstall it.
Since sparklyr that Rstudio on watson studio cloud has is customized to allow to be able to connect to spark service from IBM Cloud.
sc <- spark_connect(config = kernels[1])
Uninstalling the sparklyr or removing your version of sparklyr will load the original sparklyr(customized).
Hope it helps.
I have been trying to create connection between the Google cloud storage and RStudio server(The one I spinned up in Google cloud), so that I can access the files in R to run sum analysis on.
I have found three different ways to do it on the web, but I don't see many clarity around these ways so far.
Access the file by using the public URL specific to the file [This is not an option for me]
Mount the Google cloud storage as a disc in RStudio server and access it like any other files in the server [ I saw someone post about this method but could not find on any guides or materials that shows how it's done]
Using the googleCloudStorageR package to get full access to the Cloud Storage bucket.
The step 3 looks like the pretty standard way to do it. But I get following error when I try to hit the gcs_auth() command
Error in gar_auto_auth(required_scopes, new_user = new_user, no_auto =
no_auto, : Cannot authenticate -
options(googleAuthR.scopes.selected) needs to be set to
includehttps://www.googleapis.com/auth/devstorage.full_control or
https://www.googleapis.com/auth/devstorage.read_write or
https://www.googleapis.com/auth/cloud-platform
The guide on how to connect using this is found on
https://github.com/cloudyr/googleCloudStorageR
but it says it requires a service-auth.json file to set the environment variables and all other keys and secret keys, but do not really specify on what these really are.
If someone could help me know how this is actually setup, or point me to a nice guide on setting the environment up, I would be very much grateful.
Thank you.
Before using any services by google cloud you have to attach your card.
So, I am assuming that you have created the account, after creating the account go to Console ,if you have not created Project then Create Project, then click on sidebar find APIs & Services > Credentials.
Then,
1)Create Service Account Keys save this File in json you can only download it once.
2)OAuth 2.0 client ID give the name of the app and select type as web application and download the json file.
Now For Storage go to Sidebar Find Storage and click on it.
Create Bucket and give the name of Bucket.
I have added the single image in bucket, you can also add for the code purpose.
lets look how to download this image from storage for other things you can follow the link that you have given.
First create environment file as .Renviron so it automatically catches the json file and save it in a working directory.
In .Renviron file add those two downloaded json files like this
GCS_AUTH_FILE="serviceaccount.json"
GAR_CLIENT_WEB_JSON="Oauthclient.json"
#R part
library(googleCloudStorageR)
library(googleAuthR)
gcs_auth() # for authentication
#set the scope
gar_set_client(scopes = c("https://www.googleapis.com/auth/devstorage.read_write",
"https://www.googleapis.com/auth/cloud-platform"))
gcs_get_bucket("you_bucket_name") #name of the bucket that you have created
gcs_global_bucket("you_bucket_name") #set it as global bucket
gcs_get_global_bucket() #check if your bucket is set as global,you should get your bucket name
objects <- gcs_list_objects() # data from the bucket as list
names(objects)
gcs_get_object(objects$name[[1]], saveToDisk = "abc.jpeg") #save the data
**Note :**if you dont get json file loaded restart the session using .rs.restartR()
and check the using
Sys.getenv("GCS_AUTH_FILE")
Sys.getenv("GAR_CLIENT_WEB_JSON")
#it should show the files
You probably want the FUSE adaptor - this will allow you to mount your GCS bucket as a directory on your Server.
Install gcsfuse on the R server.
create a mnt directory.
run gcsfuse your-bucket /path/to/mnt
Be aware though that RW performance isnt great vis FUSE
Full documentation
https://cloud.google.com/storage/docs/gcs-fuse
Using Shiny and R to create a little webapp that pulls data from Google BigQuery and spits it out onto the page. Uses the bigrquery package
When running the script from inside R (source(x.R)) everything runs fine, however when using Rscript x.R I get the error. I'm trying to setup cron to run the script automatically.
There is a .httr-oauth file in the directory of the script.
This question is similar to another one I answered where I suggested using a Google Service Account for server-to-server authentication using the googleAuthR and bigQueryR packages for R. Please refer to that answer (via the above link) for details including an example R script.
In the end I've decided to just use python to pull data from bigquery and use it in R.