Export data from Google Analytics with R library RGA - r

If I have google analytics link like this:
https://www.googleapis.com/analytics/v3/data/ga?ids=ga%5A6389839&start-date=yesterday&end-date=yesterday&metrics=ga%3Apageviews&dimensions=ga%3ApagePath%2Cga%3Adimension4%2Cga%3Adimension
What are the credentials to input in code below (ga$getData, ids, metrics and dimension are troubles) :
ga$getData(ids, batch = TRUE, start.date, end.date,
metrics = "ga:visits", dimensions = "ga:date,ga:medium,ga:source",
sort = "", filters = "", segment = "")
I have tried with so many inputs, but every one of them have some error.
Thanks.

Related

Knitting GA data in RStudio: Error in identical(body, FALSE) : object 'redirect.uri' not found

I am using the rga package to track my Google Analytics data.
When I create a script in R, I successfully retrieve data from GA:
library(rga)
library(curl)
rga.open(instance="ga")
x <- ga$getProfiles()
id <- x[3,1] # here we select the ID of the webpage we are tracking
visits <- ga$getData(id,
start.date= as.Date("2016-01-01"),
end.date = "today",
metrics = "ga:visits, ga:pageviews, ga:organicSearches",
dimensions = "ga:date",
sort = "", filters = "", segment = "",
start = 1, max = 1000,
batch = TRUE)
However, when I copy/paste this code in a RMarkdown file, inside a chunk, I get this error when converting it to HTML/PDF/Word:
Error in identical(body, FALSE) : object 'redirect.uri' not found Calls: ... -> request_build -> body_config -> identical Execution halted
How can I create reports about my GA data using R Markdown? How can I solve this problem

Collect search occurrences with rscopus in r?

I have to make a lot of queries on Scopus. For this reason I need to automatize the process.
I have loaded "rscopus" Package and I wrote this code:
test <- generic_elsevier_api(query = "industry",
type = c("abstract"),
search_type = c("scopus"),
api_key = myLabel,
headers = NULL,
content_type = c("content"),
root_http = "http:/api.elsevier.com",
http_end = NULL,
verbose = TRUE,
api_key_error = TRUE)
My goal is obtaining the number of occurrences of a particular query.
In this example, if I search for "industry", I want to obtain the number of search results of the query.
query occurrence
industry 1789
how could I do?

Google Analytics Query explorer returning data for custom dimension but not able to get data via API call using RGoogleAnalyticsRGoogleAnalytics

If I remove the custom dimensions 15 and 16 the code runs perfectly fine but does not return anything with them.
I checked the query with query explorer and it returns the data perfectly fine.
query.list <- Init(start.date = "2017-02-01",
end.date = "2017-02-04",
dimensions = "ga:eventAction,ga:eventLabel,ga:pagePath,ga:dimension15,ga:dimension16",
#dimensions = paste(toString(paste("ga:dimension", dim, sep="")),"ga:pagePath,ga:eventLabel,ga:eventAction",sep=", "),
metrics = "ga:totalEvents",
max.results = 10000,
# table.id = "ga:XXXXXX"
table.id = "ga:XXXXX"
)
ga.query <- QueryBuilder(query.list)
ga.data <- GetReportData(ga.query, token, split_daywise = T)
The dates that I was passing the team did not create those custom dimensions during that time.

How to Fix Row Limits Export, Google analytics to R

Hi I'm using GoogleAnalyticsR to import my data from Google Analytics but I'm having a problem because it only downloads 1,000 rows from a total of 1,000,000.
Any advice how to download all 1,000,000?
Here's my code!
df1 <- google_analytics_4(my_id,
date_range = c("2016-05-13", "2017-05-13"),
metrics = c("pageviews"),
dimensions = c("pagePath"))
By default it gets 1000 rows, if you set max = -1 in your call it gets everything:
df1 <- google_analytics_4(my_id,
date_range = c("2016-05-13", "2017-05-13"),
metrics = "pageviews",
dimensions = "pagePath",
max = -1)

Unsampled GA data in R

I am attempting to extract unsampled data for the past nine months. The website is pretty active, and as such, I'm unable to get the data in its entirety (over 3 m rows) unsampled. I'm currently attempting to break out the filtering so that I'm only returning under 10k rows at a time (which is the API response limit). Is there a way I can loop over a number of days? I tried using the batch function with no success. I have included my code for reference, I was thinking of writing a loop and doing it in 10 day intervals? I appreciate any input.
Thanks!
library(RGA)
gaData <- get_ga(id, start.date = start_date,
end.date= "today" , metrics = "ga:sessions",
dimensions = "ga:date, ga:medium, ga:country, ga:hour, ga:minute",
filters = "ga:country==United States;ga:medium==organic",
max.results = NULL,
batch = TRUE,
sort = "ga:date")
The get_ga function havn't batch param (see ?get_ga). Try it with the fetch.by option. You could test a different variants: "month", "week", "day".
library(RGA)
authorize()
gaData <- get_ga(id, start.date = start_date,
end.date= "today" , metrics = "ga:sessions",
dimensions = "ga:date, ga:medium, ga:country, ga:hour, ga:minute",
filters = "ga:country==United States;ga:medium==organic",
sort = "ga:date", fetch.by = "week")

Resources