Upload offline data in Google Analytics - google-analytics

I am looking for a way to quickly upload offline data into Google Analytics. This is possible using Data Import which is a feature provided by Google Analytics itself. But doing this on daily basis is a hectic task. Is there any other functionality available using which i can automatically upload data on daily basis and view the report?

You can automate data imports by using the Management API. Data Import is documented here.
To follow the examples you first need to install the Google API client for the programming language of your choice. Then you create the custom data source (same as for the manual upload) and send data there via the uploadData method. Run this at a schedule (e.g. via cron) and the task stops being hectic.

Related

Firebase - Perform Analytics from database/firestore data

I am using Firebase as my authentication and database platform in my React Native-Expo app. I have not yet decided if I will be using the realtime-database or Firestore database.
I need to perform statistical analysis on daily data gathered from my users, which is stored in the database. I.e. the users type in their daily intake of protein, from it I would like to calculate their weekly average, expected monthly average, provide suggestions of types of food if protein intake is too low and etc.
What would be the best approach in order to achieve the result wanted in my specific situation?
I am really unfamiliar and stepping into uncharted territory regarding on how I can accomplish this. I have read that Firebase Analytics generates different basic analytics regarding usage of the app, number crash-free users etc. But can it perform analytics on custom events? Can I create a custom event for Firebase analytics to keep track of a certain node in my database, and output analytics from that? And then of course, if yes, does it work with React Native-Expo or do I need to detach from Expo? In addition, I have read that Firebase Analytics can be combined with Google BigQuery. Would this be an alternative for my case?
Are there any other ways of performing such data analysis on my data stored in Firebase database? For example, export the data and use Python and SciKit Learn?
Whatever opinion or advice you may have, I would be grateful if you could share it!
You're not alone - many people building web apps on GCP have this question, and there is no single answer.
I'm not too familiar with Firebase Analytics, but can answer the question for Firestore and for your custom analytics (e.g. weekly avg protein consumption)
The first thing to point out is that Firestore, unlike other NoSQL databases, is storage only. You can't perform aggregations in real time like you can with MongoDB, so the calculations have to be done somewhere else.
The best practice recommended by GCP in this case is indeed to do a regular export of your Firestore data into BQ (BigQuery), and you can run analytical calculations there in the meantime. You could also, when a user inputs some data, send that to Pub/Sub and use one of GCP Dataflow's streaming templates to stream the data into BQ, and have everything in near real time.
Here's the issue with that however: while this solution gives you real time, and is very scalable, it gets expensive fast, and if you're more used to Python than SQL to run analytics it can be a steep learning curve. Here's an alternative I've used for smaller webapps, which scales well for <100k users and costs <$20 a month on GCP's current pricing:
Write a Python script that grabs the data from Firestore (using the Firestore Python SDK), generates the analytics you need on it, and writes the results back to a Firestore collection
Create an endpoint for that function using Flask or Django
Deploy that server application on Cloud Run, preventing unauthenticated invocations (you'll only be calling it from within GCP) - see this article, steps 1 and 2 only. You can also deploy the Python script(s) to GCP's Vertex AI or hosted Jupyter notebooks if you're more comfortable with that
Use Cloud Scheduler to call that function every x minutes - see these docs for authentication
Have your React app query the "analytics results" collection to get the results
My solution is a FlutterWeb based Dashboard that displays relevant data in (near) realtime like the Regular Flutter IOS/Android app and likewise some aggregated data.
The aggregated data is compiled using a few nodejs based triggers in the database that does any analytic lifting and hence is also near realtime. If you study pricing you will learn, that function invocations are pretty cheap unless of-course you happen to make a 'desphew' :)
I came up with a great solution.
I used the inbuilt firebase BigQuery plugin. Then I used Cube.js (deployed on GCP - cloud run on docker) on top of bigquery.
Cube.js just makes everything just so easy. You do need to make a manual query It tries to do optimize queries. On top of that, it uses caching so you won't get big bills on GCP. I think this is the best solution I was able to find. And this is infinitely scalable and totally real-time.
Also if you are a small startup then it is mostly free with GCP - free limits on cloud run and BigQuery.
Note:- This is not affiliated in any way with cubejs.

Getting data from BigQuery back into Google Analytics

I have data in BigQuery from my Google Analytics account, along with some extra tables where I have transformed some of this data.
I would like to export some of my transformed data from BigQuery and import it into Google Analytics as a custom dimension.
I have done this manually, by downloading a CSV from my table in BigQuery and importing this using the GA admin UI. I would like to automate the process, but not sure where to start.
What would be the most efficient tool to automate this process? The process being:
Run a SQL query on by BQ data every day and overwrite a table.
Export this table as a file and upload it to a GA account as a query time import.
Not sure why you'd want do this, but one (rather clunky) solution that pops into my head is to spin up a small GCE instance, and using the gcloud tool and some simple bash:
Run a BigQuery query job (SQL) to truncate your table
Monitor the progress of that query job i.e wait
When it's finished, trigger an export job and dump the table to GCS
Monitor the progress of that BigQuery export job i.e. wait
When it's finished, download the file from GCS
Upload the file to GA using the management API (https://developers.google.com/analytics/devguides/config/mgmt/v3/mgmtReference/management/uploads/uploadData)
Schedule a cron job to run the above bash script daily
A nicer way would be to use Cloud Functions listening on the GCS bucket, but in my opinion, CFs are not designed for performing long running batch/data workloads. They have e.g. time limits (540s). Also, if GA supported direct load from GCS it would be much better. But, I wasn't able to find support for that.

How to copy Google Analytics data into SQL Server tables

I just started working on Google Analytics stuff and I'm pretty new to this. I am now granted access to GA account of my Organization's marketing Website for several European countries(single login).
My requirement is to copy different European countries GA data into a single table structure in SQL server. Wondering if anyone of you have done this before? Any suggestions are highly appreciated.
As already written earlier, there are several ways of doing this. I prefer to integrate Google Analytics and SQL Server with no coding, using Skyvia tool: Google Analytics and SQL Server Integration. It allows me to create a copy of Google Analytics report data in SQL Server and keep it up-to-date with little to no configuration efforts. I don’t even need to prepare the schema — Skyvia can automatically create a table for report data. You can load 10000 records per month for free — this is enough for me.
There is a number of ways of doing this. Google Analytics does have the ability to export data as CSV but its going to be hard to match up the data properly.
If you are up for a bit of programming. start with the Google Analytics API it will allow you to extract data from Google analytics and insert it where you like. You can use any programming language that is capable of preforming a HTTP Post and HTTP Get. However i recommend looking into one of Googles client libraries.
If you have the ability to use SSIS to you can use Targit Google Analytics SSIS its a custom connection manager and data reader for extracting data from Google analytics it is free to use. Note: Full disclosure I am the lead developer on that project.

Can I trigger creation of tables on Analytcs Export completion?

I have setup Analytics Export to BigQuery. Everytime when a new ga_sessions_yyyymmdd gets created I would like to run some queries aggregating some data for future use.
I can't figure out how to do this. Do I have to create a job and trigger it from outside or is there a way to trigger this in BigQuery directly (prefably using the Web UI).
You cannot schedule queries to run via the Web UI. You'll need to write a small piece of software to do this by using use the BigQuery API, and cron(s).
You may also want to check out Cloud Functions - bearing in mind that it's still in Alpha.

Fastest way to get basic information from google analytics api

My GA Account has a number(50) of profiles associated with it and I am trying to build an api which shows me the basic information like visits, bounce rates etc. for each profile.
This query gets me what I want from GA, but for each profile:
URL ="https://www.google.com/analytics/feeds/data?ids=ga:11111&start-date=2011-07-01&end-date=2011-07-02&metrics=ga:visitors&prettyprint=true&alt=json"
The id is table id and the metrics gives me the information I want.
Now the problem is, I want to show all the information together. So, everytime I will have to send 50 requests to the API, which just doesn't work out. Is there a way I can get the information for all the profiles associated with me in a single request?
You unfortunately will be required to perform 50 requests if you want metrics for 50 different profiles. You can easily automate this, however, by using a combination of the Management API and the Data Export API.
The Management API allows you to pull information about the account. For example, you can very easily pull all profile IDs and names associated with an Analytics account through this API for use in an automated query.
The Data Export API, which I am sure you already are familiar with, is the only way to pull collected data/statistics for individual profiles.
If you are concerned about speed, you might want to build an automated process that uses both the Management API and the Data Export API. Pull all of the profiles associated with your account with the Management API, then loop through each and pull the basic data you'd like through the Data Export API. Have this run at regular intervals based on your needs and cache it between runs. This way it won't execute every time the page is hit (though you honestly might be fine, depending on your traffic - I've found it to be extremely quick).

Resources