Google Analytics export setup to Bigquery - google-analytics

I was able to setup the GA360 to bigquery and I also got an email that export is complete. But I dont see anything when I click on the dataset. Will it take time for the tables to show up or do I need to create tables once I setup the dataset in bigquery?

If you successfully connected GA360, you don't need to anything else additional for it to work. Make sure you're using the correct account for the Big Query project.
It will take time for the day-of data to show up as well as a few days for the 13 months of data to show up.

Related

Explore Free Form report in Google Analytics

I am trying to generate a report using Google Analytics Explore tab using Free Form technique. Few weeks ago I could use Message name, stream name and time to see all the notification name, platform and total no of click. I exported the same to excel file.
but today when I tried to generate the same I couldn't find "Message Name" dimension. Is this field removed from pre defined/custom dimensions from GA? or am I doing something wrong?
My main purpose is to get all list of notifications sent via Firebase.
Any help will be deeply appreciated.
Given that you excluded the obvious issues like using the too-fresh data, the proper way to debug it is to export the data into a sample BQ table, then conduct exactly the same analysis that you're trying to conduct in GA4's explorer. From there, if your issue is with explorer's filters, you will quickly see it.
If, however, you're able to see your event properties in BQ, but not able to get the explorer to display them... Well, Google likely saved quite a lot of money on GA4. UA was pretty expensive. GA4 now introduces all these amazing features like data retention limits, properties' values cardinality bugs, odd inconsistencies between explore's reports and default reports and so on.
For now, the best way to really access your data minus all the artificial limitations of GA4 is to ETL your data from there either through the reporting API or exporting it to BQ.

Get Real time Analytics Data

I want to get analytics data of each user of my website. I know that google analytics provide data user wise but I am not able to export that data using API. I want real time user wise data with export feature to save it in my database. I have gone through many platform like mouseflow but they do not provide data user wise. Is anyone have idea how can I implement this?
Thanks!
Here an example of how to get data from Google Real Time API (with Google Apps Script):
https://www.appsscript.it/tutorial/utilizzare-le-real-time-reporting-api-di-google-analytics-con-apps-script/

Events from Firebase Analytics don't showing up in Big Query

I have a trouble when I linked Firebase Events to Big Query.
In Firebase Analytics, I have many event .
I linked firebase with Big Query
But in Big Query, I can't see it.
How can I see it?
I tried to unlink bigquery and put it back. I tried to follow every tutorial, but there is nothing to do, my events don't appear and I don't know why
my 25 events in firebase
events that I apparently export from firebase, but they have nothing to do with those in firebase
So If somebody knows how to realy export their events from firebase it would be a pleasure :)
PS : sorry for my english, I hope I don't made too much wording mistakes
It appears that data from Firebase is exported to BigQuery when it is beyond a certain number as you can see only screenView event is getting recorded in Big Query which has a high number of event count. It takes some time for the data to accumulate and get displayed as it is not recorded in real time as Firebase. If the linking has been done properly then the data will start to show up as soon as it is beyond that number.
Analytics events are exported from Firebase to BigQuery from the moment you enable the integration. There is no way to export events from before you enabled the integration.

Ingesting Google Analytics data into S3 or Redshift

I am looking for options to ingest Google Analytics data(historical data as well) into Redshift. Any suggestions regarding tools, API's are welcomed. I searched online and found out Stitch as one of the ETL tools, help me know better about this option and other options if you have.
Google Analytics has an API (Core Reporting API). This is good for getting the occasional KPIs, but due to API limits it's not great for exporting great amounts of historical data.
For big data dumps it's better to use the Link to BigQuery ("Link" because I want to avoid the word "integration" which implies a larger level of control than you actually have).
Setting up the link to BigQuery is fairly easy - you create a project in the Google Cloud Console, enable billing (BigQuery comes with a fee, it's not part of the GA360 contract), add your email address as BigQuery Owner in the "IAM&Admin" section, go to your GA account and enter the BigQuery Project ID in the GA Admin section, "Property Settings/Product Linking/All Products/BigQuery Link". The process is described here: https://support.google.com/analytics/answer/3416092
You can select between standard updates and streaming updated - the latter comes with an extra fee, but gives you near realtime data. The former updates data in BigQuery three times a day every eight hours.
The exported data is not raw data, this is already sessionized (i.e. while you will get one row per hit things like the traffic attribution for that hit will be session based).
You will pay three different kinds of fees - one for the export to BigQuery, one for storage, and one for the actual querying. Pricing is documented here: https://cloud.google.com/bigquery/pricing.
Pricing depends on region, among other things. The region where the data is stored might also important be important when it comes to legal matters - e.g. if you have to comply with the GDPR your data should be stored in the EU. Make sure you get the region right, because moving data between regions is cumbersome (you need to export the tables to Google Cloud storage and re-import them in the proper region) and kind of expensive.
You cannot just delete data and do a new export - on your first export BigQuery will backfill the data for the last 13 months, however it will do this only once per view. So if you need historical data better get this right, because if you delete data in BQ you won't get it back.
I don't actually know much about Redshift, but as per your comment you want to display data in Tableau, and Tableau directly connects to BigQuery.
We use custom SQL queries to get the data into Tableau (Google Analytics data is stored in daily tables, and custom SQL seems the easiest way to query data over many tables). BigQuery has a user-based cache that lasts 24 hours as long as the query does not change, so you won't pay for the query every time the report is opened. It still is a good idea to keep an eye on the cost - cost is not based on the result size, but on the amount of data that has to be searched to produce the wanted result, so if you query over a long timeframe and maybe do a few joins a single query can run into the dozens of euros (multiplied by the number of users who use the query).
scitylana.com has a service that can deliver Google Analytics Free data to S3.
You can get 3 years or more.
The extraction is done through the API. The schema is hit level and has 100+ dimensions/metrics.
Depending on the amount of data in your view, I think this could be done with GA360 too.
Another option is to use Stitch's own specfication singer.io and related open source packages:
https://github.com/singer-io/tap-google-analytics
https://github.com/transferwise/pipelinewise-target-redshift
The way you'd use them is piping data from into the other:
tap-google-analytics -c ga.json | target-redshift -c redshift.json
I like Skyvia tool: https://skyvia.com/data-integration/integrate-google-analytics-redshift. It doesn't require coding. With Skyvia, I can create a copy of Google Analytics report data in Amazon Redshift and keep it up-to-date with little to no configuration efforts. I don't even need to prepare the schema — Skyvia can automatically create a table for report data. You can load 10000 records per month for free — this is enough for me.

Manually adding e-commerce transactions in Google Analytics

we've had some problems with the tracking of the transactions in Google Analytics in our e-commerce website, and now we've lost a couple of months of data due to a configuration error.
Is it possible to bulk import in some way these transactions, along with their additional parameters (date of event, transaction code, money amount)?
We know when every one of these events has taken place, but we would like to import it back into Analytics to have the complete statistics.
Thank you for the help
If the data is in MySQL say for example.
Run a MySQL command which pulls all the orders from the date_you_want_start_from and then limit it till the date_you_want_to_end_from
With this data, then create a MySQL view call it "import_table"
Create a PHP and XML script which loads all the data from import_table, it then writes and fills the XML feed
Just run that XML feed now inside your analytic loop JavaScript and it will loop and basically add data into the past for you.
Note: PHP, XML and MySQL is just a scenario to give you a head on to do it.

Resources