Here API's Report Usage is not available - here-api

I'm using freemium plan but report usage data is not available. How I can view daily report usage?

You can see the usage details by apply the filter on right side of the same page from where you have taken this screenshot above. Once you select the date or from-to then you can see the result in bottom , also you can download the result into csv format to see for every day usage.

Related

Explore Free Form report in Google Analytics

I am trying to generate a report using Google Analytics Explore tab using Free Form technique. Few weeks ago I could use Message name, stream name and time to see all the notification name, platform and total no of click. I exported the same to excel file.
but today when I tried to generate the same I couldn't find "Message Name" dimension. Is this field removed from pre defined/custom dimensions from GA? or am I doing something wrong?
My main purpose is to get all list of notifications sent via Firebase.
Any help will be deeply appreciated.
Given that you excluded the obvious issues like using the too-fresh data, the proper way to debug it is to export the data into a sample BQ table, then conduct exactly the same analysis that you're trying to conduct in GA4's explorer. From there, if your issue is with explorer's filters, you will quickly see it.
If, however, you're able to see your event properties in BQ, but not able to get the explorer to display them... Well, Google likely saved quite a lot of money on GA4. UA was pretty expensive. GA4 now introduces all these amazing features like data retention limits, properties' values cardinality bugs, odd inconsistencies between explore's reports and default reports and so on.
For now, the best way to really access your data minus all the artificial limitations of GA4 is to ETL your data from there either through the reporting API or exporting it to BQ.

Ingesting Google Analytics data into S3 or Redshift

I am looking for options to ingest Google Analytics data(historical data as well) into Redshift. Any suggestions regarding tools, API's are welcomed. I searched online and found out Stitch as one of the ETL tools, help me know better about this option and other options if you have.
Google Analytics has an API (Core Reporting API). This is good for getting the occasional KPIs, but due to API limits it's not great for exporting great amounts of historical data.
For big data dumps it's better to use the Link to BigQuery ("Link" because I want to avoid the word "integration" which implies a larger level of control than you actually have).
Setting up the link to BigQuery is fairly easy - you create a project in the Google Cloud Console, enable billing (BigQuery comes with a fee, it's not part of the GA360 contract), add your email address as BigQuery Owner in the "IAM&Admin" section, go to your GA account and enter the BigQuery Project ID in the GA Admin section, "Property Settings/Product Linking/All Products/BigQuery Link". The process is described here: https://support.google.com/analytics/answer/3416092
You can select between standard updates and streaming updated - the latter comes with an extra fee, but gives you near realtime data. The former updates data in BigQuery three times a day every eight hours.
The exported data is not raw data, this is already sessionized (i.e. while you will get one row per hit things like the traffic attribution for that hit will be session based).
You will pay three different kinds of fees - one for the export to BigQuery, one for storage, and one for the actual querying. Pricing is documented here: https://cloud.google.com/bigquery/pricing.
Pricing depends on region, among other things. The region where the data is stored might also important be important when it comes to legal matters - e.g. if you have to comply with the GDPR your data should be stored in the EU. Make sure you get the region right, because moving data between regions is cumbersome (you need to export the tables to Google Cloud storage and re-import them in the proper region) and kind of expensive.
You cannot just delete data and do a new export - on your first export BigQuery will backfill the data for the last 13 months, however it will do this only once per view. So if you need historical data better get this right, because if you delete data in BQ you won't get it back.
I don't actually know much about Redshift, but as per your comment you want to display data in Tableau, and Tableau directly connects to BigQuery.
We use custom SQL queries to get the data into Tableau (Google Analytics data is stored in daily tables, and custom SQL seems the easiest way to query data over many tables). BigQuery has a user-based cache that lasts 24 hours as long as the query does not change, so you won't pay for the query every time the report is opened. It still is a good idea to keep an eye on the cost - cost is not based on the result size, but on the amount of data that has to be searched to produce the wanted result, so if you query over a long timeframe and maybe do a few joins a single query can run into the dozens of euros (multiplied by the number of users who use the query).
scitylana.com has a service that can deliver Google Analytics Free data to S3.
You can get 3 years or more.
The extraction is done through the API. The schema is hit level and has 100+ dimensions/metrics.
Depending on the amount of data in your view, I think this could be done with GA360 too.
Another option is to use Stitch's own specfication singer.io and related open source packages:
https://github.com/singer-io/tap-google-analytics
https://github.com/transferwise/pipelinewise-target-redshift
The way you'd use them is piping data from into the other:
tap-google-analytics -c ga.json | target-redshift -c redshift.json
I like Skyvia tool: https://skyvia.com/data-integration/integrate-google-analytics-redshift. It doesn't require coding. With Skyvia, I can create a copy of Google Analytics report data in Amazon Redshift and keep it up-to-date with little to no configuration efforts. I don't even need to prepare the schema — Skyvia can automatically create a table for report data. You can load 10000 records per month for free — this is enough for me.

How to properly use sequences for attribution?

My goal is to see which customers originate from organic search, but convert via a different source later on.
To do this, I defined this segment:
Then, I look at the Source/Medium report, but the results seem off. I expected to see zero revenue in the google/organic row (as the segment should show users where the transaction is specifically not coming from google/organic.
Am I using the right tool for what I'm trying to achieve? And if so, what am I doing wrong?
The conversion path in Google Analytics is a better solution here. You can find it under conversions -> multi-channel funnels -> top conversion paths.
What you're going to see here is all the combinations of channels that have been used to generate a conversion, see the screenshot below.
If you have a bigger website you're probably going to have >10.000 different conversion paths depending on the time period that you select. What you want to realize is how much conversion value and conversion was generated in conversion paths that started with organic, but ended with a different channel. Simply apply the filter below in the report to retrieve that data. Please note that in the standard Google Analytics reports all conversions and conversion value are attributed to the last, non-direct click.
The solution is to add a condition that includes only sessions as described in the second step of the sequence. This will reduce the population from all users that present the pattern in the sequence, to only sessions that matter. See detailed explanation here: https://www.napkyn.com/2017/09/07/quick-google-analytics-trick-use-sequence-segments-to-analyse-behavior-over-multiple-sessions/

Tableau Server hourly google analytics updates

I'm setting up a Dashboard on Tableau Server and with a data source extract from Google Analytics.
I need it to refresh hourly.
When publishing the data source I have selected Fixed start and incremental refresh, and i can see that the end date is hardcoded to yesterday and I can only schedule update once a day.
This means that my Dashboard will only have data for the previous day. Is this a known limitation? Is there a way to achieve hourly automated data refreshes?
I thought of three possible solutions:
1) Using command line data extract update, but I don't see any parameter that allows me to control date range, so the end date would always be hardcoded to yesterday (?)
2) Using Tableau API. But again, my extract will only refresh to latest day yesterday.
3) Setting up a separate process to download the data into a database and then connect to it with Tableau. Not sure if in this case I will be able to incrementally update the extract or if I have to reload the entire extract all the time.
Any help would be much appreciated!!

Can see Real-Time Data, but can't see data in a report

I can see my real-time event data as I post it.
However, when switching to my dashboard or a report, no data appears.
Do you have to wait a day to see the data in a report that is not real-time?
Thank you in advance,
Karl
See Confuzing response.
It can take up to 48hrs to see data in analytics, although in my experience I have never had it take more than 4hrs. Also make sure you are selecting todays date in the date range as it doesn't automatically include the current date in the range.

Resources