I have set up a daily GA4 BigQuery export successfully for several BQ projects. I could see the daily partition tables exported in bigquery.
However, the data export stopped around end of Sept in one of the projects. Other projects seem fine so far. Anybody know what the possible reason for this is? How to fix it?
I didn't receive any notification/alert to tell me the data size exceeded export limit. But I have already setup the billing account.
Any suggestion would be greatly appreciated!!
Related
Is it possible to set the default table expiration that is used when enabling the Crashlytics Big Query integration in Firebase?
We are trying to reduce our monthly Firebase costs (Blaze plan) that are due to the amount of data exported automatically and now exists in our BigQuery tables. These are the costs that appear in our Firebase billing reports as "non Firebase services".
To reduce the costs we would like to allow the data to expire automatically and adjust the "time to expire" shown below for all ongoing data exported from Firebase to BigQuery.
Is this possible from within the Firebase console itself? Or can this only be done in BigQuery using the CLI? This page doesn’t seem to give any indication that this is possible from the Firebase Console itself: https://firebase.google.com/docs/crashlytics/bigquery-export
But we can see from the BigQuery docs that Table Expiration appears to be what we need to set, our question is essentially how to do this to apply for all existing and future tables streamed from Firebase Crashlytics (but also Events and Performance) data.
Thanks for any advice!
You can limit data in BigQuery by setting the retention time right in the BigQuery console to whatever length of time you prefer:
Set default table expiration times here
Update a particular table's expiration time here
The size of exported data highly depends on the product usage. Moreover, especially for Crashlytics, stacktrace in the data is completely unpredictable.
In order for you to have an idea of the cost, You can check following links:
Schema of the exported table
Columns presenting regardless of the stack trace
BigQuery Free operations
Additionally, please follow following documentation, which has clearer insight on the export data to BigQuery.
Need some help with accessing historical data for Firebase Crashlytics and Events data in BigQuery.
We have linked BigQuery to firebase and we are able to get only last 2 months of data in BigQuery at this moment.
Can you please suggest a way to get the data since the inception of the app?
Firebase doesn't keep the events data indefinitely which makes this feature not feasible at the moment.
Currently, your data will start being exported since the moment you enable the BigQuery connection, i.e. you can't access your historical data.
If you think this feature would be useful for you and for other people, I encourage you to request it in this link.
I hope it helps
I'm using Firebase and BigQuery to monitor my mobile app activity.
Over the first few days, everything was smooth and I saw the events_* tables. However, days passed and suddenly Firebase stopped exporting data to BigQuery.
I validated the BigQuery account - and it seems that this is not a payment-related kind of a problem.
When I check the Firebase reports, everything looks OK and the data available.
Any idea what is causing the stop?
I am researching of a way to regularly sync Firebase data to BigQuery, then display that data to Data Studio. I saw this instruction in the documentation:
https://support.google.com/firebase/answer/6318765?hl=en
According to the above instruction, it says once Firebase is linked to BigQuery, the data from Firebase is being streamed to BigQuery real-time.
Let's say I have initial export of Firebase data to BigQuery (before linking) and I made a Data Studio visualization out of that initial data, we call it Dataset A. Then I started linking Firebase to BigQuery. I want Dataset A to be in sync with Firebase every 3 hours.
Based on the documentation, does this mean I don't have to use some external program to synchronize Firebase data every 3 hours to BigQuery, since it is streaming real-time already? After linking, does the streamed data from Firebase automatically goes to Dataset A?
I am asking because I don't want to break the visualization if the streaming behaves differently than the expected (expected means that Firebase streams to BigQuery's Dataset A consistent with the original schema). Because if it does (break the original dataset or it doesn't stream to the original dataset), I might as well write a program that does the syncing.
Once you link your Firebase project to BigQuery, Firebase will continuously export the data to BigQuery, until you unlink the project. As the documentation says, the data is exported to daily tables, and a single fixed intraday table. There is no way for you to control the schedule of the data export beyond enabling/disabling it.
If you're talking about Analytics data, schema changes to the exported data are very rare. So far there's been a schema change once, and there are currently no plans to make any more schema changes. If a schema change ever were to happen again though, all collaborators on the project will be emailed well in advance of the change.
I have to develop a program to work on Crashlytics firebase BigQuery data, but our dev team still hasn't migrated our app from fabric.io to firebase.
Can somebody share a small dataset ( a couple of dozen entries would be ok) of Crashlytics firebase data which has been exported to BigQuery ?
https://firebase.google.com/docs/crashlytics/bigquery-export#without_stack_traces
Sure, here you go.
As a heads up, you can view sample public datasets from the BigQuery dashboard, under Resources>Add Data>Explore Public Datasets.
Thanks, but unfortunately I con not open the link provided. The message I get is:
"Report cannot be viewed at this time or xxx#yyyyyyy.com does not have access."
Also within the public datasets, I can not find a Crashlytics Firebase data set.
Am I overlooking something?