Is it possible to filter the data to be exported to BigQuery? For example I just want to have fatal crashes (is_fatal=TRUE) to be exported but not non fatal exceptions which allocates much more space in my case.
I checked out data transfer options but could not find anything related to filtering or schema customization.
The only configuration options for the exporting Crashlytics data to BigQuery are to:
Turn it on or off
Enable streaming of intra-day events (if your project is on the Blaze plan)
It's not possible to control what crash data is exported beyond that.
If you want less data to be stored in BigQuery, you'll have to copy the data you want to keep over to new tables, and delete the ones generated by the integration.
Related
Is it possible to set the default table expiration that is used when enabling the Crashlytics Big Query integration in Firebase?
We are trying to reduce our monthly Firebase costs (Blaze plan) that are due to the amount of data exported automatically and now exists in our BigQuery tables. These are the costs that appear in our Firebase billing reports as "non Firebase services".
To reduce the costs we would like to allow the data to expire automatically and adjust the "time to expire" shown below for all ongoing data exported from Firebase to BigQuery.
Is this possible from within the Firebase console itself? Or can this only be done in BigQuery using the CLI? This page doesn’t seem to give any indication that this is possible from the Firebase Console itself: https://firebase.google.com/docs/crashlytics/bigquery-export
But we can see from the BigQuery docs that Table Expiration appears to be what we need to set, our question is essentially how to do this to apply for all existing and future tables streamed from Firebase Crashlytics (but also Events and Performance) data.
Thanks for any advice!
You can limit data in BigQuery by setting the retention time right in the BigQuery console to whatever length of time you prefer:
Set default table expiration times here
Update a particular table's expiration time here
The size of exported data highly depends on the product usage. Moreover, especially for Crashlytics, stacktrace in the data is completely unpredictable.
In order for you to have an idea of the cost, You can check following links:
Schema of the exported table
Columns presenting regardless of the stack trace
BigQuery Free operations
Additionally, please follow following documentation, which has clearer insight on the export data to BigQuery.
As part of a 'pet project' Flutter app that I am trying to build (in Android Studio) I am looking to add a database of information (possibly with Firebase) for users to use with the app.
My current understanding/capabilities
At the moment, I understand how to (and have already) build a database in Cloud Firestore, where the users can store their own data. A good example of this would be a to-do list app, where the user can create a new item, which is stored in the database with their uid. This remains there until they delete it manually. They are also able to update the entry, such as change the name of the item, themselves in the app.
The aim
I've got a set of data at the moment, which is in Excel format, that has the potential to have up to 1000s of rows. I would like to be able to incorporate this into my app, such that the user is able to query the database, either via multiple dependent drop-down menus, or a search widget.
My question
Is there an easy way to convert a reasonably large set of data, currently in Excel format, into a firebase-type database (such as cloud firestore or realtime database) without having to manually enter all of the data?
For RTDB, you can use use some Excel-to-JSON tool and import that JSON into RTDB. However, I doubt that the exported format will be efficient to use in your app, so you might have to do some transformations (in your language of choice).
If your data is very large (1000s of rows, but... how many columns?), you might have to split your import into multiple smaller imports at different paths of your database.
Doing huge RTDB imports in Firebase console has caused my projects to "misbehave" for a little while, but it goes back to normal quickly, so don't freak out if that happens to you too.
For Firestore, which has no direct JSON import AFAIK, take a look at How to import CSV or JSON to firebase cloud firestore for some ideas.
I am researching of a way to regularly sync Firebase data to BigQuery, then display that data to Data Studio. I saw this instruction in the documentation:
https://support.google.com/firebase/answer/6318765?hl=en
According to the above instruction, it says once Firebase is linked to BigQuery, the data from Firebase is being streamed to BigQuery real-time.
Let's say I have initial export of Firebase data to BigQuery (before linking) and I made a Data Studio visualization out of that initial data, we call it Dataset A. Then I started linking Firebase to BigQuery. I want Dataset A to be in sync with Firebase every 3 hours.
Based on the documentation, does this mean I don't have to use some external program to synchronize Firebase data every 3 hours to BigQuery, since it is streaming real-time already? After linking, does the streamed data from Firebase automatically goes to Dataset A?
I am asking because I don't want to break the visualization if the streaming behaves differently than the expected (expected means that Firebase streams to BigQuery's Dataset A consistent with the original schema). Because if it does (break the original dataset or it doesn't stream to the original dataset), I might as well write a program that does the syncing.
Once you link your Firebase project to BigQuery, Firebase will continuously export the data to BigQuery, until you unlink the project. As the documentation says, the data is exported to daily tables, and a single fixed intraday table. There is no way for you to control the schedule of the data export beyond enabling/disabling it.
If you're talking about Analytics data, schema changes to the exported data are very rare. So far there's been a schema change once, and there are currently no plans to make any more schema changes. If a schema change ever were to happen again though, all collaborators on the project will be emailed well in advance of the change.
Every time I close my application it has to load the data from the firebase again. Is there any way for it to already open with the last data fetched?
If you enable disk persistence, the Firebase client will write any data it sees to a local disk cache. This means that when you restart the app when there's no connectivity, the same data that you saw before is available offline on your app.
For more info, see the Firebase documentation on disk persistence and the call to enable it from Flutter.
You should explain a bit more, so we understand your use case. Anyways, i'll try to answer with what I have understood.
You can use sqflite package to cache the data, i.e. it will be stored to the local storage. Get started: https://pub.dev/packages/sqflite
It will be fairly complex, even unnecessary if the size of your data is small.
If you have huge amount of data that doesn't change frequently, then go for it.
So how it will work is like this:
First you'll check whether the data has changed in Firestore or not.
Case 1: If it didn't then you'll display data from your local sqflite db.
Case 2: If it did change, then, you'll display data from Firestore, and at the same time update your local db with the new data.
Again, this is very superfluous if your data size is be small/it changes very frequently.
I have linked Bigquery to my app through Firebase and it has worked well for more than 3 months. However, couple days ago, a random dataset called "analytics_156951304" was created and then the app data was streamed into this dataset instead of the original dataset that Bigquery created for me when I first linked my app to Bigquery, "my_app_name_IOS". And the table schema was changed too. I checked Stackdriver logging and it said an account called "firebase-measurement#system.gserviceaccount.com" created this job at midnight of my local time. However, I have no clue what happened and how to get my streaming data back to my original dataset, "my_app_name_IOS". Thank you for all the answers!!!
Firebase recently scheduled a huge schema migration - your previous queries won't work anymore.
(AFAIK this was communicated via email)
There's a script that helps migrating your previous datasets to the new schema:
https://support.google.com/analytics/answer/7029846?visit_id=1-636661955728880760-637850888&rd=1#migrationscript
(tho the script won't help with modifying the existing queries)
From the email I got:
Currently, the data from each app in a Firebase project are exported to a distinct BigQuery dataset in the corresponding BigQuery project. Going forward, the data from each app in a Firebase project will be exported to a single dataset in that BigQuery project. Additionally, the data in this new dataset will follow a new schema.