How to get historical Firebase Crashlytics and Events data in BigQuery? - firebase

Need some help with accessing historical data for Firebase Crashlytics and Events data in BigQuery.
We have linked BigQuery to firebase and we are able to get only last 2 months of data in BigQuery at this moment.
Can you please suggest a way to get the data since the inception of the app?

Firebase doesn't keep the events data indefinitely which makes this feature not feasible at the moment.
Currently, your data will start being exported since the moment you enable the BigQuery connection, i.e. you can't access your historical data.
If you think this feature would be useful for you and for other people, I encourage you to request it in this link.
I hope it helps

Related

Crashlytics BigQuery integration table expiration

Is it possible to set the default table expiration that is used when enabling the Crashlytics Big Query integration in Firebase?
We are trying to reduce our monthly Firebase costs (Blaze plan) that are due to the amount of data exported automatically and now exists in our BigQuery tables. These are the costs that appear in our Firebase billing reports as "non Firebase services".
To reduce the costs we would like to allow the data to expire automatically and adjust the "time to expire" shown below for all ongoing data exported from Firebase to BigQuery.
Is this possible from within the Firebase console itself? Or can this only be done in BigQuery using the CLI? This page doesn’t seem to give any indication that this is possible from the Firebase Console itself: https://firebase.google.com/docs/crashlytics/bigquery-export
But we can see from the BigQuery docs that Table Expiration appears to be what we need to set, our question is essentially how to do this to apply for all existing and future tables streamed from Firebase Crashlytics (but also Events and Performance) data.
Thanks for any advice!
You can limit data in BigQuery by setting the retention time right in the BigQuery console to whatever length of time you prefer:
Set default table expiration times here
Update a particular table's expiration time here
The size of exported data highly depends on the product usage. Moreover, especially for Crashlytics, stacktrace in the data is completely unpredictable.
In order for you to have an idea of the cost, You can check following links:
Schema of the exported table
Columns presenting regardless of the stack trace
BigQuery Free operations
Additionally, please follow following documentation, which has clearer insight on the export data to BigQuery.

How to see the Firebase Analytics data on BigQuery?

The scenario here is that I have my Firebase( just Analytics-Data) project linked to BigQuery. but when i check on the bigQuery to see the dataset it's not appear there and i don't know which name/id has.
I highly appreciate your support. Thanks
According to the documentation, when you link your Firebase Project to BigQuery,a corresponding dataset will be created. This dataset can be found, in BigQuery, under your project id and it will be named as your app. In case, that you have both IOS and Android versions of your app, two datasets will be created as follows:
The above image was taken from the documentation, here.
Furthermore, in addition to your app_events table, under your app's name, you will have apps_events_intraday, which will receive data near real time from Firebase. In other words, as soon as Firebase receives the data from the app it will transfer it to the intraday table in BigQuery. Whereas, the app_events table will be uploaded once per day, link.
Lastly, keep in mind that the data generated by your app can take up to 1 hour to be sent to Firebase which then will be nearly instantly sent to BigQuery. You can read more about the latency here.

Refresh Firebase data to BigQuery to display in Data Studio

I am researching of a way to regularly sync Firebase data to BigQuery, then display that data to Data Studio. I saw this instruction in the documentation:
https://support.google.com/firebase/answer/6318765?hl=en
According to the above instruction, it says once Firebase is linked to BigQuery, the data from Firebase is being streamed to BigQuery real-time.
Let's say I have initial export of Firebase data to BigQuery (before linking) and I made a Data Studio visualization out of that initial data, we call it Dataset A. Then I started linking Firebase to BigQuery. I want Dataset A to be in sync with Firebase every 3 hours.
Based on the documentation, does this mean I don't have to use some external program to synchronize Firebase data every 3 hours to BigQuery, since it is streaming real-time already? After linking, does the streamed data from Firebase automatically goes to Dataset A?
I am asking because I don't want to break the visualization if the streaming behaves differently than the expected (expected means that Firebase streams to BigQuery's Dataset A consistent with the original schema). Because if it does (break the original dataset or it doesn't stream to the original dataset), I might as well write a program that does the syncing.
Once you link your Firebase project to BigQuery, Firebase will continuously export the data to BigQuery, until you unlink the project. As the documentation says, the data is exported to daily tables, and a single fixed intraday table. There is no way for you to control the schedule of the data export beyond enabling/disabling it.
If you're talking about Analytics data, schema changes to the exported data are very rare. So far there's been a schema change once, and there are currently no plans to make any more schema changes. If a schema change ever were to happen again though, all collaborators on the project will be emailed well in advance of the change.

How to calculate MAUs in Firebase? Do I need BigQuery?

We're using Firebase for analytics on our mobile apps. But Firebase only appears to report on active users for 1, 7 and 28-day rolling periods. These are not the industry standard reporting metrics I'm looking for.
We also have a web app, where we're counting unique active users in Google Analytics, and we'd like to be able to compare (and combine) MAUs from our apps in firebase with web MAUs calculated in GA.
Is this possible without BigQuery?
If no, how much will BigQuery cost us?
It seems crazy to have to purchase BigQuery for this purpose alone. Any help is appreciated.
Is [it] possible [to get MAU] without BigQuery?
If the intervals in the analytics reports in the Firebase console don't suit your needs, you will have to roll your own. There is nothing built into Firebase for custom intervals. Most developers use BigQuery for such custom reporting, especially since this is quite easy to do by tweaking the default Data Studio template.
If no, how much will BigQuery cost us?
If you have a look at the BigQuery pricing page, you'll see that this is quite involved making it hard to answer without knowing your exact amount of data. In general: if you store and process more data (i.e. have more users in your app or more reports), you will pay more. Luckily there is now a BigQuery sandbox, which allows you to process significant data without paying (even without entering a credit card). This gives you an option to try BigQuery, before committing to it.

Firebase - Perform Analytics from database/firestore data

I am using Firebase as my authentication and database platform in my React Native-Expo app. I have not yet decided if I will be using the realtime-database or Firestore database.
I need to perform statistical analysis on daily data gathered from my users, which is stored in the database. I.e. the users type in their daily intake of protein, from it I would like to calculate their weekly average, expected monthly average, provide suggestions of types of food if protein intake is too low and etc.
What would be the best approach in order to achieve the result wanted in my specific situation?
I am really unfamiliar and stepping into uncharted territory regarding on how I can accomplish this. I have read that Firebase Analytics generates different basic analytics regarding usage of the app, number crash-free users etc. But can it perform analytics on custom events? Can I create a custom event for Firebase analytics to keep track of a certain node in my database, and output analytics from that? And then of course, if yes, does it work with React Native-Expo or do I need to detach from Expo? In addition, I have read that Firebase Analytics can be combined with Google BigQuery. Would this be an alternative for my case?
Are there any other ways of performing such data analysis on my data stored in Firebase database? For example, export the data and use Python and SciKit Learn?
Whatever opinion or advice you may have, I would be grateful if you could share it!
You're not alone - many people building web apps on GCP have this question, and there is no single answer.
I'm not too familiar with Firebase Analytics, but can answer the question for Firestore and for your custom analytics (e.g. weekly avg protein consumption)
The first thing to point out is that Firestore, unlike other NoSQL databases, is storage only. You can't perform aggregations in real time like you can with MongoDB, so the calculations have to be done somewhere else.
The best practice recommended by GCP in this case is indeed to do a regular export of your Firestore data into BQ (BigQuery), and you can run analytical calculations there in the meantime. You could also, when a user inputs some data, send that to Pub/Sub and use one of GCP Dataflow's streaming templates to stream the data into BQ, and have everything in near real time.
Here's the issue with that however: while this solution gives you real time, and is very scalable, it gets expensive fast, and if you're more used to Python than SQL to run analytics it can be a steep learning curve. Here's an alternative I've used for smaller webapps, which scales well for <100k users and costs <$20 a month on GCP's current pricing:
Write a Python script that grabs the data from Firestore (using the Firestore Python SDK), generates the analytics you need on it, and writes the results back to a Firestore collection
Create an endpoint for that function using Flask or Django
Deploy that server application on Cloud Run, preventing unauthenticated invocations (you'll only be calling it from within GCP) - see this article, steps 1 and 2 only. You can also deploy the Python script(s) to GCP's Vertex AI or hosted Jupyter notebooks if you're more comfortable with that
Use Cloud Scheduler to call that function every x minutes - see these docs for authentication
Have your React app query the "analytics results" collection to get the results
My solution is a FlutterWeb based Dashboard that displays relevant data in (near) realtime like the Regular Flutter IOS/Android app and likewise some aggregated data.
The aggregated data is compiled using a few nodejs based triggers in the database that does any analytic lifting and hence is also near realtime. If you study pricing you will learn, that function invocations are pretty cheap unless of-course you happen to make a 'desphew' :)
I came up with a great solution.
I used the inbuilt firebase BigQuery plugin. Then I used Cube.js (deployed on GCP - cloud run on docker) on top of bigquery.
Cube.js just makes everything just so easy. You do need to make a manual query It tries to do optimize queries. On top of that, it uses caching so you won't get big bills on GCP. I think this is the best solution I was able to find. And this is infinitely scalable and totally real-time.
Also if you are a small startup then it is mostly free with GCP - free limits on cloud run and BigQuery.
Note:- This is not affiliated in any way with cubejs.

Resources