I have linked Bigquery to my app through Firebase and it has worked well for more than 3 months. However, couple days ago, a random dataset called "analytics_156951304" was created and then the app data was streamed into this dataset instead of the original dataset that Bigquery created for me when I first linked my app to Bigquery, "my_app_name_IOS". And the table schema was changed too. I checked Stackdriver logging and it said an account called "firebase-measurement#system.gserviceaccount.com" created this job at midnight of my local time. However, I have no clue what happened and how to get my streaming data back to my original dataset, "my_app_name_IOS". Thank you for all the answers!!!
Firebase recently scheduled a huge schema migration - your previous queries won't work anymore.
(AFAIK this was communicated via email)
There's a script that helps migrating your previous datasets to the new schema:
https://support.google.com/analytics/answer/7029846?visit_id=1-636661955728880760-637850888&rd=1#migrationscript
(tho the script won't help with modifying the existing queries)
From the email I got:
Currently, the data from each app in a Firebase project are exported to a distinct BigQuery dataset in the corresponding BigQuery project. Going forward, the data from each app in a Firebase project will be exported to a single dataset in that BigQuery project. Additionally, the data in this new dataset will follow a new schema.
Related
Is it possible to filter the data to be exported to BigQuery? For example I just want to have fatal crashes (is_fatal=TRUE) to be exported but not non fatal exceptions which allocates much more space in my case.
I checked out data transfer options but could not find anything related to filtering or schema customization.
The only configuration options for the exporting Crashlytics data to BigQuery are to:
Turn it on or off
Enable streaming of intra-day events (if your project is on the Blaze plan)
It's not possible to control what crash data is exported beyond that.
If you want less data to be stored in BigQuery, you'll have to copy the data you want to keep over to new tables, and delete the ones generated by the integration.
The scenario here is that I have my Firebase( just Analytics-Data) project linked to BigQuery. but when i check on the bigQuery to see the dataset it's not appear there and i don't know which name/id has.
I highly appreciate your support. Thanks
According to the documentation, when you link your Firebase Project to BigQuery,a corresponding dataset will be created. This dataset can be found, in BigQuery, under your project id and it will be named as your app. In case, that you have both IOS and Android versions of your app, two datasets will be created as follows:
The above image was taken from the documentation, here.
Furthermore, in addition to your app_events table, under your app's name, you will have apps_events_intraday, which will receive data near real time from Firebase. In other words, as soon as Firebase receives the data from the app it will transfer it to the intraday table in BigQuery. Whereas, the app_events table will be uploaded once per day, link.
Lastly, keep in mind that the data generated by your app can take up to 1 hour to be sent to Firebase which then will be nearly instantly sent to BigQuery. You can read more about the latency here.
As part of a 'pet project' Flutter app that I am trying to build (in Android Studio) I am looking to add a database of information (possibly with Firebase) for users to use with the app.
My current understanding/capabilities
At the moment, I understand how to (and have already) build a database in Cloud Firestore, where the users can store their own data. A good example of this would be a to-do list app, where the user can create a new item, which is stored in the database with their uid. This remains there until they delete it manually. They are also able to update the entry, such as change the name of the item, themselves in the app.
The aim
I've got a set of data at the moment, which is in Excel format, that has the potential to have up to 1000s of rows. I would like to be able to incorporate this into my app, such that the user is able to query the database, either via multiple dependent drop-down menus, or a search widget.
My question
Is there an easy way to convert a reasonably large set of data, currently in Excel format, into a firebase-type database (such as cloud firestore or realtime database) without having to manually enter all of the data?
For RTDB, you can use use some Excel-to-JSON tool and import that JSON into RTDB. However, I doubt that the exported format will be efficient to use in your app, so you might have to do some transformations (in your language of choice).
If your data is very large (1000s of rows, but... how many columns?), you might have to split your import into multiple smaller imports at different paths of your database.
Doing huge RTDB imports in Firebase console has caused my projects to "misbehave" for a little while, but it goes back to normal quickly, so don't freak out if that happens to you too.
For Firestore, which has no direct JSON import AFAIK, take a look at How to import CSV or JSON to firebase cloud firestore for some ideas.
I am researching of a way to regularly sync Firebase data to BigQuery, then display that data to Data Studio. I saw this instruction in the documentation:
https://support.google.com/firebase/answer/6318765?hl=en
According to the above instruction, it says once Firebase is linked to BigQuery, the data from Firebase is being streamed to BigQuery real-time.
Let's say I have initial export of Firebase data to BigQuery (before linking) and I made a Data Studio visualization out of that initial data, we call it Dataset A. Then I started linking Firebase to BigQuery. I want Dataset A to be in sync with Firebase every 3 hours.
Based on the documentation, does this mean I don't have to use some external program to synchronize Firebase data every 3 hours to BigQuery, since it is streaming real-time already? After linking, does the streamed data from Firebase automatically goes to Dataset A?
I am asking because I don't want to break the visualization if the streaming behaves differently than the expected (expected means that Firebase streams to BigQuery's Dataset A consistent with the original schema). Because if it does (break the original dataset or it doesn't stream to the original dataset), I might as well write a program that does the syncing.
Once you link your Firebase project to BigQuery, Firebase will continuously export the data to BigQuery, until you unlink the project. As the documentation says, the data is exported to daily tables, and a single fixed intraday table. There is no way for you to control the schedule of the data export beyond enabling/disabling it.
If you're talking about Analytics data, schema changes to the exported data are very rare. So far there's been a schema change once, and there are currently no plans to make any more schema changes. If a schema change ever were to happen again though, all collaborators on the project will be emailed well in advance of the change.
I have to develop a program to work on Crashlytics firebase BigQuery data, but our dev team still hasn't migrated our app from fabric.io to firebase.
Can somebody share a small dataset ( a couple of dozen entries would be ok) of Crashlytics firebase data which has been exported to BigQuery ?
https://firebase.google.com/docs/crashlytics/bigquery-export#without_stack_traces
Sure, here you go.
As a heads up, you can view sample public datasets from the BigQuery dashboard, under Resources>Add Data>Explore Public Datasets.
Thanks, but unfortunately I con not open the link provided. The message I get is:
"Report cannot be viewed at this time or xxx#yyyyyyy.com does not have access."
Also within the public datasets, I can not find a Crashlytics Firebase data set.
Am I overlooking something?