Is it possible to use the Datastore as an input on a streaming basis? I.e. any time an entity is saved to the datastore it streams that to a dataflow project?
Currently we do not stream out of Datastore automatically, but I've made a note of your interest in it. One approach you can consider is to monitor any relevant source from a App Engine and publish its contents to PubSub.
Related
Is it possible? I'd like to explore using GTM's custom functions to send events to a separate GA4 app data stream. Before I get to that point, I first need a data stream added to a property. Creating an Android data stream in GA4's dashboard also provisions and creates a Google Cloud project. Using the GA4 data stream creation method returns API call to analyticsadmin.properties.dataStreams.create failed with error: Only web streams may be created on Admin API. To create app streams, use the Firebase API.
As mentioned, the end goal is to have a single Firebase project send events to second, unassociated data stream without adding any extra Firebase projects or data streams.
I'm teaching myself to use Flutter and I'm making an app that queries The Movie Database API. Currently, I'm having the client query the API on launch but I'm thinking this is not the most efficient way of doing it, and I would rather have the client query a backend service like Firebase to get the same data.
I would appreciate some guidance into where to start in order to setup a periodical process to query the API and use the results as entries into a Firestore DB. I've looked online but I might be using suboptimal keywords since I haven't found a good tutorial or example for this.
Thanks.
You can use Firebase Cloud Functions to build code that runs on Firebase servers to fill your Firebase database, but you can only make HTTP requests to non-Google addresses if you use a paid plan.
https://firebase.googleblog.com/2017/03/how-to-schedule-cron-jobs-with-cloud.html explains how to invoke periodic tasks with Cloud Functions. It utilizes Google AppEngine for that because Cloud Functions doesn't provide that out of the box.
My Current Plan:
I'm currently creating an IOS App that will access/change java/python files that are stored in the Google Cloud Storage. Once confirmed the App will talk with App Engine that will have a Compute Engine VM receive files and compile them. Once compiled have the result returned back to the IOS App
Is there any better or easier method to achieve this task? Should I use firebase or Google Cloud Functions? Would it be any help
Currently, I'm lost how to design and have requests sent between many platforms.
It would also depend on what type of data processing you are doing to the files in Cloud Storage. Ideally you would want to avoid as many "hops" between services as possible. You could do everything via Cloud Functions and listen on GCS Triggers. You can think of Cloud Functions as a sudo App Engine Backend to use for quick request handling.
Use Cloud Functions to respond to events from Cloud Storage or Firebase Storage to process files immediately after upload
If you are already using Firebase, it would be better to stay within their ecosystem as much as possible. If you are doing bigger or more intensive data processing you might want to look at different options.
With more information and current pain points, we may be able to offer more insight.
The Cloud Dataflow page implicates that this would be possible, but I haven't found a way of observing change events in the Google Cloud Datastore docs. How is it done?
As far as I am aware, the integration of Cloud Datastore with Dataflow is through DatastoreIO (now based on DatastoreV1), which can only be used as a bounded source for batch jobs.
I have been trying to find an alternative solution that would allow you to use Datastore (directly or indirectly) as an unbounded source (for instance creating a Pub/Sub topic where Datastore changes are published and can be consumed from Dataflow), but I do not think that would be a viable solution given that, as you said, there is no easy way to detect changes (addition of entities, modification of entities, etc.) in Datastore.
For now, I have filed an internal request to improve the documentation to either modify the image so that it does not imply that Cloud Datastore can be used with a Streaming Pipeline, or clarify this use case.
Actually my data is in GB so every time manually sync for updating is not make sense. So I want to sync my SQLite database with Firebase database automatically, either it may be offline or online. And I want syncing from both ways. "FB to SQLite" and "SQLite to FB".
Is it in Android ? if you already have functionality to manual sync the data, then should not be a problem to automate it. You can call same sync functionality through a service, which in turn will use CountDownTimer. CountDownTimer as per preference can be called every couple of minutes or hourly.
There is no built-in functionality for Firebase to synchronize with SQlite, nor built-in functionality in SQlite to synchronize with Firebase.
But since both have an API, you can write code to do the synchronization for you.
Actually firebase use persistency to work while offline but dynamically can get only last event whatever it is. So have to write our own Sync for that.