My goal is to have a firebase cloud function track the upload of three separate files to the same storage bucket. These uploads are preceded by a write to the real time database which would preferably be the trigger for the cloud function to track the uploads.
The context is a user is adding an item to her shopping cart. The data is written to the RTDB and then a custom 3d model and 2 images are copied into a storage bucket. If any of these files don't successfully upload, I need to know that and conduct a rollback of the 3 files in the storage bucket and also remove the entry in the database. I could handle this client side, but that isn't ideal since usually if the uploads fail, its because the connection with the client has failed.
I haven't been able to find any sort of batch add or transaction-type uploads to firebase storage. Sorry for not having any code to show, but I'm not even really sure how to get started on this. Any suggestions would be much appreciated. Thanks!
There are no transactions that cross products like this. Nor are there any transactions offered by Cloud Storage. You're going to have to check errors and manually undo things previously done. Or, have some job that checks for orphaned data and deletes it later.
Related
Update: Editing the question title/body based on the suggestion.
Firebase store makes everything that is publicly readable also publicly accessible to the browser with a script, so nothing stops any user from just saying db.get('collection') and saving all the data as theirs.
In more traditional db setup where an app's frontend is pulling data from backend, and the user would have to at least go through the extra trouble of tweaking the UI and then scraping the front end to pull more-and-more data (think Twitter load more button).
My question was whether it was possible to limit users from accessing the entire database in a click, while also keeping the data publicly available.
Old:
From what I understand, any user who can see data coming out of a Firebase datastore can also run a query to extract all of that data. That is not desirable when data itself is of any value, and yet Firebase is such an easy to use tool, it's great for pretty much everything else.
Is there a way, or a best practice, for how to structure the data or access rules s.t. users see the data, but can't just run a script to download all of it entirely?
Thanks!
Kato once implemented a simplistic rate limit for writes in Realtime Database security rules: Firebase rate limiting in security rules?. Something similar could be possible in Cloud Firestore rules. But this approach won't work for reads, since you can't update the timestamp at the same time the read is performed.
You can however limit what queries a user can perform on your database. For example, to limit them to reading 50 documents at a time:
allow list: if request.query.limit <= 50;
I want to log the following user actions on my Firebase app:
sign in/out
page in/out
timestamp of action
Right now, I use my own function to log actions to the database location "root > user-logs > [user's id]".
Each action is logged as
[time in milliseconds] : [action]
These logs put a lot of data in my database.
However, I won't be accessing data stored at the user-logs locations, so my belief is that this won't lower the speed of read operations at other locations in the database.
Question 1: Is the above belief true?
Question 2: Is there a better way to log customized user actions?
I first thought of creating a csv file in Cloud Storage and appending user actions to the file, but then realized that in order to write to a csv file, I would first have to download it, so I decided that writing to the database would be much faster (and easier).
Thanks.
If you write data to a location that you don't read any data from, then that write operation will not affect operations that read data from somewhere else in your database.
But storing data that you're never going to read is unlikely. Otherwise there probably wouldn't be a reason to store it. More likely you're going to want to read/query this data at some point.
Given the append-only, every-growing nature of your log data, it is unlikely that Firebase will offer the query capabilities that you need at that point. Therefor I'd recommend storing your data in a system that is more tailored towards the use-case: storing lots of data and querying that. A perfect example of such a system is Google's BigQuery.
A common way to get the data into BigQuery is to keep doing what you do now from the client: write it into the database. Then create a Cloud Function that triggers on incoming log data from your database, writes that data to BigQuery, and then deletes it from the database.
With this approach you're only using the Firebase Database for transient storage, and do the heavy lifting in BigQuery.
I need to get a user profile document, which then needs to access two other documents in separate collections, before it returns. At the moment I have implemented this client side but it takes a while. Should I/Can I run this using Cloud Functions, so that I just call one GET and retrieve everything in one go, rather than calling separate get functions sequentially from within my app?
The database retrieval from separate collections would take a similar amount of time whether it's done from the client or Cloud Function.
Collection queries should be very fast on your indexed fields, so probably your problem is the way you are handling asynchronicity. Are you waiting for the result from the first collection before starting the second query? You could dispatch both queries at the same time to cut your waiting time.
You can store all your documents in Firebase Storage and then concatenate the references from the files and download all the documents at the same time, plus you can access them quicker because you can store them into your SD card or internal storage.
Then, if the documents need to be rewritten there is not problem because if you download again from the storage it will auto replace them and the user will still have access to the documents. I tell you this because I'm doing something similar and it's working great!
Edit: As Sujil says, first make an authentication between the user and the database structure with Firebase, so only people logged in or authenticated in your app can read/write files.
If I try to download or read the file from Firebase database or firebase storage, it will incur unnecessarily large quota costs and be financially unsustainable. And since firebase only has an in-memory file system with a "tmp" directory it is impossible to deploy any files there.
The reason I think my request should be very reasonable, is I could accomplish it by literally declaring the entire 80 MB of hashmaps data in the code. Maybe I'll write a script that enumerates all those fields in js and put it inside the index.js itself? Then it never has to download from anywhere, and won't incur any quotas. However this seems like a very desperate solution for what seems to be a simple problem.
Cross post of https://www.quora.com/How-can-I-keep-objects-stored-in-RAM-with-Cloud-Functions
Is there any way to add Firebase 3 Storage security rules to limit how many files can single authenticated user upload? For example 100 files per user.
Or somehow update Firebase Database file count, once someone uploaded file to Storage and later validate that file count.
Trying to solve problem, how to deal with user ability to upload unlimited data amount to storage.
It's not a simple solution, but...
https://medium.com/#felipepastoree/per-user-storage-limit-validation-with-firebase-19ab3341492d