I have a lot of files in my Firestore that I want to delete through the firebase website but I want to be able to select and delete them instead of deleting them one by one
It could be a hard task to delete many documents on your Firestore instance manually. Specially if you have hundreds or thousands of them. I think the best option would be doing it programmatically through an Application or a Cloud function.
On this documentation you can learn how to implement the delete() method on many popular languages to delete documents, fields, and collections, and on this document I found a simple way of creating a Cloud Function while learning how to delete collections.
As a summary there is no easy way to delete many documents,field or collections using the user interface. The best way is to implement code to do it.
Related
for the technically-savy people out there. I'm building a flutter app with a firestore backend and I've been doing some research as to the best way to structure my models. Tutorials online show different methods and I can't figure out what's the best one as I want my app to be light but also use Firestore efficiently so it doesn't cost too much.
Those are the ways I've encountered so far:
Have model for the flutter object and another one for the firestore object. Everytime I get the data from firestore I instanciate a firestore object and map it into a flutter object or create a new flutter object and then have a listener there to update my whole app.
Have one model for the flutter/firestore object. Everytime I get data from firestore I need to instantiate it once. No mapping. I have a listener there.
Get the data directly from firestore without needing to instantiate an object and print the documents there, and have a streamprovider to get the data.
I'd really appreciate your help in structuring my app/project. Thanks.
I think you should consider all your requirements, technical and business. The definition of your architecture should entails this needs. If you rush to choose an architecture that in the future would be needed to be re-defined, then it could be more expensive.
Once you have considered this, then you should think in your architecture. Try to share your general workflow.
I want my app to be light but also use Firestore efficiently so it doesn't cost too much.
All the three options you have shared fit to your needs? Is there a step in your solution in which the steps of your three options would fail?
I think you should base the architecture more on your needs than in which is least cost.
I'm newbie on cloud function. I have some confusions.
Is there any difference between admin.firestore and
functions.firestore?
Is admin.database for real-time database?
So if cloud functions are basically written as JavaScript or
TypeScript in Node.js environment, then I can write them with
JavaScript. But documentation made me confused about that. Because
whenever I read the documentation, it gives things different. For
example,
Above code to get a document, it uses document('some/doc').
But, above code it uses doc('doc') to achieve the same functionality. If all of them come form firestore, why do both differ each other?
Can someone help me to understand these questions? Thank you for all your supports.
functions.firestore: is used to set up a trigger. You can set up a listener on a document path so that whenever an event happens on that document (create/update/delete etc.), this function will be executed. See how to create one here.
admin.firestore: is used to create a reference to a firestore database. You can perform various operations on firestore collection/documents using admin sdk through that reference. See documentation.
I've combed through GC DataStore documentation, but I haven't found anything useful around bulk updating documents. Does anyone have best practice around bulk updates in DataStore? Is that even a valid use case for DataStore? (I'm thinking something along the lines of MongoDB Bulk.find.update or db.Collection.bulkWrite)
You can use the projects.commit API call to write multiple changes at once. Most client libraries have a way to do this, for example in python use put_multi.
In C# the DatastoreDb.Insert method has an overload for adding multiple entities at once.
If this is a one-off, consider using gcloud's import function.
I have a Firebase realtime database that was architected very poorly by the original developer, and I need to drastically change its structure and move the existing data around. Is it possible to migrate the existing data without literally copy and pasting items? You know, the sort of thing that's trivially easy with an ordinary database....
I want to isolate each user's data from all other users. What is the best way to keep a user viewing and modifying only her stuff? My approach has been to
Add a userId field on every collection
Configure every published collection to filter on userId.
Use simple-schema with collections2 and add autoValue: function(d) { return this.userId } on the userId field for each schema to force the userId during validation.
Is this a good and correct approach? What is best practice?
#dk. this sounds like a good approach to me and is considered best practice as well (as of my experience with meteor).
Sounds solid. Actually I use this approach in a quite large project I'm working on.
I also use composite collections using the reywood:publish-composite package.
As a result some collections don't have userId keys but the documents for the current user are selected based on a document from a related collection. I'm also dealing with the case that many documents are shared between users.
This affords some degree of normalization while working really well.