Fine grained access control with Appsync/Firebase - firebase

Will it be a good idea to create a completely "serverless" app using Appsync/Firebase when fine-grained access control is necessary?
I tried to build an app with Firebase, and then with AppSync and it feels like these solutions are kind of crippling me, and I started to think that maybe im still thinking in the "old" way of solving the problem, and that's what is crippling me and not the tools.
Where im struggling is with access control.
Firebase has "Firebase rules" and AppSync has "VTL"(Apache Velocity Template Language), both offer relatively good solutions, "Firebase rules" is easier and cleaner, but VTL is more robust because it is basically a programing language.
The problem is that im trying to give the user access to documents on the database based on a "collection/table" of permissions. So each user has a document inside that "collection/table" with fine-grained permissions, and I need to read that document in order to know if he has access to the resource he is trying to read/write.
With both, firebase and AppSync I can read the DB, but both have their limits:
Firebase Rules has request limits. and that is problematic if a user
has multiple "permission groups".
AppSync is more flexible, but still limited, and I rather use my language of choice rather than VTL if im going to write some logic. And in addition, I rather have that code inside my project locally than only in the cloud accessible via the GUI.
So, in the end, it feels like both solutions drive me into having another layer before them in order to do more complex stuff, so it can either be functions or an entire app.
But then, why do I need all of their APIs? Having another layer before Appsync/Firebase basically forces me to reimplement GraphQL/Firebases API, and then, why not build it using another tool?
So, am I doing it all wrong? Will it be better to have an app deployed on AppEngine or a similar solution(and thus losing the advantages of functions)?
Note: Im sorry if after all this reading its still not clear, English is my first language.

AWS AppSync added Pipeline Resolvers recently, which sounds like a perfect solution for your use case. You compose the GraphQL resolver with a chain of Resolver Functions. Your auth check against the document collection table can be implemented as a reusable function.
Take a look at the Pipeline Resolvers tutorial to see if it meets your needs.

Related

Update clients after updating Firestore collection name

I have a Firestore collection that I need to rename.
To do that I'll have to do two things. One, rename the collection, two, update my app (only web right now) to use the new collection name.
My problem is that if I just go ahead and do that, any user that has not refreshed the app won't be able to find the renamed collection.
So, my question is: Is there any best practice to handle this scenario?
I can think of a couple of options:
Somehow forcing a reload of the web apps immediately after renaming the collection.
Set a feature flag so that the web apps enter into maintenance mode while I update everything and then reload the web apps once the change is finished. Unfortunately the currently deployed web app doesn't have a maintenance mode to enable so this doesn't seem to be a valid solution.
However, I'd like to hear about other options. There might be some best practice that I'm missing. Moreover, I'm aware this is a problem that might be more general than just related to Firestore. For example when changing a REST API endpoint, so I guess there must be some tried and tested solutions out there.
I tried searching for best practices regarding this and couldn't find any.
Also, if I was consuming a REST API it would be easier to solve because I could change the DB and keep the DB unchanged. But given that Firestore gets consumed directly from the web app I don't have this benefit.
Locking out outdated clients is a common practice, but leads to a lesser user experience. It also requires that you have a mechanism for the clients to detect that they're outdated, which you don't seem to have.
The most common practice I know of is to perform dual writes to both the old and the new collection while clients are updating.

Use different data for production and develop firebase sites

I have a CI/DC pipeline with google cloud build triggers that deploy my code to different sites depending on which branch I push to. The develop site is a live test - the final check before I merge to master, which triggers a deploy of master to the production site.
Currently, both sites use the same firebase Firestore db, and any document changed on the develop site will also be changed on the production site.
What I want to avoid is creating another firebase project to push the develop code to with a different database, because that means I need a separate set of credentials and would copy the same functions over to the new project every time I change them. That's not maintainable and is a lot of work.
What I would like is some way for the develop site to only have access to part of the firestore database, and the production site to have access to another part.
How do people do this? Is it even possible? Is there a better way? One alternative I can think of is using authentication and creating separate accounts for testing with different access permissions, but this seems a work-around and not the ideal solution.
What you're trying to do sounds like a lot more hassle than using multiple projects, which is the documented and strongly preferred solution. Putting everything in one project is a huge anti-pattern in Firebase and Google Cloud, and it will cause you more problems in the long run, in addition to increasing the risk of catastrophic failure if you manage to misconfigure something in that one project.
It's perfectly maintainable to have multiple projects like this, if you apply some scripting to automate the work. This is very common, and I strongly suggest thinking through how this would work for you.
You CI/CD pipeline could definitely check out your updates from source control and deploy them to whatever other project environments you have set up. It's very common to manage different credentials and configurations for use in CI/CD.

Firebase Cloud Firestore "Request/response" documentation

Most of "bigger" project I was working with was using REST API for Frontend->Backend communication. I was using Firebase Cloud Firestore for some small (one-day/hackathon) projects. Now I'm thinking about using Firestore for some bigger project but I'm not sure if this will work.
For "standard", REST api project I had Swagger documentation, where each developer could see list of all endpoints with request/response data structures. How does it work with Firestore? Can I create similar documentation for developers to check data structure, so they will know what can they add and what should they read? Or maybe there is another way?
I'm thinking, maybe there is no tool for this kind of documentation because frontend data structures are defining database structure? But what if I am connecting database from two or more platform (ex. web, mobile and cloud functions)? How can I synchronize knowledge about data structures between all the developers?
I was looking for some answers but couldn't find anything useful expect advice to manually maintain some documentation. How does it work in your projects? Is there some automation? Manually written documentation? Or no documentation - everything "in code"?
I understand your concerns, but unfortunately, there is no such tool available for Cloud Firestore to generate the documentation for database structure as Swagger.
I believe you can do it programatically.
From
Generating Swagger Docs in Firebase Cloud Functions project
I'm using express and nodejs in my Firebase Function implementations, and for me, Swagger doc generation can be implemented via the following libraries:.
https://github.com/scottie1984/swagger-ui-express
https://github.com/Surnet/swagger-jsdoc
You can find other libraries at:
https://swagger.io/tools/open-source/open-source-integrations
In addition to the responses there, the following service allows you to access Firestore metadata, click the explorer tab, looks promising for your use case https://aapi.io/api-directory/Google_CloudFirestore_GoogleCloudFirestoreAPI_v1beta1 though not necessarily more so than the links above.

DTAP storing script variables in a simple and fast way

I am setting up a DTAP environment for Google App Maker. Google App Maker enables working in a singe file very well, however there is one use case that I would like to simplify.
For each deployment I need to "know" certain things in the back end script. Things like the ip address of the SQL server, or usernames and passwords. This information needs to be retrieved fast and often, given the stateless nature of google.script.run.
The best solution so far is a settings form, combined with google drive tables and caching. This works, but it is not simple, and things could fail easily. The other approach is hard coded and linked to the deployment url. This is fast and simple, but also means that all the credentials are in the source.
I am looking for a better solution. Apps Script used to have the script properties. Is there a similar option in App Maker, with a UI to maintain the settings.
There is no built-in UI to manage script properties, but App Maker's runtime (Apps Script) provides API to perform CRUD operations on it:
PropertiesService.getScriptProperties().setProperty('testKey', 'testValue');
...and you can 'easily' build the UI on top of this API. In answer for this question are highlighted major steps to achieve this: Google App Maker how to create Data Source from Google Contacts
Here is a feature request for the first party support. You can up-vote it by giving it a star:
https://issuetracker.google.com/issues/73584947

How to migrate data from Parse.com to Firebase

I have two production apps that are currently using Parse.com. I have no plans on using Parse server, and I wanted to switch to the firebase service. I was wondering if there was a way to migrate my database from Parse.com to Firebase
There are differences between Parse and Firebase that makes a straight migration not as easy as you would hope.
Parse is based on a relational database, where as Firebase stores all it's data in JSON - thus a "copy and paste" job isn't going to work here.
On top of that the way that the two platforms organise user authentication is completely different.
So unfortunately no easy solution here.
Firebase has a import JSON option so if you get your data out of Parse.com as JSON, it can be imported.
However, the structure Parse uses to create relationships between data is (probably) going to be different than Firebase, so it's going to take some planning and coding to make the transition.
Once we had a plan, we found it easiest to just craft an importer App that would take the Parse.com data structure, and massage it to a Firebase format that worked for our app.
In some cases we had to start from scratch as the thought process is different from Parse (objects) to Firebase.
Firebase API is completely different from the Parse one. It means that you have to learn other API, SDKs, etc, and rewrite your frontend code.
Does not exist an easy path to migrate from Parse to Firebase.
Moreover I think it is not a good decision. Parse Server community is growing and it is becoming even better than original Parse. In a short time, Parse Server will become the best framework for backend and API development.
My recommendation to you is to migrate to a Parse Hosting provider. Using this kind of solution you will use same Parse APIs and features. It will not require you learn other technology nor rewrite any frontend code.
You can find some options in parse server repository:
https://github.com/ParsePlatform/parse-server#parse-server-sample-application
For a full disclaimer, I am co-founder of https://www.back4app.com that is the first parse server mover.

Resources