Use different data for production and develop firebase sites - firebase

I have a CI/DC pipeline with google cloud build triggers that deploy my code to different sites depending on which branch I push to. The develop site is a live test - the final check before I merge to master, which triggers a deploy of master to the production site.
Currently, both sites use the same firebase Firestore db, and any document changed on the develop site will also be changed on the production site.
What I want to avoid is creating another firebase project to push the develop code to with a different database, because that means I need a separate set of credentials and would copy the same functions over to the new project every time I change them. That's not maintainable and is a lot of work.
What I would like is some way for the develop site to only have access to part of the firestore database, and the production site to have access to another part.
How do people do this? Is it even possible? Is there a better way? One alternative I can think of is using authentication and creating separate accounts for testing with different access permissions, but this seems a work-around and not the ideal solution.

What you're trying to do sounds like a lot more hassle than using multiple projects, which is the documented and strongly preferred solution. Putting everything in one project is a huge anti-pattern in Firebase and Google Cloud, and it will cause you more problems in the long run, in addition to increasing the risk of catastrophic failure if you manage to misconfigure something in that one project.
It's perfectly maintainable to have multiple projects like this, if you apply some scripting to automate the work. This is very common, and I strongly suggest thinking through how this would work for you.
You CI/CD pipeline could definitely check out your updates from source control and deploy them to whatever other project environments you have set up. It's very common to manage different credentials and configurations for use in CI/CD.

Related

How to avoid modifying Firebase Firestore data when working on new version of a app

I have developed an app (my first app) that's currently on app- and playstore. I'm using Firebase/Firestore as a back-end/database. I'm about to start to work on the next version of the app, which will require me to modify my database for testing and such. However, I currently have about 100 regular users on the app and I wouldn't want them to experience any weird data changes on their app while I'm developing.
I'm simply not sure about the way it's done when you further develop an existing app. I was thinking of creating new collections in Firestore simply for testing and hook my app with them but I don't know if that's the best way to go.
How do I work on a new version of an app without my users seeing any changes in their data?
If your data model is (very) simple you could create some specific, temporary, collections as you describe, but this is quite an error-prone approach (risk of modifying existing data, wrong or missing security rules, etc..) and is not recommended.
One standard approach is simply to create another Firebase project which is totally isolated from the production project. You will need to change the Firebase configuration in your Flutter app. If you need existing production data in your test Firebase project you could use the export/import mechanism.

Generate site when headless cms modifies database

I've been reading about how nuxt can generate a static site when a client makes a request to view the website. We are planning to build a headless cms to migrate the database with the data the website needs. This data will only be changed when you save it in the headless cms.
My question is since this data will only change when it is changed in the headless cms. Isn't it possible to just generate the site when it is modified from the headless cms, and then serve that site to the client? To reduce server costs.
Is it possible to do this with nuxt? Or are there any possibilities to do this?
We are planning on using Firebase as a backend.
There's nothing explicitly preventing Nuxt from being rebuilt each time you change an item in your DB. The part that matters is how you tell your app to rebuild itself.
By far the simplest way is using some sort of "build hook". See Netlifys docs here for a quick overview of what they are. However, this only really works if you're using a headless CMS that can send hooks on save, and a build provider that can trigger builds using those hooks.
You will absolutely save on server costs using this sort of method, but beware: if you have a lot of user generated content triggering builds, your build cost can easily outweigh the server costs. You also need to be aware that builds generally take a few minutes, so you won't see instant changes on your site.
The other option you have is foregoing static site generation in favour of SSR, which can dynamically load and render your content, completely avoiding the need to build every time a new DB change is made. This is what I'd consider the best alternative if you do indeed have a lot of user generated content.
It's hard to give any further advice without knowing the specifics of the CMS or build provider though.

Is there a recommended way to deal with deploying pages from local dev to prod?

For example, say I am working on a FAQ page locally. I create whatever plugins/templates etc I need. Then, locally, I proceed to add the plugins to the page, debug, modify whatever. Now it comes time for me to deploy this to production.
I am left with redoing all the work again, copy/pasting the content and rebuilding the FAQ page or is there an alternative way? Things I have thought of:
Create a data migration representing the structure/content
Sync the production db to the dev db, make my changes and push it all back during a downtime window.
Are there any other solutions around in the Django CMS community for handling this kind of thing?
The data migration seems like the best approach, but I figured I would ask to be sure I wasn't missing anything.
I am not aware of any out-of-the-box solution to this problem. Data migration seems fine, though if you are planning to integrate it into the actual migrations framework, I would be worried about making it too coupled to the state of the database (i.e. if you are inserting the content into a specific page ID).
What we have been doing in our projects is to create as special app that provides additional commands for the management CLI. You can then keep the migrations separate from data population. Once you deploy your plugin structure live, you can simply run a command to populate the database.
After you have seeded the data, you can simply disable / completely remove the temporary app without having any effect on your main application - compared to keeping tightly coupled data population in the migrations framework, that wastes both space and tightly couples the db migration to your db contents.

How to separate configurations in ASP.NET?

My team is doing web development (ASP.NET, WCF), and we are at a beginning stage where everyone needs to make DB changes and use own sample data.
We use a dedicated DB server, and we want each developer to develop against separate DB.
What we appear to need is ability to configure connection string on per-developer basis in source controlled way. Obviously, we might have other configuration settings that need custom setting and finally, we'll need to maintain a set of configuration settings that are common to all developers.
Can anyone suggest a best practice here?
PS Similar issue appears when we want to deploy a built application to different environments (test, stage, production) without having to manually tweak configurations (except perhaps configuring the environment name).
You can use config transforms for your deployment to different environments. That's easy enough. Scott Hanselman did a pretty awesome video on it here.
For your individual developer db problem, there isn't any particularly elegant solution I can think of. Letting each developer have a unique configuration isn't really a "best practice" to begin with. Once everyone starts integrating their code, you could have a very ugly situation on your hands if everyone wrote their code against a unique db and configuration set. It almost guarantees that code won't perform the same way for two developers.
Here is what I would recommend, and have done in the past.
Create a basic framework for your database, on one database on your test db server.
Create a Database Project as part of your solution.
Use .Net's built in Schema Compare to write your existing database to the database project.
When someone needs to change the database, first, they should get latest on the Database project, then make their changes, and then repeat step 4 to add their changes to the project.
Using this method, it is also very easy for developers to deploy a local instance of the database that matches the "main" database, make changes, and write those changes back to the project.
OK.
Maybe not so elegant solution, but we've chosen to read connection string from a different place when the project is built using Debug configuration.
We are using registry, and it has to be maintained manually.
It requires some extra coding, but the code to read the registry is only compiled in debug (#if debug), so there is no performance hit in production.
Hope this helps as well.
Cheers
v.

How to avoid chaotic ASP.NET web application deployment?

Ok, so here's the thing.
I'm developing an existing (it started being an ASP classic app, so you can imagine :P) web application under ASP.NET 4.0 and SQLServer 2005. We are 4 developers using local instances of SQL Server 2005 Express, having the source-code and the Visual Studio database project
This webapp has several "universes" (that's how we call it). Every universe has its own database (currently on the same server) but they all share the same schema (tables, sprocs, etc) and the same source/site code.
So manually deploying is really annoying, because I have to deploy the source code and then run the sql scripts manually on each database. I know that manual deploying can cause problems, so I'm looking for a way of automating it.
We've recently created a Visual Studio Database Project to manage the schema and generate the diff-schema scripts with different targets.
I don't have idea how to put the pieces together
I would like to:
Have a way to make a "sync" deploy to a target server (thanksfully I have full RDC access to the servers so I can install things if required). With "sync" deploy I mean that I don't want to fully deploy the whole application, because it has lots of files and I just want to deploy those new or changed.
Generate diff-sql update scripts for every database target and combine it to just 1 script. For this I should have some list of the databases names somewhere.
Copy the site files and executing the generated sql script in an easy and automated way.
I've read about MSBuild, MS WebDeploy, NAnt, etc. But I don't really know where to start and I really want to get rid of this manual deploy.
If there is a better and easier way of doing it than what I enumerated, I'll be pleased to read your option.
I know this is not a very specific question but I've googled a lot about it and it seems I cannot figure out how to do it. I've never used any automation tool to deploy.
Any help will be really appreciated,
Thank you all,
Regards
Have you heard of the term Multi-Tenancy? It might be worth look that up to see if that applied to your "Multiverse" especially if one universe is never accessed by another...
See:
http://en.wikipedia.org/wiki/Multitenancy
http://msdn.microsoft.com/en-us/library/aa479086.aspx
UPDATE:
If the application and database is the same for each client (or Tenant) I believe there are applications that may help in providing the same code/db as an SaaS application? ie another application/configuration layer on top that can handle the deployments etc?
I think these are called Platform as a Service (PaaS) applications:
see: http://en.wikipedia.org/wiki/Platform_as_a_service
Multi-Tenancy in your case may be possible, depending on client security requirements, with a bit of work (or a lot of work):
Option 1:
You could use the one instance of the application, ie deploy the site once and connect to a different database for each client. You would need to differentiate each client by URL to isolate content/data byt setting a connection string for each etc. (This would reduce your site deployments to one deployment)
Option 2:
You could create both a single instance of the application and use a single database. You would need to add a "TenantID" to each table and adjust all your code to accept a TenantID to ensure data security/isolation. Again you wold need to detect/differentiate the Tenant based on the URL to set the TenantID for the session used for every database call. (This would reduce your site and database deployment to one of each)

Resources