Is there a recommended way to deal with deploying pages from local dev to prod? - django-cms

For example, say I am working on a FAQ page locally. I create whatever plugins/templates etc I need. Then, locally, I proceed to add the plugins to the page, debug, modify whatever. Now it comes time for me to deploy this to production.
I am left with redoing all the work again, copy/pasting the content and rebuilding the FAQ page or is there an alternative way? Things I have thought of:
Create a data migration representing the structure/content
Sync the production db to the dev db, make my changes and push it all back during a downtime window.
Are there any other solutions around in the Django CMS community for handling this kind of thing?
The data migration seems like the best approach, but I figured I would ask to be sure I wasn't missing anything.

I am not aware of any out-of-the-box solution to this problem. Data migration seems fine, though if you are planning to integrate it into the actual migrations framework, I would be worried about making it too coupled to the state of the database (i.e. if you are inserting the content into a specific page ID).
What we have been doing in our projects is to create as special app that provides additional commands for the management CLI. You can then keep the migrations separate from data population. Once you deploy your plugin structure live, you can simply run a command to populate the database.
After you have seeded the data, you can simply disable / completely remove the temporary app without having any effect on your main application - compared to keeping tightly coupled data population in the migrations framework, that wastes both space and tightly couples the db migration to your db contents.

Related

Use different data for production and develop firebase sites

I have a CI/DC pipeline with google cloud build triggers that deploy my code to different sites depending on which branch I push to. The develop site is a live test - the final check before I merge to master, which triggers a deploy of master to the production site.
Currently, both sites use the same firebase Firestore db, and any document changed on the develop site will also be changed on the production site.
What I want to avoid is creating another firebase project to push the develop code to with a different database, because that means I need a separate set of credentials and would copy the same functions over to the new project every time I change them. That's not maintainable and is a lot of work.
What I would like is some way for the develop site to only have access to part of the firestore database, and the production site to have access to another part.
How do people do this? Is it even possible? Is there a better way? One alternative I can think of is using authentication and creating separate accounts for testing with different access permissions, but this seems a work-around and not the ideal solution.
What you're trying to do sounds like a lot more hassle than using multiple projects, which is the documented and strongly preferred solution. Putting everything in one project is a huge anti-pattern in Firebase and Google Cloud, and it will cause you more problems in the long run, in addition to increasing the risk of catastrophic failure if you manage to misconfigure something in that one project.
It's perfectly maintainable to have multiple projects like this, if you apply some scripting to automate the work. This is very common, and I strongly suggest thinking through how this would work for you.
You CI/CD pipeline could definitely check out your updates from source control and deploy them to whatever other project environments you have set up. It's very common to manage different credentials and configurations for use in CI/CD.

CMS - How to work with multiple environments? Do I really need them?

I've never worked with any CMS and I simply wanted to play with such ones. As originally I come from .NET roots, so I was thinking about choosing Orchard Core CMS.
Let's imagine very simple scenario, together with my colleague I'd like to create a blog. As I'm used to work with web based systems and applications for a business for me it's kinda normal to work with code repository, having multiple environments dev/test/stage/prod, implementing CI / CD, adjusting database via migrations or scripts.
Now the question is do I need all of this with working on our blog with a usage of CMS.
To be more specific I can ask few questions:
Shall I create blog using CMS locally (My PC) -> create few articles and then deploy it to the web or I should create a blog over the internet and add articles in prod environment directly.
How to synchronize databases between environments (dev / prod).
I can add, that as I do not expect many visitors on a website I was thinking to use Orchard Core CMS together with SQLite. Also I expect that I can customize code, add new modules, extend existing ones etc. - not only add content (articles). You can take that into consideration in answering the question
So basically my question is what should be the workflow of a person who want to create / administer and maintain CMS (let it be blog) as a single person or as a team.
Shall I work and create content locally, then publish it and somehow synchronize both application and database (database is my main question mark - also in a context how to do that properly using SQLite).
Or simply all the changes - code + content should be managed directly on a server let's call it production environment.
Excuse me if question is silly and hard to understand, but I'm looking for any advice as I really didn't find any good examples / information about that or maybe I'm seeking in totally wrong direction.
Thanks in advance.
Great question, not at all silly ;)
When dealing with a CMS, you need to think about the data/content in very different terms from the code/modules, despite the fact that the boundary between them is not always completely obvious.
For Orchard, the recommendation is not to install modules in production, but to have a dev - staging - production type of environment: install new modules on a dev environment, test them in staging, and then deploy to production when it's safe to do so. Depending on the scale of the project, the staging may be skipped for a more agile dev to prod setting but the idea remains the same, and is not very different from any modular application.
Then you have the activation and configuration of the settings of the modules you deploy. Because in a CMS like Orchard, those settings are considered data and stored in the database, they should be handled like content. This includes metadata such as the very shape of the content of your site: content types are data.
Data is typically not deployed like code is, with staging and prod environments (although it can, to a degree, more on that in a moment). One reason for this is that a CMS will often feature user-provided data, such as reviews, ratings, comments or usage stats. Synchronizing all that two-ways is very impractical. Another even more important reason is that the very reason to use a CMS is to let non-technical owners of the site manage content themselves in a fast and direct manner.
The difference between code and data is also visible in the way you secure their changes: for code, usual source control is still the rule, whereas for the content, you'll setup database backups.
Also important to mention is the structure of the database. You typically don't have to worry about this until you write your own modules: Orchard comes with a rich data migration feature that makes sure the database structure gets updated with the code that uses it. So don't worry about that, the database will just update itself as you deploy code to production.
Finally, I must mention that some CMS sites do need to be able to stage contents and test it before exposing it to end-users. There are variations of that: in some cases, being able to draft and preview content items is enough. Orchard supports that out of the box: any content type can be marked draftable. When that is not enough, there is an optional feature called Deployments that enables rich content deployment workflows that can be repeated, scheduled and validated. An important point concerning that module is that the deployment only applies to the subset of the site's content you decide it should apply to (and excludes, obviously, stuff like user-provided content).
So in summary, treat code and modules as something you deploy in a one-way fashion from the dev box all the way to production, with ordinary source control and deployment methods, and treat data depending on the scenario, from simple direct in production database instances with a good backup policy, to drafts stored in production, and then all the way to complex content deployment rules.

Complete UI Redesign in Rails

I've been tasked with starting a complete UI redesign for an App that is already coded in Ruby on Rails. Honestly, I'm not sure where to start. Would it be easier to start from scratch or go through and modify the existing code?
The issue I see with modifying the existing code is that there are legacy artifacts that could conflict with the new UI code (CSS classes, ids, styling, etc.) which would make the redesign project probably take longer.
The issue I see with starting from scratch is that I'd literally start from scratch. There are existing migrations that I'd have to re-run by first clearing out the DB, which I can't do on a production server. Re-run all of the install scripts (like Devise, Rubber, etc.).
Has anyone gone through this before? Any recommended things to do?
Can I just hook into the existing DB and not have to run any migrations (since I can't do that in production)? Surely ppl that do this have some trick to getting back up and running smoothly.
Since you mentioned UI redesign, I wonder how that affects the DB. However, generally my approach to such would be to first do a HTML/CSS design to show how the new UI would look, both with and without data, then break those HTML down into components or partials then, I'd start replacing old components with the new components. I'd probably start from the root route and build up.
You shouldn't really need to clear out DBs or anything of sorts, I prefer to use SQLite for local development. If you'd like to test with data, you should probably find a way of seeding some of the data or copying data from the main application to your localDB. I'd never advice that you use your production DB with your development environment.
Just thought to share my 2 cents.

How to separate configurations in ASP.NET?

My team is doing web development (ASP.NET, WCF), and we are at a beginning stage where everyone needs to make DB changes and use own sample data.
We use a dedicated DB server, and we want each developer to develop against separate DB.
What we appear to need is ability to configure connection string on per-developer basis in source controlled way. Obviously, we might have other configuration settings that need custom setting and finally, we'll need to maintain a set of configuration settings that are common to all developers.
Can anyone suggest a best practice here?
PS Similar issue appears when we want to deploy a built application to different environments (test, stage, production) without having to manually tweak configurations (except perhaps configuring the environment name).
You can use config transforms for your deployment to different environments. That's easy enough. Scott Hanselman did a pretty awesome video on it here.
For your individual developer db problem, there isn't any particularly elegant solution I can think of. Letting each developer have a unique configuration isn't really a "best practice" to begin with. Once everyone starts integrating their code, you could have a very ugly situation on your hands if everyone wrote their code against a unique db and configuration set. It almost guarantees that code won't perform the same way for two developers.
Here is what I would recommend, and have done in the past.
Create a basic framework for your database, on one database on your test db server.
Create a Database Project as part of your solution.
Use .Net's built in Schema Compare to write your existing database to the database project.
When someone needs to change the database, first, they should get latest on the Database project, then make their changes, and then repeat step 4 to add their changes to the project.
Using this method, it is also very easy for developers to deploy a local instance of the database that matches the "main" database, make changes, and write those changes back to the project.
OK.
Maybe not so elegant solution, but we've chosen to read connection string from a different place when the project is built using Debug configuration.
We are using registry, and it has to be maintained manually.
It requires some extra coding, but the code to read the registry is only compiled in debug (#if debug), so there is no performance hit in production.
Hope this helps as well.
Cheers
v.

Best Practice for maintaining a TSQL database creation script for a web application

We have a ASP.NET web application and need to maintain the database creation and initialization script.
Are there any industry best practices that people know of for maintaining database creation and initialization scripts. I can think of two main approaches.
Maintain a tsql creation script directly by hand.
Maintain a master database and create the script that is then checked into source safe.
Also the script should be able to be tracked through source control, i.e. table order should be controllable.
If possible should also include the ability to track initialisation data either in the same or a seperate script.
Currently we generate the script from management studio but the order of the tables seems to be random.
And the more automated the solution the better.
The problem is not maintaining the script, nor maintaining a 'master' copy of the database. The real problem is upgrading existing database(s). You do your modification in the developer environment, which are then propagated to the test environment, and finally pushed into production environment. While at developer and test environment stage is possible to start from scratch, in production you always have to upgrade the existing deployment.
In my experience the best practice is to use upgrade scripts. This practice is useful even with a single deployed site, but it becomes invaluable with multiple locations that may be at different versions. But even with one single operational site is still useful to be able to test the upgrade repeatedly (starting from backups of current version), keep the changes in source control, have a well formalized and peer reviewed change procedure (the upgrade script). And upgrade scripts can be tailored to specific needs of the operational site, like handling a large table with special care, or deal with encrypted data, or whatever one of the myriad of the details diff based tools neglect or ignore. The main disadvantage is the the scripts have to be written, which require real T-SQL knowledge (forget all the 'designers' in you favorite management tool).
You might want to check out RedGate SQL Source Control.
Are you looking for Visual Studio Database Projects?
I use database projects to store all database objects (tables, views, functions, keys, triggers, indexes across schemas) and keep versioning in TFS. You can build the database to ensure that everything is valid. You can deploy to a fresh database, or do a schema comparison with an existing database.
I also keep all reference and setup data in post deployment scripts which are automatically run after deployment.

Resources