I have recently published my mvc 5 application to a windows server using IIS. Now I might be thinking too much into this, because when using code first it should populate the database with the new entries to my data model. HOWEVER, before i make changes to my data model in my development environment, i just wanted to ask here to make sure I really don't mess anything up.
So after launching my application as always now my client wants me to make some changes which now involve me adding a data model as well as new controllers. Is it safe for me to add these changes, then once i re-publish the application, should the production database get updated as per the code-first additions i've made?
I'm having some confusion in understanding how my production database is going to recognize the new data, and tables that are being added once i make this update.
Do i simply attach my newly changed database from my development machine to my production environment - All the data on my development system is needed and usable on the production side of things as well.
I hope i was able to ask this question clear enough so that someone can help me out.
I plan to make weekly or bi-weekly changes to the web app.
Thank you
Related
For example, say I am working on a FAQ page locally. I create whatever plugins/templates etc I need. Then, locally, I proceed to add the plugins to the page, debug, modify whatever. Now it comes time for me to deploy this to production.
I am left with redoing all the work again, copy/pasting the content and rebuilding the FAQ page or is there an alternative way? Things I have thought of:
Create a data migration representing the structure/content
Sync the production db to the dev db, make my changes and push it all back during a downtime window.
Are there any other solutions around in the Django CMS community for handling this kind of thing?
The data migration seems like the best approach, but I figured I would ask to be sure I wasn't missing anything.
I am not aware of any out-of-the-box solution to this problem. Data migration seems fine, though if you are planning to integrate it into the actual migrations framework, I would be worried about making it too coupled to the state of the database (i.e. if you are inserting the content into a specific page ID).
What we have been doing in our projects is to create as special app that provides additional commands for the management CLI. You can then keep the migrations separate from data population. Once you deploy your plugin structure live, you can simply run a command to populate the database.
After you have seeded the data, you can simply disable / completely remove the temporary app without having any effect on your main application - compared to keeping tightly coupled data population in the migrations framework, that wastes both space and tightly couples the db migration to your db contents.
We do rapid development of web applications and we're looking for ways to separate our development and production databases (we currently develop directly on production... it's bad news).
We use ASP.NET Webforms with LINQ2SQL and Dynamic Data for CRUD. How can we do database development locally and then deploy changes to production? I've seen Entity Framework Code-first migrations, but I don't know of any equivalent for LINQ2SQL. We don't want to switch to EF as our CMS is built around LINQ2SQL.
We would also need production data to be available locally (not up to the minute, but recent enough) so we can debug with real data if problems arise.
This is the only idea I've come up with so far but it's far from ideal:
Initial development is done locally then deployed to production
Subsequent maintenance is then done on a local replication of the production database. Then we use some kind of 'database diff' tool to determine the changes that were made, and migrate those changes to production.
Is this an acceptable way of doing things? Is there a better way we could use?
Thanks
Develop your data model and procedures in SSDT Database Projects. This keeps a perfect source controlled copy of what you want the database to look alike at any moment in time. Then let the tooling generate the publishing scripts for you.
Developers should always develop on their own local copy of a database. They can check out scripts form the database project and make changes which they publish locally They can get latest on the checked in project, merge their changes, deploy locally again, test it out, and then check in their changes. Only when everything is tested out, you then publish the changes to production.
You end up treating your database schema very much like code source files.
To get production data down to your development server I would take a .bacpac or .dacpac of the production DB and them import it into your local DB. This works well because you need the schema definition along with the data since it is likely that prod is an older version than what you would have in dev
Yeah I think you basically hit the nail on the head. Those are the two things I have done.
You develop locally and check in your SQL scripts to source control. Then you run the scripts for a deployment. What I've seen work well is dropping/re-creating all stored procedures (seems scary, but if you trust those scripts it's very helpful), and then having one-off scripts per deployment for schema changes and data migration.
Periodically you will copy down production data and restore it locally. Obviously this sync can only happen easily right after a deployment since that's when local and production will be the same. At my current job we actually duplex writes and send a copy to the lower environments and so I suppose that's an option. You could replicate data from production somewhere else and work off of that/write a tool to bring the data into local.
From what I've seen there are no easy answers.
My team is doing web development (ASP.NET, WCF), and we are at a beginning stage where everyone needs to make DB changes and use own sample data.
We use a dedicated DB server, and we want each developer to develop against separate DB.
What we appear to need is ability to configure connection string on per-developer basis in source controlled way. Obviously, we might have other configuration settings that need custom setting and finally, we'll need to maintain a set of configuration settings that are common to all developers.
Can anyone suggest a best practice here?
PS Similar issue appears when we want to deploy a built application to different environments (test, stage, production) without having to manually tweak configurations (except perhaps configuring the environment name).
You can use config transforms for your deployment to different environments. That's easy enough. Scott Hanselman did a pretty awesome video on it here.
For your individual developer db problem, there isn't any particularly elegant solution I can think of. Letting each developer have a unique configuration isn't really a "best practice" to begin with. Once everyone starts integrating their code, you could have a very ugly situation on your hands if everyone wrote their code against a unique db and configuration set. It almost guarantees that code won't perform the same way for two developers.
Here is what I would recommend, and have done in the past.
Create a basic framework for your database, on one database on your test db server.
Create a Database Project as part of your solution.
Use .Net's built in Schema Compare to write your existing database to the database project.
When someone needs to change the database, first, they should get latest on the Database project, then make their changes, and then repeat step 4 to add their changes to the project.
Using this method, it is also very easy for developers to deploy a local instance of the database that matches the "main" database, make changes, and write those changes back to the project.
OK.
Maybe not so elegant solution, but we've chosen to read connection string from a different place when the project is built using Debug configuration.
We are using registry, and it has to be maintained manually.
It requires some extra coding, but the code to read the registry is only compiled in debug (#if debug), so there is no performance hit in production.
Hope this helps as well.
Cheers
v.
I'm interested to find out what would be the good way to make changes to production database and source code in web application (ASP.NET, SQL Server 2008).
A little bit more details, we develop on local machines, and then we need to transfer the code and database changes to production (pretty much standard story).
At the moment we do it in the evening, change the database directly from management studio on production server, and then just overwrite the existing asp.net code (copy/past).
You're talking about Release management. What you're asking about is a big subject with a LOT of different answers. The best solution for you is not something we can tell you. There are trade offs to consider.
For example, what you're describing is a very basic release management process that would be considered an "immature" process.... It does not take into account rollback plans, versioning, separation of concerns, proper testing, or any of a hundred other factors that a "mature" release management process involves.
A mature process is very good, but if you don't have the resources, it's not feasible.
To get to the point, I don't think you question can be answered fully here. I'd suggest starting to research "change management", "release management", "Application Lifecycle management", and "Applicaiton Development Lifecycle". I'll have a few good starter links for you in a minute.
Just a forewarning, though, you are asking a question that's going to open your eyes and your world in ways you probably haven't considered. There are things like automated builds to consider, tools to do it for you (high priced, free, and everything in between)
http://en.wikipedia.org/wiki/Release_management
http://en.wikipedia.org/wiki/Application_lifecycle_management
A few simple options for JUST what you're asking about can be found here:
http://msdn.microsoft.com/en-us/library/7hd4c0x3(VS.80).aspx
Also, since you talked about source code without mentioning which source control you're using, I need to say... if you're not already using source control, you need to. You'll wonder how you ever lived without it once you start using it.
Depends on whether it's the first deployment of a new app, or an update to the app.
For small updates, record all your database changes as sql scripts. You must strictly enforce that all changes to development are applied as sql scripts. Put the scripts in source control. Deploy the update by running the scripts on production.
For new apps you may have thousands of scripts. You can't run them individually. Consolidating them into a master script takes too much time. (although you still want to check EVERY script into source control). In this case you reach a milestone in development then FREEZE the development database, and declare it a baseline. Use the database tools to generate a master script(s). Deploy production by running this script(s). Manually create data scripts for your lookup tables to keep it separate from junk dev data.
Avoid a database copy. Avoid changing by hand through the GUI. Scripts are the way. How you go about collecting the scripts, consolidating to master scripts, generating the scripts, etc is another story.
Ok, so here's the thing.
I'm developing an existing (it started being an ASP classic app, so you can imagine :P) web application under ASP.NET 4.0 and SQLServer 2005. We are 4 developers using local instances of SQL Server 2005 Express, having the source-code and the Visual Studio database project
This webapp has several "universes" (that's how we call it). Every universe has its own database (currently on the same server) but they all share the same schema (tables, sprocs, etc) and the same source/site code.
So manually deploying is really annoying, because I have to deploy the source code and then run the sql scripts manually on each database. I know that manual deploying can cause problems, so I'm looking for a way of automating it.
We've recently created a Visual Studio Database Project to manage the schema and generate the diff-schema scripts with different targets.
I don't have idea how to put the pieces together
I would like to:
Have a way to make a "sync" deploy to a target server (thanksfully I have full RDC access to the servers so I can install things if required). With "sync" deploy I mean that I don't want to fully deploy the whole application, because it has lots of files and I just want to deploy those new or changed.
Generate diff-sql update scripts for every database target and combine it to just 1 script. For this I should have some list of the databases names somewhere.
Copy the site files and executing the generated sql script in an easy and automated way.
I've read about MSBuild, MS WebDeploy, NAnt, etc. But I don't really know where to start and I really want to get rid of this manual deploy.
If there is a better and easier way of doing it than what I enumerated, I'll be pleased to read your option.
I know this is not a very specific question but I've googled a lot about it and it seems I cannot figure out how to do it. I've never used any automation tool to deploy.
Any help will be really appreciated,
Thank you all,
Regards
Have you heard of the term Multi-Tenancy? It might be worth look that up to see if that applied to your "Multiverse" especially if one universe is never accessed by another...
See:
http://en.wikipedia.org/wiki/Multitenancy
http://msdn.microsoft.com/en-us/library/aa479086.aspx
UPDATE:
If the application and database is the same for each client (or Tenant) I believe there are applications that may help in providing the same code/db as an SaaS application? ie another application/configuration layer on top that can handle the deployments etc?
I think these are called Platform as a Service (PaaS) applications:
see: http://en.wikipedia.org/wiki/Platform_as_a_service
Multi-Tenancy in your case may be possible, depending on client security requirements, with a bit of work (or a lot of work):
Option 1:
You could use the one instance of the application, ie deploy the site once and connect to a different database for each client. You would need to differentiate each client by URL to isolate content/data byt setting a connection string for each etc. (This would reduce your site deployments to one deployment)
Option 2:
You could create both a single instance of the application and use a single database. You would need to add a "TenantID" to each table and adjust all your code to accept a TenantID to ensure data security/isolation. Again you wold need to detect/differentiate the Tenant based on the URL to set the TenantID for the session used for every database call. (This would reduce your site and database deployment to one of each)