Ok, so here's the thing.
I'm developing an existing (it started being an ASP classic app, so you can imagine :P) web application under ASP.NET 4.0 and SQLServer 2005. We are 4 developers using local instances of SQL Server 2005 Express, having the source-code and the Visual Studio database project
This webapp has several "universes" (that's how we call it). Every universe has its own database (currently on the same server) but they all share the same schema (tables, sprocs, etc) and the same source/site code.
So manually deploying is really annoying, because I have to deploy the source code and then run the sql scripts manually on each database. I know that manual deploying can cause problems, so I'm looking for a way of automating it.
We've recently created a Visual Studio Database Project to manage the schema and generate the diff-schema scripts with different targets.
I don't have idea how to put the pieces together
I would like to:
Have a way to make a "sync" deploy to a target server (thanksfully I have full RDC access to the servers so I can install things if required). With "sync" deploy I mean that I don't want to fully deploy the whole application, because it has lots of files and I just want to deploy those new or changed.
Generate diff-sql update scripts for every database target and combine it to just 1 script. For this I should have some list of the databases names somewhere.
Copy the site files and executing the generated sql script in an easy and automated way.
I've read about MSBuild, MS WebDeploy, NAnt, etc. But I don't really know where to start and I really want to get rid of this manual deploy.
If there is a better and easier way of doing it than what I enumerated, I'll be pleased to read your option.
I know this is not a very specific question but I've googled a lot about it and it seems I cannot figure out how to do it. I've never used any automation tool to deploy.
Any help will be really appreciated,
Thank you all,
Regards
Have you heard of the term Multi-Tenancy? It might be worth look that up to see if that applied to your "Multiverse" especially if one universe is never accessed by another...
See:
http://en.wikipedia.org/wiki/Multitenancy
http://msdn.microsoft.com/en-us/library/aa479086.aspx
UPDATE:
If the application and database is the same for each client (or Tenant) I believe there are applications that may help in providing the same code/db as an SaaS application? ie another application/configuration layer on top that can handle the deployments etc?
I think these are called Platform as a Service (PaaS) applications:
see: http://en.wikipedia.org/wiki/Platform_as_a_service
Multi-Tenancy in your case may be possible, depending on client security requirements, with a bit of work (or a lot of work):
Option 1:
You could use the one instance of the application, ie deploy the site once and connect to a different database for each client. You would need to differentiate each client by URL to isolate content/data byt setting a connection string for each etc. (This would reduce your site deployments to one deployment)
Option 2:
You could create both a single instance of the application and use a single database. You would need to add a "TenantID" to each table and adjust all your code to accept a TenantID to ensure data security/isolation. Again you wold need to detect/differentiate the Tenant based on the URL to set the TenantID for the session used for every database call. (This would reduce your site and database deployment to one of each)
Related
We have a large, complex Kentico build which uses Kentico's Continuous Integration locally, and Kentico's Staging module to push Kentico object changes through various environments.
We have a large internal dev team and have found that occasionally (probably due to Git merging issues) certain staging tasks aren't logged. When dealing with large deployments this is often not obvious until something breaks on the target server.
What I'd like is to write a custom module which can pull certain data from a target server (e.g. a collection of serialized web parts). I can then use this to compare with the source server to identify where objects are not correctly synchronized. I'd hoped this might be possible using the web services already exposed by Kentico which handle the staging sync tasks.
I've been hunting through a few namespaces in the Kentico API (CMS.Synchronization, CMS.Synchronization.WSE3 etc.) but it's not clear if what I'm trying to do is even possible. Has anyone tried anything similar. If so, could you point me in the right direction?
Instead of writing your own code/tool for this I'd suggest taking advantage of what someone else has already done. This is like Red Gate's SQL Compare for Kentico BUT on steroids. It compares, database data, schema AND file system changes on staging and target servers.
Compare for Kentico
We do rapid development of web applications and we're looking for ways to separate our development and production databases (we currently develop directly on production... it's bad news).
We use ASP.NET Webforms with LINQ2SQL and Dynamic Data for CRUD. How can we do database development locally and then deploy changes to production? I've seen Entity Framework Code-first migrations, but I don't know of any equivalent for LINQ2SQL. We don't want to switch to EF as our CMS is built around LINQ2SQL.
We would also need production data to be available locally (not up to the minute, but recent enough) so we can debug with real data if problems arise.
This is the only idea I've come up with so far but it's far from ideal:
Initial development is done locally then deployed to production
Subsequent maintenance is then done on a local replication of the production database. Then we use some kind of 'database diff' tool to determine the changes that were made, and migrate those changes to production.
Is this an acceptable way of doing things? Is there a better way we could use?
Thanks
Develop your data model and procedures in SSDT Database Projects. This keeps a perfect source controlled copy of what you want the database to look alike at any moment in time. Then let the tooling generate the publishing scripts for you.
Developers should always develop on their own local copy of a database. They can check out scripts form the database project and make changes which they publish locally They can get latest on the checked in project, merge their changes, deploy locally again, test it out, and then check in their changes. Only when everything is tested out, you then publish the changes to production.
You end up treating your database schema very much like code source files.
To get production data down to your development server I would take a .bacpac or .dacpac of the production DB and them import it into your local DB. This works well because you need the schema definition along with the data since it is likely that prod is an older version than what you would have in dev
Yeah I think you basically hit the nail on the head. Those are the two things I have done.
You develop locally and check in your SQL scripts to source control. Then you run the scripts for a deployment. What I've seen work well is dropping/re-creating all stored procedures (seems scary, but if you trust those scripts it's very helpful), and then having one-off scripts per deployment for schema changes and data migration.
Periodically you will copy down production data and restore it locally. Obviously this sync can only happen easily right after a deployment since that's when local and production will be the same. At my current job we actually duplex writes and send a copy to the lower environments and so I suppose that's an option. You could replicate data from production somewhere else and work off of that/write a tool to bring the data into local.
From what I've seen there are no easy answers.
My team is doing web development (ASP.NET, WCF), and we are at a beginning stage where everyone needs to make DB changes and use own sample data.
We use a dedicated DB server, and we want each developer to develop against separate DB.
What we appear to need is ability to configure connection string on per-developer basis in source controlled way. Obviously, we might have other configuration settings that need custom setting and finally, we'll need to maintain a set of configuration settings that are common to all developers.
Can anyone suggest a best practice here?
PS Similar issue appears when we want to deploy a built application to different environments (test, stage, production) without having to manually tweak configurations (except perhaps configuring the environment name).
You can use config transforms for your deployment to different environments. That's easy enough. Scott Hanselman did a pretty awesome video on it here.
For your individual developer db problem, there isn't any particularly elegant solution I can think of. Letting each developer have a unique configuration isn't really a "best practice" to begin with. Once everyone starts integrating their code, you could have a very ugly situation on your hands if everyone wrote their code against a unique db and configuration set. It almost guarantees that code won't perform the same way for two developers.
Here is what I would recommend, and have done in the past.
Create a basic framework for your database, on one database on your test db server.
Create a Database Project as part of your solution.
Use .Net's built in Schema Compare to write your existing database to the database project.
When someone needs to change the database, first, they should get latest on the Database project, then make their changes, and then repeat step 4 to add their changes to the project.
Using this method, it is also very easy for developers to deploy a local instance of the database that matches the "main" database, make changes, and write those changes back to the project.
OK.
Maybe not so elegant solution, but we've chosen to read connection string from a different place when the project is built using Debug configuration.
We are using registry, and it has to be maintained manually.
It requires some extra coding, but the code to read the registry is only compiled in debug (#if debug), so there is no performance hit in production.
Hope this helps as well.
Cheers
v.
We have a ASP.NET web application and need to maintain the database creation and initialization script.
Are there any industry best practices that people know of for maintaining database creation and initialization scripts. I can think of two main approaches.
Maintain a tsql creation script directly by hand.
Maintain a master database and create the script that is then checked into source safe.
Also the script should be able to be tracked through source control, i.e. table order should be controllable.
If possible should also include the ability to track initialisation data either in the same or a seperate script.
Currently we generate the script from management studio but the order of the tables seems to be random.
And the more automated the solution the better.
The problem is not maintaining the script, nor maintaining a 'master' copy of the database. The real problem is upgrading existing database(s). You do your modification in the developer environment, which are then propagated to the test environment, and finally pushed into production environment. While at developer and test environment stage is possible to start from scratch, in production you always have to upgrade the existing deployment.
In my experience the best practice is to use upgrade scripts. This practice is useful even with a single deployed site, but it becomes invaluable with multiple locations that may be at different versions. But even with one single operational site is still useful to be able to test the upgrade repeatedly (starting from backups of current version), keep the changes in source control, have a well formalized and peer reviewed change procedure (the upgrade script). And upgrade scripts can be tailored to specific needs of the operational site, like handling a large table with special care, or deal with encrypted data, or whatever one of the myriad of the details diff based tools neglect or ignore. The main disadvantage is the the scripts have to be written, which require real T-SQL knowledge (forget all the 'designers' in you favorite management tool).
You might want to check out RedGate SQL Source Control.
Are you looking for Visual Studio Database Projects?
I use database projects to store all database objects (tables, views, functions, keys, triggers, indexes across schemas) and keep versioning in TFS. You can build the database to ensure that everything is valid. You can deploy to a fresh database, or do a schema comparison with an existing database.
I also keep all reference and setup data in post deployment scripts which are automatically run after deployment.
What is the simplest way to distribute an asp.net web application? I tried to look at some of the open source asp.net projects out there to see how they distribute their apps and how they do updates and they seem rather complicated to me (not for myself to perform but for non-technical users). A lot of them entail backing up the entire installed project, deleting specific folders and save parts of their web.config. I am hoping to find a solution that will make the update process specifically as simple as possible.
Thanks.
I am working on a project with a similar requirement now. We decided to use WiX to create an installer that can be run on the server or machine where the site is installed. WiX is incredibly powerful, but takes a bit to get the hang of.
There are plenty of other open source, and paid installer technologies as well. Here is a post with some info on a few.
CommunityServer provides a setup msi that will create a virutal directory, generate the SQL database and populate it with default data. Updating for point releases though is still a manual process involving an update.sql file and having everyone download then merge binary and static file changes.
They probably could have created an update msi too, but because so many people customize CommunityServer, it is probably better to let people merge changes themselves.
Do you mean in terms of breaking up the functionality into tiers that could be handled on separate machines, e.g. having 3 servers for a 3-tier architecture where one is the DB server, one handles middleware and the other handles the requests in ASP.Net? Another point here would be in going from a web server to multiple web servers in terms of scaling up.
Or are you referring to deployment?
It's a web application, man. Serve it publicly, require registration, and move on. Isn't that the point of the web application?