Single website, single database schema multiple copies - asp.net

I know it maybe a mess in a logic I am thinking of, but the scenario is that I have a website which has 8 multiple copies same website same database schema but published on different places on the same server.
My problem is that I have about 50 stored procedures (per database), so what can I do to make maintenance easier?
Every time I modify one stored procedure shall I modify it at 8 places? Shall I change the web.config file 8 times each publish operation?
I am thinking about making a simple CMS system which I will store publish directories and on button click it will publish all project to selected directories, but still I have a problem which is the stored procedures? On first publish I am thinking of altering the whole stored procedures at first publish.
Please any suggestion is welcome, now I have only one copy and you know its hard to maintain after taking the decision.
BTW the website is ASP.NET Web Forms but I can port it to any new .NET web technology like MVC or .NET Core.

If you have full control over the hosting then a "multi-tenant database design" as suggested by Dai would be a better approach. However if the circumstances require multiple instances then you can do the following:
Store a version number in your database so you know which scripts have run and which haven't.
Script any schema changes (which includes any Stored Procedures), potentially seed data, and including a new database version number.
Store the scripts in a directory under the website.
Have a maintenance routine (manual or automatic), which runs any new scripts which haven't been run before, by comparing the version numbers.

Related

Is it possible to use Kentico's staging API to pull serialized object information from a target server?

We have a large, complex Kentico build which uses Kentico's Continuous Integration locally, and Kentico's Staging module to push Kentico object changes through various environments.
We have a large internal dev team and have found that occasionally (probably due to Git merging issues) certain staging tasks aren't logged. When dealing with large deployments this is often not obvious until something breaks on the target server.
What I'd like is to write a custom module which can pull certain data from a target server (e.g. a collection of serialized web parts). I can then use this to compare with the source server to identify where objects are not correctly synchronized. I'd hoped this might be possible using the web services already exposed by Kentico which handle the staging sync tasks.
I've been hunting through a few namespaces in the Kentico API (CMS.Synchronization, CMS.Synchronization.WSE3 etc.) but it's not clear if what I'm trying to do is even possible. Has anyone tried anything similar. If so, could you point me in the right direction?
Instead of writing your own code/tool for this I'd suggest taking advantage of what someone else has already done. This is like Red Gate's SQL Compare for Kentico BUT on steroids. It compares, database data, schema AND file system changes on staging and target servers.
Compare for Kentico

How to organize a collection of demo web application

I would like to create and archive a collection of demo ASP.NET web form applications that show projects with certain features in the sense "this feature can be implemented like this" -- to be presented to a potential customer.
Before the presentation, I would like to get the selected set of demo and install them easily to the notebook. Each of the demos will be "frozen". The target notebook is not the customer's one. It is one of our ones that is bring to the customer for the presentation. This way, it can be prepared in the sense that a named MS SQL instance with the fixed name can be ready, etc.
Can you share some experience with such situation? (I do not want to have marked this question as of opinionated; so please, if you have some explicit links to the related documents or explicit suggestions...)
Here are some other facts and initial ideas:
Each of the demo projects uses two databases: xxx_users (the standard ASP.NET authentication...), and xxx_application (and possibly xxx_external) where xxx is a prefix for the specific project.
The demo application is expected to be compiled (binary only, no sources needed for the presentation).
The Web.config files can use the local\SQLINSTANCEFORDEMOS in connection strings.
The SQL instance has a fixed name, fixed administrator account (like sa) and fixed password for the logging to the SQL instance. This way, it can be included in the Web.config files.
The sample data can be fairly big (not extremely tiny).
The application will use its own SQL tables in the xxx_application database.
The application will simulate the outer database that is accessed from the web application can be simulated by xxx_external database.
This way, I should be able to create and archive SQL backups of xxx_users, xxx_application, and xxx_external databases, plus the archive of the web app binary.
Have you ever encountered this situation? Is the approach reasonable? Could you share some better ideas?

Best Practice for maintaining a TSQL database creation script for a web application

We have a ASP.NET web application and need to maintain the database creation and initialization script.
Are there any industry best practices that people know of for maintaining database creation and initialization scripts. I can think of two main approaches.
Maintain a tsql creation script directly by hand.
Maintain a master database and create the script that is then checked into source safe.
Also the script should be able to be tracked through source control, i.e. table order should be controllable.
If possible should also include the ability to track initialisation data either in the same or a seperate script.
Currently we generate the script from management studio but the order of the tables seems to be random.
And the more automated the solution the better.
The problem is not maintaining the script, nor maintaining a 'master' copy of the database. The real problem is upgrading existing database(s). You do your modification in the developer environment, which are then propagated to the test environment, and finally pushed into production environment. While at developer and test environment stage is possible to start from scratch, in production you always have to upgrade the existing deployment.
In my experience the best practice is to use upgrade scripts. This practice is useful even with a single deployed site, but it becomes invaluable with multiple locations that may be at different versions. But even with one single operational site is still useful to be able to test the upgrade repeatedly (starting from backups of current version), keep the changes in source control, have a well formalized and peer reviewed change procedure (the upgrade script). And upgrade scripts can be tailored to specific needs of the operational site, like handling a large table with special care, or deal with encrypted data, or whatever one of the myriad of the details diff based tools neglect or ignore. The main disadvantage is the the scripts have to be written, which require real T-SQL knowledge (forget all the 'designers' in you favorite management tool).
You might want to check out RedGate SQL Source Control.
Are you looking for Visual Studio Database Projects?
I use database projects to store all database objects (tables, views, functions, keys, triggers, indexes across schemas) and keep versioning in TFS. You can build the database to ensure that everything is valid. You can deploy to a fresh database, or do a schema comparison with an existing database.
I also keep all reference and setup data in post deployment scripts which are automatically run after deployment.

How to avoid chaotic ASP.NET web application deployment?

Ok, so here's the thing.
I'm developing an existing (it started being an ASP classic app, so you can imagine :P) web application under ASP.NET 4.0 and SQLServer 2005. We are 4 developers using local instances of SQL Server 2005 Express, having the source-code and the Visual Studio database project
This webapp has several "universes" (that's how we call it). Every universe has its own database (currently on the same server) but they all share the same schema (tables, sprocs, etc) and the same source/site code.
So manually deploying is really annoying, because I have to deploy the source code and then run the sql scripts manually on each database. I know that manual deploying can cause problems, so I'm looking for a way of automating it.
We've recently created a Visual Studio Database Project to manage the schema and generate the diff-schema scripts with different targets.
I don't have idea how to put the pieces together
I would like to:
Have a way to make a "sync" deploy to a target server (thanksfully I have full RDC access to the servers so I can install things if required). With "sync" deploy I mean that I don't want to fully deploy the whole application, because it has lots of files and I just want to deploy those new or changed.
Generate diff-sql update scripts for every database target and combine it to just 1 script. For this I should have some list of the databases names somewhere.
Copy the site files and executing the generated sql script in an easy and automated way.
I've read about MSBuild, MS WebDeploy, NAnt, etc. But I don't really know where to start and I really want to get rid of this manual deploy.
If there is a better and easier way of doing it than what I enumerated, I'll be pleased to read your option.
I know this is not a very specific question but I've googled a lot about it and it seems I cannot figure out how to do it. I've never used any automation tool to deploy.
Any help will be really appreciated,
Thank you all,
Regards
Have you heard of the term Multi-Tenancy? It might be worth look that up to see if that applied to your "Multiverse" especially if one universe is never accessed by another...
See:
http://en.wikipedia.org/wiki/Multitenancy
http://msdn.microsoft.com/en-us/library/aa479086.aspx
UPDATE:
If the application and database is the same for each client (or Tenant) I believe there are applications that may help in providing the same code/db as an SaaS application? ie another application/configuration layer on top that can handle the deployments etc?
I think these are called Platform as a Service (PaaS) applications:
see: http://en.wikipedia.org/wiki/Platform_as_a_service
Multi-Tenancy in your case may be possible, depending on client security requirements, with a bit of work (or a lot of work):
Option 1:
You could use the one instance of the application, ie deploy the site once and connect to a different database for each client. You would need to differentiate each client by URL to isolate content/data byt setting a connection string for each etc. (This would reduce your site deployments to one deployment)
Option 2:
You could create both a single instance of the application and use a single database. You would need to add a "TenantID" to each table and adjust all your code to accept a TenantID to ensure data security/isolation. Again you wold need to detect/differentiate the Tenant based on the URL to set the TenantID for the session used for every database call. (This would reduce your site and database deployment to one of each)

ASP.NET Web App Distribution

What is the simplest way to distribute an asp.net web application? I tried to look at some of the open source asp.net projects out there to see how they distribute their apps and how they do updates and they seem rather complicated to me (not for myself to perform but for non-technical users). A lot of them entail backing up the entire installed project, deleting specific folders and save parts of their web.config. I am hoping to find a solution that will make the update process specifically as simple as possible.
Thanks.
I am working on a project with a similar requirement now. We decided to use WiX to create an installer that can be run on the server or machine where the site is installed. WiX is incredibly powerful, but takes a bit to get the hang of.
There are plenty of other open source, and paid installer technologies as well. Here is a post with some info on a few.
CommunityServer provides a setup msi that will create a virutal directory, generate the SQL database and populate it with default data. Updating for point releases though is still a manual process involving an update.sql file and having everyone download then merge binary and static file changes.
They probably could have created an update msi too, but because so many people customize CommunityServer, it is probably better to let people merge changes themselves.
Do you mean in terms of breaking up the functionality into tiers that could be handled on separate machines, e.g. having 3 servers for a 3-tier architecture where one is the DB server, one handles middleware and the other handles the requests in ASP.Net? Another point here would be in going from a web server to multiple web servers in terms of scaling up.
Or are you referring to deployment?
It's a web application, man. Serve it publicly, require registration, and move on. Isn't that the point of the web application?

Resources