How to organize a collection of demo web application - asp.net

I would like to create and archive a collection of demo ASP.NET web form applications that show projects with certain features in the sense "this feature can be implemented like this" -- to be presented to a potential customer.
Before the presentation, I would like to get the selected set of demo and install them easily to the notebook. Each of the demos will be "frozen". The target notebook is not the customer's one. It is one of our ones that is bring to the customer for the presentation. This way, it can be prepared in the sense that a named MS SQL instance with the fixed name can be ready, etc.
Can you share some experience with such situation? (I do not want to have marked this question as of opinionated; so please, if you have some explicit links to the related documents or explicit suggestions...)
Here are some other facts and initial ideas:
Each of the demo projects uses two databases: xxx_users (the standard ASP.NET authentication...), and xxx_application (and possibly xxx_external) where xxx is a prefix for the specific project.
The demo application is expected to be compiled (binary only, no sources needed for the presentation).
The Web.config files can use the local\SQLINSTANCEFORDEMOS in connection strings.
The SQL instance has a fixed name, fixed administrator account (like sa) and fixed password for the logging to the SQL instance. This way, it can be included in the Web.config files.
The sample data can be fairly big (not extremely tiny).
The application will use its own SQL tables in the xxx_application database.
The application will simulate the outer database that is accessed from the web application can be simulated by xxx_external database.
This way, I should be able to create and archive SQL backups of xxx_users, xxx_application, and xxx_external databases, plus the archive of the web app binary.
Have you ever encountered this situation? Is the approach reasonable? Could you share some better ideas?

Related

Single website, single database schema multiple copies

I know it maybe a mess in a logic I am thinking of, but the scenario is that I have a website which has 8 multiple copies same website same database schema but published on different places on the same server.
My problem is that I have about 50 stored procedures (per database), so what can I do to make maintenance easier?
Every time I modify one stored procedure shall I modify it at 8 places? Shall I change the web.config file 8 times each publish operation?
I am thinking about making a simple CMS system which I will store publish directories and on button click it will publish all project to selected directories, but still I have a problem which is the stored procedures? On first publish I am thinking of altering the whole stored procedures at first publish.
Please any suggestion is welcome, now I have only one copy and you know its hard to maintain after taking the decision.
BTW the website is ASP.NET Web Forms but I can port it to any new .NET web technology like MVC or .NET Core.
If you have full control over the hosting then a "multi-tenant database design" as suggested by Dai would be a better approach. However if the circumstances require multiple instances then you can do the following:
Store a version number in your database so you know which scripts have run and which haven't.
Script any schema changes (which includes any Stored Procedures), potentially seed data, and including a new database version number.
Store the scripts in a directory under the website.
Have a maintenance routine (manual or automatic), which runs any new scripts which haven't been run before, by comparing the version numbers.

CMS - How to work with multiple environments? Do I really need them?

I've never worked with any CMS and I simply wanted to play with such ones. As originally I come from .NET roots, so I was thinking about choosing Orchard Core CMS.
Let's imagine very simple scenario, together with my colleague I'd like to create a blog. As I'm used to work with web based systems and applications for a business for me it's kinda normal to work with code repository, having multiple environments dev/test/stage/prod, implementing CI / CD, adjusting database via migrations or scripts.
Now the question is do I need all of this with working on our blog with a usage of CMS.
To be more specific I can ask few questions:
Shall I create blog using CMS locally (My PC) -> create few articles and then deploy it to the web or I should create a blog over the internet and add articles in prod environment directly.
How to synchronize databases between environments (dev / prod).
I can add, that as I do not expect many visitors on a website I was thinking to use Orchard Core CMS together with SQLite. Also I expect that I can customize code, add new modules, extend existing ones etc. - not only add content (articles). You can take that into consideration in answering the question
So basically my question is what should be the workflow of a person who want to create / administer and maintain CMS (let it be blog) as a single person or as a team.
Shall I work and create content locally, then publish it and somehow synchronize both application and database (database is my main question mark - also in a context how to do that properly using SQLite).
Or simply all the changes - code + content should be managed directly on a server let's call it production environment.
Excuse me if question is silly and hard to understand, but I'm looking for any advice as I really didn't find any good examples / information about that or maybe I'm seeking in totally wrong direction.
Thanks in advance.
Great question, not at all silly ;)
When dealing with a CMS, you need to think about the data/content in very different terms from the code/modules, despite the fact that the boundary between them is not always completely obvious.
For Orchard, the recommendation is not to install modules in production, but to have a dev - staging - production type of environment: install new modules on a dev environment, test them in staging, and then deploy to production when it's safe to do so. Depending on the scale of the project, the staging may be skipped for a more agile dev to prod setting but the idea remains the same, and is not very different from any modular application.
Then you have the activation and configuration of the settings of the modules you deploy. Because in a CMS like Orchard, those settings are considered data and stored in the database, they should be handled like content. This includes metadata such as the very shape of the content of your site: content types are data.
Data is typically not deployed like code is, with staging and prod environments (although it can, to a degree, more on that in a moment). One reason for this is that a CMS will often feature user-provided data, such as reviews, ratings, comments or usage stats. Synchronizing all that two-ways is very impractical. Another even more important reason is that the very reason to use a CMS is to let non-technical owners of the site manage content themselves in a fast and direct manner.
The difference between code and data is also visible in the way you secure their changes: for code, usual source control is still the rule, whereas for the content, you'll setup database backups.
Also important to mention is the structure of the database. You typically don't have to worry about this until you write your own modules: Orchard comes with a rich data migration feature that makes sure the database structure gets updated with the code that uses it. So don't worry about that, the database will just update itself as you deploy code to production.
Finally, I must mention that some CMS sites do need to be able to stage contents and test it before exposing it to end-users. There are variations of that: in some cases, being able to draft and preview content items is enough. Orchard supports that out of the box: any content type can be marked draftable. When that is not enough, there is an optional feature called Deployments that enables rich content deployment workflows that can be repeated, scheduled and validated. An important point concerning that module is that the deployment only applies to the subset of the site's content you decide it should apply to (and excludes, obviously, stuff like user-provided content).
So in summary, treat code and modules as something you deploy in a one-way fashion from the dev box all the way to production, with ordinary source control and deployment methods, and treat data depending on the scenario, from simple direct in production database instances with a good backup policy, to drafts stored in production, and then all the way to complex content deployment rules.

Best Practice for maintaining a TSQL database creation script for a web application

We have a ASP.NET web application and need to maintain the database creation and initialization script.
Are there any industry best practices that people know of for maintaining database creation and initialization scripts. I can think of two main approaches.
Maintain a tsql creation script directly by hand.
Maintain a master database and create the script that is then checked into source safe.
Also the script should be able to be tracked through source control, i.e. table order should be controllable.
If possible should also include the ability to track initialisation data either in the same or a seperate script.
Currently we generate the script from management studio but the order of the tables seems to be random.
And the more automated the solution the better.
The problem is not maintaining the script, nor maintaining a 'master' copy of the database. The real problem is upgrading existing database(s). You do your modification in the developer environment, which are then propagated to the test environment, and finally pushed into production environment. While at developer and test environment stage is possible to start from scratch, in production you always have to upgrade the existing deployment.
In my experience the best practice is to use upgrade scripts. This practice is useful even with a single deployed site, but it becomes invaluable with multiple locations that may be at different versions. But even with one single operational site is still useful to be able to test the upgrade repeatedly (starting from backups of current version), keep the changes in source control, have a well formalized and peer reviewed change procedure (the upgrade script). And upgrade scripts can be tailored to specific needs of the operational site, like handling a large table with special care, or deal with encrypted data, or whatever one of the myriad of the details diff based tools neglect or ignore. The main disadvantage is the the scripts have to be written, which require real T-SQL knowledge (forget all the 'designers' in you favorite management tool).
You might want to check out RedGate SQL Source Control.
Are you looking for Visual Studio Database Projects?
I use database projects to store all database objects (tables, views, functions, keys, triggers, indexes across schemas) and keep versioning in TFS. You can build the database to ensure that everything is valid. You can deploy to a fresh database, or do a schema comparison with an existing database.
I also keep all reference and setup data in post deployment scripts which are automatically run after deployment.

DB advice and best practices for ASP.NET based web site?

I have a web site I developed for displaying the results of some data analysis work I did. It relied on ASP.NET for the front end and connected to a MySQL back end utilising Entity Framework and LINQ extensively.
I chose MySQL because I personally have used it in the past and like the database, but this resulted in some serious issues when I had to deploy it to a hosting provider (incompatible connectors, access rights, etc.)
I am now getting ready to redevelop and expand the site and I am looking for some advice to avoid the issues I had last time.
The new DB has to serve two roles. The first is to be a data provider for the charts that are the output of the analysis work. These tables are straightforward, almost flat files, with 10 tables. One table has roughly 200k rows of data the rest have aprox 1200 rows of data each. There are little references or queries between the DB tables, but there are a few. This data is updated periodically by a back end process and does not need to be added to or edited by the user.
The second role of the DB would be as a basic persistent store for a standard user management system. It would need to manage data for adding/ removing clients, user names, passwords, access rights. etc. No financial data or super secure data is involved.
What database approach would you recommend that would give me easy deployment and management at a web host and still allow me to use both Entity Framework and LINQ effectively.
Second, what tools/frameworks should I consider as I rewrite this system. It is very graphical and data focused. Presentation of charts and information is the key factor in this site. Are there any new technologies or frameworks that would add specific value to what I am doing?
A few notes. I am a one man shop and I maintain the entire system myself so I am less worried about enterprise level frameworks than other people. My focus is on the easy development and deployment of the site. Maintainability is also a key factor.
I am also an experienced C# developer, but new to ASP.NET and the web side of things. The first version of this site was a big learning experience. It was good, but I wasted an enormous amount of time on just understanding new technologies and approaches. I am very open to learning, but I can't afford the time to get my head around a complete paradigm shift.
I am looking forward to your thoughts, thanks.
Doug
The natural choice would be SQL Server. I'd guess by your description that you are way under the maximum space limit of the SQL Server Express edition. I of course supports Entity Framework and the drivers are part of the .NET Framework, so no problem with third party assemblies here.
This will also open up the possibility to host your app in the cloud (Azure) later on, because SQL Azure in fact is a Microsoft SQL Server, so there is no overhead in supporting that.
Regarding user management - ASP.NET has this all build in (Membership, Role and Profile provider) and also a SQL Provider for which default tables are available. So you don't have to design your tables by yourself and it runs very naturally on SQL Server.

How to avoid chaotic ASP.NET web application deployment?

Ok, so here's the thing.
I'm developing an existing (it started being an ASP classic app, so you can imagine :P) web application under ASP.NET 4.0 and SQLServer 2005. We are 4 developers using local instances of SQL Server 2005 Express, having the source-code and the Visual Studio database project
This webapp has several "universes" (that's how we call it). Every universe has its own database (currently on the same server) but they all share the same schema (tables, sprocs, etc) and the same source/site code.
So manually deploying is really annoying, because I have to deploy the source code and then run the sql scripts manually on each database. I know that manual deploying can cause problems, so I'm looking for a way of automating it.
We've recently created a Visual Studio Database Project to manage the schema and generate the diff-schema scripts with different targets.
I don't have idea how to put the pieces together
I would like to:
Have a way to make a "sync" deploy to a target server (thanksfully I have full RDC access to the servers so I can install things if required). With "sync" deploy I mean that I don't want to fully deploy the whole application, because it has lots of files and I just want to deploy those new or changed.
Generate diff-sql update scripts for every database target and combine it to just 1 script. For this I should have some list of the databases names somewhere.
Copy the site files and executing the generated sql script in an easy and automated way.
I've read about MSBuild, MS WebDeploy, NAnt, etc. But I don't really know where to start and I really want to get rid of this manual deploy.
If there is a better and easier way of doing it than what I enumerated, I'll be pleased to read your option.
I know this is not a very specific question but I've googled a lot about it and it seems I cannot figure out how to do it. I've never used any automation tool to deploy.
Any help will be really appreciated,
Thank you all,
Regards
Have you heard of the term Multi-Tenancy? It might be worth look that up to see if that applied to your "Multiverse" especially if one universe is never accessed by another...
See:
http://en.wikipedia.org/wiki/Multitenancy
http://msdn.microsoft.com/en-us/library/aa479086.aspx
UPDATE:
If the application and database is the same for each client (or Tenant) I believe there are applications that may help in providing the same code/db as an SaaS application? ie another application/configuration layer on top that can handle the deployments etc?
I think these are called Platform as a Service (PaaS) applications:
see: http://en.wikipedia.org/wiki/Platform_as_a_service
Multi-Tenancy in your case may be possible, depending on client security requirements, with a bit of work (or a lot of work):
Option 1:
You could use the one instance of the application, ie deploy the site once and connect to a different database for each client. You would need to differentiate each client by URL to isolate content/data byt setting a connection string for each etc. (This would reduce your site deployments to one deployment)
Option 2:
You could create both a single instance of the application and use a single database. You would need to add a "TenantID" to each table and adjust all your code to accept a TenantID to ensure data security/isolation. Again you wold need to detect/differentiate the Tenant based on the URL to set the TenantID for the session used for every database call. (This would reduce your site and database deployment to one of each)

Resources