TFS DLM: Should Database and Application be separate Build definitions? - sql-server-data-tools

I am trying to setup an application and database in TFS 2017. For a standard windows application and SQL Database Project.
Should I combine the application project and the SQL Database Project into the same solution and Build together on TFS in the same Build Definition?
I a thinking these should be separate solutions in TFS and separate Build definitions in TFS. We can include multiple artifact references in TFS Release to deploy them both simultaneously.
Does this make sense?
Or does it make more sense to combine the solution and build definition into the same units of work?

Few questions to ask yourself:
Do the database changes need to go out with the code changes?
Probably combine them
Is the DB something maintained by another group that you just keep in sync? Maybe split them up
Is the database build adding a lot of time to your CI? Maybe split them up to improve performance, or cache the DacPac between builds
If they are in separate solutions, how are you planning to do code reviews or branching?
I've usually combined them, but have been thinking about something closer to the 3rd question to speed up builds.

A solution is a structure for organizing projects in Visual Studio. Whether combine the application project and the SQL Database Project into the same solution depends on your projects structure.
On TFS side, it's OK to use one build definition, as if you want to build only one project, you can specify the particular project.

Related

BizTalk common schema

As per best practice I have separated my BizTalk solution into projects based on artifacts (schema, pipelines, maps etc). I've also separated business processes into solution folders. I have created a common project to hold schema that need to be available for each and referenced these when needed... so far so good.
When I deploy it will deploy the common schema and each reference - resulting in multiple schemas. If I try to untick the dependency in the project assembly I get the error
This dependency was added by the project and cannot be removed.
Am I missing something?
Visual Studio 2012, BizTalk Server Dev Ed 2013.
Really, same answer as in the other post.
Consider the Visual Studio Solution as one Deployment Unit and build your processes around that. Meaning, all the Projects, Schemas, Maps, Orchestrations, would always go out together even if only one changed.
I try really hard to not share Schemas across Solutions specifically because of the Deployment issues. I do this even if it means duplicate or essentially duplicate Schemas. 99% of the time, the only thing that breaks is the automatic Schema resolution in the Xml Disassembler and that is easily solvable.
"When I deploy it will deploy the common schema and each reference - resulting in multiple schema"
Sorry, this part doesn't seem right. If you have a common Schema project, there shouldn't be duplicates.

Best Practice for maintaining a TSQL database creation script for a web application

We have a ASP.NET web application and need to maintain the database creation and initialization script.
Are there any industry best practices that people know of for maintaining database creation and initialization scripts. I can think of two main approaches.
Maintain a tsql creation script directly by hand.
Maintain a master database and create the script that is then checked into source safe.
Also the script should be able to be tracked through source control, i.e. table order should be controllable.
If possible should also include the ability to track initialisation data either in the same or a seperate script.
Currently we generate the script from management studio but the order of the tables seems to be random.
And the more automated the solution the better.
The problem is not maintaining the script, nor maintaining a 'master' copy of the database. The real problem is upgrading existing database(s). You do your modification in the developer environment, which are then propagated to the test environment, and finally pushed into production environment. While at developer and test environment stage is possible to start from scratch, in production you always have to upgrade the existing deployment.
In my experience the best practice is to use upgrade scripts. This practice is useful even with a single deployed site, but it becomes invaluable with multiple locations that may be at different versions. But even with one single operational site is still useful to be able to test the upgrade repeatedly (starting from backups of current version), keep the changes in source control, have a well formalized and peer reviewed change procedure (the upgrade script). And upgrade scripts can be tailored to specific needs of the operational site, like handling a large table with special care, or deal with encrypted data, or whatever one of the myriad of the details diff based tools neglect or ignore. The main disadvantage is the the scripts have to be written, which require real T-SQL knowledge (forget all the 'designers' in you favorite management tool).
You might want to check out RedGate SQL Source Control.
Are you looking for Visual Studio Database Projects?
I use database projects to store all database objects (tables, views, functions, keys, triggers, indexes across schemas) and keep versioning in TFS. You can build the database to ensure that everything is valid. You can deploy to a fresh database, or do a schema comparison with an existing database.
I also keep all reference and setup data in post deployment scripts which are automatically run after deployment.

ASP.NET integration Environment

All,
My dev team and I would like to setup a development environment for our ASP.NET projects. BY development environment i do not mean Visual Studio. I mean, that we have a Database Server, a Application Server and a Web Server in a 'Development Environment'.
We want to use this as our integration environment. Where the developers all work on there parts of ASP.NET Applications and then we can push our new changes up to test them as a whole.
My Question is , what is the best way to deploy our code together without stepping on our toes?
Thanks.
Team Foundation Server is a good candidate for this.
You need a source code control methodology and with it you'll get the benefits you're searching for. SVN and other solutions in this space offer "conflict resolution" to avoid inadvertent overwriting/toe squashing.
Setup a subversion repository, get all of the developers up to speed on svn and using it.
Once you have your source under control you can consider setting up a continuous integration server which can build your code and deploy to your target environment in batch. Organizing your project code properly into trunk, tags and branches per solution will make it very easy to control what is deployed or redeployed to your dev environment at any given time.
There are other options for source code control (git, tfs, and many others) but they all offer close to the same features... SVN is one of the nicer options because it's open source, free and stable.
Another thing to consider is keeping your database schema changes in sync with your code changes. Consider using migrator.net or similar solution to enable your team to keep everything in sync through revisions, including database state.

How to avoid chaotic ASP.NET web application deployment?

Ok, so here's the thing.
I'm developing an existing (it started being an ASP classic app, so you can imagine :P) web application under ASP.NET 4.0 and SQLServer 2005. We are 4 developers using local instances of SQL Server 2005 Express, having the source-code and the Visual Studio database project
This webapp has several "universes" (that's how we call it). Every universe has its own database (currently on the same server) but they all share the same schema (tables, sprocs, etc) and the same source/site code.
So manually deploying is really annoying, because I have to deploy the source code and then run the sql scripts manually on each database. I know that manual deploying can cause problems, so I'm looking for a way of automating it.
We've recently created a Visual Studio Database Project to manage the schema and generate the diff-schema scripts with different targets.
I don't have idea how to put the pieces together
I would like to:
Have a way to make a "sync" deploy to a target server (thanksfully I have full RDC access to the servers so I can install things if required). With "sync" deploy I mean that I don't want to fully deploy the whole application, because it has lots of files and I just want to deploy those new or changed.
Generate diff-sql update scripts for every database target and combine it to just 1 script. For this I should have some list of the databases names somewhere.
Copy the site files and executing the generated sql script in an easy and automated way.
I've read about MSBuild, MS WebDeploy, NAnt, etc. But I don't really know where to start and I really want to get rid of this manual deploy.
If there is a better and easier way of doing it than what I enumerated, I'll be pleased to read your option.
I know this is not a very specific question but I've googled a lot about it and it seems I cannot figure out how to do it. I've never used any automation tool to deploy.
Any help will be really appreciated,
Thank you all,
Regards
Have you heard of the term Multi-Tenancy? It might be worth look that up to see if that applied to your "Multiverse" especially if one universe is never accessed by another...
See:
http://en.wikipedia.org/wiki/Multitenancy
http://msdn.microsoft.com/en-us/library/aa479086.aspx
UPDATE:
If the application and database is the same for each client (or Tenant) I believe there are applications that may help in providing the same code/db as an SaaS application? ie another application/configuration layer on top that can handle the deployments etc?
I think these are called Platform as a Service (PaaS) applications:
see: http://en.wikipedia.org/wiki/Platform_as_a_service
Multi-Tenancy in your case may be possible, depending on client security requirements, with a bit of work (or a lot of work):
Option 1:
You could use the one instance of the application, ie deploy the site once and connect to a different database for each client. You would need to differentiate each client by URL to isolate content/data byt setting a connection string for each etc. (This would reduce your site deployments to one deployment)
Option 2:
You could create both a single instance of the application and use a single database. You would need to add a "TenantID" to each table and adjust all your code to accept a TenantID to ensure data security/isolation. Again you wold need to detect/differentiate the Tenant based on the URL to set the TenantID for the session used for every database call. (This would reduce your site and database deployment to one of each)

.NET automated build with cruisecontrol.net + nant - multiple assembly structure / best practice

I'm doing some work with several shared .NET assemblies and a generic web application that I would like to handle better in our CC.NET/NAnt build environment.
Currently, we have several .NET assemblies (shared common code that we use in client projects) that exist in different .NET solutions within different repositories in our SCM (Vault incidentally). They are all configured under CC.NET separately so we have a decent amount of control over their build and deployment at present.
We have developed a CMS system that uses some of the .NET assemblies and includes a common administration website project and a template website example project. Out of this one solution we have the following elements that need to managed separately:
Admin interface is not tied to .NET so it is template based and we are developing a PHP backend for it currently.
CMS shared assembly build on top of our other common company wide assemblies.
Control over functionality within each major CMS build/release.
I'd like the build output of this solution to be a Visual Studio template, which we can use to develop other client sites and better manage version changes within the CMS itself, as we add features to the codebase.
I have a rough approach for all this and think it is achievable, however, I wanted to open this topic up for discussion and see what everyone else is doing when it comes to managing the build and deployment of multiple solutions.
Main considerations for us are:
Do we make use of the integration queue functionality in CC.NET to ensure a build order and pull together the assemblies we need for the CMS at build time?
Debugging within a CMS client site i.e. stepping into the shared assemblies' code when the client solution is a version of the base CMS system and therefore separate.
Developing and extending the CMS when it uses shared assemblies i.e. do we add the assembly projects to the trunk solution during development (across source control repositories) and then rely on the build to pull it together or do we use a different approach entirely?
Any other issues people might have experienced that could change our way of thinking?
Hopefully this question isn't too vague and some of you will have dealt with these issues. Look forward to hearing everyones experiences.
Many thanks!
Tim
I unfortunately cannot answer all of your points, but let me start with this one:
Do we make use of the integration queue functionality in CC.NET to
ensure a build order and pull together
the assemblies we need for the CMS at
build time?
The short answer is -yes, you should. The queue attribute ensures a build order within the running instance of CC.NET and is gives you serialization of the builds that depend on each other. For specifying which projects depend on each other, you should use project triggers. Do not rely on the queuePriority for this task.
You shold most likely pull the pieces you need to do the build at build time. Unless you have some time constraints on your individual builds.
Re:
Developing and extending the CMS when it uses shared assemblies i.e. do we add the assembly projects to the trunk solution during development (across source control repositories) and then rely on the build to pull it together or do we use a different approach entirely?
I'm fundamentally against distributing binaries in the trunk unless it's some libraries that does not need to be updated/changed on a frequent basis. If you build the shared assemblies yourself, you should consider pulling them from the artifacts on the build server(s).

Categories

Resources