Publishing Databases using DacPac in Visual Studio 2013 - asp.net

I need some clarification on when to register a Database as a Data Tier Application (DAC). I've looked at all the guides but am stuck on a few points.
The database is NOT registered
Build Database Project to produce DacPac
Publish the Database Project
Check "Register as a Data Tier Application"
Check "Block publish when database has drifted from registered version"
First time round, this works. It registers the database and succeeds.
However, on subsequent publishes is fails as it says the DB has drifted noting two users which have not changed.
Am I following the correct process? i.e. setting the Publish script to re-register each time?
What is the best practice for making changes? By changing the relevant .sql files in the Database Project and then building? The guides talk a lot about being able to version the DB using the DacPac but its not clear how. Should I rename each DacPac and commit it to TFS?
My next step is to publish the Database as part of the overall ASP.Net Solution. When I try to do that (it works fine when the DB publish is not included), it comes up with the following error
Web deployment task failed. (The SQL provider cannot run with dacpac option because of a missing dependency. Please make sure that DacFx is installed. Learn more at: http://go.microsoft.com/fwlink/?LinkId=221672#ERROR_DACFX_NEEDED_FOR_SQL_PROVIDER.)
However, I have all the required elements installed on the publishing machine. Do they need to be on the SQL Server or IIS VMs?
Any guidance would be much appreciated.

If you want to deploy your changes to a database using a dacpac you would need to register the database as a DAC. This basically creates a snapshot of the database at that point in time. You do this before making a change to create the initial snapshot and then after a deployment.
The reason you do this is to detect drift. Lets say you do a deployment and someone makes a change directly in that database, for instance changing the logic of a stored procedure, you would want to know about that change before making a subsequent deployment. If you deploy your dacpac and ignore this change it will revert their change to whats in the dacpac model. This is where drift occurs. You can generate an xml report on what has drifted through the sdk.
You can enable a setting to disable deployment if drift occurs so that you can retrofit those changes in the database directly in your source code. You would then need to re-register the database as a DAC to create a new snapshot.
Am I following the correct process? i.e. setting the Publish script to re-register each time? Yes
What is the best practice for making changes? By changing the relevant .sql files in the Database Project and then building? Yes
The guides talk a lot about being able to version the DB using the DacPac but its not clear how. Should I rename each DacPac and commit it to TFS? You can set a version within the databse. Have a look at the properties of the database project. You shouldnt rename the dacpac.
About the ASP.Net publish, I would need a bit more detail around the project structure and environment setup.

Related

How can I deploy only a select set of stored procedures in a DACPAC deployment?

I have a visual studio project which contains a database project. I create an executable which performs a software update and part of that update is to update the database. Some of the stored procedures are dependent on a linked server existing which gets created as part of the executable too. The problem is that this functionality is optional and the linked server won't connect on some client machines. But the DACPAC fails because the linked server can't connect. I am using sqlpackage.exe to deploy the .dacpac file.
Is there some way that I can deploy either all or only some of the stored procedures? Or maybe I can set a flag to ignore linked server errors? Or maybe there is an alternative method to using sqlpackage/dacpac?
One option I thought of is to convert the stored procedures that contain the linked server to dynamic SQL.
Having the database in visual studio and therefore source control is important.
Yes!
This is fairly easy to do. You can see your database project in visual studio. I would recommend removing the stored procs that are problematic and merging those back in to master. Then I would take out a feature branch and point again to the DB you have the stored procs on and use the schema compare to get those back as well (even the ones that don't work well so that you don't lose them). Push the commit up to the feature branch repo. Then,now that you have the problematic stored procs in source control + the shippable version in master-- you can go ahead and thruough visual studio "publish" through the database project into the DBs you want the selected objects.
If you haven't checked in anything to master-- you can do the schema compare and select all objects except those that are problematic and update your database project. and merge that to master. If this doesn't make sense, please comment on this answer and I'm happy to give more detail.
Well, I came across this. Still working on to implement this to solve my problem. Might help your cause too.
Download the filter from:
https://agilesqlclub.codeplex.com/releases/view/610727 put the dll
into the same folder as sqlpackage.exe and add these command line
parameters to your deployment:
/p:AdditionalDeploymentContributors=AgileSqlClub.DeploymentFilterContributor
/p:AdditionalDeploymentContributorArguments="SqlPackageFilter=IgnoreSchema(BLAH)”
This will neither deploy, drop or alter anything in the BLAH schema.
More details on
https://the.agilesql.club/2015/01/howto-filter-dacpac-deployments/

Schema not updating when publish web app from Visual Studio

I am building an ASP.NET MVC EF app with code-first migrations and hosting it in Azure with Azure SQL DB. The first time I published this, it went fine. But since then my models changed, and my schema in the Azure DB is not getting updated to match. When I deploy, I do have "Execute Code First Migrations" checked. When that wouldn't work, I deleted my DB and then recreated it in the Azure portal, figuring that would trigger it getting updated. But then that didn't work, so I set AutomaticMigrationsEnabled = True in the migration Configuration. It is STILL not working, so currently my DB in Azure has none of my tables. HOW can I get the DB in Azure to be forced to update to match my models so the published site will work?? I did try looking for if there's a way to script the VS local DB to a Create query and execute that in SQL management studio, but couldn't find how to do that.
If you have made sure that you have selected update database in the publish setting, and the connection string is correctc and its still not updating. Maybe the following will help for you:
I sometimes get an issue like this, it is quite frustrating, My publish file is correct and my settings are set to allow SQL updates to occur during publishing. But sometimes the database hasn't been updated and I get a nice "backing context has changed" error, sometimes the culprit is the migration table that hasn't been updated. Unfortunately the only sure way to get your databases in sync is to check what migration history they are both at, by comparing [dbo].[__MigrationHistory]
If your published server is missing the latest migration history, then you can generate an SQL script of that by typing into the package manager console:
Update-Database -Script -TargetMigration [migration name]
'migration name' should be the name of the last migration that your published server had, visual studio will generate sql script that can be used to bring the database up to the latest migration from that target migration.
Sometimes (though very rarely, its only happened once or twice for me) the above doesn't work for whatever reason (usually because migration files have been deleted), if that is the case then its a good idea to script the whole database, and cherry pick the sql you need from that.
Update-Database -Script -SourceMigration:0
This will generate a script for every migration, you can then cherry pick based on the changes you've made. The 'latest' changes will be closer to the bottom of the file. every migrational change will start with an if check:
IF #CurrentMigration < '201710160826338_mymigration'
BEGIN
You can use this to pick the bits that you need, if you do pick the SQL be sure to include the update to the migration history. It will be at the end of the if block and look something like this:
INSERT [dbo].[__MigrationHistory]([MigrationId], [ContextKey], [Model], [ProductVersion])
VALUES (N'201710101645265_test', N'API.Core.Configuration', 'Some long checksum')
Including the migration history will ensure that visual studio doesn't have the problem again.
Hope this helps.

Way to Switch Between Development and Live Database with EF?

I have an application written in ASP.NET and using EF. I want to make a copy of the live database, have my application point to the copy, and be able to run the application against it as if it is live...but making updates to the copy.
I know I can manually copy the database and manually update the web.config files...but I'm wondering if there is a better, more automated method for achieving this with VS 2010?
Also, if there is anything I need to know about setting up a live/dev version of a database while using EF4.
You can switch the ConnectionString to point to the live or development database.
One way is to have different web.config files for debug and release builds.
Check out Web.config Transformation Syntax for Web Application Project Deployment
From my experience,
Setup 2 database connection entries in your config file, name one backup and leave the other ones name alone. Point your backup one to the backup db and when you want to run the backup, just change the names of the entries so the backup one is the actual name and the live one is named anything else.
Also, as for setting up different db versions - I'd just be careful that you don't update live by mistake!! and I've found it best to track changes to the db structure in a sql file so the live can be updated if needed or the backup can be rolled back if changes are made

Looking for a good web application deployment strategy (ASP.NET MVC3)

I’m looking for a good deployment strategy for deploying a ASP.NET MVC3 application. What I imagine is that each deployment would be some kind of commit to a Source Management System in the sense that a deployment tool could automatically do the following:
1) Upon generating a deployment package (a commit) the tool would
remember the state of my Web.Config file, the state of a folder of
auto-generated scripts containing new database changed, the state of
a folder of batch files that contain new tasks to be run on the
server, the state of files specifying ISS settings changes, etc.
2) When I build a package the next time, the tool would know to only
package the new script files, web.config changes, new batch files,
new ISS settings since my last package
3) Apply the package unto my web application
I started looking into MS Deploy but it only seems to do number 3. I’ve been searching around for either an application that that does what I imagine or a strategy to combine some SMS and MS Deploy. I'm hoping that someone has already solved the problem I feel I have here. My last resort of course is to build the tool but again, that would be my last resort.
Are you using Team Foundation Server? If so, TFS comes with tools to automate builds (including labeling code, running unit tests, deploying, et cetera.) Take a look at http://msdn.microsoft.com/en-us/library/ms181710(v=vs.80).aspx
TFS is not exactly easy to configure and get going but it's free if you are already using TFS.
If you are not using TFS, look for continuous integration tools like NAnt or TeamCity.
Have you used Web Deploy and the "Publish" feature under Build in Visual Studio?
You can set options for things like leaving the previous files on the server.
Your web.config file, do you mean the main one or one that already exists elsewhere on the server? Your web.config file should copy from your project to the server, or are there settings that are different when running locally vs server? If so, look at using transforms to modify web.config.
This is only a partial answer to #1 for you, but we looked for a long time on a migration tool that we liked... We ultimately found Migrator.Net: http://code.google.com/p/migratordotnet/
Doing this, you can turn db migrations into a batch command

How to automate the build process?

How can I automate the web-application build process, that includes following steps:
Change connection string.
Recreate database by scripts.
Deploy web-site by ftp.
Copy some files to server in addition to application.
And may be perform some initialize operations.
Should I write any script/programm, use Visual Studio or any another program?
Personally I use a Continuous Integration tool to do this kind of work.
The one I mainly use is Team City by JetBrains.
This kind of software can look at your Source Control repo for new checking, perform builds, publish builds to servers as well as running pre/post build events.
You've to start learning MSBuild. It is VERY simple and straightforward, so just start and you'll see ;)
In adddition to built in features it has Community Pack with many tasty things so you will be able to:
Replace connection string in config file using regex or replace whole config with predefined connection string (FileUpdate or Copy task)
Execute database scripts (MSBuild.Community.Tasks.SqlServer.ExecuteDDL)
Deploy site using Copy task
And many other...
You can run pre and post events in Visual Studio. To do this, simply right click on the project and in the project properties navigate to the 'Build Events' options. Here you can specify the pre and post build events (you can also specify when the event runs - on successful build or otherwise).
Once the project has been successfully built, the post build event can be set up to perform the tasks specified. You can detail the steps either in a separate file or in Visual Studio project's build events itself.
More information
Pre/Post Build event command line arguments
How to: Specify Build Events (C#)
Much along the continuous integration concept Jamie mentions, we use BuildMaster internally for all of our applications since we develop it :)
Now that we have a version offered for free, I'll share some thoughts on each of your bullet points:
Change connection string
This is something that is handled uniquely by the tool. Each environment would get its own "instance" of a configuration file and in a deployment plan you can use the "deploy configuration files" action to put them in any environment. This means there are no transforms to worry about since the config file is stored and versioned within the tool.
Recreate database by scripts
This is another major feature we have. Object code (stored procs, views, etc.) can be run every time with a DROP/CREATE combo, but adding indexes, dropping columns, can only be done once (you can't bring a column's data back without a restore!)
BuildMaster handles these types of change scripts differently - they can only be run at most once against an environment's instance of your database. This makes it super easy to bring any new or existing initialized database schema up-to-date.
Deploy web-site by FTP
Just add an action to your deployment plan, and you click Create Build or Promote Build, it will do that.
Copy some files to server in addition to application
If the process is repeatable you can do this easily, if need be by using a manual action that will remind you to do it.
And may be perform some initialize operations
This sounds like a "change control" to me, a one-time change when you release. We support these as well but not in the free version unfortunately.

Resources