DACPAC Deployments and RM - how to tokenize DACPAC scripts? - sql-server-data-tools

I have a DACPAC deployment with some scripts that get "baked in" to the database.dacpac file prior to publish. I need to tokenize some of these scripts - which poses a challenge because the DACPAC is compiled before it hits RM.
Is there any way I can have the SQL script reference an external file that is tokenizable - this way when RM actually publishes, the DACPAC references this file?
For example - somehow getting the following into an externally referenced sql file that is NOT baked into the dacpac and sits on disk alongside it:
INSERT INTO #table (Param1,Param2)
VALUES (__Value1__, __Value2__)
Edit
Based on the comments, I'm thinking it may be possible to pass the values into SQLPackage. Does anyone know how I can leverage publish profiles to solve this problem?

Related

How can I deploy only a select set of stored procedures in a DACPAC deployment?

I have a visual studio project which contains a database project. I create an executable which performs a software update and part of that update is to update the database. Some of the stored procedures are dependent on a linked server existing which gets created as part of the executable too. The problem is that this functionality is optional and the linked server won't connect on some client machines. But the DACPAC fails because the linked server can't connect. I am using sqlpackage.exe to deploy the .dacpac file.
Is there some way that I can deploy either all or only some of the stored procedures? Or maybe I can set a flag to ignore linked server errors? Or maybe there is an alternative method to using sqlpackage/dacpac?
One option I thought of is to convert the stored procedures that contain the linked server to dynamic SQL.
Having the database in visual studio and therefore source control is important.
Yes!
This is fairly easy to do. You can see your database project in visual studio. I would recommend removing the stored procs that are problematic and merging those back in to master. Then I would take out a feature branch and point again to the DB you have the stored procs on and use the schema compare to get those back as well (even the ones that don't work well so that you don't lose them). Push the commit up to the feature branch repo. Then,now that you have the problematic stored procs in source control + the shippable version in master-- you can go ahead and thruough visual studio "publish" through the database project into the DBs you want the selected objects.
If you haven't checked in anything to master-- you can do the schema compare and select all objects except those that are problematic and update your database project. and merge that to master. If this doesn't make sense, please comment on this answer and I'm happy to give more detail.
Well, I came across this. Still working on to implement this to solve my problem. Might help your cause too.
Download the filter from:
https://agilesqlclub.codeplex.com/releases/view/610727 put the dll
into the same folder as sqlpackage.exe and add these command line
parameters to your deployment:
/p:AdditionalDeploymentContributors=AgileSqlClub.DeploymentFilterContributor
/p:AdditionalDeploymentContributorArguments="SqlPackageFilter=IgnoreSchema(BLAH)”
This will neither deploy, drop or alter anything in the BLAH schema.
More details on
https://the.agilesql.club/2015/01/howto-filter-dacpac-deployments/

How can I remove issues with my flyway springboot project?

So while building a new database using our database migration scripts written in a springboot flyway project, we realized we made some mistakes.
Some old scripts need to be changed to ensure that we do not face these issues when we make a new database schema again. These issues are mostly related - an info table was not populated with entries in the project and there are scripts that refer to the data in the migration project -- this data does not exist because we never included a script to include data.
How can we correct this project - the only way I can think of is to correct scripts such that all inserts are replaced by - insert if not exists or replace create statements by create if not exists.
and then delete all entries in schema version and re-run this on all the database which are using this schema.
I cannot go back and correct my script because then the migration project will fail because of checksum issues.
You are rigth, if this project and the scripts are running in some existing projects you can not modify them because the checksum would fail.
Then the cleanest way I can think would be add a file called "DB-GENERAL-FIXES" or something like that, where you can add all SQL validations to restore the DB to a stable status. For the new implementations will be extra work first build it wrongly and then clean it, but if you are sharing the same code in production right now...is the best option

Publishing Databases using DacPac in Visual Studio 2013

I need some clarification on when to register a Database as a Data Tier Application (DAC). I've looked at all the guides but am stuck on a few points.
The database is NOT registered
Build Database Project to produce DacPac
Publish the Database Project
Check "Register as a Data Tier Application"
Check "Block publish when database has drifted from registered version"
First time round, this works. It registers the database and succeeds.
However, on subsequent publishes is fails as it says the DB has drifted noting two users which have not changed.
Am I following the correct process? i.e. setting the Publish script to re-register each time?
What is the best practice for making changes? By changing the relevant .sql files in the Database Project and then building? The guides talk a lot about being able to version the DB using the DacPac but its not clear how. Should I rename each DacPac and commit it to TFS?
My next step is to publish the Database as part of the overall ASP.Net Solution. When I try to do that (it works fine when the DB publish is not included), it comes up with the following error
Web deployment task failed. (The SQL provider cannot run with dacpac option because of a missing dependency. Please make sure that DacFx is installed. Learn more at: http://go.microsoft.com/fwlink/?LinkId=221672#ERROR_DACFX_NEEDED_FOR_SQL_PROVIDER.)
However, I have all the required elements installed on the publishing machine. Do they need to be on the SQL Server or IIS VMs?
Any guidance would be much appreciated.
If you want to deploy your changes to a database using a dacpac you would need to register the database as a DAC. This basically creates a snapshot of the database at that point in time. You do this before making a change to create the initial snapshot and then after a deployment.
The reason you do this is to detect drift. Lets say you do a deployment and someone makes a change directly in that database, for instance changing the logic of a stored procedure, you would want to know about that change before making a subsequent deployment. If you deploy your dacpac and ignore this change it will revert their change to whats in the dacpac model. This is where drift occurs. You can generate an xml report on what has drifted through the sdk.
You can enable a setting to disable deployment if drift occurs so that you can retrofit those changes in the database directly in your source code. You would then need to re-register the database as a DAC to create a new snapshot.
Am I following the correct process? i.e. setting the Publish script to re-register each time? Yes
What is the best practice for making changes? By changing the relevant .sql files in the Database Project and then building? Yes
The guides talk a lot about being able to version the DB using the DacPac but its not clear how. Should I rename each DacPac and commit it to TFS? You can set a version within the databse. Have a look at the properties of the database project. You shouldnt rename the dacpac.
About the ASP.Net publish, I would need a bit more detail around the project structure and environment setup.

Way to Switch Between Development and Live Database with EF?

I have an application written in ASP.NET and using EF. I want to make a copy of the live database, have my application point to the copy, and be able to run the application against it as if it is live...but making updates to the copy.
I know I can manually copy the database and manually update the web.config files...but I'm wondering if there is a better, more automated method for achieving this with VS 2010?
Also, if there is anything I need to know about setting up a live/dev version of a database while using EF4.
You can switch the ConnectionString to point to the live or development database.
One way is to have different web.config files for debug and release builds.
Check out Web.config Transformation Syntax for Web Application Project Deployment
From my experience,
Setup 2 database connection entries in your config file, name one backup and leave the other ones name alone. Point your backup one to the backup db and when you want to run the backup, just change the names of the entries so the backup one is the actual name and the live one is named anything else.
Also, as for setting up different db versions - I'd just be careful that you don't update live by mistake!! and I've found it best to track changes to the db structure in a sql file so the live can be updated if needed or the backup can be rolled back if changes are made

Using MSBuild to Copy a website into Production while REALLY skipping unchanged files

I am using MSBuild to Publish a web site, then copy the published site to a web server on the same network. I set the copy command to "SkipUnchangedFiles."
It works swimmingly, but Skip Unchanged won't work because when I use AspNetCompiler to publish the website, each and every file is "new" -- its date is set to the moment of publishing, so even if the contents of a given file have not changed, the timestamp is different, so it's copied over anyway.
Is there a workaround that will prevent file whose contents have not changed from being copied?
Depending on how you're publishing the site, you may be able to do Incremental Build instead of a full build.
There is no existing process for this as the deployment process isn't aware of the deployment target filesystem.
If you were aware you could do a diff using a tool like beyond compare and then grab only the binary diff'ed items and copy those across.
Looking to automate this you are probably going to have to dig into writing msbuild targets or post build scripts.

Resources