Is there an efficient way to share scripts between Airflow Deployments? - airflow

We have several Astronomer airflow deployments in production. Each deployment is based off its own git repo (some differences between each), however, they all use a core set of SQL scripts. Currently, if we need to update one of the core SQL scripts we need to update each and every airflow deployment (big pain and prone to copy paste errors).
Is there a way we can efficiently share these core SQL scripts in such a manner that we only need to update one repo and the changes are propagated to all deployments?

We have a similar requirement and we are deploying the SQL scripts and other metadata files in object storage, So you can have a seperate CI/CD job to deploy SQL scripts to Object Storage like AWS S3 or Azure Data Lake, and the processing system can read the SQL files and execute.
If there any changes in the SQL scripts then we can only deploy the SQL scripts to Object Storage.

Related

How to migrate EF Core to AWS RDS

I have a dockerized ASP.NET web api that I am running on AWS. I am planning on using RDS for the database, and I need to run migrations and I am unsure how I should go about this. My docker container only contains the dotnet runtime, so I can't just SSH into one of the machines and migrate. The RDS instance is set to only accept traffic from within the VPC, so I can't just run them from my machine. What would be the best way to run EF Core migrations into RDS?
I was thinking of maybe setting up a temporary EC2 machine, installing the dotnet SDK, EF Core and the source code, then running migrations and tearing it down. But I don't know if this is a good idea, or whether there is a better way.
A temporary EC2 instance for performing this sort of thing is fine, and a common practice.
I would suggest an alternative of building an AWS CodeBuild job to perform the migration task. However you might find your temporary EC2 instance useful for other things, like connecting to the database to perform ad hoc queries.

How can I implement a local testing database for my Azure Website?

I have a website I'm developing that uses an SQL Azure database. I want to be able to test my website using a database locally hosted on the debugging machine so that my designer can work on style and content without having to deal with the overhead of firing up the Azure emulator and communicating with an external service as he is often in areas with no-connectivity.
I imagine it would be possible to extract the .sql scripts to create the database and execute them every test run, but this seems tedious. Similarly I'm not sure what the best way to configure these deployment details would be in order to switch between development, and published configurations. I'm a new web-developer cutting my teeth on a rather large project.
Using ASP.NET MVC4 and have MSSQL 2012 installed for what it's worth.
You can export your SQL azure database as .bacpac format and then import it into your sql server database. It will create all tables and fill with data. You don't need to do it on every test run, just do it once and you will have proper database for debug needs.
Switching between debug and release (or you can rename it if you want, e.g. Local and Production) configurations and using different web.config (or config transformations) is a good way to work with different settings.
If you want to store your db scripts (db structure or data) in your VCS - you can use Entity Framework migrations (new fancy way) or separate project of "SQL Server database" type (old school but proven way :) ).
Also EF migrations allows you to easily recreate db (with different name) on each run (for unit testing purposes. And then you can use SQL Express file instance (keep in mind that it's only for local work, your designer wont be able to access sql express instances afaik)

ALM - Application Lifecycle Management - Build and deployment challenge

I am stuck with a problem that I could use some feedback on to solve it in the best possible way.
The issue revolves around source control -> automated builds -> deployment. Basically ALM (Application Lifecycle Management).
We have a product – an ASP.NET Web application with a MS SQL database. This product is running on hundreds of websites with associated databases across multiple virtual machines in our production environment. At the moment the web applications and database are running on servers with IIS 7 and SQL Database Server 2008 R2. The product itself is source controlled in Team Foundation 2012.
For years the release of new versions of the product has been once or twice a year for years. Now we are going to focus on releasing more frequently and hence we need a strategy for the ALM for the product.
The deployment strategy now:
In the development period between the releases, the SQL update scripts has been created manually – each time a database change was made a script was updated. When the application is ready to be deployed it gets compiled on a developer machine. The database with all the changes used would be backed up into a .BAK file. The web application, the .BAK file and the update SQL script would be packaged (.zip) and uploaded to the production environment ready for deployment.
Update existing running products:
Copy/paste the web application in the target website physical folder.
Update the web.config file – connectionstring and application
variables. Run the update script via SQL Management Studio
This would be done for each and every customer – hundreds of times.
This is a very tedious and error prone task and I don’t like it at all!
What I would like to do instead is;
Source control the database as a Database Project in Team Foundation
Automatically build the web application with Team Foundation 2012
Build Server.
Deploy the output from the Build Server to the multiple websites of
the production environment along with automatically generated SQL
update scripts run against the SQL Server.
I have been googling my ass off - only finding bits and pieces regarding builds, deployment, automatic SQL update script etc.
What I think is partly the right direction is to source control the database and use the TFS Build Server. I am very confused on how to do the deployment itself in an easy and controlled way using the output from the TFS Build server.
Ideally I would want to the TFS Build server to create a package with the latest version of the Web application, the latest version of the database, post deployment script including an auto generated SQL Update script from the previous build to the current build. This could be contained in e.g. a nuget package. Then I would want to be able to create an additional web application which should manage the deployment – target, version, iis website, sql server, web.config connectionstrings etc.
Does anyone have any advice on how to achieve this? How do you do this?
You can use a release management tool to do this, no need to create an additional web application.
One such example is Deployment Manager, from Red Gate. (Disclaimer: I work there.) It has built-in deployment actions for ASP.NET apps and SQL Server databases. The command line tool RgPublish.exe can be used to create a package for the web app as you describe from TFS Build. The same can be done for the database using the sqlCI.exe command line and associated NANT/MSBuild scripts.
The same packages can then be deployed to each of your servers. You may run into scalability issues with 100s of websites though.
The database deployment works by generating the upgrade script automatically, though you can change the behaviour to put the upgrade script in the package, when the package is first built. These are called "dynamic" and "static" upgrade methods respectively.

Using MS Deploy to deploy only the databases

Can I use MS Deploy to deploy only the databases and not the web applications in the deploy package?
And if so, how?
The website is running on a different server than the databases (web server vs db server) so I'd rather not have it deploy the package in its entirety to the database server.
MSDeploy does not deploy the entire package to MsDeploy.axd, so the deploying the lot just to deploy the database doesn't really have that much overheard.
If you really don't want the website to even be checked for the purposes of synchronization, you have two choices:
Deploy the package and skip the iisApp by adding -skip:objectName=iisApp,setAcl
Split the dbFullSql into another package
Either way, I'd recommend continuing to deploy via the application server rather than directly to the database server.

ASP.NET SQL Server Sessions and deployment

I just deployed my first ASP.NET App to server and everything would be fine if not problems with database... I have sessions stored in SQL Server as two tables. Those tables are operated thru stored procedures. When deploying app I have moved database by making a backup copy of my local db and uploading it to production server. This caused that names were mixed and stored procedures were not working.
Is there a way to deploy database witch changed names ?! Or every time I deploy application to server I need to run asp's create-session-in-db app ?!
It's better to deploy the database using SQL scripts then backup/restore or copy. You can tailor the scripts to your specific need at the time, eg. alter, drop/create. You also get a record of what your data structures are that can be filed with your app.
Simon

Resources