I have a dockerized ASP.NET web api that I am running on AWS. I am planning on using RDS for the database, and I need to run migrations and I am unsure how I should go about this. My docker container only contains the dotnet runtime, so I can't just SSH into one of the machines and migrate. The RDS instance is set to only accept traffic from within the VPC, so I can't just run them from my machine. What would be the best way to run EF Core migrations into RDS?
I was thinking of maybe setting up a temporary EC2 machine, installing the dotnet SDK, EF Core and the source code, then running migrations and tearing it down. But I don't know if this is a good idea, or whether there is a better way.
A temporary EC2 instance for performing this sort of thing is fine, and a common practice.
I would suggest an alternative of building an AWS CodeBuild job to perform the migration task. However you might find your temporary EC2 instance useful for other things, like connecting to the database to perform ad hoc queries.
Related
I've tried to make a website using flask using a sqlite database and deploy it via heroku, but apparently heroku doesn't support sqlite, it able to deploy but the database is deleted after a while. If I want to deploy using another service or buy my own domain, will the database be fine?
Heroku does support sqlite database.
But since Heroku has an ephemeral file system so the files created during program run are deleted on dyno restart.
Using another service may solve it, it depends on the service you are using.
A good solution would be to use a remote database. Heroku itself has an add on to add a postgresql.
Edit: Here is an article from devcenter.heroku which explains why sqlite doesn't work and shouldn't be used also explaining how to create an external database as addon.
Here is the article
We have several Astronomer airflow deployments in production. Each deployment is based off its own git repo (some differences between each), however, they all use a core set of SQL scripts. Currently, if we need to update one of the core SQL scripts we need to update each and every airflow deployment (big pain and prone to copy paste errors).
Is there a way we can efficiently share these core SQL scripts in such a manner that we only need to update one repo and the changes are propagated to all deployments?
We have a similar requirement and we are deploying the SQL scripts and other metadata files in object storage, So you can have a seperate CI/CD job to deploy SQL scripts to Object Storage like AWS S3 or Azure Data Lake, and the processing system can read the SQL files and execute.
If there any changes in the SQL scripts then we can only deploy the SQL scripts to Object Storage.
What is the best process of dotnet deployment?
Starting from Code checkin to Jenkins and to AWS server.
FYI: We have multiple AWS servers which sync via DFS. So, currently we are deploying only to 1 server and DFS sync it to other servers.
Few questions:
Should we recycle app pool after every deployment? Is is necessary?
What about packages? Should we checkin with code or restore them at build server?
What about T4 templates? Currently, we checkin the auto-generated code as well because without installing visual studio at build server we can't auto-generate T4 templates.
After few months we will be using webpacks as well.
This deployment is used to regenerate existing 10,000 pages which has Output cache enabled. Also, these 10,000 pages are behind AWS CloudFront. This deployment and 10,000 pages share the same app-pool. What happens to Output cache after deployment? Should we have a separate app-pool and why?
FYI:
This deployment is only used for internal staff majorly. So, not much traffic to it.
i need some advice about continuous deployment within Visual Studio Team Service. To be honest, i am quite new in this area, so forgive this silly question because i can't find any reference for AWS but only Azure.
My idea is i can deploy asp.net application to AWS EC2 which is built from VSTS source control.
My current scenario is:
I had source control which contain asp.net application code inside VSTS.
I created build definition which build the source code and produce artifact.
I created release definition, which copy artifact to remote AWS EC2 instance.
....
I don't have any idea to continue the next step, could you give advice what i should do next ? Or any better scenario ?
Thank You.
Currently I don't see any tasks which can directly deploy to AWS, so the only way this seems possible if you create your own task or use powershell or bash along with AWS cli to deploy your artifact. The process would be something like this
Download the artifact in a release. This is default if you link the artifact.
Make sure the agent machine that you are using has AWS CLI for Powershell or AWS Shell if you are using bash.
You can then write a powershell or bash script which will utilize aws cli to deploy your artifact to AWS.
For anyone else wondering about this in the future, AWS just released the AWS Tools for VSTS to the Visual Studio Marketplace. These tools contain a number of tasks you can use to work with AWS services such as S3, CodeDeploy, Elastic Beanstalk, Lambda and CloudFormation from within a VSTS or TFS environmemt.
We also just published a blog post about using the tools to publish ASP.NET and ASP.NET Core applications to AWS from within VSTS.
There are couple of options for you. A tutorial to explain how to get this running is given below.
How to Build a CI/CD Pipeline Using AWS CodeDeploy and Microsoft Team Foundation Server (TFS)
(For hybrid/complex deployments, you can use this. You can deploy IIS websites, MSI packages, services, exe). The beauty of this is that with a single deployment you can deploy to both on premises and cloud environment.
https://www.youtube.com/watch?v=MIE0P3m9eEY
How to Integrate AWS Elastic Beanstalk with Microsoft Team Foundation Server (TFS) or (VSTS)
(for IIS websites/batch jobs you can use this)
https://www.youtube.com/watch?v=nRLZZefLDqU
How to Integrate AWS Cloudformation with Microsoft Team Foundation Server (TFS)
(fully infrastructure automation and manage infrastructure as code)
https://www.youtube.com/watch?v=WU93NJT0_3s
Can I use MS Deploy to deploy only the databases and not the web applications in the deploy package?
And if so, how?
The website is running on a different server than the databases (web server vs db server) so I'd rather not have it deploy the package in its entirety to the database server.
MSDeploy does not deploy the entire package to MsDeploy.axd, so the deploying the lot just to deploy the database doesn't really have that much overheard.
If you really don't want the website to even be checked for the purposes of synchronization, you have two choices:
Deploy the package and skip the iisApp by adding -skip:objectName=iisApp,setAcl
Split the dbFullSql into another package
Either way, I'd recommend continuing to deploy via the application server rather than directly to the database server.