Flyway approach for Spring + JPA deployed to heroku - flyway

I'm thinking of using Flyway for my database migration. Seems like it will be simpler than creating my own SQL and Java migration scripts. However, looking at the documentation there seems to be several ways to use it.
What should I consider when deciding between migrating with (a) application integration, (b) a maven task, or (c) the command line?
Currently I deploy to heroku with a simple git push. This builds my app and starts it as specified in the proc file.
So in this regard it seems like the application integration (migrating on startup) would be the simplest. But it also seems like overhead I don't need. I suppose if I do the maven task I would need to ensure that heroku calls maven correctly to make this happen.
What are the trade-offs? Is anyone currently using Spring + JPA + Flyway together with a heroku hosted application?

You are correct, the application integration is the simplest. Code and DB can never get out of sync.
The overhead is absolutely minimal, especially compared to JPA. The few millis it'll cost you on startup are well worth the dev and deployment convenience.

Related

Please advise the modern architecture of ASP.NET WebApi application

Please advise the modern architecture of ASP.NET WebApi application (better .NET Framework not Core) with the latest innovations. For example: Unit of work pattern, controller -> manager -> repository, Automapper, xUnit, Serilog or..., a reliable migration mechanism for Oracle - Liquibase or... Asynchronous execution - async / await.
Requirements: authentication - AD / Windows, DB - Oracle. If there are code examples it would be ideal
UPDATE 1
What about ORM and migration system. In my opinion code first approach and EF migrations is good because the model suit to DB copletely, but it is risky. Previously I made Database Project for MS Sql Server and all change scripts was generated by comparing schemas of original DB and DB changed by EF migration, like this:
create models
apply migration to local (dev) database
compare original DB and changed DB, ectract/generate change SQL scripts
create deployment migration for CI/CD (DB project, liquibase or similar)
rollback local DB and test deployment migration
commit and push
It looks strange, but practice shows that using EFs migrations leads to data loss
For me, an API (which your app exposes to consumers) can be thought of as just another type of presentation layer.
If you assume that's the case then any sensible .Net architecture would suit. Personally I follow an logical approach like this: 5 layer Architecture - https://morphological.files.wordpress.com/2011/08/5-layer-architecture-draft.pdf
Not sure if that answers your question though. I think if you get the logical architecture sorted out, then determining which technologies you use to implement isn't so hard.
The 5-Layer architecture referenced above uses dependency injection, so you can use whatever database technology you like.
Use of sync vs async depends on the nature of the actual functional and technical problems you're trying to solve.

Best practice to create topics in dotnet core

I'm just started working with Kafka together with C# (dotnet core). I have spinnned up an Kafka environment with Schema Registry using docker (Confluent Images).
I have a hobby project where I try to implement a microservice architecture. I will use Kafka to handle my IntegrationEvents between services.
Right now I have created my Kafka topics through the Confluent UI but I really like to have configuration as code - etc. I have database migrations using EF Core Migrations - Cloud environment using TerraForm.
What would be the best practices for creating topics? Right now I am thinking to create topics when my applications is starting up (if it exists it does nothing). The responsible application for creating the topic will be the application that needs to produce to that topic.
Any input our ideas what I can improve or do I have missed something that can potentiel cause me a lot of troubles.
Best regards Martin
I have decided to just use a command for my docker compose file. I still don't know or have decided for how my hosting environment will be. So wether going with Kubernetes all the way or using only Azure Services or ConfluentCloud.
So on I will postpone my decision regarding that and when time comes I will adjust code or Terraform for that.

Deploying microservice to be tested within the test [duplicate]

This question already has an answer here:
Is there a way to run Karate tests as an integration test suite against a pre-booted spring boot server?
(1 answer)
Closed 1 year ago.
Maybe this is not possible to do generically in a test framework but
I would like to be able to deploy the microservice I am testing within the test itself. I have looked at Citrus, RestAssured, and Karate and listened to countless talks and read countless blogs but I never see how to do this first stage. It always seems to be the case that there is an assumption that the microservice is pre-deployed.
Honestly it depends on how your microservice is deployed and which infrastructure you are targeting on. I prefer to integrate the deployment into the Maven build as Maven provides pre- and post-integration-test phases.
In case you can use Kubernetes or Docker I would recommend integrating the deployment with fabric8 maven plugins (fabric8-maven-plugin, docker-maven-plugin). That would automatically create/start/stop the Docker container deployment within the Maven build.
In case you can use Spring boot the official maven plugin can do so in the same way.
Another possibility would be to use build pipelines. Where the continuous build with Jenkins for example would deploy the system under test first and then execute the tests in a pipeline.
I personally prefer to always separate deployment and testing tasks. In case you really want to do a deployment within your test Citrus as a framework is able to start/stop Docker containers and/or Kubernetes pods within the test. Citrus can also integrate with before/after test suite phases for these deployment tasks.
I found a way to do it using docker-compose.
https://github.com/djangofan/karate-test-prime-example
Basically, make a docker-compose.yml that runs your service container and then also runs e2e tests after a call to wait-for-it.sh.
2 points:
The karate-demo is a Spring Boot example that is deployed by the JUnit test-runner. Here is the code that starts the server.
The karate-mock-servlet takes things one step further where you can run HTTP integration tests within your project without booting an app-server. Save time and code-coverage reports are easier.
If you have any requirements beyond these, I'd be happy to hear them. One of the things we would like to implement is a built-in server-side mocking framework - think embedded wiremock: but with the ease of Karate's DSL. But no concrete timeline yet.
EDIT: Karate has mocking now: https://github.com/intuit/karate/tree/master/karate-netty

Database deployment in rapid development environment ASP.NET Webforms/SQL Server

We do rapid development of web applications and we're looking for ways to separate our development and production databases (we currently develop directly on production... it's bad news).
We use ASP.NET Webforms with LINQ2SQL and Dynamic Data for CRUD. How can we do database development locally and then deploy changes to production? I've seen Entity Framework Code-first migrations, but I don't know of any equivalent for LINQ2SQL. We don't want to switch to EF as our CMS is built around LINQ2SQL.
We would also need production data to be available locally (not up to the minute, but recent enough) so we can debug with real data if problems arise.
This is the only idea I've come up with so far but it's far from ideal:
Initial development is done locally then deployed to production
Subsequent maintenance is then done on a local replication of the production database. Then we use some kind of 'database diff' tool to determine the changes that were made, and migrate those changes to production.
Is this an acceptable way of doing things? Is there a better way we could use?
Thanks
Develop your data model and procedures in SSDT Database Projects. This keeps a perfect source controlled copy of what you want the database to look alike at any moment in time. Then let the tooling generate the publishing scripts for you.
Developers should always develop on their own local copy of a database. They can check out scripts form the database project and make changes which they publish locally They can get latest on the checked in project, merge their changes, deploy locally again, test it out, and then check in their changes. Only when everything is tested out, you then publish the changes to production.
You end up treating your database schema very much like code source files.
To get production data down to your development server I would take a .bacpac or .dacpac of the production DB and them import it into your local DB. This works well because you need the schema definition along with the data since it is likely that prod is an older version than what you would have in dev
Yeah I think you basically hit the nail on the head. Those are the two things I have done.
You develop locally and check in your SQL scripts to source control. Then you run the scripts for a deployment. What I've seen work well is dropping/re-creating all stored procedures (seems scary, but if you trust those scripts it's very helpful), and then having one-off scripts per deployment for schema changes and data migration.
Periodically you will copy down production data and restore it locally. Obviously this sync can only happen easily right after a deployment since that's when local and production will be the same. At my current job we actually duplex writes and send a copy to the lower environments and so I suppose that's an option. You could replicate data from production somewhere else and work off of that/write a tool to bring the data into local.
From what I've seen there are no easy answers.

Deploying ASP.net MVC Applications to Staging and Production with SQL

We have been a ColdFusion shop for 10 years, and are now switching over to ASP.net MVC. Our target framework is .net 4.0 BETA 2 using VS 2010 BETA 2. We set up two instances of Windows Server 2008 (staging and production), and will be using our existing database server (SQL Server 2008).
None of us really have much experience in ASP.net itself, though we are all very comfortable in C# and the MVC pattern. The coding itself isn't much of an issue; but the deployment process is. Our goal is to be able to have a CI setup that will automatically pull down, and test, our applications into staging on commit - then have the option to tag, then switch, the checkouts on our production sites when websites pass QA.
Some of the things I'm having issues with here is the concept of an ASP.net application and how it integrates into SVN. CF, like PHP or RoR, are all scripting languages and as such require no build process (checking out the source into production is very straightforward). But in this case, applications need to be compiled - which is where we start to have problems. Will we need to create another server (or use an existing one) that has some sort of application that pulls down code, compiles it, then somehow pushes it on the live servers? If so, what is considered the best way to accomplish this? I imagine if we end up using a build tool such as Nant, adding additional steps to migrate the database would be trivial, but what is the best way to accomplish this as well?
Another, slightly unrelated, problem is how our designers will work with our code. Most of them are on Macs, and using VS isn't much of an option. How will they be able to edit the aspx, css and image files easily? Our goal is to make this as transparent as possible to them.
We have done a lot of shopping around, and ASP.net MVC seems to be the best option as far as our familiarity with the language, and our current platform. We just need to figure out a good build process so everything is as transparent as possible. I understand there are a ton of resources available on this, but I wanted to get the opinions of the people here from first-hand experience.
Microsoft TFS has a wonderful build solution built-in. It's costly, but effective. In addition, you cannot lose by looking at CruiseControl, which is free. TeamCity from JetBrains is also a great option. All of these Continuous Build and Integration solutions would provide a good starting point for your research.
http://msdn.microsoft.com/en-us/teamsystem/dd408382.aspx
http://www.cruisecontrol.com/
http://www.jetbrains.com/teamcity/
Even Draco.net is a good consideration:
http://draconet.sourceforge.net/
We use http://www.cruisecontrol.com/ (CC) running on our SVN / Build server. You can configure CC via it's own config/script files to pull down the latest source from SVN and then spawn one or more Nant or MSBuild scripts which can perform your build and deployment.
We script all of our database changes into change scripts which also go into SVN. We then have a custom command line tool which will deploy the change scripts to SQL Server during the web site deployment. All of that is done in the Nant script.
So each project's Nant script handles the build, web site deployment and SQL change script deployment.
The tricky part is handling rollbacks if/when something goes horribly wrong. I would suggest posting another question for that specific problem.

Resources