We are trying to deploy the docker images on redhat openshift platform and was able to deploy with default Ubuntu base files
But we need the docker files align with the production environment so we need to convert the base from Ubuntu to redhat
Has anyone deployed api manager in openshift or converted the docker images to redhat
OpenShift is running kubernetes underneath therefore wso2 products can be deployed in OpenShift with kubernetes membership scheme for clustering.
Converting wso2 docker images to RedHat should not be a problem. You only need to convert dockerfile and few scripts from ubuntu to RedHat.
There are existing deployments running wso2 products on OpenShift.
Related
I have started learning/trying Airflow. Part of my research, I have installed airflow in my local through Docker. I referred official page Airflow Install Docker
I am looking for a (standard) process when I can deploy Airflow to Azure.
Can I directly use same docker-compose file for that ?
Any help will be appreciated.
Likely the easiest way is to use AKS (Azure Kubernetes Service) and use the official Helm Chart of Apache Airflow from the Apache Airflow community to deploy Airflow on it:
https://airflow.apache.org/docs/helm-chart/stable/index.html
I reckon, Installing Airflow through Docker-compose.yml on Azure wouldn't be the right approach. It is probably encouraged for local installations only.
The way we have set this up in Azure is like this:
Database: Azure Database for PostgreSQL server
Webserver + Scheduler: Azure App Service on a Linux App Service Plan
Docker: Azure Container Registry
Build the Docker image (either locally or in CICD pipeline)
Push the Docker image to Azure Container Registry
Setup Postgres server and create airflow database
Setup Azure App Service to pull Docker image from ACR (You can also setup continuous integration). Remember to configure Environment variables.
Obviously, as Jarek mentioned, you could also go with setting up AKS.
You can try the preconfigured images of Kubernertes AirFlow2 packaged by Bitnami. And perform the Deploy in AKS
bitnami.airflow
I work within a corporation that has very strict security policies.
I am running a single docker container consisting of an asp.net core 3.1 mvc web-app. It is based on the default asp.net debian 10 image provided by microsoft.
I have only installed docker engine - so no docker compose - and have not had to do any additional set-up/config, but run the container by a simple command:
docker run -p port-x:port-y imagename:tag
when I navigate to http://host-Ip i can access the web-app from my windows dev machine.
However, the host is RHEL running in a very restricted enterprise network.
Now I need to connect to an external domain from within the container but that can only happen via an internal proxy.
The proxy team requires the source ip-address.
I have 2 questions:
would that be the host (RHEL) ip address?
Is there a way to test this (I cannot/not allowed to install custom software/libs into RHEL and have access to basic + limited set)?
If anyone stumbles upon this: The source ip-address is the RHEL host ip-address where the container is running
I am using on-prem JFrog Artifactory to hold maven artifacts. I have a need to use this artifactory to hold my app's docker images. So I tried creating a new docker repository but it showed me docker type as disabled. When I am looking around people are suggesting to use JFrog Container Registry.
My question is, can't I create a docker repository in my existing artifactory, does it require any plugin etc. to be downloaded to add this functionality in existing artifactory ? And how to enable that docker repository option while trying to create a new repository ?
Seems like you are using the Artifactory OSS version and not the Artifactory pro version, Artifactory OSS is limited as this an open-source version for the Maven-based projects. As everyone recommends you can make use of the JFrog container registry which is free to use application to handle the Docker registries.
Otherwise you can utilize the FREE Cloud Artifactory to handle all the different repositories in a single Artifactory instance.
I have a running dev installation of a dockerized wordpress project( via docker-compose) and was wondering what would be the best approach to deploying it to Azure. Most tutorials assume you are starting from scratch however I already have an installation and was wondering if I just need to integrate the azure-cli to the compose or setup an azure and migrate what was already done.
I have an account but do I invoke a docker VM or a wordpress from azure ??
What about the database?
The easiest path is for you to create a new Azure Container Service, which can host Docker Swarm, and deploy it there. Install the Azure CLI 2.0 on your machine and follow the tutorial to get started.
I have MySQL database and WebAPI project.
After installation of Docker for Windows. I can download image for MySQL in Linux mode, but when I try
docker pull microsoft/aspnet
it says
unknown blob
On the other side - in Windows containers mode. I can install aspnet, but when I try
docker pull mysql/mysql-server
it says
image for Linux
How I can combine both in single environment?
At the Moment I would recommend using the aspnetcore container, since it uses Linux as a base and can be used with every other docker Container.
The amount of Windows-only Containers is limited, to say the least.