As title states, running helm upgrade is not updating my user pods. Currently have culling and scheduling disabled (via the Helm chart)...but I believed that wouldn't/shouldn't matter. Is there another option I'm missing or misunderstanding?
JupyterHub chart version 0.11.1
More information, I can tell the hub, proxy, and other deployments update when running helm upgrade, why wouldn't the user pods as well?
Update: This is necessary when the docker image the user pods are relying on is updated with anything.
Related
I have started learning/trying Airflow. Part of my research, I have installed airflow in my local through Docker. I referred official page Airflow Install Docker
I am looking for a (standard) process when I can deploy Airflow to Azure.
Can I directly use same docker-compose file for that ?
Any help will be appreciated.
Likely the easiest way is to use AKS (Azure Kubernetes Service) and use the official Helm Chart of Apache Airflow from the Apache Airflow community to deploy Airflow on it:
https://airflow.apache.org/docs/helm-chart/stable/index.html
I reckon, Installing Airflow through Docker-compose.yml on Azure wouldn't be the right approach. It is probably encouraged for local installations only.
The way we have set this up in Azure is like this:
Database: Azure Database for PostgreSQL server
Webserver + Scheduler: Azure App Service on a Linux App Service Plan
Docker: Azure Container Registry
Build the Docker image (either locally or in CICD pipeline)
Push the Docker image to Azure Container Registry
Setup Postgres server and create airflow database
Setup Azure App Service to pull Docker image from ACR (You can also setup continuous integration). Remember to configure Environment variables.
Obviously, as Jarek mentioned, you could also go with setting up AKS.
You can try the preconfigured images of Kubernertes AirFlow2 packaged by Bitnami. And perform the Deploy in AKS
bitnami.airflow
I followed these instructions to create a Hello-World WebApp with C#/.NET Core 3.1 with Visual Studio 2019 V16.5.2 and deployed to Azure Kubernetes Services (AKS) and it worked great. In addition to being able to debug/single-step with Visual Studio, I could also use the kubectl run command and edit the deployment to make it a LoadBalancer and see my hello-world web page in the browser.
Then I published the same image of the WebApp to my personal DockerHub account and tried to deploy to Docker for desktop/Kubernetes. When doing kubectl get pods -o wide I saw the status of the pod is ImagePullBackOff and kubectl logs <podname> fails to give me a log where might have some hints as to the problems.
(1) How does one diagnose a problem like this with no log files?
(2) Could this be a bug since this image works on AKS? I'd prefer to use Docker Desktop.
Thanks
I am using this:
kubectl describe pod {pd-id}
it shows whole bunch of the information including errors related to pulling image.
also most probably your ImagePullBackOff is related to authentication problem of kubernetes against image registry.
How to get link for elastalert in kibana console, I had installed elastalert in the machine with kibana and it is running fine. However not sure on how to get it displayed in the menu bar.
My Kibana Console
The official Elastalert doesn't have any Kibana plugin. However, the folks at Bitsensor have developed their own fork of Elastalert that runs a server (running on port 3030) that exposes REST API's for manipulating rules and alerts and for that they have developed a Kibana Plugin. So, to use the Kibana plugin you should use their fork.
Relevant Links:
Bitsensor Elastalert
Kibana Plugin for Bitsensor Elastalert
I have a running dev installation of a dockerized wordpress project( via docker-compose) and was wondering what would be the best approach to deploying it to Azure. Most tutorials assume you are starting from scratch however I already have an installation and was wondering if I just need to integrate the azure-cli to the compose or setup an azure and migrate what was already done.
I have an account but do I invoke a docker VM or a wordpress from azure ??
What about the database?
The easiest path is for you to create a new Azure Container Service, which can host Docker Swarm, and deploy it there. Install the Azure CLI 2.0 on your machine and follow the tutorial to get started.
We are trying to deploy the docker images on redhat openshift platform and was able to deploy with default Ubuntu base files
But we need the docker files align with the production environment so we need to convert the base from Ubuntu to redhat
Has anyone deployed api manager in openshift or converted the docker images to redhat
OpenShift is running kubernetes underneath therefore wso2 products can be deployed in OpenShift with kubernetes membership scheme for clustering.
Converting wso2 docker images to RedHat should not be a problem. You only need to convert dockerfile and few scripts from ubuntu to RedHat.
There are existing deployments running wso2 products on OpenShift.