AWS Airflow Add provider - MongoDB - airflow

I'm new in the AWS Airflow.
I have unmanaged Airflow on AWS ec2 instance which I want to move to the managed one.
When I went over the managed Airflow I saw there is missing provider for MongoDB.
How can I add it to the "connection type" drop down?
I couldn't found any wizard for guide me in this issue.

You'll need to install the MongoDB provider package for the MWAA environment. Add the package apache-airflow-providers-mongo to requirements.txt. The MongoDB provider package is not installed by default in the MWAA environment.
References
apache-airflow-providers-mongo (Airflow)
Installing Python dependencies (MWAA)
Default provider packages (GitHub)

Related

How can I install NebulaGraph databse on AWS without deploying the workbench?

I'm installing the Nebula Graph cluster on AWS, but I don't need the workbench service where it has Explorer, Dashboard and so on.
How can I NOT install the workbench when I am installing NebulaGraph with CloudFormation?
To the best of my knowledge, there is no way at present unless you manually deploy the cluster or manually modify the parameters after downloading the CloudFormation template.

In Airflow UI, how to add SQL Server connection type

I had set up airflow in my machine using astro CLI as mentioned here https://www.astronomer.io/guides/get-started-airflow-2/ . I wanted to use SQL Server connection in Airflow UI, but I don't see it anywhere.
Can someone tell me how I can add it since I need this provider to access SQL Server tables?
I am using the airflow version - Astronomer Runtime 5.0.6 based on Airflow 2.3.3+astro.1
You need to pip install the SQL Server provider package.
https://pypi.org/project/apache-airflow-providers-microsoft-mssql/
You can edit the Dockerfile that's created by the Astro CLI to install it when the tool brings up the environment.

Deploy Airflow 2.0+ on Azure

I have started learning/trying Airflow. Part of my research, I have installed airflow in my local through Docker. I referred official page Airflow Install Docker
I am looking for a (standard) process when I can deploy Airflow to Azure.
Can I directly use same docker-compose file for that ?
Any help will be appreciated.
Likely the easiest way is to use AKS (Azure Kubernetes Service) and use the official Helm Chart of Apache Airflow from the Apache Airflow community to deploy Airflow on it:
https://airflow.apache.org/docs/helm-chart/stable/index.html
I reckon, Installing Airflow through Docker-compose.yml on Azure wouldn't be the right approach. It is probably encouraged for local installations only.
The way we have set this up in Azure is like this:
Database: Azure Database for PostgreSQL server
Webserver + Scheduler: Azure App Service on a Linux App Service Plan
Docker: Azure Container Registry
Build the Docker image (either locally or in CICD pipeline)
Push the Docker image to Azure Container Registry
Setup Postgres server and create airflow database
Setup Azure App Service to pull Docker image from ACR (You can also setup continuous integration). Remember to configure Environment variables.
Obviously, as Jarek mentioned, you could also go with setting up AKS.
You can try the preconfigured images of Kubernertes AirFlow2 packaged by Bitnami. And perform the Deploy in AKS
bitnami.airflow

Connection type not added on MWAA's Web UI after python dependencies installed successfully

I have successfully enable a MWAA environment (Airflow version: 2.0.2) and install python dependencies for MSSQL, MySQL and Oracle. But connections for these 3 databases are not added into the drop down list on Web UI. Any idea?
I have checked the log on Cloudwatch for scheduler's requirement installation and it says all installed successfully as below:
Successfully installed apache-airflow-providers-microsoft-mssql-1.1.0 apache-airflow-providers-mysql-1.1.0 apache-airflow-providers-oracle-1.1.0 cx-Oracle-8.2.1 mysql-connector-python-8.0.22 mysqlclient-2.0.3 protobuf-3.17.3 pymssql-2.2.1
Is it because of MWAA is installed under /usr/local/lib/python3.7/site-packages and python dependencies are installed under ~/.local/lib/python3.7/site-packages. How do I specify the directory for packages installation.
Or is that Web Server did not restart after dependencies are installed. Thus, Web UI did not reflect new installed packages? How do I restart Web Server if this is the case?
[UPDATE]
I have tried to recreate the environment and specify the requirements file from the beginning. The connections are still not showing in the drop down list on Web UI.
I'm afraid some connections are not supported by MWAA see mailing list thread where they explain their concerns. The issues is that cloud service providers often do not allow installing dependencies on webserver.
Starting from MWAA 2.2.2, dependencies will be installed on Web Server as well. Thus, we could select the correct connection type from now on.

How does Meteor Up work?

I recently created a droplet on Digital Ocean, and then just used Meteor Up to deploy my site to it.
As awesome as it was to not have to mess with all of the details, I'm feeling a little worried and out of the loop about what's happening with my server.
For example, I was using the console management that Digital Ocean provides, and I tried to use the meteor mongo command to investigate what was happening with my database. It just errored, with command not found: meteor.
I know my database works, since records are persistent across accesses, but it seems like Meteor Up accomplished this without retaining any of the testing and development interfaces I grew used to on my own machine.
What does it do??? And how can I get a closer look at things going on behind the scenes?
Meteor Up installs your application to the remote server, but does not install the global meteor command-line utilities.
For those, simply run curl https://install.meteor.com | /bin/sh.
MUP does a few things. Note that this MUP is currently under active development and some of this process will likely change soon. The new version will manage deployment via Docker, add support for meteor build options, and other cool stuff. Notes on the development version (mupx) can be found here: https://github.com/arunoda/meteor-up/tree/mupx.
mup setup installs (depending on your mup.json file) Node, PhantomJS, MongoDB, and stud (for SSL support). It also installs the shell script to setup your environment variables, as well as your upstart configuration file.
mup deploy runs meteor build on your local machine to package your meteor app as a bundled and zipped node app for deployment. It then copies the packaged app to the remote server, unbundles it, installs npm modules, and runs as a node app.
Note that meteor build packages your app in production mode rather than the debug mode that runs by default on localhost when you call meteor or meteor run. The next version of MUP will have a buildOptions property in mup.json that you can use to set the debug and mobileSettings options when you deploy.
Also, since your app is running directly via Node (rather than Meteor), meteor mongo won't work. Instead, you need to ssh into the remote server and call mongo appName.
From there, #SLaks is right about how it sets things up on the server (from https://github.com/arunoda/meteor-up#server-setup-details):
This is how Meteor Up will configure the server for you based on the given appName or using "meteor" as default appName. This information will help you customize the server for your needs.
your app lives at /opt/<appName>/app
mup uses upstart with a config file at /etc/init/<appName>.conf
you can start and stop the app with upstart: start <appName> and stop <appName>
logs are located at: /var/log/upstart/<appName>.log
MongoDB installed and bound to the local interface (cannot access from the outside)
the database is named <appName>

Resources