How do we pip3 install the email operator for apache-airflow ?
ImportError: cannot import name 'EmailOperator' from 'airflow.operators
I would imagine it were similar to :
pip3 install apache-airflow-providers-apache-hive==4.0.0
There is no Email provider.
There used to be a backport of email provider for Airflow 1.10 but then the project decided against that so EmailOperator was moved back to Airflow core (see PR).
To answer your question you get the EmailOperator directly from Airflow core:
from airflow.operators.email import EmailOperator
Related
I am using Python snowflake.connector library to connect to snowflake from MWAA Airflow.
The code is working fine in Airflow version 2.0.2.
Now I am upgrading MWAA Airflow version to 2.2.2 and trying to execute the same code to connect to snowflake. I created new Airflow environment with version 2.2.2.
My requirements.txt file is as shown below:
apache-airflow==2.2.2
apache-airflow-providers-snowflake==2.3.0
snowflake-connector-python==2.7.0
snowflake-sqlalchemy==1.2.4
pandas==1.4.3
But I am getting error with Snowflake module on Airflow UI as shown below:
**File "/usr/local/airflow/dags/shared/modules/connect_to_snowflake.py", line 11, in <module>
import snowflake.connector as sf
ModuleNotFoundError: No module named 'snowflake'**
Also, I am not able to view any AWS cloudwatch logs.
I did not find any details about snowflake changes from Airflow version 2.0.2 to 2.2.2.
Can anyone help me to resolve this issue?
Note: Below is my requirements.txt file for MWAA Airflow version 2.0.2.:
apache-airflow==2.0.2
snowflake-connector-python
I trying to getting data from FTP server's txt file by GCP Composer Tasks.
So I imported SFTPOperator package in code.
but error occurred:
ModuleNotFoundError: No module named 'airflow.providers.sftp'
then, I tried few ways:
Getting exception "No module named 'airflow.providers.sftp'"
Install apache-airflow-providers-sftp by composer pypi packages
but didn't work.ðŸ˜
My GCP Composer Environment is as below:
Image Version : composer-1.17.7-airflow-2.1.4
python version : 3
Network VPC-native : Enable
How can I use SFTPOperator ?
For this you will have to install sftp package, pip install 'apache-airflow[sftp]' . You can check the built-in and extras packages that airflow components have when installed (varies from version).
Once you have it installed you should be able to use SFTPOperator by importing the operator inside your DAG.
from airflow.providers.sftp.operators.sftp import SFTPOperation,SFTPOperator
with DAG(...) as dag:
upload_op = SFTPOperator(
task_id="test_sftp",
ssh_conn_id="ssh_default",
local_filepath="/tmp/file.txt",
remote_filepath="/tmp/tmp1/tmp2/file.txt",
operation=SFTPOperation.GET,
dag=dag
)
...
You can also find a mock tests on the airflow git hub project that can provide you some guidance, check this link.
UPDATE 17/08/2022: As commented by Diana, Composer has a documented way to install its components as mention on this link. Be advised to pick up the composer version your project uses as there is version1 and version2 guides.
I can see the following in my pip list but when I try to add a Snowflake connection via the GUI, Snowflake is not an option from the dropdown.
apache-airflow-providers-snowflake 2.1.0
snowflake-connector-python 2.5.1
snowflake-sqlalchemy 1.2.3
Am I missing something?
I have had this issue with MWAA recently.
I find that if I select AWS in the drop down and provide the correct snowflake host name etc it works though.
I run into the same issue using the official helm chart 1.3.0.
But finally I was able to make Snowflake connection visible by doing the following steps:
I uninstalled the apache-airflow-providers-google. Not sure whether this is important, but I like to mention it here. I did this, because I got some warnings.
Because with SQLAlchemy 1.4 some breaking changes were introduced, I made sure that version 1.3.24 gets installed. Based on that I choosed the fitting version for snowflake stuff.
So this is my requirements.txt for my custom Airflow container:
apache-airflow-providers-snowflake==2.3.0
pyarrow==5.0.0
snowflake-connector-python==2.5.1
snowflake-sqlalchemy==1.2.5
SQLAlchemy==1.3.24
This is my Dockerfile:
FROM apache/airflow:2.2.1-python3.8
## adding missing python packages
USER airflow
COPY requirements.txt .
RUN pip uninstall apache-airflow-providers-google -y \
&& pip install -r requirements.txt
I had the same issue where my pip freeze showed the apache-airflow-providers-snowflake yet I did not have the provider in the UI. I had to add the line apache-airflow-providers-snowflake to my requirements.txt file and then restart. Then I was able to see the Snowflake provider and connector in the UI.
Now that Airflow 2.0 is released, we're excited to try out some the new features.
What's the best way of upgrading from 1.10.11 to Airflow 2.0?
Will my existing code work or will I be required to change my DAGs?
We'll start upgrading in our DEV environment for testing later this week.
Airflow 1.10.11 and local executor and Python3
The documentation lacks the information, how to exactly upgrade to 1.10.14 while the newer version is already available.
According to the PIP documentation (https://pip.pypa.io/en/stable/user_guide/#installing-packages) this should work:
python -m pip install apache-airflow==1.10.14
This seemed to work for me, but I was not able to start the websever after.
First, I had to upgrade the DB:
airflow upgradedb
Second, starting the webserver showed the problem that now the "secret_key" has to contain a real secret key.
Execute
openssl rand -hex 30
and add the hex key to the airflow.cfg file.
Then follow the remaining steps (including executing the check script) from the upgrade documentation.
As it is not described, either, the actual upgrade to 2.0 should work by using
pip install -U apache-airflow
Note especially the change in the DB upgrade command (airflow db upgrade instead of airflow upgradedb).
Regards,
HerrB92
We have documented it at https://airflow.apache.org/docs/apache-airflow/stable/upgrading-to-2.html
Step 1: Upgrade to Python 3
Step 2: Upgrade to Airflow 1.10.14 (a.k.a Airflow "bridge" release)
Step 3: Install and run the Airflow Upgrade check scripts (https://pypi.org/project/apache-airflow-upgrade-check/)
Step 4: Import Operators from Backport Providers
Step 5: Upgrade Airflow DAGs
Step 6: Upgrade Configuration settings
Step 7: Upgrade to Airflow 2.0
The upgrade-check package should help you in upgrading.
Read https://airflow.apache.org/docs/apache-airflow/stable/upgrading-to-2.html#step-3-install-and-run-the-upgrade-check-scripts
My OS is Mac OS. I followed airflow official installation guide to install. But when I test: airflow test tutorial print_date 2015-06-01 from airflow testing it doesn't print any output. The result is here.. I wonder did I install it successfully? I've run other commands on the official airflow testing page. They report no error.
So far I see only WARNING output, it doesn't mean that airflow isn't running nor installed improperly. You'd have an easier time testing your install with airflow list_dags and you probably must run airflow initdb before most of the commands (and look at the airflow.cfg file).