airflow operator import doesn't seem to work - airflow

I am new to airflow and need some direction on this one...
I'm creating my first dag that uses a subdag and importing the subdag operator
`from airflow.operators.subdag import SubDagOperator`
however I keep getting the flowing error
"Broken DAG: [/usr/local/airflow/dags/POC_Main_DAG.py] No module named 'airflow.operators.subdag'"
I also tried importing the dummy operator ang got the same error.
on the other hand the below operators seem to be imported as expected.
from airflow.operators.bash_operator import BashOperator
from airflow.operators.python_operator import PythonOperator
from airflow.operators.mysql_operator import MySqlOperator
appreciate help on resolving this issue
thanks in advance!

What version of Airflow are you using?
If you are using Airflow 1.10.x, use the following:
from airflow.operators.subdag_operator import SubDagOperator
from airflow.operators.bash_operator import BashOperator
from airflow.operators.python_operator import PythonOperator
In Airflow >=2.0.0, use the following:
from airflow.operators.subdag import SubDagOperator
from airflow.operators.bash import BashOperator
from airflow.operators.python import PythonOperator

I am using Version : 1.10.4. i changed the code in the way you suggested and now it works.
thanks for the help!

Related

How do I import Airflow operators for version 2.2.5?

I have just upgraded my Airflow to 2.2.5 and I can't use the EmptyOperator. It should be simple from airflow.operators.empty import EmptyOperatorbut I get the error ModuleNotFoundError: No module named 'airflow.operators.empty'. I also tried:
from airflow.operators import empty
from empty.operators import EmptyOperator
The Airflow repo itself shows the structure that would mean
from airflow.operators.empty import EmptyOperator should work but it doesn't so I am really puzzled as to what is going on.
EmptyOperator was released in Airflow 2.3.0.
In Airflow 2.3.0 DummyOperator was deprecated in favor of EmptyOperator (See PR)
For Airflow>=2.3.0 you should use EmptyOperator:
from airflow.operators.empty import EmptyOperator
For Airflow<2.3.0 you should use DummyOperator:
from airflow.operators.dummy import DummyOperator

Airflow 2 - ModuleNotFoundError: No module named 'airflow.operators.sensors'

After upgrading to Airflow 2, I got that error in some DAGs:
ModuleNotFoundError: No module named 'airflow.operators.sensors'
The new one that works:
from airflow.sensors.base import BaseSensorOperator
Chosen answer doesn't work for newer versions of Airflow.
I resolved by change the import.
old one
from airflow.operators.sensors import BaseSensorOperator
the new one that works
from airflow.sensors import BaseSensorOperator
BaseSensorOperator
I was trying to import ExternalTaskSensor and my research led me to this post, it turned out to be this class.
The correct import for me was
from airflow.sensors.external_task import ExternalTaskSensor
Just FYI in case anyone runs into this in the future.
For Airflow 2.1.1 I first installed Amazon provider:
pip install apache-airflow-providers-amazon
and then imported S3KeySensor:
from airflow.providers.amazon.aws.sensors.s3_key import S3KeySensor

Airflow 2 - ImportError: cannot import name 'BashOperator' from 'airflow.operators'

After upgrading to Airflow 2, I got that error in some DAGs:
ImportError: cannot import name 'BashOperator' from 'airflow.operators'
I ran into the same issue recently. The following worked for me:
from airflow.operators.bash import BashOperator
I resolved by change the import.
old one (https://github.com/apache/airflow/blob/v1-10-stable/airflow/hooks/base_hook.py)
from airflow.operators import BashOperator
the new one that works (https://github.com/apache/airflow/blob/v2-0-stable/airflow/hooks/base.py)
from airflow.hooks import BashOperator
As for airflow 2.2 the import should be:
from airflow.operators.bash import BashOperator
More details can be found in airflow-v2-2-stable-code:
The following imports are deprecated in version 2.2: deprecated
message in v2.2 source code
from airflow.operators import BashOperator

not able to see the DAG in Web UI

I have created a new DAG using the following code. It is calling a python script.
Code:
from __future__ import print_function
from builtins import range
import airflow
from airflow.operators.python_operator import PythonOperator
from airflow.models import DAG
from airflow.operators.bash_operator import BashOperator
from airflow.operators.dummy_operator import DummyOperator
args = {
'owner': 'admin'
}
dag = DAG(
dag_id='workflow_file_upload', default_args=args,
schedule_interval=None)
t1 = BashOperator(
task_id='testairflow',
bash_command='python /root/DataLake_Scripts/File_Upload_GCP.py',
dag=dag)
I have placed it in $airflowhome/dags folder.
after that I have run :
airflow scheduler
I am trying to see the DAG in WebUI however it is not visible there. There is no error coming.
I've met the same issue.
I figured out that the problem is in initial sqlite db. I suppose it's some feature of airflow 1.10.3
Anyway I solved the problem using postgresql backend.
These links will help you:
link
link
link
All instructions are suitable for python 3.
You'll see your DAG after execution of 'airflow webserver' and 'airflow scheduler' commands.
Also notice that you should call 'sudo service postgresql restart' exactly before 'airflow initdb' command.

Difference between DAG import in two ways?

I am trying to create dynamic dag but seems to be failing at the minute. I came across creating the DAG object in two different:
from airflow.models import DAG https://airflow.apache.org/concepts.html#latest-run-only
from airflow import DAG https://airflow.apache.org/tutorial.html
This really confused me because within the same documentation there are two ways of instantiating DAG object.
Both are importing the same DAG class. Just an attribute of how python imports works.
When you do from airflow.models import DAG python is importing the models file and assigning the variable DAG to the DAG class defined in the models file.
When you do from airflow import DAG python is importing the variable DAG defined in init.py, which is in fact just from airflow.models import DAG.
A minimal version being:
models.py
class DAG():
pass
init.py
from airflow.models import DAG
dags/dag_file.py
# import __init__.py which imports models.py which contains DAG
from airflow import DAG
# or this which just imports models.py which contains DAG
from airflow.models import DAG
All that being said, if your dynamic DAG is failing, I doubt it's related to this import

Resources