Airflow does not run dags - airflow

Context: I successfully installed Airflow on EC2, changed things like executor to LocalExecutor; sql_alchemy_conn to postgresql+psycopg2://postgres#localhost:5432/airflow; max_threads to 10.
My problem is when I create a dag which I indicate to be run everyday everything is fine, but when I create a dag to be run like at 10am on Monday and Wednesday Airflow doesn't does not run it. Does anybody know what could I do wrong and should I do in order to fix this issue?
Dag for script which runs fine and properly:
import airflow
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from datetime import timedelta
args = {
'owner': 'arseniyy123',
'start_date': airflow.utils.dates.days_ago(1),
'depends_on_past': False,
'email': ['exam#exam.com'],
'email_on_failure': False,
'email_on_retry': False,
'retries': 1,
'retry_delay': timedelta(minutes=1),
}
dag = DAG(
'daily_script',
default_args=args,
description = 'daily_script',
schedule_interval = "0 10 * * *",
)
t1 = BashOperator(
task_id='daily',
bash_command='cd /root/ && python3 DAILY_WORK.py',
dag=dag)
t1
Dag for script which should run on Monday and Wednesday, but it does not run at all:
import airflow
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from datetime import timedelta
args = {
'owner': 'arseniyy123',
'start_date': airflow.utils.dates.days_ago(1),
'depends_on_past': False,
'email': ['exam#exam.com'],
'email_on_failure': False,
'email_on_retry': False,
'retries': 1,
'retry_delay': timedelta(minutes=1),
}
dag = DAG(
'monday_wednesday',
default_args=args,
description = 'monday_wednesday',
schedule_interval = "0 10 * * 1,3",
)
t1 = BashOperator(
task_id='monday_wednesday',
bash_command='cd /root/ && python3 not_daily_work.py',
dag=dag)
t1
I also have some problems with scheduler, it uses to die after being working more than 10 hours, anybody know why does it happen?
Thank you in advance!

Can you try changing the start_date to a static datetime e.g. datetime.date(2020, 3, 20) instead of using airflow.utils.dates.days_ago(1)
Maybe read through the scheduling examples here, to understand why your code didn't work. From that documentation:
Let’s Repeat That The scheduler runs your job one schedule_interval AFTER the start date, at the END of the period

Related

Airflow Task triggered manually but remains in queued state

I am using Airflow 2.3.1 and running with Local Executor against MS SQL Server as metadata db.
I am trying to execute a dag manually, it shows as queued but nothing happens. There is no other tasks running when this dag is triggered. When I hover on the task, it says "Not yet started".
Tried restarting the scheduler and webserver, but nothing different. The code of the dag is as follows
from datetime import datetime, timedelta import pendulum from airflow
import DAG from airflow.operators.bash import BashOperator from
airflow.utils.dates import days_ago
default_args = {
'owner': 'airflow',
'start_date': datetime(2022,5,27),
'email': False,
'email_on_failure': False,
'email_on_retry': False,
'retries': 1,
'retry_delay': timedelta(minutes=5), }
MIDT_dag = DAG(
'Dag_1',
default_args=default_args,
catchup=False,
description='Test DAG',
schedule_interval=timedelta(days=1) )
task_1 = BashOperator(
task_id='first_task',
bash_command= r"/srv/python3_8_13/venv/bin/python /srv/source_code/InputToRawMIDT_Amadeus_Spark_Linux.py",
dag=MIDT_dag, )
task_2 = BashOperator(
task_id='second_task',
bash_command='echo Testing',
dag=MIDT_dag, )
task_1 >> task_2
Appreciate any help.
Thanks
Manoj George
It seems like your DAG is disabled.
open the UI, choose DAGS in the menu and enabled it.

Airflow tasks not gettin running

I am trying to run a simple BASHOperator task in Airflow. The DAG when trigerred manually lists the tasks in Tree and Graph view but the tasks are always in not started state.
I have restarted my Airflow scheduler. I am running Airflow on local host using a Kubectl image on Docker Compose.
from airflow import DAG
from airflow.operators.dummy_operator import DummyOperator
from airflow.operators.bash_operator import BashOperator
from datetime import datetime, timedelta
default_args = {
'owner': 'airflow',
'depends_on_past': False,
'email': ['vijayraghunath21#gmail.com'],
'email_on_success': True,
'email_on_failure': True,
'email_on_retry': False,
'retries': 1,
'retry_delay': timedelta(minutes=2),
}
with DAG(
dag_id='bash_demo',
default_args=default_args,
description='Bash Demo',
start_date=datetime(2021, 1, 1),
# schedule_interval='0 2 * * *',
schedule_interval=None,
max_active_runs=1,
catchup=False,
tags=['bash_demo'],
) as dag:
dag.doc_md = __doc__
# Task 1
dummy_task = DummyOperator(task_id='dummy_task')
# Task 2
bash_task = BashOperator(
task_id='bash_task', bash_command="echo 'command executed from BashOperator'")
dummy_task >> bash_task
DAG Image
As shown on the image you added the DAG is set to off thus it's not running. You should click on the toggle button to set it to on.
This issue can be avoided in two ways:
Global solution- if you wills set dags_are_paused_at_creation = False in airflow.cfg - This will effect all DAGs in the system.
Local solution - if you will use is_paused_upon_creation in the DAG contractor:
with DAG(
dag_id='bash_demo',
...
is_paused_upon_creation=False,
) as dag:
This parameter specifies if the dag is paused when created for the first time. If the dag exists already, the parameter is being ignored.

Apache-Airflow - Task is in the none state when running DAG

Just started with airflow and wanted to run simple dag with BashOperator that outputs 'Hello' to console
I noticed that my status is indefinitely stuck in 'Running'
When I go on task details, I get this:
Task is in the 'None' state which is not a valid state for execution. The task must be cleared in order to be run.
Any suggestions or hints are much appreciated.
Dag:
from datetime import timedelta
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from airflow.utils.dates import days_ago
default_args = {
'owner': 'dude_whose_doors_open_like_this_-W-',
'depends_on_past': False,
'start_date': days_ago(2),
'email': ['yessure#gmail.com'],
'email_on_failure': True,
'email_on_retry': True,
'retries': 1,
'retry_delay': timedelta(minutes=5),
}
dag = DAG(
'Test',
default_args=default_args,
description='Test',
schedule_interval=timedelta(days=1)
)
t1 = BashOperator(
task_id='ECHO',
bash_command='echo "Hello"',
dag=dag
)
t1
I've managed to solve it by adding 'start_date': dt(1970, 1, 1),
to default args object
and adding schedule_interval=None to my dag object
Could you remove the last line of t1- this isn't necessary. Also start_dateshouldn't be set dynamically - this can lead to problems with the scheduling.

Airflow not picking up "start_date" from DAG

I'm registering a new DAG like so:
from airflow import DAG
from airflow.operators.python_operator import PythonOperator
from airflow.operators.bash_operator import BashOperator
from airflow.utils.dates import days_ago
from airflow.hooks.base_hook import BaseHook
from datetime import datetime, timedelta, timezone
import pendulum
local_tz = pendulum.timezone("UTC")
default_args = {
'owner': 'me',
'depends_on_past': False,
'start_date': datetime(2020, 6, 19, 9, 37, 35, tzinfo=local_tz),
'email': ["blah#blah.com"],
'email_on_failure': False,
'email_on_retry': False,
'retries': 1,
'retry_delay': timedelta(minutes=15)
}
dag = DAG(
dag_id="some_id",
default_args=default_args,
description= "Some description",
schedule_interval="#once"
)
def to_be_executed_py():
print("I did it, ma!")
with dag:
t1 = PythonOperator(
task_id="some_id",
python_callable=to_be_executed_py)
I want this to run once and only once at the time given in start_date. After uploading the DAG (using S3), I'm not seeing the "start_date" in details. Instead, I see in details (under default_args):
{'owner': 'me',
'depends_on_past': False,
'start_date': datetime.datetime(2020, 6, 19, 9, 37, 35, tzinfo=<TimezoneInfo [UTC, GMT, +00:00:00, STD]>),
'email': ['bleh#blah.com'],
'email_on_failure': False,
'email_on_retry': False,
'retries': 1,
'retry_delay': datetime.timedelta(0, 900)}
Am I doing something wrong here? Am I making the right assumption that this should execute at the given start_time? I've looked all over for similar use cases, but not many are setting their start_date to include a time.
UPDATE
Currently, the DAG is running immediately upon being unpaused. Definitely not picking up the start time. All the resources I've found online don't have an answer that works here.
Figured out the issue. It was twofold. One, a translator we were using was a 12 hour clock. As this was in the evening, it was setting it to be in the past (causing Airflow to play catchup).
Secondly, we didn't need the timezone. Plus, we weren't setting the dag in the task. So the code should read as the following:
from airflow import DAG
from airflow.operators.python_operator import PythonOperator
from airflow.operators.bash_operator import BashOperator
from airflow.utils.dates import days_ago
from airflow.hooks.base_hook import BaseHook
from datetime import datetime, timedelta, timezone
default_args = {
'owner': 'me',
'depends_on_past': False,
'start_date': datetime(2020, 6, 19, 21, 37, 35),
'email': ["blah#blah.com"],
'email_on_failure': False,
'email_on_retry': False,
'retries': 1,
'retry_delay': timedelta(minutes=15)
}
dag = DAG(
dag_id="some_id",
default_args=default_args,
description= "Some description",
schedule_interval="#once"
)
def to_be_executed_py(ds, **kwargs):
print("I did it, ma!")
with dag:
t1 = PythonOperator(
dag=dag,
provide_context=True,
task_id="some_id",
python_callable=to_be_executed_py)
With those changes, everything runs at the given time, once and only once.
With a combination of schedule_interval='#daily', ShortCircuitOperator and airflow variables you could do a workaround; DAG runs every day and checks if today is in the list of the days that you entered as an airflow variable. If so, proceed and runs downstream tasks and if not, it skips the downstream tasks and waits until subsequent execution on tomorrow.
here is the DAG definition:
import airflow.utils.helpers
from airflow.models import DAG, Variable
from airflow.operators.dummy_operator import DummyOperator
from airflow.operators.python_operator import ShortCircuitOperator
args = {
'owner': 'airflow',
'start_date': airflow.utils.dates.days_ago(2)
}
dag = DAG(
dag_id='run_on_release_day',
default_args=args,
schedule_interval='#daily'
)
def check_release_date(**context):
release_dates = Variable.get('release_dates')
print(context, release_dates)
return context['ds'] in release_dates
cond = ShortCircuitOperator(
task_id='condition',
python_callable=check_release_date,
dag=dag,
provide_context=True,
)
tasks = [DummyOperator(task_id='task_' + str(i), dag=dag) for i in [1, 2]]
airflow.utils.helpers.chain(cond, *tasks)

PythonOperator with python_callable set gets executed constantly

import airflow
from airflow import DAG
from airflow.operators.python_operator import PythonOperator
from datetime import datetime, timedelta
from workflow.task import some_task
default_args = {
'owner': 'airflow',
'depends_on_past': False,
'email': ['jimin.park1#aig.com'],
'email_on_failure': False,
'email_on_retry': False,
'retries': 0,
'retry_delay': timedelta(minutes=1),
'start_date': airflow.utils.dates.days_ago(0)
# 'queue': 'bash_queue',
# 'pool': 'backfill',
# 'priority_weight': 10,
# 'end_date': datetime(2016, 1, 1),
}
dag = DAG('JiminTest', default_args=default_args, schedule_interval='*/1 * * * *', catchup=False)
t1 = PythonOperator(
task_id='Task1',
provide_context=True,
python_callable=some_task,
dag=dag
)
The actual some_task itself simply appends timestamp to some file. As you can see in the dag config file, the task itself is configured to run every 1 min.
def some_task(ds, **kwargs):
current_time = datetime.datetime.now().strftime("%Y-%m-%d %H:%M:%S")
with open("test.txt", "a") as myfile:
myfile.write(current_time + '\n')
I simply tail -f the output file and started up the webserver without the scheduler running. This function was being called and things were being appended to the file when webserver starts up. When I start up the scheduler, on each execution loop, the file gets appended.
What I want is for the function to be executed on every minute as intended, not every execution loop.
The scheduler will run each DAG file every scheduler loop, including all import statements.
Is there anything running code in the file from where you are importing the function?
Try to check the scheduler_heartbeat_sec config parameter in your config file. For your case it should be smaller than 60 seconds.
If you want the scheduler not to cahtchup previous runs set catchup_by_defaultto False (I am not sure if this relevant to your question though).
Please indicate which Apache Airflow version are you using

Resources