Only works with the CeleryExecutor - airflow

I am new at airflow and when i click run 'ignore all dependence' on Task Instance Context Menu like this:
Task Instance Context Menu
It leads to 'Only works with the CeleryExecutor'
I try to Refresh the Web UI but it doesn't work.
(I use LocalExecutor and don't want to use CeleryExecutor)
Why it happened and how can i run a single task ignore all dependence on the Web UI when i use LocalExecutor

I had a similar problem. Issue was following:
With LocalExecutor you cannot run single task, you could only run the whole DAG at once. Source code
DAG was already in 'success' state.
Possible solution is to change DAG status to running.

I worked around this issue by selecting the first task in my DAG and mark all downstream tasks as success.
I would then clear the task I would actually want to run and the scheduler would pick it up and run this task for me.

Related

Airflow: Why do DAG tasks run outdated DAG code?

I am running Airflow (1.10.9) through Cloud Composer (1.11.1) on GCP.
Whenever I update a DAG's code I can see the updated code refreshed in the Airflow GUI but for at least 10 minutes the DAG's tasks still run the old code.
A couple of questions:
Why does this delay occur and can it be decreased?
How can I know when the task's code has been updated to make sure nobody runs old code?

Airflow: How to take out a task that is now not included in DAG?

I have this problem. I changed the DAG workflow to replace on task with another. But it seems that the replaced task is still reflecting but is not part of the workflow already (please see image). My question is, how to take out that task?
Any help is much appreciated. Thanks!
I think your best bet would be to turn off the DAG, restart the scheduler (or just start it with airflow scheduler) and wait. Usually the changes in the DAGs are only picked up after a while that the scheduler is running.
Also it could happen that while refreshing either the graph view or tree view of the DAG you'll see the task "randomly" appear and disappear, until it finally stabilize at the latest version.
After some scheduler cycles have passed and after refreshing you only see the new configuration, you can safely turn the DAG back on.

Is there anyway to run python script asynchronously in Airflow?

I want to run python script in airflow. To achieve the same I am triggering script using airflow bash operator like below.
sub_dag = BashOperator(
task_id='execute_dependent_dag',
bash_command='python myscript.py',
dag=dag,
trigger_rule="all_success")
However I want it to be triggered asynchronously. Currently it is waiting for script to get finish. I used & as well as nohup to make it run but it didn't work.
Let me know if there is any other way to run it asynchronously.
Thanks in advance.
I believe extending BashOperator to remove wait() call would make that happen with the downside that errors would go silently undetected
Alternatively if the python script / code in question can be imported into your Airflow project, you could try doing the same with PythonOperator through multiprocessing (or a variety of other ways)
Also if you want to get your hands dirty, have a look at this

Airflow UI pause toggle not showing for tutorial

I am following the tutorial in the Airflow docs. When I visit the UI I don't see the toggle to turn on and off (or pause?) the DAGs
I tried to click the trigger DAG button on the right but I guess this just manually runs it once ignoring the scheduler. (A side question, it just says it's running now, it isn't finishing... is it waiting for something?)
So, did I have to do something in order to schedule the DAG first and is that why I'm not seeing a pause button, because it isn't scheduled? that would surprise me because surely I should be able to schedule it from the UI?
Lastly, what are all those other example DAGs and how can I hide them?
Seems to me that some part of your Airflow setup is broken.
Either the scheduler is not working or the files are not deployed.
My suggestion is to check this question as well: Airflow 1.9.0 is queuing but not launching tasks

How to execute test using tasks in nightwatch?

I'm trying to execute a group of tests. I see at nightwatch documentation about group and tasks and I'm using tasks to run specifics tests. The problem is that nightwatch recognizes the test but does not execute it.
nightwatch return:
Starting selenium server in parallel mode... started - PID: 1404
If you want to upload Screenshots to S3
please set your AWS Environment Variables (see readme).
Started child process for: adquirentes\antecipacao
>> adquirentes\antecipacao finished.
As you can see, the test was started but not executed. How can I fix it?

Resources