Can we integrate ServiceNow with Airflow ? I mean if there is any DAG failure, can we rerun the failed DAG/tasks by submitting a ServiceNow request ? Also can we submit an adhoc job to run using ServiceNow. Thanks in advance.
Yes, simply use the on_failure_callback parameter of the dag to call a function that makes a POST request using service now's SET API (particularly the u_incident table)
Related
How do I get a list of all unpaused(running) dags using airflow API?
I tried GET /dags endpoints but I did not find a query string to filter paused dags, isn't there something like is_paused query parameter or body parameter perhaps?
P.S I'm currently using airflow version 2.2.3 +
Currently Airflow API doesn't support this filter, you should get all the dags and filter them locally.
If you really need this filter, you can create an Airflow plugin which exposes a simple API to fetch the unpaused dags and return them.
Update: this filter will be available in Airflow API from 2.6.0 (PR)
Actually there is plugin made for this. You can fetch the dags along with status. Please explore this plugin. May be this is what you are looking for.
Airflow API Plugin
Dag Run Endpoints
Or else you can write your custom python script/API to fill the dagbag and then filter the list to get the list of dags with status which you want.
I have an airflow DAG that is triggered externally via cli.
I have a requirement to change order of the execution of tasks based on a Boolean parameter which I would be getting from the CLI.
How do I achieve this?
I understand dag_run.conf can only be used in a template field of an operator.
Thanks in advance.
You can not change tasks dependency with runtime parameter.
However you can pass runtime parameter (with dag_run.conf) that according to it's value tasks will be executed or be skipped for that you need to place operators in your workflow that can handle this logic for example: ShortCircuitOperator, BranchPythonOperator
I was looking through the different API endpoints that Airflow offers, but I could not find one that would suite my needs. Essentially I want to monitor the state of each task within the DAG, without having to specify each task I am trying to monitor. Ideally, I would be able to ping the DAG and the response would tell me the state of the task/tasks and what task/tasks are running/retrying...etc
You can use the airflow rest api which comes along with it - https://airflow.apache.org/docs/apache-airflow/stable/stable-rest-api-ref.html
I have a task in Airflow that uses bash operator which is running continuously.
Is there any way to make the task instance as success before its next scheduled run? I know how to mark it as success in api.
Thank you for your replies
If you just want to call the api automatically to mark the previous dagrun as success, you can use SimpleHttpOperator as the first task of your dag. This operator might call airflow REST API to request to mark the previous dagrun as success.
https://airflow.apache.org/docs/apache-airflow-providers-http/stable/operators.html#simplehttpoperator
I want to use a console command from this bundle within my controller: http://knpbundles.com/dizda/CloudBackupBundle
The developer proposes cronjobs, however I want to use the command to backup my database from within my controller.
How would I do that?
I am getting this error message when i simply try to register this command as a service:
You have requested a non-existent service "backupcommandservice".
Thanks for the help!
commands don't quite work that way. Per the note on http://symfony.com/doc/current/cookbook/console/console_command.html#register-commands-in-the-service-container
registering a command as a service doesn't do much other than control location and dependency injection.
if you want to call a command: http://symfony.com/doc/current/components/console/introduction.html#calling-an-existing-command
that being said you shouldn't call commands from within a controller since you're basically asking to wait for this command to finish executing before you return a response. You'd be better off just sending a request to a queue box (for example beanstalk) and have a worker perform the job.