Unable to see the dag that I created programmatically - airflow

I am trying to create a dag programmatically, using the instructions here: http://airflow.apache.org/faq.html#how-can-i-create-dags-dynamically
While the code runs and does not throw any errors, I am still unable to find the newly created dags.

Related

Airflow dag file is in place but it's not showing up when I do airflow dags list

I placed a dag file in the dags folder based on a tutorial with slight modifications, but it doesn't show up in the GUI or when run airflow dags list.
Answering my own question: Check the python file for Exceptions by running it directly. It turns out one exception in the dag's python script due to a missing import made the dag not show up in the list. I note this just in case another new user comes across this. To me the moral of the story is that dag files should often be checked by running with python directly when they are modified because there won't be an obvious error showing up otherwise; they may just disappear from the list.

Scheduler not updating package files

I'm developing a DAG on Cloud Composer; my code is separated into a main python file and one package with subfolders, it looks like this:
my_dag1.py
package1/__init__.py
package1/functions.py
package1/package2/__init__.py
package1/package2/more_functions.py
I updated one of the functions on package1/functions.py to take an additional argument (and update the reference in my_dag1.py). The code would run correctly on my local environment and I was not getting any errors when running
gcloud beta composer environments run my-airflow-environment list_dags --location europe-west1
But the Web UI raised a python error
TypeError: my_function() got an unexpected keyword argument
'new_argument'
I have tried to rename the function and the error changed to
NameError: name 'my_function' is not defined
I tried changing the name of the DAG and to upload the files to the dag folder zipped and unzipped, but nothing worked.
The error disappeared only after I renamed the package folder.
I suspect the issue is related to the scheduler picking up my_dag1.py but not package1/functions.py. The error appeared out of nowhere as I have made similar updates on the previous weeks.
Any idea on how to fix this issue without refactoring the whole code structure?
EDIT-1
Here's the link to related discussion on Google Groups
I've run into a similar issue. the "Broken DAG" error won't dismiss in Web UI. I guess this is a cache bug in Web server of AirFlow.
Background.
I created a customized operator with Airflow Plugin features.
After I import the customized operator, the airflow Web UI keep shows the Broken DAG error says that it can't find the customized operator.
Why I think it's a bug in Web server Airflow?
I can manually run the DAG with the command airflow test, so the import should be correct.
Even if I remove the related DAG file from the /dags/ folder of airflow, the error still there.
Here are What I did to resolve this issue.
restart airflow web service. (sometimes you can resolve the issue only by this).
make sure no DAG is running, restart airflow scheduler service.
make sure no DAG is running, restart airflow worker
Hopefully, it can help someone has the same issue.
Try restarting the webserver with:
gcloud beta composer environments restart-web-server ENVIRONMENT_NAME --location=LOCATION

Modify the task schedule in Airflow

I want to modify the schedule of a task I created in a dags/ folder through the airflow UI. I can't find a way to modify the schedule through the UI. Can it be done or we can get it done only by modifying the python script ?
The only way to change it is through the code. As it's part of the DAG definition (like tasks and dependencies), it appears to be difficult to be able to change it through the web interface.

Using aotimport on server startup

I'm trying to set up one of my AX environments to have an XPO imported whenever the server process is started up. Because the environment is updated from production on a regular basis, and the code needs to be unique to this (non-production) environment, it seems the best option would be to use the AOTImport command on startup. However, I'm having trouble identifying the exact syntax/setup to get this to work.
Looking through the system code, it seems the syntax should be aotimport_[path to file]. Is this correct? The server does not seem to complain about this command, but the file in question does not get imported. I have also tried variations on this command, but have not seen it work.
I supose you're trying to execute a command on SysStartupCmd classes. If so, this methods are fired when AX client starts, not the AOS. It's docummented on this page:
http://msdn.microsoft.com/en-us/library/aa569641(v=ax.50).aspx
If you want to automate this import it can be done scheduling an execution of the AX client (ax32.exe) on your build workflow that run the import (it's recommended to run a full compilation after importing). This is discussed on other questions here on SO.

PLSQL - install works alternatively

I have a sql script file that I use to install tables, trigger, sequences and finally a package. The package uses the tables created by the sql script file. The package is specified to run when an event occurs in the application i.e the package runs when an application trigger is fired.
All the package does is that it does a bulk select insert into a staging table. That is all it does.
Now, the issue is when I do a clean install the first time, the package is triggered and runs but does not insert data into the staging table. However, when the next event occurs the package is triggered and data is inserted into the staging stable and continues to function normally. So initially I thought it could be an initialization error.
However, when I drop all the objects created by the sql script file including the package and rerun the sql script file, the package works just fine when the first event occurs itself and continues to function normally.
So this cannot be an initialization error.
But again (just because I am losing my mind) I drop everything and rerun the script file, I find the same behavior I noticed the first time. Then I drop everything again and rerun the script file it works just fine the first time.
I have no idea why it works alternatively and this is so weird.
I would guess code is in an invalid (un-compiled) state. A trigger is created on the table which calls a package which has not yet been created. Then the package is created. After first run, the trigger code is automatically re-compiled by the server. In your creation script, after all the objects are created, run a script to compile invalid objects (ALTER <object> COMPILE).
Verify that this is the case by checking:
SELECT object_type, object_name
FROM all_objects
WHERE status = 'INVALID'
after creation, but before your event triggers.

Resources