Issues installing airflow locally - airflow

I installed airflow locally because i am testing sftp operator in airflow (2.0.0). When I try running this code
from airflow.providers.sftp.operators import sftp_operator
from airflow import DAG
import datetime
dag = DAG(
'test_dag',
start_date = datetime.datetime(2020,1,8,0,0,0),
schedule_interval = '#daily'
)
get_operation = SFTPOperator(
task_id="operation",
ssh_conn_id="ssh_default",
local_filepath="route_to_local_file",
remote_filepath="remote_route_to_copy",
operation="get",
dag=dag
)
get_operation
When I run this code python code I am getting this error.
Traceback (most recent call last):
File "test_dags.py", line 1, in <module>
from airflow.providers.sftp.operators import sftp_operator
ModuleNotFoundError: No module named 'airflow.providers.sftp'
can anyone please tell if I am missing anything in my installation?

Since you don't specify how you installed Airflow I'm assuming you did something like pip install apache-airflow>=2.0.0. If you look at the Python dependencies in that environment with pip freeze you won't see apache-airflow-providers-sftp because as of version 2, Airflow extracts its functionality into provider packages, the vast majority of which need to installed manually, eg: pip install apache-airflow-providers-sftp. Now it should work. Supporting documentation https://airflow.apache.org/docs/apache-airflow-providers/packages-ref.html#apache-airflow-providers-sftp.

Related

Failed to execute script due to unhandled exception: No module named ttkwidgets

I created an executable from a py file that used ttkwidgets
import tkinter as tk
import work_logger_database
from ttkwidgets.autocomplete import AutocompleteEntry
root = tk.Tk()
root.geometry('600x300')
items = work_logger_database.show_all()
stored_projects_list = list(set([items[i][2] for i in range(len(items))]))
#or:
#example_list = ['Hello', 'World']
project_entry = AutocompleteEntry(root, completevalues=stored_projects_list)
project_entry.grid(column=0, row=1, sticky='W')
root.mainloop()
and when I open the exe I get what is shown in this image
I also got a warning:
WARNING: Several hooks defined for module 'numpy'. Please take care they do not conflict.
I'm not sure if that's relevant
I tried a solution in creating the exe that didn't work:
pyinstaller work_logger.py --onefile -w --hidden-import=ttkwidgets --hidden-import=ttkwidgets.autocomplete
I also tried:
pyinstaller work_logger.py --onefile -w --hidden-import=ttkwidgets
Had the same ModuleNotFoundError: 'ttkwidgets' creating an executable of one of my scripts.
Worked after (re-)installing ttkwidgets on my system:
pip install ttkwidgets

Airflow Scheduler fails to execute Windows EXE via WSL

My Windows 10 machine has Airflow 1.10.11 installed within WSL 2 (Ubuntu-20.04).
I have a BashOperator task which calls an .EXE on Windows (via /mnt/c/... or via symlink).
The task fails. Log shows:
[2020-12-16 18:34:11,833] {bash_operator.py:134} INFO - Temporary script location: /tmp/airflowtmp2gz6d79p/download.legacyFilesnihvszli
[2020-12-16 18:34:11,833] {bash_operator.py:146} INFO - Running command: /mnt/c/Windows/py.exe
[2020-12-16 18:34:11,836] {bash_operator.py:153} INFO - Output:
[2020-12-16 18:34:11,840] {bash_operator.py:159} INFO - Command exited with return code 1
[2020-12-16 18:34:11,843] {taskinstance.py:1150} ERROR - Bash command failed
Traceback (most recent call last):
File "/usr/local/lib/python3.8/dist-packages/airflow/models/taskinstance.py", line 984, in _run_raw_task
result = task_copy.execute(context=context)
File "/usr/local/lib/python3.8/dist-packages/airflow/operators/bash_operator.py", line 165, in execute
raise AirflowException("Bash command failed")
airflow.exceptions.AirflowException: Bash command failed
[2020-12-16 18:34:11,844] {taskinstance.py:1187} INFO - Marking task as FAILED. dag_id=test-dag, task_id=download.files, execution_date=20201216T043701, start_date=20201216T073411, end_date=20201216T073411
And that's it. Return code 1 with no further useful info.
Running the very same EXE via bash works perfectly, with no error (I also tried it on my own program which emits something to the console - in bash it emits just fine, but via airflow scheduler it's the same error 1).
Some more data and things I've done to rule out any other issue:
airflow scheduler runs as root. I also confirmed it's running in a root context by putting an whoami command in my BashOperator, which indeed emitted root (I should also note that all native Linux programs run just fine! only the Windows programs don't.)
The Windows EXE I'm trying to execute and its directory have full 'Everyone' permissions (on my own program of course, wouldn't dare doing it on my Windows folder - that was just an example.)
The failure happens both when accessing via /mnt/c as well as via symlink. In the case of a symlink, the symlink has 777 permissions.
I tried running airflow test on a BashOperator task - it runs perfectly - emits output to the console and returns 0 (success).
Tried with various EXE files - both "native" (e.g. ones that come with Windows) as well as my C#-made programs. Same behavior in all.
Didn't find any similar issue documented in Airflow's GitHub repo nor here in Stack Overflow.
The question is: How does Airflow's Python usage of a subprocess (which airflow scheduler uses to run Bash Operators) different than a "normal" Bash, causing an error 1?
you can use the library subprocess and sys of Python and PowerShell
In the folder Airflow > Dags, create 2 files: main.py and caller.py
so, main.py call caller.py and caller.py go in machine (Windows) to run the files or routines.
This is the process:
code Main.py:
# Importing the libraries we are going to use in this example
from airflow import DAG
from datetime import datetime, timedelta
from airflow.operators.bash_operator import BashOperator
# Defining some basic arguments
default_args = {
'owner': 'your_name_here',
'depends_on_past': False,
'start_date': datetime(2019, 1, 1),
'retries': 0,
}
# Naming the DAG and defining when it will run (you can also use arguments in Crontab if you want the DAG to run for example every day at 8 am)
with DAG(
'Main',
schedule_interval=timedelta(minutes=1),
catchup=False,
default_args=default_args
) as dag:
# Defining the tasks that the DAG will perform, in this case the execution of two Python programs, calling their execution by bash commands
t1 = BashOperator(
task_id='caller',
bash_command="""
cd /home/[Your_Users_Name]/airflow/dags/
python3 Caller.py
""")
# copy t1, paste, rename t1 to t2 and call file.py
# Defining the execution pattern
t1
# comment: t1 execute and call t2
# t1 >> t2
Code Caller.py
import subprocess, sys
p = subprocess.Popen(["powershell.exe"
,"cd C:\\Users\\[Your_Users_Name]\\Desktop; python file.py"] # file .py
#,"cd C:\\Users\\[Your_Users_Name]\\Desktop; .\file.html"] # file .html
#,"cd C:\\Users\\[Your_Users_Name]\\Desktop; .\file.bat"] # file .bat
#,"cd C:\\Users\\[Your_Users_Name]\\Desktop; .\file.exe"] # file .exe
, stdout=sys.stdout
)
p.communicate()
How to know if your code will work in airflow, if run, its Ok.

Dag Seems to be missing

I have a dag which checks for new workflows to be generated (Dynamic DAG) at a regular interval and if found, creates them. (Ref: Dynamic dags not getting added by scheduler )
The above DAG is working and the dynamic DAGs are getting created and listed in the web-server. Two issues here:
When clicking on the DAG in web url, it says "DAG seems to be missing"
The listed DAGs are not listed using "airflow list_dags" command
Error:
DAG "app01_user" seems to be missing.
The same is for all other dynamically generated DAGs. I have compiled the Python script and found no errors.
Edit1:
I tried clearing all data and running "airflow run". It ran successfully but no Dynamic generated DAGs were added to "airflow list_dags". But when running the command "airflow list_dags", it loaded and executed the DAG, (which generated Dynamic DAGs). The dynamic DAGs are also listed as below:
[root#cmnode dags]# airflow list_dags
sh: warning: setlocale: LC_ALL: cannot change locale (en_US.UTF-8\nLANG=en_US.UTF-8)
sh: warning: setlocale: LC_ALL: cannot change locale (en_US.UTF-8\nLANG=en_US.UTF-8)
[2019-08-13 00:34:31,692] {settings.py:182} INFO - settings.configure_orm(): Using pool settings. pool_size=15, pool_recycle=1800, pid=25386
[2019-08-13 00:34:31,877] {__init__.py:51} INFO - Using executor LocalExecutor
[2019-08-13 00:34:32,113] {__init__.py:305} INFO - Filling up the DagBag from /root/airflow/dags
/usr/lib/python2.7/site-packages/airflow/operators/bash_operator.py:70: PendingDeprecationWarning: Invalid arguments were passed to BashOperator (task_id: tst_dyn_dag). Support for passing such arguments will be dropped in Airflow 2.0. Invalid arguments were:
*args: ()
**kwargs: {'provide_context': True}
super(BashOperator, self).__init__(*args, **kwargs)
-------------------------------------------------------------------
DAGS
-------------------------------------------------------------------
app01_user
app02_user
app03_user
app04_user
testDynDags
Upon running again, all the above generated 4 dags disappeared and only the base DAG, "testDynDags" is displayed.
When I was getting this error, there was an exception showing up in the webserver logs. Once I resolved that error and I restarted the webserver it went through normally.
From what I can see this is the error that is thrown when the webserver tried to parse the dag file and there is an error. In my case it was an error importing a new operator I added to a plugin.
Usually, I check in Airflow UI, sometimes the reason of broken DAG appear in there. But if it is not there, I usually run the .py file of my DAG, and error (reason of DAG cant be parsed) will appear.
I never got to work on dynamic DAG generation but I did face this issue when DAG was not present on all nodes ( scheduler, worker and webserver ). In case you have airflow cluster, please make sure that DAG is present on all airflow nodes.
Same error, the reason was I renamed my dag_id in uppercase. Something like "import_myclientname" into "import_MYCLIENTNAME".
I am little late to the party but I faced the error today:
In short: try executing airflow dags report and/or airflow dags reserialize
Check out my comment here:
https://stackoverflow.com/a/73880927/4437153
I found that airflow fails to recognize a dag defined in a file that does not have from airflow import DAG in it, even if DAG is not explicitly used in that file.
For example, suppose you have two files, a.py and b.py:
# a.py
from airflow import DAG
from airflow.operators.dummy_operator import DummyOperator
def makedag(dag_id="a"):
with DAG(dag_id=dag_id) as dag:
DummyOperator(task_id="nada")
dag = makedag()
and
# b.py
from a import makedag
dag = makedag(dag_id="b")
Then airflow will only look at a.py. It won't even look at b.py at all, even to notice if there's a syntax error in it! But if you add from airflow import DAG to it and don't change anything else, it will show up.

Darkflow without GPU on Jupyter-Notebook - Simple Code Required

I am unable to setup & run a simple darkflow program. Infact can't even configure darkflow library:
from darkflow.net.build import TFNet
==> ModuleNotFoundError: No module named 'darkflow'
My Target is to run the following program:
from darkflow.net.build import TFNet
import cv2
options = {"model": "cfg/yolo.cfg", "load": "bin/yolo.weights", "threshold": 0.1}
tfnet = TFNet(options)
imgcv = cv2.imread("./test/dog.jpg")
result = tfnet.return_predict(imgcv)
print(result
Please suggest steps so that I could configure darkflow on Jupyter Notebook (with no GPU) and run the above code
Fixed by creating the file in ipynb file in darkflow directory (downloaded from github) and executing the following from the notebook:
!python3 setup.py build_ext --inplace
!pip install -e .
!pip install .

How to resolve: ImportError: cannot import name 'HttpNtlmAuth' in python3 script?

I have installed both requests and requests_ntlm modules using "sudo python3 -m pip install requests" (and requests_ntlm respectively) and both installs were successful.
When I then attempt to do "from requests import HttpNtlmAuth", I get an error stating "cannot import name 'HttpNtlmAuth'. I do not get this error on my "import requests" line.
When I do a "sudo python3 -m pip list", I see both are installed and are the latest versions.
I've not encountered this error before, only "cannot import module", so I'm unfamiliar with how to resolve this.
EDIT 1: Additional information. When I run this script from command line as "sudo", it works. Because I am running my python script from within a PHP file using "exec", I don't particularly want to run this as a root user. Is there a way around this, or possibly running the exec statement with sudo?
the HttpNtlmAuth class is in the requests_ntlm package so you'll need to have:
import requests
from requests_ntlm import HttpNtlmAuth
Then you'll be able to instantiate your authentication
session = requests.Session()
session.auth = HttpNtlmAuth('domain\\username','password')
session.get(url)

Resources