airflow.decorators not being recognized - airflow

I am a new user of Airflow. I successfully set up the airflow locally.
But when I try to import modules such as from airflow.decorators or airflow.operators.pythonOperators I receive an error as Import could not be resolved.
Below is the screenshot of my situation.
Receiving the same error on my localhost web-server:
Please help me out, as I am entirely new to apache-airflow
Edit:
My current version of airflow is already 2.3.4

Airflow decorators are added in the version 2.2.0, you need just to upgrade your Airflow version
AIRFLOW_VERSION=2.3.4 # the version you want to use, it should be >= 2.2.0
PYTHON_VERSION="$(python --version | cut -d " " -f 2 | cut -d "." -f 1-2)"
CONSTRAINT_URL="https://raw.githubusercontent.com/apache/airflow/constraints-${AIRFLOW_VERSION}/constraints-${PYTHON_VERSION}.txt"
pip install "apache-airflow==${AIRFLOW_VERSION}" --constraint "${CONSTRAINT_URL}"

Related

How do you access Airflow Web Interface?

Hi I am taking a datacamp class on how to use Airflow and it shows how to create dags once you have access to an Airflow Web Interface.
Is there an easy way to create an account in the Airflow Web Interface? I am very lost on how to do this or is this just an enterprise tool where they provide you access to it once you pay?
You must do this on terminal. Run these commands:
export AIRFLOW_HOME=~/airflow
AIRFLOW_VERSION=2.2.5
PYTHON_VERSION="$(python --version | cut -d " " -f 2 | cut -d "." -f 1-2)"
CONSTRAINT_URL="https://raw.githubusercontent.com/apache/airflow/constraints-${AIRFLOW_VERSION}/constraints-${PYTHON_VERSION}.txt"
pip install "apache-airflow==${AIRFLOW_VERSION}" --constraint "${CONSTRAINT_URL}"
airflow standalone
Then, in there, you can see the username and password provided.
Then, open Chrome and search for:
localhost:8080
And write the username and password.
airflow has a web interface as well by default and default user pass is : airflow/airflow
you can run it by using :
airflow webserver --port 8080
then open the link : http://localhost:8080
if you want to make a new username by this command:
airflow create_user [-h] [-r ROLE] [-u USERNAME] [-e EMAIL] [-f FIRSTNAME]
[-l LASTNAME] [-p PASSWORD] [--use_random_password]
learn more about Running Airflow locally
You should install it , it is a python package not a website to register on.
The easiest way to install Airflow is:
pip install apache-airflow
if you need extra packages with it:
pip install apache-airflow[postgres,gcp]
finally run the webserver and the scheduler in different cmd :
airflow webserver # it is by default 8080
airflow scheduler

Get Access to Airflow Hooks within a jupyter notebook

I have Airflow running using the postgres backend - all fine. Additionally I have running a Jupyter server on the same host where Airflow runs. Now I thought I can just access the airflow hooks from a notebook.
import pandas as pd
import numpy as np
import matplotlib as plt
from airflow.hooks.mysql_hook import MySqlHook
mysql = MySqlHook(mysql_conn_id = 'mysql-x')
sql = "select 1+1"
mysql.get_pandas_df(sql)
But I get this exception message:
OperationalError: (sqlite3.OperationalError) no such table: connection
[SQL: SELECT connection.password AS connection_password, connection.extra AS connection_extra, connection.id AS connection_id, connection.conn_id AS connection_conn_id, connection.conn_type AS connection_conn_type, connection.host AS connection_host, connection.schema AS connection_schema, connection.login AS connection_login, connection.port AS connection_port, connection.is_encrypted AS connection_is_encrypted, connection.is_extra_encrypted AS connection_is_extra_encrypted
FROM connection
WHERE connection.conn_id = ?]
[parameters: ('mysql-x',)]
(Background on this error at: http://sqlalche.me/e/e3q8)
But what makes me get suspicious is that it not only does not find the connection_id (which I clearly can see in the Airflow UI). It also says: sqlite3.OperationalError - it very much looks like it is not even connected to the same postgres database. I have checked os.environ["AIRFLOW_HOME"] which seems to be correct.
EDIT 1:
After I start the jupyter notebook server after airflow, such that all environment variables are set, I get a different error:
/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/attributes.py in __get__(self, instance, owner)
351
352 def __get__(self, instance, owner):
--> 353 retval = self.descriptor.__get__(instance, owner)
354 # detect if this is a plain Python #property, which just returns
355 # itself for class level access. If so, then return us.
/usr/local/lib/python3.7/site-packages/airflow/models/connection.py in get_password(self)
188 "Can't decrypt encrypted password for login={}, \
189 FERNET_KEY configuration is missing".format(self.login))
--> 190 return fernet.decrypt(bytes(self._password, 'utf-8')).decode()
191 else:
192 return self._password
/usr/local/lib/python3.7/site-packages/cryptography/fernet.py in decrypt(self, msg, ttl)
169 except InvalidToken:
170 pass
--> 171 raise InvalidToken
InvalidToken:
You can use this dockerfile to reproduce it:
FROM apache/airflow
USER root
# install mysql client
RUN apt-get update && apt-get install -y mariadb-client-10.3 unzip
# install mssql client and tools
RUN apt-get install -y curl gnupg libicu-dev libicu63
RUN curl https://packages.microsoft.com/keys/microsoft.asc -o key
RUN apt-key add < key
RUN curl https://packages.microsoft.com/config/ubuntu/16.04/prod.list > /etc/apt/sources.list.d/msprod.list
ENV ACCEPT_EULA=Y
RUN apt-get update && apt-get install -y mssql-tools msodbcsql17 unixodbc unixodbc-dev unzip libunwind8
RUN curl -Lq 'https://go.microsoft.com/fwlink/?linkid=2108814' -o sqlpackage-linux-x64-latest.zip
RUN mkdir /opt/sqlpackage/ && unzip sqlpackage-linux-x64-latest.zip -d /opt/sqlpackage/
RUN chmod a+x /opt/sqlpackage/sqlpackage && ln -sfn /opt/sqlpackage/sqlpackage /usr/bin/sqlpackage
RUN ln -sfn /opt/mssql-tools/bin/sqlcmd /usr/bin/sqlcmd
# install notebooks
RUN pip install jupyterlab pandas numpy scikit-learn matplotlib pymssql
#RUN cat /entrypoint.sh
# start additional notebok server
# this is a dirty hack but for the sake of this prototype good enough
RUN sed -i -e's/\# Run the command/airflow scheduler \& \njupyter notebook --ip=0.0.0.0 --port=9000 --NotebookApp.token="" --NotebookApp.password="" \& \n/' /entrypoint
EXPOSE 9000
# switch back to airflow user
USER airflow
RUN airflow initdb
RUN alias ll='ls -al'

how to resolve "Error: No module named 'airflow.www'" while starting airflow websever

Getting below error while starting Airflow webserver
balajee#Balajees-MacBook-Air.local:~$ airflow webserver -p 8080
[2018-12-03 00:29:37,066] {init.py:51} INFO - Using executor SequentialExecutor
[2018-12-03 00:29:38,776] {models.py:271} INFO - Filling up the DagBag from /Users/balajee/airflow/dags
Running the Gunicorn Server with:
Workers: 4 sync
Host: 0.0.0.0:8080
Timeout: 120
Logfiles: - -
Error: No module named 'airflow.www'
Fixed for me
pip3 uninstall -y gunicorn
pip3 install gunicorn==19.4.0
I got this problem this morning, and I found a strange solution, may it helps you. I think maybe you just need to change the command running directory.
I install airflow basic dependence in my virtualenv directory venv with PyCharm help, and I use PyCharm build-in Terminal tab to directly access my venv, and I use airflow initdb to init sqlite database to store all my logs and ops, then according to the official tutorial I use airflow webserver to start the webserver. But somehow today I use my Mac terminal, and start virtulenv, and start airflow webserver, and I encounter this problem with:
Running the Gunicorn Server with:
Workers: 4 sync
Host: 0.0.0.0:8080
Timeout: 120
Logfiles: - -
=================================================================
Error: No module named 'airflow.www'
[2019-05-26 07:45:27,130] {cli.py:833} ERROR - No response from gunicorn master within 120 seconds
[2019-05-26 07:45:27,130] {cli.py:834} ERROR - Shutting down webserver
And I tried #Evgeniy Sobolev's solution by reinstall gunicorn and nothing changed, but when I still using my PyCharm Terminal, it can still running successfully. I guess maybe it is because the first directory you init your db and running webserver is critical. By default when I use PyCharm Terminal to init db and start webserver is the Project root directory, like:
(venv) root#root:~/GitHub/FakeProject$ airflow webserver
But today I check into venv to start virtualenv, and the root directory changed!
root#root:~/GitHub/FakeProject/SubDir$ source venv/bin/activate
(venv) root#root:~/GitHub/FakeProject/SubDir$ airflow webserver
** Error **
So in this way it encounters Error: No module named 'airflow.www', so I check out the directory, and the webserver running successfully just like PyCharm Terminal:
(venv) root#root:~/GitHub/FakeProject/SubDir$ cd ..
(venv) root#root:~/GitHub/FakeProject$ airflow webserver
** It works **
I thought maybe airflow store some metadata (like setup a PATH, maybe) in the first time init your airflow db, so you can not change your command running directory.
I hope it may help somebody in the future. Just check your directory!
Looks like you have a problem with gunicorn.
Try to execute this two commands:
sudo -H pip3 uninstall -y gunicorn
sudo -H pip3 install gunicorn
It should resolve your problem, cause airflow show you not clear error message related to gunicorn problems
I did this steps for the problem happens:
create a separate virtualenv only for airflow (I use anaconda distribution)
activate this env with conda activate
install airflow: pip install apache-airflow
at this moment the error No module named 'airflow.www' was showed for me
To fix follow this steps:
Look for where is your gunicorn in: whereis gunicorn
gunicorn have to stay only in your virtualenv directory: /home/yourname/anaconda3/envs/airflow_env/bin/gunicorn
If it stay in two directories, let it just in your airflow enviroment. Remove it all from another.
Another way to verify if gunicorn is in another directories is printing your PATH variable: echo $PATH. Look for gunicorn in /home/yourname/.local/bin and another anaconda directories from PATH. Remove all references. Remove gunicorn from conda base env as well: pip uninstall gunicorn.
With this steps, I think your problem will be solved.
I used anaconda distribution, but I think the same process can be done without it. I used airflow 1.10.0 and python 3.6.
If you have defined a custom home directory for airflow other than default one (~/airflow) during the installation:
You need first export the custom path:
export AIRFLOW_HOME=/your/custom/path/airflow
Go to the airflow directory and then Run the webserver
airflow webserver -p 8080
Run scheduler too
airflow scheduler
please check if gunicorn is installed already in server. for me it was installed in /usr/local/bin and it was taking precedence over gunicorn version installed with airflow. uninstall earlier one or fix $PATH variable
I solved this by starting the webserver from the airflow folder itself.
I was previously trying to open the server from the home directory but the required modules could not be found which may be the case here.
Late to the party but could help others who get here.
I got the same issue using latest airflow version 2.5.0
Make sure env variable AIRFLOW_HOME is pointing to right location
Thanks all for sharing
I added sudo and it actually worked just fine.
I got the same error today and a sudo did the trick to me

Problem: Google Colab import libraries and authorization to use google drive takes too long?

I execute this part of code in google colab, to authorize me so to get access to my drive folders
!apt-get install -y -qq software-properties-common python-software-properties module-init-tools
!add-apt-repository -y ppa:alessandro-strada/ppa 2>&1 > /dev/null
!apt-get update -qq 2>&1 > /dev/null
!apt-get -y install -qq google-drive-ocamlfuse fuse
from google.colab import auth
auth.authenticate_user()
from oauth2client.client import GoogleCredentials
creds = GoogleCredentials.get_application_default()
import getpass
!google-drive-ocamlfuse -headless -id={creds.client_id} -secret=
{creds.client_secret} < /dev/null 2>&1 | grep URL
vcode = getpass.getpass()
!echo {vcode} | google-drive-ocamlfuse -headless -id=
{creds.client_id} -secret={creds.client_secret}
But when it comes to the part of copy and paste the verification code from the given url, execution doesn't end, so to move on the next block. What is the problem?even though I am still connected.
Following this tutorial, I've been using this for the past few days, and what has worked for me is the following:
1.- In a single cell run:
from google.colab import auth
auth.authenticate_user()
2.- Follow the link, copy and paste the generated password
3.- In another cell run:
from oauth2client.client import GoogleCredentials
creds = GoogleCredentials.get_application_default()
import getpass
prompt = !google-drive-ocamlfuse -headless -id={creds.client_id} -secret={creds.client_secret} < /dev/null 2>&1 | grep URL
vcode = getpass.getpass(prompt[0] + '\n\nEnter verification code: ')
!echo {vcode} | google-drive-ocamlfuse -headless -id={creds.client_id} -secret={creds.client_secret}
4.- Again, follow the link, copy and paste the new generated password
5.- Proceed to mount your drive and work with your files!
This looks like an issue other folks are encountering being discussed here: https://github.com/googlecolab/colabtools/issues/231
Do you also see any 400s or errors on the JS console?
Recently I made a new effort and seems to work, in order to mount your drive to the colab-notebook use these lines instead,
from google.colab import drive
drive.mount('/content/gdrive')
Run it,
Click on the link accept and copy-paste the given code, hit enter done! Much simpler from the previous way I've asked about which by the way it still not working.
After pasting in your generated code, hit enter on the keyboard. It will continue to buffer until you do.

Airflow installation issue on Windows 7

How to install Airflow on Windows 7? getting below error while installing it using pip install apache-airflow :
---------------------------------------- Command "c:\users\shrgupta5\appdata\local\programs\python\python36-32\python.exe
-u -c "import setuptools, tokenize;__file__='C:\\Users\\SHRGUP~1\\AppData\\Loca l\\Temp\\pip-build-_yptw7sa\\psutil\\setup.py';f=getattr(tokenize, 'open', open) (__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __fi le__, 'exec'))" install
--record C:\Users\SHRGUP~1\AppData\Local\Temp\pip-_cwm0n u7-record\install-record.txt --single-version-externally-managed
--compile" fail ed with error code 1 in C:\Users\SHRGUP~1\AppData\Local\Temp\pip-build-_yptw7sa\ psutil\
I wouldn't bother trying to install Airflow on windows, even after you install it successfully you cannot run the airflow script due to a dependency on the unix-only module pwd
You can run Airflow on Windows by using the Docker setup from puckel https://github.com/puckel/docker-airflow.
Use VirtualBox and Docker Toolbox(legacy) and setup a docker-machine on your Windows computer (docker-machine create -d virtualbox --virtualbox-cpu-count "2" --virtualbox-memory "2048" default)
make sure to fork puckels git repo underneath c:/Users/yourusername/documents otherwise mounting of DAGS won't work
You should now be able to spin up Airflow e.g. by using the celery-executor setup with docker compose -f docker-compose-CeleryExecutor.yml up -d
I have setup a environment where I develop the DAGs on Windows, test them within the dockercontainer and then push the Dockerimage to Linux to production. I have added a more detailed tutorial here.
Airflow cannot be installed on Windows within the standard command prompt.
You need to use bash and afterwards change the config:
How to run Airflow on Windows
Download the source of airflow from pypi:
https://pypi.org/project/airflow/#files
Unzip and edit setup.cfg, then go to the install_requires section and change the version of psutil with the following: 'psutil>=5.4.7',
Finally, run python setup.py install in the source directory

Resources