Broken DAG : ModuleNotFoundError: No module named 'airflow.providers.snowflake' - airflow

Hi I have setup my environment for Airflow run. I want to run the DAG whcih connects to Snowflake. I have installed below necessary packages from cloud shell.
pip3 install snowflake-connector-python==2.4.5
pip3 install snowflake-sqlalchemy==1.2.4
pip3 install apache-airflow-providers-snowflake==2.3.0
pip3 install apache-airflow-providers-common-sql
I have established snowflake connection in Airflow.
Now while executing the DAG i am getting this error since long time:
Broken DAG: [/home/airflow/gcs/dags/snowflake_connect_mine.py] Traceback (most recent call last):
File "/home/airflow/gcs/dags/snowflake_connect_mine.py", line 6, in <module>
from airflow.contrib.hooks.snowflake_hook import SnowflakeHook
File "/opt/python3.8/lib/python3.8/site-packages/airflow/contrib/hooks/snowflake_hook.py", line 23, in <module>
from airflow.providers.snowflake.hooks.snowflake import SnowflakeHook # noqa
ModuleNotFoundError: No module named 'airflow.providers.snowflake'
Please help me to resolve this issue.
Regards
Sachin Mittal
9560315720

Related

Issue with installing ansible-galaxy azure.azcollection

[root#jenkins-dev playbooks]# ansible-galaxy collection install azure.azcollection
ERROR! Unexpected Exception, this is probably a bug: cannot import name 'CollectionRequirement' from 'ansible.galaxy.collection' (/usr/local/lib/python3.7/site-packages/ansible/galaxy/collection/__init__.py)
the full traceback was:
Traceback (most recent call last):
File "/usr/local/bin/ansible-galaxy", line 92, in <module>
mycli = getattr(__import__("ansible.cli.%s" % sub, fromlist=[myclass]), myclass)
File "/usr/local/lib/python3.7/site-packages/ansible/cli/galaxy.py", line 24, in <module>
from ansible.galaxy.collection import (
ImportError: cannot import name 'CollectionRequirement' from 'ansible.galaxy.collection' (/usr/local/lib/python3.7/site-packages/ansible/galaxy/collection/__init__.py)
This exception indicates you have overlapping conflicting installations of ansible-core (or ansible-base) and ansible<2.10.
You will need to clean up your installs to resolve the issue. Potentially via:
$ sudo pip uninstall -y ansible-base ansible-core
And then install again:
$ sudo pip install ansible-base ansible-core

Apache Airflow initdb command fails, due to syntax error

I have created virtualenv for python3 using:
virtualenv -p $(which python3) ENV
Then activate the source
source /Users/myusername/ENV/bin/activate
Install the apache-airflow:
pip install apache-airflow
then which airflow yields /Users/myusername/ENV/bin/airflow
But when I try to initdb using:
airflow initdb
I get below error:
{db.py:350} INFO - Creating tables
INFO [alembic.runtime.migration] Context impl SQLiteImpl.
INFO [alembic.runtime.migration] Will assume non-transactional DDL.
WARNI [airflow.utils.log.logging_mixin.LoggingMixin] cryptography not found - values will not be stored encrypted.
ERROR [airflow.models.DagBag] Failed to import: /Library/Python/2.7/site-packages/airflow/example_dags/example_http_operator.py
Traceback (most recent call last):
File "/Library/Python/2.7/site-packages/airflow/models/__init__.py", line 413, in process_file
m = imp.load_source(mod_name, filepath)
File "/Library/Python/2.7/site-packages/airflow/example_dags/example_http_operator.py", line 27, in <module>
from airflow.operators.http_operator import SimpleHttpOperator
File "/Library/Python/2.7/site-packages/airflow/operators/http_operator.py", line 21, in <module>
from airflow.hooks.http_hook import HttpHook
File "/Library/Python/2.7/site-packages/airflow/hooks/http_hook.py", line 23, in <module>
import tenacity
File "/Library/Python/2.7/site-packages/tenacity/__init__.py", line 375, in <module>
from tenacity.tornadoweb import TornadoRetrying
File "/Library/Python/2.7/site-packages/tenacity/tornadoweb.py", line 24, in <module>
from tornado import gen
File "/Library/Python/2.7/site-packages/tornado-6.0.3-py2.7-macosx-10.14-intel.egg/tornado/gen.py", line 126
def _value_from_stopiteration(e: Union[StopIteration, "Return"]) -> Any:
^
SyntaxError: invalid syntax
Done.
(ENV) ---------------------------------------------------------
Seems like example scripts use python 2.7 and it can't recognize the function definition syntax.
Does apache-airflow package need to be fixed by the next release or I can do something to fix this?
I tried fixing this:
Use python2.7 instead of python3
then install airflow on default python 2.7 enabled on mac but this throws other errors like package "six" is not compatible.
You need to turn off the example DAGs to be loaded in config file to solve this problem.
Anyway, it seems weird that airflow uses 2.7 Python when you told that it is installed into Python 3 virtual environment.

Raspberry pi No module named 'cx_Oracle'

I want to use the raspberry pi to send values to the Oracle11g database, but when I run import cx_Oracle syntax for that process, I get the following error:
Traceback (most recent call last):
File "/home/pi/20190222ex01.py", line 1, in <module>
import cx_Oracle
File "/usr/lib/python3/dist-packages/thonny/backend.py", line 317, in _custom_import
module = self._original_import(*args, **kw)
ImportError: No module named 'cx_Oracle'
How can I solve this problem?
Update: Oracle has released Oracle Instant Client ARM64: https://www-sites.oracle.com/database/technologies/instant-client/linux-arm-aarch64-downloads.html
It means, that you have not installed module cx_Oracle.
First you must install Oracle driver with PIP:
python -m pip install cx_Oracle --upgrade
Hope it helped you.

Error in install_keras() in R since Ubuntu update

I used the book "Deep Learning with R" since one month now, and it enables me to make my first neural networks.
I am using Ubuntu. Until 2 days ago, everything was OK and worked fine. But two days ago I updated my Ubuntu to Ubuntu 18.02. Since then, my R code is not working anymore.
I have re-done what is recommended in the book (and what has worked one month ago):
$ sudo apt-get update
$ sudo apt-get upgrade
$ sudo apt-get install python-pip python-dev
$ sudo apt-get install build-essential cmake git unzip pkg-config libopenblas-dev liblapack-dev
I had no error.
then in R:
install.packages("keras")
library(keras)
install_keras()
This last command is supposed to install the core Keras library along with its dependencies in a Python virtual environment using TensorFlow.
But I obtained the following error that I really do not understand:
> install_keras()
Using existing virtualenv at ~/.virtualenvs/r-tensorflow
Upgrading pip ...
Traceback (most recent call last):
File "/home/baragatt/.virtualenvs/r-tensorflow/bin/pip", line 7, in <module>
from pip._internal import main
File "/home/baragatt/.virtualenvs/r-tensorflow/local/lib/python2.7/site-packages/pip/_internal/__init__.py", line 5, in <module>
import logging
File "/usr/lib/python2.7/logging/__init__.py", line 26, in <module>
import sys, os, time, cStringIO, traceback, warnings, weakref, collections
File "/usr/lib/python2.7/weakref.py", line 14, in <module>
from _weakref import (
ImportError: cannot import name _remove_dead_weakref
Erreur : Error 1 occurred installing TensorFlow
I have re-installed R, python, tensorflow, but I always have the same error. I do not understand this error. Maybe this is a problem with the virtualenv?
Can someone help me please? It is so frustrating, because two days ago my code was running, and now impossible to work...
I am working with Ubuntu 18.02, at the installed versions are python 2.7.15~rc1-1, R-3.4.4 and tensorflow-1.10.0.
Thanks a lot for this post. I do not really understand what the commands in this post are supposed to fix. But I have done the following:
cd /home/baragatt/.virtualenvs/r-tensorflow/
Then, as proposed in the post:
virtualenv . --system-site-packages
I obtained the following messages:
Running virtualenv with interpreter /usr/bin/python2
New python executable in /home/baragatt/.virtualenvs/r-tensorflow/bin/python2
Not overwriting existing python script /home/baragatt/.virtualenvs/r-tensorflow/bin/python (you must use /home/baragatt/.virtualenvs/r-tensorflow/bin/python2)
Traceback (most recent call last):
File "/usr/lib/python3/dist-packages/virtualenv.py", line 2375, in <module>
main()
File "/usr/lib/python3/dist-packages/virtualenv.py", line 724, in main
symlink=options.symlink)
File "/usr/lib/python3/dist-packages/virtualenv.py", line 946, in create_environment
site_packages=site_packages, clear=clear, symlink=symlink))
File "/usr/lib/python3/dist-packages/virtualenv.py", line 1417, in install_python
os.symlink(py_executable_base, full_pth)
OSError: [Errno 17] File exists
I also tried:
virtualenv -p /usr/bin/python2.7 .
And I obtained:
Running virtualenv with interpreter /usr/bin/python2.7
New python executable in /home/baragatt/.virtualenvs/r-tensorflow/bin/python2.7
Traceback (most recent call last):
File "/usr/lib/python3/dist-packages/virtualenv.py", line 2375, in <module>
main()
File "/usr/lib/python3/dist-packages/virtualenv.py", line 724, in main
symlink=options.symlink)
File "/usr/lib/python3/dist-packages/virtualenv.py", line 946, in create_environment
site_packages=site_packages, clear=clear, symlink=symlink))
File "/usr/lib/python3/dist-packages/virtualenv.py", line 1278, in install_python
shutil.copyfile(executable, py_executable)
File "/usr/lib/python2.7/shutil.py", line 97, in copyfile
with open(dst, 'wb') as fdst:
IOError: [Errno 40] Too many levels of symbolic links: '/home/baragatt/.virtualenvs/r-tensorflow/bin/python2.7'
I finally find a solution, by looking at different forums.
I thought that the problem should be because of the virtualenvironment that should be created when doing the following command in R.
install_keras()
Hence, I deleted the virtual environment(s) by deleting the directory in which these environements are located (I imagine).
cd ~/.virtualenvs
rm -r r-tensorflow/
Then I have tried the following commands in R
install.packages("keras")
library(keras)
install_keras()
And it works! Honestly, I still do not understand what was the problem that occured after my Ubuntu update.

ModuleNotFoundError: No module named 'six'

I am trying to setup lamp server on my Fedora 27. Referring this site, I am following every step, but running this command firewall-cmd --permanent --add-service=http, here are the following errors I get
Traceback (most recent call last):
File "/usr/bin/firewall-cmd", line 31, in <module>
from firewall.client import FirewallClient, FirewallClientIPSetSettings, \
File "/usr/lib/python3.6/site-packages/firewall/client.py", line 29, in <module>
import slip.dbus
File "/usr/lib/python3.6/site-packages/slip/dbus/__init__.py", line 8, in <module>
from . import service
File "/usr/lib/python3.6/site-packages/slip/dbus/service.py", line 30, in <module>
from six import with_metaclass
ModuleNotFoundError: No module named 'six'
I reinstalled six package, but still the same error message.
You probably don't have the six Python module installed. You can find it on
pypi
To install it:
$ easy_install six
If you have pip installed you can run $ pip install 'six'

Resources