Recently, I added a sub-directory and few script files with an empty init.py in the new sub-directory. There is no init.py in foo_main_dir.
foo_main_dir
|
|---- main.py
----- foo_sub_dir
|
---- foo.py
Sometimes pyinstaller stoped working completely after I made some changes in foo.sub.dir with an import error at run-time of the executable file:
ImportError: No module named foo
Due to main.py import:
from foo_main_sub.foo import FOO
None of .pyc was generated in both directories.
A workaround is to reinstall pyinstller. It is quite annoying.
Appreciate if you know a permanent solution.
Related
I'm trying to run some GitHub code and I keep getting ModuleNotFoundError: No module named 'deep_knn' when I run from deep_knn import Deep_KNN. Same thing happens when I try import utils.tensorflow_utils as tf_utils. Here is a picture of the folders:
I'm using Python 3.8.10 and I'm in the DeepNNK_polytope_interpolation directory. Does anyone know how I can fix this issue? Thanks for any help!
figured it out- I ran !cd '/content/DeepNNK_polytope_interpolation/' but it should've been !cd DeepNNK_polytope_interpolation.
my project structure looks like :
my_project
-dags
-config
however on airflow dashboard I see an error Broken DAG pointing to this line : from config.setup_configs import somemethod
and yields this err:
Broken DAG: [/usr/local/airflow/dags/airflow_foo.py] No module named 'config'
although the directory exists
According to documentation Airflow has, by default, three directories to path
AIRFLOW_HOME/dags
AIRFLOW_HOME/config
AIRFLOW_HOME/plugins
Any other path has to be added to system path, as described in airflow module management
For sake of simplicity, I added my module mymodule.py to AIRFLOW_HOME/plugins and I can import them successfully.
from mymodule import my_method
So, in your case, if you rename configurations to plugins and update import statement into DAG,
from setup_configs import somemethod
it should work.
You need to move the config directory into the dags folder and create an empty __init__.py file within the config folder. Then, it should work.
I have the following folder-structure
airflow/
|_dag/
as far as I understand, airflow uses the "airflow" folder as root i.e I assume that everything placed in "airflow" would be able to be imported.
Say I have different projects with tasks placed in the following structure
airflow/
|_dag/
| |_ mydag.py
|
|_myprojects/
|_projectone/
| |_tasks/
| |_ mytask.py
|_projecttwo/
|_tasks/
|_ mytask.py
then I would assume that I in mydag.py should be able to import mytask from a given project like
#mydag.py
from myprojects.projectone import tasks
but I get a DAG import error; ModuleNotFoundError: No module named 'myprojects'.
Is this doable, or should I (somehow) change the airflows PYTHONPATH (and in that case, where is that done?)
Note, I have created __init__.py files in the folders.
One option is to set the path just before importing myproject:
#mydag.py
import sys
# it's important for path being inserted before importing `myproject`
sys.path.insert(0, "..")
from myproject import tasks
The second option is to move myproject under the dag folder:
airflow
+-- dag
+--- myproject
The third option is to move some logic into plugins/ folder.
https://airflow.apache.org/docs/apache-airflow/stable/modules_management.html
What is the correct way on converting a python script to binary using pyinstaller and opencl? How do I setup the name.spec file? See my pyinstaller settings below
Usage: pyinstaller --clean --noconfirm --log-level=DEBUG --onefile --paths=/home/testuser/projects/tool --paths=/usr/local/lib/python3.8/dist-packages/pyopencl-2018.2.2-py3.8-linux-x86_64.egg --hidden-import=pyopencl --name=toolexe tool.py
From my main script(myscript.py), I am calling "import tool as *", which is my tool package, and it's calling "import pyopencl as cl" in several tool package scripts. When I run the compiled binary version on my scripts, I see "pyopencl.init.py" being called multiple times.. (see diagram below)
I am wondering if it's the relative import of "tool as *", that screwing up the binary version. Note: This code does work as python script code.
#myscript -- "import tool as *"
└── tool
├── common.py ... #"import pyopencl as cl"
├── engine.py ... #"import pyopencl as cl"
├── traffic.py... #"import pyopencl as cl"
├── init.py
I found two issue with my conversion python scripts to binary
The python script import tool/tool.py was wrong, you can't have import script with the same name as the directory it's in.
In linux, there is a warning with threading when converting to binary..
https://docs.python.org/3/library/multiprocessing.html
Warning The 'spawn' and 'forkserver' start methods cannot currently be used with “frozen” executables (i.e., binaries produced by packages like PyInstaller and cx_Freeze) on Unix. The 'fork' start method does work
Which paths are added to sys.path when the command is run, what are the factors that affect it?
My Python version is 3.6.4, and I also tried it on version 3.7.
Directory structure is
.
├── __init__.py
└── src
├── a.py
└── b.py
code is
# a.py
class A: pass
# b.py
from sys
print(sys.path)
from src.a import A
a = A()
print(a)
I tried to run python3 src/b.py on two machines with the same Python version. One of them did not report an error and the other error occurred.
In the correct running result, there are two directories in sys.path, one is the current directory and the other is the src directory;
The correct output is:
['/home/work/test/testimport/src', '/home/work/test/testimport',...]
<src.a.A object at 0x7f8b71535ac8>
The wrong result is:
['/home/work/test/testimport/src', ...]
Traceback (most recent call last):
File "src/b.py", line 3, in <module>
from src.a import A
ModuleNotFoundError: No module named 'src'
sys.path contains only the src directory.
Which path will be appended to sys.path when i run python3 src/b.py?
src is indeed not a module (does not contain __init__.py) and it does not matter if it is in your path or not. In addition, b.py "sees" the directory it is in (src) anyway, so
from a import A
would work no matter where are you executing B from (python3 /path/to/src/b.py) should work. Note even if you did create
`src/__init__.py`
your b.py would fail if you did not add the directory src to your path (or PYTHONPATH, which is the recommended way to add python modules to your path).