FileNotFoundError: [Errno 2] No such file or directory: 'java': 'java' - streamlit

I am facing this error after I have deployed my streamlit app on streamlit sharing. The app is running well on my localhost but not after deploying. I think it is not running the bash commands on the server which I have in my streamlit app.
# Performs the descriptor calculation
bashCommand = "java -Xms2G -Xmx2G -Djava.awt.headless=true -jar ./PaDEL-Descriptor/PaDEL-Descriptor.jar -removesalt -standardizenitro -fingerprints -descriptortypes ./PaDEL-Descriptor/PubchemFingerprinter.xml -dir ./ -file descriptors_output.csv"
process = subprocess.Popen(bashCommand.split(), stdout=subprocess.PIPE)
output, error = process.communicate()
os.remove('molecule.smi')
Error Image
I am getting this as an error:
FileNotFoundError: [Errno 2] No such file or directory: 'java': 'java'
Traceback:
File "/home/appuser/venv/lib/python3.7/site-packages/streamlit/script_runner.py", line 332, in _run_script
exec(code, module.__dict__)
File "/app/bioactivity-prediction/app.py", line 69, in <module>
desc_calc()
File "/app/bioactivity-prediction/app.py", line 13, in desc_calc
process = subprocess.Popen(bashCommand.split(), stdout=subprocess.PIPE)
File "/usr/local/lib/python3.7/subprocess.py", line 800, in __init__
restore_signals, start_new_session)
File "/usr/local/lib/python3.7/subprocess.py", line 1551, in _execute_child
raise child_exception_type(errno_num, err_msg, err_filename)
Also here is a link to my deployed app:
https://share.streamlit.io/rahul97532/bioactivity-prediction/app.py

To install java on Streamlit Cloud, you need to create a file packages.txt with the line default-jre in it, as demonstrated in this repository example:
https://github.com/randyzwitch/test-java/blob/main/packages.txt

Related

Script compiled with pyinstaller is missing a .dll file, when the file is manually copied in the program's folder it just dies

I have a python script which is basically a graphic interface (pysimpleguy) to a mysql database.
I am working in python 3.8; my dependencies are:
PySimpleGUI 4.55.1
sqlalchemy 1.3.20
pymysql 1.0.2
pandas 1.1.3
regex 2020.10.15
pillow 8.0.1
The code works and I'd like to compile it to .exe to distribute it to users in my organization.
I tried to compile it with:
pyinstaller -D .\db_interface_v3.6.1_release.py --debug=imports
However, pyinstaller throws some errors when compiling:
201667 INFO: Building COLLECT COLLECT-00.toc
Traceback (most recent call last):
File "c:\users\spit\anaconda3\lib\runpy.py", line 194, in _run_module_as_main
return _run_code(code, main_globals, None,
File "c:\users\spit\anaconda3\lib\runpy.py", line 87, in _run_code
exec(code, run_globals)
File "C:\Users\Spit\anaconda3\Scripts\pyinstaller.exe\__main__.py", line 7, in <module>
File "c:\users\spit\anaconda3\lib\site-packages\PyInstaller\__main__.py", line 124, in run
run_build(pyi_config, spec_file, **vars(args))
File "c:\users\spit\anaconda3\lib\site-packages\PyInstaller\__main__.py", line 58, in run_build
PyInstaller.building.build_main.main(pyi_config, spec_file, **kwargs)
File "c:\users\spit\anaconda3\lib\site-packages\PyInstaller\building\build_main.py", line 782, in main
build(specfile, kw.get('distpath'), kw.get('workpath'), kw.get('clean_build'))
File "c:\users\spit\anaconda3\lib\site-packages\PyInstaller\building\build_main.py", line 714, in build
exec(code, spec_namespace)
File "C:\Users\Spit\Desktop\DIPEx db parser\db_interface_v3.6.1_release.spec", line 37, in <module>
coll = COLLECT(exe,
File "c:\users\spit\anaconda3\lib\site-packages\PyInstaller\building\api.py", line 818, in __init__
self.__postinit__()
File "c:\users\spit\anaconda3\lib\site-packages\PyInstaller\building\datastruct.py", line 155, in __postinit__
self.assemble()
File "c:\users\spit\anaconda3\lib\site-packages\PyInstaller\building\api.py", line 866, in assemble
shutil.copy(fnm, tofnm)
File "c:\users\spit\anaconda3\lib\shutil.py", line 415, in copy
copyfile(src, dst, follow_symlinks=follow_symlinks)
File "c:\users\spit\anaconda3\lib\shutil.py", line 261, in copyfile
with open(src, 'rb') as fsrc, open(dst, 'wb') as fdst:
FileNotFoundError: [Errno 2] No such file or directory: 'C:\\Users\\Spit\\Desktop\\DIPEx db parser\\dist\\db_interface_v3.6.1_release\\share\\jupyter\\lab\\staging\\node_modules\\.cache\\terser-webpack-p
lugin\\content-v2\\sha512\\2e\\ba\\cfce62ec1f408830c0335f2b46219d58ee5b068473e7328690e542d2f92f2058865c600d845a2e404e282645529eb0322aa4429a84e189eb6b58c1b97c1a'
If I try to run the compiled exe, I get an error regarding a specific .dll:
INTEL MKL ERROR: Impossibile trovare il modulo specificato. mkl_intel_thread.dll.
Intel MKL FATAL ERROR: Cannot load mkl_intel_thread.dll.
If I take this missing .dll from my Anaconda environment and copy it into the program's folder, when I try to run the .exe again it just dies without further messages:
import 'numpy.ma' # <pyimod03_importers.FrozenImporter object at 0x000001F6A455BEE0>
PS C:\Users\Spit\Desktop\DIPEx db parser\dist\db_interface_v3.6.1_release>
Any idea on how to sort it out?
Thanks!
Sorted out. As a future reference if someone stumbles upon this question, the error is caused by Windows' PATH_MAX limitation, preventing pyinstaller to find all the necessary files.
In order to disable said limitation: https://learn.microsoft.com/en-us/windows/win32/fileio/maximum-file-path-limitation?tabs=cmd
Kudos to https://github.com/bwoodsend

Is the working directory of the dagster main process different of the scheduler processes

I'm having an issue with the loading of a file from dagster code (setup, not pipelines). Say I have the following project structure:
pipelines
-app/
--environments
----schedules.yaml
--repository.py
--repository.yaml
When I run dagit while inside the project folder($cd project && dagit -y app/repository.yaml), this folder becomes the working dir and inside the repository.py I could load a file knowing the root is project
# repository.py
with open('app/evironments/schedules.yaml', 'r'):
# do something with the file
However, if I set up a schedule the pipelines in the project do not run. Checking the cron logs it seems the open line throws a file not found exception. I was wondering if this happens because the working directory is different when executing the cron.
For context, I'm loading a config file with parameters of cron_schedules for each pipeline. Also, here's the tail of the stacktrace in my case:
File "/home/user/.local/share/virtualenvs/pipelines-mfP13m0c/lib/python3.8/site-packages/dagster/core/definitions/handle.py", line 190, in from_yaml
return LoaderEntrypoint.from_file_target(
File "/home/user/.local/share/virtualenvs/pipelines-mfP13m0c/lib/python3.8/site-packages/dagster/core/definitions/handle.py", line 161, in from_file_target
module = import_module_from_path(module_name, os.path.abspath(python_file))
File "/home/user/.local/share/virtualenvs/pipelines-mfP13m0c/lib/python3.8/site-packages/dagster/seven/__init__.py", line 75, in import_module_from_path
spec.loader.exec_module(module)
File "<frozen importlib._bootstrap_external>", line 783, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/home/user/pipelines/app/repository.py", line 28, in <module>
schedule_builder = ScheduleBuilder(settings.CRON_PRESET, settings.ENV_DICT)
File "/home/user/pipelines/app/schedules.py", line 12, in __init__
self.cron_schedules = self._load_schedules_yaml()
File "/home/user/pipelines/app/schedules.py", line 16, in _load_schedules_yaml
with open(path) as f:
FileNotFoundError: [Errno 2] No such file or directory: 'app/environments/schedules.yaml'
You could open the file using the absolute path of the file so that it opens correctly.
from dagster.utils import file_relative_path
with open(file_relative_path(__file__, './environments/schedules.yaml'), 'r'):
# do something with the file
All file_relative_path is simply doing the following, so you can call the os.path methods directly if you prefer:
def file_relative_path(dunderfile, relative_path):
os.path.join(os.path.dirname(dunderfile), relative_path)

Airflow: ValueError: Unable to configure handler 'processor' - wasb logger

I am trying to configure remote logging with Azure blob.
Airflow version: 1.10.2
Python: 3.6.5
Ubuntu: 18.04
Following are the step I did:
In $AIRFLOW_HOME/config/log_config.py, I have put REMOTE_BASE_LOG_FOLDER = 'wasb-airflow-logs' (This is a folder inside the container (container name: airflow-logs))
Empty init.py is in $AIRFLOW_HOME/config/
$AIRFLOW_HOME/config/ is added in $PYTHONPATH
Renamed DEFAULT_LOGGING_CONFIG to LOGGING CONFIG everywhere in $AIRFLOW_HOME/config/log_config.py
User defined in Airflow blob connection has read/write access to REMOTE_BASE_LOG_FOLDER
$AIRFLOW_HOME/airflow.cfg it has remote_logging = True
logging_config_class = log_config.LOGGING_CONFIG
remote_log_conn_id =
Following is the error:
Unable to load the config, contains a configuration error.
Traceback (most recent call last):
File "/home/gsingh/anaconda3/lib/python3.6/logging/config.py", line 382, in resolve
found = getattr(found, frag)
AttributeError: module 'airflow.utils.log' has no attribute 'wasb_task_handler'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/gsingh/anaconda3/lib/python3.6/logging/config.py", line 384, in resolve
self.importer(used)
File "/home/gsingh/venv/lib/python3.6/site-packages/airflow/utils/log/wasb_task_handler.py", line 23, in <module>
from airflow.contrib.hooks.wasb_hook import WasbHook
File "/home/gsingh/venv/lib/python3.6/site-packages/airflow/contrib/hooks/wasb_hook.py", line 22, in <module>
from airflow.hooks.base_hook import BaseHook
File "/home/gsingh/venv/lib/python3.6/site-packages/airflow/hooks/base_hook.py", line 28, in <module>
from airflow.models import Connection
File "/home/gsingh/venv/lib/python3.6/site-packages/airflow/models.py", line 86, in <module>
from airflow.utils.dag_processing import list_py_file_paths
File "/home/gsingh/venv/lib/python3.6/site-packages/airflow/utils/dag_processing.py", line 49, in <module>
from airflow.settings import logging_class_path
ImportError: cannot import name 'logging_class_path'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/gsingh/anaconda3/lib/python3.6/logging/config.py", line 558, in configure
handler = self.configure_handler(handlers[name])
File "/home/gsingh/anaconda3/lib/python3.6/logging/config.py", line 708, in configure_handler
klass = self.resolve(cname)
File "/home/gsingh/anaconda3/lib/python3.6/logging/config.py", line 391, in resolve
raise v
File "/home/gsingh/anaconda3/lib/python3.6/logging/config.py", line 384, in resolve
self.importer(used)
File "/home/gsingh/venv/lib/python3.6/site-packages/airflow/utils/log/wasb_task_handler.py", line 23, in <module>
from airflow.contrib.hooks.wasb_hook import WasbHook
File "/home/gsingh/venv/lib/python3.6/site-packages/airflow/contrib/hooks/wasb_hook.py", line 22, in <module>
from airflow.hooks.base_hook import BaseHook
File "/home/gsingh/venv/lib/python3.6/site-packages/airflow/hooks/base_hook.py", line 28, in <module>
from airflow.models import Connection
File "/home/gsingh/venv/lib/python3.6/site-packages/airflow/models.py", line 86, in <module>
from airflow.utils.dag_processing import list_py_file_paths
File "/home/gsingh/venv/lib/python3.6/site-packages/airflow/utils/dag_processing.py", line 49, in <module>
from airflow.settings import logging_class_path
ValueError: Cannot resolve 'airflow.utils.log.wasb_task_handler.WasbTaskHandler': cannot import name 'logging_class_path'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/gsingh/venv/bin/airflow", line 21, in <module>
from airflow import configuration
File "/home/gsingh/venv/lib/python3.6/site-packages/airflow/__init__.py", line 36, in <module>
from airflow import settings, configuration as conf
File "/home/gsingh/venv/lib/python3.6/site-packages/airflow/settings.py", line 262, in <module>
logging_class_path = configure_logging()
File "/home/gsingh/venv/lib/python3.6/site-packages/airflow/logging_config.py", line 73, in configure_logging
raise e
File "/home/gsingh/venv/lib/python3.6/site-packages/airflow/logging_config.py", line 68, in configure_logging
dictConfig(logging_config)
File "/home/gsingh/anaconda3/lib/python3.6/logging/config.py", line 795, in dictConfig
dictConfigClass(config).configure()
File "/home/gsingh/anaconda3/lib/python3.6/logging/config.py", line 566, in configure
'%r: %s' % (name, e))
ValueError: Unable to configure handler 'processor': Cannot resolve 'airflow.utils.log.wasb_task_handler.WasbTaskHandler': cannot import name 'logging_class_path'
I am not sure which configuration I am missing. Has anyone faced the same issue?
You need to install the azure package.
pip install 'apache-airflow[azure_blob_storage,azure_data_lake,azure_cosmos,azure_container_instances]
As per updating.md
This now should be installed with
pip install apache-airflow[azure]
But this didn't work for me.
sudo chown 50000:0 dags logs plugins in my case.
I tried to run official docker-compose.yml with all these containers (which are dependent on these 3 volume forwards) or simply wrap airflow standalone into a single container for a debug purpose. Turned out volumes were created with root ownerships instead of airflows.
I had the same error however if I scrolled up higher I could see that there was another exception thrown before the ValueError. Which was a PermissionError.
PermissionError: [Errno 13] Permission denied: '/usr/local/airflow/logs/scheduler'
The reason I got that error is because I didn't create the initial 3 folders (dags, logs, plugins) before running airflow docker container. So docker seems to have created then automatically but the permissions were wrong.
Steps to fix:
Stop current container
docker-compose down --volumes --remove-orphans
Delete folders dags, logs, plugins
Just in case Destroy the images and volumes already created (in Docker Desktop)
Create folders again from command line
mkdir logs dags plugins
run airflow docker again
docker-compose up airflow-init
docker-compose up

Jupyter path error when on different server with mapped home directory

I work on two servers, serverA and serverB. On both of them, my home directory is mapped to the same location. Other than the home directory, the servers have independent file systems. This includes system directories and application directories. So, I created a special .bashrc_serverb file that is sourced if my hostname is serverB. This resets my path.
balter#serverB:~$ echo $PATH
/mnt/scratch/miniconda3/bin:/bin:/usr/local/bin:/usr/bin
I first installed conda and jupyter while logged in to serverA. Apparently it created a file ~/.local/share/jupyter/kernels/python3. I also installed conda and jupyter on serverB. Now when I try to run jupyter notebook or jupyter-console on serverB, I get:
```
balter#serverB:~$ jupyter-console
[ZMQTerminalIPythonApp] ERROR | Failed to run command:
['/home/...miniconda3/bin/python', '-m', 'ipykernel', '-f', '/home/users/balter/.local/share/jupyter/runtime/kernel-26741.json']
PATH='/mnt/scratch/miniconda3/bin:/bin:/usr/local/bin:/usr/bin'
with kwargs:
{'stdin': -1, 'cwd': None, 'start_new_session': True, 'stdout': None, 'stderr': None}
Traceback (most recent call last):
File "/mnt/scratch/miniconda3/bin/jupyter-console", line 5, in
app.main()
File "/mnt/scratch/miniconda3/lib/python3.5/site-packages/jupyter_core/application.py", line 267, in launch_instance
return super(JupyterApp, cls).launch_instance(argv=argv, **kwargs)
File "/mnt/scratch/miniconda3/lib/python3.5/site-packages/traitlets/config/application.py", line 657, in launch_instance
app.initialize(argv)
File "", line 2, in initialize
File "/mnt/scratch/miniconda3/lib/python3.5/site-packages/traitlets/config/application.py", line 87, in catch_config_error
return method(app, *args, **kwargs)
File "/mnt/scratch/miniconda3/lib/python3.5/site-packages/jupyter_console/app.py", line 141, in initialize
self.init_shell()
File "/mnt/scratch/miniconda3/lib/python3.5/site-packages/jupyter_console/app.py", line 109, in init_shell
JupyterConsoleApp.initialize(self)
File "/mnt/scratch/miniconda3/lib/python3.5/site-packages/jupyter_client/consoleapp.py", line 334, in initialize
self.init_kernel_manager()
File "/mnt/scratch/miniconda3/lib/python3.5/site-packages/jupyter_client/consoleapp.py", line 288, in init_kernel_manager
self.kernel_manager.start_kernel(**kwargs)
File "/mnt/scratch/miniconda3/lib/python3.5/site-packages/jupyter_client/manager.py", line 243, in start_kernel
**kw)
File "/mnt/scratch/miniconda3/lib/python3.5/site-packages/jupyter_client/manager.py", line 189, in _launch_kernel
return launch_kernel(kernel_cmd, **kw)
File "/mnt/scratch/miniconda3/lib/python3.5/site-packages/jupyter_client/launcher.py", line 123, in launch_kernel
proc = Popen(cmd, **kwargs)
File "/mnt/scratch/miniconda3/lib/python3.5/subprocess.py", line 947, in init
restore_signals, start_new_session)
File "/mnt/scratch/miniconda3/lib/python3.5/subprocess.py", line 1551, in _execute_child
raise child_exception_type(errno_num, err_msg)
FileNotFoundError: [Errno 2] No such file or directory: '/home/...miniconda3/bin/python'
```
The last line is the crucial one. That path is on serverA (full path obscured for security).
What is the fix for this?
Cross-posted as jupyter issue.

Plone buildout error non-existant file

I have a Plone 4.3.3 site that is throwing an error on buildout:
Traceback (most recent call last):
File "/usr/local/Proforest4.3/buildout-cache/eggs/zc.buildout-2.2.5-py2.7.egg/zc/buildout/buildout.py", line 1946, in main
getattr(buildout, command)(args)
File "/usr/local/Proforest4.3/buildout-cache/eggs/zc.buildout-2.2.5-py2.7.egg/zc/buildout/buildout.py", line 626, in install
installed_files = self[part]._call(recipe.install)
File "/usr/local/Proforest4.3/buildout-cache/eggs/zc.buildout-2.2.5-py2.7.egg/zc/buildout/buildout.py", line 1370, in _call
return f()
File "/data/usr/local/Proforest4.3/buildout-cache/eggs/plone.recipe.precompiler-0.6-py2.7.egg/plone/recipe/precompiler/__init__.py", line 29, in install
return self._run()
File "/data/usr/local/Proforest4.3/buildout-cache/eggs/plone.recipe.precompiler-0.6-py2.7.egg/plone/recipe/precompiler/__init__.py", line 35, in _run
self._compile_eggs()
File "/data/usr/local/Proforest4.3/buildout-cache/eggs/plone.recipe.precompiler-0.6-py2.7.egg/plone/recipe/precompiler/__init__.py", line 67, in _compile_eggs
py_compile.compile(fn, None, None, True)
File "/usr/lib64/python2.7/py_compile.py", line 123, in compile
with open(cfile, 'wb') as fc:
IOError: [Errno 13] Permission denied: '/data/usr/local/Proforest4.3/test/src/proforest.content/proforest/content/behaviours/accordion.pyc'
accordion.py does exist, but not the .pyc version
Permissions on accordion.py seem correct (owned by plone_buildout etc.)
How do I resolve?
You user is trying to create that file, but filesystem permissions are wrong.
Check that the Plone effective-user is able to write in buildout folders.

Resources