Jupyter path error when on different server with mapped home directory - jupyter-notebook

I work on two servers, serverA and serverB. On both of them, my home directory is mapped to the same location. Other than the home directory, the servers have independent file systems. This includes system directories and application directories. So, I created a special .bashrc_serverb file that is sourced if my hostname is serverB. This resets my path.
balter#serverB:~$ echo $PATH
/mnt/scratch/miniconda3/bin:/bin:/usr/local/bin:/usr/bin
I first installed conda and jupyter while logged in to serverA. Apparently it created a file ~/.local/share/jupyter/kernels/python3. I also installed conda and jupyter on serverB. Now when I try to run jupyter notebook or jupyter-console on serverB, I get:
```
balter#serverB:~$ jupyter-console
[ZMQTerminalIPythonApp] ERROR | Failed to run command:
['/home/...miniconda3/bin/python', '-m', 'ipykernel', '-f', '/home/users/balter/.local/share/jupyter/runtime/kernel-26741.json']
PATH='/mnt/scratch/miniconda3/bin:/bin:/usr/local/bin:/usr/bin'
with kwargs:
{'stdin': -1, 'cwd': None, 'start_new_session': True, 'stdout': None, 'stderr': None}
Traceback (most recent call last):
File "/mnt/scratch/miniconda3/bin/jupyter-console", line 5, in
app.main()
File "/mnt/scratch/miniconda3/lib/python3.5/site-packages/jupyter_core/application.py", line 267, in launch_instance
return super(JupyterApp, cls).launch_instance(argv=argv, **kwargs)
File "/mnt/scratch/miniconda3/lib/python3.5/site-packages/traitlets/config/application.py", line 657, in launch_instance
app.initialize(argv)
File "", line 2, in initialize
File "/mnt/scratch/miniconda3/lib/python3.5/site-packages/traitlets/config/application.py", line 87, in catch_config_error
return method(app, *args, **kwargs)
File "/mnt/scratch/miniconda3/lib/python3.5/site-packages/jupyter_console/app.py", line 141, in initialize
self.init_shell()
File "/mnt/scratch/miniconda3/lib/python3.5/site-packages/jupyter_console/app.py", line 109, in init_shell
JupyterConsoleApp.initialize(self)
File "/mnt/scratch/miniconda3/lib/python3.5/site-packages/jupyter_client/consoleapp.py", line 334, in initialize
self.init_kernel_manager()
File "/mnt/scratch/miniconda3/lib/python3.5/site-packages/jupyter_client/consoleapp.py", line 288, in init_kernel_manager
self.kernel_manager.start_kernel(**kwargs)
File "/mnt/scratch/miniconda3/lib/python3.5/site-packages/jupyter_client/manager.py", line 243, in start_kernel
**kw)
File "/mnt/scratch/miniconda3/lib/python3.5/site-packages/jupyter_client/manager.py", line 189, in _launch_kernel
return launch_kernel(kernel_cmd, **kw)
File "/mnt/scratch/miniconda3/lib/python3.5/site-packages/jupyter_client/launcher.py", line 123, in launch_kernel
proc = Popen(cmd, **kwargs)
File "/mnt/scratch/miniconda3/lib/python3.5/subprocess.py", line 947, in init
restore_signals, start_new_session)
File "/mnt/scratch/miniconda3/lib/python3.5/subprocess.py", line 1551, in _execute_child
raise child_exception_type(errno_num, err_msg)
FileNotFoundError: [Errno 2] No such file or directory: '/home/...miniconda3/bin/python'
```
The last line is the crucial one. That path is on serverA (full path obscured for security).
What is the fix for this?
Cross-posted as jupyter issue.

Related

Script compiled with pyinstaller is missing a .dll file, when the file is manually copied in the program's folder it just dies

I have a python script which is basically a graphic interface (pysimpleguy) to a mysql database.
I am working in python 3.8; my dependencies are:
PySimpleGUI 4.55.1
sqlalchemy 1.3.20
pymysql 1.0.2
pandas 1.1.3
regex 2020.10.15
pillow 8.0.1
The code works and I'd like to compile it to .exe to distribute it to users in my organization.
I tried to compile it with:
pyinstaller -D .\db_interface_v3.6.1_release.py --debug=imports
However, pyinstaller throws some errors when compiling:
201667 INFO: Building COLLECT COLLECT-00.toc
Traceback (most recent call last):
File "c:\users\spit\anaconda3\lib\runpy.py", line 194, in _run_module_as_main
return _run_code(code, main_globals, None,
File "c:\users\spit\anaconda3\lib\runpy.py", line 87, in _run_code
exec(code, run_globals)
File "C:\Users\Spit\anaconda3\Scripts\pyinstaller.exe\__main__.py", line 7, in <module>
File "c:\users\spit\anaconda3\lib\site-packages\PyInstaller\__main__.py", line 124, in run
run_build(pyi_config, spec_file, **vars(args))
File "c:\users\spit\anaconda3\lib\site-packages\PyInstaller\__main__.py", line 58, in run_build
PyInstaller.building.build_main.main(pyi_config, spec_file, **kwargs)
File "c:\users\spit\anaconda3\lib\site-packages\PyInstaller\building\build_main.py", line 782, in main
build(specfile, kw.get('distpath'), kw.get('workpath'), kw.get('clean_build'))
File "c:\users\spit\anaconda3\lib\site-packages\PyInstaller\building\build_main.py", line 714, in build
exec(code, spec_namespace)
File "C:\Users\Spit\Desktop\DIPEx db parser\db_interface_v3.6.1_release.spec", line 37, in <module>
coll = COLLECT(exe,
File "c:\users\spit\anaconda3\lib\site-packages\PyInstaller\building\api.py", line 818, in __init__
self.__postinit__()
File "c:\users\spit\anaconda3\lib\site-packages\PyInstaller\building\datastruct.py", line 155, in __postinit__
self.assemble()
File "c:\users\spit\anaconda3\lib\site-packages\PyInstaller\building\api.py", line 866, in assemble
shutil.copy(fnm, tofnm)
File "c:\users\spit\anaconda3\lib\shutil.py", line 415, in copy
copyfile(src, dst, follow_symlinks=follow_symlinks)
File "c:\users\spit\anaconda3\lib\shutil.py", line 261, in copyfile
with open(src, 'rb') as fsrc, open(dst, 'wb') as fdst:
FileNotFoundError: [Errno 2] No such file or directory: 'C:\\Users\\Spit\\Desktop\\DIPEx db parser\\dist\\db_interface_v3.6.1_release\\share\\jupyter\\lab\\staging\\node_modules\\.cache\\terser-webpack-p
lugin\\content-v2\\sha512\\2e\\ba\\cfce62ec1f408830c0335f2b46219d58ee5b068473e7328690e542d2f92f2058865c600d845a2e404e282645529eb0322aa4429a84e189eb6b58c1b97c1a'
If I try to run the compiled exe, I get an error regarding a specific .dll:
INTEL MKL ERROR: Impossibile trovare il modulo specificato. mkl_intel_thread.dll.
Intel MKL FATAL ERROR: Cannot load mkl_intel_thread.dll.
If I take this missing .dll from my Anaconda environment and copy it into the program's folder, when I try to run the .exe again it just dies without further messages:
import 'numpy.ma' # <pyimod03_importers.FrozenImporter object at 0x000001F6A455BEE0>
PS C:\Users\Spit\Desktop\DIPEx db parser\dist\db_interface_v3.6.1_release>
Any idea on how to sort it out?
Thanks!
Sorted out. As a future reference if someone stumbles upon this question, the error is caused by Windows' PATH_MAX limitation, preventing pyinstaller to find all the necessary files.
In order to disable said limitation: https://learn.microsoft.com/en-us/windows/win32/fileio/maximum-file-path-limitation?tabs=cmd
Kudos to https://github.com/bwoodsend

problem launching Jupyter notebook or Jupyter lab anaconda

I have a problem launching the Jupyter notebook on a remote Windows computer. running Jupyter from conda prompt returns WinError 1326 as below:
Traceback (most recent call last):
File "D:\Anaconda3\lib\site-packages\traitlets\traitlets.py", line 535, in get
value = obj._trait_values[self.name]
KeyError: 'runtime_dir'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\Anaconda3\Scripts\jupyter-notebook-script.py", line 10, in <module>
sys.exit(main())
File "D:\Anaconda3\lib\site-packages\jupyter_core\application.py", line 254, in launch_instance
return super(JupyterApp, cls).launch_instance(argv=argv, **kwargs)
File "D:\Anaconda3\lib\site-packages\traitlets\config\application.py", line 844, in launch_instance
app.initialize(argv)
File "D:\Anaconda3\lib\site-packages\traitlets\config\application.py", line 87, in inner
return method(app, *args, **kwargs)
File "D:\Anaconda3\lib\site-packages\notebook\notebookapp.py", line 2127, in initialize
self.init_configurables()
File "D:\Anaconda3\lib\site-packages\notebook\notebookapp.py", line 1650, in init_configurables
connection_dir=self.runtime_dir,
File "D:\Anaconda3\lib\site-packages\traitlets\traitlets.py", line 575, in __get__
return self.get(obj, cls)
File "D:\Anaconda3\lib\site-packages\traitlets\traitlets.py", line 538, in get
default = obj.trait_defaults(self.name)
File "D:\Anaconda3\lib\site-packages\traitlets\traitlets.py", line 1578, in trait_defaults
return self._get_trait_default_generator(names[0])(self)
File "D:\Anaconda3\lib\site-packages\jupyter_core\application.py", line 85, in _runtime_dir_default
ensure_dir_exists(rd, mode=0o700)
File "D:\Anaconda3\lib\site-packages\jupyter_core\utils\__init__.py", line 11, in ensure_dir_exists
os.makedirs(path, mode=mode)
File "D:\Anaconda3\lib\os.py", line 213, in makedirs
makedirs(head, exist_ok=exist_ok)
File "D:\Anaconda3\lib\os.py", line 213, in makedirs
makedirs(head, exist_ok=exist_ok)
File "D:\Anaconda3\lib\os.py", line 213, in makedirs
makedirs(head, exist_ok=exist_ok)
[Previous line repeated 3 more times]
File "D:\Anaconda3\lib\os.py", line 223, in makedirs
mkdir(name, mode)
OSError: [WinError 1326] Username or password is incorrect: 'MY_SERVER_ADDRESS'
I tried changing some values in config file but no luck! please help me fix this problem
This might help:->
steps include:->
1-open cmd
2-paste the path of your Jupyter folder
3-write "Jupyter notebook" and press enter you will redirect to Jupyter
4-still not working check your python path is configured or not and then go to step1,2,3:->
upvote if it's helpful
or
you can made another folder jupyter folder and then configure python
and install jupyter notebook and use above steps.

Is the working directory of the dagster main process different of the scheduler processes

I'm having an issue with the loading of a file from dagster code (setup, not pipelines). Say I have the following project structure:
pipelines
-app/
--environments
----schedules.yaml
--repository.py
--repository.yaml
When I run dagit while inside the project folder($cd project && dagit -y app/repository.yaml), this folder becomes the working dir and inside the repository.py I could load a file knowing the root is project
# repository.py
with open('app/evironments/schedules.yaml', 'r'):
# do something with the file
However, if I set up a schedule the pipelines in the project do not run. Checking the cron logs it seems the open line throws a file not found exception. I was wondering if this happens because the working directory is different when executing the cron.
For context, I'm loading a config file with parameters of cron_schedules for each pipeline. Also, here's the tail of the stacktrace in my case:
File "/home/user/.local/share/virtualenvs/pipelines-mfP13m0c/lib/python3.8/site-packages/dagster/core/definitions/handle.py", line 190, in from_yaml
return LoaderEntrypoint.from_file_target(
File "/home/user/.local/share/virtualenvs/pipelines-mfP13m0c/lib/python3.8/site-packages/dagster/core/definitions/handle.py", line 161, in from_file_target
module = import_module_from_path(module_name, os.path.abspath(python_file))
File "/home/user/.local/share/virtualenvs/pipelines-mfP13m0c/lib/python3.8/site-packages/dagster/seven/__init__.py", line 75, in import_module_from_path
spec.loader.exec_module(module)
File "<frozen importlib._bootstrap_external>", line 783, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/home/user/pipelines/app/repository.py", line 28, in <module>
schedule_builder = ScheduleBuilder(settings.CRON_PRESET, settings.ENV_DICT)
File "/home/user/pipelines/app/schedules.py", line 12, in __init__
self.cron_schedules = self._load_schedules_yaml()
File "/home/user/pipelines/app/schedules.py", line 16, in _load_schedules_yaml
with open(path) as f:
FileNotFoundError: [Errno 2] No such file or directory: 'app/environments/schedules.yaml'
You could open the file using the absolute path of the file so that it opens correctly.
from dagster.utils import file_relative_path
with open(file_relative_path(__file__, './environments/schedules.yaml'), 'r'):
# do something with the file
All file_relative_path is simply doing the following, so you can call the os.path methods directly if you prefer:
def file_relative_path(dunderfile, relative_path):
os.path.join(os.path.dirname(dunderfile), relative_path)

Apache Airflow - Was Working Fine Now Says Log File Isn't Local Error & Exceptions are Popping Up

So it looks like my install of apache airflow on a Google Compute Engine instance broke down. Everything was working great and then two days ago all the DAG runs show up stuck in a running state. I am using LocalExecutioner.
When I try to look at the log I get this error:
* Log file isn't local.
* Fetching here: http://:8793/log/collector/aa_main_combined_collector/2017-12-15T09:00:00
*** Failed to fetch log file from worker.
I didn't touch a setting anywhere. I looked through all the config files and I scanned the logs and I see this error
[2017-12-16 20:08:42,558] {jobs.py:355} DagFileProcessor0 ERROR - Got an exception! Propagating...
Traceback (most recent call last):
File "/usr/local/lib/python3.4/dist-packages/airflow/jobs.py", line 347, in helper
pickle_dags)
File "/usr/local/lib/python3.4/dist-packages/airflow/utils/db.py", line 53, in wrapper
result = func(*args, **kwargs)
File "/usr/local/lib/python3.4/dist-packages/airflow/jobs.py", line 1584, in process_file
self._process_dags(dagbag, dags, ti_keys_to_schedule)
File "/usr/local/lib/python3.4/dist-packages/airflow/jobs.py", line 1173, in _process_dags
dag_run = self.create_dag_run(dag)
File "/usr/local/lib/python3.4/dist-packages/airflow/utils/db.py", line 53, in wrapper
result = func(*args, **kwargs)
File "/usr/local/lib/python3.4/dist-packages/airflow/jobs.py", line 763, in create_dag_run
last_scheduled_run = qry.scalar()
File "/usr/local/lib/python3.4/dist-packages/sqlalchemy/orm/query.py", line 2843, in scalar
ret = self.one()
File "/usr/local/lib/python3.4/dist-packages/sqlalchemy/orm/query.py", line 2814, in one
ret = self.one_or_none()
File "/usr/local/lib/python3.4/dist-packages/sqlalchemy/orm/query.py", line 2784, in one_or_none
ret = list(self)
File "/usr/local/lib/python3.4/dist-packages/sqlalchemy/orm/query.py", line 2855, in iter
return self._execute_and_instances(context)
File "/usr/local/lib/python3.4/dist-packages/sqlalchemy/orm/query.py", line 2878, in _execute_and_instances
result = conn.execute(querycontext.statement, self._params)
File "/usr/local/lib/python3.4/dist-packages/sqlalchemy/engine/base.py", line 945, in execute
return meth(self, multiparams, params)
File "/usr/local/lib/python3.4/dist-packages/sqlalchemy/sql/elements.py", line 263, in _execute_on_connection
return connection._execute_clauseelement(self, multiparams, params)
File "/usr/local/lib/python3.4/dist-packages/sqlalchemy/engine/base.py", line 1053, in _execute_clauseelement
compiled_sql, distilled_params
File "/usr/local/lib/python3.4/dist-packages/sqlalchemy/engine/base.py", line 1189, in _execute_context
context)
File "/usr/local/lib/python3.4/dist-packages/sqlalchemy/engine/base.py", line 1405, in _handle_dbapi_exception
util.reraise(*exc_info)
File "/usr/local/lib/python3.4/dist-packages/sqlalchemy/util/compat.py", line 187, in reraise
raise value
File "/usr/local/lib/python3.4/dist-packages/sqlalchemy/engine/base.py", line 1182, in _execute_context
context)
File "/usr/local/lib/python3.4/dist-packages/sqlalchemy/engine/default.py", line 470, in do_execute
cursor.execute(statement, parameters)
File "/usr/local/lib/python3.4/dist-packages/airflow/bin/cli.py", line 69, in sigint_handler
sys.exit(0)
SystemExit: 0
Any thoughts out there?
I solved this problem though in doing so I discovered another problem.
Long and short of it as soon as I manually started the scheduler, everything worked again. It appears the problem was that the scheduler did not get restarted correctly after a system reboot.
I have scheduler running through SystemD. The Webserver .service works fine. However I do notice that the scheduler .service continually restarts. It appears there is an issue there I need to resolve. This part of it is solved for now.
Look at the log URL, verify if it ends with a date with special characters +:
&execution_date=2018-02-23T08:00:00+00:00
This was fixed here.
You can replace the + for -, or replace all special characters, in my case:
&execution_date=2018-02-23T08%3A00%3A00%2B00%3A00
This happens here.
The FileTaskHandler can not load the log from local disk, and try to load from worker.
Another thing that could be causing this error is the exclusion of the airflow/logs folder or the sub folders inside it.

Plone buildout error non-existant file

I have a Plone 4.3.3 site that is throwing an error on buildout:
Traceback (most recent call last):
File "/usr/local/Proforest4.3/buildout-cache/eggs/zc.buildout-2.2.5-py2.7.egg/zc/buildout/buildout.py", line 1946, in main
getattr(buildout, command)(args)
File "/usr/local/Proforest4.3/buildout-cache/eggs/zc.buildout-2.2.5-py2.7.egg/zc/buildout/buildout.py", line 626, in install
installed_files = self[part]._call(recipe.install)
File "/usr/local/Proforest4.3/buildout-cache/eggs/zc.buildout-2.2.5-py2.7.egg/zc/buildout/buildout.py", line 1370, in _call
return f()
File "/data/usr/local/Proforest4.3/buildout-cache/eggs/plone.recipe.precompiler-0.6-py2.7.egg/plone/recipe/precompiler/__init__.py", line 29, in install
return self._run()
File "/data/usr/local/Proforest4.3/buildout-cache/eggs/plone.recipe.precompiler-0.6-py2.7.egg/plone/recipe/precompiler/__init__.py", line 35, in _run
self._compile_eggs()
File "/data/usr/local/Proforest4.3/buildout-cache/eggs/plone.recipe.precompiler-0.6-py2.7.egg/plone/recipe/precompiler/__init__.py", line 67, in _compile_eggs
py_compile.compile(fn, None, None, True)
File "/usr/lib64/python2.7/py_compile.py", line 123, in compile
with open(cfile, 'wb') as fc:
IOError: [Errno 13] Permission denied: '/data/usr/local/Proforest4.3/test/src/proforest.content/proforest/content/behaviours/accordion.pyc'
accordion.py does exist, but not the .pyc version
Permissions on accordion.py seem correct (owned by plone_buildout etc.)
How do I resolve?
You user is trying to create that file, but filesystem permissions are wrong.
Check that the Plone effective-user is able to write in buildout folders.

Resources