I have several long running scripts (in Jupyter Notebook) on a remote Google Cloud Compute Instance.
If I lose the ssh connection, I cannot reconnect to the (running) Notebook without stopping those running scripts--executing within the Notebook.
It seems that closing my macbook, will sever my connection to the remote (running) jupyter notebook. Is there some way to reconnect without stopping the script?
On Google Cloud, Jupyter is still running. I just can't connect to the notebook executing the code––without stopping code execution.
I'm sure other Jupyter users have figured this out :)
Thanks in advance
My GCloud Tunneling Script
gcloud compute ssh --zone us-central1-c my-compute-instance -- -N -p 22 -D localhost:5000
Bash Script that Launches Chrome
/Applications/Google\ Chrome.app/Contents/MacOS/Google\ Chrome
"localhost:22"
--proxy-server="socks5://localhost:5000"
--host-resolver-rules="MAP * 0.0.0 , EXCLUDE localhost"
--user-data-dir=/tmp/
Nohup that launches Jupyter on Gcloud
nohup jupyter notebook --no-browser > log.txt 2>&1 &
On my Sierra-os macbook, no proxy settings (System Preferences) are enabled
On Google Cloud, I'm NOT using a static ip, just an ephemeral ip.
Much appreciation in advance
What do you mean by "cannot reconnect" ? Do you mean you can't see the notebook interface anymore ? (In which case this is likely a google cloud question).Or do you mean you can't run code or see previous results ?
If the second, this is a known issue, the jupyter team is working on it; The way to go around that is to wrap your code in Python Futures, that store intermediate code; thus re-accessing the future will not trigger re-computation, but will show you intermediate results.
Related
I am unable to access jupyter notebook after starting my virtual machine on Google cloud. I type the code below on the shell prompt:
jupyter notebook
This returns some information about the notebook server including:
[I 02:28:31.858 NotebookApp] The Jupyter Notebook is running at:
[I 02:28:31.858 NotebookApp] http://(my-fastai-instance2 or 127.0.0.1):8081/
However, when I try to access jupyter notebook at this address, the browser just returns a message saying it is unable to establish connection at that server address.
Resolved using:
gcloud compute ssh <zone> <instance name> <port number>.
Thank you for your help.
Maybe you need try in web browser this address:
http://localhost:{your port}/tree
insatall localtunnel :
localtunnel exposes your localhost to the world for easy testing and sharing! No need to mess with DNS
npm install -g localtunnel
after that run this command on port you are using
lt --port 8000
if the url wasn't working :
go set the configuration mentioned in the message from jupyter :
c.NotebookApp.allow_remote_access = True
in jupyter_notebook_config.py, e.g. /etc/jupyter/jupyter_notebook_config.py in your user image if using container-based deployment.
Try like this
gcloud compute ssh --zone=YOURZONE jupyter#INSTANCENAME -- -L 8080:localhost:8080
After login to cloud with that, open browser and type localhost:8080 and you should have jupyter.
This also should work by tunneling jupyter via ssh -i ~/.ssh/google_compute_engine -nNT -L 8888:localhost:8888 vm_external_IP and then localhost:8888 in your browser
I made a conda environment in my Deep Learning VM. When I ssh to it (clicking SSH button of my instance in the VM instances page) and type source activate <environment_name> it gets activated correctly in the shell.
I successfully connect to jupyter lab from my local machine as explained from the docs
How can I use jupyter in a specific conda environment on this VM ?
The accepted way to run jupyter in a specific conda environment seems to be
Activate a conda environment in your terminal using source activate <environment_name> before you run jupyter notebook.
but the Deep Learning VM docs say
A Jupyter Lab session is started when your Deep Learning VM instance is initialized
so that I cannot source activate before the creation of the jupyter lab session.
Any ideas ?
run a standard jupyter notebook myself instead of using the jupyter lab provided by the VM ?
activate the environment in startup scripts of the VM before the creation of the jupyter lab ?
Please try out the below steps:
source activate < env_name >
conda install ipykernel
ipython kernel install --name < env_name > --user
After this, launch your python code from hub.colfaxresearch.com and select Kernel --> Change Kernel --> < env_name >
The only way we've found to make it see all your environments(conda and new python environments) is to run a new jupyter lab instance.
When connecting over SSH map the 8888 or any other port instead of 8080 gcloud compute ssh ... -L 8888:localhost:8888
After connecting run jupyter lab from console. The default port is 8888.
This is one of the ugliest issues I've seen with GCE so far!
I connect to a remote server using ssh -L but if I close the laptop lid or the connection is lost, the jupyter notebook is disconnected.
After I reconnect to the remote server, the "last" session is lost.
What can be done to make it persistent?
Could screen help with it?
On the remote server, you should open your jupyter in a screen session, it will make it persistent if you lose the connection to the server and resume it.
On your computer: ssh -L xxxx:localhost:yyyy server.
screen.
jupyter notebook --no-browser --port=yyyy. [on remote server]
In your browser: localhost:xxxx.
To disconnect manually and reconnect:
Exit the screen window: control + a and then d.
Disconnect from the server: control + d
And reconnect ssh -L xxxx:localhost:yyyy.
Optionally, you can reopen the screen window, though unnecessary, using screen -r.
Go back to your notebook or reopen localhost:xxxx.
The standard usage for persisting Jupyter server sessions is the use of nohup and &; in your remote server with IP address xx.xx.xx.xx:
nohup jupyter notebook --no-browser --ip xx.xx.xx.xx --port yyyy &
Now, even if you switch off your laptop or lose the connection, you will be always able to reconnect by pointing your browser at xx.xx.xx.xx:yyyy
Adding to #BiBi's answer...
Instead of screen I could recommend you to take a look at tmux. Especially, if you combine tmux with the Tmux Plugin Manager and install Tmux Resurrect, even after reboots of your remote server you will be able to go back to your previous Tmux sessions.
Shortcuts for tmux are somewhat equal to those of screens, just that control + a is replaced by control + b. Of course, tmux allows you to configure your custom shortcuts.
BiBi's answer is correct. But I had cases where my ssh connection terminated unexpectedly and the port forwarding no longer worked when trying to reconnect. Probably there was some dangling process on the remote machine, not sure.
Anyway, in these cases I used socat to proxy between two local ports on the remote machine:
# jupyter notebook/lab running in screen on port yyyy, then your connection dies...
ssh -L xxxx:localhost:zzzz
socat tcp-listen:zzzz,reuseaddr,fork tcp:localhost:yyyy
This way you can avoid restarting jupyter on a different port
Use the nohup command to keep jupyter running even after exiting the shell or terminal. Type the following command in the specified locations.
In remote server nohup jupyter notebook --no-browser --port=8085 > my.log 2>&1 < /dev/null &. This runs jupyter in port 8085 and any stdout would be present in my.log
In local ssh -NL 8085:localhost:8085 username#xx.xx.xx.xx. If port needs to be specified, you can use ssh -NL 8085:localhost:8085 -p xxxx username#xx.xx.xx.xx
In browser http://127.0.0.1:8085/
Sometimes port 8085 may be occupied in the remote server, in such cases try it with another port but make sure you use the same port number in the local while tunneling.
I am connected to a Linode terminal via SSH.
I have started a Jupyter Notebook server from the command line.
But I want to do some other tasks in the command line, keeping the NB server running. I can't work out how to do this without stopping the Jupyter Notebook server. It says:
[I 05:55:05.523 NotebookApp] Use Control-C to stop this server and shut
down all kernels (twice to skip confirmation).
Since you mentioned Linode, I'm assuming it is on a linux/unix system.
Possbile options:
Send process to background
send process to background: jupyter notebook &
do some work on terminal
close jupyter with pkill jupyter (this should have the same effect as
a ctrl+c keypress)
WARNING: this will kill all jupyter instances
use a terminal multiplexer like tmux
just use multiple ssh sessions (not really elegant imo)
Overall, I think the option that would be easiest and most flexible is option 2.
I am trying to get GNU parallel to split up a processing job between my machine (Win7 running Cygwin) and some remote machines (Linux).
But I can't seem to figure out the syntax to do both from the same command. I have tried using -S localhost, user#server1, user#server2 but (localhost) does not have sshd running, so this fails (the job does continue running on the remote hosts).
Thanks in advance.
From man parallel:
The sshlogin : is special, it means no ssh and will therefore run on the local computer.