Connecting Colab to Local Runtime Jupyter Notebook - jupyter-notebook

I'm trying to connect my Google Colab to a local runtime via Jupyter Notebook. There is one part I can't figure out, which is this:
#Ensure that the notebook server on your machine is running on port 8888 and accepting requests from https://colab.research.google.com.
jupyter notebook \
--NotebookApp.allow_origin='https://colab.research.google.com' \
--port=8888 \
--NotebookApp.port_retries=0
I tried copy-pasting it into my anaconda prompt but only "jupyter notebook " is pasted and executed. How do you get all that code typed into prompt? Is it some cmd feature that I'm completely oblivious to?

The command you are currently running should be run on a Linux machine. If you have a Windows machine, replace the \ at the end of each part of the command with ^. So, your command will be
jupyter notebook ^
--NotebookApp.allow_origin='https://colab.research.google.com' ^
--port=8888 ^
--NotebookApp.port_retries=0
Or, the complete command with all the parameters can be run in a single line like this
jupyter notebook --NotebookApp.allow_origin='https://colab.research.google.com' --port=8888 --NotebookApp.port_retries=0
I have tested this to work on the command prompt, and the local runtime connection is successful.

Related

Is it posible to prevent juputer lab from saving notebooks automatically?

I tend to lunch jupyter lab from different locations, open a new notebook to try out something, etc. Jupyter lab creates a notebook and saves it to the disk by default. As a result, I end up with notebooks that I don't want to keep all over my disk drive. Is there a way to prevent this from happening?
You might find it quickest to use an ephemeral docker container:
docker run --rm -p 8888:8888 \
-e JUPYTER_ENABLE_LAB=yes \
-e JUPYTER_TOKEN=docker \
--name jupyter \
-d jupyter/datascience-notebook:latest

Run .exe on remote machine with PSEXEC

I am trying to run an .exe that sits on a remote machine with PSEXEC, but it does not run properly.
The exe is a converted python script:
usage: WindowsUpdates3.exe [-h] [-v] [-m] [-i] [-u] [-r] [-s SKIP]
List, download and update Windows clients
optional arguments:
-h, --help show this help message and exit
-v, --version Show program version and exit
-m, --list-missing List missing updates
-i, --list-installed List installed updates
-u, --update List and install missing updates
-r, --reboot Reboot after installing updates if needed
-s SKIP, --skip SKIP Skips these KB numbers
I am able to run the .exe as intended with a PSEXEC console session:
PSEXEC \\<hostname> cmd
Navigate to the exe my.exe -i, run it and it works the same way if I execute it on the machine locally.
When I try to execute the file directly some fucntionality does not work
PSEXEC.exe \\<hostname> -h "C:\WindowsUpdates\WindowsUpdates3.exe" -i
...
C:\WindowsUpdates\WindowsUpdates3.exe exited on <hostname> with error code 0.
I am able to get the help-menu (-h) and the version (-v) of the exe with the above command.
The other arguments don't return anything but code 0 and the --update argument throws a com_error: -2147024891, which translates to access denied...
How can this be as I have the same privileges as if I spawn a cmd terminal?

How to run jupyter lab in a conda environment on a google compute engine (Deep Learning VM)?

I made a conda environment in my Deep Learning VM. When I ssh to it (clicking SSH button of my instance in the VM instances page) and type source activate <environment_name> it gets activated correctly in the shell.
I successfully connect to jupyter lab from my local machine as explained from the docs
How can I use jupyter in a specific conda environment on this VM ?
The accepted way to run jupyter in a specific conda environment seems to be
Activate a conda environment in your terminal using source activate <environment_name> before you run jupyter notebook.
but the Deep Learning VM docs say
A Jupyter Lab session is started when your Deep Learning VM instance is initialized
so that I cannot source activate before the creation of the jupyter lab session.
Any ideas ?
run a standard jupyter notebook myself instead of using the jupyter lab provided by the VM ?
activate the environment in startup scripts of the VM before the creation of the jupyter lab ?
Please try out the below steps:
source activate < env_name >
conda install ipykernel
ipython kernel install --name < env_name > --user
After this, launch your python code from hub.colfaxresearch.com and select Kernel --> Change Kernel --> < env_name >
The only way we've found to make it see all your environments(conda and new python environments) is to run a new jupyter lab instance.
When connecting over SSH map the 8888 or any other port instead of 8080 gcloud compute ssh ... -L 8888:localhost:8888
After connecting run jupyter lab from console. The default port is 8888.
This is one of the ugliest issues I've seen with GCE so far!

How to run jupyter notebook in the background ? No need to keep one terminal for it

Often we run jupyter notebook to pop up a page in browser to use notebook. However, the terminal opening the server remains there. Is there a way that we can close that terminal with server running in the back ?
You can put the process into the background by using jupyter notebook --no-browser & disown. You can close the terminal afterwards and the process will still be running.
If you're using zsh you can also use a shorter version that does the same: jupyter notebook --no-browser &!.
To kill the process you can use pgrep jupyter to find the PID of the process and then kill 1234, replacing 1234 with the PID you just found.
Explanation
The --no-browser flag makes jupyter not open the browser automatically, it also works without this flag.
The & puts it into the background of the currently running shell.
The disown then removes the job from the background of the currently running shell and makes it run independently of the shell so that you may close it.
In the zsh version the &! is a built-in function that does the same as & disown.
Tmux is a good option available to run your Jupyter Notebook in the background.
I am running my Jupyter Notebook on a google cloud platform's VM instance with OS: Ubuntu 16.0. Where
I have to start SSH terminal
Then run the command: jupyter-notebook --no-browser --port=5000. It will start the Jupyter Notebook on port number 5000 on that VM instance
Then I open my browser and typer ip_addrees_of_my_instance:port_number which is 5000. It will open my Notebook in the browser.
Now up to this, all is good. But wait if the connection with my SSH terminal is terminated then the Jupyter Notebook stops immediately and hence I have to re-run it once again once the ssh terminal is restarted or from new ssh terminal.
To avoid this tmux is very good option.
Terminal Multiplexer (tmux) to Keep SSH Sessions Running in the background after ssh terminal is closed:
Start the ssh terminal
Type tmux. It will open a window in the same terminal.
Give command to start Jupyter Notebook here. Open Notebook.
Now if SSH terminal is closed/terminated it will keep running your notebook on the instance.
If the connection terminated then:
reconnect or open new ssh terminal. To see this Jupyter Server(which is kept running in the background) type: tmux attach command.
(Edited: changed "notebook" to "Jupyter Server")
Want to terminate tmux session:
Close the notebook. Then type exit in tmux-terminal-window.
(update: If we close the notebook and use tmux detach command: it will exit from tmux session window/terminal without terminating/stopping the tmux sessions)
For more details please refer to this article: https://www.tecmint.com/keep-remote-ssh-sessions-running-after-disconnection/
Under *nix, the best way to run a program avoiding to be terminated by closing the terminal is to use nohup (no Hang up).
To start browser after running the server use the command:
nohup jupyter notebook &
And to start the server without opening the browser use the command:
nohup jupyter notebook --no-browser &
Note that you can shut down the jupyter server by using Quit in the upper right of the page of jupyter.
nohup puts as a parent of the process init(0), so it will not receive the "Hang Up" signal when the terminal is closed. All the output (standard output and standard error) are redirected to the file nohup.out
nohup exists both as program and shell command, so if you have bash, check man page of bash to read more details.
This works for me when running a jupyter notebook server in the background.
$> nohup jupyter notebook --allow-root > error.log &
Stop the nohup jupyter notebook is simple.
First, find the pid of jupyter:
$> ps -ef| grep jupyter
e.g output like:
root 11417 2897 2 16:00 pts/0 00:04:29 /path/to/jupyter-notebook
Then kill the process:
$> kill -9 11417
You can also simplify this by storing the pid with:
$> nohup jupyter notebook --allow-root > error.log & echo $!> pid.txt
i.e, you can stop the notebook with:
$> kill -9 $(cat pid.txt)
An alternative way to stop the jupyter notebook is quit from the notebook page.
You can use screen to run it.
screen -A -m -d -S anyscreenname jupyter notebook --no-browser
This will start jupyter in a screen and you can access screen using screen commands.
Actually, jupyter notebook & alone is not enough, the backend will still log to your screen.
What you need is, cited from this issue
jupyter notebook > /path/to/somefileforlogging 2>&1 &
You can start up the notebook in a screen or tmux session. Makes it easy to check error messages, etc.
For remote machines jupyter notebook & works fine.
However, it does not work on local machines when you close the terminal.
For local machines use tmux.
Not real sophisticated but it gets the job done:
#! /bin/bash
#probably should change this to a case switch
if [ "$1" == "end" ]
then
echo
echo "Shutting Down jupyter-notebook"
killall jupyter-notebook
echo
exit 0
fi
if [ "$1" == "-h" ]
then
echo
echo "To start : jnote <port> [default 8888]"
echo "To end : jnote end"
echo "This help : jnote -h"
echo
exit 0
fi
#cast from string
PORT=$(($1))
RETURN=0
PID=0
if [ "$PORT" == "0" ] || [ "$PORT" == "" ]; then PORT=8888; fi
echo
echo "Starting jupyter-notebook"
#background and headless, set port, allow colab access, capture log, don't open browser yet
nohup jupyter notebook \
--NotebookApp.allow_origin='https://colab.research.google.com' \
--port=$PORT --NotebookApp.port_retries=0 \
--no-browser >~/jnote.log 2>&1 &
RETURN=$?
PID=$!
#Wait for bg process to complete - add as needed
sleep 2
if [ $RETURN == 0 ]
then
echo
echo "Jupyter started on port $PORT and pid $PID"
echo "Goto `cat ~/jnote.log | grep localhost: | grep -v NotebookApp`"
echo
exit 0
else
echo
echo "Jupyter failed to start on port $PORT and pid $PID with return code $RETURN"
echo "see ~/jnote.log"
echo
exit $RETURN
fi
jupyter notebook & >> disown
put the process into the background by using jupyter notebook &
then type disown or disown <the process id>
you can close the terminal now
Detach Jupyter process from the controlling terminal and send all its input and output data to /dev/null which is a special device file that writes-off any data written to it.
jupyter notebook </dev/null &>/dev/null &
Lazy people like me would prefer to edit ~/.bash_aliases and create an alias:
alias jnote='jupyter notebook </dev/null &>/dev/null &'
Reference: https://www.tecmint.com/run-linux-command-process-in-background-detach-process/
As suggested by one of the users, using
jupyter notebook &
solves the issue. Regarding the comments stating that it kills the kernel after closing the terminal, probably you are using
jupyter-notebook &.
If you are using iTerm2 software,
first you need to set:
brew shellenv
Then start jupyter in nohup:
eval $(/usr/local/bin/brew shellenv)
nohup jupyter notebook &
Here is a command that launches jupyter in background (&) detached from the terminal process (disown) and without opening the browser (--no-browser)
No log will be shown on the terminal since they are redirected (&>) to a file jupyter_server.log so they can still be referred to later.
jupyter notebook --no-browser &> jupyter_server.log & disown
If you don't wan't to store the logs(discouraged):
jupyter notebook --no-browser &> /dev/null & disown
Thanks to the other answers this one is built upon: here and there

How to RE-connect to a Remote Jupyter instance (Running Code)

I have several long running scripts (in Jupyter Notebook) on a remote Google Cloud Compute Instance.
If I lose the ssh connection, I cannot reconnect to the (running) Notebook without stopping those running scripts--executing within the Notebook.
It seems that closing my macbook, will sever my connection to the remote (running) jupyter notebook. Is there some way to reconnect without stopping the script?
On Google Cloud, Jupyter is still running. I just can't connect to the notebook executing the code––without stopping code execution.
I'm sure other Jupyter users have figured this out :)
Thanks in advance
My GCloud Tunneling Script
gcloud compute ssh --zone us-central1-c my-compute-instance -- -N -p 22 -D localhost:5000
Bash Script that Launches Chrome
/Applications/Google\ Chrome.app/Contents/MacOS/Google\ Chrome
"localhost:22"
--proxy-server="socks5://localhost:5000"
--host-resolver-rules="MAP * 0.0.0 , EXCLUDE localhost"
--user-data-dir=/tmp/
Nohup that launches Jupyter on Gcloud
nohup jupyter notebook --no-browser > log.txt 2>&1 &
On my Sierra-os macbook, no proxy settings (System Preferences) are enabled
On Google Cloud, I'm NOT using a static ip, just an ephemeral ip.
Much appreciation in advance
What do you mean by "cannot reconnect" ? Do you mean you can't see the notebook interface anymore ? (In which case this is likely a google cloud question).Or do you mean you can't run code or see previous results ?
If the second, this is a known issue, the jupyter team is working on it; The way to go around that is to wrap your code in Python Futures, that store intermediate code; thus re-accessing the future will not trigger re-computation, but will show you intermediate results.

Resources