Kill Jupyter Notebooks from within the notebooks - jupyter-notebook

I am looking for a way to automatically kill the notebooks kernel to free up the memory after finish running some cells.
Is there a magic command that I can use from within the notebook cells to do that?

Use exit(); the kernel will restart automatically but all the resources of the old kernel will be cleaned.

Related

Unable to start kernel for jupyter notebooks in a specific directory

No problem in other directories. Is there an environmental variable or something else I need to erase?
Deleted cache file...
OK, I think I need be much clearer here.
First software:
MacOS Catalina 10.15.6
jupyter notebook 6.0.3
Python 3.8.3
IPython 7.16.1
jupyter notebook is installed and runs fine.
jupyter notebook runs just fine in any user directory on the computer except exactly one.
There is nothing obvious in this directory that shouldn't be there. An 'ls -al' shows nothing but some .py files.
I can create a jupyter notebook in this directory, but the kernel crashes and won't restart. I can rename the directory, rename the jupyter notebook, but the behavior persists beyond everything I have been able to reset including a cold computer restart. It is reproducible and happens every time.
This behavior is not seen in any other directory.
My question: are there environmental variables or caches stored not visibly in the directory (obviously) that are responsible for this incredibly annoying behavior and how can I reset them?
Problem solved: jupyter notebooks apparently uses some reserved names for local directory .py files when starting up the notebook. So far I've found that "string.py" and "decorator.py" cannot be in the startup directory unless they contain the expected data (looks like it needs to be related to some template info)
To start-up a kernel
You first activate your virtual environment:
For instance: conda activate vision
Second, you type jupyter notebook
as stated here

Jupyter notebook seems to remember previous path (!pwd) after being moved to a different directory?

I initially had a notebook in one directory in AWS SageMaker JupyterLab, say /A, but then moved it into /A/B. However, when I run !pwd in a jupyter notebook cell, I still get /A. This happens even when I press 'restart kernel'. How does the notebook remember this, and is there a way to prevent or reset this?
Thanks
I was actually using AWS SageMaker, and restarting the kernel from the toolbar was not enough. I needed to restart the kernel session, by pressing 'shut down' in the "Running terminals and kernels" section on the left navigation.
They are currently discussing warning users about the need to restart the kernel when a notebook is moved.

Run A Specific Jupyter Notebook On Start

I would like to setup a system such that it not only runs jupyter notebook on start, but it also starts executing a specific notebook on that jupyter server (running all cells in sequence).
Is this possible? I specifically want to be able to access the notebook web interface and inspect/stop/etc the running notebook at any point.
I know nbconvert can execute a notebook, but it seems to run independently of any existing jupyter servers?
Maybe there is some API I can access so that I can write a shell script to run jupyter notebook and then use the API to open and run a notebook?

Cell run to completion but kernel is still running

I have a jupyter notebook does some data extractions. After executed the cell (no * with the cell) and get the extraction results, the kernel is still showing running (and CPU shows ipykernel 100%)
What could cause this happen and how to find out what process causes 100% usage on the ipykernel while no cell is running?
These normally caused by some extensions. Try to check if you have any extension for variable monitor/watch, then disable it.
Alternatively, you can try jupyter lab which has optimized many ways jupyter book issues with extensions.

Memory limit in jupyter notebook

How do I set a maximum memory limit for a jupyter notebook process?
If I use too much RAM the computer gets blocked and I have to press the power button to restart the computer manually.
Is there a way of automatically killing a jupyter notebook process as soon as a user-set memory limit is surpassed or to throw a memory error? Thanks
Under Linux you can use "cgroups" to limit resources for any software running on your computer.
Install cgroup-tools with apt-get install cgroup-tools
Edit its configuration /etc/cgconfig.conf to make a profile for the particular type of work (e.g. numerical scientific computations):
group app/numwork {
memory {
memory.limit_in_bytes = 500000000;
}
}
Apply that configuration to the process names you care about by listing them in /etc/cgrules.conf (in my case it is all julia executables, which I run through jupyter, but you can use it for any other software too):
*:julia memory app/numwork/
Finally, parse the config and set it as the current active config with the following commands:
~# cgconfigparser -l /etc/cgconfig.conf
~# cgrulesengd
I use this to set limits on processes running on a server that is used by my whole class of students.
This page has some more details and other ways to use cgroups https://wiki.archlinux.org/index.php/cgroups

Resources