Table of Contents disappeared in Jupyter Notebook - jupyter-notebook

Currently, a jupyter notebook does not show the lateral table of contents. All the other notebooks do.
In fact, for that notebook under menu->edit the nbextensions config disappeared, so, it looks like that notebook does not load the extension.
My jupyter version is 4.4.0.
How to fix that notebook?

I found the solution to this problem. All those advises like "close and then re-open", "clean output of the notebook" or "re-install extensions" won't work. The problem is in the javascript load timeouts.
In firefox press F12 and press on the red icon in the top right corner - you will see that you have an error
Load timeout for modules:
custom/custom,nbextensions/nbextensions_configurator/config_menu/main,
bla-bla-bla...
How to solve:
close the notebook in jupyter, open your ipynb file in any text editor, go to the end of it - you will find "metadata" section. Add the line
"setTimeout": 120
create a file ~/.jupyter/custom/custom.js if you don't have it (I have Linux, where it is in Windows - I have no idea, google for it) and put this contents into this file:
window.requirejs.config({
waitseconds: 90, // default is 30s
});
This page describes the problem in detail:
https://github.com/ipython-contrib/jupyter_contrib_nbextensions/issues/1195

In my case, disabling my ad-blocker brought the table of contents back.

Apparently, all it needs is to shutdown the notebook. After restarting it, all works fine. I suppose that problem happens when the notebook has been closed wrongly without the proper "close and halt" procedure.

My solution in these cases is, make a copy (File <Make a Copy), delete the original notebook and rename the copy (File <Rename)

For me it seems that one of the following steps worked:
Update all packages and jupyter notebook to the latest version using conda
Uninstall then install again the nbextensions configurator
Reboot the computer

For me the reason was 22 emojis in the headers, deleting 5 of them was enough for firefox to get toc back with no reloading and kernel operations.

I have the problem with Jupyter notebook version 6.1.4. After installing nbextensions and enabling extension "Table of Contents (2)" (as "toc" and "toc2" don't work), I cannot obtain the TOC of an a notebook (2.8 MB), while new notebooks have a table of contents.
Reloading, reopening, restarting jupyter does not help, even after a long wait (> 15 min).
I have tried with browsers Mozilla Firefox version 83 and Chromium version 87.
However, a workaround is to create a new notebook (with TOC), then copy all cells from old notebook to new notebook. In order to do this, this might be useful:
how to copy cells from notebook to notebook and
how to select all cells.
Actually I could not copy all cells at once, I had to do three partial copies.
Alas, my method is not only painful but also ineffective in the long term, as the TOC disappeared again.
Restarting the computer worked or did not work.
Opening a small notebook with a table of content and reloading the big notebook worked or did not work.
The problem concerns all Nbextensions, they are just not active for the big notebook. A Javascript console is shown in Google Chrome with Control Shift J.
Error: Load timeout for modules:
nbextensions/nbextensions_configurator/config_menu/main,nbextensions/init_cell/main,nbextensions/spellchecker/main,nbextensions/toc2/main,nbextensions/jupyter-js-widgets/extension

Related

Some setting changes in Table of Contents 2 (toc2) in nbextensions not showing up in Jupyter notebook I had open at the time of installation

I recently installed nbextensions in my conda environment to add the ability to add a Table of Contents (toc2) to my notebooks. But I did this while having a jupyter notebook open and after installing the extension, changes in the extension settings didn't reflect on the notebook I had open during installation while it did on the others that weren't.
I tried shutting down and restarting the kernel, restarting the computer, and uninstalling and re-installing nbextensions again (following these instructions on a github ticket). None of these things rectified the issue with the notebook that was open. A duplicate of the notebook inherits its problems.
An interesting thing to note is that after reinstalling nbextensions, which was my last attempt, changes of the settings from the first install were held over instead of going back to default (ie color settings in nbextensions being the distinct colors I had switched them to before the uninstall). I'm not sure if I fully uninstalled nbextensions or if it's really possible. And after the re-install, further changes in the settings in the second go-around didn't reflect on any notebooks afterwards, like removing the sidebar toc setting after selecting it the first time, the sidebar remains in certain notebooks.
I'm not sure what's going on but:
1.) Is there a way to get the settings changes to reflect on the notebook universally, particularly having a toc? Especially on the notebook that was open at the time of install?
2.) Is there a way to totally un-install nbextensions like it never existed on my machine so I can try this again?
Step 0 - Is this a problem with the Notebook metadata?
The toc2 module adds some metadata to the notebook.
Just in case this metadata is missing, has wrong values, or the defaults written make the ToC does not show up try this:
Open the notebook and select "Edit"->"Edit Notebook Metadata".
Remove the "toc":section if it exists an add this to the end of the metadata:
"toc": {
"nav_menu": {},
"number_sections": true,
"sideBar": true,
"skip_h1_title": false,
"base_numbering": 1,
"title_cell": "Table of Contents",
"title_sidebar": "Contents",
"toc_cell": true,
"toc_position": {
"height": "382px",
"width": "256px",
"left": "10px",
"top": "10px"
},
"toc_section_display": true,
"toc_window_display": true
}
Be careful not to remove any comma(,) after the closing } if exists ! This must be a valid JSON file.
After changing the metadata:
Clic on the "Edit" button
Save the notebook
Close the browser window
Annd open the notebook again.
You should see a floating TOC window.
If not, go to the next step.
Step 1 - Is this a Jupyter problem?
Check the jupyter_contrib_nbextensions is installed (pip freeze | grep jupyter_contrib_nbextensions).
Open jupyter and check there is a 'Nbextensions' tab next to the 'Clusters' tab on the home page.
Open the 'Nbclusters' tab and leave unchecked the ' disable configuration for nbextensions without explicit compatibility (they may break your notebook environment, but can be useful to show for nbextension development)' box.
Now look for the 'Table of contents (2)' extension and enable it.
Now your should see the TOC when you open the notebook.
If not, go to the next step.
Step 2 - Is this a browser-related problem?
Looking at the toc2 extension soure we can see this is client-based based on Javascript code.
When you load the Jupyter notebook, the module main.js is added to the .js your browser runs and modifies some menu commands.
To discard we have caching issues, open Jupyter using with a 'fresh' browser.
This could be either:
Another browser (chrome, firefox, Edge..)
Or your 'normal' browser in incognito/InPrivate mode
With a 'clean' browser you should be able to open the notebook and see the navigation bar from the TOC extension.
What to do now:
If the Clean browser works, but the "normal" browser doesn't
clear cache in the normal browser and try again.
If the Clean browser does not work and Jupyter is running on your local machine.
We've hit a bug here.
Open your browser developer tools, open a ticket in github, and add the console output from developer tools.
If the Clear browser does not work and Jupyter is running on a remote machine
Do you use a proxy? or is your ISP using transparent caching for this page/parts of the page (last one is hard to assess).
Try running Jupyter in another port (with the --port option). That should fix he problem. Consider using https instead of http to avoid caching by intermediate network.

How to change Juypyter notebook default working directory

I'm trying to change the default working directory for Jupyter notebooks (anaconda installation). I've looked up many different answers, most of which focus on changing the jupyter_notebook_config.py file, but none of those work.
Some of the things i've tried:
This medium article
This github issue (bottom reply)
This
This too
No matter what I try, whenever I start jupyter notebook the directory I see is C:\Users\me.
I admit part of the problem is that I'm not great with command line things which makes some of the answers difficult to understand.
If you run the Jupyter Notebook from the Windows Start Menu then you have to change the command running Jupyter Notebook in the shortcut. Right click on the Start Menu on the shortcut, open file location, right click on the Jupyter Notebook shortcut, select Properties, and in the Terget Textbox, change the "%USERPROFILE%" part in what you wand or if you have changed the Jupyter config file, delete this part.

jupyter notebook takes forever to open and then pages unresponsive - [MathJax] issue

I'm trying to open a jupyter notebook and it takes a long time and I see at the bottom it's trying to load various [MathJax] extension, e.g. at the bottom left of the chrome browser it says:
Loading [MathJax]/extensions/safe.js
Eventually, the notebook loads, but it's frozen and then at the bottom left it keeps showing that it's trying to load other [MathJax] .js files.
Meanwhile, the "pages unresponsive do you want to kill them" pop up keeps popping up.
I have no equations or plots in my notebook so I can't understand what is going on. My notebook never did this before.
I googled this and some people said to delete the ipython checkpoints. Where would those be? I'm on Mac OS and using Anaconda.
conda install -c conda-forge nbstripout
nbstripout filename.ipynb. Make sure that there is no whitespace in the filename.
I had a feeling that the program in my Jupyter notebook was stuck trying to produce some output, so I restarted the kernel and cleared output and that seemed to do the trick!
If Jupyter crashes while opening the ipynb file, try "using nbstripout to clear output directly from the .ipynb file via command line"(bndwang). Install with pip install nbstripout
I was having the same problem with jupyter notebook. My recommendations to you are as follows:
First, check the size of the .ipynb file you are trying to open. Probably the file size is in MB and is large. One of the reasons for this might be the output of a dataset that you previously displayed all rows.
For example;
In order to check the dataset, sometimes I use pd.set_option('display.max_rows', None) instead of the .head() function. And so I view all the rows in the data set.
The large number of outputs increases the file size, making the notebook slower. Try to delete such outputs.
I think this will solve your problem.
Here restarting your kernel will not help. Instead use nbstripout to strip the output from command line.
Run this command -> nbstripout FILE.ipynb
Install nbstripout if it is not there
https://pypi.org/project/nbstripout/
It happened to me the time I decided to print a matrix for 100000 times. The notebook file became 150MB and Jupyter (in Chrome) was not able to open it: it said all the things you experienced and then the page died saying it was "OutOfMemory".
I solved the issue opening it in Visual Studio Code, there is a button "Clear All Output", then I saved the notebook again and it was back to some hundreds of KB, which I could open normally.
If you don't have Visual Studio Code installed, you can open the notebook with another editor (gedit if you use Linux or Notepad++ in Windows) and try to delete the output cells. This is more tricky since you have to pay a lot of attention in what you are deleting, otherwise the notebook will stop working.

IOPub data rate exceeded in Jupyter notebook (when viewing image)

I want to view an image in Jupyter notebook. It's a 9.9MB .png file.
from IPython.display import Image
Image(filename='path_to_image/image.png')
I get the below error:
IOPub data rate exceeded.
The notebook server will temporarily stop sending output
to the client in order to avoid crashing it.
A bit surprising and reported elsewhere.
Is this expected and is there a simple solution?
(Error msg suggests changing limit in --NotebookApp.iopub_data_rate_limit.)
Try this:
jupyter notebook --NotebookApp.iopub_data_rate_limit=1.0e10
Or this:
yourTerminal:prompt> jupyter notebook --NotebookApp.iopub_data_rate_limit=1.0e10
I ran into this using networkx and bokeh
This works for me in Windows 7 (taken from here):
To create a jupyter_notebook_config.py file, with all the defaults commented out, you can use the following command line:
$ jupyter notebook --generate-config
Open the file and search for c.NotebookApp.iopub_data_rate_limit
Comment out the line c.NotebookApp.iopub_data_rate_limit = 1000000 and change it to a higher default rate. l used c.NotebookApp.iopub_data_rate_limit = 10000000
This unforgiving default config is popping up in a lot of places. See git issues:
jupyter
IOPub data rate exceeded
It looks like it might get resolved with the 5.1 release
Update:
Jupyter notebook is now on release 5.2.2. This problem should have been resolved. Upgrade using conda or pip.
Removing print statements can also fix the problem.
Apart from loading images, this error also happens when your code is printing continuously at a high rate, which is causing the error "IOPub data rate exceeded". E.g. if you have a print statement in a for loop somewhere that is being called over 1000 times.
By typing 'jupyter notebook --NotebookApp.iopub_data_rate_limit=1.0e10' in Anaconda PowerShell or prompt, the Jupyter notebook will open with the new configuration. Try now to run your query.
Some additional advice for Windows(10) users:
If you are using Anaconda Prompt/PowerShell for the first time, type "Anaconda" in the search field of your Windows task bar and you will see the suggested software.
Make sure to open the Anaconda prompt as administrator.
Always navigate to your user directory or the directory with your Jupyter Notebook files first before running the command. Otherwise you might end up somewhere in your system files and be confused by an unfamiliar file tree.
The correct way to open Jupyter notebook with new data limit from the Anaconda Prompt on my own Windows 10 PC is:
(base) C:\Users\mobarget\Google Drive\Jupyter Notebook>jupyter notebook --NotebookApp.iopub_data_rate_limit=1.0e10
I have the same problem in my Jupyter NB on Win 10 when querying from a MySQL database.
Removing any print statements solved my problem.
For already running docker containers, try editing the file name - ~/.jupyter/jupyter_notebook_config.py
uncomment the line - NotebookApp.iopub_data_rate_limit =
and set high number like 1e10.
Restart the docker, it should fix the problem
I ran into this problem running version 6.3.0. When I tried the top rated solution by Merlin the powershell prompt notified me that iopub_data_rate_limit has moved from NotebookApp to ServerApp. The solution still worked but wanted to mention the variation, especially as internal handling of the config may become deprecated.
Easy workaround is to create a for loop and print. Then there wont be any issue. Printing directly wcc would cause if graph is huge. Hence any of below code will work as workaround.
wcc=list(nx.weakly_connected_components(train_graph))
for i in range(1,10):
print(wcc[i])
for i in wcc):
print(wcc)
Like others pointed out, print statement at a high rate can cause this. Resolve it by printing modulo a number using if statement. Example in python:
k = 10
if (i % k == 0):
print("Something")
Increase k if the warning persists.
Using Visual Studio Code, the Jupyter extension will be able to handle big data. launch from anaconda navigator
In general, trying to print something that is too long will trigger this error. I tried to print a string that was 9221593 characters long (too long), and that triggered the error.

Repair corrupted Jupyter notebook / load previous version?

I had a hardware crash while running a Jupyter notebook. After repairing the system and trying to restart the notebook, I got the following error message:
Error loading notebook
Unreadable Notebook: D:\Eddy\Documents\1604 Udacity\1612 Self-driving car Nanodegree\P4\P4 Eduard van Kleef.ipynb NotJSONError("Notebook does not appear to be JSON: ''...",)
Does anyone know of a way to revert to any of Jupyter's previous 'checkpoints'? Or of a way to at least partially restore a JSON?
If you are lucky then the ipynb file is corrupted but still there. In that case you can try opening it in a text file and copying the contents to a new notebook. But check the size of the file. If it is zero bytes, then there is nothing there!
This actually happened to me when my server ran out of memory and somehow the notebook got completely erased. Totally sucks.
Try this
jupyter nbconvert filename.ipynb --clear-output
It worked for me since it has corrupted because of Plotly behavior with some big data.
in your file directory that contains your ipynb file there is a folder called '.ipynb_checkpoints'. this folder does not show in the jupyter application so find it through windows explorer.
inside there's will be a file called urfilenamehere-checkpoint.ipynb
copy paste it to your file directory and open through the jupyter application it should probably work.
if your corrupted file is 0B, you definitely have to rely on the checkpoints.
do not create a new notebook with the same name it will overwrite the checkpoint.

Resources