Jupyter Notebook not loading notebook - jupyter-notebook

I have a Jupyter Notebook that I have been working on for a while now to do some data cleanup for a project. I tried to open it again today and am getting a blank loading screen. I have restarted the server multiple times and have restarted my computer but have had no luck in getting it to load. Each time I try to open the notebook, my computer goes into overdrive, gets hot, fans start up, etc. I am working with a 2GB file in Pandas, so sometimes when running a data heavy command this happens, so I take it as normal.
Here is the screen and the output in my terminal. I haven't had this problem before and am wondering what my options are for saving my work. Any help is much appreciated!

I had the same problem . I solved it by deleting cookies and cache .
If the problem persists pls check this link
https://github.com/jupyter/notebook/issues/3857

Related

Jupyter Lab, R kernel - Wanted: to prevent Jupyter from opening the R help server

Unfortunately, I made a change to the Jupyter Lab settings recently. Jupyter had never been able to access the remote R help server. Usually this was never an issue because most help files are displayed in Jupyter. The way help is called on a function or method is typing: ?question. The "question" being the name of an R function or method. Ex. ?plot
Infrequently one of the R packages will not list its help in Jupyter and instead try to open the remote R help server. This has never worked in the past. The tab for a remote server never opened. This has never really been an issue until recently. Recently I decided to "fix" this issue and spent a good deal of time looking up the solution. I made this "fix" and Jupyter began working properly to open the remote R help server. But the fix is a problem that is way too obnoxious.
Each time I type the question mark in Jupyter, Jupyter causes the browser to open a new tab for the remote R help server. It is impossible to type fast enough after typing typing the ?. Once the ? is typed, the typing is interupted while the browser opens a new R server help website. As soon as returning to the typing, another browser tab opens and interrupts the typing .
It is impossible to finish typing the name of a function after the ? without having multiple interruptions to the typing while multiple new windows are opened in the browser at the R help server website. By the time I've finished typing: ?ppp, there will the 5 interruptions and new browser tabs opened. If I typed slowly, there may be 10 - 12 interruptions and 10 new tabs opened at the R help server.
How to prevent Jupyter from trying to access the remote R help server every time the ? is typed when looking up a description of a function? How to get back to that happy place I had before I'd made whatever change that caused this nightmare to happen?
Solved my own question.
This issue had nothing to do with R. I assumed I had changed a configuration in R, but that was not the case. This was an issue I had caused when I had changed a setting in Jupyter lab.
I can say with certainty this issue was definitely caused when I changed the "Contextual Help" setting under the Commands tab in Jupyter. This setting is related to the Contextual Help selection found under the Help tab. And this setting was causing multiple R help server tabs to open in the browser after I typed: ?
Upon rediscovering this Jupyter setting today I recalling that I had been down this path before. I selected "Contextual Help". Jupyter auto restarted and the issue went away. I can gladly say that I can perform search again for the description of an R function: Ex ?plot ,and receive the description of this function within Jupyter the same as before. I no longer have the issue of the browser opening or multiple new tabs at the remote R help server after typing ?

Jupyter notebook is eating all my memory and then crashes

I have been getting involved with the Python language, and especially through Jupyter notebook. I think Jupyter is great for prototyping code in a very convenient way. I've been working on code according to this tutorial over the past 2 days:
https://medium.com/#omar.ps16/stereo-3d-reconstruction-with-opencv-using-an-iphone-camera-part-iii-95460d3eddf0, and it's been working fine.
However, when I woke up this morning, it seems that a memory issue is causing Jupyter to crash. When I start Jupyter, there is no such memory issue, it is only when I click on my particular notebook file. Then the memory gradually increases (as seen on the task manager). Also, the screen is non-reactive, so I cannot reach the restart kernel or any of these options in the kernel. After about 30 seconds, the entire Jupyter system crashes due to a memory overflow.
I would greatly appreciate any help with this problem.
Okay I figured out that I was printing a huge matrix out, which blocked up the system. I had to open the notebook with notepad++, and get rid of the data that way, and everything is running fine. Stupid mistake.

ERR_HTTP2_PROTOCOL_ERROR when opening Notebook in JUPYTERLAB Azure ML Studio

So our team created a new Azure Machine Learning resource, but whenever I try to add a new notebook and try to edit it using "JUPYTERLAB" i get ERR_HTTP2_PROTOCOL_ERROR error, but the same notebook, when edited using EDIT IN JUPYTER works perfectly.
This is a blank and clean notebook, I also tried 2 different laptops and multiple browsers per laptop, same error. I also tried incognito and clearing cookies, but to no avail.
update: I seem to have accidentally replicated the issue and I now know what is causing it, the situation is that Im using my work laptop and constantly switching VPN connections, and some times, connecting to the AZURE PORTAl OUTSIDE the VPN. So, when you've worked on a notebook while inside a VPN, then you disconnected, and tried loading the notebook sometime later, you will encounter this
Have you tried creating a new Azure Machine Learning Compute Instance? Sometimes the VM has a fatal issue and a new one needs to be spun up.
Also try modifying the working Jupyter Notebook url by adding /lab to the end.
This problem has stomped me for hours, but I was finally able to fix it. What I did was I opened a terminal and did a Jupyter lab rebuild "jupyter lab build"

vphython stops to work for me in jupyter notebooks every few minutes for no reason

I am having massive trouble with using vpython in jupyter notbooks. I'm creating small animations with vpython. After a couple of minutes when I try to run a cell, it will either not show any output or will yield a error message "object could not be called".
The only fix I found for this is to restart or change the kernel. Most times it works then for the next few minutes until it stops again. This is really annoying and prevents real progress.
pictrue of error message with example code
all used objects have been imported in another cell before this one.
I am running vpython 7.3.2 and anaconda navigator 1.6.10 on a mac with High Sierra. As a browser I use Chrome.
Thank u for every hint to fix this permanently in advance!
Cheers,
Gordon
Try asking your question on the vpython forum and perhaps provide a sample notebook on github where with instructions on how to reproduce the problem.
https://groups.google.com/forum/?fromgroups&hl=en#!forum/vpython-users
Here is a link to demo vpython notebooks running in the cloud using binder service.
https://mybinder.org/v2/gh/BruceSherwood/vpython-jupyter/master?filepath=Demos
If you provide a notebook on github that demonstrates the problem then it should be reproduceable when running on mybinder.
https://mybinder.org/

ipython notebook hang when open a large notebook

I am using the LATEST (2.2.0) ipython notebook, when I create a notebook with a loop to write many lines (about 20000 lines), then it run forever I guess since I always see the running icon at the top right. Even if I restart the computer and reopen the notebook again, it will into a running mode automatically, then I almost unable to do anything in this page. I have to copy the code and new another page to fix it.
How can I fix such hang issue during open a too large notebook? I have tried the kernel "interrupt" and "restart" menu and it seems no any effect at all.
IPython notebook is not intended to do such tasks with too much calculation or too many output data because actually such things is for standalone program rather than a notebook.
To fix such issues, you need to create a standalone application (script) to do it from console, then paste the meaningful result into IPython notebook.

Resources