Jupyter notebook is eating all my memory and then crashes - jupyter-notebook

I have been getting involved with the Python language, and especially through Jupyter notebook. I think Jupyter is great for prototyping code in a very convenient way. I've been working on code according to this tutorial over the past 2 days:
https://medium.com/#omar.ps16/stereo-3d-reconstruction-with-opencv-using-an-iphone-camera-part-iii-95460d3eddf0, and it's been working fine.
However, when I woke up this morning, it seems that a memory issue is causing Jupyter to crash. When I start Jupyter, there is no such memory issue, it is only when I click on my particular notebook file. Then the memory gradually increases (as seen on the task manager). Also, the screen is non-reactive, so I cannot reach the restart kernel or any of these options in the kernel. After about 30 seconds, the entire Jupyter system crashes due to a memory overflow.
I would greatly appreciate any help with this problem.

Okay I figured out that I was printing a huge matrix out, which blocked up the system. I had to open the notebook with notepad++, and get rid of the data that way, and everything is running fine. Stupid mistake.

Related

Rstudio potential memory leak / background activity?

I’m having a lot of trouble working with Rstudio on a new PC. I could not find a solution searching the web.
When Rstudio is on, it is constantly eating up memory until it becomes unworkable. If I work on an existing project, it takes half an hour to an hour to become impossible to work with. If I start a new project without loading any objects or packages, just writing scripts without even running them, it takes longer to reach that point, however, it still does.
When I first start the program, the Task Manager shows memory usage of 950-1000 MB already (sometimes larger), and as I work, it climbs up to 6000 MB at which point it is impossible to work with as every activity is delayed and 'stuck'. Just to compare, on my old PC while working on the program, the Task Manager shows 100-150 MB. When I click the "Memory Usage Report" within Rstudio, the "used by session" is very small, the "used by system" is almost at a maximum yet Rstudio is the only thing taking up they system memory on the PC.
Things I tried: installing older versions of both R and Rstudio, pausing my anti-virus program, changing compatibility mode, zoom on "100%". It feels like Rstudio is continuously running something in the background as the memory usage keeps growing (and quite quickly). But maybe it is something else entirely.
I am currently using the latest versions of R and Rstudio (4.1.2, and 2021.09.0-351), on a PC with processor Intel i7, x64 bit, RAM 16GM, Windows 10.
What should I look for at this point?
On Windows, there is several typical memory or CPU issues with Rstudio. In my answer, I explain how the Rstudio interface itself use memory and CPU, as soon as you open a project (e.g., when Rstudio show you some .Rmd files). The memory / CPU cost associated with the computation is not covered in my answer (i.e. when you have performance issues when executing a line of code = not covered).
When working on 'long' .Rmd files within Rstudio on Windows, the CPU and/or memory usage get sometimes very high and increases progressively (e.g., because of a process named 'Qtwebengineprocess'). To solve the problem caused by long Rmd files loaded within a Rstudio session, you should:
pay attention to the process of Rstudio that consume memory, when scanning your code (i.e. disable or enable stuff in the 'Global options' menu of Rstudio). For example, try to disable 'inline display'(Tools => Global options => Rmarkdown => Show equation and image preview => Never). This post put me on this way to consider that memory / CPU leak are sometimes due to Rstudio itself, nor the data or the code.
set up a bookdown project, in order to split your large Rmd files into several Rmd. See here.
Bonus step, see if there is a conflict in some packages which are loaded with the command tidyverse_conflicts(), but it's already a 'computing problem' (not covered here).

Jupyter Notebook not loading notebook

I have a Jupyter Notebook that I have been working on for a while now to do some data cleanup for a project. I tried to open it again today and am getting a blank loading screen. I have restarted the server multiple times and have restarted my computer but have had no luck in getting it to load. Each time I try to open the notebook, my computer goes into overdrive, gets hot, fans start up, etc. I am working with a 2GB file in Pandas, so sometimes when running a data heavy command this happens, so I take it as normal.
Here is the screen and the output in my terminal. I haven't had this problem before and am wondering what my options are for saving my work. Any help is much appreciated!
I had the same problem . I solved it by deleting cookies and cache .
If the problem persists pls check this link
https://github.com/jupyter/notebook/issues/3857

vphython stops to work for me in jupyter notebooks every few minutes for no reason

I am having massive trouble with using vpython in jupyter notbooks. I'm creating small animations with vpython. After a couple of minutes when I try to run a cell, it will either not show any output or will yield a error message "object could not be called".
The only fix I found for this is to restart or change the kernel. Most times it works then for the next few minutes until it stops again. This is really annoying and prevents real progress.
pictrue of error message with example code
all used objects have been imported in another cell before this one.
I am running vpython 7.3.2 and anaconda navigator 1.6.10 on a mac with High Sierra. As a browser I use Chrome.
Thank u for every hint to fix this permanently in advance!
Cheers,
Gordon
Try asking your question on the vpython forum and perhaps provide a sample notebook on github where with instructions on how to reproduce the problem.
https://groups.google.com/forum/?fromgroups&hl=en#!forum/vpython-users
Here is a link to demo vpython notebooks running in the cloud using binder service.
https://mybinder.org/v2/gh/BruceSherwood/vpython-jupyter/master?filepath=Demos
If you provide a notebook on github that demonstrates the problem then it should be reproduceable when running on mybinder.
https://mybinder.org/

make a jupyter notebook run even if the page is closed

I love notebooks. I love them so much that I have many of them running at the same time, often in different browsers, sometimes on different remote clients. I miss one feature: when I close the tab corresponding to a running notebook, it warns that the corresponding run will be stopped.
My question:
How do I make a jupyter notebook resume it's run even if the page is closed ?
such that I can:
re-open the tab in another browser (possibly on a remote computer such as a tablet),
restart a browser when it needs to,
close those with long running time for later inspection.
From what I understand, the client-server architecture could make that possible, but that there may be issues with multiple concurrent runs...
PS: I created an issue on GitHub
In fact, this was answered in the github issue:
takluyver commented on 26 Apr 2017: Anything already running in the
notebook will keep running, and the kernel it started for that will
stay running - so it won't lose your variables. However, any output
produced while the notebook isn't open in a browser tab is lost; there
isn't an easy way to change this until we have the notebook server
able to track the document state, which has been on the plan for ages.
Thanks!

ipython notebook hang when open a large notebook

I am using the LATEST (2.2.0) ipython notebook, when I create a notebook with a loop to write many lines (about 20000 lines), then it run forever I guess since I always see the running icon at the top right. Even if I restart the computer and reopen the notebook again, it will into a running mode automatically, then I almost unable to do anything in this page. I have to copy the code and new another page to fix it.
How can I fix such hang issue during open a too large notebook? I have tried the kernel "interrupt" and "restart" menu and it seems no any effect at all.
IPython notebook is not intended to do such tasks with too much calculation or too many output data because actually such things is for standalone program rather than a notebook.
To fix such issues, you need to create a standalone application (script) to do it from console, then paste the meaningful result into IPython notebook.

Resources