Jupyter Notebook over 100 MB even after clearing all output - jupyter-notebook

I have a Jupyter Notebook that I wish to upload to GitHub. However, even after clearing all output, it's still larger than 100 MB. Stranger still, I created a duplicate and deleted everything in it, yet the duplicate still turned out to have the same size! I tried nbstripout, which didn't help, either.
Why is this happening? Is there a way to "purge" a Jupyter Notebook so that it's significantly smaller? Thank you so much!

Related

Jupyter notebook is eating all my memory and then crashes

I have been getting involved with the Python language, and especially through Jupyter notebook. I think Jupyter is great for prototyping code in a very convenient way. I've been working on code according to this tutorial over the past 2 days:
https://medium.com/#omar.ps16/stereo-3d-reconstruction-with-opencv-using-an-iphone-camera-part-iii-95460d3eddf0, and it's been working fine.
However, when I woke up this morning, it seems that a memory issue is causing Jupyter to crash. When I start Jupyter, there is no such memory issue, it is only when I click on my particular notebook file. Then the memory gradually increases (as seen on the task manager). Also, the screen is non-reactive, so I cannot reach the restart kernel or any of these options in the kernel. After about 30 seconds, the entire Jupyter system crashes due to a memory overflow.
I would greatly appreciate any help with this problem.
Okay I figured out that I was printing a huge matrix out, which blocked up the system. I had to open the notebook with notepad++, and get rid of the data that way, and everything is running fine. Stupid mistake.

Google Colab disk space getting full

I'm new to ML and I am now testing some notebooks in Google Colab (using GPU).
My first notebook has been running for a few hours with no complaints about RAM or disk space. Hoever, when running another notebook, I soon get warnings that I am already using around 57 GB of my 68 GB disk space. These warnings only appear in the second notebook, and the disk space icon has only turned yellow in this second one and not in the first one.
I wonder if someone could clarify a bit what happens with this (virtual) disk space? Where are all the heavy files stored and will it reset automatically so that there will be free space again after running these notebooks?
I have tried to find answers on Colab website and forums but so far with no success. Thanks a lot!

Jupyter Notebook not loading notebook

I have a Jupyter Notebook that I have been working on for a while now to do some data cleanup for a project. I tried to open it again today and am getting a blank loading screen. I have restarted the server multiple times and have restarted my computer but have had no luck in getting it to load. Each time I try to open the notebook, my computer goes into overdrive, gets hot, fans start up, etc. I am working with a 2GB file in Pandas, so sometimes when running a data heavy command this happens, so I take it as normal.
Here is the screen and the output in my terminal. I haven't had this problem before and am wondering what my options are for saving my work. Any help is much appreciated!
I had the same problem . I solved it by deleting cookies and cache .
If the problem persists pls check this link
https://github.com/jupyter/notebook/issues/3857

Rstudio is painfully slow

Suddenly, Rstudio is painfully slow, and now it is unusable. This means, I open it up and there is a lag of several seconds if I type anything. I have explored all the options I can come up with:
1. re-installing both R and Rstudio (although I am not 100% sure I could remove all components),
2. trying to reset settings.... the obvious things such as clearing the workspace and the console.
The size of my data is negligible. I cannot think of anything else.... any ideas?
The only observation i can make that shows something could be wrong with the configuration is (sometimes), I see "gctorture false" as a value in the environment.
Just a guess, but ?gctorture says
Provokes garbage collection on (nearly) every memory allocation.
Intended to ferret out memory protection bugs. Also makes R run
_very_ slowly, unfortunately.
which sounds about right for your problem! You could try
gctorture(FALSE)
If that speeds things up, then look for somewhere that this might have been set, e.g., in a .Rprofile (current working directory, or your user home directory, or the installation directory of R; see ?.Rprofile) and make sure that you start R without loading any .Rhistory or .RData files (again in the working directory, your home directory, etc.)
I had a RStudio project with Git Version Control and then it became very slow. I solved the problem by removing the Git Version Control

ipython notebook hang when open a large notebook

I am using the LATEST (2.2.0) ipython notebook, when I create a notebook with a loop to write many lines (about 20000 lines), then it run forever I guess since I always see the running icon at the top right. Even if I restart the computer and reopen the notebook again, it will into a running mode automatically, then I almost unable to do anything in this page. I have to copy the code and new another page to fix it.
How can I fix such hang issue during open a too large notebook? I have tried the kernel "interrupt" and "restart" menu and it seems no any effect at all.
IPython notebook is not intended to do such tasks with too much calculation or too many output data because actually such things is for standalone program rather than a notebook.
To fix such issues, you need to create a standalone application (script) to do it from console, then paste the meaningful result into IPython notebook.

Resources