I'm new to ML and I am now testing some notebooks in Google Colab (using GPU).
My first notebook has been running for a few hours with no complaints about RAM or disk space. Hoever, when running another notebook, I soon get warnings that I am already using around 57 GB of my 68 GB disk space. These warnings only appear in the second notebook, and the disk space icon has only turned yellow in this second one and not in the first one.
I wonder if someone could clarify a bit what happens with this (virtual) disk space? Where are all the heavy files stored and will it reset automatically so that there will be free space again after running these notebooks?
I have tried to find answers on Colab website and forums but so far with no success. Thanks a lot!
Related
I’m having a lot of trouble working with Rstudio on a new PC. I could not find a solution searching the web.
When Rstudio is on, it is constantly eating up memory until it becomes unworkable. If I work on an existing project, it takes half an hour to an hour to become impossible to work with. If I start a new project without loading any objects or packages, just writing scripts without even running them, it takes longer to reach that point, however, it still does.
When I first start the program, the Task Manager shows memory usage of 950-1000 MB already (sometimes larger), and as I work, it climbs up to 6000 MB at which point it is impossible to work with as every activity is delayed and 'stuck'. Just to compare, on my old PC while working on the program, the Task Manager shows 100-150 MB. When I click the "Memory Usage Report" within Rstudio, the "used by session" is very small, the "used by system" is almost at a maximum yet Rstudio is the only thing taking up they system memory on the PC.
Things I tried: installing older versions of both R and Rstudio, pausing my anti-virus program, changing compatibility mode, zoom on "100%". It feels like Rstudio is continuously running something in the background as the memory usage keeps growing (and quite quickly). But maybe it is something else entirely.
I am currently using the latest versions of R and Rstudio (4.1.2, and 2021.09.0-351), on a PC with processor Intel i7, x64 bit, RAM 16GM, Windows 10.
What should I look for at this point?
On Windows, there is several typical memory or CPU issues with Rstudio. In my answer, I explain how the Rstudio interface itself use memory and CPU, as soon as you open a project (e.g., when Rstudio show you some .Rmd files). The memory / CPU cost associated with the computation is not covered in my answer (i.e. when you have performance issues when executing a line of code = not covered).
When working on 'long' .Rmd files within Rstudio on Windows, the CPU and/or memory usage get sometimes very high and increases progressively (e.g., because of a process named 'Qtwebengineprocess'). To solve the problem caused by long Rmd files loaded within a Rstudio session, you should:
pay attention to the process of Rstudio that consume memory, when scanning your code (i.e. disable or enable stuff in the 'Global options' menu of Rstudio). For example, try to disable 'inline display'(Tools => Global options => Rmarkdown => Show equation and image preview => Never). This post put me on this way to consider that memory / CPU leak are sometimes due to Rstudio itself, nor the data or the code.
set up a bookdown project, in order to split your large Rmd files into several Rmd. See here.
Bonus step, see if there is a conflict in some packages which are loaded with the command tidyverse_conflicts(), but it's already a 'computing problem' (not covered here).
I have written a fairly extensive function but I still need to debug it. As I've been running the function several times, the available hard drive space is slowly decreasing from my system drive. Just over the last day or two it's taken up 10-20 GB of space on the hard drive (with nothing to my knowledge being downloaded in the background).
Since I've found this problem I've moved my script into an R project on another drive, but the system drive continues to fill up. The code generates a few tables and graphs, with all the results being stored in a list variable from which I then display the graphs. I clean the environment/plot windows every time I run the function.
I've checked all the R installation folders but they all look roughly the right size/not too big. Is there anywhere else on a default installation that R could be storing files that is causing this issue?
Similar question asked here 3 years ago with no solution (haven't tried reinstalling R or Rstudio yet though).
Windows x64 and R x64 installation.
I have been getting involved with the Python language, and especially through Jupyter notebook. I think Jupyter is great for prototyping code in a very convenient way. I've been working on code according to this tutorial over the past 2 days:
https://medium.com/#omar.ps16/stereo-3d-reconstruction-with-opencv-using-an-iphone-camera-part-iii-95460d3eddf0, and it's been working fine.
However, when I woke up this morning, it seems that a memory issue is causing Jupyter to crash. When I start Jupyter, there is no such memory issue, it is only when I click on my particular notebook file. Then the memory gradually increases (as seen on the task manager). Also, the screen is non-reactive, so I cannot reach the restart kernel or any of these options in the kernel. After about 30 seconds, the entire Jupyter system crashes due to a memory overflow.
I would greatly appreciate any help with this problem.
Okay I figured out that I was printing a huge matrix out, which blocked up the system. I had to open the notebook with notepad++, and get rid of the data that way, and everything is running fine. Stupid mistake.
I have a Jupyter Notebook that I wish to upload to GitHub. However, even after clearing all output, it's still larger than 100 MB. Stranger still, I created a duplicate and deleted everything in it, yet the duplicate still turned out to have the same size! I tried nbstripout, which didn't help, either.
Why is this happening? Is there a way to "purge" a Jupyter Notebook so that it's significantly smaller? Thank you so much!
I have a Jupyter Notebook that I have been working on for a while now to do some data cleanup for a project. I tried to open it again today and am getting a blank loading screen. I have restarted the server multiple times and have restarted my computer but have had no luck in getting it to load. Each time I try to open the notebook, my computer goes into overdrive, gets hot, fans start up, etc. I am working with a 2GB file in Pandas, so sometimes when running a data heavy command this happens, so I take it as normal.
Here is the screen and the output in my terminal. I haven't had this problem before and am wondering what my options are for saving my work. Any help is much appreciated!
I had the same problem . I solved it by deleting cookies and cache .
If the problem persists pls check this link
https://github.com/jupyter/notebook/issues/3857