Repair corrupted Jupyter notebook / load previous version? - jupyter-notebook

I had a hardware crash while running a Jupyter notebook. After repairing the system and trying to restart the notebook, I got the following error message:
Error loading notebook
Unreadable Notebook: D:\Eddy\Documents\1604 Udacity\1612 Self-driving car Nanodegree\P4\P4 Eduard van Kleef.ipynb NotJSONError("Notebook does not appear to be JSON: ''...",)
Does anyone know of a way to revert to any of Jupyter's previous 'checkpoints'? Or of a way to at least partially restore a JSON?

If you are lucky then the ipynb file is corrupted but still there. In that case you can try opening it in a text file and copying the contents to a new notebook. But check the size of the file. If it is zero bytes, then there is nothing there!
This actually happened to me when my server ran out of memory and somehow the notebook got completely erased. Totally sucks.

Try this
jupyter nbconvert filename.ipynb --clear-output
It worked for me since it has corrupted because of Plotly behavior with some big data.

in your file directory that contains your ipynb file there is a folder called '.ipynb_checkpoints'. this folder does not show in the jupyter application so find it through windows explorer.
inside there's will be a file called urfilenamehere-checkpoint.ipynb
copy paste it to your file directory and open through the jupyter application it should probably work.
if your corrupted file is 0B, you definitely have to rely on the checkpoints.
do not create a new notebook with the same name it will overwrite the checkpoint.

Related

from ipynb to py with jupiter lab export

I open Jupyter lab,
I go to
file
Export Notebook As
Export Notebook As Executable Script.
I than get a warning from my system saying:
This type of file can harm your computer. Do you want to keep filename.py anyway?
Is there risk involved? Why do I get the warning? Would you recommend that way to convert my .ipynb to .py?
You are warned because your browser noticed that you are trying to download some .py script. When you download some from the Internet it really can be harmful. But the browser doesn't know that actually, it's your own code. So download it, you'll be fine.

error loading jupyter notebook permission denied: ipynb

I was working in a jupyter notebook until it froze. It wouldn't save or shut down so I restarted my computer. I launched jupyter notebook from an anaconda prompt, my folder directory opens per usual. When i tried to open the notebook from before, I get an error loading screen that says permission denied: (name of notebook).ipynb. I hit close and the notebook shuts down.
I checked the folder permissions, I have full control. I can create a new ipynb without any issues. I can open other notebooks without any problem in the same folder. I tried to run a trust notebook through the anaconda prompt and it says the notebook is missing.
I need to recover this particular notebook as it has all my work. Help! Any ideas? Thanks in advance.
I work in the anaconda prompt in an environment other than the root, so this answer using sudo chmod doesn't work for me.
I had possibly the same problem. In my case the problem was that jupyter notebook must have crashed or had some problem whilst autosaving.
As a result, in the folder where the notebook is saved there's a temporary file called ".~nameofnotebook.ipynb".
This file didn't show up in jupyter notebook, but only in the explorer. I deleted the notebook file and renamed the temporary file to delete the ".~" prefix. Make sure to save a copy of the notebook file before deleting anything in case your problem is different.
The renamed temporary file opens fine and none of my data was lost.
Change the name of the file and you are good to go bro..

No code displayed when I open the compiled R notebook

I am using R Studio V 1.1.423. I create and save R Notebook files but once I open the .rmd files in R Studio I only get a completely blank page with no code displayed.
However the html files work fine when opened in the browser.
Any suggestions would be really appreciated.
This seems to be a bug in RStudio: It happens every once in a while. It usually helps to delete the R Notebook output (the .nb.html file) and to recompile it from scratch.
You might also have to clear the cache, which can be found (on Unix-like systems) in ~/.rstudio-desktop/notebooks:
Close RStudio.
Find the subfolder corresponding to your notebook and delete it. The subfolder contains a quasi-random hexadecimal prefix, followed by your notebook name; for instance: A88D397F-notebook.
Reopen RStudio and recompile your notebook from scratch.

jupyter notebook takes forever to open and then pages unresponsive - [MathJax] issue

I'm trying to open a jupyter notebook and it takes a long time and I see at the bottom it's trying to load various [MathJax] extension, e.g. at the bottom left of the chrome browser it says:
Loading [MathJax]/extensions/safe.js
Eventually, the notebook loads, but it's frozen and then at the bottom left it keeps showing that it's trying to load other [MathJax] .js files.
Meanwhile, the "pages unresponsive do you want to kill them" pop up keeps popping up.
I have no equations or plots in my notebook so I can't understand what is going on. My notebook never did this before.
I googled this and some people said to delete the ipython checkpoints. Where would those be? I'm on Mac OS and using Anaconda.
conda install -c conda-forge nbstripout
nbstripout filename.ipynb. Make sure that there is no whitespace in the filename.
I had a feeling that the program in my Jupyter notebook was stuck trying to produce some output, so I restarted the kernel and cleared output and that seemed to do the trick!
If Jupyter crashes while opening the ipynb file, try "using nbstripout to clear output directly from the .ipynb file via command line"(bndwang). Install with pip install nbstripout
I was having the same problem with jupyter notebook. My recommendations to you are as follows:
First, check the size of the .ipynb file you are trying to open. Probably the file size is in MB and is large. One of the reasons for this might be the output of a dataset that you previously displayed all rows.
For example;
In order to check the dataset, sometimes I use pd.set_option('display.max_rows', None) instead of the .head() function. And so I view all the rows in the data set.
The large number of outputs increases the file size, making the notebook slower. Try to delete such outputs.
I think this will solve your problem.
Here restarting your kernel will not help. Instead use nbstripout to strip the output from command line.
Run this command -> nbstripout FILE.ipynb
Install nbstripout if it is not there
https://pypi.org/project/nbstripout/
It happened to me the time I decided to print a matrix for 100000 times. The notebook file became 150MB and Jupyter (in Chrome) was not able to open it: it said all the things you experienced and then the page died saying it was "OutOfMemory".
I solved the issue opening it in Visual Studio Code, there is a button "Clear All Output", then I saved the notebook again and it was back to some hundreds of KB, which I could open normally.
If you don't have Visual Studio Code installed, you can open the notebook with another editor (gedit if you use Linux or Notepad++ in Windows) and try to delete the output cells. This is more tricky since you have to pay a lot of attention in what you are deleting, otherwise the notebook will stop working.

Issue: Working Directory in R Studio "stuck" on directory containing open R Markdown file

I am using the most recent version of R (3.3.2), running in the most recent version of RStudio (1.0.136) on MacOS Sierra (10.12.3). I am running into an issue in which my working directory corresponds, and is stuck on, the directory that contains the .RMD file I currently have open in RStudio. Upon opening the file, the working directory is correctly set to the directory holding the .Rproj file. When I go to load in a file with a path relative to that directory, however, I get an error that there is no such file in the current working directory, and the error returns the location of the .RMD file as that working directory.
The working directory, however (using getwd()) still reads where the working directory is supposed to be, and no matter where I try to set it, I still get the same error message when I try to read in a file. Notably, I do NOT get an error message that the working directory cannot be changed--R tells me that the working directory has been changed, and that directory is allegedly the current working directory...but it's not.
I have tried fully (as far as I am aware) uninstalling R and R studio and reinstalling them, to no avail. Does anyone have a solution? This is frustrating the heck out of me right now, since I have to revise all the relative paths in the notebooks that I have defined to do my work in the interim.
Extra information in case it's relevant: I restored from a Time Machine backup that I suspect may have been corrupted somehow; some contents of my Applications folder were missing that I had to move over manually. Could this be causing the issue? Are there other system files that R depends on when interacting with the filesystem that I might look to? I'm trying to avoid doing a clean OS install or a piecemeal rebuilding of my files, since I don't know if that's actually the issue.
Thanks in advance!
This is a known feature/bug of RStudio notebooks (Working Directory about halfway down). Notebooks are executed in the same directory as the file. As #Simon Jackson noted, you can change this using knitr::opts_knit$set(root.dir = normalizePath()).

Resources