Use workspace of an RStudio session in Jupyter notebook - r

As my RAM is scarce, I'd like to not replicate data and use objects created in an RStudio session inside my Jupyter notebook (running w/ R kernel).
Any idea how to do it?
Basically I'd like to use the same workspace in both, the RStudio and the Jupyter notebook session.
Thanks for help!

One problem I encountered with an R notebook in Jupyter, though, was saving my workspace. In a normal R session I’m used to saving my workspace at the end of the session and coming back to it later to pick up where I left off. However, with the Jupyter notebook I found that I had to rerun all the code to regenerate all the objects again! This appears to be an issue for Python notebook users too.
There’s a very simple fix for this: Just run the standard R command
save.image()
Your workspace will then be saved to the usual hidden .RData file in the same folder as the Jupyter notebook. If you want to share the code and the workspace, you’ll have to make sure that you copy both the notebook file and the .RData file that goes along with it.
Likewise, if you start a notebook in a folder that already has an .RData file, you’ll find that you can access that workspace from the Jupyter notebook – just run ls() to see what’s there.

Related

Where does Jupyter Notebook save files?

So I'm just beginning my programming/coding journey. I've downloaded Anaconda and made a shortcut for Jupyter Notebook on my desktop. I tried using my first file the other day, and I'm not sure where it's being saved to. Also, I basically don't want Jupyter to save any Notebook I do once I close the Notebook, unless I specifically save it myself - I just use it for 'working out' if you like.
Here is the image showing what I mean
Like, where is that untitled.ipynb file being saved? And, how can I adjust my settings in Jupyter Notebook such that these files aren't saved and are discarded automatically so I can use them as I describe just for 'working out'?
By default the ipynb files are stored to your user profile:
C:\Users\yourlogin
How to disable autosave has already been described here:
Turn Off Autosave in IPython Notebook
I don't recommend doing that.

Unable to start kernel for jupyter notebooks in a specific directory

No problem in other directories. Is there an environmental variable or something else I need to erase?
Deleted cache file...
OK, I think I need be much clearer here.
First software:
MacOS Catalina 10.15.6
jupyter notebook 6.0.3
Python 3.8.3
IPython 7.16.1
jupyter notebook is installed and runs fine.
jupyter notebook runs just fine in any user directory on the computer except exactly one.
There is nothing obvious in this directory that shouldn't be there. An 'ls -al' shows nothing but some .py files.
I can create a jupyter notebook in this directory, but the kernel crashes and won't restart. I can rename the directory, rename the jupyter notebook, but the behavior persists beyond everything I have been able to reset including a cold computer restart. It is reproducible and happens every time.
This behavior is not seen in any other directory.
My question: are there environmental variables or caches stored not visibly in the directory (obviously) that are responsible for this incredibly annoying behavior and how can I reset them?
Problem solved: jupyter notebooks apparently uses some reserved names for local directory .py files when starting up the notebook. So far I've found that "string.py" and "decorator.py" cannot be in the startup directory unless they contain the expected data (looks like it needs to be related to some template info)
To start-up a kernel
You first activate your virtual environment:
For instance: conda activate vision
Second, you type jupyter notebook
as stated here

Values not retained within a jupyter notebook / colab notebook after reloading the notebook

If I keep open a jupyter notebook / colab one for sometime without any activity, somehow the notebook "forgets" all the values (e.g. csv files loaded into DataFrames) and I need to re-run the whole notebook which is time consuming.
By the way, I diligently save the notebook with each change.
It happens also when I reload the notebook. It seems odd.
Any ideas how to prevent it?
Thank you

jupyter notebook takes forever to open and then pages unresponsive - [MathJax] issue

I'm trying to open a jupyter notebook and it takes a long time and I see at the bottom it's trying to load various [MathJax] extension, e.g. at the bottom left of the chrome browser it says:
Loading [MathJax]/extensions/safe.js
Eventually, the notebook loads, but it's frozen and then at the bottom left it keeps showing that it's trying to load other [MathJax] .js files.
Meanwhile, the "pages unresponsive do you want to kill them" pop up keeps popping up.
I have no equations or plots in my notebook so I can't understand what is going on. My notebook never did this before.
I googled this and some people said to delete the ipython checkpoints. Where would those be? I'm on Mac OS and using Anaconda.
conda install -c conda-forge nbstripout
nbstripout filename.ipynb. Make sure that there is no whitespace in the filename.
I had a feeling that the program in my Jupyter notebook was stuck trying to produce some output, so I restarted the kernel and cleared output and that seemed to do the trick!
If Jupyter crashes while opening the ipynb file, try "using nbstripout to clear output directly from the .ipynb file via command line"(bndwang). Install with pip install nbstripout
I was having the same problem with jupyter notebook. My recommendations to you are as follows:
First, check the size of the .ipynb file you are trying to open. Probably the file size is in MB and is large. One of the reasons for this might be the output of a dataset that you previously displayed all rows.
For example;
In order to check the dataset, sometimes I use pd.set_option('display.max_rows', None) instead of the .head() function. And so I view all the rows in the data set.
The large number of outputs increases the file size, making the notebook slower. Try to delete such outputs.
I think this will solve your problem.
Here restarting your kernel will not help. Instead use nbstripout to strip the output from command line.
Run this command -> nbstripout FILE.ipynb
Install nbstripout if it is not there
https://pypi.org/project/nbstripout/
It happened to me the time I decided to print a matrix for 100000 times. The notebook file became 150MB and Jupyter (in Chrome) was not able to open it: it said all the things you experienced and then the page died saying it was "OutOfMemory".
I solved the issue opening it in Visual Studio Code, there is a button "Clear All Output", then I saved the notebook again and it was back to some hundreds of KB, which I could open normally.
If you don't have Visual Studio Code installed, you can open the notebook with another editor (gedit if you use Linux or Notepad++ in Windows) and try to delete the output cells. This is more tricky since you have to pay a lot of attention in what you are deleting, otherwise the notebook will stop working.

Repair corrupted Jupyter notebook / load previous version?

I had a hardware crash while running a Jupyter notebook. After repairing the system and trying to restart the notebook, I got the following error message:
Error loading notebook
Unreadable Notebook: D:\Eddy\Documents\1604 Udacity\1612 Self-driving car Nanodegree\P4\P4 Eduard van Kleef.ipynb NotJSONError("Notebook does not appear to be JSON: ''...",)
Does anyone know of a way to revert to any of Jupyter's previous 'checkpoints'? Or of a way to at least partially restore a JSON?
If you are lucky then the ipynb file is corrupted but still there. In that case you can try opening it in a text file and copying the contents to a new notebook. But check the size of the file. If it is zero bytes, then there is nothing there!
This actually happened to me when my server ran out of memory and somehow the notebook got completely erased. Totally sucks.
Try this
jupyter nbconvert filename.ipynb --clear-output
It worked for me since it has corrupted because of Plotly behavior with some big data.
in your file directory that contains your ipynb file there is a folder called '.ipynb_checkpoints'. this folder does not show in the jupyter application so find it through windows explorer.
inside there's will be a file called urfilenamehere-checkpoint.ipynb
copy paste it to your file directory and open through the jupyter application it should probably work.
if your corrupted file is 0B, you definitely have to rely on the checkpoints.
do not create a new notebook with the same name it will overwrite the checkpoint.

Resources