downloading Jupyter notebooks in R w/ all their resources - jupyter-notebook

In python,to download a jupyter notebook with all its resources, I added a code line like this: +
!zip -r nb_.zip
is there any equivalent when the nb is programmed in R? Thank you!

Related

Running Jupyter notebook (and generating plots) from the command line

I'm trying to use the terminal to run a jupyter notebook (kernel: Julia v1.6.2), which contains generated using Plots.jl, before uploading the notebook to github for viewing on nbviewer.com.
Following this question:
How to run an .ipynb Jupyter Notebook from terminal?
I have been using nbconvert as follows:
jupyter nbconvert --execute --to notebook --inplace
This runs the notebook (if you tweak the timeout limits), however, it does not display plots when using Plots.jl, even when I explicitly call display(plot()) at the end of a cell.
Does anyone have any idea how notebooks can be run remotely in such a manner that plots will be generated and displayed, particularly when using Julia?
I managed to generate Plots.jl plots by getting from IJulia the same configuration it uses to run notebooks (this is probably the most sure way when you have many Pyhtons etc.).
using Conda, IJulia
Conda.add("nbconvert") # I made sure nbconvert is installed
mycmd = IJulia.find_jupyter_subcommand("nbconvert")
append!(mycmd.exec, ["--ExecutePreprocessor.timeout=600","--to", "notebook" ,"--execute", "note1.ipynb"])
Now mycmd has exactly the same environment as seen by IJulia so we can do run(mycmd):
julia> run(mycmd)
[NbConvertApp] Converting notebook note1.ipynb to notebook
Starting kernel event loops.
[NbConvertApp] Writing 23722 bytes to note1.nbconvert.ipynb
The outcome got saved to note1.nbconvert.ipynb, I open it with nteract to show that graphs actually got generated:
Launch notebook with using IJulia and notebook() in the REPL

start jupyter lab in different folder (windows)

For jupyter notebooks, I can do:
jupyter notebook --ExtractOutputPreprocessor.enabled=False --notebook-dir C:/Bla
so I tried something similar for jupyter lab:
jupyter lab --app-dir C:/Bla
but get:
JupyterLab Error
JupyterLab application assets not found in "C:/Bla"
Please run `jupyter lab build` or use a different app directory
I did a few google searches without a clear answer. Could someone please enlighten me? Thanks.
Don't use --app-dir - this is for custom deployments of JupyterLab.
notebook-dir should work fine, as well as ServerApp.root_dir:
jupyter lab --notebook-dir C:/Bla
or
jupyter lab --ServerApp.root_dir C:/Bla
Unless you have a very old version of JupyterLab installed (in that case - upgrade).

How do I access a .ipynb file? Do I neeed to download jupyter notebook in order to do this?

I'm doing the SQL course on coursera
Yes, you can install Jupyter Notebook as a library for python.
But also you can try open and run your file, for example, in Google Collab.

Audit Commands run in Jupyter Notebook

Requirement:
Be able to audit (using logs) all the commands run in Jupyter Notebook by a user. The Jupyter Notebook is installed on Dataproc.
Is there a way we can log the command run by the user at the same time.
I have already tried changing Application.log_level in jupyter config file to 0 but no luck.
Looks like there was some discussion about this FR in the Jupyter community: https://groups.google.com/forum/#!topic/jupyter/sLKCCBwlKEc. You would have to modify the Jupyter kernel to print out all commands to a file.

Run IPython notebook with Spark in another directory

I am trying to use IPython Notebook with spark.
When I am in the spark installation directory, this command works well:
IPYTHON_OPTS="notebook --pylab inline" ./bin/pyspark
However, I can create notebooks only on the spark installation directory and its sub directories.
If I launch this command on another directory on my computer:
PYTHON_OPTS="notebook --pylab inline" /Users/poiuytrez/Documents/programs/spark/bin/pyspark
I get only a regular pyspark shell and not a notebook. Do you have any ideas of what could be wrong?
It looks like you might have left off the initial I in IPYTHON_OPTS in your second command?

Resources