How to reinitialize a notebook from the shell? - jupyter-notebook

Following previous discussions from 2018, I managed to run the latest version of Julia inside Google Colab (see here). However, there is a small quirk in the setup: after installing Julia, a new notebook has to be initialized before Colab will recognize Julia code. I've been doing this by refreshing the web browser, which seems like it doesn't kill the runtime environment while reinitializing the notebook.
I would like to remove this quirky step, and have it so that the notebook is reinitialized directly after Julia is installed, without the need to refresh the browser. Is there a simple shell command that will do this? Any ideas are appreciated.
(The Colab notebook I provided in the link above has step-by-step instructions for installing and running Julia, including instructions on when to refresh the browser. It's short.)

Related

How to make sure my jupyter notebook is runnable on any other computer or on any jupyter Lab?

An analytic task has been given to me to solve it by python and return back the result to the technical staff. I was asked to prepare the result in a jupyter notebook and such that the resulting code would be fully runnable and documented.
Honestly, I just started using jupyter notebook and generally found it pretty useful and convenient in generating reports integrated with codes and figures. But I had to go into some level of difficulty when I wanted to use specific packages like graphviz and dtreeviz, which was beyond doing a simple pip install xxx.
So, how should I make sure that my code is runnable when I do not know what packages are available at the destination Jupyter notebook of the next guy who wants to run it or when they want to run it using a Jupiter Lab? especially regarding these particular packages!
One solution for you problem would be to use docker to develop and deploy your project.
You can define all your dependencies, create your project and build a docker image with them. With this image, you can be sure that anyone who is using it, will have the same infrastructure like yours.
It shouldn't take you a lot of time to learn docker and it will help you in the future.

How to activate a debugger or access logs in Jupyter notebooks?

I am trying to run a R notebook on Microsoft's Azure notebooks cloud service.
When I am trying to run all cells, it displays a Loading required package: ggplot2 in the last cell and then the Kernel systematically crashes. I get:
The kernel appears to have died. It will restart automatically.
But the Kernel does not restart automatically.
How can I get a log describing the encountered issue? Is there a way to activate a debugger?
When you're running Jupyter usually you'll see messages about kernel issues in standard I/O of the console that you launch. In Azure Notebooks this gets redirected to a file at ~/.nb.log. You can open a new terminal by clicking on the Jupyter icon, and then doing New->Terminal, and doing cat ~/.nb.log. You could also start a new Python notebook for this purpose and do "!cat ~/.nb.log" - but unfortunately you can't just do that from an R notebook, they don't support the "magic" ! commands.
Usually that gives you a good starting point. If that doesn't help much you could try invoking R directly from the terminal and trying the repro steps there and see if that's more useful.

Fixing pandoc "out of memory" error when running the profvis R package

I'm trying to use the profvis package to do memory profiling of a large job in R (64 bit), run under RStudio, run under windows 7. profvis keeps crashing and I get an error message saying that Pandoc is out of memory. The message is copied below.
My understanding, and please correct me if this is wrong, is that the problem is likely to go away if I can set the /LARGEADDRESSAWARE switch on Pandoc. And to do that, I need to install a linker, etc., do my own build, after learning how to do all those things. Or, there is a shortcut, involving installing MS Visual Studio, running the editbin utility, and set the switch that way. However a new install of Visual Studio is unhappy on my machine, and demands that I fix some unspecified problem with Windows Management Instrumentation before it will go forward.
So my question is this: Is there a way to set the /LARGEADDRESSAWARE switch on Pandoc from inside R?
I had a similar problem and was able to resolve it by following the advice at https://www.techpowerup.com/forums/threads/large-address-aware.112556/. See in the post where it has an Attached File called laa_2_0_4.zip. I downloaded it and ran the executable it contains. Basic mode was sufficient; I simply navigated to C:/Program Files/RStudio/bin/pandoc/pandoc and turned on the checkbox for Large Address Aware Flag (step 2), then did Commit Changes (step 3). After this, the profvis-invoked pandoc command eventually ran to success. I was able to watch pandoc's memory consumption in Task Manager rise up to a peak of about 2.7 GB.

Julia's execution using Juno gets blocked because of not being able to access the console

In another question:
Juno IDE for Julia, how to interact?
One of the answers stated:
You can't enter commands into the console in Juno--that's for displaying output. Commands can be submitted from within the editor by setting your cursor in the line to submit and pressing Ctrl+Enter or Shift+Enter. The value will then be displayed in a small popup next to the line and the output will be printed to the console if you have the console visible.
Note that the inability to use the console as you desire is by design. See here for information about the console from the Juno docs.
I am encountering an issue that might make this design decision a bit impractical at times.
First, let me say that I am new to Julia and Juno (not to coding, just trying to learn Julia now), so there might be a way to fix this and I do not know about it.
I am using the package RCall, that allows to include R code into Julia, so I am trying to install some R packages. One of the packages is asking me a "y/n:" question and I cannot answer it because I cannot access the console, and sending code from the editor to the console using command+Enter is blocked (because the console line is executing).
To reproduce this issue (note that R 3.2.0 or above needs to be installed, and I had to restart Juno after installing RCall for it to work):
Pkg.add("RCall")
Pkg.build("RCall")
restart Juno
using RCall
reval("install.packages(\"rgdal\")")
A pop-up will appear asking to select a mirror, just choose 0 and OK. Then the package is asking me if I want to install some dependencies "y/n:" and the whole execution is blocked.
I guess that this can be fixed just by using Julia from the command line (and forgetting about Juno), but I like to use IDEs.
Do you have any ideas about how to circumvent this issue? or another IDE for Julia?

Jupyter notebook extension loading eratically

I load extensions to IPython notebook (powered by Jupyter) with the following command in ~\.ipython\profile_default\static\custom\custom.js.
IPython.load_extensions("calico-spell-check", "calico-document-tools");
The extensions are correctly loaded in the first notebook I open. But they are typically disabled in the following notebooks I open. Then when I close, reopen, etc. the notebooks they are loaded (seldom) or not (often) without any specific message in the console.
Is it a problem of compatibility with Jupyter or rather a bad configuration of mine?
You are hitting a race condition. Most of the instruction on how to activate extension in custom.js are wrong/too old when they tell you to directly copy past code in custom.js. Please follow official docs, use requirejs, and register the extension following the information in official docs.

Resources