How to activate a debugger or access logs in Jupyter notebooks? - r

I am trying to run a R notebook on Microsoft's Azure notebooks cloud service.
When I am trying to run all cells, it displays a Loading required package: ggplot2 in the last cell and then the Kernel systematically crashes. I get:
The kernel appears to have died. It will restart automatically.
But the Kernel does not restart automatically.
How can I get a log describing the encountered issue? Is there a way to activate a debugger?

When you're running Jupyter usually you'll see messages about kernel issues in standard I/O of the console that you launch. In Azure Notebooks this gets redirected to a file at ~/.nb.log. You can open a new terminal by clicking on the Jupyter icon, and then doing New->Terminal, and doing cat ~/.nb.log. You could also start a new Python notebook for this purpose and do "!cat ~/.nb.log" - but unfortunately you can't just do that from an R notebook, they don't support the "magic" ! commands.
Usually that gives you a good starting point. If that doesn't help much you could try invoking R directly from the terminal and trying the repro steps there and see if that's more useful.

Related

Kubeflow: Notebook server stuck on loading

Whenever I try to create a Kubeflow notebook server to build a pipeline from a jupyter notebook, it keeps loading forever without displaying any error.
I'm currently using a Kubeflow dashboard that's already up and running on a server, so I didn't deploy it myself and I'm not working on a local instance to use the terminal.
Any idea what the origin of the problem might be and how to solve it?
Here's a screenshot that might explain better.

How to reinitialize a notebook from the shell?

Following previous discussions from 2018, I managed to run the latest version of Julia inside Google Colab (see here). However, there is a small quirk in the setup: after installing Julia, a new notebook has to be initialized before Colab will recognize Julia code. I've been doing this by refreshing the web browser, which seems like it doesn't kill the runtime environment while reinitializing the notebook.
I would like to remove this quirky step, and have it so that the notebook is reinitialized directly after Julia is installed, without the need to refresh the browser. Is there a simple shell command that will do this? Any ideas are appreciated.
(The Colab notebook I provided in the link above has step-by-step instructions for installing and running Julia, including instructions on when to refresh the browser. It's short.)

Can cloud instances of Jupyter Lab support pop out interactive windows

I'm new to Jupyter notebooks/Lab and I've successfully got interaction with pop-out windows and buttons etc with ipywidgets on local instances of Jupyter Lab but not in the cloud when using notebooks.ai - the code runs without error but doesn't create the appropriate windows/button.
Is there anyway to get this working or is this an inherent restriction of using Jupyter Lab in the cloud?
I'm wondering whether there are firewall settings that need configuring to get this to work?
The X11 forwarding is disabled on the docker machines provided by notebooks.ai. Hence, any pop-up interaction is not being forwarded to your machine. However, you can see any inline plot/button in the Jupyter notebook.
If you are interested in this feature there is a GitHub repository to request features (I have never tried it) but might be interesting for you to receive a more in-depth explanation, workaround, or solution.
PROVE:
In the launcher tab you can either run a notebook, a python interpreter or a terminal on the remote docker machine. If you select the terminal and type echo $DISPLAY, you will see that the result is an empty line (if a valid display was attached you will see something like: DISPLAY=localhost:11.0).
For further information about using BASH to check if X11 forwarding is enabled from SSH check this question.

Fixing pandoc "out of memory" error when running the profvis R package

I'm trying to use the profvis package to do memory profiling of a large job in R (64 bit), run under RStudio, run under windows 7. profvis keeps crashing and I get an error message saying that Pandoc is out of memory. The message is copied below.
My understanding, and please correct me if this is wrong, is that the problem is likely to go away if I can set the /LARGEADDRESSAWARE switch on Pandoc. And to do that, I need to install a linker, etc., do my own build, after learning how to do all those things. Or, there is a shortcut, involving installing MS Visual Studio, running the editbin utility, and set the switch that way. However a new install of Visual Studio is unhappy on my machine, and demands that I fix some unspecified problem with Windows Management Instrumentation before it will go forward.
So my question is this: Is there a way to set the /LARGEADDRESSAWARE switch on Pandoc from inside R?
I had a similar problem and was able to resolve it by following the advice at https://www.techpowerup.com/forums/threads/large-address-aware.112556/. See in the post where it has an Attached File called laa_2_0_4.zip. I downloaded it and ran the executable it contains. Basic mode was sufficient; I simply navigated to C:/Program Files/RStudio/bin/pandoc/pandoc and turned on the checkbox for Large Address Aware Flag (step 2), then did Commit Changes (step 3). After this, the profvis-invoked pandoc command eventually ran to success. I was able to watch pandoc's memory consumption in Task Manager rise up to a peak of about 2.7 GB.

ipython notebook hang when open a large notebook

I am using the LATEST (2.2.0) ipython notebook, when I create a notebook with a loop to write many lines (about 20000 lines), then it run forever I guess since I always see the running icon at the top right. Even if I restart the computer and reopen the notebook again, it will into a running mode automatically, then I almost unable to do anything in this page. I have to copy the code and new another page to fix it.
How can I fix such hang issue during open a too large notebook? I have tried the kernel "interrupt" and "restart" menu and it seems no any effect at all.
IPython notebook is not intended to do such tasks with too much calculation or too many output data because actually such things is for standalone program rather than a notebook.
To fix such issues, you need to create a standalone application (script) to do it from console, then paste the meaningful result into IPython notebook.

Resources