I'm new to Jupyter notebooks/Lab and I've successfully got interaction with pop-out windows and buttons etc with ipywidgets on local instances of Jupyter Lab but not in the cloud when using notebooks.ai - the code runs without error but doesn't create the appropriate windows/button.
Is there anyway to get this working or is this an inherent restriction of using Jupyter Lab in the cloud?
I'm wondering whether there are firewall settings that need configuring to get this to work?
The X11 forwarding is disabled on the docker machines provided by notebooks.ai. Hence, any pop-up interaction is not being forwarded to your machine. However, you can see any inline plot/button in the Jupyter notebook.
If you are interested in this feature there is a GitHub repository to request features (I have never tried it) but might be interesting for you to receive a more in-depth explanation, workaround, or solution.
PROVE:
In the launcher tab you can either run a notebook, a python interpreter or a terminal on the remote docker machine. If you select the terminal and type echo $DISPLAY, you will see that the result is an empty line (if a valid display was attached you will see something like: DISPLAY=localhost:11.0).
For further information about using BASH to check if X11 forwarding is enabled from SSH check this question.
Related
I am trying to connect to remote jupyter server which is running in remote server inside a docker. from vs code (local machine) with the help of jupyter extension, and running a newly created notebook.
The problem seems to be the kernel creation via this method for this newly created notebook.
My Jupyter Server is running in remote server, inside a docker environment, with port forwarding enable.
I can access it via my browser by http://{remote_machine_ip}:{port}/, and I am able to create new jupyter notebook.
However when I use the vs code to open local notebook file, and connect to the remote jupyter server.
When I try to run the cells, it shows the following error:
Failed to start the Kernel.
'_xsrf' argument missing from POST.
View Jupyter log for further details.
Possible solution
However, if I try to open a new kernel in browser, and connect it to same kernel in vs code, the problem goes away.
This issue seems to be arising when vs code sends the request to create jupyter kernel to the jupyter server.
As, when I tested with already running jupyter kernel in vs code, it works fine.
The issue here is, security implementation of jupyter server, which don't allow cross site request to create a kernel, as it security vulnerability.
For more details about _xsrf token, you can read here, although it doesn't talk about specific to jupyter server, its very easy to deduce the logic.
In one post of jetbrains, I found the solution, to make jupyter ignore _xsrf token, by adding new flag while starting the jupyter server,
--NotebookApp.disable_check_xsrf=True
Or add it to your notebook config.
Also, to make sure your request are not been blocked, as suggested by vscode blog. Add the following flag too,
--NotebookApp.allow_origin='*'
I am trying to connect from google colab to local host - it does not work,
google troubleshooting advise suggests that I should allow local Jupyter notebook to accept colab requests - how to do it ?
Screenshot from local host which confirms that requests from colab are forbidden.
There should be some config modification to allow such requests, is not it ?
Advise from colab:
How I fixed my connection issues:
If after the above commands don't work, what finally did it for me was creating a firewall rule for port 8888.
Another big one that is in the instructions in your command line but not stated on the provided Google tutorial that some people will miss is that you need to open the URL in your browser that's given to you after launching Jupyter in order for it to create the access cookie and make it valid (sort of speak).
From the screenshot you provided, make sure you have "http://localhost:8888/?token=2534..." opened in your browser that is logged in with the same account accessing colab before it will allow access to colab (a separate window or tab will usually pops up on it's own when you run the command though).
Alternatively, you can also add the --no-browser tag to prevent the need to open it in your browser
Other solutions that involve installation issues to mitigate network issues:
There were a number of other troubleshooting steps that were particular to me but dont know if will apply to you but did you have any issues when installing Jupyter?
For me, even though jupyter would still launch, there were errors when installing on Windows, and to fix those I had to replace pip with pipwin and go through the steps that way.
For example
pipwin install jupyterlabs
pipwin install jupyter_http_over_ws
And in case it helps anyone else coming across this in the future, if you're using the DOS cmd line and have issues launching jupyter, replace the "" with a "^" to indicate continuation
i.e.:
jupyter notebook --NotebookApp.allow_origin="https://colab.research.google.com" --port=8888 --NotebookApp.port_retries=0
I am trying to share my Jupyter notebook with my team and want us all to coauth on the same notebook? Is it possible ?
Yes, JupyterLab 3.1 introduces Real Time Collaboration mode:
Ensure you have JupyterLab 3.1 or newer installed, and for convenience install jupyterlab-link-share:
pip install -U "jupyterlab>=3.1" jupyterlab-link-share
Add c.LabApp.collaborative = True setting to your config file (which is
jupyter_server_config.py or jupyter_notebook_config.py depending on the server you use)
After restarting JupyterLab open the new Share menu and choose Share Jupyter Server Link, copy the link and send it to your collaborator. Of course, the serve needs to be accessible for them (in the same network, or publicly available).
You can try it out on binder: (here using this public gist).
JupyterHub users may need to perform additional configuration (as discussed on the discourse here) because there are additional permissions/authentication questions to be dealt with.
I am trying to run a R notebook on Microsoft's Azure notebooks cloud service.
When I am trying to run all cells, it displays a Loading required package: ggplot2 in the last cell and then the Kernel systematically crashes. I get:
The kernel appears to have died. It will restart automatically.
But the Kernel does not restart automatically.
How can I get a log describing the encountered issue? Is there a way to activate a debugger?
When you're running Jupyter usually you'll see messages about kernel issues in standard I/O of the console that you launch. In Azure Notebooks this gets redirected to a file at ~/.nb.log. You can open a new terminal by clicking on the Jupyter icon, and then doing New->Terminal, and doing cat ~/.nb.log. You could also start a new Python notebook for this purpose and do "!cat ~/.nb.log" - but unfortunately you can't just do that from an R notebook, they don't support the "magic" ! commands.
Usually that gives you a good starting point. If that doesn't help much you could try invoking R directly from the terminal and trying the repro steps there and see if that's more useful.
sorry if this question is not as detailed and focused as it should be.
I am a Linux user (so no admin privileges of any sort) and just installed Anaconda3 from here and followed the instructions.
Note that my Linux machine is not connected to the internet, so I had to transfer the file through other means and just run bash Anaconda... as instructed in the Continuum site.
I have then succesfully launched IPython and tried to plot inline, without problems. However, when I tried to use plotting in windows, I got this kind of output from the terminal:
X Error: BadDrawable (invalid Pixmap or Window parameter) 9 Major opcode: 62 (X_CopyArea)
And the created graph window was just blank.
I then tried to start Spyder and basically saw the same behaviour: a lot of those errors reported above, and the Spyder window just popped up blank.
Google search for the error gives results reported for Qt, which makes sense since when plotting "offline" (as opposed to inline) then QtAgg is used.
However, I have no clue as to where to look for the version of these libs, how to install/compile new ones, whether that is really the issue or not. I am just to ignorant about Linux.
Anybody can hint on what to look for and how to try to debug this behaviour?
I had the same error. What worked for me is to add this line in /etc/environment.
sudo nano /etc/environment
Add this line
QT_X11_NO_MITSHM=1
Source : https://github.com/P0cL4bs/WiFi-Pumpkin/issues/53#issuecomment-309120875
Note that in my case the fix didn't take effect until I rebooted my machine.
you may simply run on the terminal:
export QT_X11_NO_MITSHM=1
I had this same error, so I'll tell you what worked for me.
I think it is a permissions issue, based on the following:
I was logged in through a VNC server window through my account, but within the VNC server was setting up a user profile for "user2". In a 'user2' console I installed anaconda in user2's directory. When I typed spyder in the user2 console, I got the exact error you describe. I guessed the VNC desktop didn't like user2 trying to open a window on user1's profile. I then logged out of my VNC server window, logged into the user2 vnc server window, and in a console typed 'spyder' and it opened perfectly.
I think for some reason it is trying to open but is installed in a directory that you don't have permissions for or trying to open in a window that you don't have access to.