Im running a small scientific cluster in our lab. Jupyterhub is installed to run jupyter notebooks with python/julia/r for multiple users. we are new to dask
Dask and the jupyterlab extensions work fine if I run them locally on a node and acces through 127.0.0.1
However I can’t get dask to play nice with the nginx proxy we normally use to connect to jupyterhub. The status pages still point to 127.0.0.1 instead of the access node IP.
Any hints are appreciated.
Our setup
Nginx<——->jupyterhub on access node
Slurm scheduler
8 compute nodes
All on same subnet
somehow, I'm not the only one. see this thread:
https://github.com/dask/dask-labextension/issues/41
however, it is totally unclear for me howto tackle this since..
If someone could outline the steps it would be really helpful.
Related
Hey this is my first question on here, so go easy.
I set up a Nextcloud server on my homelab in an ubuntu server 20.04 vm using the snap install. I have a seperate vm running nginx as a reverse proxy to my Nextcloud instance. Everything works flawlessly as intended, except that when I try and install apps on Nextcloud, I get a curl error #7. I've tried using my lan ip through the web ui, my public domain name through the web ui, and the commandline using the nextcloud.occ app:install command. I always get the same error. I tried to find the appropriate log file to get more information but looking in /var/snap/nextcloud/current/log/ I couldn't find any relevant info in any of the logs. Running php -m comes up with php not installed, I guess because php is installed via the Nextcloud snap? Obviously php is installed somewhere because Nextcloud is running, but I don't know how to look and see what modules are enabled, or how to install new ones using the snap. Any help on what to do is much appreciated!
enter image description here
Update: I fixed it. I think I had improperly configured my firewall, and turning it off (in proxmox)/making some changes to my /etc/netplan/*.yaml file to properly configure the static IP fixed it. GL
Another reason can be a wrongliy configured network. I forgot to set the gateway/proxy for IPv4, so github.com was unreachable. Most other services I use seem to resolve IPv6 first, so I did not have any other problems besides updating nextcloud apps.
I have started to work with Airflow recently and I have two questions. I am currently using it as background on Ubuntu as I have created the two following services:
sudo touch /etc/systemd/system/airflow-webserver.service
sudo touch /etc/systemd/system/airflow-scheduler.service
It work well but, my questions are:
Is there a way to use the GUI when not connected to the VM? (It is currently impossible for me)
The scheduler does not work as soon as I stop the VM local forwarding. Is there a way to have it enables 24/7, whatever if my computer is on or off?
Any kind of explanation would be appreciated.
Edit:
According to this post How do you keep your airflow scheduler running in AWS EC2 while exiting ssh?
running as a service seems to be enough to have the scheduler set even when not doing ssh. But in my case it is not working. Could it be because of the user name in the .service file? What should be the user name in the file?
Following this link I installed graphite with docker.
Now I should be able to find graphite on localhost/dashboard
But there's nothing. What I'm missing? Looked quite straight forward.
When running docker on macOS you need to use docker-machine,
That sets the docker IP to export DOCKER_HOST="tcp://192.168.99.100:2376"
So to access the dashboard you need to type http://192.168.99.100/dashboard
I'm new to Jupyter notebooks/Lab and I've successfully got interaction with pop-out windows and buttons etc with ipywidgets on local instances of Jupyter Lab but not in the cloud when using notebooks.ai - the code runs without error but doesn't create the appropriate windows/button.
Is there anyway to get this working or is this an inherent restriction of using Jupyter Lab in the cloud?
I'm wondering whether there are firewall settings that need configuring to get this to work?
The X11 forwarding is disabled on the docker machines provided by notebooks.ai. Hence, any pop-up interaction is not being forwarded to your machine. However, you can see any inline plot/button in the Jupyter notebook.
If you are interested in this feature there is a GitHub repository to request features (I have never tried it) but might be interesting for you to receive a more in-depth explanation, workaround, or solution.
PROVE:
In the launcher tab you can either run a notebook, a python interpreter or a terminal on the remote docker machine. If you select the terminal and type echo $DISPLAY, you will see that the result is an empty line (if a valid display was attached you will see something like: DISPLAY=localhost:11.0).
For further information about using BASH to check if X11 forwarding is enabled from SSH check this question.
I have installed R and Rstudio-server on Amazon AMI cloud and both are running properly as I can type R and get the R-terminal and check whether the RStudio-server is running.
sudo lsof | grep rstudio
Since Port 8787 is not accessible and there is another port something like 6970 open so I have created a rserver.conf file in /etc/rstudio/ But when I try to open RStudio server from my system's web browser I get this error
"No Data Received"
I am not sure what is the issue, any help would be greatly appreciated.
I've been using RStudio on AWS instance (EC2, Ubuntu) for several months, so I hope I can help you. First, could you clarify this part of your post?
Since Port 8787 is not accessible and there is another port something like 6970 open
I suppose you are using EC2, as it seems to be a common one for this purpose. Then, I believe you can open certain ports by configuring security groups in the AWS console (i.e. Network&Security - Security Group). Have you added a rule to allow 8787 (or 6970 if you prefer) inbound?
This worked for me.
rstudio-server online