RStudio Server on ec2 - Not persistent when closing browser tab - r

I am running RStudio server on an ec2 instance (using Louis Aslett's AMI) and connect through the browser.
I have some long scripts to run and thought I would be able to leave them running and close the browser tab/turn off my computer.
However, when I do this it seems to interrupt the console and when I log back into the server (pasting address into address bar and logging back in) I am met with an alert telling me that the R session terminated and my workspace is completely reset (working directory reset, and any data or variables lost).
Note that I am not terminating the instance, I am simply closing the browser tab that RStudio is loaded in.
Am I doing something wrong? Is there a proper way to disconnect safely and prevent this from happening?
Thanks

The author of the AMI implies that the AMI is based on Linux, so you can run screen before launching your RStudio server session.
The screen package is bundled with most Linux distributions. The author doesn't mention which distro his AMI is based on or list all of the included packages, but if the AMI doesn't have it, then you can use a package manger to install it:
sudo apt-get install screen -y
if your package manager is apt. The installation using the yum package manager is similar.

Related

problem with kernel creation in vscode while connected to remote jupyter server?

I am trying to connect to remote jupyter server which is running in remote server inside a docker. from vs code (local machine) with the help of jupyter extension, and running a newly created notebook.
The problem seems to be the kernel creation via this method for this newly created notebook.
My Jupyter Server is running in remote server, inside a docker environment, with port forwarding enable.
I can access it via my browser by http://{remote_machine_ip}:{port}/, and I am able to create new jupyter notebook.
However when I use the vs code to open local notebook file, and connect to the remote jupyter server.
When I try to run the cells, it shows the following error:
Failed to start the Kernel.
'_xsrf' argument missing from POST.
View Jupyter log for further details.
Possible solution
However, if I try to open a new kernel in browser, and connect it to same kernel in vs code, the problem goes away.
This issue seems to be arising when vs code sends the request to create jupyter kernel to the jupyter server.
As, when I tested with already running jupyter kernel in vs code, it works fine.
The issue here is, security implementation of jupyter server, which don't allow cross site request to create a kernel, as it security vulnerability.
For more details about _xsrf token, you can read here, although it doesn't talk about specific to jupyter server, its very easy to deduce the logic.
In one post of jetbrains, I found the solution, to make jupyter ignore _xsrf token, by adding new flag while starting the jupyter server,
--NotebookApp.disable_check_xsrf=True
Or add it to your notebook config.
Also, to make sure your request are not been blocked, as suggested by vscode blog. Add the following flag too,
--NotebookApp.allow_origin='*'

RStudio Server immediately freezes and crashes browser after starting

I have an Rstudio Server installation on a Linux Azure VM that seems frozen and also crashes any browser I try to login with.
I have restarted the VM and separately restarted RStudio Server (similarly to the solution in this post) a number of times, but it does not change the behavior. I didn't think I had it set to reload an environment or operation upon restart, but perhaps I accidentally did. The last thing I was doing with the server was a large sparse matrix operation that turned out to be too much for it.
Since the problem occurs immediately after the app loads, I'm not sure how to reset its state. I tried following these instructions but it didn't work:
I renamed ~/.rstudio to ~/.rstudio.backup while RStudio Server was stopped and it was recreated on the next start, but it still resumed in the same state. I saw mentions of ~/.local/share/rstudio/ and ~/.config/rstudio in the support docs, but they do not seem to exist in this Azure Ubuntu installation.
I also moved and renamed the folder of the last used R project, but it had no effect. There is a .config/R/ folder but it only contains a rsconnect folder and 2 empty subfolders.
I finally got it. After shutting down and restarting RStudio Server it was necessary to run this command immediately after RStudio Server loaded:
sudo rstudio-server suspend-all

Can cloud instances of Jupyter Lab support pop out interactive windows

I'm new to Jupyter notebooks/Lab and I've successfully got interaction with pop-out windows and buttons etc with ipywidgets on local instances of Jupyter Lab but not in the cloud when using notebooks.ai - the code runs without error but doesn't create the appropriate windows/button.
Is there anyway to get this working or is this an inherent restriction of using Jupyter Lab in the cloud?
I'm wondering whether there are firewall settings that need configuring to get this to work?
The X11 forwarding is disabled on the docker machines provided by notebooks.ai. Hence, any pop-up interaction is not being forwarded to your machine. However, you can see any inline plot/button in the Jupyter notebook.
If you are interested in this feature there is a GitHub repository to request features (I have never tried it) but might be interesting for you to receive a more in-depth explanation, workaround, or solution.
PROVE:
In the launcher tab you can either run a notebook, a python interpreter or a terminal on the remote docker machine. If you select the terminal and type echo $DISPLAY, you will see that the result is an empty line (if a valid display was attached you will see something like: DISPLAY=localhost:11.0).
For further information about using BASH to check if X11 forwarding is enabled from SSH check this question.

Running R code via Rstudio on a remote server not by browser

Is there a way to use a local Rstudio installation on my machine which is actually running the code on a remote server where I can run distributed jobs via SLURM?
Can it be compatible with version control and dockers?
The remoter package does what you are wanting to do very well. You start R on the remote server and run remoter :: server(showmsg=TRUE). Then in you local RStudio you run
remoter :: client (). Works fairly flawlessly.
My main issue is that when you run help it comes from the remote session in the console rather than the help window.
https://cran.r-project.org/web/packages/remoter/vignettes/remoter.pdf

Unable to open RStudio-server in web browser

I have installed R and Rstudio-server on Amazon AMI cloud and both are running properly as I can type R and get the R-terminal and check whether the RStudio-server is running.
sudo lsof | grep rstudio
Since Port 8787 is not accessible and there is another port something like 6970 open so I have created a rserver.conf file in /etc/rstudio/ But when I try to open RStudio server from my system's web browser I get this error
"No Data Received"
I am not sure what is the issue, any help would be greatly appreciated.
I've been using RStudio on AWS instance (EC2, Ubuntu) for several months, so I hope I can help you. First, could you clarify this part of your post?
Since Port 8787 is not accessible and there is another port something like 6970 open
I suppose you are using EC2, as it seems to be a common one for this purpose. Then, I believe you can open certain ports by configuring security groups in the AWS console (i.e. Network&Security - Security Group). Have you added a rule to allow 8787 (or 6970 if you prefer) inbound?
This worked for me.
rstudio-server online

Resources