Input doesn't pass through into attached container with R session - r

I was trying to do some debugging in R code when it's already on the container.
After doing docker attach #container-id, I attach as expected to the running process, I get to see the browser prompt as well. However, I cannot interact with the R session due to the input not passing through to R session. Commands that I enter stay in a buffer and only get executed in the local bash after the container detaches.
R session is started through ShinyProxy that spins up a Docker container with R instance in which the following script is run:
#!/bin/bash
R -e "shiny::runApp(host='0.0.0.0', port=3838)"
I'm connecting to the machine with docker from windows using putty. How can I make my input pass through into the attacked R container?

The problem turned out to be due to putty which seems to send something to the input resulting in the closing of Browser prompt.
Using ssh client from git provided a solution.

Related

Preserve environment variables when spawning shiny processes within a container

I have a running Docker container with the shiny server from the slightly modified rocker/shiny image.
The default shiny-server.conf file sets the shiny user as the one under
# Define the user we should use when spawning R Shiny processes
run_as shiny;
means the server is running as root by default, but the worker processes for shiny apps are run as user shiny
The apps themselves use a data warehouse connection to the SQL server, initialized via RODBC. While we did not want to put the entire connection details string (including DB host and password) into the codebase, we wanted to read them from the environment variables with which the container is created by running the following routine
HOST <- Sys.getenv("host")
DB <- Sys.getenv("db")
UID <- Sys.getenv("uid")
PWD <- Sys.getenv("pwd")
conn<-paste0("driver={ODBC Driver 17 for SQL Server};server=",HOST,";database=",DB,";uid=",UID,";pwd=",PWD)
dbhandle<-odbcDriverConnect(conn)
The problem is, that those env variables are empty when the worker process is spawned within a container as the user shiny.
If I try to run the same code in the interactive R console (as both root, or shiny user) I am getting the env variables as expected.
Any input would be much appreciated. Please note I do not intend to use docker secrets as I am not running the app within a docker swarm cluster, just a standalone Rancher OS host.
EDIT:
While .Renviron file might be a viable alternative to solving that particular problem, it would entail putting the variables into the codebase, which we are trying to avoid here.
I added the following in shiny-server.sh start script which is the docker container's CMD, as suggested by Ralf Stubner
env > /home/shiny/.Renviron
chown shiny.shiny /home/shiny/.Renviron

run local Rscript on remote R server

I am trying to run some Rscripts on a remote server which has a fixed IP-Adress.
Background is the following.
I connected a Front-end web app with R and am able to insert different values, which are then passed to R, where different outcomes are evaluated.
Everything works fine when I run it locally.
I want to be able to run the Web-app from every location, so I set up an RServer with username, password etc.
What I want to next is to establish a connection between R and the server.
I wasn't able to find anything on that topic yet, so I hope you can help me with this.
Establish a connection to the Rserver with some IP.
Login with my username and password
Tell my R script where to find the Rscript.exe file
Doing that locally I just tell my php-file:
$command = '"C:\Users\Username\Documents\R\R-3.3.1\bin\Rscript.exe" "'.__DIR__.'\rfile.R" '.$json;
So I tell him where to find my RScript.exe and to read with that my rfile.R.
Thanks for your help.

Background R console and long running session

this is my situation: I usually run R from within Emacs using ESS into terminal emulator, in my local pc. In my work place we get a new server running R so I would use the remote server via ssh. I connect via ssh and all works well. What I would do is to keep alive the R console while I close my laptop and go home so, from my home I would reconnect to the existing R session.
I tried to put the R console in background using C-q C-z Enter to stop the process but, while I close the ssh connection the proces is killed. No luck using bg & too. I also tried mosh but, also in this case, I get some issue related to the UDP traffic across my work's network. Screen and tmux are not also very useful due to their bad interaction with the Emacs eshell.
In both client and server machine I run Debian 8 xfce.
Is there a way to keep alive the R terminal while closing the ssh connection? Which is your approach to the long R sessions?
EDIT
Finally here and here I found the solutio that I'm looking for. I tried the same approach as in the link above, but using tmux, and I get lots of error. The holy grail is screen. I tried to follow step-by-step that procedure but I get an error from emacs while I try to attach a screen session from within eshell. So I tried to use ansi-term instead of eshell and all works as expected. I can attach and detach the R session. In this way I use the remote server machine only for the computation while the R scripts are in my laptop.
So, this is the work-flow:
ssh to the host server
start screen session
start R
detach screen
exit from the server closing the ssh connection
run emacs as daemon in your local machine and open an emacsclient
instance (not necessary run emacs via emacsclient but I prefer this
way)
open your R script
open an ansi-term (M-x ansi-term)
ssh to the server from ansi-term
attach the screen session (screen -r)
connect the remote R console to the local R script (M-x ess-remote)
to detach from R from within ansi-term use Ctrl-q Ctrl-a d return
Thats it. Now I can run a remote R process using a local R script, closing the connection but leaving open the R console so I can re-attach to it in the future, also from a different IP.
This is one of my favourite topics :) Here is what I do:
Always start emacs as emacs --daemon so that it runs in the background.
Always launch emacsclient -nw (for textmode) or emacsclient -c (in x11/graphical mode) to access the daemonized emacs in the background. I have these aliased to emt and emx, respectively.
Now you are essentially done. You can ssh to that box and resume from whereever you can launch ssh from---which may be a smartphone or browser. And ESS of course allows you to have multiple R sessions. After M-x R I often invoke M-x rename-buffer to align the buffer with the project name or idea I work on.
I combine this further with both
byobu (which is a fancy tmux wrapper available in many distros and on OS X, and originally from Ubuntu) to have shell sessions persist
mosh for places like work and home where my laptop can simply resume
Strictly speaking you do not need byobu or mosh for emacs to persist (as running the daemon takes care of that) but you may want it for all your other shell session.
This setup has been my goto tools for years at work and home.

Amazon EC2 / RStudio : Is there are a way to run a job without maintaining a connection?

I have a long running job that I'd like to run using EC2 + RStudio. I setup the EC2 instance then setup RStudio as a page on my web browser. I need to physically move my laptop that I use to setup the connection and run the web browser throughout the course of the day and my job gets terminated in RStudio but the instance is still running on the EC2 dashboard.
Is there a way to keep a job running without maintaining an active connection?
Does it have to be started / controlled via RStudio?
If you make your task a "normal" R script, executed via Rscript or littler, then you can run them from the shell ... and get to
use old-school tools like nohup, batch or at to control running in the background
use tools like screen, tmux or byobu to maintain one or multiple sessions in which you launch the jobs, and connect / disconnect / reconnect at leisure.
RStudio Server works in similar ways but AFAICT limits you to a single user per user / machine -- which makes perfect sense for interactive work but is limiting if you have a need for multiple sessions.
FWIW, I like byobu with tmux a lot for this.
My original concern that it needed to maintain a live connection was incorrect. It turns out the error was from running out of memory, it just coincided with being disconnected from the internet connection.
An instance is started from the AWS dashboard and stopped or terminated from there also. As long as it is still running it can be accessed from an RStudio tab by copying the public DNS to the address bar on the web page and logging in again.

Fails in Telnet works in SSH

Hi
I am connecting to a remote unix and running a command there that supposed to run in the background.
The problem is that when I am connecting with ssh it works fine but if I am connecting with telnet the program which I run stops running after a few seconds.
the program which I execute is a program that starts another program in the background.
It seems (guessing) that the failure happens when the first program is about to run the other program in the background.
has anyone encountered something like that ever?
> An interactive shell is one started without non-option arguments, unless -s is specified, without specifying the -c option, and whose input and output are both connected to terminals (as determined by isatty(3)), or one started with the -i option. See section 6.3 Interactive Shells, for more information
Job control isn't available over your telnet. This can be
a deficiency of your telnet client
a missing option to telnet
if you start bash in a pipe, e.g., by default the input/output are not connected to a terminal (but rather pipes). Don't do that :)
See also

Resources