Failed to display R script on the local OpenCPU single-user server - r

I set up a local OpenCPU single-user server using RStudio. I also create my own R package(Package name: test) which includes only a simple test.R file. The source code is
f1 <- function(x, y) {x+y}
I started the OpenCPU server by typing library(opencpu) in RStudio's console. I got the following print.
Initiating OpenCPU server...
OpenCPU started.
[httpuv] http://localhost:6067/ocpu
OpenCPU single-user server ready.
I was able to run the script by typing curl http://localhost:6067/ocpu/library/test/R/f1 -d "x=33&y=3".
But when I tried to display the R script(test.R) by typing curl http://localhost:6067/ocpu/library/test/R/test.R, it printed
object 'test.R' not found
In call:
get(reqobject, paste("package", reqpackage, sep = ":"), inherits = FALSE)
In addition, It's failed when I ran the test.R script by typing curl http://localhost:6067/ocpu/library/test/R/test.R -X POST -d "x=3&y=4". Could I run the script like that?
Could anyone help with this? Thanks.

When you install the R package, scripts under /R are turned into functions/objects. To read the source of the function, just do on of these:
curl http://localhost:6067/ocpu/library/test/R/f1/print
curl http://localhost:6067/ocpu/library/test/R/f1/ascii

Related

Create a shell alias that R can recognise

I have an alias (actually, a function) on my .bashrc but R doesn't seem to recognise it.
fun() {
echo "Hello"
}
which runs correctly when I log in via ssh (it's a remote server). However, if I run system("fun"), I get
sh: 1: fun: not found
Warning message:
In system("fun") : error in running command
From a comment on this question I can get system("bash -i -c fun") to work, although with a weird warning/message
bash: cannot set terminal process group (27173): Inappropriate ioctl for device
bash: no job control in this shell
Hello
However, this doesn't apply to my case, because I'm running external code so I cannot modify the system() call. I need system("command") to use the command command that I defined.
BTW, this is all running on Linux (Debian in the remote server, but I also tried on my machine running elementary OS with the same result).

R script / package to listen to an endpoint on webpage?

I'm using an R script to query an API, process data, and then run it on a shiny server. Basically there's a input field, and then it queries that username. However, I'd like to do this from a discord bot. So I'm wondering if there's a way to listen to www.example.com/endpoint/ and then use the input from this endpoint into the query?
The solution was to use the plumber package. This essentially creates an endpoint for your script. An example:
# plumber.R
#* Echo back the input
#* #param msg The message to echo
#* #get /echo
function(msg=""){
list(msg = paste0("The message is: '", msg, "'"))
}
Then you save that as a R-script, and use the following:
library(plumber)
r <- plumb("plumber.R") # Where 'plumber.R' is the location of the file shown above
r$run(port=8000)
I'm running this on an Ubuntu server, where following code starts the service. In that case you don't need the lines above, only the plumber.R file since the code below in essence does the same:
sudo nano /etc/systemd/system/plumber-api.service
Content of the file should be:
[Unit]
Description=Plumber API
# After=postgresql
# (or mariadb, mysql, etc if you use a DB with Plumber, otherwise leave this commented)
[Service]
ExecStart=/usr/bin/Rscript -e "api <- plumber::plumb('/your-dir/your-api-script.R'); api$run(port=8080, host='0.0.0.0')"
Restart=on-abnormal
WorkingDirectory=/your-dir/
[Install]
WantedBy=multi-user.target
https://www.rplumber.io/docs/hosting.html
sudo systemctl start plumber-api # starts the service
Details on the plumber package here

Unable to pull docker image in Rstudio

I have trouble pulling a docker image from Rstudio using babelwhale libary. I tried running the command below
babelwhale::pull_container("ubuntu:18.04")
But it gave the following error
Got permission denied while trying to connect to the Docker daemon socket at unix:///var/run/docker.sock: Post http://%2Fvar%2Frun%2Fdocker.sock/v1.40/images/create?fromImage=ubuntu&tag=18.04: dial unix /var/run/docker.sock: connect: permission denied
Error in processx::run("docker", c("pull", container_id), echo = TRUE) :
System command error
It seems like you are using RStudio in a unix system which probably requires root access to pull an image from docker. I think you should try launching Rstudio with sudo privilege from the terminal using the command
sudo rstudio
and then rerun the babelwhale function above.

Running Rserve as service

I am currently working on one project which uses R to do a very sophisticated calculation, it sets up to be called by ASP.NET web application. It uses RServe as the interface to R and needs to use ROracle and DBI libraries too.
Now the problem is every time when the server restarts, I have to sign in to the server to launch the Rserve command manually, the question is: is there any way to automatically run RServe every time when server restarts. I am running it on windows.
Thanks very much.
As the public documentation of Rserve says, you can start Rserve from a shell script.
Just create a script that executes the following command:
echo 'library(Rserve);Rserve(FALSE,args="--no-save --slave --RS-conf <your_own_path>/rserve.conf")'|<R_bin_path>/R --no-save --slave
For instance, in my MacOS computer I can start Rserve executing this line:
/bin/sh -c "echo 'library(Rserve);Rserve(FALSE,args=\"--slave\")'|/Library/Frameworks/R.framework/Versions/3.2/Resources/bin/exec/R --no-save --slave"
This command outputs something like this:
Starting Rserve:
/Library/Frameworks/R.framework/Resources/bin/R CMD /Library/Frameworks/R.framework/Versions/3.2/Resources/library/Rserve/libs//Rserve --slave
Rserv started in daemon mode.
You can create a shell script (or windows script) and tell the OS to execute this script during the startup process.

How to redirect local ouput to stdin over ssh to remotely execute a local script?

i am trying to remotely execute a perl script that takes data from stdin, over ssh.
The tricky part is that i don't want to upload the script itself to the remote server.
The data that the remote script will read from stdin is produced by another perl script run locally.
Let's assume the following:
my local script producing data is called cron_extract_section.pl
my local script that will be run remotely is called cron_update_section.pl
both scripts take one argument on the command line, a simple word
I manage to execute the script remotely, if the script is present on the remote machine:
./cron_extract_section.pl ${SECTION} 2> /dev/null | ssh user#remote ~/path/to/remote/script/cron_update_section.pl ${SECTION}
I know also that i can run a script on a remote server without having to upload it first, using the following syntax:
ssh user#remote "perl - ${SECTION}" < ./cron_update_section.pl
What i can't figure out is how to feed the local script cron_update_section.pl over ssh to perl, AND also pipe the result of the local script cron_extract_section.pl to perl.
I tried the following, the perl script executes fine, but there is nothing to read from stdin:
./cron_extract_section.pl ${SECTION} 2> /dev/null | ssh user#remote perl - ${SECTION} < ./cron_update_section.pl
Do you know if it's possible to do so without modifying the scripts ?
Use the DATA file handle. In example:
Local script to be run on the remote machine:
# script.pl
while(<DATA>) {
print "# $_";
}
__DATA__
Then, run it as:
(cat script.pl && /cron_extract_section.pl ${SECTION}) | ssh $host perl

Resources