RStudio : Rook does not work? - r

I would like to build a simple webserver using Rook, however I am having strange errors when trying it in R-Studio:
The code
library(Rook)
s <- Rhttpd$new()
s$start()
print(s)
returns the rather useless error
"Error in listenPort > 0 :
comparison (6) is possible only for atomic and list types".
When trying the same code in a simple R-Console,everything works - so I would like to understand why that happens and how I can fix it.
RStudio is Version 0.99.484 and R is R 3.2.2

I've experienced same thing.
TLDR: This pull request solves the problem: https://github.com/jeffreyhorner/Rook/pull/31
RStudio is treated in different way and Rook port is same as tools:::httpdPort value. The problem is that in current Rook master tools:::httpdPort is assigned directly. It's a function that's why we need to evaluate it first.
If you want to have it solved right now, without waiting for merge into master: install devtools and load package from my fork #github.
install.packages("devtools")
library(devtools)
install_github("filipstachura/Rook")

Related

Why can't Rserve run?

I want to use R on Qlik, and I began by following this process : https://community.qlik.com/servlet/JiveServlet/previewBody/18785-102-1-25264/Installing%20R%20with%20Qlik%20Sense.pdf
I installed R (with the specific install path, but R-3.4.4), and Qlik (September 2018 version).
Then I followed the different steps, until the 6th.
There, when I run Rserve.exe, I get an error : "Fatal error: unable to open the base package".
However I checked that the base package is well installed.
I didn't find where does it come from.
-> Is it a problem of R or Qlik version ?
-> Or is there someting else to take care of ?
Thank you very much for your advice !
I finally found the answer :
creating an environment variable
calling it R_HOME
value : the path to R (C:\R\R-3.4.4)

R Script running successfully on local machine, not on EC2 instance

I have an R script (an R plumber API) that I have deployed to an EC2 instance and managing with pm2, and I am running into a struggling issue. I have pinpointed the exact location of the error, and am hoping to understand this error a bit better.
When I run the script on my local machine (RStudio on my Mac) it works okay. When I run the script using Rscript myrfile.R from the EC2 instance command line, it breaks.
I have pinpointed that the line of code that breaks the script the on EC2 instance, as well as its error, are:
my_df <- my_df %>%
dplyr::mutate(AwayScore = ifelse(dplyr::row_number() == 1, 0, AwayScore),
HomeScore = ifelse(dplyr::row_number() == 1, 0, HomeScore))
# with the following error
<Rcpp::eval_error in mutate_impl(.data, dots): Evaluation error: argument "x" is missing, with no default.>
I am 100% sure that dplyr is installed on the EC2 instance, since my script uses it throughout. I am also 100% sure that the my_df dataframe here has the columns AwayScore and homeScore, and also that my_df doesnt have any other issues.
I am left to assume that this error is specifically due to the dplyr::row_number() function, which the EC2 instance does not seem to be able to handle, although I am not positive on this.
Any thoughts / help / things I should try / etc. would be greatly appreciated on this, thanks!!
While I appreciate you have avoided the problem by not requiring the library, at some point you may find you want to run codes in a similar way where loading a library will be necessary.
I ran into a similar problem using R script. I found it could not find the libraries I had installed. It is possible to use R.exe instead of Rscript.exe, but this causes other headaches. I found that the environment when using Rscript doesn't contain the R_LIBS_USER path
If you append the following code to the top of your R script it should work
p <- "\directory path of local R packages"
.libPaths(c(p,.libPaths()))
putting the folder path to where your libraries are found on the computer. This is the path that would be returned by Sys.getenv("R_LIBS_USER") if running R in the GUI
It was easy enough for me to simply change my code to the following:
if(is.na(my_df$AwayScore[1])) { my_df$AwayScore[1] = 0 }
if(is.na(my_df$HomeScore[1])) { my_df$HomeScore[1] = 0 }
... so I will likely not waste too much more time trying to debug this.

Running R code on linux in parallel on computing cluster

I've recently converted my windows R code to a Linux installation for running DEoptim on a function. On my windows system it all worked fine using:
ans <- DEoptim1(Calibrate,lower,upper,
DEoptim.control(trace=TRUE,parallelType=1,parVAr=parVarnames3,
packages=c("hydromad","maptools","compiler","tcltk","raster")))
where the function 'Calibrate' consisted of multiple functions. On the windows system I simply downloaded the various packages needed into the R library. The option paralleType=1 ran the code across a series of cores.
However, now I want to put this code onto a Linux based computing cluster - the function 'Calibrate' works fine when stand alone, as does DEoptim if I want to run the code on one core. However, when I specify the parelleType=1, the code fails and returns:
Error in checkForRemoteErrors(lapply(cl, recvResult)) :
7 nodes produced errors; first error: there is no package called ‘raster’
This error is reproduced whatever package I try and recall, even though the
library(raster)
command worked fine and 'raster' is clearly shown as okay when I call all the libraries using:
library()
So, my gut feeling is, is that even though all the packages and libraries are loaded okay, it is because I have used a personal library and the packages element of DEoptim.control is looking in a different space. An example of how the packages were installed is below:
install.packages("/home/antony/R/Pkges/raster_2.4-15.tar.gz",rpeo=NULL,target="source",lib="/home/antony/R/library")
I also set the lib paths option as below:
.libPaths('/home/antony/R/library')
Has anybody any idea of what I am doing wrong and how to set the 'packages' option in DEoptim control so I can run DEoptim across multiple cores in parallel?
Many thanks, Antony

knitr execution halted because of http_proxy?

I had to mess around with my R version to be able to get my secure gateway to connect to R mirrors. I did this by adding http_proxy=http://servername to both the properties tab in the R icon, and also by doing:
Sys.setenv(http_proxy=http://servername)
in Rstudio. Could get this to work in R, but not Rstudio. anyway no problem - I can install packages from within R, point RStudio at R and load the package. good.
So I try to create a (default) .Rmd file in Rstudio - when I run knitr, I get the following:
Error: 24:17: unexpected '/'
24: http_proxy=http:/
^
Execution halted
I can only imagine I am getting this because I messed around setting up the proxy. Would this make sense?
How do I clear/unset any proxy server in Rstudio?
Quote the string:
Sys.setenv(http_proxy = "http://servername")
If your on Windows, configure your proxy in IExplorer and use setInternet2(TRUE) function in your script to use IExplorer's proxy settings.

R packages inaccessible via "opencpu knitr app"

I started playing with OpenCPU a couple of weeks back and am getting hooked to it. I was able to succesfully install the "knitr" and "opencpu.demo" apps. The issue I am running into is when I try to invoke the R functions I packaged under a new r package and call it from within knitr-app. I get a message saying no such package exists. I ran installed.packages(lib.loc="/usr/lib/R/library") from an R shell and from knitr-app interface and indeed my packages shows up in the former but not in the latter case. No idea whats going on here ! Will greatly appreciate if anyone can answer this.
omments:
when i run find.package("DummyPkg") through the /R/pub/base/identity/json API , I do get back
[
"/usr/lib/R/library/DummyPkg"
]
However the same query from within knitr-app webpage returns:
# write R code here
find.package("DummyPkg")
## Error: there is no package called 'DummyPkg'
The default html page for knitr app has links to opencpu.org server, so the POST was going to public server instead of going to my server, no wonder my packages weren't showing up !
Comments:
when i run find.package("DummyPkg") through the /R/pub/base/identity/json API , I do get back
[
"/usr/lib/R/library/DummyPkg"
]
However the same query from within knitr-app webpage returns:

Resources