R source file from RConnection (JRI) "could not find function" - r

I'm trying to source a R script file from an org.rosuda.REngine.Rserve.RConnection and then calling a function from that script, but am getting, "Error: could not find function "main".
Start Rserve from terminal
> require(Rserve)
Loading required package: Rserve
> Rserve()
Starting Rserve...
"C:\Users\slenzi\DOCUME~1\R\WIN-LI~1\3.3\Rserve\libs\x64\Rserve.exe"
> Rserve: Ok, ready to answer queries.
Error: could not find function "main"
External script rdbcTest1.R
require(RJDBC)
main <- function() {
jdbcDriver <- JDBC(driverClass = dbDriverClass, classPath = dbDriverPath)
jdbcConnection <- dbConnect(jdbcDriver, dbUrl, dbUser, dbPwd)
dbResult <- dbGetQuery(jdbcConnection, dbQuery)
dbDisconnect(jdbcConnection)
return(dbResult)
}
Java Code
import org.rosuda.REngine.REXP;
import org.rosuda.REngine.REXPMismatchException;
import org.rosuda.REngine.REngine;
import org.rosuda.REngine.REngineException;
import org.rosuda.REngine.RList;
import org.rosuda.REngine.Rserve.RConnection;
import org.rosuda.REngine.Rserve.RserveException;
REXP result = null;
RConnection c = null;
try {
c = new RConnection();
String driverPath = "C:/temp/ojbdc6.jar";
String scriptPath = "C:/temp/rdbcTest1.R";
c.assign("dbUser", "foo");
c.assign("dbPwd", "******");// commented out
c.assign("dbUrl", "jdbc:oracle:thin:#myhost:1511:ecogtst");
c.assign("dbDriverClass", "oracle.jdbc.driver.OracleDriver");
c.assign("dbDriverPath", driverPath);
// assign query to execute
c.assign("dbQuery", "SELECT count(*) FROM prs.members");
String evalSource = String.format("try(source(\"%s\", local=TRUE), silent=TRUE)", scriptPath);
c.eval("evalSource");
// debug
result = c.eval("ls()");
logger.info("REXP debug => " + result.toDebugString());
result = c.eval("main()");
} catch (RserveException e) {
logger.error("error running script test, " + e.getMessage());
} finally {
if(c != null){
c.close();
}
}
Output
evaluating => try(source("C:/temp/rdbcTest1.R", local=TRUE), silent=TRUE)
REXP debug => org.rosuda.REngine.REXPString#61c6bc05[6]{"dbDriverClass","dbDriverPath","dbPwd","dbQuery","dbUrl","dbUser"}
Error: could not find function "main"
error running oracle script test, eval failed, request status: error code: 127
Running the script directly from the R terminal it works fine. When ran from RConnection I can see the values of the variables I assign (c.assign(...)) but not the main() function from the sourced script.
What am I missing in regards to sourcing a script? How can I access a function from the script?
Thanks.
** UPDATE **
I got the following to work, but if anyone has any idea why source() doesn't seem to work in my example above, please let me know!
RConnection c = ...
REXP x = c.parseAndEval("try(eval(parse(file=\"C:/temp/rdbcTest3.R\")), silent=TRUE)")

Related

bus error on usage of rusqlite with spatialite extension

I'm seeing a bus error on cargo run when attempting to load the spatialite extension with rusqlite:
Finished dev [unoptimized + debuginfo] target(s) in 1.19s
Running `target/debug/rust-spatialite-example`
[1] 33253 bus error cargo run --verbose
My suspicion is that there's a mismatch of sqlite version and spatialite and that they need to be built together rather than using the bundled feature of rusqlite, though it seems like that'd result in a different error?
Here's how things are set up:
Cargo.toml
[package]
name = "rust-spatialite-example"
version = "0.0.0"
edition = "2021"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
rusqlite = { version = "0.28.0", features = ["load_extension", "bundled"] }
init.sql
CREATE TABLE place (
id INTEGER PRIMARY KEY,
name TEXT NOT NULL
);
SELECT AddGeometryColumn('place', 'geom', 4326, 'POINT', 'XY', 0);
SELECT CreateSpatialIndex('place', 'geom');
main.rs
use rusqlite::{Connection, Result, LoadExtensionGuard};
#[derive(Debug)]
struct Place {
id: i32,
name: String,
geom: String,
}
fn load_spatialite(conn: &Connection) -> Result<()> {
unsafe {
let _guard = LoadExtensionGuard::new(conn)?;
conn.load_extension("/opt/homebrew/Cellar/libspatialite/5.0.1_2/lib/mod_spatialite", None)
}
}
fn main() -> Result<()> {
let conn = Connection::open("./geo.db")?;
load_spatialite(&conn)?;
// ... sql statements that aren't executed
Ok(())
}
Running:
cat init.sql | spatialite geo.db
cargo run
The mod_spatialite path is correct (there's an expected SqliteFailure error when that path is wrong). I tried explicitly setting sqlite3_modspatialite_init as the entry point and the behavior stayed the same.

R, httr, and an unstoppable error message

In summary, I'm finding that when running httr::POST against a plumber api, within an R.utils::withTimeout, the post is throwing an error message which can't be suppressed. The error message is:
Error in .Call(R_curl_fetch_memory, enc2utf8(url), handle, nonblocking) : reached elapsed time limit
Things I've tried:
suppressmessages/warnings
tryCatch with no error response
Using sinks
I can't think of any other way of stopping this error. The code carries on running but it's resulting in users of the api getting spammed with error messages when checking for an available port to call. Any ideas welcome.
Reproducible example (note this uses rstudio jobs pkg and a separate plumber file, but i've tried without the jobs pkg and the issue persists so it's not connected to that.) :
plumber.R:
#* quick ping
#* #post /ping
function() {
list(msg = "you got here!")
}
#* slow 10s call
#* #post /slowfxn
function() {
Sys.sleep(10)
1
}
code which calls and produces errors:
#run the plumber file in a separate job:
r <- plumber::plumb("errortest/plumber.R")
job::job({r$run(swagger = F,port = 1111)})
#this function will throw the error if no response comes within 3s:
call_with_timeout = function() {
R.utils::withTimeout(
rawToChar(httr::POST("http://127.0.0.1:1111/echo")$content)
,timeout = 3,onTimeout = "silent")
}
#now call the fxn (again in a job) which takes 10 s to run:
job::job({httr::POST(url = "http://127.0.0.1:1111/slowfxn")})
#wait a second for that job to be initiated:
Sys.sleep(1)
#while that's running, try and call the quick fxn, with a 3 second timeout:
#try wrap it in a trycatch to suppress:
tryCatch( {
call_with_timeout()
},
error = function(x) {},
TimeoutException = function(x){},
warning = function(x) {}
)
suppressMessages(suppressWarnings(call_with_timeout()))
#not even a sink can stop it!
sink("delete.txt")
call_with_timeout()
sink(NULL)

Quit a Plumber API once a condition is met

I am trying to run a Plumber API inline to receive an input, and once the proper input is received and a specified condition is met, the input is returned to the globalenv and the API closes itself such that the script can continue to run.
I've specified a condition within a #get endpoint that calls quit(), stop() etc, none of which successfully shut down the API.
I've attempted to run the API in parallel using future such that the parent script can close the Plumber API.
It appears that there isn't actually a method in the Plumber API class object to close the Plumber API, and the API can't be closed from within itself.
I've been through the extended documentation, SO, and the Github Issues in search of a solution. The only semi-relevant solution suggested is to use R.Utils::withTimeout to create a time-bounded timeout. However, this method is also unable to close the API.
A simple use case:
Main Script:
library(plumber)
code_api <- plumber::plumb("code.R")
code_api$run(port = 8000)
code.R
#' #get /<code>
function(code) {
print(code)
if (nchar(code) == 3) {
assign("code",code,envir = globalenv())
quit()}
return(code)
}
#' #get /exit
function(exit){
stop()
}
The input is successfully returned to the global environment, but the API does not shut down afterward, nor after calling the /exit endpoint.
Any ideas on how to accomplish this?
You could look at Iterative testing with plumber #Irène Steve's, Dec 23 2018 with:
trml <- rstudioapi::terminalCreate()
rstudioapi::terminalKill(trml)
excerpt of her article (2nd version of 3):
.state <- new.env(parent = emptyenv()) #create .state when package is first loaded
start_plumber <- function(path, port) {
trml <- rstudioapi::terminalCreate(show = FALSE)
rstudioapi::terminalSend(trml, "R\n")
Sys.sleep(2)
cmd <- sprintf('plumber::plumb("%s")$run(port = %s)\n', path, port)
rstudioapi::terminalSend(trml, cmd)
.state[["trml"]] <- trml #store terminal name
invisible(trml)
}
kill_plumber <- function() {
rstudioapi::terminalKill(.state[["trml"]]) #access terminal name
}
Running a Plumber in the terminal might work in some cases but as I needed access to the R session (for insertText) I had to come up with the different approach. While not ideal the following solution worked:
# plumber.R
#* Insert
#* #param msg The msg to insert to the cursor location
#* #post /insert
function(msg="") {
rstudioapi::insertText(paste0(msg))
stop_plumber(Sys.getpid())
}
.state <- new.env(parent = emptyenv()) #create .state when package is first loaded
stop_plumber <- function(pid) {
trml <- rstudioapi::terminalCreate(show = FALSE)
Sys.sleep(2) # Wait for the terminal to initialize
# Wait a bit for the Plumber to flash the buffers and then send a SIGINT to the R session process,
# to terminate the Plumber
cmd <- sprintf("sleep 2 && kill -SIGINT %s\n", pid)
rstudioapi::terminalSend(trml, cmd)
.state[["trml"]] <- trml # store terminal name
invisible(trml)
Sys.sleep(2) # Wait for the Plumber to terminate and then kill the terminal
rstudioapi::terminalKill(.state[["trml"]]) # access terminal name
}

Call Java from R

I want to execute Java code from R. I used rJava package and I was able to execute a simple code of Java such as create object or print on screen.
require("rJava")
.jinit()
test<-new (J ("java.lang.String") , "Hello World!")
However what I want to do is to send a dataframe from R or CSV file and execute a code in Java then return the output file to R. At the same time, it is difficult in my case to call the R code from Java, as I want to process the CVS file first in R , then apply the Java code on it and return the result again to R to complete the analysis.
I'd go following way here.
Process CSV file inside R
Save this file somewhere and make sure you know explicit location (e.g. /home/user/some_csv_file.csv)
Create adapter class in Java that will have method String processFile(String file)
Inside method processFile read the file, pass it to your code in Java and do Java based processing
Store output file somewhere and return it's location
Inside R, get the result of processFile method and do further processing in R
At least, that's what I'd do as a first draft of a solution for your problem.
Update
We need Java file
// sample/Adapter.java
package sample;
public class Adapter {
public String processFile(String file) {
System.out.println("I am processing file: " + file);
return "new_file_location.csv";
}
public static void main(String [] arg) {
Adapter adp = new Adapter();
System.out.println("Result: " + adp.processFile("initial_file.csv"));
}
}
We have to compile it
> mkdir target
> javac -d target sample/Adapter.java
> java -cp target sample.Adapter
I am processing file: initial_file.csv
Result: new_file_location.csv
> export CLASSPATH=`pwd`/target
> R
We have to call it from R
> library(rJava)
> .jinit()
> obj <- .jnew("sample.Adapter")
> s <- .jcall(obj, returnSig="Ljava/lang/String;", method="processFile", 'initial_file')
> s
I am processing file: initial_file
> s
[1] "new_file_location.csv"
And your source directory looks like this
.
├── sample
│   └──Adapter.java
└── target
     └── sample
         └── Adapter.class
In processFile you can do whatever you like and call your existing Java code.

Deployed R app with Shiny crash with 'could not find function "httpdPort"'

I have an application built with Shiny (a tutorial, where ui.R and server.R are taken from here: http://shiny.rstudio.com/tutorial/lesson1/).
I have these two files in shiny-frontend folder, and if I runApp("shiny-frontend") locally in RStudio - everything works great and I see the tutorial in my browser.
Now I want the same app to be put into Bluemix via cloudfoundry. I'm using this: http://www.ibm.com/developerworks/library/ba-rtwitter-app/ as a tutorial, but struggling with an error.
I have a start.r file which I run as R -f ./start.r --gui-none --no-save. I'm using https://github.com/virtualstaticvoid/heroku-buildpack-r buildpack.
My start.r looks like this (taken from the bluemix tutorial with a very minor modifications):
library(shiny)
if (Sys.getenv('VCAP_APP_PORT') == "") {
print("Running Shiny")
runApp("shiny-frontend")
} else {
# In case we're on Cloudfoundry, run this:
print('running on CF')
# Starting Rook server during CF startup phase - after 60 seconds start the actual Shiny server
library(Rook)
myPort <- as.numeric(Sys.getenv('VCAP_APP_PORT'))
myInterface <- Sys.getenv('VCAP_APP_HOST')
status <- -1
# R 2.15.1 uses .Internal, but the next release of R will use a .Call.
# Either way it starts the web server.
if (as.integer(R.version[["svn rev"]]) > 59600) {
status <- .Call(tools:::startHTTPD, myInterface, myPort)
} else {
status <- .Internal(startHTTPD(myInterface, myPort))
}
if (status == 0) {
unlockBinding("httpdPort", environment(tools:::startDynamicHelp))
assign("httpdPort", myPort, environment(tools:::startDynamicHelp))
s <- Rhttpd$new()
s$listenAddr <- myInterface
s$listenPort <- myPort
s$print()
Sys.sleep(60)
s$stop()
}
# run shiny server
sink(stderr())
options(bitmapType='cairo')
getOption("bitmapType")
print("test")
write("prints to stderr", stderr())
write("prints to stdout", stdout())
write(port, stdout())
runApp('shiny-frontend',port=myPort,host="0.0.0.0",launch.browser=F)
}
And my init.r, looks like this:
install.packages("shiny", clean=T)
install.packages("Rook", clean=T)
Then when I run, everything is deployed correctly, but then when I try to go by the route, I see an error in the log:
* ERR Calls: <Anonymous> -> startDynamicHelp
* ERR Execution halted
* ERR Error in startDynamicHelp(FALSE) : could not find function "httpdPort"
I also noticed that assigned port is different every time, which is weird and the route in bluemix dashboard does not mention it. But I output the port to the log, and use that number.
Also the way I'm doing it seems a bit too complicated, so if anybody could suggest any easier way, I'd appreciate it
it took me a while to understand that this error is thrown by R because it can not find the function (not the value) httpdPort. Instead of binding httpdPort to a function you are binding it to a value. The line s$stop() is the one causing trouble. It calls startDynamicHelp that assumes that httpdPort is a function defined in the environment tools.
To fix this issue you can change the block if (status == 0){...} in your code to:
if (status == 0) {
getSettable <- function(default){
function(obj = NA){if(!is.na(obj)){default <<- obj};
default}
}
myHttpdPort <- getSettable(myPort)
unlockBinding("httpdPort", environment(tools:::startDynamicHelp))
assign("httpdPort", myHttpdPort, environment(tools:::startDynamicHelp))
s <- Rhttpd$new()
s$listenAddr <- myInterface
s$listenPort <- myPort
s$print()
Sys.sleep(60)
s$stop()
}

Resources