How can I test if a RJDBC connection is working? With time limit?
Here is a toy code:
library("RJDBC")
testConnection <- function(){
tryCatch({
conn <- RJDBC::dbConnect(JDBC("com.mysql.jdbc.Driver", "driverPath"), "connString", "user", "password")
dbDisconnect(conn)
return("Connection: OK")
}
, error = function(e) {
return("Connection: Failed")
}
)
}
testConnection()
But if there is no network connection, I have to wait for 30 seconds to have the result.
I need to test many connections, so I would like to change the test limit to 2 seconds.
How can I set a evaluation time limit?
I tried setTimeLimit without success.
Related
I have an R script which consumes a feed periodically and then produces signals or actions, which need to be sent to a TCP server implemented in other language. I have handled the signal generation part in this fashion:
sendViaSocket <- function(object, socket) {
byteLen <- as.integer(object$byteLen())
writeBin(
byteLen, socket,
endian = "little",
useBytes = TRUE
)
writeBin(
object$toRaw(), socket,
useBytes = TRUE
)
}
readFromSocket <- function(socket) {
packetLen <- readBin (
socket, integer(), endian = "little"
)
rawVec <- readBin(
socket, "raw", n = packetLen
)
return (
parse (messages.MyMessage, rawVec)
)
}
getOHLC <- function() {
# feed reading function
...
}
generateSignal <- function (candleStick) {
# signal generating function
...
}
performAction <- function(message) {
# Action on server message
}
clientLoop <- function() {
socket <- socketConnection(
host = 'localhost',
port = 42000,
blocking = TRUE,
server = FALSE,
open = "a+b"
)
while(TRUE) {
candleStick <- getOHLC()
signal <- generateSignal(candleStick)
sendViaSocket(signal, socket)
# Reading any message from server
object <- readFromSocket(message)
Sys.sleep(10)
}
}
clientLoop()
I have deliberately set the socket connection as blocking as I want to read all the data available in the connection in one go, and send message in one go.
Currently I am reading any notification from client loop itself, but if the server does not send anything, the client loop is blocked.
What I want to do is have a callback when the socket has received any data, and then process that data in the callback. This will not block the client loop. Something along the following lines
onServerDataReceived <- function (serverMessage) {
# Perform related action on receiving server message
}
socket$onDataReceived(callback = onServerDataReceived)
Is there a way in which I can use a client socket connection with a callback when the server actually sends a message/ or do this in a reactive / asynchronous way?
Edit: As a bonus I also don't want to do while (TRUE) {...} and use an API or package that provides a better way to call the clientLoop() function periodically every n seconds (minimum interval is 30 seconds)
I am using the spotifyr library where I want to find audio features for multiple tracks. For example I can do this in order to find the audio features of a specific song using it's id.
analysis2 <- get_track_audio_features("2xLMifQCjDGFmkHkpNLD9h",
authorization = get_spotify_access_token())
Yesterday, I wrote this function below that takes all the tracks in a dataframe and finds the audio features for all of them and stores them in a list and it was working fine.
get_analysis <- function(track_id)
{
analysis <- get_track_audio_features(track_id,
authorization = get_spotify_access_token())
}
tracks_list <- lapply(all_tracks$track.id, get_analysis)
Now I am getting an error saying Request failed [503] and Error in get_track_audio_features(track_id, authorization = get_spotify_access_token()) : Service Unavailable (HTTP 503).
I am still able to find the audio features of a specific song so I am not sure which service is unavailable.
I suspect you are reaching a song in your data for which the response is denied from spotify. You could try adding an error-catching mechanism to see which one it is:
get_analysis <- function(track_id){
tryCatch(
expr = {
get_track_audio_features(track_id, authorization = get_spotify_access_token())
},
error = function(e){
print(track_id)
}) -> analysis
return(analysis)
}
tracks_list <- lapply(all_tracks$track.id, get_analysis)
I looked at the source code for the package and didn't see any sneaky rate-limiting issues and the Web API page shows error 503 as a generic error that needs waiting to be resolved (https://developer.spotify.com/documentation/web-api/). Thus you could also try just adding a 10 minute wait (I couldn't find how long exactly it is on Spotify's website):
get_analysis <- function(track_id){
tryCatch(
expr = {
get_track_audio_features(track_id, authorization = get_spotify_access_token()) -> output
return(output)
},
error = function(e){
print(track_id)
return(e)
}) -> output
}
wait.function <- funciton(){
Sys.sleep(600)
}
get_analysis_master <- function(all_tracks){
k <- 1
tracks_list <- list()
for(track.id in all_tracks$track.id){
get_analysis(track.id) -> output
if(!inherits(output, "error")){
tracks_list[[k]] <- output
k <- k + 1
} else {
wait.function()
}
return(tracks_list)
}
get_analysis_master(all_tracks) -> tracks_list
I have a socket server using IO::Socket::Async and Redis::Async for message publishing. Whenever there is a message received by the server, the script would translate the message and generate acknowledge message to be sent back to the sender so that the sender would send subsequent messages. Since translating the message is quite expensive, the script would run that portion under a 'start' method.
However, I noticed that the Moar process eating my RAM as the script is running. Any thought where should I look to solve this issue? Thanks!
https://pastebin.com/ySsQsMFH
use v6;
use Data::Dump;
use experimental :pack;
use JSON::Tiny;
use Redis::Async;
constant $SOCKET_PORT = 7000;
constant $SOCKET_ADDR = '0.0.0.0';
constant $REDIS_PORT = 6379;
constant $REDIS_ADDR = '127.0.0.1';
constant $REDIS_AUTH = 'xxxxxxxx';
constant $IDLING_PERIOD_MIN = 180 - 2; # 3 minutes - 2 secs
constant $CACHE_EXPIRE_IN = 86400; # 24h hours
# create socket
my $socket = IO::Socket::Async.listen($SOCKET_ADDR, $SOCKET_PORT);
# connnect to Redis ...
my $redis;
try {
my $error-code = "110";
$redis = Redis::Async.new("$SOCKET_ADDR:$SOCKET_PORT");
$redis.auth($REDIS_AUTH);
CATCH {
default {
say "Error $error-code ", .^name, ': Failed to initiate connection to Redis';
exit;
}
}
}
# react whenever there is connection
react {
whenever $socket -> $conn {
# do something when the connection wants to talk
whenever $conn.Supply(:bin) {
# only process if data length is either 108 or 116
if $_.decode('utf-8').chars == 108 or $_.decode('utf-8').chars == 116 {
say "Received --> "~$_.decode('utf-8');
my $ack = generateAck($_.decode('utf-8')); # generate ack based on received data
if $ack {
$conn.print: $ack;
}else{
say "No ack. Received data maybe corrupted. Closing connection";
$conn.close;
}
}
}
}
CATCH {
default {
say .^name, ': ', .Str;
say "handled in $?LINE";
}
}
}
### other subroutines down here ###
The issue was using the Async::Redis. Jonathon Stowe had fixed the Redis module so I'm using Redis module with no issue.
I'm currently running a geocoding function (using the google_places function in the googleway package). The function will run for a while (I have almost 3k locations), then throw the following error:
Error in open.connection(con, "rb") :
schannel: next InitializeSecurityContext failed: SEC_E_ILLEGAL_MESSAGE (0x80090326) - This error usually occurs when a fatal SSL/TLS alert is received (e.g. handshake failed). More detail may be available in the Windows System event log.
Having consulted the system event log, I found the following information:
The machine-default permission settings do not grant Local Activation permission for the COM Server application with CLSID
{9BA05972-F6A8-11CF-A442-00A0C90A8F39}
and APPID
{9BA05972-F6A8-11CF-A442-00A0C90A8F39}
I'm not really sure what to do with this information. From my limited knowledge, it appears this is some sort of security/firewall issue. How should I go about giving R the permissions needed to run this function?
I am running Windows 10 with Windows Defender as antivirus/firewall. For reference, this is the function I am using for geocoding:
metro.locater <- function(lat, lon){
library(googleway)
#putting latitude and longitude into the same vector
latlon <- c(lat, lon)
#getting places result
res <- google_places(location = latlon,
place_type = "subway_station", radius = 50000,
rankby="distance",
key = "myKey")
#condition handling
if(res$status == 'OK'){
closest <- res$results[1:3, ]
return(closest)} else {
try(return(res$status))
}
}
I was able to fix the issue by using an adverb I'd used with another geocoding function that attempts to run the function 5 times when it fails to provide results. Given that this worked, it seems likely that this was just a transient error rather than a systemic issue.
The adverb I used:
safely <- function(fn, ..., max_attempts = 5) {
function(...) {
this_env <- environment()
for(i in seq_len(max_attempts)) {
ok <- tryCatch({
assign("result", fn(...), envir = this_env)
TRUE
},
error = function(e) {
FALSE
}
)
if(ok) {
return(this_env$result)
}
}
msg <- sprintf(
"%s failed after %d tries; returning NULL.",
deparse(match.call()),
max_attempts
)
warning(msg)
NULL
}
}
Taken from Repeating values in loop until error disappears.
For a web scraping project I am making frequent requests on a particular site. Sometimes the connection times out with an error and I would like for it to retry instead of erroring out. I've written out the code below for it to keep trying, but I don't think it works because I still error out.
url = "www.google.com"
while(true){
withRestarts(tryCatch(
sourcecode <- getForm(urls[n]),
finally = Sys.sleep(2),
abort = function(){})
}
Error in function (type, msg, asError = TRUE) : couldn't connect to
host
Got it after experimenting:
while(length(sourcecode.ad) == 0){
try({
sourcecode <- getForm(urls[n])
print(urls[n])
Sys.sleep(1)
})
}
Try() will allow a continuation after an error occurs. Combined with the loop, it will keep retrying.