I'm currently running a geocoding function (using the google_places function in the googleway package). The function will run for a while (I have almost 3k locations), then throw the following error:
Error in open.connection(con, "rb") :
schannel: next InitializeSecurityContext failed: SEC_E_ILLEGAL_MESSAGE (0x80090326) - This error usually occurs when a fatal SSL/TLS alert is received (e.g. handshake failed). More detail may be available in the Windows System event log.
Having consulted the system event log, I found the following information:
The machine-default permission settings do not grant Local Activation permission for the COM Server application with CLSID
{9BA05972-F6A8-11CF-A442-00A0C90A8F39}
and APPID
{9BA05972-F6A8-11CF-A442-00A0C90A8F39}
I'm not really sure what to do with this information. From my limited knowledge, it appears this is some sort of security/firewall issue. How should I go about giving R the permissions needed to run this function?
I am running Windows 10 with Windows Defender as antivirus/firewall. For reference, this is the function I am using for geocoding:
metro.locater <- function(lat, lon){
library(googleway)
#putting latitude and longitude into the same vector
latlon <- c(lat, lon)
#getting places result
res <- google_places(location = latlon,
place_type = "subway_station", radius = 50000,
rankby="distance",
key = "myKey")
#condition handling
if(res$status == 'OK'){
closest <- res$results[1:3, ]
return(closest)} else {
try(return(res$status))
}
}
I was able to fix the issue by using an adverb I'd used with another geocoding function that attempts to run the function 5 times when it fails to provide results. Given that this worked, it seems likely that this was just a transient error rather than a systemic issue.
The adverb I used:
safely <- function(fn, ..., max_attempts = 5) {
function(...) {
this_env <- environment()
for(i in seq_len(max_attempts)) {
ok <- tryCatch({
assign("result", fn(...), envir = this_env)
TRUE
},
error = function(e) {
FALSE
}
)
if(ok) {
return(this_env$result)
}
}
msg <- sprintf(
"%s failed after %d tries; returning NULL.",
deparse(match.call()),
max_attempts
)
warning(msg)
NULL
}
}
Taken from Repeating values in loop until error disappears.
Related
I am using the spotifyr library where I want to find audio features for multiple tracks. For example I can do this in order to find the audio features of a specific song using it's id.
analysis2 <- get_track_audio_features("2xLMifQCjDGFmkHkpNLD9h",
authorization = get_spotify_access_token())
Yesterday, I wrote this function below that takes all the tracks in a dataframe and finds the audio features for all of them and stores them in a list and it was working fine.
get_analysis <- function(track_id)
{
analysis <- get_track_audio_features(track_id,
authorization = get_spotify_access_token())
}
tracks_list <- lapply(all_tracks$track.id, get_analysis)
Now I am getting an error saying Request failed [503] and Error in get_track_audio_features(track_id, authorization = get_spotify_access_token()) : Service Unavailable (HTTP 503).
I am still able to find the audio features of a specific song so I am not sure which service is unavailable.
I suspect you are reaching a song in your data for which the response is denied from spotify. You could try adding an error-catching mechanism to see which one it is:
get_analysis <- function(track_id){
tryCatch(
expr = {
get_track_audio_features(track_id, authorization = get_spotify_access_token())
},
error = function(e){
print(track_id)
}) -> analysis
return(analysis)
}
tracks_list <- lapply(all_tracks$track.id, get_analysis)
I looked at the source code for the package and didn't see any sneaky rate-limiting issues and the Web API page shows error 503 as a generic error that needs waiting to be resolved (https://developer.spotify.com/documentation/web-api/). Thus you could also try just adding a 10 minute wait (I couldn't find how long exactly it is on Spotify's website):
get_analysis <- function(track_id){
tryCatch(
expr = {
get_track_audio_features(track_id, authorization = get_spotify_access_token()) -> output
return(output)
},
error = function(e){
print(track_id)
return(e)
}) -> output
}
wait.function <- funciton(){
Sys.sleep(600)
}
get_analysis_master <- function(all_tracks){
k <- 1
tracks_list <- list()
for(track.id in all_tracks$track.id){
get_analysis(track.id) -> output
if(!inherits(output, "error")){
tracks_list[[k]] <- output
k <- k + 1
} else {
wait.function()
}
return(tracks_list)
}
get_analysis_master(all_tracks) -> tracks_list
I have a shiny app where I call a function from a different package. So when I call this function if the values are not in a certain range that function throws an error and the app stops.
Eg function :
data <- callFunction(cost,input1,input2)
So when the callFunction throws an error it stops the app. I dont want the app to stop and just throw a message on the app saying your inputs are incorrect please modify it. How can I do this in a shiny app.Thank you.
For error handling, R implements tryCatch
Test <- list(5,4,"foo","bar")
res <- sapply(Test,function(el)
{
tryCatch({
#Expression that might throw an error
el + 5
}, warning = function(w) {
print("Warning. Minor Problems!") #warning handling
}, error = function(e) {
print("Error!. Major Problems!") #error handling
}, finally = {
#possible clean-up code.
})})
res
[1] "10" "9" "Error!. Major Problems!" "Error!. Major Problems!
I'm new to stackoverflow, so please correct me if I make any major mistakes.
As a part of a bigger project I have a function that requests routes from Google and calculates the driving time, I do this with the package ggmap. This worked perfectly fine until I tried to speed things up on other parts of the project and needed to call the driving time function within a foreach loop. In the loop, when I use %dopar% it throws this error:
unable to connect to 'maps.googleapis.com' on port 80.
Does anyone know, where this error comes from and how it can be fixed?
I managed to produce a small example that shows the behaviour:
# necessary packages
library(ggmap)
library(doParallel)
library(doSNOW)
library(foreach)
# some lines to test the function in a for and a foreach loop
Origins <- c("Bern","Biel","Thun","Spiez")
Destinations <- c("Biel","Thun","Spiez","Bern")
numRoutes = length(Origins)
# numCores = detectCores()
# I use only 1 core in testing to make sure that the debug-file is readable
cl <- snow::makeCluster(1, outfile = "debug.txt")
registerDoSNOW(cl)
timesDoPar <-foreach(idx=1:numRoutes,
.packages = c("ggmap")) %dopar% {
getDrivingTime(Origins[idx], Destinations[idx])
}
timesDo <-foreach(idx=1:numRoutes,
.packages = c("ggmap")) %do% {
getDrivingTime(Origins[idx], Destinations[idx])
}
stopCluster(cl)
The function (with some extra for debugging):
getDrivingTime <- function(from, to){
if (from == to){
drivingTimeMin = 0
} else{
route_simple <- tryCatch({
message("Trying to get route from Google")
route(from, to, structure = "route", mode = "driving", output = "simple")
},
error=function(cond) {
message("Route throws an error:\nHere's the original error message:")
message(cond)
return(data.frame(minutes=0))
},
warning=function(cond) {
message("Route throws a warning:\nHere's the original warning message:")
message(cond)
return(data.frame(minutes=0))
},
finally={
message(paste0("\nProcessed route: ", from, "; ", to, "\n\n"))
})
drivingTimeMin = sum(route_simple$minutes, na.rm = TRUE)
}
return(drivingTimeMin)
}
I'm aware that in this example it would make absolutely no sense to use parallel programming - especially with using only one core - but in the scope of the full project it is needed.
I couldn't find any useful information related to this except for this question, where the person asking suggests that the problem might be with the network in their company. I don't think that this is the case for me, since it works with %do%. I couldn't test it in another network yet, though.
(I'm working on Windows 7, using a portable version of R (R version 3.1.0) and R Studio (Version 0.98.501))
How can I test if a RJDBC connection is working? With time limit?
Here is a toy code:
library("RJDBC")
testConnection <- function(){
tryCatch({
conn <- RJDBC::dbConnect(JDBC("com.mysql.jdbc.Driver", "driverPath"), "connString", "user", "password")
dbDisconnect(conn)
return("Connection: OK")
}
, error = function(e) {
return("Connection: Failed")
}
)
}
testConnection()
But if there is no network connection, I have to wait for 30 seconds to have the result.
I need to test many connections, so I would like to change the test limit to 2 seconds.
How can I set a evaluation time limit?
I tried setTimeLimit without success.
For a web scraping project I am making frequent requests on a particular site. Sometimes the connection times out with an error and I would like for it to retry instead of erroring out. I've written out the code below for it to keep trying, but I don't think it works because I still error out.
url = "www.google.com"
while(true){
withRestarts(tryCatch(
sourcecode <- getForm(urls[n]),
finally = Sys.sleep(2),
abort = function(){})
}
Error in function (type, msg, asError = TRUE) : couldn't connect to
host
Got it after experimenting:
while(length(sourcecode.ad) == 0){
try({
sourcecode <- getForm(urls[n])
print(urls[n])
Sys.sleep(1)
})
}
Try() will allow a continuation after an error occurs. Combined with the loop, it will keep retrying.