as.h2o ERROR: Unexpected HTTP Status code: 500 Server Error - r

I'm trying to do something with h2o in Rstudio, but have problems when using as.h2o(). It always gives back the following error.
For example:
library(h2o)
localH2O = h2o.init()
finaldata.hex = as.h2o(finaldata)
ERROR: Unexpected HTTP Status code: 500 Server Error (url = http://localhost:54321/3/PostFile?destination_frame=%2Fprivate%2Fvar%2Ffolders%2F8z%2F29h4lb311gbdhg58mj704g580000gn%2FT%2FRtmpr83spR%2Ffile12ab3b8df30c.csv_sid_a24d_3)
Error: lexical error: invalid char in json text.
<html> <head> <meta http-equiv=
(right here) ------^
Would you please help me to figure out how to fix this error?
Thanks,

This is super late but I had the same issue and I realized that increasing h2o.glm(...,max_mem_size) made that issue go away. I was converting an extremely large data.table to to an h2o object and was getting this same error message.

This problem is when you are using h2o to change the data frame to h2o format as:
data_h2o <- as.h2O(data)
This is internal error for your Server.
To solve this problem you can restart your server and run it again. I hope this can help you...

Related

plotly partial_bundle fails with timeout error

I'm trying to increase the performance of a markdown which has a lot of plotly graphs in it by using the partial_bundle function. I'm using the code from the ?partial_bundle for the reprex
Unfortunately it fails with the error message:
Error in curl::curl_download(paste0("https://cdn.plot.ly/", bundle_script), :
Timeout was reached: [] Connection timed out after 10012 milliseconds
plot_ly(z = ~volcano) %>%
add_heatmap() %>%
partial_bundle()
Now I am behind a VPN at work, so I assume that is causing an error, but unsure how to redirect the package if needed. Also is there a way to download the required "bundle" directly from the cdn or npm package directly, as listed here https://github.com/plotly/plotly.js/blob/master/dist/README.md
?
Many thanks. I appreciate that as this isn't reproducible (you'd need to be behind my VPN), just wondered if anyone else had come across this issue and resolved it!

Getting the Googlebot crawl errors via R with the new search console

So the problem is I had a code running nicely for an automation that got the number of Googlebot crawl errors. I was using the SearchconsoleR package for this.
Recently I'm assuming that due to the changes in Search Console this doesn't work anymore. Has anyone had (and solved) this problem till now?
So the previous code was working finely for months:
Errors <- crawl_errors(website, category = "all", platform = c("web"), latestCountsOnly = T)
And now I get the following error code:
Request failed [404]. Retrying in 1 seconds...
Request failed [404]. Retrying in 2.4 seconds...
2019-05-15 14:41:02> Request Status Code: 404
Error : lexical error: invalid char in json text.
Not Found
(right here) ------^
Not Found
Error: lexical error: invalid char in json text.
Not Found
(right here) ------^
In addition: Warning message:
No JSON content found in request
Tried looking into the documentation of the package but didn't find any relevant updates yet. If anyone has any pointers they would be greatly appreciated.
Thanks in advance
I'm afraid this functionality got removed from the Search Console API.

gtrendsR error HTTP 410

I am new to use a package, gtrendsR, running in R 3.4.1, windows 10.
I succeeded in gconnect, but i get the following error message for any types of query passing to gtrends like below.
library(gtrendsR)
gconnect(usr=my_user_name,psw=my_password)
google.trends = gtrends(c("NHL"), geo="US",start_date="2017-01-01")
Error: Not enough search volume. Please change your search terms.
In addition: Warning message:
In request_GET(x, url, ...) : Gone (HTTP 410).
Anybody has some ideas to solve the problems?

Error in Curl when using geojson_read command in R

My situation is I wrote an R script using a university windows computer and emailed the file to myself. The first and second time I ran the script it was working perfectly, however, I started having the following error on the third run.
I figured out the line causing the problem and the error message:
> mapdt <-
> geojson_read("http://maperic.clst.org/wupl/Stuff/gz_2010_us_040_00_500k.json",
> what = "sp")
Error in curl::curl_fetch_disk(url, x$path, handle = handle) :
Timeout was reached
I have tried http_proxy method in this link, and it was not working properly. I have the same problem in two different network environment.
Thank you.

Getting Error in readRDS(cache_path) : error reading from connection

I am trying to solve quiz 2 in Coursera for Getting and Cleaning Data. I get the error :"Error in readRDS(cache_path) : error reading from connection" when I execute the following statement:
ONgithub_token <- oauth2.0_token(oauth_endpoints("github"), myapp)
I have searched and see that there are solutions for readRDS(file) but not readRDS(cache_path).
I am on Windows10, R version is 3.3.0
Appreciate your help.
Thanks.

Resources