gtrendsR error HTTP 410 - r

I am new to use a package, gtrendsR, running in R 3.4.1, windows 10.
I succeeded in gconnect, but i get the following error message for any types of query passing to gtrends like below.
library(gtrendsR)
gconnect(usr=my_user_name,psw=my_password)
google.trends = gtrends(c("NHL"), geo="US",start_date="2017-01-01")
Error: Not enough search volume. Please change your search terms.
In addition: Warning message:
In request_GET(x, url, ...) : Gone (HTTP 410).
Anybody has some ideas to solve the problems?

Related

Web Scraping with R: error related to reset of the connection with server

I have a problem with obtaining data from specific website - when trying to download raw website data with R 3.6.3 using following example code:
website_raw <- readLines("https://tge.pl/gaz-rdn?dateShow=09-02-2022")
The result I got is:
Error in file(con, "r") : cannot open the connection In addition: Warning message: In file(con, "r") : InternetOpenUrl failed: 'the connection with the server was reset'
readLines() method used to work fine on this website but from one week on it fails. I've tried also download.file() method: at the beginning the result was the same (error, connection reset) but after setting options(download.file.method = "libcurl"), website file starts to download but then it suddenly stops with information:
trying URL 'https://tge.pl/gaz-rdn?dateShow=09-02-2022'
Error in download.file("https://tge.pl/gaz-rdn?dateShow=09-02-2022", "test.html") :
cannot open URL 'https://tge.pl/gaz-rdn?dateShow=09-02-2022'
In addition: Warning message:
In download.file("https://tge.pl/gaz-rdn?dateShow=09-02-2022", "test.html") :
URL 'https://tge.pl/gaz-rdn?dateShow=09-02-2022': status was 'Failure when receiving data from the peer'
I've tried also disabling Use Internet Explorer library/proxy for HTTP in Rstudio Global Options but it didn't help. Another solution that I've tested was read_html() from rvest package - getting following error:
Error in open.connection(x, "rb") : Send failure: Connection was reset
Downloading data from other websites works fine though, with all considered methods.
Is there any way I can download data from this website with R?
Any kind of help or suggestion will be highly appreciated

I am getting a pathing error that I do not understand regarding the "diskImageR" package

I am getting an error when trying to run the diskImageR package, specifically the IJMacro function, regarding an inability to locate ImageJ. This is what I think the error is stating although I do not know for sure.
I already tried changing the path and by following the pdf associated with running the package, but I still get the same error.
IJMacro("newProject",imageJLoc ="C:\\Users\\user\\Desktop\\ImageJ")
[1] "Searching for application name or filepath: ImageJ"
Error in ij$runScript(paste(script, IJarguments)) :
The imageJ binaries have not been located. Re-initialise the imageJInterface object with the correct location for the imageJ binaries
In addition: Warning message:
In setFilePath(filePath) :
The ImageJ application could not be found in the common install location on your system

Getting the Googlebot crawl errors via R with the new search console

So the problem is I had a code running nicely for an automation that got the number of Googlebot crawl errors. I was using the SearchconsoleR package for this.
Recently I'm assuming that due to the changes in Search Console this doesn't work anymore. Has anyone had (and solved) this problem till now?
So the previous code was working finely for months:
Errors <- crawl_errors(website, category = "all", platform = c("web"), latestCountsOnly = T)
And now I get the following error code:
Request failed [404]. Retrying in 1 seconds...
Request failed [404]. Retrying in 2.4 seconds...
2019-05-15 14:41:02> Request Status Code: 404
Error : lexical error: invalid char in json text.
Not Found
(right here) ------^
Not Found
Error: lexical error: invalid char in json text.
Not Found
(right here) ------^
In addition: Warning message:
No JSON content found in request
Tried looking into the documentation of the package but didn't find any relevant updates yet. If anyone has any pointers they would be greatly appreciated.
Thanks in advance
I'm afraid this functionality got removed from the Search Console API.

RStudio v1.1.456 Rpubs upload error, no login prompt

Hi I couldn't find any useful information in regards to this one.
When I try to publish to Rpubs on Rstudio v1.1.456, instead of login prompt I get this error message below.
[Edited] Rstudio v1.1.456, R v3.5.1
Upload Error Occurred Error in if (result$status == 201) succeeded <-
TRUE : missing value where TRUE/FALSE needed Calls: In
addition: Warning message: In http(protocol, "api.rpubs.com", port,
method, path, headers, : NAs introduced by coercion Execution
halted
I've already addded this line
options(rpubs.upload.method = "internal")
to both global and working directory Rprofile
Apparently, there was an issue in the most recent version of rsconnect (0.8.12).
devtools::install_github("rstudio/rsconnect", ref = "bugfix/multi-status-header")
Installing the patch solves it.

Why there is database connection issue in RNCEP package

I am trying to use "RNCEP" package in R studio. I ran following code
install.packages("RNCEP", dependencies=TRUE)
library(RNCEP)
wx.extent <- NCEP.gather(variable= 'air', level=850, months.minmax=c(8,9),
years.minmax=c(2006,2007), lat.southnorth=c(50,55), lon.westeast=c(0,5),
reanalysis2 = FALSE, return.units = TRUE)
I got error messages as:
trying URL
'http://www.esrl.noaa.gov/psd/thredds/dodsC/Datasets/ncep.reanalysis/pressure/air.2006.nc.das'
Content length 660 bytes
Error in NCEP.gather.pressure(variable = variable, months.minmax =
months.minmax, :
There is a problem connecting to the NCEP database with the
information provided.
Try entering
http://www.esrl.noaa.gov/psd/thredds/dodsC/Datasets/ncep.reanalysis/pressure/air.2006.nc.das
into a web browser to obtain an error message.
In addition: Warning messages:
1: In
download.file(paste("http://www.esrl.noaa.gov/psd/thredds/dodsC/Datasets/ncep.reanalysis",
: cannot open URL
'http://www.cfauth.com/?cfru=aHR0cDovL3d3dy5lc3JsLm5vYWEuZ292L3BzZC90aHJlZGRzL2RvZHNDL0RhdGFzZXRzL25jZXAucmVhbmFseXNpcy9wcmVzc3VyZS9haXIuMjAwNi5uYy5kYXM=':
HTTP status was '401 Unauthorized'
Please suggest me the correct syntax to download NCEP data.
Thanks
Sam

Resources