Why there is database connection issue in RNCEP package - r

I am trying to use "RNCEP" package in R studio. I ran following code
install.packages("RNCEP", dependencies=TRUE)
library(RNCEP)
wx.extent <- NCEP.gather(variable= 'air', level=850, months.minmax=c(8,9),
years.minmax=c(2006,2007), lat.southnorth=c(50,55), lon.westeast=c(0,5),
reanalysis2 = FALSE, return.units = TRUE)
I got error messages as:
trying URL
'http://www.esrl.noaa.gov/psd/thredds/dodsC/Datasets/ncep.reanalysis/pressure/air.2006.nc.das'
Content length 660 bytes
Error in NCEP.gather.pressure(variable = variable, months.minmax =
months.minmax, :
There is a problem connecting to the NCEP database with the
information provided.
Try entering
http://www.esrl.noaa.gov/psd/thredds/dodsC/Datasets/ncep.reanalysis/pressure/air.2006.nc.das
into a web browser to obtain an error message.
In addition: Warning messages:
1: In
download.file(paste("http://www.esrl.noaa.gov/psd/thredds/dodsC/Datasets/ncep.reanalysis",
: cannot open URL
'http://www.cfauth.com/?cfru=aHR0cDovL3d3dy5lc3JsLm5vYWEuZ292L3BzZC90aHJlZGRzL2RvZHNDL0RhdGFzZXRzL25jZXAucmVhbmFseXNpcy9wcmVzc3VyZS9haXIuMjAwNi5uYy5kYXM=':
HTTP status was '401 Unauthorized'
Please suggest me the correct syntax to download NCEP data.
Thanks
Sam

Related

Accessing SharePoint in R

My script happened to get errors with accessing SharePoint. It used to work.
sp_con = sp_connection("https://asdf.sharepoint.com/sites/staff",
credentialFile = "H:/SharePoint API/creds.yml", Office365 = T)
The error was
Error in sp_connection("https://asdf.sharepoint.com/sites/staff", :
Receiving access cookies failed.
In addition: Warning message:
In readLines(file) :
incomplete final line found on 'Y:/Operations/SharePoint API/creds.yml'
I googled the warning message and found solutions to fix it. But still got the access cookies error. Thanks in advance for any idea!

Web Scraping with R: error related to reset of the connection with server

I have a problem with obtaining data from specific website - when trying to download raw website data with R 3.6.3 using following example code:
website_raw <- readLines("https://tge.pl/gaz-rdn?dateShow=09-02-2022")
The result I got is:
Error in file(con, "r") : cannot open the connection In addition: Warning message: In file(con, "r") : InternetOpenUrl failed: 'the connection with the server was reset'
readLines() method used to work fine on this website but from one week on it fails. I've tried also download.file() method: at the beginning the result was the same (error, connection reset) but after setting options(download.file.method = "libcurl"), website file starts to download but then it suddenly stops with information:
trying URL 'https://tge.pl/gaz-rdn?dateShow=09-02-2022'
Error in download.file("https://tge.pl/gaz-rdn?dateShow=09-02-2022", "test.html") :
cannot open URL 'https://tge.pl/gaz-rdn?dateShow=09-02-2022'
In addition: Warning message:
In download.file("https://tge.pl/gaz-rdn?dateShow=09-02-2022", "test.html") :
URL 'https://tge.pl/gaz-rdn?dateShow=09-02-2022': status was 'Failure when receiving data from the peer'
I've tried also disabling Use Internet Explorer library/proxy for HTTP in Rstudio Global Options but it didn't help. Another solution that I've tested was read_html() from rvest package - getting following error:
Error in open.connection(x, "rb") : Send failure: Connection was reset
Downloading data from other websites works fine though, with all considered methods.
Is there any way I can download data from this website with R?
Any kind of help or suggestion will be highly appreciated

gtrendsR error HTTP 410

I am new to use a package, gtrendsR, running in R 3.4.1, windows 10.
I succeeded in gconnect, but i get the following error message for any types of query passing to gtrends like below.
library(gtrendsR)
gconnect(usr=my_user_name,psw=my_password)
google.trends = gtrends(c("NHL"), geo="US",start_date="2017-01-01")
Error: Not enough search volume. Please change your search terms.
In addition: Warning message:
In request_GET(x, url, ...) : Gone (HTTP 410).
Anybody has some ideas to solve the problems?

Unable to download financial data with getFin() in R - HTTP status is '403 Forbidden'

I am trying to download financial data of companies. I have used getFin() quite a lot without encountering any problem.
Right now, I am unable to download any data and when I use e.g. this code (and basically any other valid symbol instead of "AAPL"):
getFin("AAPL")
I get the following error message:
Error in download.file(paste(google.fin, Symbol, sep = ""), quiet = TRUE, :
cannot open URL 'http://finance.google.com/finance?fstype=ii&q=AAPL'
In addition: Warning message:
In download.file(paste(google.fin, Symbol, sep = ""), quiet = TRUE, :
cannot open URL 'http://finance.google.com/finance?fstype=ii&q=AAPL': HTTP status was '403 Forbidden'
However, if I try to access the website http://finance.google.com/finance?fstype=ii&q=AAPL via a browser, I have no problem with accessing the website.
So why am I unable to download data with getFin() in RStudio all of the sudden?
Have you tried clearing your cache or going incognito and accessing the URL?
Assuming you are on a linux server and using PHP you could try updating your PHP version it should be on the Google finance api documentation

bigrquery: Error with Google Big Query R interface

I'm using bigrquery R package to fetch the data. But i'm getting the following error. Let me know if anyone knows how to fix this error.
"Waiting for authentication in browser...
Authentication complete.
Loading required package: rjson
Error: Invalid access credentials have been reset. Please try again.
In addition: Warning messages:
1: In mapCurlOptNames(names(.els), asNames = TRUE) :
Unrecognized CURL options: token.error
2: In mapCurlOptNames(names(.els), asNames = TRUE) :
Unrecognized CURL options: token.error"

Resources