how to fix error in getGEO function? - r

When I want to use the getGEO function to download GSE dataset with this code:
series <- "GSE85358"
gset <- getGEO(series , GSEMatrix =TRUE, AnnotGPL=TRUE, destdir = "....." )
I encounter this error:
https://ftp.ncbi.nlm.nih.gov/geo/series/GSE2nnn/GSE2553/matrix/ Error
in function (type, msg, asError = TRUE) : error:1407742E:SSL
routines:SSL23_GET_SERVER_HELLO:tlsv1 alert protocol version
How can i fix it?

Related

Error using readMapping function of topGO

I am getting the below error when I am using the readMappings function of topGO in R
library(topGO)
geneID2GO <- readMappings(file = "All_genes_GO_terms", package = "topGO")
Error in readMappings(file = "All_genes_GO_terms", package = "topGO") :
unused argument (package = "topGO")

"Error in function (type, msg, asError = TRUE)" in R

I am getting this error when i run this code below to download data
Error in function (type, msg, asError = TRUE) : Failed to connect to course1.winona.edu port 80: Timed out
Any help would be appreciated
My code
library(RCurl)
urlfile <-'http://course1.winona.edu/bdeppa/Stat%20425/Data/Boston_Housing.csv'
downloaded <- getURL(urlfile, ssl.verifypeer=FALSE)
connection <- textConnection(downloaded)
dataset <- read.csv(connection, header=FALSE)
I don't think you need anything beyond read.csv. This works for me:
urlfile <-'http://course1.winona.edu/bdeppa/Stat%20425/Data/Boston_Housing.csv'
dataset <- read.csv(urlfile)

Error accessing ftp with getURL in R

I want access files on the internet, but I get the following error message:
Error in function (type, msg, asError = TRUE) : Access denied: 530
In addition: Warning messages:
1: In strsplit(str, "\\\r\\\n") : input string 1 is invalid in this locale
This is my code from this post:
library(RCurl)
url<'ftp://ftp.address'
userpwd <- "user:password"
filenames <- getURL(url, userpwd = userpwd,
ftp.use.epsv=FALSE, dirlistonly = TRUE)
Any idea how to solve this?
Thanks a lot for your help!
Try:
library(RCurl)
filenames <- getURL(url="ftp://user:password#ftp.address",ftp.use.epsv=FALSE, dirlistonly = FALSE)

r XBRL package "404 Not Found" error with Ubuntu

The following R code works fine from my Windows 8 laptop:
> inst<- "https://www.sec.gov/Archives/edgar/data/51143/000104746916010329/ibm-20151231.xml"
> options(stringsAsFactors = FALSE)
> xbrl.vars <- xbrlDoAll(inst, cache.dir = "XBRLcache", prefix.out = NULL, verbose=TRUE)
However, when I attempt to run it from my Ubuntu 16.04 machine, I receive the following output:
Error in fileFromCache(file) :
Error in download.file(file, cached.file, method = "auto", quiet = !verbose) :
cannot download all files
In addition: Warning message:
In download.file(file, cached.file, method = "auto", quiet = !verbose) :
URL 'https://www.sec.gov/Archives/edgar/data/51143/000104746916010329/ibm-20151231.xsd': status was '404 Not Found'
It's finding the initial xml file but then cannot find the referenced schemas. Any help would be appreciated. Thanks in advance.

Unused argument error in EpiR

I'm getting an unused argument error in the following code:
epi.tests(dat4, verbose = TRUE)
# Error in epi.tests(dat4, verbose = TRUE) :
# unused argument (verbose = TRUE)
What could be causing this and how could I fix the issue?
Thanks
Look at ?epiR::epi.tests.
The Usage section indicates that the function should be used as:
epi.tests(dat, conf.level = 0.95)
There is no verbose argument to that function - it only accepts dat (to which you are passing dat4, and conf.level (whose default is 0.95).
Try again with epi.tests(dat4).

Resources