write_csv Error in stream_delim_(df, path, ...) : invalid connection - r

I have a doubt about why write_csv is not working in the following example, but write.csv is working properly. In general, I use the packages "dplyr" and "readr" as they are fast. However, I am not sure why this is happening.
write_csv(df1, path = "C:\\Users\\Sergio\\final\\file1.csv" ,col_names = TRUE)
I get the following error message:
Error in stream_delim_(df, path, ...) : invalid connection
However, if I write the following code, it works ok.
write.csv(df1, file = "file1.csv", fileEncoding="UTF-8")
I am using R version 3.4.1 64-bit, and RStudio Version 1.0.143. My OS is Windows 10. I would appreciate your comments. Thanks in advance.

You may have misspecified the file path. You should make sure both commands point to the same directory:
write.csv(df1, file = "C:\\Users\\Sergio\\final\\file1.csv", fileEncoding="UTF-8")
If that fails, you've misspecified the directory.

Related

Permission denied when manually installing R package

Hi I am currently run R code with version 4.1.0 on ubuntu(WSL) right now. I try to install some R packages manually because my internet is too slow, but I cannot get permission, the warning messages suggest the probable reason "permission denied". Here is the error and warning messages.
Warning in dir.create(path, showWarnings = TRUE, recursive = TRUE, ...) :
cannot create dir 'Rhdf5lib_1.14.2/Rhdf5lib', reason 'Permission denied'
Error in mydir.create(name) :
failed to create directory ‘Rhdf5lib_1.14.2/Rhdf5lib’
I really hope someone can help me with this problem, thanks a lot!
Try to start your R session with sudo R. Other possible option is to specify a lib argument like install.packages("SomePackage", lib = "/home/user/R/...")
I try the following command
export PATH=$PATH: path/to/your/R
and run R directly,this works for me.

How to remove a 'permission denied' file from folder within R

I am downloading a large xlsx file as a part of a function. This file is removed with file.remove() in linux and mac but I have permission denied in windows machines. Below is the code for my function.
download.file(
'http://mirtarbase.mbc.nctu.edu.tw/cache/download/7.0/miRTarBase_MTI.xlsx',
'miRTarBase.xlsx', mode = "wb")
readxl::read_excel('miRTarBase.xlsx') -> miRTarBase
write.csv(miRTarBase, 'miRTarBase.csv')
read.csv('miRTarBase.csv', row.names = 1) -> miRTarBase
file.remove("miRTarBase.xlsx")
I get the following error message in my console
Warning message:
In file.remove("miRTarBase.xlsx") :
cannot remove file 'miRTarBase.xlsx', reason 'Permission denied'.
Again this warning only appears in windows.
Furthermore, after checking the properties of the file itself the 'Read-only' attribute is unchecked.
Following this, the following code works perfectly fine so I do not think the issue is with the folder either.
file.remove("miRTarBase.csv")
I believe the issue lies in how .xlsx files are treated in windows.
When I try to delete the .xlsx file while Rstudio is still running I get a File in use warning message. After closing the R session the .xlsx file can be deleted with no hassle.
This has confused me because I am not used to working with windows. Has anyone had this issue before? Would appreciate any help that can be given. Many thanks.
Have you tried saving as a temporary file in windows?
tmp <- tempfile()
download.file(
'http://mirtarbase.mbc.nctu.edu.tw/cache/download/7.0/miRTarBase_MTI.xlsx', tmp, mode = "wb")
readxl::read_excel(tmp) -> miRTarBase
write.csv(miRTarBase, 'miRTarBase.csv')
read.csv('miRTarBase.csv', row.names = 1) -> miRTarBase
file.remove(tmp)

Importing a csv into R using read.csv in Ubuntu

I have just installed Ubuntu on my computer and I am re-running some codes that previously worked in Windows. I have copied my directories into Ubuntu with all my files.
When I run this line of code to import a database into R, I get the following error:
Annot <- read.csv("~/Documents/DATABASES/Functional_Annotations/Salar_2_Annot_light.csv", header = TRUE)
Error in file(file, "rt") : cannot open the connection
In addition: Warning message: In file(file, "rt") : cannot open file
'/home/cd46/Documents/DATABASES/Functional_Annotations/Salar_2_Annot_light.csv':
No such file or directory
The code is right, hasn't changed since before. In fact if I run:
setwd("~/Documents/DATABASES/Functional_Annotations")
It works fine and recognize the directory. And the file it there too.
I am not sure what this can be, does anyone have a suggestion? The only thing I have done was to switch over to Ubuntu, so I would imagine the problem would lie there.
I have installed the readr package in R
then simply write this :
df <- read_csv("/your path/file.csv")
And this work for me and solved the problem.

Importing Excel file using url using read.xls

I'm trying to use read.xls from gdata to import an Excel file directly into R. I'm on a Windows machine running 64 bit R.
I have checked my PATH variable for perl and I appear to have that set correctly, so that doesn't appear to be a problem. Here's my code, and I've attached my error below. Does anyone have any pointers on how I can get this done?
require(RCurl)
require(gdata)
url <- "https://dl.dropboxusercontent.com/u/27644144/NADAC%2020140101.xls"
test <- read.xls(url)
The error I'm getting is:
Error in xls2sep(xls, sheet, verbose = verbose, ..., method = method, :
Intermediate file 'C:\Users\Me\AppData\Local\Temp\RtmpeoJNxP\file338c26156d7.csv' missing!
In addition: Warning message:
running command '"C:\STRAWB~1\perl\bin\perl.exe" "C:/Users/Me/Documents/R/win-library/3.0/gdata/perl/xls2csv.pl" "https://dl.dropboxusercontent.com/u/27644144/NADAC%2020140101.xls" "C:\Users\Me\AppData\Local\Temp\RtmpeoJNxP\file338c26156d7.csv" "1"' had status 22
Error in file.exists(tfn) : invalid 'file' argument
#G.G is correct that read.xls does not support https. However, if you simply replace the https with http in the url you should be able to download the file.
Give this a try:
require(RCurl)
require(gdata)
url <- "http://dl.dropboxusercontent.com/u/27644144/NADAC%2020140101.xls"
test <- read.xls(url)
read.xls supports http and ftp but does not support https. Download it first and then use read.xls with the downloaded file.

Error when using getGEO() in package GEOquery

I'M running the following code in R:
library(GEOquery)
mypath <- "C:/Users/Farzin/Desktop/BIOC"
GDS1 <- getGEO('GDS1',destdir=mypath)
But I'm getting the following error:
Using locally cached version of GDS1 found here:
C:/Users/Farzin/Desktop/BIOC/GDS1.soft.gz
Error in read.table(con, sep = "\t", header = FALSE, nrows = nseries) :
invalid 'nlines' argument
Could anyone please tell me how I could get rid of this error?
I have had the same error using GEOquery (version 2.23.5) with R and Bioconductor from ubuntu (12.04), whatever GDS file I queried. Could it be that the GEOquery package is faulty ?
In my experience, getGEO is extremely finicky. I commonly experience issues connecting to the GEO server. If this happens during download, getGEO leaves a partial file. But since the partial file is there, when you try to re-download, it will use this cached, partially downloaded file, and run into the error you see (which you want, because its not the full file).
To solve this, delete the cached SOFT file and retry the download.

Resources