I wonder how to convert ending line characters from Windows to Unix using R.
I saw in another post that it's possible using the script write(), but when I try that, it doesn't work(it returns an empty file). Instead, I'd like to use the write.table() command, if it's possible.
Let's write some text:
library(readr)
text <- c("line one", "line two")
write_lines(text, file = "text.linux.txt", sep = "\n")
write_lines(text, file = "text.macos.txt", sep = "\r")
write_lines(text, file = "text.windows.txt", sep = "\r\n")
There are similar options e.g. eol in write.table and write_csv to set the end of line characters.
When a file containing a table or dataframe with character values has been created in R using write.table() or write.csv(), but is opened on a text editor, the values are all in double quotes in the text editor, for example
"MM","M786"
"MM","M797"
"MM","M801"
Is there a way to save the table or dataframe in R so that the double quotes do not show when opening the file with a text editor? So it shoud be
MM,M786
MM,M797
MM,M801
Thanks!
Use quote = FALSE -
write.csv(df, 'out.csv', row.names = FALSE, quote = FALSE)
write.table(df, 'out.txt', quote = FALSE, row.names = FALSE)
We could use fwrite
library(data.table)
fwrite(df, 'out.csv')
fwrite(df, 'out.txt')
In R, how does one read delimiter or and also convert delimiter for "|" vertical line (ASCII: | |). I need to split on whole numbers inside the file, so strsplit() does not help me.
I have R code that reads csv file, but it still retains the vertical line "|" character. This file has a separator of "|" between fields. When I try to read with read.table() I get comma, "," separating every individual character. I also try to use dplyr in R for tab_spanner_delim(delim = "|") to convert the vertical line after the read.delim("file.csv", sep="|") read the file, even this read.delmin() does not work. I new to special char R programming.
read.table(text = gsub("|", ",", readLines("file.csv")))
dat_csv <- read.delim("file.csv", sep="|")
x <- cat_csv %>% tab_spanner_delim(delim = "|")
dput() from read.table(text = gsub("|", ",", readLines("file.csv")))
",\",R,D,|,I,|,7,8,|,0,1,0,|,0,0,1,2,|,8,8,1,0,1,|,1,|,7,|,1,0,5,|,1,1,6,|,1,9,9,9,1,2,2,0,|,0,0,:,0,0,|,|,A,M,|,6,|,|,|,|,|,|,|,|,|,|,|,|,|,\",",
",\",R,D,|,I,|,7,8,|,0,1,0,|,0,0,1,2,|,8,8,1,0,1,|,1,|,7,|,1,0,5,|,1,1,6,|,1,9,9,9,1,2,2,6,|,0,0,:,0,0,|,4,.,9,|,|,6,|,|,|,|,|,|,|,|,|,|,|,|,|,\","
dput() from dat_csv <- read.delim("file.csv", sep="|")
"RD|I|78|010|0012|88101|1|7|105|116|19991220|00:00||AM|6|||||||||||||",
"RD|I|78|010|0012|88101|1|7|105|116|19991226|00:00|4.9||6|||||||||||||"
dput(dat_csv)
"RD|I|78|010|0012|88101|1|7|105|116|19991220|00:00||AM|6|||||||||||||",
"RD|I|78|010|0012|88101|1|7|105|116|19991226|00:00|4.9||6|||||||||||||"
We can read the data line by line using readLines. Remove unwanted characters at the end of each line using trimws, paste the string into one string with new line (\n) character as the collapse argument and use this string in read.table to read data as dataframe.
data <- read.table(text = paste0(trimws(readLines('file.csv'),
whitespace = '[", ]'), collapse = '\n'), sep = '|')
How to define "," as the column separator (sep) in read.csv in R?
I have tried read.csv(file=x,header = FALSE,sep = "",""), which doest work correctly.
sep can only be one character, but you can open your file x e.g. with readLines and exchange your "," separator e.g. with \t by using gsub.
read.table(text=gsub("\",\"", "\t", readLines("x")))
While I issue the write.csv(dataframe_name,"File_name.csv"), I add the parameters row.names=F and na=" " for ease of reading in MS-Excel. Is there a default option in R to always set these parameters.
You can easily write a function to mask this
write.csv <- function(...,row.names=FALSE,na = ' '){
utils::write.csv(..., row.names = row.names, na = na)
}
and place this in your .Rprofile file [or build a simple package which exports this....]
You can also use the write.table function
write.table( dataframe_name, filename = "file.csv, sep=",", row.names=FALSE, ...)
That's worked for me at least.