Using read.csv when a data entry is a space (not blank!) - r

I am having a problem with using read.csv in R. I am trying to import a file that has been saved as a .csv file in Excel. Missing values are blank, but I have a single entry in one column which looks blank, but is in fact a space. Using the standard command that I have been using for similar files produces this error:
raw.data <- read.csv("DATA_FILE.csv", header=TRUE, na.strings="", encoding="latin1")
Error in type.convert(data[[i]], as.is = as.is[i], dec = dec, na.strings = character(0L)) :
invalid multibyte string at ' floo'
I have tried a few variations, adding arguments to the read.csv() command such as na.strings=c(""," ") and strip.white=TRUE, but these result in the exact same error.
It is a similar error to what you get when you use the wrong encoding option, but I am pretty sure this shouldn't be a problem here. I have of course tried manually removing the space (in Excel), and this works, but as I'm trying to write generic code for a Shiny tool, this is not really optimal.

Related

Reading csv file using R and RStudio

I am trying to read a csv file in R, but I am getting some errors.
This is what I have and also I have set the correct path
mydata <- read.csv("food_poisioning.csv")
But I am getting this error
Error in make.names(col.names, unique = TRUE) :
invalid multibyte string at '<ff><fe>Y'
In addition: Warning messages:
1: In read.table(file = file, header = header, sep = sep, quote = quote, :
line 1 appears to contain embedded nulls
2: In read.table(file = file, header = header, sep = sep, quote = quote, :
line 2 appears to contain embedded nulls
I believe I am getting this error because my csv file is actually not separated by comma, but it has spaces. This is what is looks like:
I tried using sep=" ", but it didn't work.
If you're having difficulty using read.csv() or read.table() (or writing other import commands), try using the "Import Dataset" button on the Environment panel in RStudio. It is useful especially when you are not sure how to specify the table format or when the table format is complex.
For your .csv file, use "From Text (readr)..."
A window will pop up and allow you to choose a file/URL to upload. You will see a preview of the data table after you select a file/URL. You can click on the column names to change the column class or even "skip" the column(s) you don't need. Use the Import Options to further manage your data.
Here is an example using CreditCard.csv from Vincent Arel-Bundock's Github projects:
You can also modify and/or copy and paste the code in Code Preview, or click Import to run the code when you are ready.

not able to read file using read.csv in R

I am not able to read a csv file in R. The file I imported need some cleaning such as removing text qualifiers such ",' etc. Still I am unable to read it. shows the following error.
currency<-read.csv("02prepared data/scraped data kickstarter/film & video1.csv")
Error in type.convert(data[[i]], as.is = as.is[i], dec = dec, numerals = numerals, :
invalid multibyte string at '45,<30>97'
here is the link to the file:- https://drive.google.com/open?id=1ABXPoYxk8b4WCQuRAu-Hhh2OvpJ76PhH
You can try setting fileEncoding = 'latin1', as suggested in this answer:
https://stackoverflow.com/a/14363274/6304113
I tried the method in the link to read your file, and it works for me.

Importing a huge csv file while fread doesn't work in R

I want to import a big csv file in R (approximately 14 million rows and 13 columns). So I tried to use fread with the following code :
my_data <- fread(my_file,
sep = ";",
header = TRUE,
na.strings=c(""," ","NA"),
quote = "",
fill = TRUE,
check.names=FALSE,
stringsAsFactors=FALSE))
However, I got the following error :
Error in fread(path_alertes_profil, sep = ";", header = TRUE, na.strings = c("", :
Expecting 13 cols, but line 18533 contains text after processing all cols. Try again with fill=TRUE. Another reason could be that fread's logic in distinguishing one or more fields having embedded sep=';' and/or (unescaped) '\n' characters within unbalanced unescaped quotes has failed. If quote='' doesn't help, please file an issue to figure out if the logic could be improved.
Therefore I tried to import my file with the function read_delim from the readr package, with the same parameters. It worked since my file appeared in the global environment (I'm working with RStudio). However, it only got 741629 rows instead of the 14+ million rows
How can I solve this problem (I tried to find a solution for the error when using fread() but didn't find any useful resource)

Read csv in r with special characters

I am trying to read a data file into R with several delimited columns. Some columns have entries which are special characters (such as arrow). Read.table comes back with an error:
incomplete final line found by readTableHeader
and does not read the file. Tried UTF-8, UTF-16 coding options which didn't help either. Here is a small example file.
I am not able to reproduce the arrow in this question box, hence I am attaching the image of the notepad screen of a small file (test1.txt).
Here is what I get when I try to open it.
test <- read.table("test1.TXT", header=T, sep=",", fileEncoding="UTF-8", stringsAsFactor=F)
Warning message: In read.table("test1.TXT", header = T, sep = ",",
fileEncoding = "UTF-8", : incomplete final line found by
readTableHeader on 'test1.TXT'
However, if I remove the second line (with the special character) and try to import the file, R imports it without problem.
test2.txt =
id, ti, comment
1001, 105AB, "All OK"
test <- read.table("test2.TXT", header=T, sep=",", fileEncoding="UTF-8", stringsAsFactor=F)
id ti comment
1 1001 105AB All OK
Although this is a small example, the file I am working with is very large. Is there a way I can import the file to R with those special characters in place?
Thank you.
test1.txt

issues of reading csv files using read.table [duplicate]

This question already has answers here:
'Incomplete final line' warning when trying to read a .csv file into R
(17 answers)
Closed 9 years ago.
I am trying to import CSV files to graph for a project. I'm using R 2.15.2 on a Mac OS X.
The first way tried
The script I'm trying to run to import the CSV file is this:
group4 <- read.csv("XXXX.csv", header=T)
But I keep getting this error message:
Error in read.table(file = file, header = header, sep = sep, quote = quote, :
object 'XXXXXX.csv' not found
The second way tried
I tried moving my working directory but got another error saying I can't move my working directory. So I went into Preferences tab and changed the working directory to the file that has my CSV files. But I still get the same error(as the first way).
The third way tried
Then I tried this script:
group4 <- read.table(file.choose(), sep="\t", header=T)
And I get this error:
Warning message:
In read.table(file.choose(), sep = "\t", header = T) :
incomplete final line found by readTableHeader on '/Users/xxxxxx/Documents/Programming/R/xxxxxx/xxxxxx.csv'
I've searched on the R site and all over the Internet, and nothing has got me to the point where I can import this simple CSV file into the R console.
The file is not in your working directory, change it, or use an absolute path.
Than you are pointing to a non-existing directory, or you do not have write privileges there.
The last line of your file is malformed.
As to the missing EOF (i.e. last line in file is corrupted)...
Usually, a data file should end with an empty line. Perhaps check your file if that is the case.
As an alternative, I would suggest to try out readLines(). This function reads each line of your data file into a vector. If you know the format of your input, i.e. the number of columns in the table, you could do this...
number.of.columns <- 5 # the number of columns in your data file
delimiter <- "\t" # this is what separates the values in your data file
lines <- readLines("path/to/your/file.csv", -1L)
values <- unlist(lapply(lines, strsplit, delimiter, fixed=TRUE))
data <- matrix(values, byrow=TRUE, ncol=number.of.columns)

Resources