My input in csv file was not recognised by R - r

I previously read string/factor column from csv file into R using multiplication symbol (×) and R recognised it. After my recent update, R stops recognising it. My previous code was
read.csv("Histology4.csv", stringsAsFactors = TRUE)
I would to keep multiplication symbol (×) in my dataframe. Does anyone has any suggestion? Thanks in advance!
column in csv file
afer reading to R!
Regards

Related

R misreading csv files after modifications on Excel

This is more of a curiosity.
Sometimes I modify csv files from Excel rather than R (suppose I manage to find a missing piece of info and I type it in the csv file), of course maintaining commas and quotes as they were.
Every time I do this, R becomes unable to read the csv file, i.e. it imports a single column as it appears on Excel, rather than separating the values (no options like sep= or quote= change this).
Does anyone know why this happens?
Thanks a lot
An example
This was readable:
state,"city","county"
AK,"Anchorage",""
AK,"Haines",""
AK,"Juneau","Juneau"
After adding the missing info under "county", R fails to import it as a data frame, reading it instead as a single vector.
state,"city","county"
AK,"Anchorage","Anchorage"
AK,"Haines","Haines"
AK,"Juneau","Juneau"
Edit:
I'm just running the basic read.csv
df <- read.csv("C:/directory/df.csv")

read excel file which has = in a column in r

I have an excel sheet which has formulas in one column like C=(A-32)/1.8. if i read using function read_excel it is showing the error as unexpected symbol in column. Need help in reading this.
I think you need to force each column type with the argument col_types = of the function read_excel() in the package readxl. You can specify the type character which should read the cells as they are.

Convert SPSS file to CSV and read to create Data Table

I have an SPSS file from which I am creating a data table using R script. But the problem is it's taking a bit of time to load the data table. I want to convert the SPSS(.sav) file into CSV file first and then read the CSV file to create a data table. So far I have tried multiple codes but that didn't work out properly.
Here's the code which I got from this.
I think foreign package in r can be used to solve this problem.
library(foreign)
write.table(read.spss("inFile.sav"), file="outFile.csv", quote = TRUE, sep = ",")

RStudio appears to be importing in data incorrectly. Any suggestions?

I'm currently using RStudio to analyze a few sets of data. When I open the data in R, I see the correct data, which should be years (see photo).
However, when I try to view the data in RStudio, the same column contains numbers ranging from 1-60.
I'm specifically looking at MobileMember$Birth in these screenshots.
Any ideas on how this can be fixed? Thanks in advance for your help!
How are you importing the data into r? As a csv or as .txt? Try saving the excel file as a csv and importing it that way:
MobileMember <- read.csv(file path)
It sounds like R is reading the column as an integer and then ranking them based on the year. Try changing the column to a factor.
You can see how R is reading the column using the str() command
str(MobileMember)
If it is listed as an integer, change the column to a factor using:
MobileMember$Birth <- as.factor(MobileMember$Birth)

Why does R 2.14.1 output certain special characters incorrectly in xls?

Using R 2.14.1, I sometimes output xls files. However, recently, I've noticed that - in the data object to output are converted into some code similar to +AC0 in the actual xls file. This persists when reading back into R. Similarly, underscores are converted to .A+. or something similar.
Example code:
write.table(obj1, file="ex1.xls", sep="\t", row.names=F, na="")
I can't remember this happening in previous versions of R.
Any ideas on solutions?

Resources