Importing data from Excel to vector in R - r

I am a novice in R and I have been having some trouble trying to get R and Excel to cooperate.
I have written a code that makes it able to compare two vectors with each other and determine the differences between them:
data.x<-read.csv(file.choose(), header=T)
data.y<-read.csv(file.choose(), header=T)
newdata.x<-grep("DAG36|G379",data.x,value=TRUE,invert=TRUE)
newdata.x
newdata.y<-grep("DAG36|G379",data.y,value=TRUE,invert=TRUE)
newdata.y
setdiff(newdata.x,newdata.y)
setdiff(newdata.y,newdata.x)
The data I want to transfer from Excel to R is a long row of numbers placed as so:
“312334-2056”, “457689-0932”, “857384-9857”,….,
There are about 350 of these numbers placed in their own separate cell along a single row.
I used the command: = """" & A1 & """" To put double quotes around every number in order for R to read it properly.
At first I tried to simply copy/paste the data directly into a vector in R, but it's as if R won’t read it as a single row of data and therefore splits it up.
I also tried to save the excel file as a CSV file but that didn’t work either.
Lastly I tried to open it directly in to R using the command:
data.x<- read.csv(file.choose(), header=T)
But as I type in: data.x and press enter it simply says:
<0 rows> (or 0-lenghts row.names)
I simply can’t figure out what I’m doing wrong. Any help would be greatly appreciated.

It's hard to access without a reproducible example, but you should be able to transpose the Excel file into a single column. Then import using read_csv from the readr package. Take a look at the tidyverse package, which will contain some great tools to import and work with this type of data.

I use https://github.com/tidyverse/readxl/. It makes it easy to maintain formatting from excel into type safe tibbles.
If you can share some sample data a working solution can be generated.

Related

How do I get EXCEL to interpret character variable without scientific notation in R using fwrite?

I have a relatively simple issue when writing out in R with fwrite from the data.table package I am getting a character vector interpreted as scientific notation by Excel. You can run the following code to create the data issue:
#create example
samp = data.table(id = c("7E39", "7G32","5D99999"))
fwrite(samp,"test.csv",row.names = F)
When you read this back into R you get values back no problem if you have scinote disable. My less code capable colleagues work with the csv directly in excel and they see this:
They can attempt to change the variable to text but excel then interprets all the zeros. I want them to see the original "7E39" from the data table created. Any ideas how to avoid this issue?
PS: I'm working with millions of rows so write.csv is not really an option
EDIT:
One workaround I've found is to just create a mock variable with quotes:
samp = data.table(id = c("7E39", "7G32","5D99999"))[,id2:=shQuote(id)]
I prefer a tidyr solution (pun intended), as I hate unnecessary columns
EDIT2:
Following R2Evan's solution I adapted it to data table with the following (factoring another numerical column, to see if any changes occured):
#create example
samp = data.table(id = c("7E39", "7G32","5D99999"))[,second_var:=c(1,2,3)]
fwrite(samp[,id:=sprintf("=%s", shQuote(id))],
"foo.csv", row.names=FALSE)
It's a kludge, and dang-it for Excel to force this (I've dealt with it before).
write.csv(data.frame(id=sprintf("=%s", shQuote(c("7E39", "7G32","5D99999")))),
"foo.csv", row.names=FALSE)
This is forcing Excel to consider that column a formula, and interpret it as such. You'll see that in Excel, it is a literal formula that assigns a static string.
This is obviously not portable and prone to all sorts of problems, but that is Excel's way in this regard.
(BTW: I used write.csv here, but frankly it doesn't matter which function you use, as long as it passes the string through.)
Another option, but one that your consumers will need to do, not you.
If you export the file "as is", meaning the cell content is just "7E39", then an auto-import within Excel will always try to be smart about that cell's content. However, you can manually import the data.
Using Excel 2016 (32bit, on win10_64bit, if it matters):
Open Excel (first), have an (optionally empty) worksheet already open
On the ribbon: Data > Get External Data > From Text
Navigate to the appropriate file (CSV)
Select "Delimited" (file type), click Next, select "Comma" (and optionally deselect any others that may default to selected), Next
Click on the specific column(s) and set the "Default data format" to "Text" (this will need to be done for any/all columns where this is a problem). Multiple columns can be Shift-selected (for a range of columns), but not Ctrl-selected. Finish.
Choose the top-left cell to import/paste the data (or a new worksheet)
Select Properties..., and deselect "Save query definition". Without this step, the data is considered a query into an external data source, which may not be a problem but makes some things a little annoying. (For example, try to highlight all data and delete it ... Excel really wants to make sure you know what you're doing there.)
This method provides a portable solution. It "punishes" the Excel users, but anybody/anything else will still be able to consume the files directly without change. The biggest disadvantage with this method is that you won't know if somebody loads it incorrectly unless/until they get odd results when the try to use the data and some fields are silently converted.

read data into R

The World Health Organization dataset is available here: http://www.filedropper.com/who
When the data is read using fread (from the data.table package), or read_csv (from the readr package) some variables are wrapped within letter r, and are shown as character type. Like so:
"\r31.1\r".
I checked the dataset in notepad and indeed it looks weird as these values are wrapped within (' '). However they are numeric, and when the regular read.csv is used there is no such problem.
What's the reason behind this? How to fix?
the '\r' is e special character used as a new line delimiter for files on windows.
When using read_csv setting the argument escape_backslash=TRUE might do the trick.
Check this for further reading.

importing txt into R

I am trying to read an ftp file from the internet ("ftp://ftp.cmegroup.com/pub/settle/stlint") into R, using the following command:
aaa<-read.table("ftp://ftp.cmegroup.vom/pub/settle/stlint", sep="\t", skip=2, header=FALSE)
the result shows the 8th, 51st, 65th, 71st, 72nd, 73rd, 74th, etc etc rows of the resulting dataset as including add-on rows appended at the end. Basically instead of returning
{row8}
{row9}
etc
{row162}
{row163}
It returns (adding in the quotes around the \n)
{row8'\n'row9'\n'etc...etc...'\n'row162}
{row163}
If it seems like i'm picking arbitrary numbers then run the code above, take a look at the actual ftp file on the internet (as of mid-day feb18) and you'll see i'm not, it really adding 155x rows onto the end of the 8th row. So what i'm looking for is simply I'm looking for a way to read it in without the random appending of rows. Thanks, and apologize in advance i'm new to R and was not able to find this fix after a while of searching.

Using R to write a .mat file not giving the right output?

I had a .csv file that I wanted to read into Octave (originally tried to use csvread). It was taking too long, so I tried to use R to workaround: How to read large matrix from a csv efficiently in Octave
This is what I did in R:
forest_test=read.csv('forest_test.csv')
library(R.matlab)
writeMat("forest_test.mat", forest_test_data=forest_test)
and then I went back to Octave and did this:
forest_test = load('forest_test.mat')
This is not giving me a matrix, but a struct. What am I doing wrong?
To answer your exact question, you are using the load function wrong. You must not assign it's output to a variable if you just want the variables on the file to be inserted in the workspace. From Octave's load help text:
If invoked with a single output argument, Octave returns data
instead of inserting variables in the symbol table. If the data
file contains only numbers (TAB- or space-delimited columns), a
matrix of values is returned. Otherwise, 'load' returns a
structure with members corresponding to the names of the variables
in the file.
With examples, following our case:
## inserts all variables in the file in the workspace
load ("forest_test.mat");
## each variable in the file becomes a field in the forest_test struct
forest_test = load ("forest_test.mat");
But still, the link you posted about Octave being slow with CSV files makes referece to Octave 3.2.4 which is a quite old version. Have you confirmed this is still the case in a recent version (last release was 3.8.2).
There is a function designed to convert dataframes to matrices:
?data.matrix
forest_test=data.matrix( read.csv('forest_test.csv') )
library(R.matlab)
writeMat("forest_test.mat", forest_test_data=forest_test)

Load Excel file to R while setting column equal to a factor

I have an Excel file that I am trying to load into R using the odbcConnectExcel and sqlQuery commands from RODBC package. One of the columns has numerical values with plus or minus signs, such as '5+ or '3-. However, if i do something like,
conn <- odbcConnectExcel("file.xls")
sqlQuery(conn, "SELECT * FROM `Sheet1$`")
then the column with the plus and minus signs will be returned as a numerical column with those symbols stripped. Is there a way to have this column read in as a factor in which the signs are maintained? I would prefer to not have to convert the file to another format first.
Thanks.
Data like this becomes a factor if you use the xlsReadWrite (http://www.swissr.org/software/xlsreadwrite) package to read the file:
library(xlsReadWrite)
x <- read.xls(file="file.xls")
However, note that you need to do something more than just install.packages("xlsReadWrite") to get this package to run. You need another file or so, I forgot.
This doesn't directly address your question, but hopefully it will help:
This is the best summary of options for connecting to Excel that I have seen: Export Data Frames To Multi-worksheet Excel File. While it deals generally with exporting, importing is also possible with most of these approaches.
My favorite is actually the RDCOMClient because it provides total control over Excel as an application.

Resources