Keep leading zeroes when exporting data from R to excel - leading-zero

I am running a SQL query in R, where the MemberID column has 9 or 10 characters. When I manually copy and paste this in excel as text format, the MemberID column retains leading zero for those members who have 9 digits. But when I automate R to run script and export data to excel, it doesn't retain leading zeroes in the MemberID column. It is a numeric column and I tried changing it to a character but it doesn't work. is there a way to keep the leading zeroes in excel?
Data <- sqlQuery(odbcChannel,
"Select distinct
PlanName,
MemberName,
MemberID,
PlanID
From Member
where PlanID ='AR'")
Data$MemberID <-as.character(Data$MemberID)
WB<-createWorkbook()
addWorksheet(WB,sheetname="Plan_AR")
writeData(WB,"Plan_AR",Data)
saveWorkbook(WB,file="MemberByPlan", overwrite =TRUE)

Related

Data import from excel to R (all columns are character classed)

I'm new to r and really need some help with an assignment i have for school.
So I've created an xls file containing returns for companies as decimals i.e 0.023 (2.3% return)
Data is in 3 columns with some negative values. titles for each column in the first row. No row names present just 130 observations of returns and the company names (column names) at the top. All the cells are formatted to general
I converted the xls file to csv on my mac so the file type became CSV-UTF-8 (comma delimited).
When i try to create a dataset in r I imported the csv using read.table command:
read.table(”filename.csv”, header = TRUE, sep =”;” row.names=null)
The dataset looks good all the individual numbers in the right place but when I try
Sapply(dataset, class)
All columns are character. I've tried as.numeric and it says list object cannot be coerced to type ’double’
The issue comes from the fact that you imported a dataset with commas and R cannot interpret this as numeric (it requires dot as decimals separator).
Two ways to avoid this :
You import as you did and you convert your dataframe
dataset=apply(apply(dataset, 2, gsub, patt=",", replace="."), 2, as.numeric)
You directly import the dataset by intepreting commas as decimals separator with read.csv2
library(readr)
read.csv2("filename.csv",fill=TRUE,header=TRUE)

read_excel() reading numeric type column as date type unless it's specified in col_types

I have a table in Excel with numeric, date, and character type columns. I use the read_excel() function from readxl library to load data into R. For most of the columns, read_excel by default does a good job in recognizing the column type.
Problem:
As the number of columns in the table can increase or decrease, I don't want to define col_types in read_excel to load data.
Two Excel numeric columns are cost and revenue with '$' in front of the value such as $200.0541. The dollar sign '$' seems to cause the function to mistakenly identify the cost and revenue column as POSIXct type.
Since new numeric columns might be added later with '$', is it possible to change the column types after loading the data (without using df$cost <- as.numeric(df$cost) for each column) through a loop?
Edit: link to sample - https://ethercalc.org/ogiqi9s51o45

CSV including column zero R

I have a DataFrame in R full of unique values from unique rownames that come from different files. Therefore, I have a column zero with the name of the file where the column name came from and a column 1 with unique rownames.
col0 col1
path/file1 name
path/file1 age
path/file2 color
path/file3 tree
path/file3 house
I want to export this as a CSV, but I haven't been able to write.csv including column0.
I have also tried mutating column 0 as a column, but that doesn't work either.
Any ideas?

Transferring 1 row of numbers from excel to R

I'm trying to transfer a row of numbers from excel to R.
The idea is to put the numbers into a vector and compare it with another vector in order to find differences between them.
I have assembled all numbers along a single row with each placed in their own box. But as i try to copy paste it into a vector in R it appears not to contain all the numbers from the excel arc.
The row contains a substantial amount of numbers so i reckon it has something to do with the capacity of the vector. Is there perhaps a different method of succesfully transfering my numbers from excel to R?
Try copying and pasting as string and then using a function like strsplit() to split your string into a long vector, and then convert it into numeric. Here is an example code with steps:
step 1: (In excel) Remove all commas, and non-numeric characters, you can keep decimals.
step 2: (In excel) Copy the entire row in excel.
step 3: (In R)
number <- readClipboard(format = 1)
number <- strsplit(number, "\t")[[1]]
number <- strsplit(number, "-")
final <- matrix(as.numeric(unlist(number)), nrow = length(number), byrow = TRUE)
You should end up with two columns. Col 1 will be all the numbers in each cell preceding the '-' and Col 2 will be all the numbers in each cell succeeding the '-'.

R read.csv how NOT to convert numbers into numerical values but keep them as strings

I have a CSV file with all values in double quotes. One column is the id column and it contains values such as this:
01100170109835
The problem I am having is that no matter what options I specify (as.is=T, stringsAsFactors=F, or numerals='no.loss'), it always reads this id column in as numeric and drops the leading 0's. This is such a fundamental operation that I am really baffled that I can't find a solution.

Resources