Write_ods keeps writing an empty data frame into directory? - r

I am trying to export a data frame into an ods sheet, however it is not working correctly.
I have tried exporting the information to many different directories but all have failed. I have also been able to use read_ods in the correct way. When using write_ods I keep getting zero errors yet the directory I am writing to is always empty after I open it.
print(final)
write_ods(x = final, path = "C:/Users/Administrator/Desktop/SpendingOptimizerStreamlined/CoeffSheet.ods")
temp <- read_ods(path = "C:/Users/Administrator/Desktop/SpendingOptimizerStreamlined/CoeffSheet.ods")
print(temp)
I have printed the final data frame that I want to export and it is full of the correct data.
I then write to the directory and get no errors.
To confirm that the previous command worked correctly, I then read in the previously exported data.
Then I print the data out and see zero columns and zero rows.
Not quite sure why this keeps happening? I am curious to know if the write_ods command is still supported or am I just doing something wrong? Thank you!

Related

Read a weird txt file as data in R

I'm trying to get R to read data from a txt file, but the file is not properly made, so it's giving me lots of errors.
Ideally, I'd like to be able to extract a dataframe to be able to work with from this, but I trully don't know how to.
The files are all in this link.
An example with any of them would work.
Thanks a lot!

Trying to merge columns from multiple csv's, but the merged dataframe is coming up NULL

My problem seems to be two-fold. I am using code that has worked before. I re-ran my scripts and got similar outputs, but saved to a new location. I have changed all of my setwd lines accordingly. But, there may be an error with either setwd or the do.call function.
In R, I want to merge 25 csv's that are located in a folder- only certain columns
My path is
/Documents/CODE/merge_file/2sp
So, I do:
setwd("/Documents/CODE")
but then I get an error saying cannot change working directory (usually works fine). So then I manually set working directory in the Session in RStudio.
The next script seems to run fine:
myMergedData2 <-
do.call(rbind,
lapply(list.files(path = "/Documents/CODE/merge_file/2sp"),
read.csv))
myMergedData2 ends up in the global environment, but it says it is NULL (empty), though the console makes it look like everything is ok.
I would then like to save just these columns of information but I can't even get to this point.
myMergedData2<-myMergedData2[c(2:5),c(10:12)]
And then add this
myMergedData2<-myMergedData2 %>% mutate(richness = 2)%>% select(richness,
everything())
And then I would like to save
setwd("/Documents/CODE/merge_file/allsp")
write.csv(myMergedData2, "/Documents/CODE/merge_file/allsp/2sp.csv")
I am trying to merge these data so I can use ggplot 2 and show how my response variables (columns 2-5) according to my independent variables (columns 10-12). I have 25 different parameter sets with 50 observations in each csv.
Ok, so the issue was that my dropbox didn't have enough space and I weirdly don't have permissions to do what I was trying on my university's H drive. Bizarre, but easy fix with the increase in space on Dropbox to allow for complete syncing of csv's.
Sometimes the issue is minor!

How do I get EXCEL to interpret character variable without scientific notation in R using fwrite?

I have a relatively simple issue when writing out in R with fwrite from the data.table package I am getting a character vector interpreted as scientific notation by Excel. You can run the following code to create the data issue:
#create example
samp = data.table(id = c("7E39", "7G32","5D99999"))
fwrite(samp,"test.csv",row.names = F)
When you read this back into R you get values back no problem if you have scinote disable. My less code capable colleagues work with the csv directly in excel and they see this:
They can attempt to change the variable to text but excel then interprets all the zeros. I want them to see the original "7E39" from the data table created. Any ideas how to avoid this issue?
PS: I'm working with millions of rows so write.csv is not really an option
EDIT:
One workaround I've found is to just create a mock variable with quotes:
samp = data.table(id = c("7E39", "7G32","5D99999"))[,id2:=shQuote(id)]
I prefer a tidyr solution (pun intended), as I hate unnecessary columns
EDIT2:
Following R2Evan's solution I adapted it to data table with the following (factoring another numerical column, to see if any changes occured):
#create example
samp = data.table(id = c("7E39", "7G32","5D99999"))[,second_var:=c(1,2,3)]
fwrite(samp[,id:=sprintf("=%s", shQuote(id))],
"foo.csv", row.names=FALSE)
It's a kludge, and dang-it for Excel to force this (I've dealt with it before).
write.csv(data.frame(id=sprintf("=%s", shQuote(c("7E39", "7G32","5D99999")))),
"foo.csv", row.names=FALSE)
This is forcing Excel to consider that column a formula, and interpret it as such. You'll see that in Excel, it is a literal formula that assigns a static string.
This is obviously not portable and prone to all sorts of problems, but that is Excel's way in this regard.
(BTW: I used write.csv here, but frankly it doesn't matter which function you use, as long as it passes the string through.)
Another option, but one that your consumers will need to do, not you.
If you export the file "as is", meaning the cell content is just "7E39", then an auto-import within Excel will always try to be smart about that cell's content. However, you can manually import the data.
Using Excel 2016 (32bit, on win10_64bit, if it matters):
Open Excel (first), have an (optionally empty) worksheet already open
On the ribbon: Data > Get External Data > From Text
Navigate to the appropriate file (CSV)
Select "Delimited" (file type), click Next, select "Comma" (and optionally deselect any others that may default to selected), Next
Click on the specific column(s) and set the "Default data format" to "Text" (this will need to be done for any/all columns where this is a problem). Multiple columns can be Shift-selected (for a range of columns), but not Ctrl-selected. Finish.
Choose the top-left cell to import/paste the data (or a new worksheet)
Select Properties..., and deselect "Save query definition". Without this step, the data is considered a query into an external data source, which may not be a problem but makes some things a little annoying. (For example, try to highlight all data and delete it ... Excel really wants to make sure you know what you're doing there.)
This method provides a portable solution. It "punishes" the Excel users, but anybody/anything else will still be able to consume the files directly without change. The biggest disadvantage with this method is that you won't know if somebody loads it incorrectly unless/until they get odd results when the try to use the data and some fields are silently converted.

write.csv executes with no errors but is not creating the csv

I am attempting to write a simple CSV of some data, but no CSV is being created. There are no error messages when I run the code, though. This code was working fine on Friday, but was not working when I came back today. I have tried using different working directories and tried different file names, but every time I check the folder there is nothing there. Has anybody experienced this before and/or do you have any additional troubleshooting ideas?
write.csv(x = map.data, file= "map.data.csv", row.names=F)
EDIT: I finally figured it out. It's because I was changing the WD inside a Markdown chunk, and I did not know that R automatically resets it after the chunk is run. Once I made the WD a global setting in the Markdown file, the CSV started showing up.
First, I would say saving a file as "map.data.csv" might be causing some issue. Try "map_data.csv".
Next make sure the working directory is the desired location (getwd and setwd)
Finally, make sure you have a non-empty data frame map.data
Otherwise I'm not sure what could be causing the issue with the info provided.

R-Studio freezes when trying to load data

I was trying to load the dataset from this file https://github.com/WinVector/zmPDSwR/blob/master/Custdata/custdata.tsv
and RStudio freezes and crashes every time. How can I tell if there is something particular with the datafile or RStudio is unable to handle it? How would I be able to get this data into R?
I will provide you with a possible alternative.
I always get a strange error whenever I try to do the online-data-pull, so what I do is, I download the dataset and keep it in my project folder and do my work with that.
As far as the dataset is concerned, it is the standard tsv format.
A code-snippet for your reference is:
mydata <- read.table("~path/dataset.tsv",sep="\t", header=TRUE)

Resources