reading .csv into EBImage - r

I'm rather new to R and image processing. I have a series of images stored as csv files that I would like to read into EBImage. I have been using ImageJ in the past. I've been using the imageData(y) command but keep getting the error that my "object must be an array" when I try to read in the file.
I think the issue I'm having is that I've imported the csv as a spatial indexed matrix (x,y) and I need to convert to an array (x,y,value).
test=read.csv('Fe_Kα1.csv', sep=",", header=FALSE)
test1 <- as.matrix(test)
writeImage('test1')
Error in validImage(x) : object must be an array
Thank you

Related

R: Writing data frame into excel with large number of rows

I have a data frame (panel form) in R with 194498 rows and 7 columns. I want to write it to an Excel file (.xlsx) using function res <- write.xlsx(df, output) but R goes in the coma (keeps showing stop sign on the top left of console) without making any change in the targeted file(output). Finally shows following:
Error in .jcheck(silent = FALSE) :
Java Exception <no description because toString() failed>.jcall(row[[ir]], "Lorg/apache/poi/ss/usermodel/Cell;", "createCell", as.integer(colIndex[ic] - 1))<S4 object of class "jobjRef">
I have loaded readxl and xlsx packages. Please suggest to fix it. Thanks.
Install and load package named 'WriteXLS' and try writing out your R object using function WriteXLS(). Make sure your R object is written in quotes like the one below "data".
# Store your data with 194498 rows and 7 columns in a data frame named 'data'
# Install package named WriteXLS
install.packages("WriteXLS")
# Loading package
library(WriteXLS)
# Writing out R object 'data' in an Excel file created namely data.xlsx
WriteXLS("data",ExcelFileName="data.xlsx",row.names=F,col.names=T)
Hope this helped.
This does not answer your question, but might be a solution to your problem.
Could save the file as a CSV instead like so:
write.csv(df , "df.csv")
open the CSV and then save as an Excel file.
I gave up on trying to import/export Excel files with R because of hassles like this.
In addition to Pete's answer I wouldn't recommend write.csv because it takes or can take minutes to load. I used fwrite() (from data.table library) and it did the same thing in about 1-2 secs.
The post author asked about large files. I dealt with a table about 2,3 million rows long and write.data (and frwrite) aren't able to write more than about 1 million rows. It just cuts the data away. So instead use write.table(Data, file="Data.txt"). You can open it in Excel and split the one column by your delimiter (use argument sep) and voila!

calculate returns in r given imported data from excel

I'm new to R and I'm trying to get returns from a timeseries of S&P500 prices.
My original file is in csv format. I uploaded it using:
>sp500 <- read.csv2("sp500.csv")
and then, after uploading timeSeries package, tryied to run:
>rend <- returns(sp500)
But I get:
>Error in hasTsp(x) : invalid time series parameters specified
It seems to me that r read my file, not as an array of numbers, but as an array of strings and so it doesn't work math computation.
Anyone can help me and suggest me how to solve it?
thank you very much!
before calling returns(), you should first transform your data.frame into a timeSeries, for example:
sp500Series <- as.timeSeries(sp500)

Save data.frame objects into .Rds files within a loop

I have data.frame objects with normalized names into my global env and I want to save them into .Rda files.
My first question is, should I save them into one big .Rda file or should I create one file for each data frame ? (df have 14 col and ~260 000 row).
Assuming that I'll save them into differents files, I was thinking about a function like this : (All my data.frame names begin by "errDatas")
sapply(ls(pattern = "errDatas"), function(x) save(as.name(x), file = paste0(x, ".Rda")))
But I have this error :
Error in save(as.name(x), file = paste0(x, ".Rda")) :
objet ‘as.name(x)’ introuvable
Seems like save can't parse as.name(x) and evaluate it as is. I tried also with eval(parse(text = x)) but it's the same thing.
Do you have an idea about how I can manage to save my data frames within a loop ? Thanks.
And I have a bonus question to know if what I'm trying to do is useful and legit :
These data frames come from csv files (one data frame by csv file which I import with read.csv). Each day I have one new csv file and I want to do some analysis on all the csv files. I realized that reading from csv is much slower than saving and loading a Rda file. So instead of reading all the csv each time I run my program, I actualy want to read each csv file only once, saving it into a Rda file and then loading it. Is this a good idea ? Is there best-practices for that with R ?
Use the list= parameter of the save function. This allows you to specify the name of the object as a character vector rather than passing the object itself. For example
lapply(ls(pattern = "errDatas"), function(x) {
save(list=x, file = paste0(x, ".Rda"))
})

How to import multiple matlab files into R (Using package R.Matlab)

Thank you in advance for your're help. I am using R to analyse some data that is initially created in Matlab. I am using the package "R.Matlab" and it is fantastic for 1 file, but I am struggling to import multiple files.
The working script for a single file is as follows...
install.packages("R.matlab")
library(R.matlab)
x<-("folder_of_files")
path <- system.file("/home/ashley/Desktop/Save/2D Stream", package="R.matlab")
pathname <- file.path(x, "Test0000.mat")
data1 <- readMat(pathname)
And this works fantastic. The format of my files is 'Name_0000.mat' where between files the name is a constant and the 4 digits increase, but not necesserally by 1.
My attempt to load multiple files at once was along these lines...
for (i in 1:length(temp))
data1<-list()
{data1[[i]] <- readMat((get(paste(temp[i]))))}
And also in multiple other ways that included and excluded path and pathname from the loop, all of which give me the same error:
Error in get(paste(temp[i])) :
object 'Test0825.mat' not found
Where 0825 is my final file name. If you change the length of the loop it is always just the name of the final one.
I think the issue is that when it pastes the name it looks for that object, which as of yet does not exist so I need to have the pasted text in speach marks, yet I dont know how to do that.
Sorry this was such a long post....Many thanks

Error exporting data.frame as csv

Exporting data.frame as .csv with code.
write.csv(df, "name.csv")
LogitTV.Rda has 3000 rows and 4 columns.
My code has an error when identifying the data.frame.
load("~/Home Automation/LogitTV.Rda")
write.csv(LogitTV.Rda, "LogitTV.csv")
Error in is.data.frame(x) : object 'LogitTV.Rda' not found
Checked the following:
1) Cleaned the console of previous history
2) Working Directory set as ~/Home Automation/
Anything else to check for preventing the error?
Thanks
LogitTV.Rda is, confusingly, not the name of the object that gets loaded.
Try:
loadedObj <- load("~/Home Automation/LogitTV.Rda")
write.csv(get(loadedObj), file="LogitTV.csv")
This assumes that the .Rda file contains only a single R object, and that it is a data frame or matrix.
It would be nice if write.csv had a way to accept the name of an object instead of the object itself (so get() was unnecessary), but I don't know of one.

Resources