How to use base R to write table to .xlsx? - r

Apologies but I'm new to R. I'm able to write a data table to a .csv and a .txt file but I'd like to write it to an .xlsx file. Is this possible using base R?
Thanks

The writexl package is quite nice. No 'Java' or 'Excel' required.
library(writexl)
write_xlsx(df,'filename.xlsx')

For doing this you need to have xlsx package installed.
The you can try.
library(xlsx)
xlsx.writeMultipleData <- function (file, ...)
{
require(xlsx, quietly = TRUE)
objects <- list(...)
fargs <- as.list(match.call(expand.dots = TRUE))
objnames <- as.character(fargs)[-c(1, 2)]
nobjects <- length(objects)
for (i in 1:nobjects) {
if (i == 1)
write.xlsx(objects[[i]], file, sheetName = objnames[i])
else write.xlsx(objects[[i]], file, sheetName = objnames[i],
append = TRUE)
}
}

Related

Reading files from one format and saving as csv format in R

I am new to R. I would like to read LAS files and perform some operation and save the result as .csv file with following piece of code. However, output files are saved as input file format (LAS). I really appreciate any help.
library(lidR)
files <- list.files(path= "Input_path", pattern= "*.las", full.names = TRUE, recursive = FALSE)
O <- function(x) {
las = readLAS(x, select = "xyz", filter = "keep_first -drop_z_below 0")
data <- as.spatial(las)
z <- data$Z
q <- quantile(z, 0.99)
data1 <- subset(data,data$Z <= q)
return(data1)
}
for (f in files) {
print(f)
data2 <- O(f)
write.csv(data2, file = paste0("PATH/", basename(f)))
}
Try this,file name should be csv
write.csv(data2, file = paste0("PATH/",unlist(strsplit(basename(f),"[.]"))[1],".csv"))
EDIT
if filename has "." middle of name,you can use this method
for (f in files) {
print(f)
data2 <- O(f)
s = unlist(strsplit(basename(f),"[.]"))
filename <- paste0(paste0(s[1:length(s)-1], collapse = "."),".csv")
write.csv(data2, file = paste0("PATH/", filename))
}

Create R data with a dynamic variable name from function for package?

I am working on a function which is part of a package.
This package contains a template for a new package, and a function which creates R data for the new package which has to have a dynamic name provided to this function.
At the moment I am doing the following:
makedata <- function(schemeName, data) {
rdsFile <- paste0(schemeName, ".rds")
varName <- paste0(schemeName)
saveRDS(
data,
file = file.path( ".", "data", rdsFile )
)
cat(
paste0(varName, " <- readRDS(\"./", rdsFile, "\")"),
file = file.path( ".", "data", paste0(varName, ".R") )
)
}
makedata(name = "test", data = letters)
which results in two files in the data directory:
a file test.rds containing letters but which is not loaded by R when the package is loaded (rds is not supported)
a file test.R which has the code test <- readRDS("./test.rds") and which causes, when the package is loaded, the data in test.rds to be loaded in the variable test which than contains letters.
Now CRAN does not like rds files in the data directory.
Is there another way that I can use the standard formats (preferably RData) to achieve this?
You can try something like this:
makedata <- function(schemeName, data) {
rdataFile <- paste0(schemeName, ".rda")
## Assign data to the name saved in schemeName
assign(x = schemeName, value = data)
## Save as RData file
save(list = schemeName, file = file.path( ".", "data", rdataFile))
}
A possible alternative with eval parse, as discussed in the comments.
makedata <- function(schemeName, data) {
rdaFile <- paste0(schemeName, ".rda")
fileLocation <- file.path( ".", "data", rdaFile )
varName <- paste0(schemeName)
assign(varName, data)
eval(parse(text = sprintf("save(%s, file = '%s')", varName, fileLocation)))
cat(sprintf("%s <- load(%s, file = '%s')",
varName, varName,
fileLocation
),
file = file.path( ".", "data", paste0(varName, ".R") ))
}
Off topic:
Also, since you're developing packages, one convenient option might be using system.file instead of file.path due to option system.file('data/test.R', package = 'yourPackage') which allows to look in your package directory, wherever it is installed. Haven't tested your previous solution, it might be working fine too.

Dynamic output file name in R

I am so close to getting my code to work, but cannot seem to figure out how to get a dynamic file name. Here is what Ivve got:
require(ncdf)
require(raster)
require(rgdal)
## For multiple files, use a for loop
## Input directory
dir.nc <- 'inputdirectoy'
files.nc <- list.files(dir.nc, full.names = T, recursive = T)
## Output directory
dir.output <- 'outputdirectory'
## For simplicity, I use "i" as the file name, but would like to have a dynamic one
for (i in 1:length(files.nc)) {
r.nc <- raster(files.nc[i], varname = "precipitation")
writeRaster(r.nc, paste(dir.output, i, '.tiff', sep = ''), format = 'GTiff', prj = T, overwrite = T)
}
## END
I appreciate any help. So close!!
You can do this in different ways, but I think it is generally easiest to first create all the output filenames (and check if they are correct) and then use these in the loop.
So something like this:
library(raster)
infiles <- list.files('inputpath', full.names=TRUE)
ff <- extension(basename(infiles), '.tif')
outpath <- 'outputpath'
outfiles <- file.path(outpath, ff)
To assure that you are writing to an existing folder, you can create it first.
dir.create(outpath, showWarnings=FALSE, recursive=TRUE)
And then loop over the files
for (i in 1:length(infiles)) {
r <- raster(infiles[i])
writeRaster(r, paste(outfiles[i], overwrite = TRUE)
}
You might also use something along these lines
outfiles <- gsub('in', 'out', infiles)
Here is the code that finally worked:
# Imports
library(raster)
#Set source file
infiles <- list.files('infilepath', full.names=TRUE)
#create dynamic file names and choose outfiles to view list
ff <- extension(basename(infiles), '.tif')
outpath <- 'outfilepath'
outfiles <- file.path(outpath, ff)
#run da loop
for (i in 1:length(infiles)) {
r <- raster(infiles[i])
writeRaster(r, paste(outfiles[i]), format ='GTiff', overwrite = T)
}
## END

Passing list of dataframes to R Function

I found a neat function for exporting multiple data frames to an excel file using the package xlsx:
save.xlsx <- function (file, ...)
{
require(xlsx, quietly = TRUE)
objects <- list(...)
fargs <- as.list(match.call(expand.dots = TRUE))
objnames <- as.character(fargs)[-c(1, 2)]
nobjects <- length(objects)
for (i in 1:nobjects) {
if (i == 1)
write.xlsx(objects[[i]], file, sheetName = objnames[i])
else write.xlsx(objects[[i]], file, sheetName = objnames[i],
append = TRUE)
}
print(paste("Workbook", file, "has", nobjects, "worksheets."))
}
It works by putting in the file name and then passing it multiple dataframes like so:
save.xlsx(filename, df1, df2, df3)
but i wanted to pass it a list of dataframes instead of them individually like so:
dataframes <- list(report1, report2, report3)
save.xlsx(filename, dataframes)
but it errors out because I am passing all of the dataframes at once. I am trying to figure out how to unstack them into the function but I haven't been successful yet.
Any help would be appreciated.
Thanks!
David
So i have been trying Richards suggestion and edited to the following:
save.xlsx <- function (file, dataframes)
{
require(xlsx, quietly = TRUE)
objects <- dataframes
fargs <- as.list(match.call(expand.dots = TRUE))
objnames <- as.character(fargs)[-c(1, 2)]
nobjects <- length(objects)
for (i in 1:nobjects) {
if (i == 1)
write.xlsx(objects[[i]], file, sheetName = objnames[i])
else write.xlsx(objects[[i]], file, sheetName = objnames[i],
append = TRUE)
}
print(paste("Workbook", file, "has", nobjects, "worksheets."))
}
but i get the following error:
Error in .jcall(wb, "Lorg/apache/poi/ss/usermodel/Sheet;", "createSheet", :
java.lang.IllegalArgumentException: sheetName must not be null
It isn't get the sheet names like it had before but I am not understanding why

Download and read shapefile function in R

I would like to expand on this function. As of now, the function downloads and unzips the shape file from the web. I would like to implement 'rgdal' to read the file into R.
library(rgdal)
dlshape=function(location) {
temp=tempfile()
download.file(location, temp)
unzip(temp)
}
I found the following code on SO, but I was unsuccessful in adapting it. It appears that the function still looks at the first file unzipped rather than grep for a file ending with the .shp extension.
read.csv.zip <- function(zipfile, ...) {
# Create a name for the dir where we'll unzip
zipdir <- tempfile()
# Create the dir using that name
dir.create(zipdir)
# Unzip the file into the dir
unzip(zipfile, exdir=zipdir)
# Get a list of csv files in the dir
files <- list.files(zipdir)
files <- files[grep("\\.csv$", files)]
# Create a list of the imported csv files
csv.data <- sapply(files, function(f) {
fp <- file.path(zipdir, f)
return(read.csv(fp, ...))
})
return(csv.data)}
dlshape=function(shploc, shpfile) {
temp=tempfile()
download.file(shploc, temp)
unzip(temp, exdir = temp)
files<-list.files(temp)
files <- files[grep("\\.shp$", files)]
shp.data <- sapply(files, function(f) {
fp <- file.path(zipdir, f)
return(ReadOGR(fp, ...))
})
return(shp.data)}
Could someone please help me in figuring this out. I would gladly appreciate it.
EDIT: Included my adaptation for clarification on the "adapting" part.
Try this.
dlshape=function(shploc, shpfile) {
temp=tempfile()
download.file(shploc, temp)
unzip(temp)
shp.data <- sapply(".", function(f) {
fp <- file.path(temp, f)
return(readOGR(".",shpfile))
})
}
x = dlshape(shploc="http://www.location.com/file_name.zip", "file_name")

Resources