R-project filepath from concatenation - r

I'm working through an R tutorial. I've been working on a function and one of the parts of the function is to take an argument and use it to define a directory in which to find data. It must then load that data.
As it stands the following works:
getmonitor <- function(id, directory){
csvfile <- function(id) {
if (id < 10) {
paste0(0,0,id,".csv")
} else if (id < 100) {
paste0(0,id,".csv")
} else paste0(id,".csv")
}
foo <- read.csv(csvfile(id))
}
Fine. But I now have to use the "directory" parameter to define the directory where the csv file must be read from. I've tried various things here to no avail.
Currently, the code works if the assumption is that the data are in the working directory. I need to say "go to the directory called (directory) and then read.csv.
The directory with all of the data files is called "specdata" and the parameter for directory is thus "specdata".
I tried the following:
getmonitor <- function(id, directory){
csvfile <- function(id) {
if (id < 10) {
paste0(0,0,id,".csv")
} else if (id < 100) {
paste0(0,id,".csv")
} else paste0(id,".csv")
}
filepath <- append(directory,"/",csvfile(id))
foo <- read.csv(filepath)
}
But then I received an error message "Error in !after : invalid argument type
"
I have tried a few various things and if I cut and paste all the code it would probably be more messy than help.
What would be a logical way to do this? Am I on the right track with append? What else should I sue if not? I need to take the parameter "directory" and then load data from that directory.

getmonitor <- function(id, directory=getwd(), ...){
csvfile <- sprintf("%03d.csv", id)
filepath <- file.path(directory, csvfile)
foo <- read.csv(filepath, ...)
foo
}

Related

Loading .rmd datafile within a function to the environment

I'm new to functions and I would like to load my data with a function.
The function appears to be correct but the file does not save as a dataframe to the environment, while this does happen when it's not within the function.
This is my script:
read_testdata <- function(file) {
Dataset_test <- read_rds(here("foldername", file))
}
read_testdata("filename")
Can someone spot my error?
After some thinking I spotted my problem, the correct code should be this:
read_testdata <- function(file) {
read_rds(here("foldername", file))
}
Dataset_test <- read_testdata("filename.rds")

How can I use error handling when reading xlsx files in R?

I am trying to read 23 excel files, store each in a list, and then rbind them to one csv. Some of these file are csv and some of them are xlsx. However, I got the following message:
Error: Can't establish that the input is either xls or xlsx.
So I want to identify which ones are giving error and then append it manually.
My function is the following:
make_df<-function(filename){
library(readxl)
library(foreign)
if (str_sub(filename,-3,-1) == "csv"){
df<-read.csv(filename,fileEncoding="latin1")
}
else{
df<-read_excel(filename)
}
return(df)
}
filenames_vector<-list.files(# directory)
datalist = list()
for (i in 1:23){
datalist[[i]] <- make_df(filenames_vector[i])
}
mega_data = do.call(rbind,datalist)
How can I add something in make_df to print out the names of files that are causing the error message? Also, is there another work around, when the the error message is on not being able to distinguish xlsx from xls?
This can be done with a tryCatch block. Without example data it's a little hard to recreate. I'm not sure what you mean in your second question.
Try the code below to catch errors and print out the filename if there's an error, otherwise return the df object.
make_df<-function(filename){
library(readxl)
library(foreign)
df = tryCatch(
{ # try block
if (str_sub(filename,-3,-1) == "csv"){
df<-read.csv(filename,fileEncoding="latin1")
}
else{
df<-read_excel(filename)
}
},
error=function(cond){return(filename)} # grab the filename if there was an error
)
if (class(df) == 'character') {
print(df)
} else{return(df)}
}

R/Shiny: Download multiple files (zip) from a folder on the server

I would like to create a zip archive (containing several xlsx files) and save it locally. The files are stored in a folder on the server side.
The user selects the files to zip using a checkboxInput.
Here the code for the checkbox:
get.files <- reactive({
list.files("output_file/")
})
obsList <- list()
output$links_list <- renderUI({
lapply(as.list(1:length(get.files())), function(i)
{
btName <- get.files()[i]
# creates an observer only if it doesn't already exists
if (is.null(obsList[[btName]])) {
obsList[[btName]] <<- btName
}
fluidRow(checkboxInput(btName, get.files()[i]) )
})
})
The checkboxes are created dynamically reading the content in the folder ("output_file/"). Near each checkbox there is the name of the file.
The function for the download is:
output$downloadzip<-downloadHandler(
filename = function(){
paste0("Extract.zip")
},
content = function(file){
files <- NULL;
for (i in 1:length(obsList)){
if(input[[obsList[[i]]]])
files <- c(paste("output_file/",obsList[[i]],sep=""),files)
}
#create the zip file
zip(file,files)
},
contentType = "application/zip"
)
The function creates an array of filenames (files) using only the names of files that have been checked.
I have created also a function that allows me to check that only the right files are chosen:
tempText <- eventReactive({input$TempTest},{
l<-c()
for (i in 1:length(obsList)){
if(input[[obsList[[i]]]])
l<-c(l,paste("output_file/",obsList[[i]],sep=""))
}
return(paste(l) )
},
ignoreInit = TRUE)
output$Temp <- renderPrint({ tempText()})
This function renders correctly the strings with the name of the files.
The error that I get when I try to download the zip file is:
sh: : command not found
Can someone help me to fix this?
I have fixed the problem.
The issue is with the zip function that for some reasons doesn't work properly on my server.
The solution is to use directly the system2 function (that is called internally by zip).
Instead of
zip(file,files)
I have to use:
system2("zip", args=(paste(file,files,sep=" ")))

Clean way to wrap-up and handle RMySQL connections?

I'm fairly new to R, so forgive me if this is a amateur question. I still don't get parts of how the R language works and I haven't used closures enough to really build intuition on how to approach this problem.
I want to wrap up opening and closing a database connection in my R project in a clean way. I have a variety of scripts set aside that all use a common DB connection configuration file (I don't put it in my repo, it's a local file only), all of which need to connect to the same MySQL database.
The end goal is to do something like :
query <- db_open()
out <- query("select * from example limit 10")
db_close()
This is what I wrote so far (all my scripts load these functions from another .R file) :
db_open <- function() {
db_close()
db_conn <<- dbConnect(MySQL(), user = db_user, password = db_pass, host = db_host)
query <- function(...) { dbGetQuery(db_conn, ...) }
return(query)
}
db_close <- function() {
result <- tryCatch({
dbDisconnect(db_conn)
}, warning = function(w) {
# ignore
}, error = function(e) {
return(FALSE)
})
return(result)
}
I'm probably thinking of this in an OOP way when I shouldn't be, but sticking db_conn in the global environment feels unnecessary or even wrong.
Is this a reasonable way to accomplish what I want? Is there a better way that I'm missing here?
Any advice is appreciated.
You basically had it, you just need to move the query function into its own function. Regarding keeping db_conn, there really is no reason not to have it in the global environment.
db_open <- function() {
db_close()
db_conn <<- dbConnect(MySQL(), user='root', password='Use14Characters!', dbname='msdb_complex', host='localhost')
}
db_close <- function() {
result <- tryCatch({
dbDisconnect(db_conn)
}, warning = function(w) {
# ignore
}, error = function(e) {
return(FALSE)
})
return(return)
}
query <- function(x,num=-1)
{
q <- dbSendQuery(db_conn, x)
s <- fetch(q, num);
}
Then you should be able to do something like:
query <- db_open()
results <- query("SELECT * FROM msenrollmentlog", 10)
db_close()

How to pass object in nested functions?

I'm trying to override save() in R so that it creates any missing directories before saving an object. I'm having trouble passing an object through one function to another using the ellipsis method.
My example:
save <- function(...,file){ #Overridden save()
target.dir <- dirname(file) #Extract the target directory
if(!file.exists(target.dir)) {
#Create the target directory if it doesn't exist.
dir.create(target.dir,showWarnings=T,recursive=T)
}
base::save(...,file=file.path(target.dir,basename(file)))
}
fun1 <- function(obj) {
obj1 <- obj + 1
save(obj1,file="~/test/obj.RData")
}
fun1(obj = 1)
The code above results in this error:
Error in base::save(..., file = file.path(target.dir, basename(file))) :
object ‘obj1’ not found
I realize that the problem is that the object 'obj1' doesn't exist inside my custom save() function, but I haven't yet figured out how to pass it from fun1 to base::save.
I have tried:
base::save(parent.frame()$...,file=file.path(target.dir,basename(file)))
and:
base::save(list=list(...),file=file.path(target.dir,basename(file)))
with no success.
Any suggestions?
You need to specify the parent's environment to 'base::save' :
save <- function(...,file){ #Overridden save()
target.dir <- dirname(file) #Extract the target directory
if(!file.exists(target.dir)) {
#Create the target directory if it doesn't exist.
dir.create(target.dir,showWarnings=T,recursive=T)
}
base::save(...,file=file.path(target.dir,basename(file)),envir=parent.frame())
}
Note the parameter added to the base::save call.
fun1 <- function(obj) {
obj1 <- obj + 1
save(obj1,file="~/test/obj.RData")
}
In addition, use '=' to specify parameter names:
fun1(obj = 1)

Resources