I have a Shiny app that reads in a gpx track file and buffers it. I then want the user to be able to download that shapefile to a destination of their choice. I have been trying to use the downloadHandler function, but so far I have no joy.
The name of the shapefile that I have created is called trk_buff.
Within R I can just use:
my_dsn<-"C:\\Documents\\TrackFiles"
writeOGR(obj=trk_buff, dsn=my_dsn, layer="BufferedTracks", driver="ESRI Shapefile")
I have tried to use the downloadHandler thus:
output$downloadData<-downloadHandler(
filename=function(file){trk_buff},
content=function(file){
writeOGR(obj=trk_buff, dsn=file, layer="BufferedTracks", driver="ESRI Shapefile")
})
But I am obviously doing something wrong as nothing happens... :|
EDIT TO ADD
I can get the behaviour I want if I use a combination of an action Button and a textFile box.
But that is a little clumsy and involves the user explicitly writing the filepath rather than searching for it, which will probably lead to errors:
e.g.
in the ui.R I have:
textInput("filepath","Filepath to download data"),
actionButton("act1","Download shapefile")
In the server.R I have:
action_btn_code <- eventReactive(input$act1, {
file_path<-input$filepath
writeOGR(obj=trk_buff, dsn=paste(file_path,"/Tracks",sep=""), layer="BufferedTracks",
driver="ESRI Shapefile", overwrite_layer=TRUE)
})
The following works for me. The idea is that you have to zip up of the shapefiles because downloadHandler can only handle downloading one file.
output$download_shp <- downloadHandler(
filename = "shapefile.zip",
content = function(file) {
data = trk_buff() # I assume this is a reactive object
# create a temp folder for shp files
temp_shp <- tempdir()
# write shp files
writeOGR(data, temp_shp, "trk_buff", "ESRI Shapefile",
overwrite_layer = TRUE)
# zip all the shp files
zip_file <- file.path(temp_shp, "trk_buff_shp.zip")
shp_files <- list.files(temp_shp,
"trk_buff",
full.names = TRUE)
# the following zip method works for me in linux but substitute with whatever method working in your OS
zip_command <- paste("zip -j",
zip_file,
paste(shp_files, collapse = " "))
system(zip_command)
# copy the zip file to the file argument
file.copy(zip_file, file)
# remove all the files created
file.remove(zip_file, shp_files)
}
)
library(sf)
library(zip)
output$download_shp <- downloadHandler(
filename <- function() {
"Data_shpExport.zip"
},
content = function(file) {
withProgress(message = "Exporting Data", {
incProgress(0.5)
tmp.path <- dirname(file)
name.base <- file.path(tmp.path, "BufferedTracks")
name.glob <- paste0(name.base, ".*")
name.shp <- paste0(name.base, ".shp")
name.zip <- paste0(name.base, ".zip")
if (length(Sys.glob(name.glob)) > 0) file.remove(Sys.glob(name.glob))
sf::st_write(trk_buff, dsn = name.shp, ## layer = "shpExport",
driver = "ESRI Shapefile", quiet = TRUE)
zip::zipr(zipfile = name.zip, files = Sys.glob(name.glob))
req(file.copy(name.zip, file))
if (length(Sys.glob(name.glob)) > 0) file.remove(Sys.glob(name.glob))
incProgress(0.5)
})
}
)
Related
There are a number of different Q/A's regarding this topic on SO, but none that I have been able to find that fit my use-case. I am also very surprised that RStudio / the Shiny developers themselves have not come out with some documentation on how to do this. Regardless, take this example application:
library(shiny)
library(glue)
library(tidyverse)
# Define UI for application
ui <- fluidPage(
# Application title
titlePanel("Test Multi-File Download"),
p("I hope this works!"),
downloadButton(
outputId = "download_btn",
label = "Download",
icon = icon("file-download")
)
)
# Define server logic
server <- function(input, output) {
#datasets stored in reactiveValues list
to_download <- reactiveValues(dataset1 = iris, dataset2 = airquality, dataset3 = mtcars, dataset4 = NULL)
blahblah <- iris
output$download_btn <- downloadHandler(
filename = function(){
paste("my_data_", Sys.Date(), ".csv", sep = "")
},
content = function(file){
#works
#readr::write_csv(blahblah, file)
#Attempt 1
# #create some temp directory
# temp_directory <- tempdir()
# browser()
# reactiveValuesToList(to_download) %>%
# #x is data, y is name
# imap(function(x,y){
# #browser()
# #check if data is not null
# if(!is.null(x)){
# #create file name based on name of dataset
# file_name <- glue("{y}_data.csv")
# #write file to temp directory
# readr::write_csv(x, file_name)
# }
# })
# zip::zip(
# zipfile = file,
# files = ls(temp_directory),
# root = temp_directory
# )
}
)
}
# Run the application
shinyApp(ui = ui, server = server)
I have some datasets that are stored in a reactiveValues list, and I would like the user to be able to download them all. Ideally, I'd like for them just to be able to download multiple files all at once, rather than having to zip them up, and then download a .zip file. Another option I would be okay with is to add each dataset to an Excel sheet, then download the multi-sheet Excel file.
My general thought process (on the former) is as follows:
Download button gets pressed
create some temporary directory
write (the not NULL) datasets contained in to_download reactiveValues list to this directory
zip the temp directory and download
I feel like I am very close, however I have not been able to successfully get this work yet. Any ideas?
Edit 1: I am aware of the proposed answer here, but would like to avoid using setwd() because I believe it is bad practice to mess with working directories from within a Shiny application.
A few things edited and it's working:
using dir instead of ls inside the zip::zip call to show the contents of the temp directory (ls lists R environment rather than directory contents)
as a further suggestion: making a new, unique folder inside tempdir() to ensure only relevant files are added.
library(shiny)
library(glue)
library(tidyverse)
# Define UI for application
ui <- fluidPage(
# Application title
titlePanel("Test Multi-File Download"),
p("I hope this works!"),
downloadButton(
outputId = "download_btn",
label = "Download",
icon = icon("file-download")
)
)
# Define server logic
server <- function(input, output) {
#datasets stored in reactiveValues list
to_download <- reactiveValues(dataset1 = iris, dataset2 = airquality, dataset3 = mtcars, dataset4 = NULL)
blahblah <- iris
output$download_btn <- downloadHandler(
filename = function(){
paste("my_data_", Sys.Date(), ".zip", sep = "")
},
content = function(file){
temp_directory <- file.path(tempdir(), as.integer(Sys.time()))
dir.create(temp_directory)
reactiveValuesToList(to_download) %>%
imap(function(x,y){
if(!is.null(x)){
file_name <- glue("{y}_data.csv")
readr::write_csv(x, file.path(temp_directory, file_name))
}
})
zip::zip(
zipfile = file,
files = dir(temp_directory),
root = temp_directory
)
},
contentType = "application/zip"
)
}
shinyApp(ui = ui, server = server)
In my own Shiny app I had used a multi-worksheet approach as you suggested above. An alternative setup which works to produce a multi-sheet xlsx workbook using openxlsx could be:
...
output$download_btn <- downloadHandler(
filename = function(){
paste("my_data_", Sys.Date(), ".xlsx", sep = "")
},
content = function(file){
wb <- createWorkbook()
reactiveValuesToList(to_download) %>%
imap(function(x,y){
if(!is.null(x)){
addWorksheet(wb, sheetName = y)
writeData(wb, x, sheet = y)
}
})
saveWorkbook(wb, file = file)
},
contentType = "file/xlsx"
)
...
Created on 2021-12-16 by the reprex package (v2.0.1)
I am making a shiny app that allows the user to upload a shapefile using the sf package. When I select the .shp file via the Browse window, I get an error. How can I allow the user to upload a shapefile, that then get it read by st_read' or readOGR. And, I don't know why st_read is going to C:\Users\Ed\AppData... as this is not location of the shapefile.
library(shiny)
library(shinydahsboard)
library(sf)
UI
ui = navbarPage("Project Eddy", theme = shinytheme("sandstone"),
tabPanel("Location",
sidebarLayout(sidebarPanel(fileInput("shp", "Please choose a Shapefile",
multiple = F,
".shp")),
mainPanel(plotlyOutput(outputId = "Area")))))
Server
server = function(input, output, session) {
myshp.df = reactive({
# input$shp will be NULL initially. After the user selects
# and uploads a file, head of that data file by default,
# or all rows if selected, will be shown.
req(input$shp)
df = st_read(dsn = input$shp$datapath,
quite = T)
if(input$disp == "head") {
return(head(df))
}
else {
return(df)
}
})
output$Area = renderPlotly({
req(myshp.df())
a = myshp.df
c = leaflet(a) %>%
addPolygons(stroke = FALSE, fillOpacity = 0.5, smoothFactor = 0.5) %>%
addProviderTiles('Esri.WorldImagery')
})
})
Error
Warning in CPL_read_ogr(dsn, layer, query, as.character(options), quiet, :
GDAL Error 4: Unable to open C:\Users\Ed\AppData\Local\Temp\RtmpioUU3m\b0cd5b1eb5c4fe4219e6c114\0.shx or C:\Users\Ed\AppData\Local\Temp\RtmpioUU3m\b0cd5b1eb5c4fe4219e6c114\0.SHX. Set SHAPE_RESTORE_SHX config option to YES to restore or create it.
Warning: Error in : Cannot open "C:\Users\Ed\AppData\Local\Temp\RtmpioUU3m\b0cd5b1eb5c4fe4219e6c114\0.shp"; The source could be corrupt or not supported. See `st_drivers()` for a list of supported formats.
128:
ESRI shapefiles are known troublemakers, as they live over multiple files - the single *.shp file is not enough for your shiny app to work with.
Consider a solution proposed by user fiorepalombina on RStudio Community forum: https://community.rstudio.com/t/shinyfiles-and-shapefiles/89099
To read-in a shapefile, the user must submit at minimum the mandatory files (.shp, .shx and .dbf). Once files are uploaded, you can access the location and name via $datapath and $name.
By default, shiny names file inputs like this:
C:\Users\DWISME~1\AppData\Local\Temp\17\RtmpiOjVGv/6903ae29a41daccceee4b8a5/0.dbf
C:\Users\DWISME~1\AppData\Local\Temp\17\RtmpiOjVGv/6903ae29a41daccceee4b8a5/1.prj
C:\Users\DWISME~1\AppData\Local\Temp\17\RtmpiOjVGv/6903ae29a41daccceee4b8a5/2.sbn
C:\Users\DWISME~1\AppData\Local\Temp\17\RtmpiOjVGv/6903ae29a41daccceee4b8a5/3.sbx
C:\Users\DWISME~1\AppData\Local\Temp\17\RtmpiOjVGv/6903ae29a41daccceee4b8a5/4.shp
C:\Users\DWISME~1\AppData\Local\Temp\17\RtmpiOjVGv/6903ae29a41daccceee4b8a5/5.shx
My approach is to create a function that accesses the file input location and change the directory:
library(shiny)
library(sf)
library(purrr)
ui <- fluidPage(
br(),
fluidRow(column(6, offset = 3,
fileInput("shp", label = "Input Shapfile (.shp,.dbf,.sbn,.sbx,.shx,.prj)",
width = "100%",
accept = c(".shp",".dbf",".sbn",".sbx",".shx",".prj"), multiple=TRUE))),
br(),
fluidRow(column(8, offset = 2,
p("input$shp$datapath" , style = "font-weight: bold"),
verbatimTextOutput("shp_location", placeholder = T))),
br(),
fluidRow(column(8, offset = 2,
p("input$shp$name" , style = "font-weight: bold"),
verbatimTextOutput("shp_name", placeholder = T)))
)
server <- function(input, output, session) {
# Read-in shapefile function
Read_Shapefile <- function(shp_path) {
infiles <- shp_path$datapath # get the location of files
dir <- unique(dirname(infiles)) # get the directory
outfiles <- file.path(dir, shp_path$name) # create new path name
name <- strsplit(shp_path$name[1], "\\.")[[1]][1] # strip name
purrr::walk2(infiles, outfiles, ~file.rename(.x, .y)) # rename files
x <- read_sf(file.path(dir, paste0(name, ".shp"))) # read-in shapefile
return(x)
}
# Read-shapefile once user submits files
observeEvent(input$shp, {
user_shp <- Read_Shapefile(input$shp)
plot(user_shp) # plot to R console
# Print original file path location and file name to UI
output$shp_location <- renderPrint({
full_path <- strsplit(input$shp$datapath," ")
purrr::walk(full_path, ~cat(.x, "\n"))
})
output$shp_name <- renderPrint({
name_split <- strsplit(input$shp$name," ")
purrr::walk(name_split, ~cat(.x, "\n"))
})
})
}
shinyApp(ui, server)
I am still relatively new at working in R shiny and I am trying to load several excel files into an R-shiny app. Part of the problem is that I need to be able to pull several files from a dropbox folder without specifying what the data file is called. So I need to be able to tell R to read in all the files from a dropbox folder. Also the files I am working with are in .xlsx format and I will need to read them into R as such.
I tried to do this first by using a folder on my computer desktop. I managed to get it to work using my local directory with the code below:
library(readxl)
library(tidyverse)
files <- list.files(path = "~/Desktop/data", pattern = "*.xlsx", full.names = TRUE) #read files from folder on desktop
df <- sapply(files, read_excel, simplify = FALSE) %>% #read files from the path, and bind them together
bind_rows()
I tried to adjust the code above to work with the drop_dir function in rdrop2. The code I tried is below:
library(rdrop2)
library(tidyverse)
library(readxl)
token <- drop_auth()
files <- drop_dir("!dropbox_folder", dtoken = token) #List all files in Dropbox folder MPD_03_Test
f <- files$path_display #list directory to dropbox
df <- sapply(f, read_excel, simplify = FALSE) %>% #runs the read function for all the files that are pulled
bind_rows() # .id="id creates a unique ID for each row and then binds them all together based on the ID.
When I run it the code is not loading the data files from the dropbox into R. When I run the dropbox code it just creates an empty object. Any help on where to go to figure this out will be greatly appreciated! Also I intend to use this as how I read data into and R-shiny app if that helps frame any suggestions you may have about how to approach my problem.
Thank You!
#MrGumble is correct in his comments. The files need to downloaded before being read. The drop_dir() function lists file paths on dropbox server and we can only read in data saved locally to our machine. If you have .csv files then this can be down in 1 step with the drop_read_csv() function. But since you have excel files these need to first to be downloaded explicitly with drop_download() and then read in with read_excel().
library(rdrop2)
library(tidyverse)
library(readxl)
#install.packages("xlsx")
library(xlsx)
token <- drop_auth()
#make a few excel file with iris dataset, save locally, and upload to dropbox root
iris_filenames <- paste0("iris", 1:3, ".xlsx")
walk(iris_filenames, ~write.xlsx(iris, file = .x, row.names = FALSE))
walk(iris_filenames, drop_upload)
#list all files on dropbox root and filter for only iris ones
iris_files_on_dropbox <- drop_dir(dtoken = token) %>%
filter(str_detect(name, 'iris'))
#make new filenames so we can see that the download worked correctly
#you could do overwrite = TRUE and not pass through new filenames
#see ?drop_download for all options
new_iris_filenames <- paste0("iris", 1:3, "-from-dropbox.xlsx")
#download the files first
walk2(iris_files_on_dropbox$name, new_iris_filenames, ~drop_download(path = .x, local_path = .y))
#then read them all in
df <- bind_rows(map(new_iris_filenames, read_xlsx))
Additionally, we can create our own custom function to do the download and reading in 1 step just as drop_read_csv() does by altering the source code for this function. All we need to do is change the read...() function from read.csv to read_excel and the reference to the dtoken default get_drop_token() to rdrop2:::get_drop_token() which is an un-exported function from the rdrop2 package so we need the three ':::'.
#source for drop_read_csv we can rewrite for excel files
# drop_read_csv <- function(file, dest = tempdir(), dtoken = get_dropbox_token(), ...) {
# localfile = paste0(dest, "/", basename(file))
# drop_download(file, localfile, overwrite = TRUE, dtoken = dtoken)
# utils::read.csv(localfile, ...)
# }
drop_read_excel <- function(file, dest = tempdir(), dtoken = rdrop2:::get_dropbox_token(), ...) {
localfile = paste0(dest, "/", basename(file))
drop_download(file, localfile, overwrite = TRUE, dtoken = dtoken)
readxl::read_excel(localfile, ...)
}
df2 <- bind_rows(map(iris_files_on_dropbox$name, drop_read_excel))
To work in a shiny app we first need to save the drop_auth token so we can authenticate while using the shiny app. Save this into your shiny app directory.
saveRDS(token, file = "token.rds")
Now here is a shiny app. When the 'go' button is clicked the iris excel files are downloaded and shown in the UI. We need to call drop_auth() in the global environment or global.R along with the custom drop_read_excel() function to use it.
library(shiny)
library(rdrop2)
library(tidyverse)
#saveRDS(token, file = "token.rds") into shiny app directory
#authenticate in global.R or outside of ui/server
drop_auth(rdstoken = "token.rds")
drop_read_excel <- function(file, dest = tempdir(), dtoken = rdrop2:::get_dropbox_token(), ...) {
localfile = paste0(dest, "/", basename(file))
drop_download(file, localfile, overwrite = TRUE, dtoken = dtoken)
readxl::read_excel(localfile, ...)
}
ui <- fluidPage(
actionButton("go", "go"),
tableOutput("table")
)
server <- function(input, output, session) {
df <- eventReactive(input$go, {
withProgress(message = 'Downloading from dropbox',
detail = 'This may take a while...', value = 0.5, {
iris_files_on_dropbox <- drop_dir() %>%
filter(str_detect(name, 'iris'))
setProgress(value = 0.75)
df <- bind_rows(map(iris_files_on_dropbox$name, drop_read_excel))
setProgress(value = 1)
})
return(df)
})
output$table <- renderTable({
df()
})
}
shinyApp(ui, server)
*Hi, I'm trying to download multiple csv file from a unique excel file. I want to download (using only one downloadbutton) the differents sheets from the excel file.
I don't understand why a for() loop doesn't work, and I can't see how can I do?
If anyone knows..
The point is to download differents csv files, which are in the "wb" list (wb[1],wb[2]...)
Thanks.
Here is my code who works with the third sheet for instance (and sorry for my bad english) :
ui :
library(readxl)
library(shiny)
library(XLConnect)
fluidPage(
titlePanel("Export onglets en CSV"),
sidebarLayout(
sidebarPanel(
fileInput('fichier1','Choisissez votre fichier excel :',
accept = ".xlsx"),
fluidPage(
fluidRow(
column(width = 12,
numericInput("sheet","Indiquez l'onglet à afficher :",min = 1, value = 1),
tags$hr(),
textInput('text',"Indiquez le nom des fichiers :"),
tags$hr(),
h4("Pour télécharger les fichiers .csv :"),
downloadButton("download","Télécharger")
)
)
)),
mainPanel(
tabsetPanel(
tabPanel('Importation',
h4("Fichier de base:"),
dataTableOutput("contents"))
)
)
)
)
Server :
function(input,output){
#Création data :
data <- reactive({
inFile<- input$fichier1
if (is.null(inFile)){
return(NULL)
}else{
file.rename(inFile$datapath,
paste(inFile$datapath,".xlsx", sep =""))
wb = loadWorkbook(paste(inFile$datapath,".xlsx",sep=""))
lst = readWorksheet(wb,sheet = getSheets(wb))
list(wb = wb, lst = lst)
}
})
#Sortie de la table :
output$contents <- renderDataTable({
data()$wb[input$sheet]
},options = list(pageLength = 10))
#Téléchargement :
output$download <- downloadHandler(
#for (i in 1:input$sheet){
filename = function(){
paste(input$text,"_0",3,".csv",sep = "")
},
content = function(file){
write.table(data()$wb[3],file,
sep = ';', row.names = F, col.names = T)
}
#}
)
}
As #BigDataScientist pointed out, you could zip all of your csv file and download the zipped file. Your downloadHandler could look like:
output$download <- downloadHandler(
filename = function(){
paste0(input$text,".zip")
},
content = function(file){
#go to a temp dir to avoid permission issues
owd <- setwd(tempdir())
on.exit(setwd(owd))
files <- NULL;
#loop through the sheets
for (i in 1:input$sheet){
#write each sheet to a csv file, save the name
fileName <- paste(input$text,"_0",i,".csv",sep = "")
write.table(data()$wb[i],fileName,sep = ';', row.names = F, col.names = T)
files <- c(fileName,files)
}
#create the zip file
zip(file,files)
}
)
This does not download all the sheets from the excel file but the sheets ranging from 1 to whatever the user has as input in input$sheet.
You could also disable the download button if the user has not added an excel file/name.
Hope you've solved this MBnn, but in case anyone else is having similar problems, this case is down to RTools not being installed correctly on windows.
Currently you need to play close attention while running through the install process, and make sure to hit the checkbox to edit the system path.
Based on your code, this is likely to be the same issue preventing you from saving XLSX workbooks too.
I know this is an old thread but I had the same issue and the top answer did not work for me. However a simple tweak and using the archive package worked.
Reproductible example below:
library(shiny)
library(archive)
shinyApp(
# ui
ui = fluidPage(downloadButton("dl")),
# server
server = function(input, output, session) {
# download handler
output$dl <- downloadHandler(
filename = function() {"myzipfile.zip"},
# content: iris and mtcars
content = function(file) {
# definition of content to download
to_dl <- list(
# names to use in file names
names = list(a = "iris",
b = "mtcars"),
# data
data = list(a = iris,
b = mtcars)
)
# temp dir for the csv's as we can only create
# an archive from existent files and not data from R
twd <- setwd(tempdir())
on.exit(setwd(twd))
files <- NULL
# loop on data to download and write individual csv's
for (i in c("a", "b")) {
fileName <- paste0(to_dl[["names"]][[i]], ".csv") # csv file name
write.csv(to_dl[["data"]][[i]], fileName) # write csv in temp dir
files <- c(files, fileName) # store written file name
}
# create archive from written files
archive_write_files(file, files)
}
)
}
)
This will create the zip file myzipfile.zip which will contain iris.csv and mtcars.csv.
I have a file which i generate in shiny
The user clicks a button and the file should download. However nothing happens
The function export_report generates the excel file and saves it to a location. The function then passes back the file location to the download handler so it will download the file. The problem seems to be that it isnt being returned correctly. I have tested the function (export_report) outside of shiny and it returns everything perfectly so I'm clearly doing something wrong from the shiny perspective.
The file itself is created where it is supposed to be on the server because i can download it within RStudio and see it in the file explorer. Can anyone help
# UI Section
downloadButton("downloadRpt", "Download Report")
# Server Section
output$downloadRpt <- downloadHandler(
filename = function() {
mydf <- report()
dateRange <- input$dates_report
selection <- input$selection
myfile <- export_report (mydf, selection, dateRange)
},
content = function(file) {
file.copy(myfile, file)
}
)
I have seen other examples R Shiny: Download existing file which is what my code is based on
EDIT 1: Adding the export_report function with some fake data to run it
export_report <- function(mydf,selection,dateRange) {
# Template for where the template excel file is stored
myoutputTemplate <- '/home/shiny_tutorials/Save to Database/templates/output_template.xlsx'
start_date <- dateRange[1]
end_date <- dateRange[2]
date_range <- paste(start_date ,end_date, sep = " - " )
# Load workbook the template workbook
wb <- loadWorkbook(myoutputTemplate)
# write to the workbook the data frame
writeWorksheet(wb, mydf, sheet="Details",
startRow=8, startCol=2,
header=FALSE)
# add the the customer the user selected
writeWorksheet(wb, selection, sheet="Details",
startRow=3, startCol=3,
header=FALSE)
# date
writeWorksheet(wb, date_range, sheet="Details",
startRow=5, startCol=3,
header=FALSE)
# Create The file Name
filename <- paste(selection, Sys.Date(), sep = " - ") %>%
paste(.,"xlsx", sep = ".")
# removes the % sign and extra qoutes
filename <- gsub (pattern = '\'|%','', x = filename)
# output directory
myoutput <- paste('/home/shiny_tutorials/Save to Database/output/',
filename, sep = '')
# Save workbook
saveWorkbook(wb, myoutput)
# Return File Path
myoutput
}
To call the function you can use the data below
dateRange <- c("2011-09-23","2016-09-23")
selection = "COMPANY_A"
mydf <- iris
myfile <- export_report(mydf,selection,dateRange)
EDIT 2 I have now managed to get an error out of it. When i cat(myfile) in the filename = function() { section of the code i get the error after the correct file path has been returned
Warning in rep(yes, length.out = length(ans)) :
'x' is NULL so the result will be NULL
Warning: Error in ifelse: replacement has length zero
Stack trace (innermost first):
1: runApp
Error : replacement has length zero
This error is basically because my file path does not get passed to the segment myfile so
if someone can tell me how to get the filepath generated by my function to the server section of the code below, that should fix my problem
content = function(file) {
file.copy(myfile, file)
}
Thank you to everyone who commented and clarified my thinking a bit on how the download handler works.
In the end, i created a new function which split up the export function above
The new function i used is called generate_file() which simply returns the file name
generate_file_name <- function(selection) {
# Create The file Name
filename <- paste(selection, Sys.Date(), sep = " - ") %>%
paste(.,"xlsx", sep = ".")
# removes the % sign and extra qoutes
filename <- gsub (pattern = '\'|%','', x = filename)
# output directory
myoutput <- paste('/home/shiny_tutorials/Save to Database/output/',
filename, sep = '')
# Return File Path
myoutput
}
Then in the server side
output$downloadRpt <- downloadHandler(
filename = function() {
selection <- input$company
generate_file_name(selection)
},
content = function(file) {
mydf <- report()
dateRange <- input$dates_report
selection <- input$company
export_report(mydf,selection,dateRange)
myfile <- generate_file_name(selection)
file.copy(myfile, file)
}
)
This then finds the newly created file and exports it for the user
I just checked your problem with this example code and it worked:
output$downloadData <- downloadHandler(
filename = function() {
data <- mtcars
myfile <- "data.csv"
write.csv(data, myfile)
myfile
},
content = function(file) {
print(file) //prints: [...]\\Local\\Temp\\RtmpEBYDXT\\fileab8c003878.csv
file.copy(file, file)
}
)
myfile is the filename of the downloaded file. You cannot use it in file.copy as input, this variable is out of scope. It seems that R creates a temp file name (see the print()).
Try to use the filename function to define your path or a custom file name, and the write.csv in the content part. Example code:
output$downloadData <- downloadHandler(
filename = function() {
paste(<user_name or date for eg>, ".csv", sep="")
},
content = function(file) {
write.csv(data, file)
}
)
I noticed in your comment above, you have asked how the application would generate the correct file when used by multiple users. For this part, you need to use the session.
So if your business logic functions were to come from an R file called foo.R, the server code should look something like:
shinyServer(func = function(input, output, session) {
source("./foo.R", local=TRUE)
......
And this would separate out the session for each user, thereby generating files specific to each, when downloading. Hope this gives you some pointers.