I'm creating a simple GUI in Shiny for reading in a bunch of csv files and then filtering them by values present in the 5th column of each csv. I'm not sure how to access the correct shiny environment however. For example, within the server function, I first read the files in with the lines:
for (i in all_paths) {
n <- basename(i)
temp = list.files(path = i, pattern="*.csv",full.names = TRUE)
list2env(
lapply(setNames(temp, make.names(gsub(".*FRSTseg*", n, temp))),
read.csv), envir = .GlobalEnv)
}
And then filter with:
Pattern1<-grep("*.csv",names(.GlobalEnv),value=TRUE)
all_data<-do.call("list",mget(Pattern1))
newdfs <- lapply(all_data, function(x) subset(x, x[, 5] > 0))
list2env(newdfs,globalenv())
When I run the app, I get en error message saying it can't find the value of one of my csvs, which I have found to be the first element of the Pattern1 list. So I'm pretty sure the app fails right after the Pattern1 line.
I think the problem is that the csv files are not being read into the correct environment, such that the all_data <- do.call... line does not know where to look. So instead of using .GlobalEnv and globalenv, what should I be using? Any help is appreciated, thanks!
We can use reactiveValues and store the result of read_csv to be available across all observers in the app. I created a small app that reads the 5th column of different .csv files located in the project directory. In this case all the data will be stored inside an object called column_read$files that can be invoked inside any observer or reactive.
app:
library(tidyverse)
library(shiny)
set.seed(15)
#create the data
paste0('iris', 1:5, '.csv') %>%
map(~write_csv(x = slice_sample(iris,n = 10), .x))
ui <- fluidPage(
actionButton('read_files', "Read Files"),
textOutput('columns_print')
)
server <- function(input, output, session) {
columns_read <- reactiveValues(files = NULL)
observeEvent(input$read_files, {
files <- list.files(pattern = "*.csv",full.names = TRUE)
columns_read$files <- map(files, ~read_csv(.x, col_select = 5))
})
output$columns_print <- renderPrint({
req(columns_read$files)
columns_read$files
})
}
shinyApp(ui, server)
Related
I'm pretty stuck here; I have created a simple shiny app with the possibility of uploading multiple files. However, I don't know how can I move on from here and access the files directly within the shiny app, for example, get all the uploaded data files into one data.frame to perform a loop later on.
for example we have
data_1 <- "data file 1"
data_2 <- "data file 2"
data_3 <- "data file 3"
data_4 <- "data file 4"
dataSet <- data.frame(DATA= c(1,2,3,4),
DATAFILE=c(data_1 ,data_2 ,data_3 ,data_4))
Is there any way to do that? I hope I have been able to explain myself thoroughly. I really appreciate any help you can provide.
library(shiny)
options(shiny.maxRequestSize = 30 * 1024^2)
ui <- fluidPage(
fileInput("upload", NULL, buttonLabel = "Upload...", multiple = TRUE),
tableOutput("files")
)
server <- function(input, output, session) {
output$files <- renderTable(input$upload)
}
shinyApp(ui, server)
input$upload is a data.frame containing four columns, to read the files we'll need datapath column that contains the temp path with the uploaded data, in this case they are csv's. From there we use a function like readr::read_csv() to transform the raw uploaded data into a df.
We can construct a reactive that consists in a list with all the uploaded files in it.
# read all the uploaded files
all_files <- reactive({
req(input$upload)
purrr::map(input$upload$datapath, read_csv) %>%
purrr::set_names(input$upload$name)
})
Full app:
library(shiny)
library(tidyverse)
library(DT)
# create some data to upload
write_csv(mtcars, "mtcars.csv")
write_csv(mpg, "mpg.csv")
write_csv(iris, "iris.csv")
options(shiny.maxRequestSize = 30 * 1024^2)
ui <- fluidPage(
fileInput("upload", NULL, buttonLabel = "Upload...", multiple = TRUE),
DT::DTOutput("files"),
tableOutput("selected_file_table")
)
server <- function(input, output, session) {
output$files <- DT::renderDT({
DT::datatable(input$upload, selection = c("single"))
})
# read all the uploaded files
all_files <- reactive({
req(input$upload)
purrr::map(input$upload$datapath, read_csv) %>%
purrr::set_names(input$upload$name)
})
#select a row in DT files and display the corresponding table
output$selected_file_table <- renderTable({
req(input$upload)
req(input$files_rows_selected)
all_files()[[
input$upload$name[[input$files_rows_selected]]
]]
})
}
shinyApp(ui, server)
There are two stages to this:
When you select a file what happens is that is gets copied into a temp directory. One of the values returned by the input is the location of the temp file, another is the original file name.
Once you have the file path you can use a function to read the data from that temp file.
The example at the bottom of this should be helpful (although your example needs a little bit more than this one because you have selected multiple files):
https://shiny.rstudio.com/reference/shiny/1.6.0/fileInput.html
I am new to Shiny, apologies if this is obvious and has been asked numerous times, but I've been stuck on this for days.
I've been modifying a dashboard to process analytical chemistry data i.e. it reads in multiple csv files, processes the data (smooths etc.) with various sliders and functions in Shiny, but does not save/download the processed data/output, which I've been trying to do. I don't seem to be able to access the "output" or processed data e.g. as a list of matrices, which I then write out as new .csv files. (I get "object of type 'closure' is not subsettable")
I am competent in R, and have script which works well, but making this change to Shiny is proving problematic. How do I access the output data of detectedPeaks or baselineCorrectedSpectra to write to csv (or zip up the mutilple csv files)?
Thank you.
#Just part of the relevant code - a long script
#server
baselineCorrectedSpectra <- reactive({
if (is.null(input$bc)) {
method <- "SNIP"
hws <- 100
} else {
method <- input$bc
hws <- input$bcHws
}
return(lapply(smoothedSpectra(), function(y) {
bl <- estimateBaseline(y, method=method, hws)
intensity(y) <- intensity(y)-bl[, 2]
return(y)
}))
})
detectedPeaks <- reactive({
return(detectPeaks(baselineCorrectedSpectra(), method=input$pdNoise,
halfWindowSize=input$pdHws, SNR=input$pdSNR))
})
datasetInput <- reactive({
switch(input$dtset,
"peaks" = detectedPeaks(),
"centroided" = baselineCorrectedSpectra())
})
output$DownloadZip <- downloadHandler(
filename = function(){
paste0("Results",".zip")
},
content = function(con){
files <- c()
tmpdir <- tempdir()
setwd(tempdir())
for (i in 1:length(s)){
x<-as.matrix(datasetInput[[i]]) #This doesn't work,how do I access this data?
y<-metaData(s[[i]])
f<-(paste("processed", y, sep="_" ))
if(input$downloadType == ".csv")
write.csv(x,f)
else write.table(x,f)
files<-c(x,files)
}
zip(zipfile=con, files=files)
},
contentType = "application/zip"
)
I added () to datasetInput
Can you give this a try:
x<-as.matrix(datasetInput([[i]]))
The way to do it is below:
x<-as.matrix(datasetInput()[[i]])
I am still relatively new at working in R shiny and I am trying to load several excel files into an R-shiny app. Part of the problem is that I need to be able to pull several files from a dropbox folder without specifying what the data file is called. So I need to be able to tell R to read in all the files from a dropbox folder. Also the files I am working with are in .xlsx format and I will need to read them into R as such.
I tried to do this first by using a folder on my computer desktop. I managed to get it to work using my local directory with the code below:
library(readxl)
library(tidyverse)
files <- list.files(path = "~/Desktop/data", pattern = "*.xlsx", full.names = TRUE) #read files from folder on desktop
df <- sapply(files, read_excel, simplify = FALSE) %>% #read files from the path, and bind them together
bind_rows()
I tried to adjust the code above to work with the drop_dir function in rdrop2. The code I tried is below:
library(rdrop2)
library(tidyverse)
library(readxl)
token <- drop_auth()
files <- drop_dir("!dropbox_folder", dtoken = token) #List all files in Dropbox folder MPD_03_Test
f <- files$path_display #list directory to dropbox
df <- sapply(f, read_excel, simplify = FALSE) %>% #runs the read function for all the files that are pulled
bind_rows() # .id="id creates a unique ID for each row and then binds them all together based on the ID.
When I run it the code is not loading the data files from the dropbox into R. When I run the dropbox code it just creates an empty object. Any help on where to go to figure this out will be greatly appreciated! Also I intend to use this as how I read data into and R-shiny app if that helps frame any suggestions you may have about how to approach my problem.
Thank You!
#MrGumble is correct in his comments. The files need to downloaded before being read. The drop_dir() function lists file paths on dropbox server and we can only read in data saved locally to our machine. If you have .csv files then this can be down in 1 step with the drop_read_csv() function. But since you have excel files these need to first to be downloaded explicitly with drop_download() and then read in with read_excel().
library(rdrop2)
library(tidyverse)
library(readxl)
#install.packages("xlsx")
library(xlsx)
token <- drop_auth()
#make a few excel file with iris dataset, save locally, and upload to dropbox root
iris_filenames <- paste0("iris", 1:3, ".xlsx")
walk(iris_filenames, ~write.xlsx(iris, file = .x, row.names = FALSE))
walk(iris_filenames, drop_upload)
#list all files on dropbox root and filter for only iris ones
iris_files_on_dropbox <- drop_dir(dtoken = token) %>%
filter(str_detect(name, 'iris'))
#make new filenames so we can see that the download worked correctly
#you could do overwrite = TRUE and not pass through new filenames
#see ?drop_download for all options
new_iris_filenames <- paste0("iris", 1:3, "-from-dropbox.xlsx")
#download the files first
walk2(iris_files_on_dropbox$name, new_iris_filenames, ~drop_download(path = .x, local_path = .y))
#then read them all in
df <- bind_rows(map(new_iris_filenames, read_xlsx))
Additionally, we can create our own custom function to do the download and reading in 1 step just as drop_read_csv() does by altering the source code for this function. All we need to do is change the read...() function from read.csv to read_excel and the reference to the dtoken default get_drop_token() to rdrop2:::get_drop_token() which is an un-exported function from the rdrop2 package so we need the three ':::'.
#source for drop_read_csv we can rewrite for excel files
# drop_read_csv <- function(file, dest = tempdir(), dtoken = get_dropbox_token(), ...) {
# localfile = paste0(dest, "/", basename(file))
# drop_download(file, localfile, overwrite = TRUE, dtoken = dtoken)
# utils::read.csv(localfile, ...)
# }
drop_read_excel <- function(file, dest = tempdir(), dtoken = rdrop2:::get_dropbox_token(), ...) {
localfile = paste0(dest, "/", basename(file))
drop_download(file, localfile, overwrite = TRUE, dtoken = dtoken)
readxl::read_excel(localfile, ...)
}
df2 <- bind_rows(map(iris_files_on_dropbox$name, drop_read_excel))
To work in a shiny app we first need to save the drop_auth token so we can authenticate while using the shiny app. Save this into your shiny app directory.
saveRDS(token, file = "token.rds")
Now here is a shiny app. When the 'go' button is clicked the iris excel files are downloaded and shown in the UI. We need to call drop_auth() in the global environment or global.R along with the custom drop_read_excel() function to use it.
library(shiny)
library(rdrop2)
library(tidyverse)
#saveRDS(token, file = "token.rds") into shiny app directory
#authenticate in global.R or outside of ui/server
drop_auth(rdstoken = "token.rds")
drop_read_excel <- function(file, dest = tempdir(), dtoken = rdrop2:::get_dropbox_token(), ...) {
localfile = paste0(dest, "/", basename(file))
drop_download(file, localfile, overwrite = TRUE, dtoken = dtoken)
readxl::read_excel(localfile, ...)
}
ui <- fluidPage(
actionButton("go", "go"),
tableOutput("table")
)
server <- function(input, output, session) {
df <- eventReactive(input$go, {
withProgress(message = 'Downloading from dropbox',
detail = 'This may take a while...', value = 0.5, {
iris_files_on_dropbox <- drop_dir() %>%
filter(str_detect(name, 'iris'))
setProgress(value = 0.75)
df <- bind_rows(map(iris_files_on_dropbox$name, drop_read_excel))
setProgress(value = 1)
})
return(df)
})
output$table <- renderTable({
df()
})
}
shinyApp(ui, server)
Context: I have an app transforming data according to user's choices. It creates a few tables and plots in the process.
Objective: to save some objects created in the process into one new folder with one click on a button.
Previous researches: the code below saves objects using downloadHandler() and some functions as presented here. It does not seems to allow multiple objects to be passed into downloadHandler(). I am aware it is possible to stack these objects in a list and then save it but if possible I would like to avoid doing it and instead get multiple files (like .txt or .png, ...)
Here is a reproductible example with very little data using datasets included in R (mtcars and iris).
library(shiny)
ui <- fluidPage(
downloadButton("save", "Save") # one click on this button to save df1 AND df2 tables in a new folder
)
server <- function(input, output) {
# my real app does multiple changes on datasets based on user choices
df1 = mtcars[1:10,]
df2 = iris[1:10,]
# Now I want to save df1 and df2 objects with 1 click on the "Save" button
output$save = downloadHandler(
filename = function(){ paste("example", ".txt", sep = " ") },
content = function(file) { write.table(df1, file) }
)
}
# Run the application
shinyApp(ui = ui, server = server)
Many thanks for your help and suggestions!
As noted in the comments of the linked post, it's not typically a good idea to change the working directory (and unnecessary in this case). While inconsequential with a small number of files, the paste0 call to create the path doesn't need to be in the for loop as it is vectorized. This also eliminates the need to dynamically grow the fs vector (also generally a bad practice). Lastly, my zip utility wasn't on my path which caused the utils::zip to fail (you can specify the path in the function call, otherwise it checks for the environment variable R_ZIPCMD and defaults to 'zip' assuming it to be on the path).
I generally agree with the accepted answer, but here's an alternative solution using the zip::zipr function instead (also walk instead of the for loop)
library(shiny)
library(purrr)
library(zip)
ui <- fluidPage(
downloadButton("save", "Save") # one click on this button to save df1 AND df2 tables in a new folder
)
server <- function(input, output) {
# my real app does multiple changes on datasets based on user choices
df1 <- mtcars[1:10,]
df2 <- iris[1:10,]
# need to names these as user won't be able to specify
fileNames <- paste0("sample_", 1:2, ".txt")
output$save = downloadHandler(
filename = function(){ paste0("example", ".zip") },
content = function(file) {
newTmpDir <- tempfile()
if(dir.create(newTmpDir)){
# write data files
walk2(list(df1, df2), fileNames,
~write.table(.x, file.path(newTmpDir, .y))
)
# create archive file
zipr(file, files = list.files(newTmpDir, full.names = TRUE))
}
},
contentType = "application/zip"
)
}
I want to use Shiny within RMarkdown for users to upload data (xlsx file).
Then I want to pass all the worksheets as R data frames (w/o reactivity) to run rest of the RMarkdown file.
I mainly want to convert them into data frames so I can use reticulate to run Python code as well.
I've tried this, and it doesn't seem to quite work:
library(dplyr)
library(miniUI)
library(shiny)
library(XLConnect)
launch_shiny <- function() {
ui <- miniPage(
gadgetTitleBar("Input Data"),
miniContentPanel(
fileInput(inputId = "my.file", label = NULL, multiple = FALSE)
)
)
server <- function(input, output, session) {
wb <- reactive({
new.file <- input$my.file
loadWorkbook(
filename = new.file$datapath,
create = FALSE,
password = NULL
)
})
observeEvent(input$done, {
stopApp(c(wb()))
})
}
runGadget(ui, server)
}
test <- launch_shiny()
df1 <- readWorksheet(object = test, sheet = "sheet1")
df2 <- readWorksheet(object = test, sheet = "sheet2")
It throws this error:
Error in (function (classes, fdef, mtable) :
unable to find an inherited method for function ‘readWorksheet’ for signature ‘"list", "character"’
I can return one sheet at a time using stopApp(readWorksheet(object = wb(), sheet = "sheet1")), but I can't seem to return an entire workbook or multiple data frames at the same time.
I don't really want to read in xlsx, save each sheet as csv in working directory, then read those files in again.
Would anyone have a good suggestion on how to get around this?
The documentation of fileInput() states in the details:
datapath
The path to a temp file that contains the data that was
uploaded. This file may be deleted if the user performs another upload
operation.
Meaning that the datapath given in the input variable is a temporary file that is no longer accessible after you close the App, which is what the function readWorksheet will try to do.
So you'll have to read the sheets in the server and return the dataframes somehow.
I did that by defining a second reactive value which is basically a list of dataframes returned by applying lapply on all the sheets in wb, in this case test will be this list of data frames.
There might be other ways (more efficient, or suits your purpose better) to do this, but here it is:
library(dplyr)
library(miniUI)
library(shiny)
library(XLConnect)
launch_shiny <- function() {
ui <- miniPage(
gadgetTitleBar("Input Data"),
miniContentPanel(
fileInput(inputId = "my.file", label = NULL,
multiple = FALSE)
)
)
server <- function(input, output, session) {
wb <- reactive({
new.file <- input$my.file
loadWorkbook(
filename = new.file$datapath,
create = FALSE,
password = NULL
)
})
# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
df_lst <- reactive({
# read all sheets into a list
lapply(getSheets(wb()),
function(sheet){
readWorksheet(object = wb(),
sheet = sheet)
})
})
# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
observeEvent(input$done, {
# get the list of dfs from the app
stopApp(c(df_lst()))
})
}
runGadget(ui, server)
}
test <- launch_shiny()