I set up a shiny app that checks for a GET string and presents a link if a file matching the id argument exists. Now what I would like to do is have the page redirect straight to the download file if a valid query is detected in the URL. Does anybody know of the syntax to insert, e.g. a <meta http-equiv=...> header from server.R?
Motivation: I want to be able to download files directly into an R console session from a URL pointing at a Shiny app. So, a non-geeky user specifies their preliminary statistical model using Shiny, then a statistician downloads it into their usual working environment and takes it the rest of the way. I need to do this server-side rather than with something like javascript's window.location because javascript won't be supported client-side.
Here is the server.R
shinyServer(function(input, output, clientData) {
query <- reactive(parseQueryString(clientData$url_search));
revals <- reactiveValues();
## obtain ID from GET string
observe({revals$id <- query()$id});
## alternatively obtain ID from user input if any
observe({input$submitid; if(length(id<-isolate(input$manualid))>0) revals$id <- id;});
## update filename, path, and existance flag
observe({ revals$filename <- filename <- paste0(id<-revals$id,".rdata");
revals$filepath <- filepath <- paste0("backups/",filename);
revals$filexists <- file.exists(filepath)&&length(id)>0; });
## update download handler
output$download <- {downloadHandler(filename=function() revals$filename, content=function(oo) if(revals$filexists) system(sprintf('cp %s %s',revals$filepath,oo)))};
## render the download link (or message, or lack thereof)
output$link <- renderUI({
cat('writing link widget\n');
id<-revals$id;
if(length(id)==0) return(div(""));
if(revals$filexists) list(span('Session download link:'),downloadLink('download',id)) else {
span(paste0("File for id ",id," not found"));}});
});
Here is the ui.R
shinyUI(pageWithSidebar(
headerPanel(div("Banner Text"),"Page Name"),
sidebarPanel(),
mainPanel(
htmlOutput('link'),br(),br(),
span(textInput('manualid','Please type in the ID of the session you wish to retrieve:'),actionButton('submitid','Retrieve')))));
Update:
In trying #jeff-allen 's suggestion, I ran into another problem: how to extract the filesystem path to which the files get copied for downloading and turn it into a valid URL? It's probably possible by screwing with shell scripts and http config settings on my local host, but how to do this in a portable way that doesn't require superuser privileges and is as shiny-native as possible?
Motivation: I want to be able to download files directly into an R
console session from a URL pointing at a Shiny app.
...i.e. this amounts to a very roundabout way of trying to serve static content from a shiny app. Turns out I don't need to redirect or use downloadHandler at all. As this post on the Shiny forum says, any file I create inside the local www directory will be accessible as if it is at the root of my app directory. I.e. if I have my app do save.image(file='www/foo.rdata') then I will be able to access it from [http://www.myhost.com/appname/foo.rdata] if the app itself lives at [http://www.myhost.com/appname/]
Related
For a project I want to create a shinydashboard that is powered by data provided by various people in a SharePoint and has the ability to automatically update once a file is edited in SharePoint.
Because of firewall issues with the company I am working for it seemed like a better idea to sync the SharePoint folder to my local disk, and upload the files from there.
My code looks like this:
observe({
invalidateLater(2000, session)
list <- list.files(path = "../../folder/folder", pattern = ".xlsx", recursive = T)
for (i in 1:length(list)) {
assign(basename(file_path_sans_ext(list[i])),
as.data.table(read_xlsx(paste0("../../folder/folder/", list[i]))))
}
})
By itself the code works well. It grabs all of the documents and assigns them to data.tables with their filename as name. However, whenever I make a change to one of the documents in SharePoint it won't update the file in my local disk until I open it manually. I can see that because when I the Preview Pane in file explorer it will still show the unedited document.
My question:
Is there a way in R to force a sync or in OneDrive to sync automatically once an update is made to the original file? All files combined are like 250kb so even if an automatic update might be too heavy in most cases, I think it's fine in my case.
I was wondering if there is a way to upload several files by one single file containing its names. E.g I have a .txt that has 3 rows with "File 1.txt" "File 2.txt" and "File 3.txt" and I want to upload those 3 files in that order (The file with the names is in the same folder that the to be uploaded files).
I don't want to use fileInput(..., multiple = TRUE) because when you upload multiple files are ordered by default alphabetically. With a few files the user can change the names to be uploaded this way, but if you have a lot of files it became an annoying task.
I've thougth that I could paste the name to the last part of the datapath, but when the file is uploaded the datapath is in the local temp and not the actual location of the file, as shown in the figure below.
Thanks!
When a file is uploaded by Shiny WEB application the file is assigned a new name and is stored in temp folder of server. All the reads are done with this server-side stored file. The server does not have priveleges to get access to client's file system and is not capable to read any files from client's PC. So there are no obvious ways to solve the stated task in the desired way and I guess it shouldn't be used even the way is found as it opens the gate for risks of unauthorized file system access.
I guess if the task cannot be solved in any altetnative way you have to consider a UI interface with an arranged list of files creation, client side packing and sending to server side.
If there is any sort of pattern to the files you want to process, you might still be able to use multiple = T within fileinput for serverside processing.
My example below arranges on size of the uploaded files, but you could create a custom order on the name attribute as well (assuming you will be using this list as an index for your processing).
library(shiny)
library(magrittr)
ui <- fluidPage(
fileInput(inputId = "fileinput",
label = "select",
multiple = TRUE)
)
server <- function(input,output,session) {
observeEvent(input$fileinput, {
print(input$fileinput %>% dplyr::arrange(desc(size)))
})
}
shinyApp(ui,server)
Is it possible to clear the Shiny cache history when reloading an app? Specifically, I have a textInput for adding a title that doesn't clear when the app is restarted. In other words, all of the titles entered into the app on previous runs have been "remembered" and show up in a dropdown list when the app is rerun or restarted again. I have tried using shinyjs::reset, but that clears the current value and not the ones in the "history". I've also tried updateTextInput with an action button set to " ", but also clears only the current value and not the ones in the "history". I have also tried setting ui as a function using:
ui <- function(req) {
fluidPage(...)
}
as well as:
I found that cache of R Shiny server is updated when the date of creation for the file "app.R" is changed.
So, here is the trick I used:
server <- function(input, output, session) {
Trick file date creation update
onStop(function() {
# File name
p <- paste0(getwd(), "/app.R")
# Update file 'date creation'
Sys.setFileTime(p, now())
}) # onStop
} # server
The idea is to update the date of "app.R" creation after each session.
Neither of these solutions worked for me. Finally, I found this post:
I also found this post:
I have been struggling with this problem for quite a while, and thought I had tried everything, including putting a js button on the shiny sidebar to manually refresh (unfortunately that did not work either). There are two things that did work for me:
Make sure all code to read data from files is NOT in a code chunk with a name OTHER THAN global OR
Manually restart the shiny server when new data is uploaded
Obviously the first one is much more manageable, and a solution that I wish I had known weeks ago when I started playing with workarounds.
But I am not sure what is involved in implementing either, so if someone could clarify that, that would be great.
Any help would be much appreciated.
I am using R to deploy an app over the web, but the URL from which my app takes data is where my app takes time. Is it possible to cache that data?
I tried to install the packages memoise, R.cache and a few more that were all unsupported by the server.
I recommend trying the DataCache package by Jason Bryer. The package is available through GitHub and I was successful in using it today for a Shiny app that I am developing.
The main function from that package is data.cache. You will need to a define a function that generates a list of objects that you want to cache, and then pass the function you define as an argument to data.cache. I also recommend setting the cache.name parameter of data.cache if you intend on caching more than one list of objects in your application.
For example:
DataCache::data.cache(
function(){
list(
normal_random_numbers = rnorm(10),
uniform_random_numbers = runif(10)
)
},
cache.name = 'my_random_numbers'
)
The above code creates two objects in the local environment, normal_random_numbers and uniform_random_numbers, as well as caches these to the file system. When you run this code again, the cached copies of these objects will be used rather than being regenerated - unless of course the cache expires. The frequency parameter of data.cache is used to set the expiry of the cache, which is set to daily by default.
If you are running the application under Windows then use this slightly modified version of the package. This is to address--- a bug that is apparently due to the cache filename being incompatible with the Windows file system.
This article from Rstudio is quite exhaustive and walks you through different ways to achieve that (i.e diskcache, the storr package or a Redis instance).
The main logic revolves around rendering a cached element and setting up the logic for cache invalidation:
function(input, output) {
renderCachedPlot(
{
# Code for a beautiful plot
},
# A change in the input or the dataframe will invalidate the cache
cacheKeyExpr = list({ input$n, df() })
)
}
We are building a Shiny app and plan to share the link to shinyapps.io.
We are wondering if there is any way to collect feedback from users - e.g. is there a way to have a text input field and permanently save the inputs for us?
Many thanks!
There is this project: ShinyChat which can be used as starting point for user feedback collection system.
Link to app: Live Chat
So in theory you need to have global reactiveValues() where you store your log.Rds and then you add user input to that log file. You may want to use R package stringgr. Example code:
library(stringgr)
log <- reactiveValues() #This have to be outside shinyServer so that all users can see it
shinyServer(function(input, output, session) {
addFeedBack <- function(file, string) {
...
return(modifiedFile)
}
observe({
log$logfile <- addFeedBack(log$logfile, input$userFeedback)
})
}
EDIT:
I did some research and actually there is really nice article and example in an official shiny page: Share data So you will encounter some problems if you are planning to host your app on ShinyApps.io. and article gives solution for that.