R shiny load workspace - r

I am learning Shiny and need some help please.
I need to load a very large data.frame from a saved workspace (RData). Once loaded I need to perform analyses and output these to the UI.
I have placed the following code in server.R but it doesn't load the data and then throws an error:
shinyServer(function(input, output) {
load("c:/temp/ws1.RData")
output$balance_matrix <- renderTable({
Transaction_history
})
})
> Error in func() : object 'Transaction_history' not found
First, what am I doing wrong here?
Secondly, is this the best place to load the workspace?
Thirdly, can I load it outside the function at the top or will it not be available in the function then?
Thanks

Rather than saving and trying to load the workspace, I'd recommend saving the file individually like,
saveRDS(Transaction_history, "C:/temp/ws1.RData")
and then load it in Shiny like,
Transaction_history <- readRDS("C:/temp/ws1.RData")
This method will focus on this file by itself. From the ?load documentation,
load() replaces all existing objects with the same names in the
current environment (typically your workspace, .GlobalEnv) and hence
potentially overwrites important data.

Related

R Shiny Dashboard - Loading Scripts using source('file.R')

Introduction
I have created an R shiny dashboard app that is quickly getting quite complex. I have over 1300 lines of code all sitting in app.R and it works. I'm using RStudio.
My application has a sidebar and tabs and rather than using modules I dynamically grab the siderbar and tab IDs to generate a unique identifier when plotting graphs etc.
I'm trying to reorganise it to be more manageable and split it into tasks for other programmers but I'm running into errors.
Working Code
My original code has a number of library statements and sets the working directory to the code location.
rm(list = ls())
setwd(dirname(rstudioapi::getActiveDocumentContext()$path))
getwd()
I then have a range of functions that sit outside the ui/server functions so are only loaded once (not reactive). These are all called from within the server by setting the reactive values and calling the functions from within something like a renderPlot. Some of them are nested, so a function in server calls a function just in regular app.R which in turn calls another one. Eg.
# Start of month calculation
som <- function(x) {
toReturn <- as.Date(format(x, "%Y-%m-01"))
return(toReturn)
}
start_fc <- function(){
fc_start_date <- som(today())
return(fc_start_date)
}
then in server something like this (code incomplete)
server <- function(input, output, session) {
RV <- reactiveValues()
observe({
RV$selection <- input[[input$sidebar]]
# cat("Selected:",RV$selection,"\r")
})
.......
cat(paste0("modelType: ",input[[paste0(RV$selection,"-modeltype")]]," \n"))
vline1 <- decimal_date(start_pred(input[[paste0(RV$selection,"-modeltype")]],input[[paste0(RV$selection,"-modelrange")]][1]))
vline2 <- decimal_date(start_fc())
.......
Problem Code
So now when I take all my functions and put them into different .R files I get errors indicating the functions haven't been loaded. If I load the source files by highlighting them and Alt-Enter running them so they are loaded into memory then click on Run App the code works. But if I rely on Run App to load those source files, and the functions within them, the functions can't be found.
source('./functionsGeneral.R')
source('./functionsQuote.R')
source('./functionsNewBusiness.R')
source('./ui.R')
source('./server.R')
shinyApp(ui, server)
where ui.R is
source('./header.R')
source('./sidebar.R')
source('./body.R')
source('./functionsUI.R')
ui <- dashboardPage(
header,
sidebar,
body
)
Finally the questions
In what order does R Shiny Dashboard run the code. Why does it fail when I put the exact same inline code into another file and reference it with source('./functions.R')? Does it not load into memory during a shiny app session? What am I missing?
Any help on this would be greatly appreciated.
Thanks,
Travis
Ok I've discovered the easiest way is to create a subfolder called R and to place the preload code into that folder. From shiny version 1.5 all this code in the R folder is loaded first automatically.

R Shiny - dataset load in a first chunk doesn't exist in a second chunk ...?

I have a strange error in a shiny app I built with the library learnr. An error "Object not found" about an object I just loaded and just visualized (meaning the object exists no ?)
Although I don't have a reproducible example, some of you will maybe understand what is creating the error :
I have a first chunk {r load} that loads a dataset. There is no error here, I can even visualize the dataset (screenshot below)
Then I have a second chunk, where I would like to manipulate the dataset. But it tells me dataset doesn't exist ! How it could be possible, I just visualized it one chunk before ?! ...
I don't understand how a dataset could be exists in a chunk, and not in another. Does it mean the dataset isn't loaded in the global environment ? Is it a problem with the learnr library ?
Maybe someone will have an idea, or something I could test. Thank you in advance.
EDIT:
The problem is about the environment/workspace. In the first chunk, even if I load the dataset, it is not store in the environment. I tested the function ls() in a second chunk, and it tells me there is no object in the workspace. The loaded dataset is not here, I don't know why ...
In my opinion, shiny doesn't store any data. You have to pass it manually from one chunk to other as follow (only adding the code snippet from server):
server <- function(input, output, session) {
output$heat <- renderPlotly({
Name<-c("John","Bob","Jack")
Number<-c(3,3,5)
Count<-c(2,2,1)
NN<-data.frame(Name,Number,Count)
render_value(NN) # You need function otherwise data.frame NN is not visible
# You can consider this as chunk 1
})
render_value=function(NN){
# Here your loaded data is available
head(NN)
# You can consider this as chunk 2
})
}
}
shinyApp(ui, server)
You can find full code here: Subset a dataframe based on plotly click event
OR
Create global.R file as suggested here and follow this URL: R Shiny - create global data frame at start of app

reactiveFileReader in Shiny for .RData

My current workflow in a shiny application is to run a R script as a cron job periodically to pull various tables from multiple databases as well as download data from some APIs. These are then saved as a .Rdata file in a folder called data.
In my global.R file I load the data by using load("data/workingdata.Rdata"). This results in all the dataframes (about 30) loading into the environment. I know I can use the reactiveFileReader() function to refresh the data, but obviously it would have to be used in the server.R file because of an associated session with the function. Also, I am not sure if load is accepted as a readFunc in reactiveFileReader(). What should be the best strategy for the scenario here?
This example uses a reactiveVal object with observe and invalidateLater. The data is loaded into a new environment and assigned to the reactiveVal every 2 seconds.
library(shiny)
ui <- fluidPage(
actionButton("generate", "Click to generate an Rdata file"),
tableOutput("table")
)
server <- shinyServer(function(input, output, session) {
## Use reactiveVal with observe/invalidateLater to load Rdata
data <- reactiveVal(value = NULL)
observe({
invalidateLater(2000, session)
n <- new.env()
print("load data")
env <- load("workingdata.Rdata", envir = n)
data(n[[names(n)]])
})
## Click the button to generate a new random data frame and write to file
observeEvent(input$generate, {
sample_dataframe <- iris[sample(1:nrow(iris), 10, F),]
save(sample_dataframe, file="workingdata.Rdata")
rm(sample_dataframe)
})
## Table output
output$table <- renderTable({
req(data())
data()
})
})
shinyApp(ui = ui, server = server)
A few thoughts on your workflow:
In the end with your RData-approach you are setting up another data source in parallel to your databases / APIs.
When working with files there always is some housekeeping-overhead (e.g. is your .RData file completed when reading it?). In my eyes this (partly) is what DBMS are made for – taking care about the housekeeping. Most of them have sophisticated solutions to ensure that you get what you query very fast; so why reinvent the wheel?
Instead of continuously creating your .RData files and polling data with the reactiveFileReader() function you could directly query the DB for changes using reactivePoll (see this
for an example using sqlite). If your queries are long running (which I guess is the cause for your workflow) you can wrap them in a future and run them asynchronously (see this post
to get some inspiration).
Alternatively many DBMS provide something like materialized views to avoid long waiting times (according user privileges presumed).
Of course, all of this is based on assumptions, due to the fact, that your eco-system isn’t known to me, but in my experience reducing interfaces means reducing sources of error.
You could use load("data/workingdata.Rdata") at the top of server.R. Then, anytime anyone starts a new Shiny session, the data would be the most recent. The possible downsides are that:
there could be a hiccup if the data is being written at the same time a new Shiny session is loading data.
data will be stale if a session is open just before and then after new data is available.
I imagine the first possible problem wouldn't arise enough to be a problem. The second possible problem is more likely to occur, but unless you are in a super critical situation, I can't see it being a substantial enough problem to worry about.
Does that work for you?

Variables in Shiny to a sourced file via environment

Im trying to build a small shiny app that will call a sourced file once an actionButton is pressed. The actionButton observer will capture the input$topic and input$num from the ui.R and then call this source("downloadTweets.R") file that needs the topic and num variables defined in the environment to work properly.
# Entry shiny server function
shinyServer(function(input, output) {
observeEvent(input$searchButton, {
topic <- as.character(input$hashtagClass)
num <- as.numeric(input$numTweetsClass)
source("downloadTweets_Topic.R")
})
})
When I try to run it, there is an error message that outputs that topic value was not found once the source("downloadTweets_Topic.R") call is made. I'm fairly new to Shiny, I read the scope documentation and use the reactive() function, but I'm afraid that I don't really get how it works. Is there a way to do this or should I reimplement the .R file so I can pass these values to a function?
The reason I'm doing it like this is just merely code reusal from a different project in R Studio which is not a Shiny app.
Looks like the input$hashtagClass is missing. Throw a browser() line above that line but inside the observe expression. This'll drop you into a breakpoint when the app is run and this code is triggered. You can likely solve the issue with a req call. Look it up with ?req.
#pork chop's suggestion to add local=T to source is also important. This will put any assigned variables into the global env.

Reading an RData file into Shiny Application

I am working on a shiny app that will read a few RData files in and show tables with the contents. These files are generated by scripts that eventually turns the data into a data frame. They are then saved using the save() function.
Within the shiny application I have three files:
ui.R, server.R, and global.R
I want the files to be read on an interval so they are updated when the files are updated, thus I am using:
reactiveFileReader()
I have followed a few of the instructions I have found online, but I keep getting an error "Error: missing value where TRUE/FALSE is needed". I have tried to simplify this so I am not using:
reactiveFileReader()
functionality and simply loading the file in the server.R (also tried in the global.R file). Again, the
load()
statement is reading in a data frame. I had this working at one point by loading in the file, then assigning the file to a variable and doing an "as.data.table", but that shouldn't matter, this should read in a data frame format just fine. I think this is a scoping issue, but I am not sure. Any help? My code is at:
http://pastebin.com/V01Uw0se
Thanks so much!
Here is a possible solution inspired by this post http://www.r-bloggers.com/safe-loading-of-rdata-files/. The Rdata file is loaded into a new environment which ensures that it will not have unexpected side effect (overwriting existing variables etc). When you click the button, a new random data frame will be generated and then saved to a file. The reactiveFileReader then read the file into a new environment. Lastly we access the first item in the new environment (assuming that the Rdata file contains only one variable which is a data frame) and print it to a table.
library(shiny)
# This function, borrowed from http://www.r-bloggers.com/safe-loading-of-rdata-files/, load the Rdata into a new environment to avoid side effects
LoadToEnvironment <- function(RData, env=new.env()) {
load(RData, env)
return(env)
}
ui <- shinyUI(fluidPage(
titlePanel("Example"),
sidebarLayout(
sidebarPanel(
actionButton("generate", "Click to generate an Rdata file")
),
mainPanel(
tableOutput("table")
)
)
))
server <- shinyServer(function(input, output, session) {
# Click the button to generate a new random data frame and write to file
observeEvent(input$generate, {
sample_dataframe <- data.frame(a=runif(10), b=rnorm(10))
save(sample_dataframe, file="test.Rdata")
rm(sample_dataframe)
})
output$table <- renderTable({
# Use a reactiveFileReader to read the file on change, and load the content into a new environment
env <- reactiveFileReader(1000, session, "test.Rdata", LoadToEnvironment)
# Access the first item in the new environment, assuming that the Rdata contains only 1 item which is a data frame
env()[[names(env())[1]]]
})
})
shinyApp(ui = ui, server = server)
Ok - I figured out how to do what I need to. For my first issue, I wanted the look and feel of 'renderDataTable', but I wanted to pull in a data frame (renderDataTable / dataTableOutput does not allow this, it must be in a table format). In order to do this, I found a handy usage of ReportingTools (from Bioconductor) and how they do it. This allows you to use a data frame directly and still have the HTML table with the sorts, search, pagination, etc.. The info can be found here:
https://bioconductor.org/packages/release/bioc/html/ReportingTools.html
Now, for my second issue - updating the data and table regularly without restarting the app. This turned out to be simple, it just took me some time to figure it out, being new to Shiny. One thing to point out, to keep this example simple, I used renderTable rather than the solution above with the ReportingTools package. I just wanted to keep this example simple. The first thing I did was wrap all of my server.R code (within the shinyServer() function) in an observe({}). Then I used invalidateLater() to tell it to refresh every 5 seconds. Here is the code:
## server.R ##
library(shiny)
library(shinydashboard)
library(DT)
shinyServer(function(input, output, session) {
observe({
invalidateLater(5000,session)
output$PRI1LastPeriodTable <- renderTable({
prioirtyOneIncidentsLastPeriod <- updateILP()
})
})
})
Now, original for the renderTable() portion, I was just calling the object name of the loaded .Rdata file, but I wanted it to be read each time, so I created a function in my global.R file (this could have been in server.R) to load the file. That code is here:
updateILP <- function() {
load(file = "W:/Projects/R/Scripts/ITPOD/itpod/data/prioirtyOneIncidentsLastPeriod.RData", envir = .GlobalEnv)
return(prioirtyOneIncidentsLastPeriod)
}
That's it, nothing else goes in the global.R file. Your ui.R would be however you have it setup, call tableOutout, dataTableOutput, or whatever your rendering method is in the UI. So, what happens is every 5 seconds the renderTable() code is read every 5 seconds, which in turns invokes the function that actually reads the file. I tested this by making changes to the data file, and the shiny app updated without any interaction from me. Works like a charm.
If this is inelegant or is not efficient, please let me know if it can be improved, this was the most straight-forward way I could figure this out. Thanks to everyone for the help and comments!

Resources