Read json file continuously - r

I want to read a json file continuously, e.g. every 1000 ms.
One option my be reactiveFileReader
reactiveFileReader(intervalMillis, session, filePath, readFunc, ...)
described here.
This function seems only working with csv files and not for json files:
file_data <- reactiveFileReader(intervalMillis = 1000, NULL, filePath = json_path, readFunc = read.json)
observe({
View(file_data())
})
Error in View : object read.json not found
With reactivePoll like here:
getJsonData <- reactivePoll(1000, session,
checkFunc = function() {
if (file.exists(path))
file.info(path)$mtime[1]
else
""
},
valueFunc = function() {
read_json(path)
}
I get nearly what I want, but this function is not working in my context. How do I force the program to read the file every second and not only when the content of the file is changing?
Are there other possibilities I not have thought about yet?

In your first way, you wrote read.json instead of read_json.
With your second way, you could replace file.info(path)$mtime[1] with runif(1, 0, 1e6). You would be very unlucky if runif returns the same number two consecutive times.
Finally, a third way could be:
server <- function(input, output, session){
autoInvalidate <- reactiveTimer(1000)
getJsonData <- reactive({
autoInvalidate()
read_json("path/to/file.json")
})
}

Here is a reprex on how to use reactiveFileReader with a json file.
I used a future to detach the writing process from the shiny session - you can simply replace this with your json input.
library(shiny)
library(jsonlite)
library(datasets)
library(promises)
library(future)
plan(multisession(workers = 2))
ui <- fluidPage(
uiOutput("printResult")
)
server <- function(input, output, session) {
json_path <- tempfile(fileext = ".json")
write_json(NULL, json_path)
# async file writing process
future({
for(i in seq_len(nrow(iris))){
Sys.sleep(1)
write_json(iris[i,], json_path)
}
})
file_data <- reactiveFileReader(intervalMillis = 1000, NULL, filePath = json_path, readFunc = read_json)
output$printResult <- renderUI({
req(file_data())
})
}
shinyApp(ui, server)

Related

How can I update data from SQL in R?

I have a sample database in SQL and an RShiny app. I have a connection to the database and can retrieve the data.
I cannot get the Shiny app to update when new data is adding to the database. I can see how it works with CSV files but am not able to find anything similar for SQL.
This is my code:
library(RODBC)
library(shiny)
dbCon <- odbcConnect("SQL")
df <- sqlFetch(dbCon, "Test")
odbcClose(dbCon)
page_1 <- tabPanel(
tableOutput('table')
)
ui <- navbarPage(
page_1
)
server <- function(input, output, session) {
output$table <- renderTable('table')
myFile <- Q1
data <- reactivePoll(1000, session,
# Returns the time the file was last modified (read that to be SAVED))
checkFunc = function() {
if (file.exists(myFile))
file.info(myFile)$mtime[1]
else
shinyalert(title = "file", text = "There is no such file")
},
# Get file content
valueFunc = function() {
dbCon <- odbcConnect("SQL")
df <- sqlFetch(dbCon, "Test")
odbcClose(dbCon)
output$table <- renderTable('table')
}
)
}
shinyApp(ui = ui, server = server)
I believe I have two problems:
Q1. What should the path for 'myFile' be?
Q2. How should I write the code in the checkFunc function to see if the data has been updated?
Thanks

R Shiny: How to call an API in the background? [duplicate]

I am trying to run a GET request inside a shinyApp, but I don't want to wait for the response as it would take quite a long time to process and I dont need really need the response inside the shinyApp, although a status code would be nice, but it is not obligatory.
Or is there maybe a function, that sends an async request? Like wrapping the whole GET inside a future/promise?
Currently I have this observeEvent in my shinyApp:
observeEvent(input$import, {
httr::GET(url = "https://someurl/that/takes/a/long/time")
})
Is the curl_fetch_multi from the curl package suited for that task?
Here is a way to run GET asynchronously and in a intra-session non-blocking manner (observer returning nothing):
library(shiny)
library(future)
library(promises)
library(future.callr)
library(httr)
plan(callr)
queryGoogle <- function(queryString) {
myResponse <- httr::GET("http://google.com/", path = "search", query = list(q = queryString))
return(myResponse)
}
ui <- fluidPage(
br(),
textOutput("time_output"),
br(),
textInput(inputId="query_input", label = NULL, value = "", placeholder = "Search google..."),
actionButton("import", "Query"),
hr(),
textOutput("query_output")
)
server <- function(input, output, session) {
futureData <- reactiveValues(response = NULL)
observeEvent(input$import, {
myFuture <- future({
queryGoogle(isolate(input$query_input))
})
then(
myFuture,
onFulfilled = function(value) {
futureData$response <- value
},
onRejected = NULL
)
return(NULL)
})
output$query_output <- renderPrint({
req(futureData$response)
})
time <- reactive({
invalidateLater(500, session)
Sys.time()
})
output$time_output <- renderText({ paste("Something running in parallel:", time()) })
}
shinyApp(ui, server)
This is a slight modification of my answer here.
Please also read Joe Cheng's related answer here carefully.

Prevent to read file multiple times from dynamic fileInput

I've created a dynamic fileInput in shiny using lapply. When I want to read the file, I've also used lapply in an observer.
The problem of using lapply here is, it is triggered every time I upload a new file and thus, reads all files again and again if a new file is uploaded.
Here I provide a Hello World app. The lapply function depends on an input paramter which I abtracted from for simplicity.
library(shiny)
ui <- fluidPage(
titlePanel("Hello World"),
sidebarLayout(
sidebarPanel(),
mainPanel(
lapply(1:2, function(i) {
fileInput(
paste0("file", i),
label = NULL,
multiple = F,
accept = c(
"text/csv",
"text/comma-separated-values,text/plain",
".csv"
),
buttonLabel = paste("File", i)
)
}),
verbatimTextOutput("list")
)
)
)
server <- function(input, output) {
r <- reactiveValues()
observe({
lapply(1:2, function(i) {
file <- input[[paste0("file",i)]]
if(is.null(file)) return()
isolate({
r$file[[paste(i)]] <- readr::read_csv2(file = file$datapath)
})
})
})
output$list <- renderPrint(reactiveValuesToList(r))
}
shinyApp(ui = ui, server = server)
How to replace the loop or add a requirement to lapply?
While I started down the road of cache-invalidation in the comments, I think something else may work better for you since you have a fixed number of fileInput fields: swap the lapply and observe lines in your code (plus a couple of other tweaks).
server <- function(input, output) {
lapply(paste0("file", 1:2), function(nm) {
observeEvent(input[[ nm ]], {
req(input[[nm]], file.exists(input[[nm]]$datapath))
readr::read_csv2(file = input[[nm]]$datapath)
})
})
}
Explanation:
I'm creating a list of reactive blocks instead of a reactive block operating on a list. This means "file1" won't react to "file2".
I short-cutted the definition of the input names by putting paste0(...) in the data of the lapply instead of in the function, though it'd be just as easy to do
lapply(1:2, function(i) {
nm <- paste0("file", i)
# ...
})
It's important to have nm defined outside of the observeEvent, and it has to do with delayed evaluation and namespace search order. I fell prey to this a few years ago and was straightened out by Joe Cheng: you can't use a for loop, it must be some environment-preserving operation like this.
N.B.: this is a stub of code, and it is far from complete: having an observe or observeEvent read the data and then discard it is wrong ... it's missing something. Ideally, this should really be a reactive or eventReactive block, or the processed data should be stored in a reactiveValues or reactiveVal. For example:
server <- function(input, output) {
mydata <- lapply(paste0("file", 1:2), function(nm) {
observeEvent(input[[ nm ]], {
req(input[[nm]], file.exists(input[[nm]]$datapath))
readr::read_csv2(file = input[[nm]]$datapath)
})
})
observe({
# the following are identical, the latter more declarative
mydata[[1]]
mydata[["file1"]]
})
}
(And another note about defensive programming: you cannot control perfectly how readr::read_csv2 reacts to that file ... it may error out for some reason. One further step would be to wrap it in tryCatch(..., error = function(e) { errfun(e); NULL; }) where errfun(e) does something meaningful with the error message (logs it and/or gives it to the user in a modal popup) and then returns NULL so that reactive blocks downstream can use req(mydata[[1]]) and will not try to process the NULL.
server <- function(input, output) {
mydata <- lapply(paste0("file", 1:2), function(nm) {
observeEvent(input[[ nm ]], {
req(input[[nm]])
file <- input[[nm]]
tryCatch(
readr::read_csv2(file = input[[nm]]$datapath),
error = function(e) { errfun(e); NULL; })
})
})
observe({
# the following are identical, the latter more declarative
mydata[[1]]
mydata[["file1"]]
})
}

Save a file via sink function in shiny server?

I have a shiny app to generate a .txt file to download.
In addition, I would like to keep a copy of the file that users generate in my shiny server.
the server function looks like :
server <- function(input, output, session){
data_gen <- reactive({
d1= data.frame(...)
d2= data.frame(...)
result <- list(d1=d1, d2=d2)
return(result)
})
create_file <- reactive({
sink("/srv/shiny-server/S3/file.txt",append = TRUE)
print(data_gen()$d1)
print(data_gen()$d2)
sink()
})
output$downloadData <- downloadHandler(
filename = function() {"input.txt"},
content = function(file) {
sink(file,append = TRUE)
print(data_gen()$d1)
print(data_gen()$d2)
sink()
}
)
}
I'm able to download the data but the app does not react to the create_file function and it does not write a copy into shiny server.
Any Idea how could I fix this ?
Your create_file function is a reactive. Reactive functions only evaluate when 1) their output is required, and 2) their inputs have changed. Neither appears to apply here.
What you could do is move the contents of create_file inside your downloadhandler. content must receive a function that returns a file, but the function can do other things first. So try the following:
server <- function(input, output, session){
data_gen <- reactive({
d1= data.frame(...)
d2= data.frame(...)
result <- list(d1=d1, d2=d2)
return(result)
})
output$downloadData <- downloadHandler(
filename = function() {"input.txt"},
content = function(file) {
# save non-user copy
sink("/srv/shiny-server/S3/file.txt",append = TRUE)
print(data_gen()$d1)
print(data_gen()$d2)
sink()
# copy to be returned for user
sink(file,append = TRUE)
print(data_gen()$d1)
print(data_gen()$d2)
sink()
})
}

downloadHandler: Set reactiveValues inside content function

Is it possible to set reactiveValues inside the content part of the downloadHandler? I tried it and don't understand the behavior.
A simple example could be a counter showing how often the download button has been clicked:
library(shiny)
ui <- fluidPage(
downloadButton("downloadData", "Download"),
textOutput("nDownloads"),
actionButton("trig", "get number")
)
server <- function(input, output) {
# Our dataset
data <- mtcars
r.nDownloads <- reactiveValues(n=0)
output$nDownloads <- renderText({
input$trig
paste("number of downloads:", r.nDownloads$n)
})
output$downloadData <- downloadHandler(
filename = function() {
paste("data-", Sys.Date(), ".csv", sep="")
},
content = function(file) {
r.nDownloads$n <- r.nDownloads$n + 1
write.csv(data, file)
}
)
}
shinyApp(ui, server)
If the download button is clicked, the textOutput is grayed out, but not updated. I added an action button as a trigger to force the renderText to be updated. Surprisingly (at least to me) that works: the correct number is shown.
So, somehow the reactiveValue is changed by the downloadHandler, but its dependencies are only invalidated, not updated.
Of course, the proper way to do it would be making the "data"-object reactive and doing the counting there. But I'm curious how the described behavior can be explained.
EDIT:
OK, now I get really confused: I tried what I mentioned above: making "data" reactive and doing the counting there. This could not be simple counting of downloads anymore, because the data-reactive gets only recalculated if it's invalid.
Here is an example with an additional input for the invalidation of "data":
library(shiny)
ui <- fluidPage(
numericInput("nRows", label = "nRows", min=1, max=32, value=15),
downloadButton("downloadData", "Download"),
textOutput("nDownloads")
)
server <- function(input, output) {
r.nDownloads <- reactiveValues(n=0)
# Our dataset
data <- reactive({
isolate({
r.nDownloads$n <- r.nDownloads$n + 1
})
mtcars[1:input$nRows,]
})
output$nDownloads <- renderText({
paste("number of downloads:", r.nDownloads$n)
})
output$downloadData <- downloadHandler(
filename = function() {
paste("data-", Sys.Date(), ".csv", sep="")
},
content = function(file) {
write.csv(data(), file)
}
)
}
shinyApp(ui, server)
But still, I see a similar behavior: Clicking the download button grays the text out, changing "nRows" makes the expected number of downloads (which is now downloads after a change in nRows ;-)) to show up.
Now it gets an actual problem for me: In my real app, a rather complex Excel file can be downloaded. While preparing and formatting the Excel file there can occur events that should lead to some reaction of the app. That's why the download should trigger something. The alternative I can see is, to prepare the Excel file before the user clicks on download (what I would like to avoid, because this can take a few seconds depending on the complexity of the file/formatting).
Am I missing something obvious? If not, I'd appreciate any ideas, how the download event can trigger something in the rest of the app.
The solution is to remove the isolation of the reactiveValues as this prevents the counter from being updated until the numericInput is triggered. This is because data() is dependent on input$nrows.
library(shiny)
ui <- fluidPage(
numericInput("nRows", label = "nRows", min = 1, max = 32, value = 15),
downloadButton("downloadData", "Download"),
textOutput("nDownloads")
)
server <- function(input, output) {
r.nDownloads <- reactiveValues(n = 0)
# Our dataset
data <- reactive({
r.nDownloads$n <- r.nDownloads$n + 1
mtcars[1:input$nRows,]
})
output$nDownloads <- renderText({
paste("number of downloads:", r.nDownloads$n)
})
output$downloadData <- downloadHandler(
filename = function() {
paste("data-", Sys.Date(), ".csv", sep = "")
},
content = function(file) {
write.csv(data(), file)
}
)
}
shinyApp(ui, server)
With regards to the deeper problem, it would be inefficient to constantly prepare a complex Excel file if there is no guarantee that the user would download the file. What you can try to do is:
Keep your data in a reactive method (e.g. data()).
Write a method to prep your data for downloading (e.g. prepExcel(data)) which returns your prepped data.
Pass (1) and (2) into the content of the downloadHandler() like this: write_xx(prepExcel(data())) or pipe the data into the write_xx function like this data() %>% prepExcel() %>% write_xx() where xx is the method used to output your final file e.g. write_xlsx or write_csv etc.
I hope this helps.

Resources