I'm facing a problem with R shiny and sqlite. My app is supposed to authenticate the user and load his/her preferences.
Here is my code :
server.R
library(shiny)
library(shinyBS)
library(dplyr)
library(lubridate)
library(DBI)
library(RSQLite)
############################
# Database functions #
###########################
# Connect the user to the dtabase app
connect <- function(userName,pwd){
#Put a generic path here
db <- dbConnect(SQLite(), dbname = "my_path/database.db")
#Query to get the correct passwd
qry = paste('SELECT password from USERS where name = "',userName,'"')
res= dbGetQuery(db,qry )
ifelse(res==pwd,"connected","unable to connect to the database")
dbDisconnect(db)
}
function(input, output,session) {
observeEvent(input$connectButton, {
userName= renderPrint(input$username)
print(userName)
userPwd = paste(input$password)
connect(user = userName,pwd = userPwd)
})
ui.R
shinyUI(fluidPage(
titlePanel("Authentification"),
textInput('username', label="User name"),
textInput('password', label= "password"),
actionButton("connectButton", label='Connect'),
actionButton("subscribeButton",label='Subscribe')
)
)
app.R
library(shiny)
library(shinyBS)
####### UI
ui <- source("ui.R")
####### Server
varserver <- source("server.R")
####### APP
shinyApp(ui = ui, server = varserver)
My problem is when I want to put the content of the TextInput for the queries. I've tried several methods
With the current version, renderPrint(input$username) returns me something what seems to be a function but it doesn't seem to be useful.
I also tried an other way using only
userName=paste(input$userName)
This returns me the content of the textField but when i integrate it to the query it puts
[1] "SELECT password from USERS where name = \" test \""
and then I got the error
Warning: Error in matrix: length of 'dimnames' [2] not equal to array extent
My objective is to have a query like this
"Select password FROM USERS where name = "username"
with username representing the content of the TextInput.
EDIT
I know use this version for the query, and it put a syntaxly correct query
qry = paste0('SELECT password from USERS where name = \'',userName,'\'')
res= dbGetQuery(db,qry )
but I face this problem now :
Warning: Error in matrix: length of 'dimnames' [2] not equal to array extent
when I run the method
connect(db,qry)
I think the problem comes from the way i get the content of the TextInput : I use
function(input, output,session) {
observeEvent(input$connectButton, {
userName= paste0(input$username)
userPwd = paste0(input$password)
connect(user = userName,pwd = userPwd)
})
What do you think about this ?
I found a solution which work
connect <- function(userName,pwd){
#Put a generic path here
db <- dbConnect(SQLite(), dbname = "my_path/database.db")
#Query to get the correct passwd
qry = paste0("SELECT password from USERS where name = \'",userName,"\'")
res= dbGetQuery(db,qry )
res = paste0(res)
ifelse(res==pwd,print("connected"),print("unable to connect to the database"))
dbDisconnect(db)
}
I just cast arguments between simple quotes
Related
I have an Oracle database which is refreshed once a day. I am a bit confused on how apps work in Shiny, what gets run once on app startup - and what gets run once per session.
My naive approach was to create a database connection and run a query outside of UI and Server code to create a dataframe of around 600,000 records...which can then be filtered and sliced during the session. I am a bit concerned by doing it inside app.R in global scope, that this connection and dataframe will only be created once when the server starts the app, and will never get run again (if that makes sense).
If I create the data frame in server, then my UI code fails, as is is dependent on the results of a query to populate the select list, and I do this in app.R scope at the moment, so UI can access it.
library(shiny)
library(DBI)
library(dplyr)
library(odbc)
library(stringdist)
library(reactable)
############################################################################
# business functions #
############################################################################
get_list_of_actives_from_db <- function() {
con <- dbConnect(odbc::odbc(), Driver="oracle", Host = "server.mycompany.net", Port = "1521", SVC = "service1", UID = "user_01", PWD = "hello", timeout = 10)
ingredients_df = dbGetQuery(con,
'
select DISTINCT INGREDIENTS FROM AES
'
)
}
get_adverse_events_from_db <- function() {
con <- dbConnect(odbc::odbc(), Driver="oracle", Host = "server.mycompany.net", Port = "1521", SVC = "service1", UID = "user_01", PWD = "hello", timeout = 10)
cases_df = dbGetQuery(con,
'
select * FROM AES
'
)
return(cases_df)
}
############################################################################
# load data sets for use in dashboard #
############################################################################
cases_df = get_adverse_events_from_db() # drive select list in UI
ingredients_df = get_list_of_actives_from_db() # main data to slice and filter
############################################################################
# shiny UI #
############################################################################
ui <- fluidPage(
"Adverse Event Fuzzy Search Tool",
fluidRow(
selectInput("ingredients", label = "Select on or more Active Ingredients:", choices = ingredients_df$PRIMARY_SUSPECT_KEY_INGREDIENT, multi=TRUE),
textInput("search_term", "AE Search Term:"),
actionButton("do_search", "Perform Search")
)
,
fluidRow(
reactableOutput("search_results")
)
)
############################################################################
# shiny server #
############################################################################
server <- function(input, output, session) {
# do stuff here to filter the data frame based on the selected value and render a table
}
# Run the application
shinyApp(ui = ui, server = server)
My main concern is doing this in the root of app.R, both functions run oracle queries which never need to be re-run for the session, as the data will only change overnight via ETL.
############################################################################
# load data sets for use in dashboard #
############################################################################
cases_df = get_adverse_events_from_db()
ingredients_df = get_list_of_actives_from_db()
When and how often is this called? Once when the app is initialized so the data set is never updated and is shared across sessions by users? Or is the entire script run end to end whenever a new sessions is started?
Part of me thinks it should be in the server function, so it runs once per session. But being new to Shiny I feel like server is called constantly whenever there is a change in the UI, I dont want to be constantly loading 600,000 records from Oracle.
Ideally I would cache the results once a day and make them available to all users across all sessions, not sure how to achieve that - so for now just want to know the best way to achieve this, so each user runs the query once and has the data frame cached for the session.
Please check RStudio's article Scoping rules for Shiny apps in this context.
If I got you right, you are asking to share a dataset across shiny-sessions and update it daily (The title of the question didn't really fit your explanation of the problem - I edited it).
I'd suggest using a cross-session reactivePoll to avoid unnecessary DB queries (I once asked a similar question here - Over there I gave an example showing, that the same can be achived via reactiveValues but it's more complex).
Here is the simple pattern you can use - please note that reactivePoll is defined outside the server function so all sessions share the same data:
library(shiny)
ui <- fluidPage(textOutput("my_db_data"))
updated_db_data <- reactivePoll(
intervalMillis = 1000L*60L*5L, # check for a new day every 5 minutes
session = NULL,
checkFunc = function() {
print(paste("Running checkFunc:", Sys.time()))
Sys.Date()
},
valueFunc = function() {
# your db query goes here:
paste("Latests DB update:", Sys.time())
}
)
server <- function(input, output, session) {
output$my_db_data <- renderText(updated_db_data())
}
shinyApp(ui, server)
Here, every 5 minutes the checkFunc checks for a new day - valueFunc is executed only if the result of checkFunc changed. As a (real world) alternative for checkFunc you could implement a query to check for the number of rows of a certain DB table.
PS: There is an example given on a cross-session reactiveFileReader (which is based on reactivePoll) when viewing ?reactiveFileReader
PPS: When doing further filtering etc. on that dataset also check bindCache().
While untested, perhaps this architecture will work:
server <- function(input, output, session) {
dailydata_ <- reactiveValues(when = NULL, what = NULL)
dailydata <- reactive({
oldwhen <- dailydata_$when
if (is.null(oldwhen) ||
as.Date(oldwhen) < Sys.Date()) {
newdata <- tryCatch(
DBI::dbGetQuery(con, "..."),
error = function(e) e)
if (inherits(newdata, "error")) {
warning("error retrieving new data: ", conditionMessage(e))
warning("using stale data instead")
} else {
dailydata_$when <- Sys.time()
dailydata_$what <- newdata
}
}
dailydata_$what
})
# some consumer of the real data
output$tbl <- renderTable(dailydata())
}
The advantage to this is that it's re-query will trigger when the data was retrieved on a different day. Granted, when the new ETL is available might change how exactly this conditional is fashioned, it might be that if it is updated at (say) 2am, then you may need some more time-math to determine if the current data is before or after the most recent update.
This logic has a "data available" fail: if it could not be queried, then the current/stale data is re-used. If you prefer that it returns no data, that is easy enough to change in the code.
(One thing you might want to do is to show the user when the data was last retrieved; this can be retrieved directly with dailydata_$when, accepting that it might be NULL.)
I have an R Shiny app connecting to SQL database with the following code:
ui <- fluidPage(
column(12, align = 'center', textInput("userName", "User Name")),
column(12, align = 'center', actionButton("dbConnect", "Connect to Database")))
server <- function(session, input, output) {
observeEvent(input$dbConnect(odbc::odbc(),
driver = "ODBC Driver 17 for SQL Server",
Server = '123......',
Database = 'example',
UID = input$userName,
PWD = askpass("Enter Database Password"))
}
The code properly works for entry of username and then prompts password. However, it does not connect to the database and prompts Login failed for user 'username'.
I have ensured that the username and password are correct and connect to the database when entering SQL.
Are you trying to return a connection?
conn = eventReactive(
input$dbConnect,
{
PWD = askpass::askpass("Enter Database Password")
conn = DBI::dbConnect(
odbc::odbc(),
driver = "ODBC Driver 17 for SQL Server",
Server = '123.......',
Database = 'example',
port=1433,
UID = input$userName,
PWD = PWD
)
return(conn)
}
)
You can then use conn() in subsequent calls, for example
output$tables=renderText({
req(conn())
DBI::dbListTables(conn())
})
Your sample code is a syntax error,
observeEvent(input$dbConnect(odbc::odbc(),
driver = "ODBC Driver 17 for SQL Server",
Server = '123......',
Database = 'example',
UID = input$userName,
PWD = askpass("Enter Database Password"))
should probably be
observeEvent(input$userName, {
dbConnect(odbc::odbc(),
driver = "ODBC Driver 17 for SQL Server",
Server = '123......',
Database = 'example',
UID = input$userName,
PWD = askpass("Enter Database Password"))
})
but even that won't work.
You never store that anywhere, so no query can use it. The DBI methods such as DBI::dbGetQuery all require the object as their first argument, so you need to put it in a real object. #langtang's suggestion to use eventReactive is the right start.
You cannot use askpass in a shiny app that you intend to deploy. You'll need to use environment variables, config.yml (via the config R package), or some other ways to pass secrets to the app. (I'm going to ignore this part for the sample code below, but keep it in mind.)
My suggestion
Store the credentials in a list (as opposed to a database connection object) and include a function that connects, runs the query, then disconnects. While there is a very little overhead added for each query, it is very small.
But the bigger reason is this: if you ever plan on providing even simple async operations in your shiny app by including promise+future, the connection object cannot be transferred to future processes, so any query in a promise will fail. However, a list of parameters can transfer.
So you might try:
myGetQuery <- function(..., cred) {
tmpcon <- do.call(DBI::dbConnect, c(list(odbc::odbc()), cred))
on.exit(DBI::dbDisconnect(tmpcon), add = TRUE)
DBI::dbGetQuery(tmpcon, ...)
}
# similar funcs needed if you need things like: dbExecute,
# dbListTables, dbListFields, etc, elsewhere in your server component
cred <- list(
driver = "ODBC Driver 17 for SQL Server",
Server = '123......',
Database = 'example'
)
And somewhere in your shiny app, something like:
cred_with_userpass <- reactive({
c(cred, UID = input$userName,
PWD = askpass("Enter Database Password"))
})
somedat <- reactive({
myGetQuery("select 1 as a", cred = cred_with_userpass())
})
I am pulling in a query results in my R shiny app. I want to be able to see the format of the results before i create functions to do what i need for my app. This is the basic setup for my R shiny app
#libraries
library(DBI)
library(rJava)
library(RJDBC)
#the driver is a JDBC if that helps
con <- dbConnect(drv, url "some url")
my_query = "select * from my_table"
print(my_query)
server <- function(input, output, session){
}
ui=shinyUI(fluidPage(
))
shinyApp(ui = ui, server = server)
This just gives me
[1] "select * from my_table"
in the console in RStudio
How can i show my query results in the console of R studio for my shiny app?
Here you need to pass the output from the server to the ui. In the server you should have the code that will fetch the data from query. After that the output is passed to ui.
Try your code in the below format. It is only an indicative example. Please update this code with your original parameters.
library(DBI)
library(rJava)
library(RJDBC)
server <- function(input, output, session){
con <- dbConnect(drv, url "some url")
output$table1 <- renderTable({
my_query = "select * from my_table"
dbGetQuery(con,my_query)
})
}
ui=shinyUI(fluidPage(
tableOutput("table1")
))
shinyApp(ui = ui, server = server)
Sumanta's answer is right, however remember to close the connection to the database.
df <- dbGetQuery(con,my_query)
on.exit(RJDBC::dbDisconnect(con))
To check for format you could use
str(df) # as some people have mentioned above
# or
glimpse(df)
As I am new to developing with Shiny, I am interested in the best practices for automated database queries. At the time of writing there are a number of different sources with different information.
If I am to query my postgres database every 10 minutes as in the example below, I want to make sure that there are no issues with a) closing the connection on session exit and b) not being able to connect due to too many open connections. My dashboard will in the future have at most a dozen users at one time.
Having done some research, I am convinced that the best way to do this is not necessarily to use a pool but to use the "one connection per query" method documented by Shiny here
Is using reactivePoll() as I have below the correct way to implement a query that will refresh the rendered table every 10 minutes? The database I will be querying will definitely return different data with every call. Does that mean that checkFunc and valueFunc should be the same or can checkFunc be left as an empty function altogether ?
library(shiny)
library(DBI)
args <- list(
drv = dbDriver("PostgreSQL"),
dbname = "shinydemo",
host = "shiny-demo.csa7qlmguqrf.us-east-1.rds.amazonaws.com",
username = "guest",
password = "guest"
)
ui <- fluidPage(
textInput("ID", "Enter your ID:", "5"),
tableOutput("tbl"),
numericInput("nrows", "How many cities to show?", 10),
plotOutput("popPlot")
)
server <- function(input, output, session) {
output$tbl <- renderTable({
conn <- do.call(DBI::dbConnect, args)
on.exit(DBI::dbDisconnect(conn))
sql <- "SELECT * FROM City WHERE ID = ?id;"
query <- sqlInterpolate(conn, sql, id = input$ID)
data <- reactivePoll(10000, session,
checkFunc = function() {}
valueFunc = function() {
dbGetQuery(conn, query)
})
})
}
shinyApp(ui, server)
I recommend creating your db connection 'conn' out of any output objects.
args <- list(
drv = dbDriver("PostgreSQL"),
dbname = "shinydemo",
host = "shiny-demo.csa7qlmguqrf.us-east-1.rds.amazonaws.com",
username = "guest",
password = "guest"
)
conn <- do.call(DBI::dbConnect, args)
it could be a global environment object, like the 'args' list in your sample code, or inside the server function, queries within rendered output objects will all access the same 'conn' db connection. In my experience, disconnect was not necessary to include, after the Rsession with the Shiny app is closed the database disconnects too.
I have a shinyapp and I want to enable certain features to the members who login to the app using google login. I am not able to implement the Google login and authentication process within my app using the GoogleAuthR package. Does anyone has an example of a sample ShinyApp which allows the audience to login through either google or any other social forum authorizations
Appreciate a demo with code.
PS: I have no intention of running statistics on Google data but I only want to do away with the hassle of creating a login module for my app and let Google login take care of the hassles
Thank you
SD
There is an example in the readme, that you can see working as a Shiny app here
If you are intending it for just logging in purposes, check out GoogleID package which is built with googleAuthR with this in mind.
Example code below:
## in global.R
library(googleAuthR)
library(shiny)
options(googleAuthR.scopes.selected = "https://www.googleapis.com/auth/urlshortener")
options(googleAnalyticsR.webapp.client_id = "YOUR_PROJECT_KEY")
options(googleAnalyticsR.webapp.client_secret = "YOUR_CLIENT_SECRET")
shorten_url <- function(url){
body = list(
longUrl = url
)
f <- gar_api_generator("https://www.googleapis.com/urlshortener/v1/url",
"POST",
data_parse_function = function(x) x$id)
f(the_body = body)
}
## server.R
source("global.R")
server <- function(input, output, session){
## Create access token and render login button
access_token <- callModule(googleAuth, "loginButton")
short_url_output <- eventReactive(input$submit, {
## wrap existing function with_shiny
## pass the reactive token in shiny_access_token
## pass other named arguments
with_shiny(f = shorten_url,
shiny_access_token = access_token(),
url=input$url)
})
output$short_url <- renderText({
short_url_output()
})
}
## ui.R
ui <- fluidPage(
googleAuthUI("loginButton"),
textInput("url", "Enter URL"),
actionButton("submit", "Shorten URL"),
textOutput("short_url")
)
### If the above global.R, server.R and ui.R files are in folder "test" like so:
## /home
## |->/test/
## /global.R
## /ui.R
## /server.R
##
## Port 1221 has been set in your Google Project options as the port to listen to
## as explained in authentication setup section
## run below in /home directory
shiny::runApp("./test/", launch.browser=T, port=1221)
I solved this problem in a different manner using gar_shiny_ui
we need to define the UI in the server
Get user info and extract email from his/her google login
Use this email to determine if the person is from your organisation
If the person is from your organisation, show the main UI else show a UI which says 'You cannot access this tool'
#Function to get google user data which will be used for checking if the user comes from your organisation
user_info <- function(){
f <- gar_api_generator("https://www.googleapis.com/oauth2/v1/userinfo",
"GET",
data_parse_function = function(x) x)
f()}
#UI code based on Output coming via server code
ui<-uiOutput('myUI')
#Server side code to do all the lifting
server = function(input, output,session) {
gar_shiny_auth(session)
#Check if user has already logged in with google authentication
gmail='john.doe#unkwown.com'
tryCatch({
x<- user_info()
gmail=x$email
print(gmail)})
print(gmail)
#Create a different UI based on where the user comes from (MyOrg or Not)
output$myUI <- renderUI({
if(grepl('#myorganisation.com',gmail)){
ui = fluidPage(
shinyjs::useShinyjs(),
title='Your Product',
theme = shinytheme("cerulean"),
img(src = "mycompany_logo.png", height = 200, width = 400),
sidebarLayout(
sidebarPanel(write whatever you want)
,
mainPanel( write whatever you want)
)
)}
else {
ui = fluidPage(mainPanel(
h4("My Company Data Team Presents", allign="center"),
h1("My Tool", allign="center"),
p("Tool that makes analysing any and everything ",
em("incredibly easy "),
"with a simple click."),
br(),
p("- But unfortunately, your account does not have the rights to ",
em("access "),
"this tool."))) }
})
shinyApp(gar_shiny_ui(ui, login_ui = silent_auth), server)