I have developed a Shiny App that provides some plots which can be filtered and "play" around them. The thing is, I would like to:
Give access to some colleagues. This means that I would like the app to be private, so when someone access the link, a password is needed to enter.
Don't let the user manage the data. I mean, I don't want the user to download the data, so I would like the data be already "there", in a server or something like that. This part should be my responsability, to put the data available for the rest of the colleagues.
I wonder if these things are possible.
For #1 Have a look at shinymanager package, as per this example
For #2 Save the data as a .csv or .rds in the same directory as the app, and load it on visit. You can also utilize shinyjs::hide and shinyjs::show also if you like to make the download buttons invisible
Related
I'm currently working on an R Shiny App that utilizes googlesheets4 to pull in a large dataset from GoogleSheets upon app launch. Loading in this dataset to my app takes ~2 minutes, which stalls my entire app's load time.
The only visual in my app is based on this GoogleSheets data, so it is very dependent on this specific dataset. Once the dataset gets pulled into my app, it is filter and therefore becomes much smaller (85,000 rows ---> 1,000 rows). This GoogleSheet data is updated every day, so I don't have the luxury of pre-downloading it once and storing it as a .csv forever.
There are two different fixes for this that I have tried but have been unsuccessful...curious if anyone has any thoughts.
Have a separate app running. My first idea was to create a separate Shiny app entirely, that would have a sole purpose of pulling the GoogleSheets df once a day. Once it pulls it, it would conduct the necessary data cleaning to get it down to ~1,000 rows, and then would push the smaller df to a different GoogleSheet link. Then, my original app with the visual would just always reference that new GoogleSheet (which would take much less time to load in).
The problem I ran into here is that I couldn't figure out how to write a new GoogleSheets doc using googlesheets4. If anyone has any idea how to do that it would be much appreciated.
Temporarily delay the load in of the GoogleSheets data, and let visual populate first. My second idea was to have the code that pulls in the GoogleSheets df be delayed upon launch, letting my visual first populate (using old data) and then have the GoogleSheets pull happen. Once the pull is complete, have the visual re-populate with the updated data.
I couldn't figure out the best/right way to make this happen. I tried messing around with sleep.sys() and futures/promises but couldn't get things to work correctly.
Curious if anyone has any thoughts on my 2 different approaches, or if there's a better approach I'm just not considering...
Thanks!
There is a function called write_sheet that allows you to write data to a google sheet. Does that work for you?
googlesheets4::write_sheet(data = your_data,
ss = spread_sheet_identifier,
sheet = "name_of_sheet_to_write_in")
If you on only want to add something without deleting everything in the sheet, the function is sheet_append
googlesheets4::sheet_append(data = your_data,
ss = spread_sheet_identifier,
sheet = "name_of_sheet_to_write_in")
Not sure you can store the credentials in a save way, but couldn't you use github actions? Or alternatively a cron job on your local computer?
I have successfully been able to pull down an excel files from sharepoint, but half of the time I try to run the code it works and the other half it tells me path does not exist. I can't find any rhyme or reason as to why it works sometimes and not others. I have tried opening the site first, opening the file first, but it seems to be random. I need the code to work consistently or I can't use R for this task. I don't log onto sharepoint, it uses my windows authentication. Is there a way to force it to recognize me or include a password in the read_excel command?
df <- read_excel('//sharepoint...', 'sheetname')
To make access to SharePoint files easy you should sync the sites from the web app to File Explorer. Addresses for these cloud resources that have been synced are commonly of the form:
C:\Users\username\My Org\My Teams Group - General\Project\My Excel.xlsx
This can create a problem when the code is run multiple users. Whilst https addresses for cloud locations may work in File Explorer they do not work directly within R packages. Users are well advised to keep track of their working directory and use relative addresses where helpful to do so, but there is another option.
Make the code user agnostic by setting the username as a variable or returning the homepath with Sys.getenv() function.
library(openxlsx)
username <- Sys.getenv("USERNAME")
sharepoint_address <- "/My Org/My Teams Group – General/Project/My Excel.xlsx"
df <- read.xlsx(xlsxFile = paste0("C:/Users/",username,sharepoint_address), sheet = "Raw Data”)
#More elegantly
df <- read.xlsx(xlsxFile = paste0(Sys.getenv("HOMEPATH"),sharepoint_address), sheet = "Raw Data”)
I do not know how to illustrate this with a simple example. The problem is this:
I generate and display a flextable in a Shiny app and want to place it in a PDF. The only available method is to convert the flextable object to a PNG then place the PNG in the PDF. This works fine, except users are reporting strange results - getting the report with a table that looks nothing like that displayed in the app. I suspect that occasionally users are executing reports very close in time so that the last saved png is grabbed, but it was saved by another user.
The PNG files (there are three) are placed in the app directory, which I believe is not isolated from one user session to another. In the PDF I cannot use relative paths so I cannot save it to a different directory.
Any suggestions?
Have you tried naming the images with a unique key, such as a per-report number getting the images named something like chart_0153927_01.png instead of chart_01.png for report #0153927? Or something like a millisecond/microsecond timestamp set once at the start of the session to reduce collisions?
I am new to IONIC. I want to know what is the ideal choice for storage. Is it IONIC storage?
Also i need to manually enter some data for the app. Now i can’t find any way to store data beforehand, is this possible in ionic?
For example, lets say i need a database/storage filled before with some values. How can i do that? Is that possible or i need to get the data from cloud?
I have posted my query in IONIC forum, this is the best answer i got:
Super simple solution: On start you check if a special value is set, e.g. databasePreloaded. If it is missing, you open a file, read the content and write it to the storage. Then you set databasePreloaded. On next start it will be present and the data won’t be loaded again.
My question is if the data is around 5-6 MB then what the ideal way to do that?
Using a check to see if the data has been loaded is a simple choice.
It would be easy to work with.
The main issue for me is whether the data will change at all once it has been deployed? You would need to think how this would be handled.
Finally, I'm under the impression IndexedDB can only handle up to 5mb so you will need to store this data in sqlite?
In my multi-user meteor application design I want to enable users to be able to create and store their own reactive dashboards to visualize data that they own within the applications database. For example, a user may have an object in the database representing the real-time disk usage of a processor. I want them to be able to submit/store html say to represent a dynamic dial as their dashboard. Another user may have their own weather station and want a dashboard with a last 24 hours thermometer and pressure trend. When they call up one of their stored dashboards it is rendered and would update as their data changes.
Can anyone point to example code or explain how to accomplish this? Or, authoritatively explain why it cannot be done in the framework. I have come across various dynamic API's but nothing that fits the bill. I.e. UI.renderWithData and Meteor._def_template.
The following topic was very similar to my questions and it got me a good start and I figured it out and posted and answer there.
How to make meteor evaluate user defined template text