Automatic re-reading of the source file with shinyapps.io - r

I have an application, where I need to update the source data periodically. Source datafile is a csv file normally stored in the project directory and read with read.csv. The csv. file is changed every day with the updates. The name of the file does not change...just few cases are added.
I need the application to re-read the source pdf file with some periodicity (e.g. once per day). I can do it with reactiveFileReader function and it works when I am running the application from Rstudio, but not after I deploy the application on the web with shinyapps.io.
Can this be even done, when I am not using my own server but shinyapps.io?

Related

open a OneDrive file with r

Im tring to create a shiny app that read and online onedrive xlsx file and show some things, but for the moment Im unable to read the onedrive xlsx file, I already explore the Microsoft365R and I can conect to my onedrive and I even can open the fil but... what it does is from r open a tab in chrome with the excel file.
I need the file in the local enviroment of r.. this its beacause the shiny app must be deploy in a web server, that every time the app runs it reads the update the file.
library(Microsfot365R)
odb <- get_business_onedrive()
odb$open_file("lcursos.xlsx")
Also this its a business account, so I also have to put the username and key to acces each file, that its beacause use the simple url doesnt work, it says Error 403 FORBIDEEN.
Any ideas?
Thank you so much!
Use the download_file() method to download the file to your local machine:
odb$download_file("lcursos.xlsx")
You can set the location of the download with the dest argument. Once it's downloaded, open it with the xls reader package of your choice. I suggest either openxlsx or readxl.
Note that if your file is password protected, your options are limited. See this question for possible solutions.

How to avoid daily uploading of files to PCF

I have my Dash plotly app running in PCF, my app.py runs based on a excel file which is uploaded to pcf along with app.py, but the excel feed changes daily, so daily i am uploading the new file to pcf using "cf push", is it possible to avoid that, like making pcf to read excel from my file system instead of uploading the new excel file to pcf cell container everytime
Basically you need some persistent storage attached to your container so app can refer the available file at run time. There are the options that can be explored:
If NFS is enabled at your end then you can mount the file share and pick the files from that location directly.
Otherwise you can have another PCF service
(just to keep it separate for better management) that can pull the
files from your server using sftp and transfer to S3. Amend your app
to refer the file from S3.

r opencpu root dir for reading csv

I have opencpu (single server) up and functioning. My first function will open a dataset from a csv file stored on my hard drive.
Where should I deploy the csv file? (I tried my apps www directory, but it doesn't work)
In sum: within an opencpu app, where do I deploy a file so that this line of code will work?
indf <- read.csv(".\\nouns-categorical_R1.csv")
The answer is simple to find.
add print(getwd()) to your opencpu function script.
First call the function using POST
The working directory can then be retrieved by calling a GET request for the url ending in "console"
The answer is that the working directory is a temp directory:
.....AppData/Local/Temp/Rtmp0qr704/ocpu_session_3780fc520c8"
This means you cannot store csvs in the working directory. This working directory changes every time you start opencpu.
It is possible to use full path to the csv when calling read.csv(). However, you need to watch for security issues and file privileges once deployed on Ubuntu.

R shiny concurrent file access

I am using the R shiny package to build a web interface for my executable program. The web interface provides user input and shows output.
On the server background, the R script formats user inputs and saves them to a local input file. Then R calls the system command to run the executable program.
My concern is that if multiple users run the web app at the same time, it is possible that the input file generated by the first user will be overwritten by the second user's input before it is read by the executable program.
One way to solve the conflict is to ask R to create a temporary folder and generate/run the input file under that folder for each user. But I'd like to know whether there is a better or automatic way to resolve this potential conflict with shiny. For example, if use shiny fileInputs, the uploaded files are automatically stored in a temporary folder.
Update
Thanks for the advice.#Symbolix and #Mike Wise
I read the persistent data storage article before but I don't think it is exactly what I wanted. Maybe my understanding is not correct. I end up with creating a temporary folder and run my executable from there.

How to Read a .csv file from url

I have successfully developed a Shiny Application for sentiment analysis. When I deployed the application, it was not working. After going through the code I found that my code is referring to the files(emotions.csv.gz,subjectivity.csv.gz) which are located in my system. That could be the reason why the application is not working because on server its not able to find these particular files. Is there be any method to read these files directly from the url where they are present, so that there will be no system dependencies?

Resources