Problem with directory when deploying app on shinyapps - r

I wrote a golem app and wanted to deploy it on the shinyapp.io. Unfortunately, every time I try to do it the following error comes up (in logs):
Warning in loadSupport(appDir, renv = sharedEnv, globalrenv = NULL) :
Loading R/ subdirectory for Shiny application, but this directory appears to contain an R package. Sourcing files in R/ may cause unexpected behavior.
All files related to my project are stored in one directory, where my golem project was initially created. I also checked and set manually working directory to 'R' folder (where app_server and app_ui are stored). Unfortunately when I deploy my app the mentioned error comes up again. Moreover, every time I close my project in RStudio I save workspace image to '.RData' file (this file is also stored in main directory) - maybe here is a problem (but I also tried to deploy w/o this file and it failes either). I really don't know where the problem lies and what this error means.
Interestingly, regular (single) app.R can be deployed on shinyapps without a problem.

Since Shiny 1.5, if you run a shiny app with a subdir called R/, it will load every function stored in it automatically. You can avoid this setting the autoload option to FALSE, doing:
options(shiny.autoload.r=FALSE)
What I do (I'm not sure whether it is best practice) is setting that up just before calling shiny::runApp(). For intance, I usually have a launch() function in my package, which calls shiny::runApp(). Including the option within this launch() function should fix the issue.
Nonetheless, the message is a warning, not an error, and it is possible that everything is working properly in your shiny app.

Related

Why do I get an error when deploying a shiny app?

When I try to publish my ShinyApp with shinyapps.io,
I keep getting the following error message: 'Application mode static requires at least one document'.
Here are some of the fixes, I've implemented:
create a different directory with all dependent files and a copy of the R-script.
re-created custom functions from my local environment within the app
Furthermore, in 'read.csv' I refer to the files directly without specifying the directory, in case this might be the culprit.
Do you have any idea, what the issue might be?

Unable to use correct file paths in R/RStudio

Disclaimer: I am very new here.
I am trying to learn R via RStudio through a tutorial and very early have encountered an extremely frustrating issue: when I am trying to use the read.table function, the program consistently reads my files (written as "~/Desktop/R/FILENAME") as going through the path "C:/Users/Chris/Documents/Desktop/R/FILENAME". Note that the program is considering my Desktop folder to be through my documents folder, which is preventing me from reading any files. I have already set and re-set my working directory multiple times and even re-downloaded R and RStudio and I still encounter this error.
When I enter the entire file path instead of using the "~" shortcut, the program is successfully able to access the files, but I don't want to have to type out the full file path every single time I need to access a file.
Does anyone know how to fix this issue? Is there any further internal issue with how my computer is viewing the desktop in relation to my other files?
I've attached a pic.
Best,
Chris L.
The ~ will tell R to look in your default directory, which in Windows is your Documents folder, this is why you are getting this error. You can change the default directory in the RStudio settings or your R profile. It just depends on how you want to set up your project. For example:
Put all the files in the working directory (getwd() will tell you the working directory for the project). Then you can just call the files with the filename, and you will get tab completion (awesome!). You can change the working directory with setwd(), but remember to use the full path not just ~/XX. This might be the easiest for you if you want to minimise typing.
If you use a lot of scripts, or work on multiple computers or cross-platform, the above solution isn't quite as good. In this situation, you can keep all your files in a base directory, and then in your script use the file.path function to construct the paths:
base_dir <- 'C:/Desktop/R/'
read.table(file.path(base_dir, "FILENAME"))
I actually keep the base_dir assignemnt as a code snippet in RStudio, so I can easily insert it into scripts and know explicitly what is going on, as opposed to configuring it in RStudio or R profile. There is a conditional in the code snippet which detects the platform and assigns the directory correctly.
When R reports "cannot open the connection" it means either of two things:
The file does not exist at that location - you can verify whether the file is there by pasting the full path echoed back in the error message into windows file manager. Sometimes the error is as simple as an extra subdirectory. (This seems to be the problem with your current code - Windows Desktop is never nested in Documents).
If the file exists at the location, then R does not have permission to access the folder. This requires changing Windows folder permissions to grant R read and write permission to the folder.
In windows, if you launch RStudio from the folder you consider the "project workspace home", then all path references can use the dot as "relative to workspace home", e.g. "./data/inputfile.csv"

Set an .Rmd in a package to write files to the current project working directory

I have a .Rmd which I use to report on data quality in a number of different r projects. It would then split the data to remove subsets with missing data, and interpolate missing results where appropriate. It would do this via a write.csv command to a file path in the form of "./Cleansed_data/"
To make an example
open rstudio
go to the rhs 'project' menu , and select and make a new
project wherever you'd like
go to the lhs 'new script' drop down and
select 'new .Rmd'
change the output to .pdf and hit ok
in the last r
chunk include write.csv(mtcars, file = "mtcars.csv")
hit the 'knit
pdf' button, save the report as "writeFile.Rmd" to your project working directory, and
let it run.
Previously I moved this .Rmd from place to place, however now I would like to built it into an internal package. I have included it (as the documentation indicates to) into inst/rmd within the package directory.
In order to do this build or open any package you have access to
add the file to inst/rmd (create it if this doesn't exist)
rebuild the package
I then rebuild the package and open a new project. I load my new package and attempt to run the document via the render command using the system.file command to locate the .rmd like so
rmarkdown::render(input = system.file("rmd/writeFile.Rmd", package="MyPackage"),
output_file = "writeFile.pdf", output_dir = "./Cars/)
This will render the report from the package build into the folder from output_dir, however, there are a number of pitfalls here. First, if I omit the output_dir argument, the report will render into the package library, usually located in the libraries r installation in the c drive. This is however fixable.
What I can't get around is that when the .Rmd hits the write.csv() then (I believe) the .Rmd is being rendered in the package environment at the time, the working directory of which is the package library folder, not the current project directory.
The Questions
How can I inform the template in the package what the current working directory is for the rstudio project? I'm vaguely aware there might be a rstudio api package? I have nearly no understanding of what it is though, or if this would provide a solution.
If this is either outright impossible or just potentially a very bad idea how can I modify the workflow to successfully retrieve a number of r object outputs into the environment or the working directory, on the call to the report, without having to modify the report for each different project? Further, why specifically is this approach such a bad plan?
In order to close this off:
I have selected to keep the .Rmd included in the package. The .Rmd need to move and be versioned with the package as that holds the functions they use to run.
In order to meet my requirements I style the documents to grab the working directory via the rstudio api in the form.
write.csv(mtcars, file = paste0(rstudioapi::getActiveProject(), "mtcars.csv"))
Having tested #CL's answer, this also runs and is not dependant on Rstudio as an IDE, however I know that these documents will
Always be accessed via the rstudio IDE
Always be accessed from within a specific project
I fear (though have not tested) that there would be the potential for other impacts from setting the working directory for the file to be artificially booted into a different WD. Potentially this could be things like child documents I might want to include later, or other code that might need to be relevant to the file path of the package installation, not the project. In this way I think (If I interpreted Yuhui correctly) the r doc is still the centre of it's own universe. It just writes it's data into another one :)

R - copy files from folder (Shiny App)

I'm working on loading up a Shiny App IO. I use an R package that downloads data into a subfolder in my directory and saves two .RData files.
I'm having issues on the Shiny App IO server. I need to load the two .RData files. Locally, I can set the relative path using (~project/source-data). Shiny App does not respond to this.
I can set it as a working directory using a relative path, (./source-data), however, this is not ideal as I have further data manipulation to do at the parent level directory and I can't seem to set the working directory back to the parent level in Shiny App.
Here is what I had moved forward with:
wd = getwd()
sd = (paste(getwd(),"/source-data",sep=""))
sd2 = list.files(sd, full.names = TRUE)
file.copy(from=sd2, to=wd)
My solution, although not ideal, is to copy the two .RData files to the parent directory. At that point, the rest of my code will run smoothly. It works locally, but not on Shiny App.
Does anyone have experience with a similar problem and solution? Either helping me direct the Shiny App IO server to the sub-directory and back to the parent directory once I load in the two .RData files, or copying the files to the parent directory?
I've seen solutions that work on a local R environment, but not one that satisfies the conditions of a Shiny App IO server environment.
Thank you in advance.

how to deploy shiny app that uses local data

I'm deploying my shiny app and I don't know how to input my a local dataset. I keep getting Error: object "data" not found. Here is my path to shiny folder.
library(shinyapps)
shinyapps::deployApp('C:\\Users\\Jeremy\\Desktop\\jerm2')
In this directory (jerm2), I have 3 things: ui.R, server.R, and my local dataset, a .csv called proj.csv.
In the server.R file,
I set data<-read.csv("proj.csv")
I just don't know how to get Shiny to pick up my datasets.
You may want to add a subdirectory in your shiny folder called "Data" and put proj.csv there.
Then, in your server.r put:
data<-read.csv("./Data/proj.csv")
That will make it clear where the data is when the app is deployed to the ShinyApps service.
I ran into this same problem. It turned out that I did not have my working directory pointing to my shiny app at the time I used shiny.io to save and deploy my app.
Be sure that if you're loading the data that the code reflects that your shiny app is the working directory.
Otherwise you will get a log error that looks something like this
cannot open compressed file 'C:/Users/Joseph/Documents/data/data.rda', probable reason 'No such file or directory'
What I did was to write the csv under a sub folder (i.e. data/) of the shiny app directory and then added data<-read.csv("/Data/proj.csv") in server.r (as indicated in the answer). I didn't put the dot and it works.
Another thing is, when you publish it, don't forget to publish both the shiny app and the file in the shiny app folder.

Resources