How to pass environment variables to shinyapps - r

I want to pass secure parameters to shinyapps.io deployment so my application could get them via:
Sys.getenv('PASSWORD_X')
I cannot find anything for this in deployApp function in the rsconnect package.

You can use Renviron.site or .Renviron to store and access private data into your shiny application. (see here for Hadley Wickham's recommendations and instructions - ref example below).
Solution:
Storing API Authentication Keys/Tokens (Attribution: Hadley Wickham)
If your package supports an authentication workflow based on an API key or token, encourage users to store it in an environment variable. We illustrate this using the github R package, which wraps the Github v3 API. Tailor this template to your API + package and include in README.md or a vignette.
Create a personal access token in the Personal access tokens area of
your GitHub personal settings. Copy token to the clipboard.
Identify your home directory. Not sure? Enter normalizePath("~/") in the R
console.
Create a new text file. If in RStudio, do File > New File >
Text file.
Create a line like this:
GITHUB_PAT=blahblahblahblahblahblah
where the name GITHUB_PAT reminds you which API this is for and blahblahblahblahblahblah is your personal access token, pasted from the clipboard.
Make sure the last line in the file is empty (if it isn’t R will silently fail to load the file. If you’re using an editor that shows line numbers, there should be two lines, where the second one is empty.
Save in your home directory with the filename .Renviron. If questioned,
YES you do want to use a filename that begins with a dot ..
Note that by default dotfiles are usually hidden. But within RStudio, the file browser will make .Renviron visible and therefore easy to edit in the future.
Restart R. .Renviron is processed only at the start of an R session.
Use Sys.getenv() to access your token. For example, here’s how to use your GITHUB_PAT with the github package:
library(github)
ctx <- create.github.context(access_token = Sys.getenv("GITHUB_PAT"))
# ... proceed to use other package functions to open issues, etc.
FAQ: Why define this environment variable via .Renviron instead of in .bash_profile or .bashrc?
Because there are many combinations of OS and ways of running R where the .Renviron approach “just works” and the bash stuff does not. When R is a child process of, say, Emacs or RStudio, you can’t always count on environment variables being passed to R. Put them in an R-specific start-up file and save yourself some grief.

Related

Setting R's default behaviour in .Rprofile and .Renviron. Understanding what to do where

I am trying to understand the use of the files .Renviron and .Rprofile. If I understand correctly, the .Rprofile is sort of a startup script, sourced as R code, that sets the options and environment variables that a user may want either all the time, or for a specific project. On the other hand .Renviron is loaded before .Rproject, and defines environment variables only.
By design I understand that R will load either the user or the project level .Renviron and .Rprofile files, but it won't load both user and project level files. Essentially, R will only load the project specific .Rprofile and .Renviron files, provided they are defined. That said, some libraries and functions would be prudent to put in the user level .Rprofile, as I need it pretty much all the time (e.g. I use dplyr syntax a lot), while at the same time I'd like to load project specific libraries and functions as well.
The purpose of the .Renviron file is more elusive to me. From what I understand, its purpose is to store environment variables, such as passwords, API keys, etc.. However, I can also set environment variables in .Rprofile using Sys.setenv(). For example, I have the environment variable set in a project's .Rprofile, to use parallelization with the package below:
Sys.setenv(OMP_NUM_THREADS=parallel::detectCores())
library(OpenMx)
Since the .Renviron doesn't use code, my assumption is I could've put this line in a .Renviron file with the following syntax:
OMP_NUM_THREADS=[number of cores]
However, I find little useful information on how to set environment variables in .Renviron, and what is advisable to put here.
My questions therefore are:
How can I load both user and project level .Renviron and .Rproject files when working in a project?
What environment variables would I typically put in .Renviron? (Any list or tutorials on how to set variables would be appreciated.)
In which cases would it be recommended to add environment variables to .Renviron over using Sys.setenv() in .Rprofile, and vice versa?
However, I can also set environment variables in .Rprofile using Sys.setenv().
"Yes but" these can under standard POSIX behaviour not alter the running process for which the variables have to be set before.
I just like you tried to get by for as long as I could with only ~/.Rprofile (or even just Rprofile.site for the whole machine) but eventually added things in .Renviron for
R_LIBS_USER to "" as I prefer not to have installations below ~
R_MAX_NUM_DLLS which has to be here
plus a few tokens for services
plus a reticulate option
plus a R CMD check option
so in some cases you do in fact have to use .Renviron (or Renvirob.site).

set R_USER for multiple users under windows

Windows seems to put R libraries in a onedrive directory by default if onedrive is in use. This is undesirable especially if you're using both R and onedrive on multiple computers with the same onedrive account.
How would I set my library to be put inside of C:\users<username>\documents instead of in C:\users<username>\onedrive\documents? There are good solutions here (How do I change the default library path for R packages), but they're mostly focused on solving this for a single windows account. Is there a general way to solve it for all accounts?
Every R installation has an etc/ directory with configuration, in it you can set Rprofile.site or, easier still, Renviron.site.
Files ending in .site should not get overwritten on the next install. Make sure you don't bulk delete though.
You can query where it is via R.home("etc"). On my (Linux) system:
> R.home("etc")
[1] "/usr/lib/R/etc"
>
Really excellent solution from here (https://github.com/r-windows/docs/issues/3):
just create an Renviron.site file in the /etc folder of your Rinstallation, then copy the following line to it:
R_USER=C:/Users/${USERNAME}/Documents
This sets R_USER which in turn sets R_LIBS_USER according to the user directory of each account under windows 10.

Output from R package to user's home filespace without being "malicious or anti-social"

I'm writing a package to generate a PDF of bridge card hands. While intermediate files are saved to temporary files and then unlinked, the ultimate PDF needs to be saved so that it can be printed or saved into a collection. This contradicts the CRAN Repository Policy:
Packages should not write in the user’s home filespace (including
clipboards), nor anywhere else on the file system apart from the R
session’s temporary directory (or during installation in the location
pointed to by TMPDIR: and such usage should be cleaned up). Installing
into the system’s R installation (e.g., scripts to its bin directory)
is not allowed.
How can the code be both compliant and the PDF available to the user?
Thank you,
TC
The function could just accept a filename (so the user can put it in his home if he wants to), and if none given (missing), you could generate a temp pdf file and verbosely say you put it there?

Set an .Rmd in a package to write files to the current project working directory

I have a .Rmd which I use to report on data quality in a number of different r projects. It would then split the data to remove subsets with missing data, and interpolate missing results where appropriate. It would do this via a write.csv command to a file path in the form of "./Cleansed_data/"
To make an example
open rstudio
go to the rhs 'project' menu , and select and make a new
project wherever you'd like
go to the lhs 'new script' drop down and
select 'new .Rmd'
change the output to .pdf and hit ok
in the last r
chunk include write.csv(mtcars, file = "mtcars.csv")
hit the 'knit
pdf' button, save the report as "writeFile.Rmd" to your project working directory, and
let it run.
Previously I moved this .Rmd from place to place, however now I would like to built it into an internal package. I have included it (as the documentation indicates to) into inst/rmd within the package directory.
In order to do this build or open any package you have access to
add the file to inst/rmd (create it if this doesn't exist)
rebuild the package
I then rebuild the package and open a new project. I load my new package and attempt to run the document via the render command using the system.file command to locate the .rmd like so
rmarkdown::render(input = system.file("rmd/writeFile.Rmd", package="MyPackage"),
output_file = "writeFile.pdf", output_dir = "./Cars/)
This will render the report from the package build into the folder from output_dir, however, there are a number of pitfalls here. First, if I omit the output_dir argument, the report will render into the package library, usually located in the libraries r installation in the c drive. This is however fixable.
What I can't get around is that when the .Rmd hits the write.csv() then (I believe) the .Rmd is being rendered in the package environment at the time, the working directory of which is the package library folder, not the current project directory.
The Questions
How can I inform the template in the package what the current working directory is for the rstudio project? I'm vaguely aware there might be a rstudio api package? I have nearly no understanding of what it is though, or if this would provide a solution.
If this is either outright impossible or just potentially a very bad idea how can I modify the workflow to successfully retrieve a number of r object outputs into the environment or the working directory, on the call to the report, without having to modify the report for each different project? Further, why specifically is this approach such a bad plan?
In order to close this off:
I have selected to keep the .Rmd included in the package. The .Rmd need to move and be versioned with the package as that holds the functions they use to run.
In order to meet my requirements I style the documents to grab the working directory via the rstudio api in the form.
write.csv(mtcars, file = paste0(rstudioapi::getActiveProject(), "mtcars.csv"))
Having tested #CL's answer, this also runs and is not dependant on Rstudio as an IDE, however I know that these documents will
Always be accessed via the rstudio IDE
Always be accessed from within a specific project
I fear (though have not tested) that there would be the potential for other impacts from setting the working directory for the file to be artificially booted into a different WD. Potentially this could be things like child documents I might want to include later, or other code that might need to be relevant to the file path of the package installation, not the project. In this way I think (If I interpreted Yuhui correctly) the r doc is still the centre of it's own universe. It just writes it's data into another one :)

providing R script templates with my package and open them easily

I have some script I want to provide to the user together with my package.
The normal way is to put them into the inst folder.
My problem is now. I want to allow the user to copy the script into the local directory so he can make changes on the default settings and execute the script locally.
Is there a comfortable way to open these provided scripts? Can I create links to the script within the package vignettes? Or has anyone a better solution instead of using a complicated command like this
file.copy(system.file('test.R','MyPackage'), '.')
Specially in this case, the user has to know the script name.
Thanks for any kind of suggestion.

Resources