Additional information to ensure R code will run on another computer? - r

sessionInfo() includes very useful info that will improve the chances of someone being able to run your code on their machine, including
OS and version
R version
Attached packages
What other info can be provided with an R script to ensure someone else will be able to run it in their environment?
NB please include how to get that info (i.e. what command to run or where to look for it)

While this is not a complete answer, I tend to include this function with scripts I send along as it will download a package if the computer does not have it. This is more of a suggestion for scripts. For packages, you can explicitly put what versions of other packages your package depends on.
package_load<-function(packages = NULL, quiet=TRUE,
verbose=FALSE, warn.conflicts=FALSE){
# download required packages if they're not already
pkgsToDownload<- packages[!(packages %in% installed.packages()[,"Package"])]
if(length(pkgsToDownload)>0)
install.packages(pkgsToDownload, repos="http://cran.us.r-project.org",
quiet=quiet, verbose=verbose)
# then load them
for(i in 1:length(packages))
require(packages[i], character.only=T, quietly=quiet,
warn.conflicts=warn.conflicts)
}
## Example of use
package_load(c('dplyr', 'rgdal'))
This is helpful for one off scripts as it gets over the hurdle of a different computer not having the appropriate packages. However, I generally suggest to folks to make sure their version of R is up to date as well.
Is this the best solution? Probably not, but it does help with minor scripts you send along to others. For a larger code base, it would probably be better to put together a package or a docker image.

I think the criterion you listed are the "basics" of reusability of a script. The next levels would be the possible interaction of your scripts (e.g. R Shiny scripts will use web features: therefore, giving the web browser and its version used to produce the script is a good practice). Also, another kind of information would be commentaries precising the expected input and outputs.
NB: I would precise "attached packages and their versions", just for us to be sure...

Related

Searching for a way to use `linearKEuclid` and corresponding functions of `spatstat`

My goal is to analyse simple point patterns on linear networks with respect to Euclidean distance instead of shortest-path distance implemented in linearK and related functions of spatstat and its sub packages. Browsing through the web I found the promising named function linearKEuclid() and related functions here.
Unfortunately, I could not bring those functions to live on my Win machine, e.g. I run in errors like this
Error in xysegMcircle(Y$x, Y$y, D, df$x0, df$y0, df$x1, df$y1) :
object 'C_circMseg' not found
or
Error in tapply(stuff$sinalpha, list(ii, jj), harmonicsum) :
object 'harmonicsum' not found
There is always something missing. For me, this means simply copying missing functions from the web, if available, does not help.
Probably, a reason for this is that the functions are merely written for internal purposes and under internal development, see, for instance, here under "Details".
However, I am hoping for some recommendation making the fascinating code around linearKEuclid() runnable on my machine. Maybe, there are some chances that someone draws my attention to a downloadable developer version or something comparable. Many thanks in advance!
I understand your confusion and it is unnecessarily complicated to get this to work at the moment since problems with another package on CRAN prevents spatstat and subpackages to be updated at the moment. Indeed you need to install a development version of spatstat.linnet and its dependencies. This is most easily done if you have the package remotes installed (and necessary tools to compile packages from source which would be RTools on Windows):
First run (in sequence):
remotes::install_github("spatstat/spatstat.random")
remotes::install_github("spatstat/spatstat.sparse")
remotes::install_github("baddstats/spatstat.explore")
remotes::install_github("baddstats/spatstat.model")
remotes::install_github("spatstat/spatstat.linnet")
Now the function should work (you may have to restart R if an old version of spatstat.linnet was already loaded when you updated). Try e.g. the example from the help file:
library(spatstat.linnet)
X <- rpoislpp(5, simplenet)
K <- linearKEuclid(X)

Are there any good resources/best-practices to "industrialize" code in R for a data science project?

I need to "industrialize" an R code for a data science project, because the project will be rerun several times in the future with fresh data. The new code should be really easy to follow even for people who have not worked on the project before and they should be able to redo the whole workflow quite quickly. Therefore I am looking for tips, suggestions, resources and best-practices on how to achieve this objective.
Thank you for your help in advance!
You can make an R package out of your project, because it has everything you need for a standalone project that you want to share with others :
Easy to share, download and install
R has a very efficient documentation system for your functions and objects when you work within R Studio. Combined with roxygen2, it enables you to document precisely every function, and makes the code clearer since you can avoid commenting with inline comments (but please do so anyway if needed)
You can specify quite easily which dependancies your package will need, so that every one knows what to install for your project to work. You can also use packrat if you want to mimic python's virtualenv
R also provide a long format documentation system, which are called vignettes and are similar to a printed notebook : you can display code, text, code results, etc. This is were you will write guidelines and methods on how to use the functions, provide detailed instructions for a certain method, etc. Once the package is installed they are automatically included and available for all users.
The only downside is the following : since R is a functional programming language, a package consists of mainly functions, and some other relevant objects (data, for instance), but not really scripts.
More details about the last point if your project consists in a script that calls a set of functions to do something, it cannot directly appear within the package. Two options here : a) you make a dispatcher function that runs a set of functions to do the job, so that users just have to call one function to run the whole method (not really good for maintenance) ; b) you make the whole script appear in a vignette (see above). With this method, people just have to write a single R file (which can be copy-pasted from the vignette), which may look like this :
library(mydatascienceproject)
library(...)
...
dothis()
dothat()
finishwork()
That enables you to execute the whole work from a terminal or a distant machine with Rscript, with the following (using argparse to add arguments)
Rscript myautomatedtask.R --arg1 anargument --arg2 anotherargument
And finally if you write a bash file calling Rscript, you can automate everything !
Feel free to read Hadley Wickham's book about R packages, it is super clear, full of best practices and of great help in writing your packages.
One can get lost in the multiple files in the project's folder, so it should be structured properly: link
Naming conventions that I use: first, second.
Set up the random seed, so the outputs should be reproducible.
Documentation is important: you can use the Roxygen skeleton in rstudio (default ctrl+alt+shift+r).
I usually separate the code into smaller, logically cohesive scripts, and use a main.R script, that uses the others.
If you use a special set of libraries, you can consider using packrat. Once you set it up, you can manage the installed project-specific libraries.

Automatically respond "Yes" to R library prompt

I'm working to define a Docker container which can be spun up in a cloud environment and run some reporting on our firm's database and spin itself down, with as little involvement from our data science team (including myself) as possible.
I'm pretty much done getting everything up and running, with one irritating exception- the reporting is done in R using some code that we've been using for a few years. I'm building on top of Rocker verse, and I'm adding the needs library.
The annoying thing (in this use case) about needs is that when it is first run, it asks the following:
>library('needs')
Should `needs` load itself when it's... needed? (this is recommended)
1: No
2: Yes
Selection:
In a typical interactive setting this is fine, I just type "Yes" and hit enter and I'm good to go. However, when I want the whole environment to build and run once a week on its own, I don't want to have to answer this question. I'd like it to assume Yes.
What I've tried so far includes each of these:
library('needs', quiet=TRUE)
library('needs', quietly=TRUE)
suppressMessages(library('needs', quietly=TRUE))
suppressWarnings(suppressMessages(library('needs', quietly = T)))
suppressPackageStartupMessages(library('needs', quietly=TRUE))
none of which solves the issue. The needs documentation provides for changing this setting later in a programmatic way, but not for defining the setting when first running needs:
Recommended use is to allow the function to autoload when prompted the
first time the package is loaded interactively. To change this setting
later, run needs:::autoload(TRUE) or needs:::autoload(FALSE) to turn
autoloading on or off, respectively.
I've also tried quietly installing needs, also to no avail. Unfortunately, I can't run bash commands in my Dockerfile to respond Yes, or at least I haven't found a way.
I'd like to avoid removing dependencies for needs, as it will involve a LOT of code refactoring.
Any ideas on how to solve this?
Thank you! :]
-Vince
Update
Solution is a bit hacky, but in my Dockerfile I'm doing a vim edit of the file which needs assigns to the sysfile variable:
sysfile <- system.file("extdata", "promptUser", package = "needs")
which for ME was /usr/local/lib/R/site-library/needs/extdata/promptUser, and changing its contents from "1" to "0" solving my problem.
A better solution would probably be to make it so it doesn't ask the question in the first place. You can view the code it runs on package load on github: https://github.com/cran/needs/blob/master/R/needs-package.R
If you set the option it checks for before hand then it doesn't need to ask the question in the first place:
options(needs.promptUser = FALSE)

Convenient way to load (and if needed install) a package in R

A user can work on many PCs. A good code runs no matter what PC it is running on. Assuming one does not want to rely on preference and option files, what is the best way to make sure a package is loaded (and installed if needed).
library command is cool, but the require command is much better. But even require is not getting the job done.
Triggering re-install that is not needed (eg, in R studio) causes an interesting prompt to restart the R session - and this is why unnecessary installs are best avoided.
One possible trick A is to do this (not to type the package name too often)
doInstall <- T;toInstall <- c("downloader");
if(doInstall) install.packages(toInstall);
lapply(toInstall, library, character.only = T)
or a worse trick B would be
if (!require(downloader)) {install.packages("downloader"); require(downloader)}
Is there a "2015 way" of doing it with one command - something like
justdoitall(c("downloader","dplyr"))
Here is an example of installing package zipcode using the pacman approach.
if (!require("pacman")) install.packages("pacman")
pacman::p_load(zipcode)
Assuming one does not want to rely on preference and option files
That rules out putting anything in .Rprofile or using external packages so we're stuck with base R to solve your problem. If that's the case then the answer is that you can't do this much better than what you have written in your question (I prefer B to A)
If you're willing to bend a little bit and require the user to load a package first (which could be done on startup by using .Rprofile) there are a few options that do exactly what you want.
installr::require2 and pacman::p_load do what you ask. Disclosure: I am a an author/maintainer of pacman. I agree with your sentiment that we shouldn't rely on options or external files though especially if we plan on sharing the code. I use pacman pretty much every day (it has much more use than just installing/loading packages) but for the most part these types of functions should be treated as useful for interactive use but if you want portable, shareable code without worries about whether packages will be available you will have to resort to something along the lines of what you have in your question.

emacs ess crashes when trying to access h_elp

My emacs/ess session crashes when I try to access help. This happens if I have two packages loaded with the same functions; for example:
library(lubridate)
library(data.table)
?month
In Rgui interface pops out and asks to choose from which packages I want help. Emacs just crashes. Similar issues happens with install.packages, but there is a way to specify mirror Is there a way to install R packages using emacs?
Is there a similar trick with help?
Well, there is no full proof solution for time being as nobody really understands why these crashes happen. I assume you are on windows, right?
There are plans in ESS to completely internalize all the help (and other) calls in order not to depend on R dialogs. Hopefully in the next version.
For time being put this into your .Rprofile
tis <- utils:::index.search
formals(tis)[["firstOnly"]] <- TRUE
assignInNamespace("index.search", tis, "utils")
It basically makes help system to pick the first package with the found topic. In your case month help page in data.table package will be ignored. Not a big deal as common topic names are quite rare anyways.
I found out that starting library(tcltk) solves this problem. Menu appears even after it is called from emacs+ess. I added library(tcltk) to my Rprofile.site and now everything works great, install.packages() and accessing help when multiple packages load same function

Resources