Temporarily ignore certain libraries in R - r

I am currently working on a continuous integration setup for R packages developed by our company. We have one Jenkins project for each R package and a corresponding library for each project.
I already defined a logic that installs all dependencies of the package into the project library. Now i want to define a check stage which basically runs
devtools::check("${PROJECT_DIR}/pkg")
but only use the project library for dependencies. I tried to use the callr package in the following manner.
callr::r(
function(...) {
devtools::check(...)
),
args = list("${PROJECT_DIR}/pkg"),
libpath = "${PROJECT_DIR}/lib"
)
However, the check process is still able to find packages which are not installed in libpath. Is there a way to make sure that only "${PROJECT_DIR}/lib" is used during the build stage?
So far, I have tried the following to no avail
callr() with the libpath argument
withr::with_libpaths with the new argument
Look through the documentation in devtools::check and R CMD BUILD for
appropriate parameters
Use .libPaths("${JOB_DIR}/lib")
Here is a repex to explain the unexpected behavior of callr. I expect an error in line 3.
find.package("ggplot2", .libPaths()[1])
#> Error in find.package("ggplot2", .libPaths()[1]): there is no package called 'ggplot2'
callr::r(function() { ggplot2::vars() }, libpath = .libPaths()[1])
#> named list()
find.package("ggplot2", .libPaths()[2])
#> [1] "/data/R/3.5.3/lib/R/library/ggplot2"
callr::r(function() { ggplot2::vars() }, libpath = .libPaths()[2])
#> named list()

Accoding to this question there is a way to archieve this with base::assign. If there is a more proper solution, I would love to hear it.
callr::r(function() {
assign(".lib.loc", .libPaths()[1], envir = environment(.libPaths))
ggplot2::vars()
})
#> Error in loadNamespace(name): there is no package called ‘ggplot2’
The issues I have here are twofold
It is basically a hack and can break any time if the internals of .libPaths() change
I might have to modify .Library and .Library.site (internals of .libPaths()) as well in order to make sure that devtools::check is affected appropriately.

This might be slightly off topic, but have you considered using docker for this use case?
You can define a Dockerfile, that you reference in your Jenkinsfile, which will define a custom image for each CI job that runs. You install the packages onto the docker container, using devtools::install() within Jenkins. This container then gets tossed when the CI is done.
With this approach you don't have to worry about manually installing the packages yourself when you run your CI, and don't have to worry about conflicting namespaces across different packages.
This definitely has a higher start up cost, but I think you'll find it will be worth it in the long run for testing your R packages. Source: I also test internal R packages at my job.
sample Dockerfile
FROM docker.io/rocker/r-base
USER root
# Install packages needed for package development
RUN R -e 'install.packages(c("devtools", "rmarkdown", "testthat", "roxygen2"))'
You then reference this Dockerfile in the Jenkinsfile in order to install, test, and check the package (pipeline example below)
agent {
dockerfile {
args '-u root'
}
}
stages {
stage('Install') {
steps {
sh 'r -e "devtools::install()"'
}
}
stage('Test') {
steps {
sh '''
r -e "options(testthat.output_file = 'test-out.xml'); devtools::test(reporter = 'junit')"
'''
junit 'test-out.xml'
}
}
stage('Check') {
// Might need to modify expected ouput, depends on devtools version
steps {
sh '''
testOutput=$(R -e "devtools::check(args='--no-tests')" 2>&1)
echo "${testOutput}" | grep -q "0 errors ✔ | 0 warnings ✔ | 0 notes ✔"
'''
}
}
}
}

Related

Error "Unknown repository for package source: bitbucket" when deploying Shiny app [duplicate]

Summary
When creating a package, I can list CRAN dependencies in the Depends list in the DESCRIPTION file. This documentation outlines how to list Bitbucket dependencies, eg. Remotes: bitbucket::sulab/mygene.r#default.
However, I don't know how to do this when authentication is needed to access the repository.
Attempt
I've tried putting the following code into the main packagename.R file. The function contents work fine as a snippet at the top of a standalone script:
.onLoad <- function(libname, pkgname) {
otherPackageVersion <- "1.0"
if (suppressWarnings(suppressPackageStartupMessages(require("otherPackageName", quietly = TRUE, character.only = TRUE)))) {
if (installed.packages()[installed.packages()[,"Package"] == "otherPackageName", "Version"] != otherPackageVersion) {
remove.packages("otherPackage")
devtools::install_bitbucket(sprintf("bitbucketUser/otherPackageName#%s", otherPackageVersion), auth_token = Sys.getenv("BITBUCKET_PAT"))
}
} else {
devtools::install_bitbucket(sprintf("bitbucketUser/otherPackageName#%s", otherPackageVersion), auth_token = Sys.getenv("BITBUCKET_PAT"))
}
}
but R CMD check fails saying it cannot be loaded after hanging for a while:
checking whether package ‘packageName’ can be installed ... ERROR
Installation failed.
Further Detail
The version of devtools I have loaded is 1.12.0.9000 (see this Github thread) which I installed using devtools::install_github("hadley/devtools#1220"). This allows me to install private Bitbucket R packages using an App Password stored in an environment variable, rather than committing my username/password in plaintext.
This will not be possible until this (a pull request using Bitbucket PATs) is merged into the devtools package.
EDIT: Checking in on this many years later, it's sorted for me by the current version of devtools (2.4.3) using a BitBucket App Password with Repo Read/Write and Project Read/Write permissions.

"R CMD check" throws warning on use of 'devtools::test()', but allows 'test()', but need to use full function name

I'm running my package through R CMD check and the only (remaining) warning is the following:
W checking for unstated dependencies in 'tests' (4.4s)
'::' or ':::' import not declared from: 'devtools'
After getting confused for a long time about this seemingly nonsensical warning, I realized it's coming from my "test manager" script (see reason for its need below). This is file pkg/tests/testthat.R, while the tests themselves are in pkg/tests/testthat/.
# testthat.R
sink(stderr(), type = "output")
x <- tryCatch(
{
x <- data.frame(devtools::test()) # here's the problem!
as.numeric(sum(x$failed) > 0)
},
error = function(e) {
1
}
)
sink(NULL, type = "output")
cat(1)
If I comment out this entire file, the R CMD check warning vanishes.
And then the weird part: if I replace devtools::test() with just test(), the R CMD check warning vanishes.
However, the purpose of this "manager" script is to be to be called (via Rscript) by a git pre-commit hook. This way, I can run all my tests to ensure the commit is stable. Due to this, I can't use test(), since devtools isn't loaded when the script is run via Rscript.
I tried a few things to satisfy both R CMD check and being called by Rscript:
Using library(devtools) doesn't work (throws a package not found error);
Moving testthat.R out of the /tests/ folder and into the top-level. This kills the R CMD check warning, but it now instead throws a note: Non-standard file/directory found at top level: 'testthat.R', so not exactly satisfactory (especially since keeping it in the /tests/ directory seems more logically consistent);
Testing for a function which has been apparently loaded by R CMD check to determine behavior. Since using a naked test() works, I assumed devtools was loaded, so prepended the following to the file (and used runTests on the problematic line). The logic being, if we can find test(), use it. If we can't, then this probably isn't R CMD check, so we can use the full name.
if (length(find("test")) == 0) {
runTests <- devtools::test()
} else {
runTests <- test()
}
Unfortunately, this just made things worse: the warning remains and we also get an error on the if-else block:
> if (length(find("test")) == 0) {
+ runTests <- devtools::test()
+ } else {
+ runTests <- test()
+ }
Error in loadNamespace(name) : there is no package called 'devtools'
Calls: :: ... loadNamespace -> withRestarts -> withOneRestart -> doWithOneRestart
Why devtools::test() throws an error here and just a warning on the problematic line is beyond me.
Similarly, using testthat::skip(). Also doesn't work.
So, what can I do to satisfy both R CMD check and being called by Rscript? Is there a way to tell R CMD check to ignore this file?
For the record, this is my git pre-commit hook, in case it can be reformulated to solve this problem some other way
#!/bin/sh
R_USER="D:/Users/wasabi/Documents"
export R_USER
# check that Rscript is accessible via PATH; fail otherwise
command -v Rscript >/dev/null || {
echo "Rscript must be accessible via PATH. Commit aborted.";
exit 1;
};
# check whether there are unstaged changes. If so, stash them.
# This allows the tests to run only on previously committed or
# indexed (added on this commit) changes.
hasChanges=$(git diff)
if [ -n "$hasChanges" ]; then
git stash push --keep-index
fi
exitCode=$(Rscript tests/testthat.R)
# remember to unstash any unstaged changes
if [ -n "$hasChanges" ]; then
git stash pop
fi
exit $exitCode
The solution is to simply add tests/testthat.R to .Rbuildignore (either by hand in the form of a regular expression or using usethis::use_build_ignore("tests/testthat.R")).
If you actually run R CMD check, the warning will still appear (since it runs on the source files, and therefore ignores .Rbuildignore, unless you run it on the binary itself).
But the "Check Package" command in RStudio relies on devtools::check(), which builds the package first and then checks the binary, therefore not getting the error. And since that's how my team and I will actually be running the checks, it's sufficient.
Solution inspired by this question.

RStudio Connect, Packrat, and custom packages in local repos

We have recently got RStudio Connect in my office. For our work, we have made custom packages, which we have updated amongst ourselves by opening the project and build+reloading.
I understand the only way I can get our custom packages to work within apps with RSConnect is to get up a local repo and set our options(repos) to include this.
Currently I have the following:
library(drat)
RepoAddress <- "C:/<RepoPath>" # High level path
drat::insertPackage(<sourcePackagePath>, repodir = RepoAddress)
# Add this new repo to Rs knowledge of repos.
options(repos = c(options("repos")$repos,LocalCurrent = paste0("file:",RepoAddress)))
# Install <PackageName> from the local repo :)
install.packages("<PackageName>")
Currently this works nicely and I can install my custom package from the local repo. This indicates to me that the local repo is set up correctly.
As an additional aside, I have changed the DESCRIPTION file to have an extra line saying repository:LocalCurrent.
However when I try to deploy a Shiny app or Rmd which references , I get the following error on my deploy:
Error in findLocalRepoForPkg(pkg, repos, fatal = fatal) :
No package '<PackageName> 'found in local repositories specified
I understand this is a problem with packrat being unable to find my local repos during the deploy process (I believe at a stage where it uses packrat::snapshot()).This is confusing since I would have thought packrat would use my option("repos") repos similar to install.packages. If I follow through the functions, I can see the particular point of failure is packrat:::findLocalRepoForPkg("<PackageName", repos = packrat::get_opts("local.repos")), which fails even after I define packrat::set_opts("local.repos" = c(CurrentRepo2 = paste0("file:",RepoAddress)))
If I drill into packrat:::findLocalRepoForPkg, it fails because it can't find a file/folder called: "C://". I would have thought this is guaranteed to fail, because repos follow the C://bin/windows/contrib/3.3/ structure. At no point would a repo have the structure it's looking for?
I think this last part is showing I'm materially misunderstanding something. Any guidance on configuring my repo so packrat can understand it would be great.
One should always check what options RStudio connect supports at the moment:
https://docs.rstudio.com/connect/admin/r/package-management/#private-packages
Personally I dislike all options for including local/private packages, as it defeats the purpose of having a nice easy target for deploying shiny apps. In many cases, I can't just set up local repositories in the organization because, I do not have clearance for that. It is also inconvenient that I have to email IT-support to make them manually install new packages. Overall I think RS connect is great product because it is simple, but when it comes to local packages it is really not.
I found a nice alternative/Hack to Rstudio official recommendations. I suppose thise would also work with shinyapps.io, but I have not tried. The solution goes like:
add to global.R if(!require(local_package) ) devtools::load_all("./local_package")
Write a script that copies all your source files, such that you get a shinyapp with a source directory for a local package inside, you could call the directory for ./inst/shinyconnect/ or what ever and local package would be copied to ./inst/shinyconnect/local_package
manifest.
add script ./shinyconnect/packrat_sees_these_dependencies.R to shiny folder, this will be picked up by packrat-manifest
Hack rsconnet/packrat to ignore specifically named packages when building
(1)
#start of global.R...
#load more packages for shiny
library(devtools) #need for load_all
library(shiny)
library(htmltools) #or what ever you need
#load already built local_package or for shiny connection pseudo-build on-the-fly and load
if(!require(local_package)) {
#if local_package here, just build on 2 sec with devtools::load_all()
if(file.exists("./DESCRIPTION")) load_all(".") #for local test on PC/Mac, where the shinyapp is inside the local_package
if(file.exists("./local_package/DESCRIPTION")) load_all("./local_package/") #for shiny conenct where local_package is inside shinyapp
}
library(local_package) #now local_package must load
(3)
make script loading all the dependencies of your local package. Packrat will see this. The script will never be actually be executed. Place it at ./shinyconnect/packrat_sees_these_dependencies.R
#these codelines will be recognized by packrat and package will be added to manifest
library(randomForest)
library(MASS)
library(whateverpackageyouneed)
(4) During deployment, manifest generator (packrat) will ignore the existence of any named local_package. This is an option in packrat, but rsconnect does not expose this option. A hack is to load rsconnect to memory and and modify the sub-sub-sub-function performPackratSnapshot() to allow this. In script below, I do that and deploy a shiny app.
library(rsconnect)
orig_fun = getFromNamespace("performPackratSnapshot", pos="package:rsconnect")
#packages you want include manually, and packrat to ignore
ignored_packages = c("local_package")
#highjack rsconnect
local({
assignInNamespace("performPackratSnapshot",value = function (bundleDir, verbose = FALSE) {
owd <- getwd()
on.exit(setwd(owd), add = TRUE)
setwd(bundleDir)
srp <- packrat::opts$snapshot.recommended.packages()
packrat::opts$snapshot.recommended.packages(TRUE, persist = FALSE)
packrat::opts$ignored.packages(get("ignored_packages",envir = .GlobalEnv)) #ignoreing packages mentioned here
print("ignoring following packages")
print(get("ignored_packages",envir = .GlobalEnv))
on.exit(packrat::opts$snapshot.recommended.packages(srp,persist = FALSE), add = TRUE)
packages <- c("BiocManager", "BiocInstaller")
for (package in packages) {
if (length(find.package(package, quiet = TRUE))) {
requireNamespace(package, quietly = TRUE)
break
}
}
suppressMessages(packrat::.snapshotImpl(project = bundleDir,
snapshot.sources = FALSE, fallback.ok = TRUE, verbose = FALSE,
implicit.packrat.dependency = FALSE))
TRUE
},
pos = "package:rsconnect"
)},
envir = as.environment("package:rsconnect")
)
new_fun = getFromNamespace("performPackratSnapshot", pos="package:rsconnect")
rsconnect::deployApp(appDir="./inst/shinyconnect/",appName ="shinyapp_name",logLevel = "verbose",forceUpdate = TRUE)
The problem is one of nomenclature.
I have set up a repo in the CRAN sense. It works fine and is OK. When packrat references a local repo, it is referring to a local git-style repo.
This solves why findlocalrepoforpkg doesn't look like it will work - it is designed to work with a different kind of repo.
Feel free to also reach out to support#rstudio.com
I believe the local package code path is triggered in packrat because of the missing Repository: value line in the Description file of the package. You mentioned you added this line, could you try the case-sensitive version?
That said, RStudio Connect will not be able to install the package from the RepoAddress as you've specified it (hardcoded on the Windows share). We'd recommend hosting your repo over https from a server that both your Dev environment and RStudio Connect have access to. To make this type of repo setup much easier we just released RStudio Package Manager which you (and IT!) may find a compelling alternative to manually managing releases of your internal packages via drat.

Private Bitbucket package dependency in R package

Summary
When creating a package, I can list CRAN dependencies in the Depends list in the DESCRIPTION file. This documentation outlines how to list Bitbucket dependencies, eg. Remotes: bitbucket::sulab/mygene.r#default.
However, I don't know how to do this when authentication is needed to access the repository.
Attempt
I've tried putting the following code into the main packagename.R file. The function contents work fine as a snippet at the top of a standalone script:
.onLoad <- function(libname, pkgname) {
otherPackageVersion <- "1.0"
if (suppressWarnings(suppressPackageStartupMessages(require("otherPackageName", quietly = TRUE, character.only = TRUE)))) {
if (installed.packages()[installed.packages()[,"Package"] == "otherPackageName", "Version"] != otherPackageVersion) {
remove.packages("otherPackage")
devtools::install_bitbucket(sprintf("bitbucketUser/otherPackageName#%s", otherPackageVersion), auth_token = Sys.getenv("BITBUCKET_PAT"))
}
} else {
devtools::install_bitbucket(sprintf("bitbucketUser/otherPackageName#%s", otherPackageVersion), auth_token = Sys.getenv("BITBUCKET_PAT"))
}
}
but R CMD check fails saying it cannot be loaded after hanging for a while:
checking whether package ‘packageName’ can be installed ... ERROR
Installation failed.
Further Detail
The version of devtools I have loaded is 1.12.0.9000 (see this Github thread) which I installed using devtools::install_github("hadley/devtools#1220"). This allows me to install private Bitbucket R packages using an App Password stored in an environment variable, rather than committing my username/password in plaintext.
This will not be possible until this (a pull request using Bitbucket PATs) is merged into the devtools package.
EDIT: Checking in on this many years later, it's sorted for me by the current version of devtools (2.4.3) using a BitBucket App Password with Repo Read/Write and Project Read/Write permissions.

CRAN Check: No repository set, so cyclic dependency check skipped [duplicate]

As of R 3.1.0 I get the following R check:
* checking package dependencies ... NOTE
No repository set, so cyclic dependency check skipped
I tried this advice: https://twitter.com/phylorich/status/431911660698083328
No go. I put the line options(repos="http://cran.rstudio.com/") in a .Rprofile in the package root directory. Still get the Note.
Also section 1.3.1 of Writing R Extensions states:
Some Windows users may need to set environment variable R_WIN_NO_JUNCTIONS
to a non-empty value. The test of cyclic declarations33in DESCRIPTION
files needs repositories (including CRAN) set: do this in ~/.Rprofile.
Is this possibly a result of the set environment variable R_WIN_NO_JUNCTIONS? If so how can I go about doing this? Any other possible causes of the note or suggested fixes?
From Writing R Extensions
The test of cyclic declarations in DESCRIPTION files needs repositories (including CRAN) set: do this in ~/.Rprofile, by e.g
options(repos = c(CRAN="http://cran.r-project.org"))
Recommended
User should double check if his .Rprofile is in his home and that it contains the mentioned option.
# in R session (any platform)
# where is my profile?
file.path(Sys.glob("~"),".Rprofile")
# is it there?
file.exists(file.path(Sys.glob("~"),".Rprofile"))
Or from R session using extra package:
library(pathological)
r_profile()
User should double check if the option entry is not nested in the IF condition, like in the following code:
# this will not help for R CMD check --as-cran
if(interactive()) {
options(repos = c(CRAN="http://cran.r-project.org"))
}
Dry run for any platform
Here is R script preparing easy temporary case of R package for testing, helping to faster find what is going wrong in your local usage.
This aproach helped myself to locate what was wrong in my .Rprofile file and generally can help to set up working initial state.
In best case, the check run should show only 1 NOTE about new submission.
first copy/paste the code and source it in your R session (--vanilla
preferably)
then run the command printed by the script to check test case --as-cran.
Example
# for example
R --vanilla -f makePackage.R
# here the resulting package path is as below
R --no-site-file CMD check --as-cran /tmp/pkgtest
# now see the check log
If your .Rprofile does not exist it will be created and one new line placed at the end of file in any case.
The makePackage.R script
# makePackage.R
# makes simple package for playing with check --as-cran
# copy this content to file makePackage.R
# then source it into your R --vanilla session
name <- "pkgtest"
#
# prepare and adjust package template
#
tempbase <- dirname(tempdir())
e <- new.env()
path <- dirname(tempdir())
# make simple package in path
e$fu <- function(){"Hello"}
package.skeleton(name=name,force=T,path=path,environment=e)
nil <- file.remove(
file.path(path,name,'Read-and-delete-me'),
file.path(path,name,'man',paste0(name,'-package.Rd'))
)
# adjust DESCRIPTION
D <- readLines(file.path(path,name,"DESCRIPTION"))
D[grepl("^Title: ",D)] <- "Title: Testing Skeleton"
D[grepl("^Author: ",D)] <- "Author: John Doe"
D[grepl("^Description: ",D)] <- "Description: Checking --as-cran check."
D[grepl("^Maintainer: ",D)] <- "Maintainer: John Doe <jdoe#doe.net>"
D[grepl("^License: ",D)] <- "License: GPL (>= 2)"
write(D,file.path(path,name,"DESCRIPTION"))
# make fu.Rd
write(
"\\name{fu}\\alias{fu}\\title{Prints}\\description{Prints}
\\usage{fu()}\\examples{fu()}",
file.path(path,name,'man','fu.Rd'))
#
# ensure that .Rprofile contains repos option
# add fresh new line et the end of .Rprofile
#
userRp <- file.path(Sys.glob("~"),".Rprofile")
write("options(repos = c(CRAN='http://cran.r-project.org'))",file=userRp, append=TRUE)
#
# print final message
#
msg <- sprintf("
Your test package was created in %s,
under name %s,
your user .Rprofile in %s was modified (option repos),
now check this test package from command line by command:
R --no-site-file CMD check --as-cran %s
", path, name, userRp, file.path(path,name)
)
# now is time to check the skeleton
message(msg)
Checking the package
# replace package-path by the path adviced by the sourcing the script above
R --no-site-file CMD check --as-cran package-path
There is user profile and site profile, in the approach above you bypasses site profile (in second step) by using --no-site-file option for package skeleton option.
PDF errors
You can experience PDF and latex related errors, caused very likely by missing or not complete latex instalation. Ycan use --no-manual option to skip PDF tests.
R --no-site-file CMD check --no-manual --as-cran /tmp/pkgtest
The answer above only works for Linux. On Windows I had to use a different method. When I tried to build and check my new package in R 3.2.0 on Windows 7, I got the same error:
checking package dependencies ... NOTE
No repository set, so cyclic dependency check skipped
I tried creating a file .Rprofile in my new package's root directory, but that didn't work. Instead I had to go to:
C:\Program Files\R\R-3.2.0\etc
and edit the file:
Rprofile.site
In the Rprofile.site file I added the suggested line:
options(repos = c(CRAN="http://cran.r-project.org"))
After I edited the Rprofile.site file, the NOTE
"No repository set, so cyclic dependency check skipped" finally disappeared.

Resources