Summary
When creating a package, I can list CRAN dependencies in the Depends list in the DESCRIPTION file. This documentation outlines how to list Bitbucket dependencies, eg. Remotes: bitbucket::sulab/mygene.r#default.
However, I don't know how to do this when authentication is needed to access the repository.
Attempt
I've tried putting the following code into the main packagename.R file. The function contents work fine as a snippet at the top of a standalone script:
.onLoad <- function(libname, pkgname) {
otherPackageVersion <- "1.0"
if (suppressWarnings(suppressPackageStartupMessages(require("otherPackageName", quietly = TRUE, character.only = TRUE)))) {
if (installed.packages()[installed.packages()[,"Package"] == "otherPackageName", "Version"] != otherPackageVersion) {
remove.packages("otherPackage")
devtools::install_bitbucket(sprintf("bitbucketUser/otherPackageName#%s", otherPackageVersion), auth_token = Sys.getenv("BITBUCKET_PAT"))
}
} else {
devtools::install_bitbucket(sprintf("bitbucketUser/otherPackageName#%s", otherPackageVersion), auth_token = Sys.getenv("BITBUCKET_PAT"))
}
}
but R CMD check fails saying it cannot be loaded after hanging for a while:
checking whether package ‘packageName’ can be installed ... ERROR
Installation failed.
Further Detail
The version of devtools I have loaded is 1.12.0.9000 (see this Github thread) which I installed using devtools::install_github("hadley/devtools#1220"). This allows me to install private Bitbucket R packages using an App Password stored in an environment variable, rather than committing my username/password in plaintext.
This will not be possible until this (a pull request using Bitbucket PATs) is merged into the devtools package.
EDIT: Checking in on this many years later, it's sorted for me by the current version of devtools (2.4.3) using a BitBucket App Password with Repo Read/Write and Project Read/Write permissions.
Related
Summary
When creating a package, I can list CRAN dependencies in the Depends list in the DESCRIPTION file. This documentation outlines how to list Bitbucket dependencies, eg. Remotes: bitbucket::sulab/mygene.r#default.
However, I don't know how to do this when authentication is needed to access the repository.
Attempt
I've tried putting the following code into the main packagename.R file. The function contents work fine as a snippet at the top of a standalone script:
.onLoad <- function(libname, pkgname) {
otherPackageVersion <- "1.0"
if (suppressWarnings(suppressPackageStartupMessages(require("otherPackageName", quietly = TRUE, character.only = TRUE)))) {
if (installed.packages()[installed.packages()[,"Package"] == "otherPackageName", "Version"] != otherPackageVersion) {
remove.packages("otherPackage")
devtools::install_bitbucket(sprintf("bitbucketUser/otherPackageName#%s", otherPackageVersion), auth_token = Sys.getenv("BITBUCKET_PAT"))
}
} else {
devtools::install_bitbucket(sprintf("bitbucketUser/otherPackageName#%s", otherPackageVersion), auth_token = Sys.getenv("BITBUCKET_PAT"))
}
}
but R CMD check fails saying it cannot be loaded after hanging for a while:
checking whether package ‘packageName’ can be installed ... ERROR
Installation failed.
Further Detail
The version of devtools I have loaded is 1.12.0.9000 (see this Github thread) which I installed using devtools::install_github("hadley/devtools#1220"). This allows me to install private Bitbucket R packages using an App Password stored in an environment variable, rather than committing my username/password in plaintext.
This will not be possible until this (a pull request using Bitbucket PATs) is merged into the devtools package.
EDIT: Checking in on this many years later, it's sorted for me by the current version of devtools (2.4.3) using a BitBucket App Password with Repo Read/Write and Project Read/Write permissions.
I am currently working on a continuous integration setup for R packages developed by our company. We have one Jenkins project for each R package and a corresponding library for each project.
I already defined a logic that installs all dependencies of the package into the project library. Now i want to define a check stage which basically runs
devtools::check("${PROJECT_DIR}/pkg")
but only use the project library for dependencies. I tried to use the callr package in the following manner.
callr::r(
function(...) {
devtools::check(...)
),
args = list("${PROJECT_DIR}/pkg"),
libpath = "${PROJECT_DIR}/lib"
)
However, the check process is still able to find packages which are not installed in libpath. Is there a way to make sure that only "${PROJECT_DIR}/lib" is used during the build stage?
So far, I have tried the following to no avail
callr() with the libpath argument
withr::with_libpaths with the new argument
Look through the documentation in devtools::check and R CMD BUILD for
appropriate parameters
Use .libPaths("${JOB_DIR}/lib")
Here is a repex to explain the unexpected behavior of callr. I expect an error in line 3.
find.package("ggplot2", .libPaths()[1])
#> Error in find.package("ggplot2", .libPaths()[1]): there is no package called 'ggplot2'
callr::r(function() { ggplot2::vars() }, libpath = .libPaths()[1])
#> named list()
find.package("ggplot2", .libPaths()[2])
#> [1] "/data/R/3.5.3/lib/R/library/ggplot2"
callr::r(function() { ggplot2::vars() }, libpath = .libPaths()[2])
#> named list()
Accoding to this question there is a way to archieve this with base::assign. If there is a more proper solution, I would love to hear it.
callr::r(function() {
assign(".lib.loc", .libPaths()[1], envir = environment(.libPaths))
ggplot2::vars()
})
#> Error in loadNamespace(name): there is no package called ‘ggplot2’
The issues I have here are twofold
It is basically a hack and can break any time if the internals of .libPaths() change
I might have to modify .Library and .Library.site (internals of .libPaths()) as well in order to make sure that devtools::check is affected appropriately.
This might be slightly off topic, but have you considered using docker for this use case?
You can define a Dockerfile, that you reference in your Jenkinsfile, which will define a custom image for each CI job that runs. You install the packages onto the docker container, using devtools::install() within Jenkins. This container then gets tossed when the CI is done.
With this approach you don't have to worry about manually installing the packages yourself when you run your CI, and don't have to worry about conflicting namespaces across different packages.
This definitely has a higher start up cost, but I think you'll find it will be worth it in the long run for testing your R packages. Source: I also test internal R packages at my job.
sample Dockerfile
FROM docker.io/rocker/r-base
USER root
# Install packages needed for package development
RUN R -e 'install.packages(c("devtools", "rmarkdown", "testthat", "roxygen2"))'
You then reference this Dockerfile in the Jenkinsfile in order to install, test, and check the package (pipeline example below)
agent {
dockerfile {
args '-u root'
}
}
stages {
stage('Install') {
steps {
sh 'r -e "devtools::install()"'
}
}
stage('Test') {
steps {
sh '''
r -e "options(testthat.output_file = 'test-out.xml'); devtools::test(reporter = 'junit')"
'''
junit 'test-out.xml'
}
}
stage('Check') {
// Might need to modify expected ouput, depends on devtools version
steps {
sh '''
testOutput=$(R -e "devtools::check(args='--no-tests')" 2>&1)
echo "${testOutput}" | grep -q "0 errors ✔ | 0 warnings ✔ | 0 notes ✔"
'''
}
}
}
}
We have recently got RStudio Connect in my office. For our work, we have made custom packages, which we have updated amongst ourselves by opening the project and build+reloading.
I understand the only way I can get our custom packages to work within apps with RSConnect is to get up a local repo and set our options(repos) to include this.
Currently I have the following:
library(drat)
RepoAddress <- "C:/<RepoPath>" # High level path
drat::insertPackage(<sourcePackagePath>, repodir = RepoAddress)
# Add this new repo to Rs knowledge of repos.
options(repos = c(options("repos")$repos,LocalCurrent = paste0("file:",RepoAddress)))
# Install <PackageName> from the local repo :)
install.packages("<PackageName>")
Currently this works nicely and I can install my custom package from the local repo. This indicates to me that the local repo is set up correctly.
As an additional aside, I have changed the DESCRIPTION file to have an extra line saying repository:LocalCurrent.
However when I try to deploy a Shiny app or Rmd which references , I get the following error on my deploy:
Error in findLocalRepoForPkg(pkg, repos, fatal = fatal) :
No package '<PackageName> 'found in local repositories specified
I understand this is a problem with packrat being unable to find my local repos during the deploy process (I believe at a stage where it uses packrat::snapshot()).This is confusing since I would have thought packrat would use my option("repos") repos similar to install.packages. If I follow through the functions, I can see the particular point of failure is packrat:::findLocalRepoForPkg("<PackageName", repos = packrat::get_opts("local.repos")), which fails even after I define packrat::set_opts("local.repos" = c(CurrentRepo2 = paste0("file:",RepoAddress)))
If I drill into packrat:::findLocalRepoForPkg, it fails because it can't find a file/folder called: "C://". I would have thought this is guaranteed to fail, because repos follow the C://bin/windows/contrib/3.3/ structure. At no point would a repo have the structure it's looking for?
I think this last part is showing I'm materially misunderstanding something. Any guidance on configuring my repo so packrat can understand it would be great.
One should always check what options RStudio connect supports at the moment:
https://docs.rstudio.com/connect/admin/r/package-management/#private-packages
Personally I dislike all options for including local/private packages, as it defeats the purpose of having a nice easy target for deploying shiny apps. In many cases, I can't just set up local repositories in the organization because, I do not have clearance for that. It is also inconvenient that I have to email IT-support to make them manually install new packages. Overall I think RS connect is great product because it is simple, but when it comes to local packages it is really not.
I found a nice alternative/Hack to Rstudio official recommendations. I suppose thise would also work with shinyapps.io, but I have not tried. The solution goes like:
add to global.R if(!require(local_package) ) devtools::load_all("./local_package")
Write a script that copies all your source files, such that you get a shinyapp with a source directory for a local package inside, you could call the directory for ./inst/shinyconnect/ or what ever and local package would be copied to ./inst/shinyconnect/local_package
manifest.
add script ./shinyconnect/packrat_sees_these_dependencies.R to shiny folder, this will be picked up by packrat-manifest
Hack rsconnet/packrat to ignore specifically named packages when building
(1)
#start of global.R...
#load more packages for shiny
library(devtools) #need for load_all
library(shiny)
library(htmltools) #or what ever you need
#load already built local_package or for shiny connection pseudo-build on-the-fly and load
if(!require(local_package)) {
#if local_package here, just build on 2 sec with devtools::load_all()
if(file.exists("./DESCRIPTION")) load_all(".") #for local test on PC/Mac, where the shinyapp is inside the local_package
if(file.exists("./local_package/DESCRIPTION")) load_all("./local_package/") #for shiny conenct where local_package is inside shinyapp
}
library(local_package) #now local_package must load
(3)
make script loading all the dependencies of your local package. Packrat will see this. The script will never be actually be executed. Place it at ./shinyconnect/packrat_sees_these_dependencies.R
#these codelines will be recognized by packrat and package will be added to manifest
library(randomForest)
library(MASS)
library(whateverpackageyouneed)
(4) During deployment, manifest generator (packrat) will ignore the existence of any named local_package. This is an option in packrat, but rsconnect does not expose this option. A hack is to load rsconnect to memory and and modify the sub-sub-sub-function performPackratSnapshot() to allow this. In script below, I do that and deploy a shiny app.
library(rsconnect)
orig_fun = getFromNamespace("performPackratSnapshot", pos="package:rsconnect")
#packages you want include manually, and packrat to ignore
ignored_packages = c("local_package")
#highjack rsconnect
local({
assignInNamespace("performPackratSnapshot",value = function (bundleDir, verbose = FALSE) {
owd <- getwd()
on.exit(setwd(owd), add = TRUE)
setwd(bundleDir)
srp <- packrat::opts$snapshot.recommended.packages()
packrat::opts$snapshot.recommended.packages(TRUE, persist = FALSE)
packrat::opts$ignored.packages(get("ignored_packages",envir = .GlobalEnv)) #ignoreing packages mentioned here
print("ignoring following packages")
print(get("ignored_packages",envir = .GlobalEnv))
on.exit(packrat::opts$snapshot.recommended.packages(srp,persist = FALSE), add = TRUE)
packages <- c("BiocManager", "BiocInstaller")
for (package in packages) {
if (length(find.package(package, quiet = TRUE))) {
requireNamespace(package, quietly = TRUE)
break
}
}
suppressMessages(packrat::.snapshotImpl(project = bundleDir,
snapshot.sources = FALSE, fallback.ok = TRUE, verbose = FALSE,
implicit.packrat.dependency = FALSE))
TRUE
},
pos = "package:rsconnect"
)},
envir = as.environment("package:rsconnect")
)
new_fun = getFromNamespace("performPackratSnapshot", pos="package:rsconnect")
rsconnect::deployApp(appDir="./inst/shinyconnect/",appName ="shinyapp_name",logLevel = "verbose",forceUpdate = TRUE)
The problem is one of nomenclature.
I have set up a repo in the CRAN sense. It works fine and is OK. When packrat references a local repo, it is referring to a local git-style repo.
This solves why findlocalrepoforpkg doesn't look like it will work - it is designed to work with a different kind of repo.
Feel free to also reach out to support#rstudio.com
I believe the local package code path is triggered in packrat because of the missing Repository: value line in the Description file of the package. You mentioned you added this line, could you try the case-sensitive version?
That said, RStudio Connect will not be able to install the package from the RepoAddress as you've specified it (hardcoded on the Windows share). We'd recommend hosting your repo over https from a server that both your Dev environment and RStudio Connect have access to. To make this type of repo setup much easier we just released RStudio Package Manager which you (and IT!) may find a compelling alternative to manually managing releases of your internal packages via drat.
I am trying to install a sample package from my github repo:
https://github.com/jpmarindiaz/samplepkg
I can install it when the repo is public using any of the following commands through the R interpreter:
install_github("jpmarindiaz/rdali")
install_github("rdali",user="jpmarindiaz")
install_github("jpmarindiaz/rdali",auth_user="jpmarindiaz")
But when the git repository is private I get an Error:
Installing github repo samplepkg/master from jpmarindiaz
Downloading samplepkg.zip from
https://github.com/jpmarindiaz/samplepkg/archive/master.zip
Error: client error: (406) Not Acceptable
I haven't figured out how the authentication works when the repo is private, any hints?
Have you tried setting a personal access token (PAT) and passing it along as the value of the auth_token argument of install_github()?
See ?install_github way down at the bottom (Package devtools version 1.5.0.99).
Create an access token in:
https://github.com/settings/tokens
Check the branch name and pass it to ref
devtools::install_github("user/repo"
,ref="main"
,auth_token = "tokenstring"
)
A more modern solution to this problem is to set your credentials in R using the usethis and credentials packages.
#set config
usethis::use_git_config(user.name = "YourName", user.email = "your#mail.com")
#Go to github page to generate token
usethis::create_github_token()
#paste your PAT into pop-up that follows...
credentials::set_github_pat()
#now remotes::install_github() will work
remotes::install_github("username/privaterepo")
More help at https://happygitwithr.com/common-remote-setups.html#common-remote-setups
I'm distributing jobs over a cluster and I'd rather not go to each machine and manually install the right packages. The job controller runs scripts as nobody, so I have to specify uncontroversial writeable paths for the installations. I actually had this working solution:
`%ni%` = Negate(`%in%`) ### "not in"
.libPaths("/tmp/") ### for local (remote non super user) install of packages
if ("xxx" %ni% installed.packages()) {install.packages("xxx", repos = "http://cran.r-project.org", lib="/tmp/")}
# ... and more
library(xxx)
# ... and more
It worked at first, but a week later I've got a strange problem.
> library(xxx)
Error in library(xxx) : there is no package called 'xxx'
xxx (and other packages) is in the manifest of installed.packages(), .libPaths is reporting /tmp/ on the path, and ls shows a folder for the package in /tmp/. Reinstalling with install.packages throws an error, as does remove.package, update.package, and find.package.
Two questions:
Is there a different way that I ought to have managed the remote install?
Any ideas what has caused my problem with the failure to load the package?
Please save me from having to implement a kludge like
locdir <- paste("/tmp/", as.integer(runif(1, 1, 100000)), sep='')
system(paste("mkdir", locdir))
.libPaths(locdir)
install.packages("xxx", repos = "http://cran.r-project.org", lib=locdir)
library(xxx)
You might need option character.only = TRUE, although it is weird that your code worked before but not anymore. Anyway, try this function:
packageLoad<-function(libName){
# try to load the package
if (!require(libName,character.only = TRUE)){
# if package is not available, install it
install.packages(libName,dep=TRUE,
repos="http://cran.r-project.org",lib="/tmp/",destdir="/tmp/")
# try again
if(!require(libName,character.only = TRUE))
stop(paste("Package ",libName,"
not found and its installation failed."))
}
}