How can I copy and entire renv based project to a new PC (which does not have internet access)? - r

I have been given access to a beefy machine on which to run a large simulation.
I have developed the code in an RStudio project with renv. Renv makes a local copy of all the packages and stores versions thereof in a lock file.
The target machine (which runs Windows) does not have access to the internet. I have copied the project file, the code files, the renv folder (which includes all the local copies of the packages, the lock file, and the .RProfile file, to a folder on the target machine.
When I open the project on the target machine, the .RProfile executes source("renv/activate.R"). However, this fails to load the projects, instead giving me the following message
The following package(s) are missing their DESCRIPTION files:
... Long list of packages ...
These may be left over from a prior, failed installation attempt.
Consider removing or re-installing these packages.
Trouble is I can't reinstall them since this machine does not have access to the internet. I could manually go through each package and download the binaries on my work machine, then transfer them over to the target machine, then install them one by one, but this seems like a very painful thing to do.
Is there a way for me to convince renv, or R itself, to just use the packages in the local renv folder?

From the Cache section of https://rstudio.github.io/renv/articles/renv.html:
When using renv with the global package cache, the project library is instead formed as a directory of symlinks (or, on Windows, junction points) into the renv global package cache. Hence, while each renv project is isolated from other projects on your system, they can still re-use the same installed packages as required.
I think that implies trying to copy the renv folder means copying those junction points (which are something like shortcuts / symlinks), not the actual underlying folder.
It looks like you already have a solution, but another option would be to call renv::isolate() to ensure that your project doesn't link to packages within the global cache, and instead just maintains an isolated library directly.

In the end I just wrote an small script to copy the files over.
sourceFolder = "some/path"
targetFolder = "some/other/path"
subFolders = list.files(sourceFolder)
for (i in seq_along(subFolders)) {
subFolder = subFolders[i]
file.copy(
from = paste0(sourceFolder, subFolder),
to = targetFolder,
overwrite = TRUE,
recursive = TRUE
)
paste(subFolder) |> message()
}

Related

Determining which R packages, and dependencies, use DLL files

I work in a corporate environment that uses Microsoft Windows Defender Application Control (WDAC) to provide security. This blocks unsigned EXE and DLL files from being installed on devices. R packages which use DLLs fail to install. The workaround to this is provide an R installation from an approved central source which also copies over a default set of packages, such as tidyverse, data.table etc. to the R library. Users can continue to install additional packages which are built with native R, but run into issues if they try to install, build from source, or update packages with DLL files in.
Is there a way to check whether a package uses DLL files in advance of installation?
Output something like:-
check_dll(foo)
result: "This package and its dependencies have no DLL files. You can install this package"
check_dll(bar)
result: "bar does not have any DLL files, but one dependency, OOF, uses DLL files.
You have already have a version of OOF installed so it should be safe to install bar"
check_dll(foobar)
result: "foobar has a DLL. Do not attempt to install foobar".
check_dll(RABOOF)
result: "RABOOF does not have any DLL files, but one of it's dependencies,
foobar, does have a DLL file. Do not attempt to install RABOOF".
tools::package_dependencies() will list the package dependencies, but nothing else.
Downloading the zip file from CRAN and inspecting it for a libs/x64 folder with contents will work, but seems a heavyweight approach. Theoretically if a package has lots of dependencies this could result in downloading a lot of files unnecessarily.
Look for the NeedsCompilation field in the DESCRIPTION file. If it is "yes", there will be a DLL. If it is "no", there probably won't be. (If it is not there, the package wasn't built properly, so all bets are off.)
The test is not perfect, because packages can put DLLs into the inst folder to get them installed without compiling them, though CRAN isn't supposed to allow that: "Source packages may not contain any form of binary executable code." But packages like pak (mentioned in the comments) may be allowed to get around this rule, e.g. by downloading binaries, so the test isn't perfect. You will also need to put together a blacklist of packages that will fail your WDAC tests even though they claim not to need compilation, containing pak and others like it.
The NeedsCompilation field is included as a column of the result of available.packages(), so it is very easy to access without trying to install the package.
I have accepted the answer from user2554330 as the best solution. It makes use of the normal set of commands used for package management; and the matrix generated by available.packages() can be passed to tools::package_dependencies(), removing the need for multiple internet queries.
For completeness I am documenting another possible solution. A script could query the unofficial CRAN Github mirror https://docs.r-hub.io/#cranatgh and look for a /src directory in each package project.

move packages to different directory Julia 0.7

I would like to move ~/.julia/packages to the julia installation directory.
What else should I move ?
What env variables should I set?
JULIA_PKGDIR?
JULIA_DEPOT_PATH?
JULIA_LOAD_PATH?
push!(DEPOT_PATH, "newdirname") created registries and compiled directories. but any newly added packages are not getting installed in the "newdirname" directory.
export JULIA_DEPOT_PATH="newdirname" removed ~/.julia/packages and replaced with the new one. packages started getting installed in the new dir. Now when I move the entire installation with the packages to a different machine without internet, Julia forces me to rebuild the packages and then fails because there is no connectivity. so what would make this move work?
You can add the new path by putting the statement
push!(DEPOT_PATH, "newdirname")
in startup.jl.
CAUTION: Most of the installed code in the .julia directories will have text created at installation containing pathnames to your old directories! I'd expect a lot of installed modules to break their installed state if moved. The adding via push! to DEPOT_PATH is for adding additional places to load files, NOT for moving existing installations!

R Packrat Fails to load private library

I have developed a solution using R and want to transfer it to the production server (CentOS 7) which has no Internet connection to install packages. To facilitate installation of packages, I used packrat to bundle the packages I used in my R script to the project.
Using packrat::bundle(), I have created a tar file of the project and moved the file to the server and untar the zip file.
According to a post in Blogger, once I open the project, When R is started from that directory, Packrat will do its magic and make sure the software environment is the same as on the source machine.
However, when I open the project in Server (using R-Studio Server 0.99), nothing happens and it throws error of unknown packages.
When manually execute the "packarat/init.R" file below error is thrown
Error in ensurePackageSymlink(source, target) :
Target '/home/R_Projects/prjName/packrat/lib-R/base' already exists and is not a symlink
Well, I found the problem and solve it. The symlink error is related to centOS (it is not related to R). I just simply removed all the folders inside the
/home/R_Projects/prjName/packrat/lib-R
Because these folder exist, the packrat is unable to create symlink with the same name inside the lib-R folder. If I remove them, it will create a link (shortcut) to the actual folder where the r package is located.
Hope it helps future readers.

Using R with git and packrat

I have been using git for a while but just recently started using packrat. I would like my repository to be self contained but at the same time I do not want to include CRAN packages as they are available. It seems once R is opened in a project with packrat it will try to use packages from project library; if they are not available then it will try to install from src in the project library; if they are not available it will look at libraries installed in that computer. If a library is not available in the computer; would it look at CRAN next?
What files should I include in my git repo as a minimum (e.g., packrat.lock)?
You can choose to set an external CRAN-like repository with the source tarballs of the packages and their versions that you'd like available for your project. The default behaviour, however, is to look to CRAN next, as you've identified in your question. Check out the packrat.lock file, you will see that for each package you use in packrat, there is an option called source: CRAN (if you've downloaded the file from CRAN, that is).
When you have a locally stored package source file, the contents of the lockout for said package change to the following:
Package: FooPackage
Source: source
Version: 0.4-4
Hash: 44foo4036fb68e9foo9027048d28
SourcePath: /Users/MyName/Documents/code/myrepo/RNetica
I'm a bit unclear on your final question: What files should I include in my git repo as a minimum (e.g., packrat.lock)? But I'm going to take this as a) combination of what files should be present for packrat to run, and b) which of those files should be committed to the git-repo. To answer the first question, I illustrate with initialising packrat on an existing R project.
When you run packrat::init(), two important things happen (among others):
1. All the packrat scaffolding, including source tarballs etc are created under: PackageName/packrat/.
2. packrat/lib*/ is added to your .gitignore file.
So from this, we can see that anything under packrat/lib*/ doesn't need to be committed to your git-repo. This leaves the following 3 files to be committed:
packrat/init.R
packrat/packrat.lock
packrat/packrat.opts
packrat.lock is needed for collaborating with others through a version control system; it helps keep your private libraries in sync. packrat.opts allows you to specify different project specific options for packrat. The file is automatically generated using get_opts and set_opts. Committing this file to the git-repo will ensure that any options you specify are maintained for all collaborators. A final file to be committed to the repo is .Rprofile. This file tells R to use the private package library (when R is started from the project directory).
Depending on your needs, you can choose to commit the source tar balls to the repository, or not. If you don't want them available in your git-repo, you simply add packrat/src/ to the .gitignore. But, this will mean that anyone accessing the git-repo will not have access to the package source code, and the files will be downloaded from CRAN, or from wherever the source line dictates within the packrat.lock file.
From your question, it sounds like committing the packrat/src/ folder contents to your repo might be what you need.

Move r packages to new computer which has no internet

Normally I install packages using:
install.packages("foo")
and a Repo over the internet. But I have a new machine now where I want to replicate the packages from my existing installation without having to pull everything off the internet all over again. (I've a ton of packages and slow internet access)
Both machines are Windows and run the same R version. (2.13.1)
Is there a way to do this? Closest I can get is I know I can install from local zip files using:
install.packages("pathtozip", repos = NULL)
But does R store all Zips somewhere? I found a few in locations like:
C:\Documents and Settings\foouser\Local Settings\Temp\RtmpjNKkyp\downloaded_packages
But not all.
Any tips?
The function .libPaths will give you a vector of all the libraries on your machine. Run this on your old machine to find all of them. You can simply copy all these files into the libraries on your new machine (run .libPaths on it too to find out where).
Alternatively, if you want to set up a real repository (i.e. basically a CRAN mirror) on your computer or on a network drive you can update, you can put binary or source packages into a folder and run tools::write_PACKAGES on that folder. You can them run install.packages using the contriburl argument and point it to your repository folder.
All packages that you have installed are stored in a folder called win-library\r-version, for example,
C:\Users\Ehsan\Documents\R\win-library\2.15 so, it is enough to copy all the folders inside 2.15 to the same folder in your new machine. because you have the same version of R you do not need to update them by update.packages().
On your original computer, run
write.csv(unique(data.frame(installed.packages())[,1]),"packages.csv",row.names=F)
Save this .csv into the working directory of your new computer, then run
install.packages(as.character(read.csv("packages.csv")[,1]))
You can check what your working directory is using getwd().

Resources