R: installing package to a different location from R base - r

I am accessing R-3.4 from a location on a shared drive. IT have configured it such that, when I install a package, it will default to installing it on my local drive. In many cases, it succeeds with a warning, but a few packages are failing, saying they can't access other libraries/packages.
The location on the shared drive is available in .libpaths(), and when I specifically install to this location using install.packages("package_name", lib = .libpaths()[2]), these packages install successfully.
Unfortunately, IT have told me I'm not allowed to install to the shared drive. So: is there a way that I can install a package to my local drive, but tell install.packages() to refer to the shared drive for any dependencies?
(This problem has so far arisen with installing arules, RGoogleAnalytics, and googleAnalyticsR, but I suspect they won't be the only ones.)

Related

deployment on shinyapps.io failing

Hi I'm trying to deploy an app on the server, however, I get the following errors
* May be unable to deploy package dependency "rClr" could not
determine a repository URL for the source CRAN.
* May be unable to deploy package dependency "tlf" could not determine
a repository URL for the source CRAN.
Unable to determine the source location for some packages. Packages
should be installed from a package repository like CRAN or a version
control system. Check that options(repos) refers to a package
repository containing the needed package versions.
The backbone packages can not be installed from Cran and have to be done manually, hence the errors. How can this be fixed in order to deploy the app on the server?
Thanks for all help

Is there a reason not to use R_HOME/library as user library?

Usually, install.packages() installs CRAN packages under R_LIBS_USER. On a windows machine, this is usually a location in C:/Users/username/Documents/R/win-library/r_version/.
However, I encountered the situation where all the CRAN packages are installed in the R_HOME/library folder (in my case: C:\Program Files\R\R-3.5.2\library).
I wonder whether there are objective reasons that would suggest not to chose this folder for installing additional packages?
I consulted the R Installation and Administration manual, but there is no advice on this topic.
Edit: The scenario is a single user machine.

Is there a way to 'install' R packages without running install.packages()?

We are testing how to run R in the cloud in a secure isolated environment that is blocked from CRAN and also cannot use packages.install(). We defined an environment which is based on R essentials Anaconda's bundle, still we would like to be able to customize it on demand with extra packages. Is there a way to be able to simulate packages.install(), e.g. by offline downloading the package, zip it, copy to the secure environment and unzipping it to a specific location in the library folder?
thanks!
You can download the package from CRAN as a zip and then transport it to the isolated PC as a file. For example, here is the link to dplyr on CRAN: https://cran.r-project.org/web/packages/dplyr/index.html
Then use the code below to install the local file:
install.packages("~/Downloads/dplyr_1.0.7.zip", repos = NULL)
On Windows you might require Rtools. At least there was a Warning about it but the package still installed.
For Linux machines, you can build the package from source using the tarball from the same page:
install.packages("~/Downloads/dplyr_1.0.7.tar.gz", repos = NULL, type = "source")
In both cases you need to take care of dependencies yourself as they are not checked while installing through this method (look at the "imports" field on the CRAN website for the package).

Deploying projects with renv in offline environment

What is the correct procedure to deploy packages using renv to an offline machine?
We have an internal CRAN-like repository, which is configured via options(repos = list(cran = "http://our.repo.url")) on both the development machine and the deployment machine. It is specified in renv.lock. The renv package itself is installed on both machines, and both are the same version (1.14).
After deployment, after starting R in the project directory, it hangs for a while, and returns an error:
# Bootstrapping renv 0.14.0--------
Warning: unable to access index for repository https://cloud.r-project.org/src/contrib/:
cannot open URL 'https://cloud.r-project.org/src/contrib/PACKAGES'
* Downloading renv 0.14.0 ... FAILED
How do I tell renv to either copy itself from the system library, or install from the internal repository?
Copying from the system library would be of course the preferred course of action, to save time compiling.
You might want to file an issue at https://github.com/rstudio/renv/issues since I think renv doesn't currently support loading the renv package from a non-project library path via the autoloader.
That said, you should be able to move forward by disabling the renv auto-loader. Before launching R, you can set the environment variable:
RENV_ACTIVATE_PROJECT = FALSE
Then, when R is started, the renv auto-loader (run via source("renv/activate.R") in the project .Rprofile) will be disabled. You can then later load renv from whichever library path is appropriate, and call renv::load() to manually load a particular project.
(The other alternative to setting that environment variable is simply removing the renv auto-loader from the project .Rprofile.)

Non-standard Remotes package INLA in R package

I have a package that Requires INLA, which is not hosted on CRAN or a standard GitHub repository. There are multiple SO questions detailing how to install the package on a personal machine, such as this, or even mentions it as a dependency in a package.
The two ways that are typically recommended to install on a personal machine are:
Direct from INLA website
install.packages("INLA",repos=c(getOption("repos"),INLA="https://inla.r-inla-download.org/R/stable"), dep=TRUE)
From the GitHub host
devtools::install_github(repo = "https://github.com/hrue/r-inla", ref = "stable", subdir = "rinla", build = FALSE)
Now, these are fine for personal machines, but don't work in the DESCRIPTION files Remotes: section.
If we do url::https://inla.r-inla-download.org/R/stable, this gives an error that the file extension isn't recognized.
Error: Error: Failed to install 'unknown package' from URL:
Don't know how to decompress files with extension
If we do github::hrue/r-inla, I am unaware of how to pass (or if it's even possible) the ref, subdir, and build arguments in the DESCRIPTION file.
Previous packages used a read only mirror of the INLA code that was hosted on GitHub, solely for this purpose, at this repo and then just using github::inbo/INLA. However, this repository is out of date.
Current solution
What I'm doing instead is to directly reference the tarball hosted on the main webpage.
url::https://inla.r-inla-download.org/R/stable/src/contrib/INLA_21.02.23.tar.gz
This solution works, and passes CI as well as the machines are able to install and load from there. The only issue is that I need to periodically update the static link to this tarball, and would prefer to reference the stable build, either directly from the INLA website as above, or the hrue/inla repo with those other arguments passed. Directly referencing those links also has the advantage that when my package is re-installed on a machine, it would recognize whether or not the latest version of INLA has been installed on that machine. Is there a way to achieve this in the DESCRIPTION file?
This is not a perfect answer but maybe what you can do is add the zip url of the stable branch of INLA from the new github repository of INLA:-
url::https://github.com/hrue/r-inla/archive/refs/heads/stable.zip
Hence, this will always install the latest stable version of the package.

Resources