Sometimes on Stack Overflow, there's a question relative to a package which is not installed on my system, and which I don't plan to reuse later.
If I install the package with install.packages(), it will be put in one of my R install libraries, and then will take some storage space and be updated each time I run update.packages().
Is there a way to install a package only for the current R session ?
You can install a package temporarily with the following function :
tmp.install.packages <- function(pack, dependencies=TRUE, ...) {
path <- tempdir()
## Add 'path' to .libPaths, and be sure that it is not
## at the first position, otherwise any other package during
## this session would be installed into 'path'
firstpath <- .libPaths()[1]
.libPaths(c(firstpath, path))
install.packages(pack, dependencies=dependencies, lib=path, ...)
}
Which you can use simply this way :
tmp.install.packages("pkgname")
The package is installed in a temporary directory, and its files should be deleted at next system restart (at least on linux systems).
Another solution for this problem is devmode from devtools. Devmode allows you to install packages to a dev repository so your other packages are untouched if you install development versions. For example:
library(devtools)
devmode()
install_github('ggplot2', 'hadley')
devmode()
You'll notice that your version has not changed.
pacman deals with package management issues like this:
library(pacman)
Now you can use:
p_load("pkgname") #installs or loads package if already installed
#at end of session:
p_delete("pkgname") #deletes package from lib
This is a quick way to install in your directory and then delete it at the end (not really a temporary install)
As an addition to Tyler's answer a p_temp function was recently added to the pacman package which does exactly what the question asks for.
library(pacman)
p_temp(pkgname) # or p_temp("pkgname") either work...
This will install the package and any dependencies temporarily.
Disclosure: Tyler and I are co-authors of the pacman package...
The following is something in the middle between
juba
and
sebastian-c,
and is as simple as that:
.libPaths("/my/path")
Now and until the end of the current session,
you can install packages as you normally would, and they will end up in
/my/path.
Also package dependencies will go to /my/path.
If you want to have control over dependencies, you can specify them manually with:
install.packages(c("pack", "dep1", "dep2", ...), dependencies = FALSE)
This approach might be useful in two particular scenarios:
A so-to-say discovery session. You want to discover new packages and install them casually to see if something interesting pops up. Then, you use an OS provided tempdir in .libPaths(), to avoid messing your R setup, and the OS will take care of the cleaning.
Create, nowadays common, reproducible environments.
You install a base R, then add .libPaths("my/project/dir"). By looking at this dir, you have a clear picture of what are your project package requirements. Further, you can copy this folder to another PC to reproduce the same environment. Much like Python pipenv you can have more isolated environments: for each session, you call .libPaths() with the related project dir.
Related
I just updated to R (3.4.1 "Single Candle") on my Linux Mint 18.1 Cinnamon machine and I attempted to install a package. R returned the following:
> install.packages('ggplot2')
Installing package into ‘/usr/local/lib/R/site-library’
(as ‘lib’ is unspecified)
Warning in install.packages("ggplot2") :
'lib = "/usr/local/lib/R/site-library"' is not writable
Would you like to use a personal library instead? (y/n) y
Would you like to create a personal library
NA
to install packages into? (y/n) y
Error in install.packages("ggplot2") : unable to create ‘NA’
I have encountered the 'lib not writable' output before but typically it offers a solution like this one:
Would you like to create a personal library
~/R/x86_64-pc-linux-gnu-library/3.4
to install packages into? (y/n) y
Any ideas why the personal library is suggesting NA? Is there a way to manually override this?
I don't know what's causing this problem (i'm also experiencing it on Ubuntu 16.04), but here's a quick workaround:
.libPaths(c("/home/your_username/R/x86_64-pc-linux-gnu-library/3.4/", .libPaths()))
Of course, you can replace "/home/your_username/..." for any another directory (that will store your personal library).
This solution makes install.packages() and library() work. Waiting for a full fix!
EDIT: I should note that this solution is not persistent. That is, it won't last after restarting R. You can fix this by adding the same line of code described above to the /home/your_username/.Rprofile file.
Looking at the details in #Dirk 's comment (https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=866768) this is a planned behaviour so that packages are installed once for all users of the system.
The solution is to make /usr/local/lib/R/ writable for all users, rather than to re-instate the old behaviour of having a personal package library for every individual user.
Open up a terminal and:
Navigate to /usr/local/lib/ with cd /usr/local/lib/
Change the owner:group so that all users can write to the folder. I happen to have a group on my computer that all users are a member of, so I used that, but see https://askubuntu.com/questions/66718/how-to-manage-users-and-groups for help with setting up a group if necessary
To change ownership use sudo chown owner:group -R R/. owner is an any user, it doesn't really matter. group is the key one; make sure anyone wanting to use R on your system is a member of this group. -R is recursive (i.e. do it to all files and folders in R/).
If you need to change group permissions, use chmod -R 775 R/. This gives the owner and group read, write, and execute permissions, and gives all others read and execute permissions.
Now restart R and you should be able to install packages to your this shared location.
My solution was the following:
In the file /usr/lib/R/etc/Renviron there is a configuration of R.
In lines 43-45 there is:
# edd Jun 2017 Comment-out R_LIBS_USER
#R_LIBS_USER=${R_LIBS_USER-'~/R/x86_64-pc-linux-gnu-library/3.4'}
##R_LIBS_USER=${R_LIBS_USER-'~/Library/R/3.4/library'}
I have uncommented R_LIBS_USER=${R_LIBS_USER-'~/R/x86_64-pc-linux-gnu-library/3.4'}, restarted RStudio and now it works.
EDIT: Looking at the comments, it seems like a planned behaviour. Here is another solution.
After July 8, 2017 this will resolve all problems
sudo apt-get update
The same happened to me while running the installation procedure for some Bioconductor packages.
Then I realized also I could write this (or similar) on the bash command line:
export R_LIBS_USER=$HOME/R/x86_64-pc-linux-gnu-library/3.4 && R
or
export R_LIBS_USER=$HOME/R/x86_64-pc-linux-gnu-library/3.4 && rstudio
and then run upgrade.packages() (or install.packages(), or biocLite()) inside R.
This way the change is temporary, and you don't have to update any config files.
This shell command is useless if subsequently a command in .Renvironor.Rprofile` sets R_USER_LIBS to a different location during R startup (-check your configuration).
Sticking with in $HOME/R/x86_64-pc-linux-gnu-library/3.X can be desirable
if you have already lots of packages in this location, you want them upgraded/installed there.
I have lots of Bioconductor packages in there, and I don't want them to download again, some of these packages download huge "Omics" datasets when used.
Maybe the partition where /usr/local/lib/R resides has too little disk space or is on a slow drive.
May be this is a bug of R 3.4.1, and my solution is change the line of
R_LIBS_SITE=${R_LIBS_SITE-'/usr/local/lib/R/site-library:/usr/lib/R/site-library:/usr/lib/R/library'}
in /etc/R/Renviron file into
R_LIBS_SITE=${R_LIBS_SITE-'~/R/x86_64-pc-linux-gnu-library/3.4.1:/usr/local/lib/R/site-library:/usr/lib/R/site-library:/usr/lib/R/library'}
Is there any documentation on manually installing a package in a user library when the R.home() path is locked down and incomplete (no etc, no bin, just library?) The system does NOT support shelling out to execute R CMD, which I believe standard R does.
I would like to build existing source packages (from CRAN) and install into a user library directory, so that I can use the library() function and get all the usual namespace and *.Rdx and *.Rdb files.
At the moment, I'm plodding through install.packages, tools::.build_package, and tools:::.install.packages source, using a standard MacOS R and the r source. Hopefully this has been documented in a more user-friendly fashion and my google searches have missed it.
Thanks.
You don't need to use a different install.packages method, rather you only need to specify a writable location for storing packages and give it precedence over the system default one. A simple way to accomplish this is to set an R_LIBS environment variable. For instance, in my .bashrc I have
export R_LIBS='/home/username/.local/lib/R-3.3.3'
Then, by default, all packages are installed here. Further, packages installed both here and the system-wide location will give priority to the ones here when loading.
You can verify that the location is being used by checking .libPaths() in your R session.
When you try to install a package in R and you don't have access rights to the default library path, R will ask you:
Would you like to use a personal library instead?
Would you like to create a personal library '~/path' to install
packages into?
However, if you are running an Rscript, those messages will not show up and installation will fail. I could predefine a specific path and instruct install.packages to use it, but I don't want to create an additional library path that would be specific to this Rscript. I just want to use the default personal library. Is there a way to force creation of a personal library without requiring interaction?
You can use Sys.getenv("R_LIBS_USER") to get the local library search location.
This is what I ended up doing, which seems to be working (the hardest part was testing the solution, since the problem only occurs the first time you try to install a package):
# create local user library path (not present by default)
dir.create(path = Sys.getenv("R_LIBS_USER"), showWarnings = FALSE, recursive = TRUE)
# install to local user library path
install.packages(p, lib = Sys.getenv("R_LIBS_USER"), repos = "https://cran.rstudio.com/")
# Bioconductor version (works for both Bioconductor and CRAN packages)
BiocManager::install(p, update = FALSE, lib = Sys.getenv("R_LIBS_USER"))
As #hrbrmstr pointed out in the comments, it may not be a good idea to force-install packages, so use at your own risk.
I've found several posts about best practice, reproducibility and workflow in R, for example:
How to increase longer term reproducibility of research (particularly using R and Sweave)
Complete substantive examples of reproducible research using R
One of the major preoccupations is ensuring portability of code, in the sense that moving it to a new machine (possibly running a different OS) is relatively straightforward and gives the same results.
Coming from a Python background, I'm used to the concept of a virtual environment. When coupled with a simple list of required packages, this goes some way to ensuring that the installed packages and libraries are available on any machine without too much fuss. Sure, it's no guarantee - different OSes have their own foibles and peculiarities - but it gets you 95% of the way there.
Does such a thing exist within R? Even if it's not as sophisticated. For example simply maintaining a plain text list of required packages and a script that will install any that are missing?
I'm about to start using R in earnest for the first time, probably in conjunction with Sweave, and would ideally like to start in the best way possible! Thanks for your thoughts.
I'm going to use the comment posted by #cboettig in order to resolve this question.
Packrat
Packrat is a dependency management system for R. Gives you three important advantages (all of them focused in your portability needs)
Isolated : Installing a new or updated package for one project won’t break your other projects, and vice versa. That’s because packrat gives each project its own private package library.
Portable: Easily transport your projects from one computer to another, even across different platforms. Packrat makes it easy to install the packages your project depends on.
Reproducible: Packrat records the exact package versions you depend on, and ensures those exact versions are the ones that get installed wherever you go.
What's next?
Walkthrough guide: http://rstudio.github.io/packrat/walkthrough.html
Most common commands: http://rstudio.github.io/packrat/commands.html
Using Packrat with RStudio: http://rstudio.github.io/packrat/rstudio.html
Limitations and caveats: http://rstudio.github.io/packrat/limitations.html
Update: Packrat has been soft-deprecated and is now superseded by renv, so you might want to check this package instead.
The Anaconda package manager conda supports creating R environments.
conda create -n r-environment r-essentials r-base
conda activate r-environment
I have had a great experience using conda to maintain different Python installations, both user specific and several versions for the same user. I have tested R with conda and the jupyter-notebook and it works great. At least for my needs, which includes RNA-sequencing analyses using the DEseq2 and related packages, as well as data.table and dplyr. There are many bioconductor packages available in conda via bioconda and according to the comments on this SO question, it seems like install.packages() might work as well.
It looks like there is another option from RStudio devs, renv. It's available on CRAN and supersedes Packrat.
In short, you use renv::init() to initialize your project library, and use renv::snapshot() / renv::restore() to save and load the state of your library.
I prefer this option to conda r-enviroments because here everything is stored in the file renv.lock, which can be committed to a Git repo and distributed to the team.
To add to this:
Note:
1. Have Anaconda installed already
2. Assumed your working directory is "C:"
To create desired environment -> "r_environment_name"
C:\>conda create -n "r_environment_name" r-essentials r-base
To see available environments
C:\>conda info --envs
.
..
...
To activate environment
C:\>conda activate "r_environment_name"
(r_environment_name) C:\>
Launch Jupyter Notebook and let the party begins
(r_environment_name) C:\> jupyter notebook
For a similar "requirements.txt", perhaps this link will help -> Is there something like requirements.txt for R?
Check out roveR, the R container management solution. For details, see https://www.slideshare.net/DavidKunFF/ownr-technical-introduction, in particular slide 12.
To install roveR, execute the following command in R:
install.packages("rover", repos = c("https://lair.functionalfinances.com/repos/shared", "https://lair.functionalfinances.com/repos/cran"))
To make full use of the power of roveR (including installing specific versions of packages for reproducibility), you will need access to a laiR - for CRAN, you can use our laiR instance at https://lair.ownr.io, for uploading your own packages and sharing them with your organization you will need a laiR license. You can contact us on the email address in the presentation linked above.
Question: How do I move all of the most up-to-date R packages into one simple location that R (and everything else) will use from now and forever for my packages?
I have been playing around with R on Ubuntu 10.04 using variously RGedit, RCmdr, R shell, and RStudio. Meanwhile, I have installed packages, updated packages, and re-updated packages via apt, synaptic, install.packages(), etc... which apparently means these packages get placed everywhere, and (with the occasional sudo tossed in) with different permissions.
Currently I have different versions of different (and repeated) packages in:
/home/me/R/i486-pc-linux-gnu-library/2.10
/home/me/R/i486-pc-linux-gnu-library/2.14
/home/me/R/i486-pc-linux-gnu-library/
/usr/local/lib/R/site-library
/usr/lib/R/site-library
/usr/lib/R/library
First - I'm a single user, on a single machine - I don't want multiple library locations, I just want it to work.
Second - I am on an extremely slow connection, and can't keep just downloading packages repeatedly.
So - is there an easy way to merge all these library locations into one simple location? Can I just copy the folders over?
How do I set it in concrete that this is and always will be where anything R related looks for and updates packages?
This is maddening.
Thanks for your help.
Yes, it should almost work to just copy the folders over. But pre-2.14 packages WITHOUT a NAMESPACE file probably won't work in R 2.14 where all packages must have a namespace...
And you'd want to manually ensure you only copy the latest version of each package if you have multiple versions...
If you type .libPaths(), it will tell you where R looks for packages. The first in the list is where new packages are typically installed. I suspect that .libPaths() might return different things from RStudio vs. Rcmd etc.
After piecing together various bits of info here goes: A complete moron's guide to the R packages directory organization:
NB1 - this is my experience with Ubuntu - your mileage may vary
NB2 - I'm a single user on a single machine, and I like things simple.
Ubuntu puts anything installed via apt, or synaptic in:
/usr/lib/R/site-library
/usr/lib/R/library
directories. The default vanilla R install will try install packages here:
/usr/local/lib/R/site-library
Since these are system directories the user does not have write privileges to, depending on what method you are interacting with R you might be prompted with a friendly - "Hey buddy - we can't write there, you want us to put your packages in your home directory?" which seems innocent and reasonable enough... assuming you never change your GUI, or your working environment. If you do, the new GUI / environment might not be looking in the directory where the packages were placed, so won't find them. (Most interfaces have a way for you to point where your personal library of packages is, but who wants to muck about in config files?)
What seems to be the best practice for me (and feel free to correct me if I'm wrong) with a default install setup on Ubuntu, is to do any package management from a basic R shell as sudo: > sudo R and from there do your install.packages() voodoo. This seems to put packages in the usr/local/lib/R/site-library directory.
At the same time, update.packages() will update the files in /usr/lib/R/site-library and usr/lib/R/library directories, as well as usr/local/lib/R/site-library
(As for usr/lib/R/ division, it looks like /library/ has the core packages, while /site-library/ holds anything added, assuming they were installed by apt...)
Any packages previously installed and in the wrong place can be moved to the /usr/local/lib/R/site-library directory (assuming you are sudoing it) just by moving the directories (thanks #Tommy), but as usr/lib/R/ is controlled by apt - best not add or subtract anything from there...
Whew. Anyway - simple enough, and in simple language. Thanks everyone for the help.