R volunteers currently maintain Ubuntu package repositories for R ~3.5 and ~4.0. For Bionic Beaver, these are:
https://cloud.r-project.org/bin/linux/ubuntu/bionic-cran35/
https://cloud.r-project.org/bin/linux/ubuntu/bionic-cran40/
I am building separate Singularity containers, into which I need very specific versions of R installed; which appear to be provided in these repositories. Specifically, I'm looking to build containers containing R versions 3.6.1, 4.0.3 and 4.1.0; one container per version.
I do this, in the container build script, by first adding the appropriate Apt source, then running the install with a pinned version. I noticed that I could only get it to run if I use the precise version numbers listed in the package repository and also include r-recommended at the same version. For example, for R 3.6.1:
apt install -y r-base=3.6.1-3bionic r-recommended=3.6.1-3bionic
This correctly installs r-base and r-recommended at the given versions. However, when I run the containerised R, R is actually reporting itself to be at the latest version provided by those repositories (3.6.3, 4.1.0 and 4.1.0, respectively). Presumably, given r-base is correct, this may even suggest them to be in a broken state.
Looking through Apt's output, it's clear that many other r-* packages are defaulting to the latest versions, rather than the versions I specified. In an attempt to get around this, I tried explicitly setting the versions on all the packages that are defaulting to the latest version. For example, again with R 3.6.1:
apt install -y r-base=3.6.1-3bionic \
r-base-core=3.6.1-3bionic \
r-base-dev=3.6.1-3bionic \
r-base-html=3.6.1-3bionic \
r-doc-html=3.6.1-3bionic \
r-recommended=3.6.1-3bionic
However, this refuses to work, complaining about conflicts with other packages it's trying to install (r-cran-* packages, IIRC).
I don't know if this is an Apt-thing, an R-thing, or something to do with their repositories. Is there a way I can get these specific versions installed from the official sources, without having to build anything myself? (If not, what's the point of them keeping the older versions in their repositories?)
Thanks to #Chris' tip-off, the structure of said R packages is important to understand.
r-base is a metapackage which includes, amongst other things, r-base-core and r-recommended. r-recommended is another metapackage which includes a suite of recommended R packages, which introduce the incompatibility when trying to pin to versions.
For just the R binaries and the documentation, pinned to a specific ${VERSION}, this will do the trick:
apt install -y --no-install-recommends \
r-base-core=${VERSION} \
r-base-html=${VERSION} \
r-doc-html=${VERSION}
If you want to build packages, you'd also want r-base-dev=${VERSION} in there.
Related
On Ubuntu in a Conda environment with Python 3.7.3, when I run
conda install -c conda-forge opencv
I get OpenCV 3.4.2 (checked with import cv2 and then cv2._version__) even though https://anaconda.org/conda-forge/opencv indicates version 4.11. Why?
Note that I didn't have OpenCV installed previously (I ran conda uninstall opencv and it got completely removed)
tl;dr You likely have previously installed dependencies that need updating. If you require a specific version, say 4.1, then express this to Conda:
conda install -c conda-forge opencv=4.1
Explanation
How Conda Interprets Specifications
A literal translation of the command
conda install -c conda-forge opencv
would go something like
With the conda-forge channel included, ensure that some version of the package opencv is installed in the currently active environment.
The logic here implies that any version it can install would be a valid solution. It also doesn't tell it that it must come from Conda Forge, only that that channel should be included.1
Two-Stage Solve Strategy
Starting with v4.7, Conda uses a two-stage dependency solving strategy. The two stages are
Solve with an implicit --freeze-installed|--no-update-deps flag. This attempts to find the newest version of the requested package that has no conflicts with installed packages. That is, it considers any installation of the package, no matter the version, to be a satisfactory solution. If it works, then it's done. Otherwise, move on to...
An unrestricted solve (what used to be default in Conda < 4.7). This frees up dependencies to be updated and will often result in the latest versions being installed unless there are previous explicit specifications on those packages.2
This strategy aims to provide a faster solve and install experience, by avoiding having to change anything in your environment. It also helps keep the environment stable by avoiding unnecessary version changes.
Specific Failure in Question
What happened in OP's case? One of the dependencies requirements of OpenCV was likely newer in v4.1.1 than what was already installed, but that dependency's version was compatible with installing OpenCV 3.4.2. Hence, the only thing that would change was adding opencv plus missing dependencies. Technically, this is a valid solution since one only asked for some version of opencv to be installed.
Getting the Latest Version
Option: Specifying the Version
If you know you want a specific version then you can always specify it
conda install -c conda-forge opencv=4.1.1
and since Conda can't install this without updating something in your env, the first round of solve will fail, and the full solve will get it for you.
Option: Skip the Freeze
Of course, you may not always know what the latest version number is and don't want to have to look this up on Anaconda Cloud every time. Fortunately, there is the --update-deps flag that essentially skips over the first solve stage and goes straight to the full solve. This will install the latest version for your system, as well as update any of the dependencies.
conda install --update-deps -c conda-forge opencv
Important Note: The --update-deps flag has a side-effect of converting dependencies to explicit specifications. While this is an internal environment state (managed through <env>/conda-meta/history), it does have some behavioral consequences (bugs!):
the result of the conda env export --from-history command will subsequently include all packages, instead of just the ones the user explicitly requested in the past
conda remove will not be able to prune dependencies; e.g., if scipy was installed, it would pull in numpy; if only scipy depended on numpy and scipy was removed, normally numpy would also get removed. This wouldn't work after using the --update-deps flag.
[1]: The behavior here depends on the channel_priority configuration option. With the strict setting, conda-forge would be prioritized over other channels; with the flexible setting, it is simply added to the list and the latest compatible version from any channel is selected.
[2]: One can check the explicit specifications of an environment with conda env export --from-history.
There is no package called 'tidyverse' is the error message I get after doing this:
install.packages('tidyverse', dependencies = T);
install.packages('DBI', dependencies = T);
library(DBI);
library(tidyverse);
I use Ubuntu 18.04 and Rstudio.
Could anyone sort me out here, please?
You may find this blog post and associated video useful -- it shows how to install all of tidyverse on Ubuntu directly from prebuilt binaries with one command.
In short, that is what PPAs are good for. The associate slides have the relevant commands.
And once you do the required step of adding the two PPAs and running sudo apt-get update (again, both detailed in the slides) then all it takes is a single sudo apt-get install r-cran-tidyverse as the video shows.
Added bonus: because you install binaries that pre-made it is the fastest possible installation.
Edit three years later: We now have r2u which thanks to its use of bspm plus its complete set of CRAN binaries lets you just use install.packages("tidyverse") to install all packages as binaries along with all dependencies in a matter of seconds as shown in a few gifs on the site, my blog, and elsewhere. Plus anybody can try it in the browser via gitpod from the r2u site.
The default setup in R on Linux is to compile packages from source since CRAN only provides binaries for macOS and Windows. This is not the recommended way to install packages on Ubuntu. As pointed out by #DirkEddelbuettel in his edit to his answer, you can use r2u and bspm to obtain binaries for all CRAN packages. This will require initial setup but result in a much better user experience. If you insist on compiling the tidyverse yourself, my old answer remains below.
Old answer
tidyverse has external dependencies that cannot be installed through R and that aren't preinstalled in Ubuntu. Install the following packages via the terminal:
sudo apt install libcurl4-openssl-dev libssl-dev libxml2-dev
Run install.packages("tidyverse") again after that.
You can find more help regarding this here.
We need to install R-base version 3.5+ on an offline machine running SLES12.3
We have downloaded all the packages from the the SUSE r repo
http://download.opensuse.org/repositories/devel:/languages:/R:/released/openSUSE_12.3/x86_64/
while running zypper install on the packages there are additional dependencies that we are not able to find the relevant packages to download.
These include:
libtcl8.5.so()(64bit)
libgomp.so.l()(64bit)
But we are not able to find the dependency package that include these libraries.
What should be the correct approach for installing these libraries offline? where can we find these libraries?
Is there a better way for offline installing R-base ? we tried to follow the instructions on the cran rstudio page
The files you downloaded don't match the distribution you're running. SUSE Linux Enterprise (SLE) and openSUSE are similar in some ways, but these are really two separate distributions and you can not always mix binaries between the two. To install R on SLE Server 12.3, you should use the repository https://download.opensuse.org/repositories/devel:/languages:/R:/released/SLE_12/.
You can find out these URLs by looking at the right hand-side column at https://build.opensuse.org/project/show/devel:languages:R:released. Look for things called "SLE" there.
Install the Development Tools, according to this answer
zypper install --type pattern Basis-Devel
Download R source and install it
wget http://cran.univ-paris1.fr/src/base/R-3/R-3.5.0.tar.gz
tar zxf R-3.5.0.tar.gz
cd R-3.5.0
./configure --enable-R-shlib
make
make check
make install
Maybe there are still dependencies missing, which need to be installed with zypper (I don't have any Suse to try myself). With this method you have an "empty" R and you will install R packages one by one (with R CMD INSTALL). Maybe not the best answer for your need, but an answer.
I'm in internship and I'm working on a Debian server
for my R's scripts.
However, the version installed on the server is really outdated (2.15.1)
and I think, it might be the reason of some errors I have with my scripts
(which work on my windows PC with R 3.3).
But I am totally a beginner with Linux and I'm stuck.
I know there is a tutorial (https://cran.r-project.org/bin/linux/debian/)
but it's a very specific vocabulary I don't understand completely + my inexperience with Linux servers make it hard to understand exactly what I have to do.
Is it possible to have more explanations on how to install R 3.3 on Debian
server ?
Here are the details from sessionInfo() of the server :
R version 2.15.1 (2012-06-22)
Platform : i486-pc-linux-gnu (32 bit)
I would suggest that you install the '-dev' version of base R
sudo apt-get install r-base r-base-dev
and then as a regular user use R's install.packages() to install additional packages. This will result in an installation where R and it's base packages are accessible to all but owned by root (and therefore difficult for a regular user to update / mess up) and other packages belong to the regular user (and hence easy to update).
Some packages may have system dependencies, e.g., XML requires the libxml2 and libcurl libraries. The '-dev' version of these libraries also need to be installed, most easily via apt-get
sudo apt-get install libxml2-dev libcurl4-openssl-dev
It may be that your version of apt knows nothing about r-base / r-base-dev. You should then follow the section 'Installing R-devel or a release branch from svn' in the document you mention; skip over the instructions in the 'R-devel' section, and instead follow 'r-patched'.
My R version is 2.7.1 (on Debian) and some packages are asking for > 2.10. I cannot find updating instructions and I don't want to remove and reinstall as I have other things depending on R and I don't want to mess up. Is there an update procedure?
Closest thing to my problem is on this thread.
check out the instructions for installing from source. Its easy on a Linux box, and you can do the install in any directory you like, you dont even need superuser permissions. Once compiled you can even run R from that directory without messing up any system-installed R. As long as you give the full path to R's binary, or put the path to it in your PATH environment variable, when starting it it will work fine.
FYI
It seems that R on Debian with versions previous to 2.7.1 cannot be updated.
The current core runs from 2.7.1 up. The only way to do it is to remove the existing version.
As this was not straight forward I post it here. If you have Rapache or other things connecting to R disable them with dismode or related.
apt-get purge r-base r-base-dev
I had to do this as well
dpkg -P r-base-core
until this shows no more installed R packages
dpkg -l r-*
Then follow the instruction from http://cran.r-project.org/bin/linux/debian/, with the amendment that you should use deb instead of deb-src in /etc/apt/sources.list.
deb http://<favorite-cran-mirror>/bin/linux/debian lenny-cran/
Before installing run this and it should not say 2.7.1.
apt-cache policy r-base-dev