I am trying to install some packages from source (including package that I have created that installed fine with R console or even when R CMD install.
However, while building docker-image using a docker file. I get this error with for this line in the docker file
RUN R -e 'install.packages("RcppDIUtilsPackage_1.0.tar.gz",repos=NULL,type="source")'
I also tried many other commands including R CMD INSTALL all work fine to install the package except within the docker image build.
Here is the error i am encountering.
Installing package into ‘/usr/local/lib/R/site-library’
(as ‘lib’ is unspecified)
Warning: invalid package ‘RcppDIUtilsPackage_1.0.tar.gz’
Error: ERROR: no packages specified
Warning message:
In install.packages("RcppDIUtilsPackage_1.0.tar.gz", repos = NULL, :
installation of package ‘RcppDIUtilsPackage_1.0.tar.gz’ had non-zero exit status
Thanks!!
Edit: The Dockerfile
FROM rocker/r-ver:3.4.4
WORKDIR /home/ubuntu/projects/DService
RUN apt-get update -qq && apt-get install -y \
libssl-dev \
libcurl4-gnutls-dev
RUN R -e "install.packages('plumber')"
RUN R -e "install.packages('Rcpp')"
RUN R -e 'install.packages("RcppDIUtilsPackage_1.0.tar.gz",repos=NULL,type="source")'
COPY / /
EXPOSE 8000
CMD ["Rscript", "DService.R"]
command: sudo docker build --no-cache -t dservice-docker-image .
This is an indirect solution to your problem, because I was not able to resolve the same issue.
The root of the issue may have something to do with the host environment that created the Docker image from the Dockerfile. Specifically, the R instance that is spun up to install the local packages may not being able to access the path to where your local packages are stored.
The solution for me was to just avoid local packages. Move any local repositories to remote repositories, and reference them in the Dockerfile instead. e.g.
RUN R -e "devtools::install_github('dmanuge/shinyFilesWidget') ; system('echo 14')"
After that, rebuilt your Docker image and run it accordingly. While this is not a direct solution, I reached the critical threshold of debugging and needed to move on. :)
Related
I'm trying to install plumber and RPostgreSQL into my docker image. Here's my dockerFile:
FROM rocker/r-base
RUN R -e "install.packages('plumber')"
RUN R -e "install.packages('RPostgreSQL')"
RUN mkdir -p /code
COPY ./plumber.R /code/plumber.R
CMD Rscript --no-save /code/plumber.R
The only thing my plumber script does is try to reference the RPostgreSQL package:
library('RPostgreSQL')
When I build, it appears to successfully install both packages, but when my script runs, it complains that RPostgreSQL doesn't exist. I've tried other base images, I've tried many things.
Any help appreciated. Thanks!
You are trying to install RPostgres and then trying to load RPostgreSQL -- these are different packages. Hence the error.
Next, as you are on r-base, the latter is installed more easily as sudo apt install r-cran-rpostgresql (maybe after an intial sudo apt update). While you're at it, you can also install plumber as a pre-made binary (along with its dependencies). So
RUN apt update -qq \
&& apt install --yes --no-install-recommends \
r-cran-rpostgresql \
r-cran-plumber
is easier and faster.
Im trying to set up a CI for a R package. In that regard I`m considering circleCI, which has worked out with previous R projects. However this time, I get the following error:
Downloading renv 0.14.0 ... OK (downloaded source)
Installing renv 0.14.0 ... Done!
Successfully installed and loaded renv 0.14.0.
Project '~/main' loaded. [renv 0.14.0]
devtools::install_deps(dependencies = TRUE)
Error in loadNamespace(x) : there is no package called ‘devtools’
Calls: loadNamespace -> withRestarts -> withOneRestart -> doWithOneRestart
Execution halted
My .circleci/config.yml looks similar to that one
version: 2
jobs:
build:
docker:
- image: my_random_image
steps:
- checkout
- run:
name: Install package dependencies
command: R -e "devtools::install_deps(dep = TRUE)"
- run:
name: Build package
command: R CMD build .
- run:
name: Check package
command: R CMD check *tar.gz
and my_random_image looks as follows:
FROM r-base:4.1.2
RUN apt-get update \
&& apt-get install git libssl-dev ssh texlive-latex-base texlive-fonts-recommended
libcurl4-openssl-dev libxml2-dev -y \
&& rm -rf /var/lib/apt/lists/*
RUN R -e "install.packages(c('devtools', 'roxygen2'), repos='http://cran.us.r- project.org')"
So its pretty standard stuff, as far as I can see. The error only occurs if renv is part if my R package. Otherwise circleCI does not complain and runs as expected without any errors.
I would like to keep renv in my R project and therefore struggle to understand the issue and the solution to that.
appreciate any help!!
The issue here is most likely that your run stage, here:
- run:
name: Install package dependencies
command: R -e "devtools::install_deps(dep = TRUE)"
installs packages into the default user / site libraries, but when R is launched in your project's working directory:
Downloading renv 0.14.0 ... OK (downloaded source)
Installing renv 0.14.0 ... Done!
Successfully installed and loaded renv 0.14.0.
Project '~/main' loaded. [renv 0.14.0]
the renv autoloader is automatically downloading renv, and activating the renv project library.
By default, renv isolates projects from the user / site library, so the packages installed in your earlier steps are not visible within the project. This behavior is intentional, and ensures that different project libraries are isolated both from changes in the user / site libraries, as well as in other project libraries.
One of the following should help:
If your renv.lock is up to date, call renv::restore() before trying to use devtools or other packages;
Allow renv to see the user library, with e.g. the environment variable RENV_CONFIG_USER_LIBRARY = TRUE.
I'd recommend reading https://rstudio.github.io/renv/articles/renv.html and https://rstudio.github.io/renv/articles/ci.html if you haven't already.
Say you have the following list of packages you would like to install for a docker image
("jsonlite","dplyr","stringr","tidyr","lubridate",
"knitr","purrr","tm","cba","caret",
"plumber","httr")
It actually takes around 1 hour to install these!
Any suggestions into how to speed up such a thing ? (or how to prevent the re-installation at every new image build ?)
Side note
I do not install these packages from the dockerfile like this:
RUN Rscript -e "install.packages('stringr')
...
Instead I create an R script Requirements.R which installs these packages and simply do:
RUN Rscript Requirements.R
Is these less optimal than installing the packages directly from the Dockerfile ?
Use binary packages where you can as we often do in the Rocker Project providing multiple Docker files for R, including the official r-base one.
If you start from Ubuntu, you get Michael's PPAs with over 3000+ packages; if you start from Debian you get fewer from the distro but still many essential ones. (There are some efforts to bring more binary packages to Debian but nothing is up right now.)
Lastly, Dockerfile creation is of course compile time too. You spend the time once (per container creation) and re-use potentially many time after. Also, by using the Docker Hub you can avoid spending your local cpu cycles.
Edit in Sep 2020: The (updated) Ubuntu PPA now has over 4600 package for the three most recent LTS releases. Still highly, highly recommended.
I found an article that described how to install R packages from precompiled binaries. It reduced the build time on our Jenkins server from 45 minutes down to 3 minutes.
Here is my Dockerfile:
FROM rocker/r-apt:bionic
WORKDIR /app
RUN apt-get update && \
apt-get install -y libxml2-dev
# Install binaries (see https://datawookie.netlify.com/blog/2019/01/docker-images-for-r-r-base-versus-r-apt/)
COPY ./requirements-bin.txt .
RUN cat requirements-bin.txt | xargs apt-get install -y -qq
# Install remaining packages from source
COPY ./requirements-src.R .
RUN Rscript requirements-src.R
# Clean up package registry
RUN rm -rf /var/lib/apt/lists/*
COPY ./src /app
EXPOSE 5000
CMD ["Rscript", "Server.R"]
You can add a file requirements-bin.txt with package names:
r-cran-plumber
r-cran-quanteda
r-cran-irlba
r-cran-lsa
r-cran-caret
r-cran-stringr
r-cran-dplyr
r-cran-magrittr
r-cran-randomforest
And finally, a requirements-src.R for packages that are not available as binairies:
pkgs <- c(
'otherpackage'
)
install.packages(pkgs)
I ended up using rocker/r-base as #DirkEddelbuettel suggested. Also thanks to this How to avoid reinstalling packages when building Docker image for Python projects? I wrote my Dockerfile in a way that doesen't reinstall packages every time I rebuild my docker image.
I want to share how my Dockerfile looks like now, hopefully this will be of help to others:
FROM rocker/r-base
RUN apt-get update
# install packages
RUN apt-get -y install libcurl4-openssl-dev
RUN apt-get -y install libssl-dev
# set work directory
WORKDIR /myapp
# copy requirments R script
COPY ./Requirements.R /myapp/Requirements.R
# run requirments R script
RUN Rscript Requirements.R
COPY . /myapp
EXPOSE 8094
ENV NAME R-test-service
CMD ["Rscript", "my_R_api.R"]
Let me confess first I am new to Docker / opencpu world. here is the issue.
I installed Docker from opencpu site on my windows 10 box.
I was able to successfully run the docker by "docker run --name myDocker -t -p 80:80 -p 8004:8004 opencpu/rstudio".
I successfully installed my R package by "R CMD INSTALL /tmp/AnotherPackage_0.1.0.tar.gz".
Only issue now is I cant see my Package in http://localhost/ocpu/test/. so in below figure I cant see my package in right box (which shows all the other packages).
If I do /library/AnotherPackage in Endpoint text box I can see my package's description etc..
You probably installed the package in another library. Can you show us the output of your R CMD INSTALL line? In particular the final line that starts with installing to....
To install into the global library, either install as user opencpu:
sudo su opencpu
R CMD INSTALL /tmp/AnotherPackage_0.1.0.tar.gz
Or install as root:
sudo -i
R CMD INSTALL /tmp/AnotherPackage_0.1.0.tar.gz
I think you are running it as opencpu user, which means user installed packages are in /ocpu/user/{username}/library/{pkgname}/. See here how to get a root shell so that your package is in /ocpu/library/{pkgname}/ as you expected.
I am trying to host an shiny app on an remote Debian machine. Yet, i have encountered an R version issue when installing shiny package. I will basically walk through the steps that I took in the process:
After SSH into the VM, I install and update the r-base:
sudo apt-get update
sudo apt-get install r-base
sudo apt-get install r-base-dev
The latest version I can get for R is 3.1.1. Then, I was trying to install "shiny" package as root by the following command:
sudo su - -c "R -e \"install.packages('shiny', repos='http://cran.rstudio.com/')\""
Then, I was getting the error message like:
Installing package into ‘/usr/local/lib/R/site-library’
(as ‘lib’ is unspecified)
Warning: unable to access index for repository http://cran.rstudio.com/src/contrib
Warning message:
package ‘shiny’ is not available (for R version 3.1.1)
Is there any work-around on this issue? Such as starch the apt-get to install the latest R version rather than 3.1.1? Or possibly install shiny from a Github repo? Please help! Thanks!
You should be able to get the R package yourself, rather than using apt-get. This way you can choose which release to install. For example:
wget http://cran.rstudio.com/src/base/R-3/R-3.2.2.tar.gz
tar zxvf R-3.2.2.tar.gz; cd R-3.2.2/
./configure; make;
sudo make install
Then you can get shiny through the terminal as well, rather than within R:
wget https://cran.r-project.org/src/contrib/shiny_0.13.2.tar.gz
sudo R CMD INSTALL shiny_0.13.2.tar.gz
credit to Huiong Tian, from whom I learned this a while back:
http://withr.me/install-shiny-server-on-raspberry-pi/