The idea is to run a R script in a docker container. The R script works fine. Here is a piece from this R script. The R sript create the file alpha.csv. The script doesn*t start. If I run the script by hand from the root dir Rscript /home/script/master.R I get an error message.
The error message
Error in file(file, ifelse(append, "a", "w")) :
cannot open the connection
Calls: write.csv -> eval.parent -> eval -> eval -> write.table -> file
In addition: Warning message:
In file(file, ifelse(append, "a", "w")) :
cannot open file '../output/alpha.csv': No such file or directory
Execution halted
I copyy the script to /home/master.r in the container
Here is my dockerfile
From rocker/r-base:latest
# Create directories
RUN mkdir -p home/data home/output home/script
# Copy files
COPY /src/data/test.csv /home/data/test.csv
COPY /src/master.R /home/script/master.R
COPY /src/install_packages.R /home/script/install_packages.R
# Install R-packages
RUN Rscript /home/script/install_packages.R
# Run the script
CMD Rscript /home/script/master.R
The second problem is, that I need groff. So I tried this:
install.packages('groff', dependencies = TRUE, repos='http://cran.us.r-project.org')
Te error message
Installing package into ‘/usr/local/lib/R/site-library’
(as ‘lib’ is unspecified)
Warning message:
package ‘groff’ is not available (for R version 3.6.1)
Installing package into ‘/usr/local/lib/R/site-library’
(as ‘lib’ is unspecified)
Warning message:
package ‘pandoc’ is not available (for R version 3.6.1)
How do I run the container.? I tried this:
docker run -it --rm test
Concerning the first issue, you should set the WORKDIR directive to /home/script, concerning groff, I'm not aware of any R package with that name but I'm familiar with the command itself and I think you want to have that installed.
This should be the resulting Dockerfile:
FROM rocker/r-base:latest
RUN apt-get update \
&& apt-get install -yq --no-install-recommends groff \
&& rm -rf /var/lib/apt/lists/*
# Create directories
RUN mkdir -p /home/data /home/output /home/script
WORKDIR /home/script
# Install R-packages
COPY ./src/install_packages.R /home/script/install_packages.R
RUN Rscript /home/script/install_packages.R
# Copy data
COPY ./src/data/test.csv /home/data/test.csv
COPY /src/master.R /home/script/master.R
# Run the script
CMD Rscript /home/script/master.R
Concerning the Dockerfile writing, I'd suggest you check https://docs.docker.com/develop/develop-images/dockerfile_best-practices/
If your data.csv file changes often, I suggest to avoid copying it in the docker image but to mount the folder when the container starts. I suppose you want to access the output files once you're done with the execution, then I suppose you should mount the output folder as well. You can take this commands as examples:
docker build --tag newtest .
docker run \
-it --rm \
-v "$(pwd)/src/data/:/home/data/" \
-v "$(pwd)/src/output/:/home/output/" \
newtest
Related
I am building a Singularity container to run a custom R script for tree segmentation using the LidR software package.
I have written the Singularity definition file as such:
Bootstrap: docker
From: ubuntu:20.04
%setup
touch test.R
touch treeSeg_dalponte2016.R
touch /home/ljeasson/R/x86_64-pc-linux-gnu-library/3.6/rgdal/libs/rgdal.so
%files
test.R
treeSeg_dalponte2016.R
/home/ljeasson/R/x86_64-pc-linux-gnu-library/3.6/rgdal/libs/rgdal.so
%post
# Disable interactivity, including region and time zone
export DEBIAN_FRONTEND="noninteractive"
export DEBCONF_NONINTERACTIVE_SEEN=true
# Update apt and install necessary libraries and repositories
apt update
apt install -y build-essential r-base-core software-properties-common dirmngr apt-transport-https lsb-release ca-certificates
add-apt-repository ppa:ubuntugis/ubuntugis-unstable
apt install -y libgdal-dev libgeos++-dev libudunits2-dev libproj-dev libx11-dev libgl1-mesa-dev libglu1-mesa-dev libfreetype6-dev libnode-dev libxt-dev libfftw3-dev
apt clean
# Install necessary R packages and dependencies
R -e "install.packages('lidR', dependencies = TRUE)"
R -e "install.packages('raster', dependencies = TRUE)"
R -e "install.packages('sf', dependencies = TRUE)"
R -e "install.packages('dplyr', dependencies = TRUE)"
R -e "install.packages('rgdal', dependencies = TRUE, repos='https://cran.rstudio.com', configure.args=c('--with-gdal-config=/opt/conda/bin/gdal-config', '--with-proj-include=/opt/conda/include', '--with-proj-lib=/opt/conda/lib', '--with-proj-share=/opt/conda/share/proj/'))"
R -e "install.packages('gdalUtils', dependencies = TRUE, repos='https://cran.rstudio.com')"
%test
#!/bin/bash
R --version
Rscript test.R
%runscript
#!/bin/sh
echo "Arguments received: $*"
Rscript treeSeg_dalponte2016.R $*
And build the container using singularity build ga_container.sif ga_container.def
The container builds without error, but when the container is run using ./ga_container <arguments>, this error always occurs:
Error: package or namespace load failed for 'rgdal' in dyn.load(file, DLLpath = DLLpath, ...):
unable to load shared object '/home/ljeasson/R/x86_64-pc-linux-gnu-library/3.6/rgdal/libs/rgdal.so':
libgdal.so.26: cannot open shared object file: No such file or directory
Execution halted
I know that the error is occuring because it cannot find the image for Rgdal, even though it seems I've attached to the container in the %setup and %files section:
%setup
touch test.R
touch treeSeg_dalponte2016.R
touch /home/ljeasson/R/x86_64-pc-linux-gnu-library/3.6/rgdal/libs/rgdal.so
%files
test.R
treeSeg_dalponte2016.R
/home/ljeasson/R/x86_64-pc-linux-gnu-library/3.6/rgdal/libs/rgdal.so
If the error is from incorrect file attachment, how do I ensure that the Rgdal (and other similar libraries) are attached correctly within the Singularity container?
Thanks in advance
This looks like an environmental issue causing the image to look at your locally installed R modules instead of using the ones installed in the image. Perhaps in your .Rprofile or R_LIBS/R_LIBS_USER. Try running with singularity run --cleanenv ..., or temporarily moving your .Rprofile if you have one, and see if that fixes it. If not, I have a few other observations.
First, the %setup block is creating root owned, empty files on the host OS if they don't exist already. An empty .so file would certainly cause problems. For the majority of cases you don't want to use %setup, as it directly modifies the host as root during sudo singularity build.
In the %files block you are copying the (potentially root owned/empty) to a path in the image that matches your home directory. Your $HOME is automatically mounted when you run/exec/shell an image, which will hide any files in the image at that location. When adding files to an image, you should always put them in a place they are unlikely to get clobbered by a mount. /opt/myapp or something similar usually works well. Additionally, test.R and treeSeg_dalponte2016.R are copied to /test.R and /treeSeg_dalponte2016.R inside the container, but relative paths are used in %runscript and %test. Singularity run/exec will attempt to run from the first path that exists in the container: $PWD (implicitly mounted, but this can fail silently), then $HOME (also implicitly mounted and can fail silently), then /. You can use singularity --verbose run ... to see if anything isn't being mounted correctly and add echo $PWD to %runscript to see where it's running from.
In %post when you install the rgdal package, you specify several paths with /opt/conda/... but conda is not installed or configured in the image. I'm not familiar with rgdal, so don't know if that would cause problems or not though.
I managed to install the following package on my local RStudio session:
install.packages("devtools")
devtools::install_github("tidyverse/googlesheets4")
I am now trying to install it directly in the Dockerfile of my RStudio session on my server.
For that, below the line:
RUN R -e "install.packages(c('DT', 'shiny', 'DBI', 'devtools'))"
I have added:
RUN R -e "devtools::install_version('tidyverse/googlesheets4')"
But when I launch the docker-compose up -d command I get this error:
Error in package_find_repo(package, repos) : couldn't find package
'tidyverse/googlesheets4' Calls: ... ->
download_version_url -> package_find_repo Execution halted ERROR:
Service 'rstudio' failed to build: The command '/bin/sh -c R -e
"devtools::install_version('tidyverse/googlesheets4')"' returned a
non-zero code: 1
Do you know what could be the issue?
Thanks a lot.
I installed shinyproxy using docker-compose.
When going to my shiny app, I am running into the error:
Status code: 500
Message: Failed to start container
and when checking into error message I see:
starting container process caused \"exec: \\"R\\": executable file not found in $PATH\": unknown"}
I am not sure to understand what it means.
In case that helps, the last lines of my Shiny Dockerfile are:
EXPOSE 3838
CMD ["R", "-e", "shiny::runApp('/root/app')"]
and in my application.yml the container-cmd line is
container-cmd: ["R", "-e", "shiny::runApp('/root/app')"]
Do you see any wrong spelling?
Also as an FYI but don't know if that's useful information, I noticed that:
- There is no R folder in my folder: /usr/lib
- And there is no R folder in /usr/bin/.
And I don't understand why.
Thanks for your help !
EDIT1:
I just installed R and now I see R in the /usr/bin/ folder but still nothing in /usr/lib and still same error message.
EDIT2:
I don't understand one thing, I see the R packages being installed in /usr/local/lib/R BUT
I see nothing in this folder after docker-compose up is done:
$ cd /usr/local/lib
$ ls
$
EDIT3:
As requested, I attach below the Dockerfile for my RStudio container and the Dockerfile for Shiny container:
RStudio Dockerfile:
FROM rocker/tidyverse:3.6.1
## Create directories
RUN mkdir -p /rstudio
RUN mkdir -p /rscripts
RUN R -e "install.packages(c('rvest','shiny','DT', 'digest', 'RCurl', 'caTools', 'bitops', 'httr', 'curl', 'stringr', 'mailR', 'xlsx', 'knitr', 'kableExtra' ,'rmarkdown', 'data.table', 'RSelenium'), repos = 'http://cran.us.r-project.org')"
Shiny Dockerfile:
FROM rocker/shiny:3.5.1
RUN apt-get update && apt-get install libcurl4-openssl-dev libv8-3.14-dev -y &&\
mkdir -p /var/lib/shiny-server/bookmarks/shiny &&\
mkdir -p /root/app
# Download and install library
RUN R -e "install.packages(c('mailR', 'shinydashboard', 'shinyjs', 'V8', 'DT', 'shiny', 'rvest', 'dplyr', 'htmltools', 'promises', 'jsonlite', 'data.table', 'rlang', 'xml2', 'digest', 'XML','rmarkdown'))"
# copy the app to the image
COPY app /root/app
COPY Rprofile.site /usr/local/lib/R/etc
# make all app files readable (solves issue when dev in Windows, but building in Ubuntu)
RUN chmod -R 755 /root/app
RUN chmod -R 755 /usr/local/lib/R/etc
EXPOSE 3838
CMD ["R", "-e", "shiny::runApp('/root/app')"]
I am trying to install some packages from source (including package that I have created that installed fine with R console or even when R CMD install.
However, while building docker-image using a docker file. I get this error with for this line in the docker file
RUN R -e 'install.packages("RcppDIUtilsPackage_1.0.tar.gz",repos=NULL,type="source")'
I also tried many other commands including R CMD INSTALL all work fine to install the package except within the docker image build.
Here is the error i am encountering.
Installing package into ‘/usr/local/lib/R/site-library’
(as ‘lib’ is unspecified)
Warning: invalid package ‘RcppDIUtilsPackage_1.0.tar.gz’
Error: ERROR: no packages specified
Warning message:
In install.packages("RcppDIUtilsPackage_1.0.tar.gz", repos = NULL, :
installation of package ‘RcppDIUtilsPackage_1.0.tar.gz’ had non-zero exit status
Thanks!!
Edit: The Dockerfile
FROM rocker/r-ver:3.4.4
WORKDIR /home/ubuntu/projects/DService
RUN apt-get update -qq && apt-get install -y \
libssl-dev \
libcurl4-gnutls-dev
RUN R -e "install.packages('plumber')"
RUN R -e "install.packages('Rcpp')"
RUN R -e 'install.packages("RcppDIUtilsPackage_1.0.tar.gz",repos=NULL,type="source")'
COPY / /
EXPOSE 8000
CMD ["Rscript", "DService.R"]
command: sudo docker build --no-cache -t dservice-docker-image .
This is an indirect solution to your problem, because I was not able to resolve the same issue.
The root of the issue may have something to do with the host environment that created the Docker image from the Dockerfile. Specifically, the R instance that is spun up to install the local packages may not being able to access the path to where your local packages are stored.
The solution for me was to just avoid local packages. Move any local repositories to remote repositories, and reference them in the Dockerfile instead. e.g.
RUN R -e "devtools::install_github('dmanuge/shinyFilesWidget') ; system('echo 14')"
After that, rebuilt your Docker image and run it accordingly. While this is not a direct solution, I reached the critical threshold of debugging and needed to move on. :)
I use Docker to run Shiny app, and I install some R packages in the Dockerfile (here is the portion of the Dockerfile, I omitted some lines, marking it with <...>):
FROM r-base:latest
RUN apt-get update && apt-get install -y -t unstable \
sudo \
gdebi-core \
make \
git \
gcc \
<...>
R -e "install.packages(c('shiny', 'rmarkdown'), repos='https://cran.rstudio.com/')" && \
R -e "install.packages(c('ada','bsplus','caret','ddalpha','diptest','doMC','dplyr','e1071','evtree','fastAdaboost','foreach','GGally','ggplot2','gridExtra','iterators','kernlab','lattice','markdown','MASS','mboost','nnet','optparse','partykit','plyr','pROC','PRROC','randomForest','recipes','reshape2','RSNNS','scales','shinyBS','shinyFiles','shinythemes'))"
This works fine. But if I add one more R package (DT), the container still builds fine (and I can see that the package gets installed properly) but when I try to run the container I get:
Loading required package: shiny
Error in dir.exists(lib) : invalid filename argument
Calls: <Anonymous> ... load_libraries -> get_package -> install.packages -> dir.exists
Execution halted
This error is not informative at all and I can't figure out what possibly can be wrong. I will appreciate any ideas! Thank you.