podman mounted volume issue - mount

Bottom line: output from container is not appearing in mounted local directory
I have read the documentation for bind mounts and on another project had success with this.
My docker file:
FROM ubuntu:18.04
RUN apt-get update && apt-get install -yq build-essential autoconf libnetcdf-dev libxml2-dev libproj-dev valgrind wget unzip git nano
# pulls ADBM from github and unzips in folder ADMBcode
RUN mkdir /ADMBcode
RUN wget https://github.com/admb-project/admb/releases/download/admb-12.2/admb-12.2-linux.zip
RUN mv admb-12.2-linux.zip /ADMBcode
RUN unzip ADMBcode/admb-12.2-linux.zip -d /ADMBcode
# pulls hydra repo from github into folder HYDRA
RUN mkdir /HYDRA
RUN git clone https://github.com/NOAA-EDAB/hydra_sim.git /HYDRA
# compiles and runs model
WORKDIR /HYDRA
RUN /ADMBcode/admb-12.2/admb hydra_sim.tpl
RUN ./hydra_sim
# create dir for output and move output
#RUN mkdir -p /HYDRA/output/diagnostics
#RUN mkdir /HYDRA/output/indices
# moves output to folder to be mounted
RUN mv *.out /HYDRA/output/diagnostics
RUN mv *.txt /HYDRA/output/indices
I build the image
podman build -t hydra .
and run the container using the following :
podman run --rm --name hydra --mount "type=bind,src=/path_on_local_machine/test,dst=/HYDRA/output" hydra
I have test folder on my local machine but the output is not mounted.
I have entered the container
podman run -it hydra
and checked that the output is there
I have done this before for another model and everything behaved. Not sure why this is not.
Any ideas what i am doing wrong?
Thanks
However

Related

Installing wp plugins after image builds

I'm trying to install wp plugins by executing script right after the wordpress image is built.
Here is my Dockerfile:
FROM wordpress
# Update aptitude with new repo
RUN apt-get update
# Install software
RUN apt-get install -y sudo vim curl less git python-dev python3.5
# Add WP-CLI
RUN curl -o /bin/wp-cli.phar https://raw.githubusercontent.com/wp-cli/builds/gh-pages/phar/wp-cli.phar
COPY wp-su.sh /bin/wp
RUN chmod +x /bin/wp-cli.phar /bin/wp && chown www-data:www-data /bin/wp-cli.phar /bin/wp
# Copy scripts into the image
COPY install.py /usr/src/wordpress
COPY plugins.json /usr/src/wordpress
COPY wait-for-it.sh /usr/src/wordpress
RUN chmod +x /usr/src/wordpress/install.py
RUN chmod +x /usr/src/wordpress/plugins.json
RUN chmod +x /usr/src/wordpress/wait-for-it.sh
COPY --chown=www-data:www-data uploads/ /usr/src/wordpress/wp-content/uploads
# Cleanup
RUN apt-get clean
RUN rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/*
ENTRYPOINT ["/usr/src/wordpress/wait-for-it.sh", "db:3306", "--", "python" , "/usr/src/wordpress/install.py" ]
CMD ["apache2-foreground"]
Once the scripts runs I get the following error:
Error: This does not seem to be a WordPress installation.
Pass --path=`path/to/wordpress` or run `wp core download`.
I tried doing what the error suggested but it did not work. The wordpress installation from my understanding should be located at /usr/src/wordpress along with all the scripts I copied into it. Is what I'm doing correct ? Should it even be possible to do what I'm attempting ? Any help would be appreciated.
Note: This image is run from docker-compose.yml
UPDATE:
Looks to me like the reason for the above error is because the wordpress installation isn't there yet and by specifying the entrypoint I think I'm overwriting the entry point that is provided by the official wp image in which wordpress installation is copied into /var/www/html directory at runtime. Not sure how to get around this.

Application logs to stdout with Shiny Server and Docker

I have a Docker container running a shiny app (Dockerfile here).
Shiny server logs are output to stdout and application logs are written to /var/log/shiny-server. I'm deploying this container to AWS Fargate and logging applications only display stdout which makes debugging an application when deployed challenging. I'd like to write the application logs to stdout.
I've tried a number of potential solutions:
I've tried the solution provided here, but have had no luck.. I added the exec xtail /var/log/shiny-server/ to my shiny-server.sh as the last line in the file. App logs are not written to stdout
I noticed that writing application logs to stdout is now the default behavior in rocker/shiny, but as I'm using rocker/verse:3.6.2 (upgraded from 3.6.0 today) along with RUN export ADD=shiny, I don't think this is standard behavior for the rocker/verse:3.6.2 container with Shiny add-on. As a result, I don't get the default behavior out of the box.
This issue on github suggests an alternative method of forcing application logging to stdout by way of an environment variable SHINY_LOG_STDERR=1 set at runtime but I'm not Linux-savvy enough to know where that env variable needs to be set to be effective. I found this documentation from Shiny Server v1.5.13 which gave suggestions in which file to set the environment variable depending on Linux distro; however, the output from my container when I run cat /etc/os-release is:
which doesn't really line up with any of the distributions in the Shiny Server documentation, thus making the documentation unhelpful.
I tried adding adding the environment variable from the github issue above in the docker run command, i.e.,
docker run --rm -e SHINY_LOG_STDERR=1 -p 3838:3838 [my image]
as well as
docker run --rm -e APPLICATION_LOGS_TO_STDOUT=true -p 3838:3838 [my image]
and am still not getting the logs to stdout.
I must be missing something here. Can someone help me identify how to successfully get application logs to stdout successfully?
You can add the line ENV SHINY_LOG_STDERR=1 to your Dockerfile (at least, this works with rocker/shiny, not sure about rocker/verse), such as with your Dockerfile:
FROM rocker/verse:3.6.2
## Add shiny capabilities to container
RUN export ADD=shiny && bash /etc/cont-init.d/add
## Install curl and xtail
RUN apt-get update && apt-get install -y \
curl \
xtail
## Add pip3 and other Python packages
RUN sudo apt-get update -y && apt-get install -y python3-pip
RUN pip3 install boto3
## Add R packages
RUN R -e "install.packages(c('shiny', 'tidyverse', 'tidyselect', 'knitr', 'rmarkdown', 'jsonlite', 'odbc', 'dbplyr', 'RMySQL', 'DBI', 'pander', 'sciplot', 'lubridate', 'zoo', 'stringr', 'stringi', 'openxlsx', 'promises', 'future', 'scales', 'ggplot2', 'zip', 'Cairo', 'tinytex', 'reticulate'), repos = 'https://cran.rstudio.com/')"
## Update and install
RUN tlmgr update --self --all
RUN tlmgr install ms
RUN tlmgr install beamer
RUN tlmgr install pgf
#Copy app dir and theme dirs to their respective locations
COPY iarr /srv/shiny-server/iarr
COPY iarr/reports/interim_annual_report/theme/SwCustom /opt/TinyTeX/texmf-dist/tex/latex/beamer/
#Force texlive to find my custom beamer theme
RUN texhash
EXPOSE 3838
## Add shiny-server information
COPY shiny-server.sh /usr/bin/shiny-server.sh
COPY shiny-customized.config /etc/shiny-server/shiny-server.conf
## Add dos2unix to eliminate Win-style line-endings and run
RUN apt-get update -y && apt-get install -y dos2unix
RUN dos2unix /usr/bin/shiny-server.sh && apt-get --purge remove -y dos2unix && rm -rf /var/lib/apt/lists/*
# Enable Logging from stdout
ENV SHINY_LOG_STDERR=1
RUN ["chmod", "+x", "/usr/bin/shiny-server.sh"]
CMD ["/usr/bin/shiny-server.sh"]

Docker File does not exist:.. ,on Ubuntu

I have an R plumber server, that i want to run using a docker container, and i have this configuration so far in my dockerfile
FROM rocker/r-ver:3.5.0
#update OS and install linux libraries needed to run plumber
RUN apt-get update -qq && apt-get install -y \
libssl-dev \
libcurl4-gnutls-dev
#load in dependencies from 00_Libraries.R file
RUN R -e "install.packages('plumber')"
#Copy all files from current directory
COPY / /
#Expose port :80 for traffic
EXPOSE 80
#when the container starts, start the runscript.R script
ENTRYPOINT ["Rscript", "runscript.R"]
in my runscript.R file, i have my server configuration like this:
pr <- plumber::plumb("/home/kristoffer/Desktop/plumber-api/rfiles/plumber.R")$run(port=8000)
whenever i try to run the docker image, i get this error:
File does not exist: /home/kristoffer/Desktop/plumber-api/rfiles/plumber.R
Execution halted
i have ensured, that all the necessary files are located in the right directory.
EDIT:
i included an image of all the files i have in my directory, to ensure, that the dockerfile is in the same directory as my other files
you are copying everything under / change your copy command to:
COPY . .
and make sure that Dockerfile is in the same directory.
beside this :
plumber::plumb("/home/kristoffer/Desktop/plumber-api/rfiles/plumber.R")
will also not work since the path is not in your container change it to :
plumber::plumb("plumber.R")
if this file is in the same directory

shiny app docker container failed to launch in browser

I'm trying to run my shiny app in a docker container.
My app folder structure is like this:
myApp (directory)
-app (directory)
--ui.R
--server.R
--global.R
--style.css
--mydata.xlsx
--mydata2.rds
--functions.R (contains functions I use in app)
-Dockerfile
-shiny-server.conf
-shiny-server.sh
I can get into my directory myApp and run shiny locally by doing runApp('app'). my shiny app runs perfectly.
However, when I try to build the image and run it, it give me error.
docker build -t myshinyapp_obs .
docker run -p 80:80 myshinyapp_obs
The application failed to start.
The application exited during initialization.
The docker image building process seems to be fine.
When run it on docker, I got error.
The interesting thing is when I simply copy any app from shiny gallery and put the ui.R and server.R file under the app folder, it works fine!!!
My question is why my app is not working? given the fact:
my app work perfectly locally
After copying shiny example from gallery into the app folder, the example app works fine.
How can that happen? I cannot figure it out. Spent hours trying to make it work but failed.
Below is my Dockfile:
# Install R version 3.6
FROM r-base:3.6.0
# Install Ubuntu packages
RUN apt-get update && apt-get install -y \
sudo \
gdebi-core \
pandoc \
pandoc-citeproc \
libcurl4-gnutls-dev \
libcairo2-dev/unstable \
libxt-dev \
libssl-dev
# Download and install ShinyServer (latest version)
RUN wget --no-verbose https://s3.amazonaws.com/rstudio-shiny-server-os-build/ubuntu-12.04/x86_64/VERSION -O "version.txt" && \
VERSION=$(cat version.txt) && \
wget --no-verbose "https://s3.amazonaws.com/rstudio-shiny-server-os-build/ubuntu-12.04/x86_64/shiny-server-$VERSION-amd64.deb" -O ss-latest.deb && \
gdebi -n ss-latest.deb && \
rm -f version.txt ss-latest.deb
# Install R packages that are required
# TODO: add further package if you need!
RUN R -e "install.packages(c('devtools','readxl','tidyverse','rlang','shiny','shinythemes', 'DT'), repos='http://cran.rstudio.com/')"
# Copy configuration files into the Docker image
COPY shiny-server.conf /etc/shiny-server/shiny-server.conf
COPY /app /srv/shiny-server/
# Make the ShinyApp available at port 80
EXPOSE 80
# Copy further configuration files into the Docker image
COPY shiny-server.sh /usr/bin/shiny-server.sh
CMD ["/usr/bin/shiny-server.sh"]

Cannot copy intermediate docker container files to host

I have a Dockerfile, it does dotnet publish and the dll's are copied to intermediate docker container. I would like to copy the dll's which are generated in container to my local system (Host) as well.
I believe we can use "cp" command to do that, but I am not able to find a solution to get the intermediate container Id to use the "cp" command.
syntax: docker cp CONTAINER:Container_Path Host_Path.
Please suggest me any other better solution for this scenario.
Dockerfile:
FROM microsoft/aspnetcore-build:1.1.4 as builder
COPY . /Code
RUN dotnet restore /Code/MyProj.csproj
RUN dotnet publish -c Release /Code/MyProj.csproj
RUN cp CONTAINER: /Code/bin/Release/netcoreapp1.1/publish /binaries
Thanks.
This answer is outside of the Dockerfile.
first your Dockerfile would have to have a volume.
[VOLUME] /my/path/in/container
To get files into and out of a volume, try using tar -cvf and tar -xvf to put and get files between a container and a host.
To put files from host's newfiles.tar in pwd to a container at /var/lib/neo4j/conf mount.
docker run --rm \
-v my-volume-data:/my/path/in/container -v $(pwd):/newfiles ubuntu bash -c \
"cd /my/path/in/container && tar -xf /newfiles/newfiles.tar"
To get files from into a container at /my/path/in/container mount to a host oldfiles.tar.
docker run --rm \
-v my-volume-data:/my/path/in/container -v $(pwd):/newfiles ubuntu bash -c \
"cd /my/path/in/container && tar -cf /newfiles/origfiles.tar"
The --user 1000:1000 is optional if your container has a user with uid of 1000.

Resources