I have created a docker image using Dockerfile mentioned below. There is a jar file in the image which needs few parameters to run it. I am passing the parameters using docker run command but it throws me error. Find the details below.
Dockerfile content
FROM ubuntu:14.04
ENV http_proxy http://http.proxy.nxp.com:7000
ENV https_proxy http://http.proxy.nxp.com:7000
RUN apt-get update
<set of lines for installing java is here>
ENV JAVA_HOME /usr/lib/jvm/java-8-oracle
copy apache-jmeter-3.1 /apache-jmeter-3.1
RUN mkdir /jarloc
copy Test.jar /jarloc
RUN java -version
ENTRYPOINT [ java -jar /jarloc/Test.jar ]
RUN ls -l /jarloc
I created an image called as jmaster:1.0 and gave following command to spin the container.
docker run jmaster:1.0 http://win_loc/soasta_parent/soasta/MyPOC/Login_Data.csv http://win_loc/soasta_parent/soasta/MyPOC/Dpc_data.csv 30 300 30
This gives me following error.
http://win_loc/soasta_parent/soasta/MyPOC/Login_Data.csv: 1: [:
missing ]
I am able to run this script from inside docker (docker run -it jmaster:1.0 /bin/bash). It gives me correct output. But when I try to pass the parameter in docker run command, I am getting this error. Am I passing this in a wrong way or is there any other way to do so?
When I go inside docker using 'docker run -it imagename /bin/bash' and execute following, I am getting correct results from jar.
/jarloc#java -jar Test.jar http://win_loc/soasta_parent/soasta/MyPOC/Login_Data.csv http://win_loc/soasta_parent/soasta/MyPOC/Dpc_data.csv 30 300 30
Try with
ENTRYPOINT ["java","-jar","/jarloc/Test.jar"]
That should take the parameters in docker run
Related
I use the geospatial rocker2 image to deploy Rstudio for development and a Shiny app for production. By using a single image, I have a consistent package library, credentials and database connections. I would like to use this same image to serve a plumber API.
Using the standard plumber.R example and the standard plumber Docker example I have tried to serve it as follows:
docker run -v `pwd`/app/plumber.R:/plumber.R --name plumber --restart=unless-stopped \
-p 8000:8000 my_rocker2_fork/geospatial Rscript /plumber.R
Success, kind of. The plumber.R file is clearly being sourced, but it is not being "plumbed":
Another issue is that the container continually restarts (this is the output of docker ps - please ignore the node.js container running):
One more oddity is that port 8000 isn't shown. Sometimes it is, sometimes it isn't. I think this is related to the restarting behaviour.
My code isn't plumbed, because I don't have the Entrypoint that is standard in the rstudio/plumber Dockerfile, and I don't think I want this Entrypoint, as it may cause issues with Rstudio Server and the Shiny app that are also in this image. Therefore, I think it is probably optimal to "plumb" by expanding the Rscript command at the end of my Docker run statement:
docker run -v `pwd`/app/plumber.R:/plumber.R -p 8000:8000 my_rocker2_fork/geospatial \
'Rscript pr("/plumber.R") %>% pr_run(port = 8000)' &
However, this fails because of all the special characters (like the pipe operator). How can I serve plumber code with an arbitrary Dockerfile without an Entrypoint?
The answer is simple! Call a script that sets the plumbing in motion, e.g.
docker run -v `pwd`/app/plumb_start.R:/plumb_start.R -p 8000:8000 my_rocker2_fork/geospatial \
Rscript plumb_start.R
Where plumb_start.R contains:
pr("plumber.R") %>% pr_run(port=8000)
Make sure that you also expose port 8000 in the Dockerfile.
I keep getting this error even my repo name is lowercase, the code i´m running is this sudo docker container run --rm -p 3838:3838 -v /home/ubuntu/la-liga-2018-2019-stats/stats/:/srv/shiny-server/stats -v /home/ubuntu/log/shiny-server/:/var/log/shiny-server/
BorisRendon/shinyauth. I´m trying to deploy a shiny app to aws using docker and i can´t pass this step.
You should probably use docker run and not docker container run.
Well, I'm new at Docker and I need to implement a Shiny app in a Docker Container.
I have the image from https://hub.docker.com/r/rocker/shiny/, that includes Shiny Server, but I don't know how to deploy my app in the server.
I want to deploy the app in the server, install the required packages for my app into the Docker, save the changes and export the image/container.
As I said, I'm new at Docker and I don't know how it really works.
Any idea?
I guess you should start by creating a Dockerfile in a specific folder which would look like something like this :
FROM rocker/shiny:latest
RUN echo 'install.packages(c("package1","package2", ...), \
repos="http://cran.us.r-project.org", \
dependencies=TRUE)' > /tmp/packages.R \
&& Rscript /tmp/packages.R
EXPOSE 3838
CMD ["/usr/bin/shiny-server.sh"]
Then go into this folder and build your image, giving it a name by using this command :
docker build -t your-tag .
Finally, once your image is built you can create a container, and if you don't forget to map the volume and the port, you should be able to find it at localhost:3838 with the following command launched from the folder containing the srv folder :
docker run --rm -p 3838:3838 -v $PWD/srv/shinyapps/:/srv/shiny-server/ -v $PWD/srv/shinylog/:/var/log/shiny-server/ your-tag
As said in the Docker documentation at the following address https://hub.docker.com/r/rocker/shiny/, you might want to launch it in detached mode with -d option and map it with your host's port 80 for a real deployment.
The link(https://hub.docker.com/r/rocker/shiny/) covers how to deploy the shiny server.
Simplest way would be:
docker run --rm -p 3838:3838 rocker/shiny
If you want to extend shiny server, you can write your own Dockerfile and start with shiny image as base image.(https://docs.docker.com/engine/reference/builder/)
Dockerfile:
FROM rocker/shiny:latest
I have pulled a kaggle/rstats image on my local windows machine. I want to run a local code script.r in the kaggle image.
My code script.r is stored in "D:/codes/script.r." I have installed docker and pulled kaggle/rstats image in "E:/docker."
Can somebody please help with how to run script.r in kaggle docker image.
I have been using the following command to run it but of course there is some issue with it which I can't figure out.
docker run -v $PWD:D:/codes -w=D:/codes --rm -it kaggle/rstats Rscript script.r
Output:
docker: Error response from daemon: Invalid bind mount spec "/c/Users/Rohan:D:/codes": invalid mode: /codes.
See 'E:\Docker Toolbox\docker.exe run --help'.
I tried following as well:
docker run -v /D:/codes -w=/D:/codes --rm -it kaggle/rstats Rscript script.r
Output:
Fatal error: cannot open file 'script.r': No such file or directory
script.r is present in D:/codes, not sure why it is saying no such file.
What is wrong in the command?
I am a bit new to docker and I have been trying to run deploy a meteor container with my meteor application. I have been using the dockerfile and instructions from https://registry.hub.docker.com/u/golden/meteor-dev/
However, I cant run docker run -p 3000:3000 -t -i -v /path/to/meteor/app:/opt/application -w /opt/application meteor-dev because my docker (version 0.5.3) does not recognize the flag (-w) to set the working directory.
Is there some workaround to set the working directory with docker 0.5.3? The work directory is already set in the docker file, but I guess I need to set it again when I run the container.
well, my workaround was to create a bash script that would go to the working directory and call the commands one by one. I created the bash script where my source is located "/path/to/meteor/app" and call docker run -p 3000:3000 -t -i -v /path/to/meteor/app:/opt/application meteor-dev bash /opt/application/start.sh with the bash as command and my script as argument