I am trying to implement gitlab CI for one of our drupal projects. For example: When a developer pushes the code to the repo in gitlab automatic tests should run using the gitlab CI pipeline which tests the code and notifies whether the code quality is fine enough to update the code to the repo. If else the developer needs to correct the coding standards and issues reported. I have tried different combination of gitlab-ci.yml file with less success.
Can anyone mention a sample gitlab-ci.yml file which can be used for drupal Projects which will trigger the CI pipeline and also run the tests using the gitlab runner so I could build upon that? I have already read through the gitlab-ci documentation.
You can start with the GitLab provided PHP template. This will get you into a docker image with PHP, xdebug, and your project composer installed.
As configured the template expects your project to have a phpunit.xml file in the root directory.
Related
I'm just starting work on a website which I want to integrate Elasticsearch into. In my development environment, I will need to install ES so that I and other devs can quickly get started with minimum effort.
We're using ASP.NET for the website (so I know all the devs will be running the website on Windows) and Git for source control.
Previously, on a another project, I have followed the installation guide and simply source controlled the following folders in ES:
/bin/
/config/
/lib/
/modules/
please note, the above folders were for ES 2.x so may slightly differ from 5.x
I then created a simple .bat file which devs run when they start working on the project:
cd %~dp0\elastic\bin\
start elasticsearch
cd %~dp0
All the script does is run ES.
However, I wonder if I should even be source controlling these files. Perhaps it would be better if I had a .bat file which would download a fresh copy of ES when a developer starts work?
I have 3 stages (dev / staging / production). I've successfully set up publishing for each, so that the code will be deployed, using msbuild, to the correct location, with the correct web configs transformed - all within Jenkins.
The problem I'm having is that I don't know to deploy the code to staging from what was built on dev (and staging to production). I'm currently using SVN as the source control, so I think I would need to somehow save the latest revision number dev has built and somehow tell Jenkins to build/deploy staging based on that number?
Is there a way to do this, or a better alternative?
Any help would be appreciated.
Edit: Decided to use the save the revision number method, which parses a file containing the revision number to the next job -- to do this, I followed this answer:
How to promote a specific build number from another job in Jenkins?
It explains how to copy an artifact from one job to another using the promotion plugin. For the artifact itself, I added a "Execute Windows batch command" build step after the main build with:
echo DEV_ENVIRONMENT_CORE_REVISION:%SVN_REVISION%>env.properties
Then in the staging job, using that above guide, copied that file, and then using a plugin EnvInject, to read from that file and set an environment variable, which can then be used as a parameter to the SVN Repository URL.
You should be able to identify the changeset number that was built in DEV and manually pass that changeset to the Jenkins build to pull that same changeset from SVN. Obviously that makes your deployment more manual. Maybe you can setup Jenkins to publish the changeset number to a file and then have the later env build to read that file for the changeset number.
We used to use this model as well and it was always complex. Eventually we moved to a build once and deploy many times model using WebDeploy. This has made the process much more simple. Check it out - http://www.dotnetcatch.com/2016/04/16/msbuild-once-msdeploy-many-times/
Currently have a pipeline that I use to build reports in R and publish in Jekyll. I keep my files under version control in github and that's been working great so far.
Recently I began thinking about how I might take R, Ruby and Jekyll and build a docker image that any of my coworkers could download and run the same report without having all of the packages and gems set up on their computer. I looked at Docker Hub and found that the automated builds for git commits were a very interesting feature.
I want to build an image that I could use to run this configuration and keep it under version control as well and keep it up to date in Docker Hub. How does something like this work?
If I just kept my current setup I could add a dockerfile to my repo and Docker Hub would build my image for me, I just think it would be interesting to run my work on the same image.
Any thoughts on how a pipeline like this may work?
Docker Hub build service should work (https://docs.docker.com/docker-hub/builds/). You can also consider using gitlab-ci or travis ci (gitlab will be useful for privet projects, it also provides privet docker registry).
You should have two Dockerfiles one with all dependencies and second very minimalistic one for reports (builds will be much faster). Something like:
FROM base_image:0.1
COPY . /reports
WORKDIR /reports
RUN replace-with-requiered-jekyll-magic
Dockerfile above should be in your reports repository.
In 2nd repository you can crate base image with all the tools and nginx or something for serving static files. Make sure that nginx www-root is set to /reports. If you need to update the tools just update base_mage tag in Dockerfile for reports.
I am a front-end developer. I do a lot of PHPs/CSS/JS and HTMLs. Currently, how we do our deployment to staging environment is to push our codes to GIT servers. Go to our staging servers and do a pull to some directory. And then manually move the files from the directory to the correct directories in our apache web server.
Will it be overkill if I use TeamCity to do this? I intend to write an ANT script that does the copying which means to say Runner type will be ANT. So every time there is a push to the GIT repo, Teamcity will pull and then run the ANT script to copy the affected codes to the correct directories.
If not, I will gladly love to listen to any suggestions.
Thanks
Teamcity may be overkill right now, as you would just be using it as a fancy trigger for your build.
But consider adding custom build parameters, which it can pass to your script. You can then start automating builds to different environments through a friendly UI.
You then have a platform to base a correct deployment process around further down the road.
Come the time when you need PHP compilation, JS minifcation, unit testing, its all just another step in your TC configuration.
I would recommend it.
Here is what I am trying to do: I have my code sitting on Bitbucket (it is a ASP.net web application). I want to put together a build machine that has a web-based dashboard. I go to the dashboard and I say: Ok show me all the branches and stuff on my Bitbucket project. Now please get the latest code from this branch for me and build it. Then please deploy it to this location for me or maybe this other location. I want this dashboard give me a history of all this builds and deployments too. I am very new to this concept I have been reading about CC.net and MSBuild and other stuff but I can not find the right answer. Please put me in the right direction.
The point of a build server is that it automatically runs a build each time you commit something to your repository.
In order for the build server to know exactly what to do, you normally put a build script (with MSBuild or NAnt) into your solution which does everything you want - building your solution, maybe create a setup package and so on.
The build server basically knows where the repository is and where in the repository your build script is.
You need to configure this once in the build server, and then it will always run after you commit (but you can also start a build manually, if you want).
If you want a solution with web-based dashboard, try TeamCity.
It's a commercial product, but it's free for up to 20 users.
You can do everything in the web interface - configuration, running the builds AND browsing the build history.
EDIT:
Houda, concerning your question about deployment:
I don't think that TeamCity has a "deployment mode" in that sense. What you could do is include the deployment stuff in your build script that is run by TeamCity.
So, after the build itself is finished, copy the generated assemblies and files on your web server(s).
If you do it this way, you HAVE to make sure in the build script that the deployment will only happen if the build didn't fail (and if you have unit tests, if the unit tests didn't fail as well).
This is very important for a live application, because if you don't take care of this well enough, your app will go immediately offline every time someone commits "bad" code to your repository (and it will stay offline until the next "good" commit)!!
EDIT 2:
See Lasse V. Karlsen's comment below: it's more comfortable with the new TeamCity version 6.