Basically I want some advice on designing integration test suite for QA using Cucumber BDD in a microservice-integration application environment.
Should QA maintain a separate code base and github repository for their test suite or its better to have the code residing inside each applications code base?
I am building an integration testsuite using cucumber BDD to test an enterprise applications platform. Each application in this platform has many microservices(built in springboot) and integration apps(Apache camel and fuse). Currently I have the test suite built for each project in the eclipse IDE(in the sense each micro service and integration app has its own code base in src/main folder and I have cucumber tests for each of them in the corresponding src/test folder). But I see these as system test rather than integration tests. When an app is deployed only its tests will be triggered in jenkins pipeline and cant run tests for a dependent app or service from its domain due to access restrictions.
So what I am thinking is let QA team create and maintain a separate test application for their tests in a standalone repository rather than maintaining in each application level for integration tests and integrate it in Jenkins pipeline and trigger the features needed for each app using proper cucumber tags after deployment of the main app. Any suggestions are welcome.
Related
I have an ASP.NET MVC solution with the following structure:
Solution.sln
--- Solution/Main.csproj
--- Solution.Services.UnitTest/Solution.Services.UnitTest.csproj
For simplicity in this question, lets say my Main solution is both my website and services. So we have one solution with is the "application".
The UnitTest solution is a simple solution that refers the Main project, but with a lot of unit tests (using NUnit).
I have setup automatic deployment in Azure. Every time I make a commit on develop it updates one web-app, and when I update master, it updates another web-app. I've done this by setting up the integration to my Github repo here inside the Azure portal:
Webapp --> Deployment --> Deployment Options
My question is: how do I run my unit test first?
I don't see any options to add this. I don't see any option if I use the newer Continious Delivery (Preview) either.
I can see there is an option to add performance tests, but thatis not what I need.
So my question is two-fold: How do I add these unit tests inside the Azure web portal / updating my build file? And if this is not possible inside Azure, what is the "norm" on how to solve this (very common, I assume) problem?
You can add a custom KUDU script to the root directory of your solution. At that point you have "complete" control over the build and deploy pipeline in Azure. Every web app has a default script in Azure. If you pull your current KUDU script (assuming your are using dotnet core), you should just need to add a dotnet test command before the dotnet publish command and fail accordingly.
https://github.com/projectkudu/kudu/wiki/Custom-Deployment-Script
If you have looking for a friendlier alternative, then you can use any number of CI/CD tools outside of Azure. VSTS offers several free build minutes every month.
Here is my situation. I have a ASP.Net web forms application with code hosted in Visual Studio Online using TFVC. I recently started using CI tools in VS Online to automate everything from building, unit testing, integration testing and production deployment.
I started in baby steps. The build is working, it runs unit tests. But when it comes to unit testing using IIS, it throws below error inside the CI console.
[error] The test or test run is configured to run in ASP.NET in IIS,
but the current user (TASKAGENT-0005\buildguest) is not in the
Administrators group. Running tests in ASP.NET in IIS requires the
user to be in the Administrators group.
I could see that its clearly telling about admin privileges. Googling is not giving any suggestions. Any idea how to give permission? Also is VSOnline supports testing by hosting ASP.Net temporarily in build machine. Below is the unit test method which I use to test by hosting inside IIS.
[TestMethod]
[HostType("ASP.NET")]
[UrlToTest(Common.BaseUrl + "Blogs.aspx")]
public void WhenChangeLogPageIsRequested_TitleShouldBeProper()
{//Code goes here
}
A typical continuous integration and Continuous Delivery workflow will contain:
1.Create a new Agent Pool, install Build Agent and configure permissions
2.Create a new build definition and configure it to execute Unit Tests (Continuous Integration)
3.Package our the built website as a Web Deploy Package
4.Create a Machine Group and add a new test web server
5.Use PowerShell DSC to configure a basic web server (IIS, ASP.NET 4.5, Website & WebDeploy)
6.Use WebDeploy to deploy the site package to the newly configured Web Server
7.Auto deploy and configure the new Test Agent on our web server
8.Run Coded UI Tests and report results
Follow the video at website: http://blogs.msdn.com/b/visualstudioalm/archive/2015/07/17/video-configuring-continuous-integration-and-continuous-testing-with-visual-studio-2015.aspx
Can anyone provide insights of using Jenkins for automating deployment under controlled and uncontrolled enviroments. We have different environments - dev/qa/uat/prod and currently we are using batch files that call msbuild/nant scripts to deploy on web and DB servers (web farm). Developers only have access to dev/qa and production support will deploy on uat/prod. Prod. support will get the source code from SVN tag folder and run the batch file to deploy the application.
By using Jenkins, is it possible to eliminate the step of prod. support team getting the script from SVN by running the jobs using their credentials via url. And what is the general practice using source control and CI tool for deploying applications.
My recommendation is to reserve Jenkins for just building the software. That way the user of Jenkins only have access to development and perhaps QA systems.
To decouple the build system from the process that deploys the software I recommend the use of a binary repository manager like:
Nexus
Artifactory
Archiva
In that way deployment scripts could retrieve any version of a previous build. The use of a repository manager would enable your QA team to certify a release prior to it's deployment onto production.
Finally, consider one of the emerging deployment automation tools. Tools like Chef, Puppet, Rundeck can be used to further version control the configuration of your infrastructure.
We are trying to automate our tests but we have some problems.
Our solution has 7 projects, one of them is WCF server and one of them is Windows Forms project. Others are helper projects.
We created a test plan and test cases. We runned our test cases with action recording and converted manual tests to Coded UI test. After that, we assoiciate our coded ui test with test case.
We defined a new build. This new build deploys WCF server to IIS and transform app.config and copy client application files to a folder.
We setted up test controller and test agent at same build machine.
I wonder what need we to do to automate our build and test?
How our build trig automated test?
Because of VMWare infrastructure we can't use Lab Management.
I solved this problem myself.
I used msdeploy to deploy WCF service
I created a database project and used VSDBCMD tool
I installed test controller and test agent my build server
I created test settings for automated tests and configured to use this agent
I attached my coded ui test with test case
Then it is ok :)
Does Microsoft offer a tool where you can deploy a web application to multiple web servers in a load-balanced environment/web farm?
My team is looking for a tool, preferably from Microsoft, where we can deploy our web application from development environment to production environment automatically.
If I understanding what your asking for your looking for a build server, to my knowledge Microsoft don't offer one, but some to take a look at are Team City, Hudson(requires a plug-in), and CruiseControl.net.
Basically they work by pulling from your source control building your application and running your tests. They all support scripting that will allow you to build then deploy to your servers. This can be set up to run nightly, weekly, etc. you can also set it up to monitor your source control for changes and build anytime it sees a change
The only one I've used is Team City, the install was easy, and depending on how many build agents you need it's free.
If your just looking to build and deploy from VS Another option is creating an NAnt script and running it from VS as an external tool.
For a good over view of Build servers check out this SOF question cruisecontrol.net vs teamcity for continuous integration
The Web Deployment Team blog at Microsoft has some reasonably useful information, and have a deployment tool you could try...
In the last environment we setup we used TeamCity for all our builds and deployments (Which is basically to say we wrote MSBuild scripts and automated them with TeamCity). In short we had the following 5 build configurations:
Continuous Build - Automatically rebuilt our product upon every check-in. Running all the tests. This build did not deploy anywhere
Nightly Build (Dev) - Automatically build and deployed our product to the development web server (no server farm). We build would run the tests, update the development database, shutdown the Dev IIS web site, copy the necessary files to our web server, and restart the site
Test Build - Like our Nightly build only it deployed to our test environment and it wasn't scheduled so it had to be manually started by logging into Team City and pressing a button
Stage Build - Like test only deployed to a web server that was externally visible to our customers sot that they could validate the application. Also, only run on demand.
Production - Created a zip file of our product that the deployment team could install on our production web servers
So I guess what I'm suggesting is that you use TeamCity and then write build scripts in such a way that they'll deploy to your Web Farm. If you want examples I could supply you with the pertinent portions of our build scripts
** One more thing: we check in our web.config files and such for each environment into subversion and then part of the build process is to copy and rename the appropriate config file for the environment. For example, web.prod.config => web.config in our production build
I believe that Sharepoint does this.
File Replication Service ( e.g. DFS Replication ) is a typical and very good choice for doing this.
Your changes are synced between member servers at the file system level.
Sharepoint does this automatically when you deploy a solution package.