We are trying to automate our tests but we have some problems.
Our solution has 7 projects, one of them is WCF server and one of them is Windows Forms project. Others are helper projects.
We created a test plan and test cases. We runned our test cases with action recording and converted manual tests to Coded UI test. After that, we assoiciate our coded ui test with test case.
We defined a new build. This new build deploys WCF server to IIS and transform app.config and copy client application files to a folder.
We setted up test controller and test agent at same build machine.
I wonder what need we to do to automate our build and test?
How our build trig automated test?
Because of VMWare infrastructure we can't use Lab Management.
I solved this problem myself.
I used msdeploy to deploy WCF service
I created a database project and used VSDBCMD tool
I installed test controller and test agent my build server
I created test settings for automated tests and configured to use this agent
I attached my coded ui test with test case
Then it is ok :)
Related
Basically I want some advice on designing integration test suite for QA using Cucumber BDD in a microservice-integration application environment.
Should QA maintain a separate code base and github repository for their test suite or its better to have the code residing inside each applications code base?
I am building an integration testsuite using cucumber BDD to test an enterprise applications platform. Each application in this platform has many microservices(built in springboot) and integration apps(Apache camel and fuse). Currently I have the test suite built for each project in the eclipse IDE(in the sense each micro service and integration app has its own code base in src/main folder and I have cucumber tests for each of them in the corresponding src/test folder). But I see these as system test rather than integration tests. When an app is deployed only its tests will be triggered in jenkins pipeline and cant run tests for a dependent app or service from its domain due to access restrictions.
So what I am thinking is let QA team create and maintain a separate test application for their tests in a standalone repository rather than maintaining in each application level for integration tests and integrate it in Jenkins pipeline and trigger the features needed for each app using proper cucumber tags after deployment of the main app. Any suggestions are welcome.
I have a number of tests in my project which are run as part of the build. Some of those tests are integration tests which need a username/password set of credentials in order to run the tests.
I want to keep these credentials out of the source code so on my local machine I have set them up as user secrets and on the server they are environment variables. The deployments are working just fine with this arrangement.
My problem is running the tests as part of the build. The tests are not being fed with any login credentials and therefore are failing with authentication issues. How do I supply these values without adding them to the appsettings.json files?
I am running a dotnet core project and have a standard Azure DevOps build template.
Thanks!
Non-secret variables declared in the build are automatically turned into environment variables on the build agent.
Secret variables are intentionally not turned into environment variables, but you can add a Command Line or Script task that's appropriate for your platform (Bash, Powershell, whatever) and set an environment variable by passing your secret in as a parameter to the script.
I have an ASP.NET MVC solution with the following structure:
Solution.sln
--- Solution/Main.csproj
--- Solution.Services.UnitTest/Solution.Services.UnitTest.csproj
For simplicity in this question, lets say my Main solution is both my website and services. So we have one solution with is the "application".
The UnitTest solution is a simple solution that refers the Main project, but with a lot of unit tests (using NUnit).
I have setup automatic deployment in Azure. Every time I make a commit on develop it updates one web-app, and when I update master, it updates another web-app. I've done this by setting up the integration to my Github repo here inside the Azure portal:
Webapp --> Deployment --> Deployment Options
My question is: how do I run my unit test first?
I don't see any options to add this. I don't see any option if I use the newer Continious Delivery (Preview) either.
I can see there is an option to add performance tests, but thatis not what I need.
So my question is two-fold: How do I add these unit tests inside the Azure web portal / updating my build file? And if this is not possible inside Azure, what is the "norm" on how to solve this (very common, I assume) problem?
You can add a custom KUDU script to the root directory of your solution. At that point you have "complete" control over the build and deploy pipeline in Azure. Every web app has a default script in Azure. If you pull your current KUDU script (assuming your are using dotnet core), you should just need to add a dotnet test command before the dotnet publish command and fail accordingly.
https://github.com/projectkudu/kudu/wiki/Custom-Deployment-Script
If you have looking for a friendlier alternative, then you can use any number of CI/CD tools outside of Azure. VSTS offers several free build minutes every month.
I am new in testing, and to be honest I would like to start over because I have been reading a lots of articles in msdn library, now I am confused...Please help!
What I have done:
I created an automated coded UI test (CUIT) in VS 2010
The CUIT is taking data from an Excel File
Created a link between the excel file and CUIT folowing this blog
My MS Tools
Microsoft Test Manager 2010
Visual Studio 2010
Team Foundation Server 2010 & Team Foundation Build
SQL Server 2008 R2 (if relevant)
Objective: Use my PC to run an automated test in a physical enviroment without extra tools
Thank you very much!
Running CodedUi Tests through MTM is the same procedure as any other automated test. You have to do the following:
Associate your CodedUi Test with appropriate Test Case.
In the physical machine you use for automated tests install a Test Controller and a Test Agent. Configure the Test Agent to interact with the desktop.
Create a Test Plan in MTM and add your Test Cases to the appropriate Test Suites.
Select the Test Environment you want to run your tests. Select the physical machine you had previously install and configure the Test Controller and Agent.
Run your tests and happy testing
For more details follow these guides:
How to: Run Automated Tests from a Test Plan Using Microsoft Test Manager
Running Automated Tests
I highly recommend you this book Software Testing with Visual Studio 2010.
It will provide you with an excellent walkthrough.
You need to install a Test Controller service and Test Agent on the environments you want to run the tests on.
This will allow you to execute the tests remotely without having visual studio installed.
see these links for more info
http://msdn.microsoft.com/en-us/library/vstudio/dd648127(v=vs.100).aspx
http://msdn.microsoft.com/en-us/library/dd728093(v=vs.100).aspx
alternatively you can call the mstest.exe directly and pass in a parameter for the test project, but i'm not sure what installs this. Probably visual studio and the agents
Once setup the agents can be managed via the Lab in MTM
Does Microsoft offer a tool where you can deploy a web application to multiple web servers in a load-balanced environment/web farm?
My team is looking for a tool, preferably from Microsoft, where we can deploy our web application from development environment to production environment automatically.
If I understanding what your asking for your looking for a build server, to my knowledge Microsoft don't offer one, but some to take a look at are Team City, Hudson(requires a plug-in), and CruiseControl.net.
Basically they work by pulling from your source control building your application and running your tests. They all support scripting that will allow you to build then deploy to your servers. This can be set up to run nightly, weekly, etc. you can also set it up to monitor your source control for changes and build anytime it sees a change
The only one I've used is Team City, the install was easy, and depending on how many build agents you need it's free.
If your just looking to build and deploy from VS Another option is creating an NAnt script and running it from VS as an external tool.
For a good over view of Build servers check out this SOF question cruisecontrol.net vs teamcity for continuous integration
The Web Deployment Team blog at Microsoft has some reasonably useful information, and have a deployment tool you could try...
In the last environment we setup we used TeamCity for all our builds and deployments (Which is basically to say we wrote MSBuild scripts and automated them with TeamCity). In short we had the following 5 build configurations:
Continuous Build - Automatically rebuilt our product upon every check-in. Running all the tests. This build did not deploy anywhere
Nightly Build (Dev) - Automatically build and deployed our product to the development web server (no server farm). We build would run the tests, update the development database, shutdown the Dev IIS web site, copy the necessary files to our web server, and restart the site
Test Build - Like our Nightly build only it deployed to our test environment and it wasn't scheduled so it had to be manually started by logging into Team City and pressing a button
Stage Build - Like test only deployed to a web server that was externally visible to our customers sot that they could validate the application. Also, only run on demand.
Production - Created a zip file of our product that the deployment team could install on our production web servers
So I guess what I'm suggesting is that you use TeamCity and then write build scripts in such a way that they'll deploy to your Web Farm. If you want examples I could supply you with the pertinent portions of our build scripts
** One more thing: we check in our web.config files and such for each environment into subversion and then part of the build process is to copy and rename the appropriate config file for the environment. For example, web.prod.config => web.config in our production build
I believe that Sharepoint does this.
File Replication Service ( e.g. DFS Replication ) is a typical and very good choice for doing this.
Your changes are synced between member servers at the file system level.
Sharepoint does this automatically when you deploy a solution package.