I've feel like I've tried everything we currently have a solution that upon checkin to TFS we force a build on CruiseControl.net. In our solution we use the Chutzpah JS Test Adapter. We were able to successfully use the Chutzpah.console.exe to fail the build if any of the JS tests fails, now we would like to fail the build on the coverage. I cannot find any way to have Chutzpah.console.exe output coverage to the XML file it generates.
I thought I could solve the problem by writing my own .xsl that would parse _Chutzpha.coverage.html. I was going to convert that to xml using the junit format that CruiseControl already can interpret. Since I just care about failing the build I was going to make the output of my transform just look like more unit tests that failed. in the xsl i would set the attribute failures > 0
<?xml version="1.0" encoding="UTF-8" ?>
<testsuites>
<testsuite name="c:\build\jstest\jstest\tests\TestSpec2.js" tests="1" failures="1">
<testcase name="Coverage" time="46" />
</testsuite>
</testsuites>
But I really can't since the incoming html has self closing tags.
So, now I want to just run Chutzpah.Console.exe and pipe the output to a file, because the console output does display the total average coverage, read that value, and fail the build if it drops below a threshold.
Is there a better option? Am I missing soemthing? I don't know that much about cruisecontrol.net
I think the output to a file, and parse that is the only option left.
A pity that the coverage info is not in the xml file :-(
This is in fact more a problem with Chutzpah than with CCNet,
think of CCNet as an upgraded task scheduler, it has a lot of options, but relies on the input it receives from the called program. If that one can not provide the data, you're stuck with these kind of workarounds :-(
There is another option, but it may be more of a trip than you're interested in embarking upon. Sonarqube is a server based tool for managing and analyzing code quality. It has a plugin called "Build Breaker", which allows you to fail the build if any number of code quality metrics are not met, including unit test coverage. Like I said, it's kind of involved because you have to set up a server and learn this new tool, but it's good to have anyway.
I use Chutzpah with the /lcov command line option so that it outputs the coverage to a file, then tell sonar to find the coverage in that file in the sonar configuration for that project. Then, you add a step to your cruise control.net build process to run the sonar analysis, and if you have configured your build breaker plugin right, it will fail the build in cruise control if the coverage is not at the level you specified.
Related
I've been searching up how to set up phpunit or phpunit.xml to log code coverage reports, but I keep finding docs like How to include files, How to ignore code blocks, and such, but nothing on setting it up. Can I get some instruction on setting code coverage up with phpunit?
https://phpunit.de/manual/current/en/code-coverage-analysis.html points you to https://phpunit.de/manual/current/en/textui.html for a list of commandline switches that control code coverage functionality and to https://phpunit.de/manual/current/en/appendixes.configuration.html#appendixes.configuration.logging for the relevant configuration settings.
https://thephp.cc/dates/2015/05/php-tek/code-coverage-covered-in-depth is a presentation on code coverage that has plenty of examples.
I am running test using a phpunit.xml.dist file. This file defines several test suites and specifies a bootstrap.php. In this bootstrap.php I am currently loading all dependencies for all tests.
A small subset of the tests is dependent on some third party library, which is optional. These tests are all part of a particular test suite. So I only want to load this library in the bootstrapping file when that particular test suite is specified.
How can I determine if this test suite was specified? This then ensures that most tests can be run when the library is not loaded, and that one can easily verify the code and tests that should not depend on the library indeed do not need it.
I currently have the following. Is there something better?
if ( !in_array( '--testsuite=WikibaseDatabaseStandalone', $GLOBALS['argv'] ) ) {
require_once( __DIR__ . '/evilMediaWikiBootstrap.php' );
}
The feature request on the PHPUnit bugtracker for a test suite specific bootstrap is here: https://github.com/sebastianbergmann/phpunit/issues/733
For now there are two options: One is yours which is fine but feels really hackish and doesn't work out well if you run "all the tests" if you have specific bootstrap for every one of them.
My suggestion would be to write a test listener and hook into "startTestSuite" and "endTestSuite". This is a nice maintained and BC compatible way to execute code only when the test suite is actually started and you can also clean up afterwards.
See http://phpunit.de/manual/3.7/en/extending-phpunit.html#extending-phpunit.PHPUnit_Framework_TestListener and http://phpunit.de/manual/3.7/en/appendixes.configuration.html#appendixes.configuration.test-listeners for how to include the test listener.
One of the usual way to handle this is to check if a required dependency is installed, and if not, run
$this->markTestAsSkipped('lib not installed');
That skipping can also happen in the setUp() phase of a test.
Finally, you can add #group annotations to the test-class and/or test functions to give some choice to whether or not the test is run from the command line (with the --group [names...] parameter).
Finally, an option that has also been used in the ZendFramework is to only add the TestSuite that runs a subset within a larger set of a testsuite - in code. There is an example of being able to
a) turn off as will,
b) turn off if the extension is not loaded, or
c) run the tests, for the use of (for example)
caching with APC
I have created a new build definition for TFS 2010. After building my C# solution I would like it to execute a couple of unit tests. These unit tests require an XML input file, so I have a [DeploymentItem] attribute to the test methods which provides the relative path the XML files. If I run the unit tests from within Visual Studio they pass ok.
When the unit tests get run following a build (via my build definition), they fail with: "Microsoft.BizTalk.TestTools.BizTalkTestAssertFailException: Input file does not exist..."
It would be great if I could get to a trace of what the build agent was trying to do, to help with troubleshooting.
Does anyone know how to get such a trace output? I guess I could increase the verbosity of the trace output from the main solution under test but I don't think that would give me any indication of where the build agent was looking for the test input XML or why?
Thanks
Rob
I found it! Needed to click the "View Log" link from the screen that's displayed following the build. I had been looking at the default view of "View Summary"
I'm looking for a .NET coverage tool, and had been trying out PartCover, with mixed success.
I see that OpenCover is intended to replace PartCover, but I've so far been unable to link it with TypeMock Isolator so my mocked-out tests pass while gathering coverage info.
I tried replicating my setup for Partcover, but there's no defined profilename that works with the "link" argument for Isolator. Thinking that OpenCover was based on Partcover, I tried to tell Isolator to link with Partcover, and it didn't complain (I still had Partcover installed), but the linking didn't work - Isolator thought it wasn't present.
Am I missing a step? Is there a workaround? Or must I wait for an Isolator version that is friends with OpenCover?
Note: I work at Typemock
I poked around with the configuration a little bit and managed to get OpenCover to run nicely with Isolator. Here's what you can do to make them work together, until we add official support:
Register OpenCover profiler by running runsvr32 OpenCover.Profiler.dll (you will need an Administrator's access for this).
Locate the file typemockconfig.xml, it should be under your installation directory, typically C:\Program Files (x86)\Typemock\Isolator\6.0.
Edit the file, and add the following entry towards the end of the file, above </ProfilerList>:
<Profiler Name="OpenCover" Clsid="{1542C21D-80C3-45E6-A56C-A9C1E4BEB7B8}" DirectLaunch="false">
<EnvironmentList />
</Profiler>
Save the file, you will now have a new entry in the Typemock Configuration utility, called OpenCover. Press the Link button to link them. You will now be able to run your tests using OpenCover.Console.exe and Isolator. For example, here's how to run your tests with MSTest:
OpenCover.Console.exe
-target:"C:\Program Files (x86)\Microsoft Visual Studio 9.0\Common7\IDE\MSTest.exe"
-targetargs:"/testcontainer:"d:\code\myproject\mytests.dll"
-output:opencovertests.xml
There is still a minor issue running this with TMockRunner -link (that is, with late linking). I will need to look at it further at work.
Hope that helps.
Simple task, but for some reason no simple solution just yet.
We've all got web.config files - and I haven't worked anywhere yet that doesn't have the problem where someone yells across the room "Sh*t, I've just uploaded the wrong web.config file".
Is there a simple way of being able to auto generate a web.config file that will contain the right things for copying to release? An example of these being:
Swap connection string over to use live database
Change
Switch over to use the live/release logging system, and live/release security settings
(in our case we need to change the SessionState mode to InProc from StateServer - this isn't normal)
If you have others, let me know and I'll update it here so it's easy for someone else to find
Maintaining 2 config files works, but is a royal pain, and is usually the reason something's gone wrong while you're pushing things live.
Visual Studio 2010 supports something like this. Check it out here.
How are you deploying your builds. In my environment, this used to be a pain point too, but now we use cruisecontrol.net and script our builds in nant. In our script, we detect the environment and have different versions of the config settings for each environment. See: http://www.mattwrock.com/post/2009/10/22/The-Perfect-Build-Part-3-Continuous-Integration-with-CruiseControlnet-and-NANT-for-Visual-Studio-Projects.aspx for my blogpost onthe subject of using cruisecontrol.net for build management. Skip to the end fora brief description of how we handle config versions.
In my most recent project I wrote a PowerShell script which loaded the web.config file, modified the necessary XML elements, and saved the file back out again. A bit like this:
param($mode, $src)
$ErrorActionPreference = "stop"
$config = [xml](Get-Content $src)
if ($mode -eq "Production")
{
$config.SelectSingleNode("/configuration/system.web/compilation").SetAttribute("debug", "false")
$config.SelectSingleNode("/configuration/system.web/customErrors").SetAttribute("mode", "off")
$config.SelectSingleNode("/configuration/system.net/mailSettings/smtp/network").SetAttribute("host", "live.mail.server")
$config.SelectSingleNode("/configuration/connectionStrings/add[#name='myConnectionString']").SetAttribute("connectionString", "Server=SQL; Database=Live")
}
elseif ($mode -eq "Testing")
{
# etc.
}
$config.Save($src)
This script overwrites the input file with the modifications, but it should be easy to modify it to save to a different file if needed. I have a build script that uses web deployment projects to build the web app, outputting the binaries minus the source code to a different folder - then the build script runs this script to rewrite web.config. The result is a folder containing all the files ready to be placed on the production server.
XSLT can be used to produce parameterized xml files. Web.config being xml file this approach works.
You can have one .xslt file(having xpath expressions).
Then there can be different xml files like
1. debug.config.xml
2. staging.config.xml
3. release.config.xml
Then in the postbuild event or using some msbuild tasks the xslt can be combined with appropriate xml files to having different web.config.
Sample debug.config.xml file can be
<Application.config>
<DatabaseServer></DatabaseServerName>
<ServiceIP></ServiceIP>
</Application.config>
.xslt can have xpaths referring to the xml given above.
Can have a look at the XSLT transformation This code can be used in some MSBuild tasks or nant tasks and different web.config's can be produced depending on the input config xml files.
This way you just have to manage the xml files.
There is only one overhead that the xslt file which is similar to web.config need to be managed. i.e whenever there is any tag getting added in the web.config the xslt also needs to be changed.
I don't think you can 100% avoid this.
The last years of work ever and ever shows: where human worked, there are fails.
So, here are 3 ideas from my last company, not the best maybe, but better then nothing:
Write an batch file or an C#.Net Application that change your web.config on a doubleclick
Write a "ToDo on Release"-List
Do pair-realesing (== pair programming while realease :))