Automated Build – Getting better

A while ago I wrote a post, describing how to setup a continues build system using CruiseControl.Net. In that post I showed how to get your project ready be continuously being build.

It’s all about improvement

That was a big step towards continues improvement in the software developing lifecycle. The next big thing is to be always up-to-date with the quality of your software and to make sure that all those bugs get fixed real soon.

To accomplish this, I created a unit-test project for my application. This can be used to do test-driven development or just to test the code you just wrote. However – you basically want to run this code whenever you change something and keep an close eye on the red-green status-icons. Whenever I make some modifications this should not affect existing features.

This means, we have to run the rest on a regular basis. This is especially important when we want to deliver some bits. We definitely want to make sure, that all test pass before we ship something.

This is where our continues integration server comes back into the game. We already have this setup to make a new build of the project whenever something changes – so now all we need to do is to make sure all our unit-tests are being run as well.

Updating the project

First of, we need to extend our NAnt build-file. We add a new target, which will run the unit-tests. I prefer to use Gallio to run my unit-tests, because this allows me to use a big variety of unit-testing-frameworks without changing the execution runtime.

<target name="run-tests" description="Run Unit tests">
    <loadtasks assembly="C:\Program Files\Gallio\bin\Gallio.NAntTasks.dll"/>
    <trycatch>
      <try>
        <gallio report-types="Xml-Inline;Html"
                report-directory="testresults"
                report-name-format="Pizza.Testing.dll-results"
                runner-type="NCover"
                failonerror="true">
          <runner-property value="NCoverCoverageFile='testresults\coverage.xml'" />
          <runner-property value="NCoverArguments='//a Pizza.Testing'"/>
          <files refid="fileset_test_assemblies">
            <include name="Pizza.Testing.dll" />
          </files>
        </gallio>
      </try>
      <catch property="failure">
        <echo message="At least one test failed: ${failure}"/>
      </catch>
    </trycatch>
    <loadtasks assembly="C:\Program Files\NCoverExplorer\NCoverExplorer.NAntTasks.dll"/>
    <echo message="Running NCoverExplorer report generation ..."/>
    <ncoverexplorer program="C:\Program Files\NCoverExplorer\NCoverExplorer.console.exe"
                    reportType="3"
                    xmlReportName="testresults\coveragereport.xml"
                    satisfactoryCoverage="80">
      <fileset>
        <include name="testresults\coverage.xml"/>
      </fileset>
    </ncoverexplorer>
</target>

This target will run all tests in the Pizza.Testing assembly of the project. Furthermore all test-results are being written to a file called Pizza.Testing.dll-results.xml.

Finally I also run the NCoverExplorer to get a nice summary of my code coverage. You might notice the runner-type=NCover, which causes Gallio to run all unit-tests within the NCover context. This way we get also some nice numbers of which code is actually covered by unit-tests.

Doing it on the server

After getting the build-file up and running we can make some adjustments to the project-configuration within CCNet (ccnet.config). Obviously we need to make sure, that the newly created target is being called during the build-process. So that’s fairly simple.

<nant description="main build">
    <executable>d:\nant\bin\NAnt.exe</executable>
    <buildFile>d:\ccnet\repositories\Pizza\Source\default.build</buildFile>
    <targetList>
      <target>Compile run-tests</target>
    </targetList>
 </nant>

But we also need to make sure, we collect all data, that is being recorded during the tests (like the test results). Since this is being recorded in a separate xml file, we need to merge this file into the main build-log, which is being maintained by CCNet.

<publishers>
    <merge>
        <files>
            <file>.\source\build\release\test-results\coverage*.xml</file>
            <file>.\source\build\release\test-results\*-results.xml</file>
        </files>
    </merge>
  <xmllogger />
  <statistics />
  <modificationHistory />
</publishers>

Showing what was done

So now we have the tests automated, and setup to run with each build. The final step is to show the test-results in the project dashboard.

For the results to show up, we have to modify the dashboard configuration in that respect, that the results are being rendered into HTML. So far they have only been merged into the build-file log.

This is being done by adding new XSL-transformations to the configuration (dashboard.config). These are part of the Gallio-package.

<xslReportBuildPlugin description="Gallio Test Report"
  actionName="GallioTestReport"
  xslFileName="gallio\xsl\Gallio-Report.ccnet-details.xsl" />
<xslReportBuildPlugin description="Gallio Test Report (Condensed)"
  actionName="GallioTestReportCondensed"
  xslFileName="gallio\xsl\Gallio-Report.ccnet-details-condensed.xsl" />
<xslReportBuildPlugin description="NCoverExplorer Report"
  actionName="NCoverExplorerBuildReport"
  xslFileName="xsl\NCoverExplorer.xsl" />
<gallioAttachmentBuildPlugin />

After that we can have a look on how many tests succeed and how many failed. The display looks just like the Gallio Runner.

ccnet_gallio

ccnet_ncoverexplorer

Copy all content of a web application

When developing web-applications using Visual Studio everything is quite easy. Especially since Microsoft introduced the development-web-server (aka Cassini) web-development is really smooth. Just create a new project and you’re ready to roll.

But there comes the point in time, where you want to deploy your web-app to a real server. So how can you accomplish this the best way? Well, using Visual Studio you can publish a web-application to a web-server using HTTP, FTP or a UNC path. But this is a manual step you would have to trigger.

Of course you could setup a copy-task, by using xcopy or something similar (like NAnt’s copy task). But this would mean, that you would have to specify all files that should be copied. Or you would have to use wildcards, by specifying all extensions which should be copied. But what happens when you add some unusual (or new) file to the project? This will most likely not being copied and you’ll have to remember to modify your copy-task.

For that purpose MSBuild has a special task called _CopyWebApplication, which is part of the Microsoft.WebApplication.targets. This task will copy all items of our web-application project, which are marked as “content”. So if your build-script is based on MSBuild you could leverage this task to copy all files for your web-application to a certain destination.

Since I prefer NAnt over MSBuild I was looking for a solution to accomplish the same with NAnt. So I found a post of Petter Wigle. He has a great approach of creating a NAnt file on the fly by analyzing the web-application’s project-file.

Basically he’s applying a XSLT script to create a fileset-definition of all files that need to be copied. This script can then be called during the build-process.

Automated Builds – Getting started

The Intro

A couple of years ago I got started scripting my builds using NAnt. I considered that to be a great opportunity to increase the quality of my builds and to get all those manual steps done, which might otherwise be skipped – due to dullness.

So after getting the build al straightened up, the logical next step was to get someone else to actually do the building of the app. Since the build-script was up and running, this wasn’t that hard to accomplish. So I found CruiseControl.Net.

This was already way back in 2004 – so way before Microsoft release TeamFoundationServer.

Since then CCNet served me very well, but to be honest I didn’t really kept up with latest developments in CCNet.

So I will show a little bit, of I used CCNet to improve my build process and how to get going with continues integration.

Prep’ing the project

First of, in order to take full advantge of CCNet you need to have a working build-script to automate the build of the project.

Well, starting with Visual Studio 2005 the project files are based on MSBuild – so you will already have a working build-script in place. Either you stick with this script, or you decide to roll your own. That’s what I chose, but sticking with MSBuild is fine as well. Just out of personal preference I use NAnt for my builds. But to be honest – to actually do the compiling and stuff, I use the MSBuild-Task of the NAnt Contribution project, which in turn utilizes the MSBuild script of Visual Studio.

So my super-simple NAnt file looks like this:

<?xml version="1.0" encoding="utf-8" ?>
<project xmlns="http://nant.sf.net/schemas/nant.xsd" name="Pizza">
    <target name="Compile">
        <msbuild project="Pizza.sln"  >
            <property name="Configuration" value="Debug"/>
        </msbuild>
    </target>
</project>

Setting up a new server

So it’s way overdue to pay some respect to the recent 1.5.0 CTP release of CCNet.

OK, let’s take the the new CTP for a spin. I just downloaded the ZIP file, extracted it into a common location and then take a deep dive into the config. This is one of the biggest downturns of CCNet – the hugh chunk of XML configuration. Well, anyway.

Since we want to start out simple, we’ll stick with a really simple ccnet.config as well. So you should have a servers folder, which contains a ccnet.config file. This holds all you ccnet configuration.

<cruisecontrol xmlns:cb="urn:ccnet.config.builder">
    <project name="Pizza">
        <workingDirectory>d:\ccnet\repositories\Pizza</workingDirectory>
        <artifactDirectory>d:\ccnet\repositories\Pizza\Artifacts</artifactDirectory>
        <sourcecontrol type="svn">
          <trunkUrl>file:///D:/SVNRepositories/Pizza/trunk/</trunkUrl>
          <workingDirectory>d:\ccnet\repositories\Pizza\Source</workingDirectory>
          <executable>d:\svn-win32\bin\svn.exe</executable>
          <tagOnSuccess>False</tagOnSuccess>
        </sourcecontrol>
        <triggers>
          <intervalTrigger seconds="900" />
        </triggers>
        <tasks>
          <nant description="main build">
            <executable>d:\nant\bin\NAnt.exe</executable>
            <buildFile>d:\ccnet\repositories\Pizza\Source\default.build</buildFile>
            <targetList>
              <target>Compile</target>
            </targetList>
          </nant>
        <publishers>
          <xmllogger />
          <modificationHistory />
        </publishers>
        <state type="state" directory="d:\ccnet\repositories\Pizza" />
    </project>
</cruisecontrol>

This configuration checks every 900 seconds my local, file-based SVN-repository for changes. If any changes were found, or a build was forced, the target “Compile” of the default.build script is being executed.

This is pretty cool. So now we’ve build our self a very basic build-server, which checks regularly for any modifications and if needed creates a new build. This way I can make sure that whenever I check something into the SVN repository I still get a working – or at least building – application.

Accessing the Build-Server

But wait, event though we’ve got our builds automated we haven’t yet any way to access the output of the build; or at least see what CCNet is actually doing and getting to know of any errors that where encountered during the build.

Basically there are two “interfaces” to CCNet. Either through the web-dashboard or via a little windows-app. For the web-dashboard you need to have IIS installed as well as ASP.Net. To get the web-dashboard up and running just create a new virtual web and point it to your location of CCNet. There should be a sub-folder “webdashboard”, that’s where you want to point your virtual directory. That’s already all you need to get going.

You should then see the dashboard, which will tell you whether your build succeed or in the case of a failure what went wrong during the build.

ccnet_dashboard

I think for now, this is working quite well. I kinda have get the feeling, that I’ll soon write some more about how to evolve your build-experience … 🙂

To quote or not to quote – that's the question

Since I’m using NAnt to build my solution I include all steps needed to get a build up and running. So recently I added the fantastic WSPBuilder to my lastest build file:

<target name="build.wsp" description="Packages the WSP file">
<exec program="${dir_buildtools}\wspbuilder\wspbuilder.exe">
<arg value="-BuildCAS false"/>
<arg value="-DeploymentTraget BIN" />
<arg value="-TraceLevel verbose"/>
</exec>
<move todir="${build.dir}" overwrite="true">
<fileset>
<include name="**.wsp"/>
</fileset>
</move>
</target>

But this didn’t work out as expected, for some strange reason the command line args added with the arg element where just ignored!

OK, let’s examine what is going on here. If you look closely, you’ll notice that the actuall command being executed looks like: wspbuider.exe "-BuildCAS false" "-DeploymentTraget BIN" "-TraceLevel verbose". But apparently wspbuilder doesn’t like the arguments enclosed in quotes, because the whole arg is being interpreted as just a single string. So parsing for -BuildCAS fails, because the actual string is longer (when using space for tokenizing!).

So what next? This does the trick:

<target name="build.wsp" description="Packages the WSP file">
<exec program="${dir_buildtools}\wspbuilder\wspbuilder.exe" commandline="-BuildCAS false -DeploymentTraget BIN" >
</exec>
<move todir="${build.dir}" overwrite="true">
<fileset>
<include name="**.wsp"/>
</fileset>
</move>
</target>

In this case I supply the complete commandline arguments

MSBuild vs. NAnt

OK, this is probably the 1.000 blog post concering whether to choose MSBuild vs. NAnt.

I had quite a though time fighting my build-environment with MSBuild, after starting out with NAnt (even before MSBee was available). But not I got a couple of things worked out: I can build and do some sort of xcopy-deploymend, I can run NUnit and NCover (even though I had different results from running the gui-runner and the automated nunit-target from my MSBuid-script ?!). But adding more features to this build-process seems be a much bigger burden, that doing the same with NAnt.

So I figured: I will use NAnt for the overall build-process and I will only consider MSBuild for build my solution. I’ll see how this side-by-side usage will work out.