Continuous Integration

I received news this week that I was going to be starting a new project (well, just another release of my last project) so I started to prepare my build and source control environments in earnest.  Because I have to be able to build for defect fixes in the production environment at any time, plus I have to pursue a new strain of development, there were a number of hurdles that I needed to overcome.

The first decision I had to make was whether I should keep only one VSS database, use two (one for production and one for new development) or move to a new source control package.  After reviewing a couple of source control option other than VSS, I decided that I needed to stick with VSS.  The decision was made solely because of the difficulty I would have getting a new system installed into the IT environment I work in.  That left me only one choice to make, one database or two.  I decided on two primarily because of the depth that the new development was going to have in the code base.  The second reason that I chose this was because of the ability to ensure that changes were not accidentally made, left untested and then migrated to the production environment during a hot fix.  The drawback of this is that we will have to duplicate all the code that we write for production hot fixes.  That said, knock on wood, we’ve had so few problems with the system that I can’t see this happening very often, if at all.

After deciding on the layout of the source control environment, my next task was to tackle one of the things that caused us to drop the ball during our previous releases.  Because we have to do two specific things, on click installs and build for 5 different environment configurations, we have to manage five different setting values for each installation parameter.  The problem that we encountered was that changes were not propagated through the setup files in all environments all the time.  This was primarily caused by the system of manually making these changes to the files.  The solution that I decided to work with as part of solving this was Cruise Control .Net.  The combination of creating installations automatically (and repeatedly) with Cruise Control .Net and the use of NAnt (kudos to the previous team lead for getting us that far into this system) for performing the builds will bring a greater confidence, if only for me, in the installation deliverables.

The one area that I’m still lacking coverage of is the automatic generation of the installation parameter files for the different environments.  How I’m planning on accomplishing this is to create a command line program that can be called from either NAnt or CCNet and will generate the files just prior to the creation of the packages.  My plan is to have all the different values stored in an MS Access database where I can create and store values for the parameters in an extensible way.

So far I’m really liking how our build system is growing.  The only hiccup that I’ve run into so far is not being able to get CCNet running as a service.  This is not because of some problem with CCNet though.  It is a problem with our environment.  My build servers (the just happen to be in our development sand box) are in a work group outside of the corporate domain.  The VSS database is on a network share that is in the corporate domain.  Everything works fine when I’m logged into the server and I have a mapped drive that uses my domain credentials.  As soon as I set it to run the service it fails because the account the service is configured to run under does not have permission to any resources within the corporate domain.  I’m sure I can fix this (one way is to run the service under a domain account), but I haven’t had any luck yet.  I’m really hoping that this one hurdle doesn’t prove to be the one that trips all these plans up.

I’m the Igloo Coder and I think my sentimental favourites in Olympic hockey this time around are the Swiss…..but I’m Canadian and my realism is biased.