My Take on Regression Testing CSS
As we add new features to the BBC Music Beta, we have more pages to check before making a new release.
We’re using test-driven-development on the code generating the pages, but obviously these techniques don’t cover the visual appearance of the site. As we reuse the same visual modules on different pages across the site, CSS bugs creep up unexpectedly on different pages across the site.
For example, the code generating the links module is reused on three different pages:
And to illustrate the problem, I just noticed a CSS quirk with the background of the links module on that third link. So I started thinking about how one might go about regression testing CSS and hacked a simple solution together using CutyCapt, ImageMagick’s compare tool and Ruby Rake.
The result is illustrated here:
|Stable version of BBC Music Beta artist profile page||Development Version of BBC Music Beta artist profile page||Difference between the stable and development versions of the site|
The first image is a screen capture of the stable version of an artist’s profile page, taken directly from the live BBC Music Beta. The second image is taken from our development version of the site. The third image shows the difference between the two - in our upcoming release we are shuffling some of the modules around so these changes are very noticeable.
At the moment, my tool is very basic: you give it the stable and development host names, and a list of paths to test (example configuration). It uses CutyCapt to pull down each of the paths from the stable and development hosts, and then runs the ImageMagick compare tool between each pair of images. It then produces a very simple HTML file that displays all of the pages being tested.
While we’re just using this tool informally at the moment, it’s already been really useful to catch unexpected CSS bugs on our site.