Knowledge of the past allows us to predict the future, at least, for certain areas of human enterprise. I'm convinced that software development is one such area. The theme of this blog is 'to measure = to know and to know = to predict', so by the transitivity of equals we can state 'to measure = to predict'.
But what should we measure? I suggest you at least measure the following and will try to explain why below: trends in burn down, trends in bugs, trends in test coverage and trends in complexity.
Trends in burn down is a measurement of the number of story points implemented in an iteration (if you don't use story points you can try some other measure like use cases or pages in a requirements document or whatever as long as you can measure for a significant period and the measured entity remains constant over time). If a team works on a system for some time the number of story points in each release will tend to stabilize. This is because the team learns about the subject matter of the system, learns each others capabilities and learns how to translate user stories into technology. So if you plot the number of story points per iteration against time you should find a horizontal line. If the number of story points per iteration goes down, there's something wrong and you should investigate the cause. More on this below as we discuss complexity.
Trends in bugs is the number of open bugs in Jira or Bugzilla. If the development team implements story points and fixes bugs and the test team tests user stories, the number of bugs found should cancel out the number of bugs fixed. If the number of open bugs goes up this may be because testers test better (or more) and therefore find more bugs. This is actually a good thing (since the point of testing is to find bugs), though it means the team has more work to do. If the test effort remains the same and the number of bugs still goes up and the velocity is constant, the team is producing more bugs and we have a potential problem. Again, more on this below.
Trends in test coverage. The coverage of unit tests should be high (80% or better is our rule of thumb) and remain high during the project. If test coverage drops this may just be sloppiness: nobody cares any more and writing tests is considered to be extra work so it is easy to stop writing tests. This means the team is taking a risk that may lead to more Jira issues later on. But again, there may be another cause, see below.
Trends in complexity is about the average complexity of methods in the system. A generally accepted measurement of complexity is McCabe's cyclomatic complexity. The actual measure used doesn't really matter as long as it remains the same during the project. I propose to use the complexity of classes as a measure.
If a project is about to derail we want to know as early as possible. My statement is that complexity is the measurement to track:
- If complexity goes up it will become harder to implement new features. This leads to lower velocity but not immediately. After a while velocity will go down because it takes longer to implement new features.
- Complex code will also show more bugs so the ratio of new bugs versus fixed bugs will go up. Again this will take a while, especially if testing is done by a separate test team on a release, instead of by a tester who is part of the team.
- Writing unit tests for complex code is harder so if you are struggling to get the story implemented because the code base is hard to understand, writing a unit test may be the first activity that gets dropped. At first test coverage will remain the same, but as it gets harder to write tests because complexity is going up, coverage will start to go down.
So, the measurement to watch is complexity as it develops in time. An excellent way to do this is to install Sonar (http://www.sonarsource.org/) and add it to the Maven build on your build server.
The other measures (burn down, bugs and test coverage) are interesting in their own right and serve as supporting evidence, so I suggest to measure them and plot them against time as well.