he purpose of Scrum is to help build great software through empirical process control. A healthy emphasis on continuous process improvement works naturally with Scrum because Scrum evolved from applying Lean-Agile manufacturing techniques to software development. The article describes some Excel tools I developed to help manage teams that can help you examine your implementation of the Scrum process in more depth.
You’ll find a free sample workbook illustrating these tools that you can experiment with in the downloadable code that accompanies this article and step-by-step instructions in the sidebar How-To to run the code.
Teams using Scrum need to thoroughly understand its effects on their development cycle. They need reliable measures of throughput and visibility into where team members’ non-sprint time is being spent. Therefore, the main thrust of the VBA-generated reports and Excel pivot tables described in this article is to enhance the visibility of what is really going on in the development process so you can produce better software faster. Knowing what is really going on is the only way to improve your process.
|Figure 1. Burndown Graph: Here’s an ideal line superimposed over the actual work completion rate line.
The essential metric of the Scrum process is the daily burndown graph, which shows the estimated work remaining for a task or set of tasks day by day. Ideally, a burndown graph should have a downward trend; in other words, you’d like to see that the team completes a day’s worth of the remaining work each day. That doesn’t always happen in practice. For example, the burndown graph in Figure 1 shows an ideal line superimposed over the actual work completion rate line.
Figure 1 shows how work was loaded and started to burndown for a 20-day sprint. The initial value of 15 days (work remaining) was calculated by dividing the total number of hours of estimated work by the ideal number of hours of project work a person can complete in a day (five is reasonable), multiplied by the number of people on the team:
Total estimated hours / (ideal hours per day * number of team members)
In Figure 1, the sudden rise in the estimate of remaining work on day three suggests that the team missed a chunk of hours in the initial task at the iteration planning session. It is critical to remember that an estimate is just that?an estimate?and in agile development it is perfectly acceptable for estimates to change, new tasks to be added, or unneeded tasks removed. It is crucial for the Scrum Master to foster an atmosphere that values honest estimates, without punishing team members for overestimating or underestimating. Again, estimates will always be estimates?not guaranteed future results.
Revealing Hidden Issues
While the burndown graph is an effective visual display of the team’s aggregated effort it sometimes hides resource allocation and/or process issues. For example, if the team members aren’t fungible (in other words, a developer can’t be swapped for a tester), then it might appear that the work fits under the burndown line when it doesn’t actually do so.
|Figure 2. Summary Rows: The figure shows the blue shaded summary rows for each day’s development and test hours.
More concretely, if development and test are sequential rather than concurrent efforts (and I recognize that this isn’t the ideal but it happens), then at the end of a sprint a couple of testers might be left with insufficient time to complete their work when the team burndown graph shows that there is still enough time remaining in the sprint to do so. To avoid this trap I enhanced the burndown graphing workbook to facilitate viewing development and testing labor separately as well as in the aggregate by writing a VBA macro that summarizes the daily columns in the Current Sprint worksheet (see Figure 2).
To distinguish the development from the test rows I added a “Task Type” column to the worksheet with three codes: D for development, Q for QA testing, and P for Project Management (though this holds little interest with respect to reporting). The VBA macro uses this task type code to break out totals for development and testing.
|Figure 3. Origin Code: The origin codes facilitate separating work planned from Day 0 from work added later in the sprint.
Separating QA from development is helpful because it can highlight resource allocation issues, but I wanted to get a picture of another dimension of the burndown?added work. A lot of tasks were being added after the sprint began. The burndown would start out trending down?and then flatten out. This was disconcerting because it meant that we weren’t getting through the work that we had committed to complete. To understand what portion of the work in the current sprint was added after the sprint started, as compared to the originally estimated work, I added another column to the burndown: Origin Code.
The Origin Code metric is useful because it provides a way to picture how much of a problem the team understood at the outset. Added work doesn’t necessarily imply a delta to the originally estimated task hours; rather it often denotes completely new tasks. Added tasks or “dark matter” (work discovered only after coding starts), is an expected phenomenon in development. The Origin Code lets you see just how much of an issue it really is. The graph in Figure 3 breaks out QA work from development work and also distinguishes added work from originally estimated work. You can see all three graphs on the Burndown worksheet in the sample workbook accompanying this article.
|Figure 4. Non-sprint Task Breakdown: This pie chart shows a breakdown of the tasks added during the sprint by task type.
These additions provided a better sense of what was affecting the process, but it still doesn’t provide a complete picture, because it doesn’t reveal what kinds of tasks were missed during the initial estimating effort. To get at this information I added a column called Task Code to the burndown, and created some codes to identify the common types of added work: knowledge transfer, requirements clarifications, bug fixes back from testing, production support, etc. You can see the complete list of codes in the “Keys & Codes” worksheet, the “Added Sprint Tasks” worksheet, or in the VBA code itself.
Non-sprint work is tracked in a similar way. Most developers can be pulled into production support issues or other non-project tasks during a sprint. While I allotted weekly time for this to facilitate estimating what could fit in the sprint, I considered anything beyond the initial allotment to be non-sprint work. So, I added an N for “Non-sprint Work” to the list of Origin Codes so the macro can distinguish these tasks from the sprint work.
With all the housekeeping codes in place, I wrote a VBA macro that extracts the tasks added to the sprint and aggregates them into a summary-page report. It was a snap to use Excel to set up a range to create a pie chart using this data (see Figure 4). The chart gives a an exceedingly clear picture of how the various types of tasks contribute to changes in the burndown rate. An identical chart provides the same insight into non-sprint tasks.
Viewing Data By Day
At this point, I felt that my Scrum toolbox was nearly complete, but I still needed a slightly different view of the summary data for a particular day or feature. It’s critical that you approach this next bit of tooling with the appropriate mindset. Scrum is about team progress on a committed feature set. This next bit of Excel functionality would enable you to pull out performance information about individuals. If you, the Scrum Master, collect and report this information and it is subsequently misused, the loss of trust between you and the other team members will undermine your effectiveness as the Scrum Master. Scrum teams are most successful when they are self-managing and self-organizing teams. The Scrum Master must support and respect these goals, or Scrum will fail.
If you use the tools presented here to punish team members for inaccurate estimates, it will simply cause them to pad their estimates in the future. So be careful not to create a dynamic that leads to estimate padding, or you’ll never get a clear report of what the team truly believes the required effort is to get the work done. There are agile techniques for adding buffers to estimates, but adding buffers is beyond the scope of this article. If you’re interested in these techniques please check out Mike Cohn’s book noted in the resources section.
|Figure 5. Pivot Table: This pivot table shows how work progress toward various tasks looks on a given day.
With that said, it didn’t seem to violate Scrum’s tenets to try to get some specific insight into which projects are burning down in the desired direction, and which aren’t. The earlier you can gain some insight into features that are struggling, the better you’ll be able to get the team the assistance they need and mitigate other risks. In the worst-case scenario, if you can sense that a feature is in trouble early, then you can alert management and do some damage control to protect the team from being asked to perform the impossible, or sacrifice their work-life balance for the remainder of the sprint.
To gain this added insight I set up a couple of native Excel pivot tables to provide insight into the daily burndown information. Figure 5 shows an example.
I set up the pivot tables and charting to show how the work stacked up at the start of the sprint. Each day I modify a similar table and chart by its side on the same worksheet that shows how things look on that particular day. You can see these tables and charts on the Report Summary page in the sample spreadsheet.
|Figure 6. Single-feature View: Using a pivot table, you can narrow the burndown view to a single task. This view has lines for both development and quality assurance.
Finally, I supplemented the chart in Figure 5 with another pivot table that lets me see the burndown for a particular feature (see Figure 6). The power of the pivot table is that it enables you to make selections and essentially ‘pivot’ the data around different variables. Using this technique you can pull information out of the burndown and narrow the view to one feature.
The graph in Figure 6 points out some of the process dysfunctions quite clearly. In real life, this feature was transferred from one developer to another, leading to considerable churn about requirements that caused a very bumpy ride. Moreover, this graph view shows that the testing effort has scarcely been started.
The Scrum process is ideally suited to provide near real-time insights into what is truly going on in your development process. Having powerful analysis and reporting tools at your disposal will make your job easier and allow you to support your team’s efforts more effectively. Moreover, the ability to gain earlier awareness of possible problems will improve your ability to communicate about your team’s progress and challenges.