Login | Register   
LinkedIn
Google+
Twitter
RSS Feed
Download our iPhone app
TODAY'S HEADLINES  |   ARTICLE ARCHIVE  |   FORUMS  |   TIP BANK
Browse DevX
Sign up for e-mail newsletters from DevX


advertisement
 

Beef Up Your Scrum-Master Toolbox : Page 2

Learn different ways to tease information out of your Scrum processes by unleashing the power of Excel.


advertisement
Revealing Hidden Issues
While the burndown graph is an effective visual display of the team's aggregated effort it sometimes hides resource allocation and/or process issues. For example, if the team members aren't fungible (in other words, a developer can't be swapped for a tester), then it might appear that the work fits under the burndown line when it doesn't actually do so.
 
Figure 2. Summary Rows: The figure shows the blue shaded summary rows for each day's development and test hours.
More concretely, if development and test are sequential rather than concurrent efforts (and I recognize that this isn't the ideal but it happens), then at the end of a sprint a couple of testers might be left with insufficient time to complete their work when the team burndown graph shows that there is still enough time remaining in the sprint to do so. To avoid this trap I enhanced the burndown graphing workbook to facilitate viewing development and testing labor separately as well as in the aggregate by writing a VBA macro that summarizes the daily columns in the Current Sprint worksheet (see Figure 2).

To distinguish the development from the test rows I added a "Task Type" column to the worksheet with three codes: D for development, Q for QA testing, and P for Project Management (though this holds little interest with respect to reporting). The VBA macro uses this task type code to break out totals for development and testing.

 
Figure 3. Origin Code: The origin codes facilitate separating work planned from Day 0 from work added later in the sprint.
Separating QA from development is helpful because it can highlight resource allocation issues, but I wanted to get a picture of another dimension of the burndown—added work. A lot of tasks were being added after the sprint began. The burndown would start out trending down—and then flatten out. This was disconcerting because it meant that we weren't getting through the work that we had committed to complete. To understand what portion of the work in the current sprint was added after the sprint started, as compared to the originally estimated work, I added another column to the burndown: Origin Code.

The Origin Code metric is useful because it provides a way to picture how much of a problem the team understood at the outset. Added work doesn't necessarily imply a delta to the originally estimated task hours; rather it often denotes completely new tasks. Added tasks or "dark matter" (work discovered only after coding starts), is an expected phenomenon in development. The Origin Code lets you see just how much of an issue it really is. The graph in Figure 3 breaks out QA work from development work and also distinguishes added work from originally estimated work. You can see all three graphs on the Burndown worksheet in the sample workbook accompanying this article.

 
Figure 4. Non-sprint Task Breakdown: This pie chart shows a breakdown of the tasks added during the sprint by task type.
These additions provided a better sense of what was affecting the process, but it still doesn't provide a complete picture, because it doesn't reveal what kinds of tasks were missed during the initial estimating effort. To get at this information I added a column called Task Code to the burndown, and created some codes to identify the common types of added work: knowledge transfer, requirements clarifications, bug fixes back from testing, production support, etc. You can see the complete list of codes in the "Keys & Codes" worksheet, the "Added Sprint Tasks" worksheet, or in the VBA code itself.


Non-sprint work is tracked in a similar way. Most developers can be pulled into production support issues or other non-project tasks during a sprint. While I allotted weekly time for this to facilitate estimating what could fit in the sprint, I considered anything beyond the initial allotment to be non-sprint work. So, I added an N for "Non-sprint Work" to the list of Origin Codes so the macro can distinguish these tasks from the sprint work. With all the housekeeping codes in place, I wrote a VBA macro that extracts the tasks added to the sprint and aggregates them into a summary-page report. It was a snap to use Excel to set up a range to create a pie chart using this data (see Figure 4). The chart gives a an exceedingly clear picture of how the various types of tasks contribute to changes in the burndown rate. An identical chart provides the same insight into non-sprint tasks.



Comment and Contribute

 

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

 

Sitemap