Karl Wiegers Q&A - Metrics, Passarounds and Agility in Peer Review
Collaborate | Posted December 20, 2012

On December 5, Karl Wiegers presented a terrific webinar to the SmartBear community called, “Peer Review—An Executive Overview.” It was an extremely well-attended event and Karl did an excellent job of presenting. Attendees asked so many insightful questions that Karl wasn't able to answer all of them during the Q&A session.

Thankfully, Karl was able to go back and answer a lot of the questions that he didn't get to during the live webinar. The first group of questions appears below, and we’ll publish a second installment in the next few days.

How do you measure the success of a review as you're just starting out? We don't have any historical data or metrics yet to compare against.

A: You might not have any data yet, but I’m sure you have impressions of quality issues that occurred on previous projects and their consequences. You might also have some sense of common kinds of problems that recur with certain deliverables time after time. The success of a review can be judged by whether the author and other participants feel like they discovered enough important issues that it was worth the time it took.

Also, recognize the side benefit of sharing information throughout the team, as well as the benefit of learning how to do a better job the next time you create a similar deliverable. Finding just one severe defect that could have caused a lot of headaches and wasted time later in the project justifies the effort and expense that went into the peer review.

How do you manage reviews effectively where the development team members are in different physical locations?

A: One approach is to use asynchronous reviews (like what I call a passaround), in which reviewers contribute their comments on the work product at their convenience and can see what comments others have already provided. Collaborative peer review tools can help with these kinds of reviews. Another approach is to perform synchronous reviews through teleconferencing or videoconferencing. These have to be moderated very well to make sure that the participants, who cannot read body language well under those conditions, can all work together effectively.

Videoconferencing often has a short time lag that can lead to people talking over each other if they aren’t aware of that. If you do hold synchronous reviews with people in multiple time zones, rotate the time of day in which you hold them so as to inconvenience all the participants to about the same degree over time.

What are the key metrics to constantly monitor, in regards to peer reviews?

measuring metricsA: The basic dimensions of software metrics are size, time, effort and quality. For each peer review you can record the size of the material you planned to review (e.g., lines of code or pages of documents) and the actual size you really did get through. You can record the time spent in the review meeting, if you held one. You can record the effort in labor hours devoted by the reviewers to various steps in the review: planning, overview, individual preparation, meantime, and rework. You can record the number of major and minor defects found in various categories and the number of major and minor defects corrected.

From these basic measurements you can calculate a number of metrics: defect density (defects found divided by size reviewed), total effort spent per review, average effort spent per defect discovered, average effort spent per unit size, review rate (actual size reviewed divided by meeting time), average rework effort in hours per defect corrected, etc. Accumulating these metrics over time from a series of reviews will provide a very good picture of what is happening with your review program. The data will help you plan future reviews and assess their payoff.

ISO 9001:2008 require that evidence is retained for any review. How can this best be handled for informal reviews, as formal review already produces this evidence?

A: Obviously, if you need to conform to the standard then you need to record and retain evidence. You can keep records of even informal reviews, such as peer desk checks. You can record the names of the reviewers, the time they spent on preparation, and the time the author spent on rework. The reviewers can complete a standard issues log or enter their comments into a tool so the evidence of what they discovered can be retained. From those logs you can count how many major and minor defects were discovered in different categories so you can compute the same kind of peer review process metrics that you could for a more formal review such as an inspection.

How are reviews best handled in an Agile environment that does not use pair programming?

A: The same way that they would be handled on any other project, except that you will probably do more quick reviews on smaller sized deliverables than if you were not working on an Agile project. For instance, if you are performing four-week sprints, then you’re going to accumulate a relatively small body of requirements and design information, which should be reviewed. The developers are going to write some code that should be reviewed by one or more of their colleagues.

By performing these incremental reviews throughout the project, you will get many opportunities to correct defects earlier than if you waited until several months worth of work was completed before beginning reviews. Such early, periodic, incremental reviews are valuable in any project, not just agile development. You should still use risk assessment to determine the portions of the deliverables that have the highest probability of containing defects for the greatest potential negative impact of defects are not discovered. Focus your review energy on the high-risk portions of the deliverables.

Just because you are working in an agile environment does not mean that requirements and designs should not be documented. They will just be done in smaller chunks and perhaps in less depth than on a more traditional project.

See also:

 

Close

By submitting this form, you agree to our
Terms of Use and Privacy Policy

Thanks for Subscribing

Keep an eye on your inbox for more great content.

Continue Reading

Add a little SmartBear to your life

Stay on top of your Software game with the latest developer tips, best practices and news, delivered straight to your inbox