How to Kick-start the Software Quality Initiative At Your Company: An Expert's Insight

Highly collaborative and high performance teams rely on a handful of proven techniques to meet critical quality goals. While lots of development organizations talk about applying these techniques to achieve higher levels of quality and productivity, Jim Sartain, VP of Worldwide Quality at McAfee, did it!  And what’s even more impressive, he did it at more than one organization!

We were grateful to have Jim share insights about how to translate these quality strategies into actionable steps based on  interactions with customers, a comprehensive focus on fundamental engineering practices, and a product quality directive that encompasses the entire organization.   

From a product development perspective, he demonstrates how continuous integration, automated testing, and code reviews provide the foundation for product quality – quality that is conveyed by customers as high NetPromoter scores. Watch the webinar on-demand to learn more.

Any quality-conscious organization can follow the practices he outlines. If you're just getting started with a quality initiative at your company, I’ve included a few excerpts from the webinar below. I hope that you’ll find them helpful to get started.

How to Get Started Implementing a Major Quality Initiative: Q&A with Jim Sartain

Q:  Who on the team is responsible for drafting quality goals and plans in the product life cycle?

A: We have a role at McAfee called the Quality Leader. I'm a Quality Leader for McAfee. I've got Quality Leaders for each major business unit, and I've got Quality Leaders for some products. It's the job of a Quality Leader to help the whole team, including the Development Manager, come up with what they think their goals should be. It works best for teams to declare goals for what they think is possible. For example, we never tell a team, thou shalt find x% of your defects in peer reviews. We do share data that shows that there are teams that are finding up to 70% of defects. The average is probably 30%, 40% right now.

So when a team says, hey, we're going to find 3%, we can say hey, have you noticed that it seems to be possible to be doing much better than that, in fact, 10x better than that? Why don't you go talk to these other teams over here and ask them what they're doing and how they're doing it. It's so much more credible when it's another development team that's saying, yes, this is how it works, this is what we're finding. You want to have that positive reinforcement.

Let me just give you an example of some comments. I taught a peer review class in Idaho Falls last Friday. I want to have comments from the engineers, having them say things like, “I was a skeptic but now a believer.”  I sincerely hope they implement this practice, that they're starting to do these things. I really want people going in that really want to give it an honest go and answer those hard questions.

There are plenty of great reasons that people could be skeptical, but the number one reason I hear for people being skeptical about this stuff working is not because they don't think they can do it themselves, but because they question whether they're going to be given the time to do the work. So the best way around that is to set a goal not only on the number of defects found, but set a goal about the amount of time that's set aside for doing reviews, the amount of effort basically, and track the effort actually invested versus the effort planned.

What we're finding is when teams are not finding a high percentage of defects in peer reviews, it's only because they're not investing enough effort. I'll give you an example. I found a team that was only finding 8% of their defects in peer reviews, so I asked them how much effort they invested. Because they were using the Code Collaborator tool, it tracked the effort automatically. They found they were investing 15 hours a month in peer reviews.

I asked how many people on the team, 25 people. So I asked, well, how much time is that per month per person? Work it out, that's less than 30 minutes. How much time per person per week? That's seven minutes. OK, I think we found your problem. I mean, try investing 30 minutes a week, or an hour per week, per person. See what happens. And sure enough, they're starting to see a significant increase in the amount of defects found in peer reviews and the overall percentage.

So a very effective metric is percent of defects found in peer reviews as a total of the total found in peer reviews plus the total found in testing. So a reasonable goal for an initial team is at least 20%, but most teams are able to work up to 50%, even 60%. So if a team measures the percent found in peer reviews plus they budget, if you're going to find 100 defects at an hour per defect, then the team should be investing 100 hours of time. If they've only invested 10 hours of time, then they're going to only find 1/10 as many defects, right? I mean, it's simple.  

Q:  How can someone who is not in a management position improve quality?

A: Well, there's a role for people, as I mentioned this Quality Leader role. You can help the team establish its own goals. You can promote the idea of measuring some of these things. You can hopefully persuade the managers that this will be a good thing to try. You can persuade, hopefully influence, some of the other developers on the team; you can get just a few to try it, get one project team. The good news about this is you don't have to have the entire organization do it.  All you need to do is get a team, which is why I like to work with teams first, because over time, the teams will move the entire organization. So that's my advice to you is, just find a team that's willing to give this a go.

Now I remember when I was an engineer a long time ago, I adopted this process with my team. I had a bunch of peers, I said, hey guys, here's an idea. Let's do code reviews before we check in. I'll review your stuff and you review mine.  I was fortunate, because when I was in college, I had a job programming.  My boss insisted that I review his code and that he review mine, and I learned how to write great code by reading great code. I learned how to make my code better through his suggestions. And I actually helped him from time to time. So that's all, it's just a matter of, find a group of people that are willing to do this, and go do it with them.

If you enjoyed the Q&A above please stay tuned cause there's more. In the coming weeks I'll be blogging more about Jim's session on software quality, particularly about the effectiveness of code review. In the meantime feel free to view the entire webinar on-demand.
 

free-white-paper-combining-reviews-with

subscribe-to-our-blog


Close

Add a little SmartBear to your life

Stay on top of your Software game with the latest developer tips, best practices and news, delivered straight to your inbox

By submitting this form, you agree to our
Terms of Use and Privacy Policy

Thanks for Subscribing

Keep an eye on your inbox for more great content.

Continue Reading