Put yourself in my place: The midst of a typical performance review. My manager was talking about what I had accomplished in the past year and where I wanted to take things. My answer was more responsibility and a corresponding raise. It's no huge surprise that young, ambitious Justin wanted that. I just had no idea how to get it - or at least how to properly get it. I had lots of ideas, but nowhere for them to go.
My manager knew that moving up in an organization takes more than skill; you need to create a sphere of influence and widen it over time. Her challenge to me for the next year was to find a way to contribute to more to the organization that "just" reporting bugs.
Software testers often have a lot of value to share. We just need to identify it and figure out how to communicate it.
What Testers Do
One company I worked with had a user experience expert on staff to run usability tests and help with product design. We were talking about what a normal day looks like for him and he described interviews, user studies, and design concepts. At some point during the conversation, the topic switched to testing and he made the statement that "You basically just click around and see what happens, right?" I replied with something along the lines of "do you just drag boxes around till things look nice?"
His question bothered me, but it was also a realization that we had done a pretty bad of describing the general role of a tester. People outside of the testing group didn't realize what the act and performance of software testing looked like.
The ability to describe what we do is a crucial skill for developing respect, and for generally being effective.
Compare "playing with the software to find bugs" to something like this:
"I spent 30 minutes today looking at a new city selector for our shipping information page. I started by entering the city I'm currently in and submitting the page. The first thing I noticed was that I had to type in several letters before the list was filtered to a reasonable length. This felt tedious."
Story telling invites people into your thought process and reminds them that you are there to participate and help make a better product. Studying test design techniques and carefully observing your own work can help to tell a more complete and accurate story of the work you did. It often results in engagement and question, giving the whole team a better idea of how the feature works and how people might get value from it.
How To Think About Test Coverage
Every time I have been in the end-game, trying to get software out to customers, the decision makers -- development managers, product managers, and executives -- always want to know how much longer we need to finish testing. Part of my answer is usually based on gut feelings or a hunch about things have been going; how many problems and questions I have discovered recently, how the test group is feeling, how quickly the programmers are fixing problems and how they feel.
The other part of this conversation is based on models, or the ways we think about and describe things.
Programmers talk about unit test coverage in terms of things like line coverage (how many lines, or statements, have an assertion), branch coverage (percentage of decision trees that are tested), and variable coverage. We can still use models to talk about test coverage when we aren't looking directly at code. Large, slow organizations often measure requirements coverage - the number of requirements that have a test case.
I don't use that kind of language in my work, but I do care about the core scenarios, and if they have been exercised. Using the concept of testing tours, I can create an inventory of parts of a piece of software -- variables, most used features, menus, and so on. I could also use the Heuristic Test Strategy Model as a way to describe the different parts of a product.
No one particular type of coverage will tell you that you are ready to release, but understanding how to talk about test coverage will help give a better picture of how far along testing is and more importantly, it can highlight what is missing. Teaching your company about test coverage might finally kill the question of "How many test cases do you have left?"
The Value Of Sharing
A good journalist spends time researching the topic, learning as much as possible before sharing their ideas with the world. Some of them parachute into a place they have never been before to experience and observe it themselves. After the experiences and observations are digested the journalist picks the important parts, the parts the world wants to know about, to turn into text and share with the world.
When we dive into a new software product or a new feature, just like that journalist our mission drives the work forward.
But sometimes we forget a step.
The things we learn while slashing through the jungle of software in development are useless until we share them with other people. The most common way of sprinkling the important things we learned over a group of people might be through bug reports. Software testers are sales-people in disguise. We are selling the rest of the team on bugs that need fixing. This creates extra work for programmers; we sell them that the fix is worth doing. Sometimes, I want that change before the programmers finish the current task.
A good report is designed to be understood quickly, even when the reader is just skimming.
Sometimes I don't bother with documentation and tracking systems at all. The quickest path to getting it resolved might be to walk over to the developers desk or find them on Skype to demo the problem. This might result in the programmer telling me that this isn't a problem at all, or it might result in a bug getting fixed before having to deal with triage and waiting around for the next scheduled build.
Compare that with the types of bug reports that come in from sales, and customers -- "Page is broken", "I can't submit". There isn't enough detail to figure out what the reporter meant. Usually someone on the test or support team will have to spend time solving the riddle.
Reporting, regardless of how we choose to do it, is an important skill. Doing this well will save a lot of time and frustration.
My wife is a Nanny to a pair of boys that are 3 and nearly 5 years old now. The younger boy is in the stage where everything anyone says results in one question, "Why?" Conversations are a long series of Why's and answers till he eventually says "oh", and we move on to the next thing. When we get older, that series of “whys” slows down and eventually dies. People forget that everything is interesting and just get on with day to day life.
Good testers never stop asking the question 'Why?'
By asking why, we can discover problems in thinking that might lead to developing the wrong product, discover new lessons about who is using our product and what they value, and make important decisions early. Maybe before they become problems later on. The professional names for these ideas are fallacies, and biases. When people see good testing work, and wonder how someone possibly thought of that question, it is often from an at least intuitive understanding of these things.
Teaching questioning throughout a team can lead to a better product and a harder task for testers. That's good, we like a challenge.
My colleague, Matt Heusser, suggests that some people respond poorly to "why?" getting defensive. So choose the language well. He advises that "I'm curious - what's the why behind that (decision/thought/idea/feature)?" might yield a better response.
Playing Well Together
In the totem pole of organizational power, Testers are rarely at the top. Yet we are in the position of discovering problems in the work of others, and pointing it out publicly. That puts us in a position of "correcting" our "superiors", something that doesn't go down well for most of society.
When I do a great job and find important problems, the result is critiquing another person, possibly adding more work to their pile, and possibly forcing a schedule date to slip.
Learning how to do this in a diplomatic way, sending a message that the product could better in some way but without personal attacks, can be difficult. Studying collaboration, how we work together, the push and pull of power, and how information moves from one person to another can help. Some people are great at picking up on social hints and will get this quickly. Sometimes, collaboration isn't the answer and we need to discover how to contribute without having to constantly pair; we need to adapt to others instead of asking others to adapt to us.
I recently saw a quote that junior programmers study architecture and design patterns while senior programmers study social science. I think the same is true of software testers. We bring a varied skill set to software development organizations -- math, experiment design, modeling, and critical thinking. I've mentioned a few here that I have found useful to spread throughout groups I work with.
What skills have you found useful enough to teach to non-testers?