How To Be An Agile Tester

Two threads connect all of the agile companies I have worked with over the past decade -- releasing often, usually every two weeks, and doing some sort of daily status meeting. Each time I went somewhere new, I would tell myself "OK, this is going to be agile for real. It's time to reevaluate and probably get some new skills to keep up." and inevitably it would be the same thing, but different. Developers writing code for a couple of weeks and sticking it in a build right before planning for the next sprint starts, testers saying they really (really) need to learn some automation this time, and people fluttering around asking what we were doing about regression testing.

As my colleague Matt Heusser has said before, absent a compelling vision, "Agile" becomes the three S's - Sprints, Standups, and Stories.

Even with those "three S" agile conversions saw a change in how I did my work - while with the more radical ones, on a good day, there was a level of collaboration that led to new and emergent practices -- and made some old ones obsolete.

Let’s talk about how agile has tilted the tester skill set.

Cross Functional Teams

When I first started working in software, there were some sharp divisions between roles. Developers sat in their cube farm and had their own leads and managers, while testers sat in a different space with their own leads and managers. It was usually pretty obvious which team you were on. Years went by and the cube farms went the way of the dodo. Teams looked different on a superficial level as agile creeped in, but there were other more important changes.

Under agile software development, a delivery team is expected to have everything it needs to deliver software as one team. That means the developers, testers, DBA's and analysts are all on one delivery team, working together to ship a stream of value. The focus shifts from work-product as a stage ("My job is to code this up") to getting the current release across the finish line. Before agile, I had to go schedule time with a developer to find that a buffer overflow was caused by a mis-sized field in a database table, and then schedule time with a dba to actually get that field fixed. On more modern teams, I just have to lean over and ask the person next to me that is familiar with the database technology and data structures we are using. Chances are, that person is vaguely familiar with the feature code, too.

The biggest misunderstanding I had when realizing this, is that each person on a cross functional team should be able to do everything. And by everything, I mean write and commit production code. The reality of cross functional teams is that someone on the team who is available needs the right skills for whatever feature we are working on right now. Someone that is good at building the API, someone good with JavaScript, and someone that specializes in testing. With these powers combined, you have a team that can make software.

Cross functional teams aren't a skill, exactly, but the ability to get along with people, to read intent (is this person do-not-interrupt busy or make-it-quick busy?), to be a combination of human, vulnerable, willing to teach but not a show-off ... those are skills, and they do matter more for agile teams than more traditional teams that focus on documents, role, specific work products, and handoffs.

Technical Inclinations

If I could point to any one thing that has changed since agile has his critical mass, it's that testers are more often than not expected to have some technical interest. The Interviews I had years ago typically went over questions like how I might test a calculator, and then a series of riddles and logic problems that might or might not correlate to testing ability. In more recent interviews I still got those questions, but I am also likely to get coding challenges and questions about my experience with tools.

There are a couple of reasons for this technical focus -- having the ability to test before there is a user interface potentially means finding problems much earlier, and also these people can create sets of checks that can be run with each build to act as change detection. When I work on testing projects now rather than waiting for a mostly done user interface, I'll open up Postman and work through the API with a developer, or take a look at the database to see how things come together there, or even reading the latest code in the source code repository.

A person that understands testing and isn't afraid of tooling and programming languages is likely to find a job market that will embrace them with open arms.

Coping With Change

Process fads come and go every few years as we learn more about what makes a good team (and then commoditized that). User Interface technologies are cool one year, and then next no one wants to be seen together with them. And JavaScript, well there is a new must have JavaScript library out at least once a month. We always had a way of ignoring the world changing around us in more traditional, waterfall-ish, development groups. The tech landscape would change, and we were of course aware, but it wasn't immediately important. We would upgrade when it became important to do so.

Somehow, agile embraces this change and turns the change rate up to 11. In addition to keeping up with the latest technology, there is also a drive to constantly adapt whatever is new. For me, this meant having to keep pace with development. A new library would inevitably break some test tooling that would take time to figure out and update. Programming library upgrades would generally require a significant test effort to make sure there weren't any surprises. These changes are manageable when done every once in a while, but the "go fast, break things" mantra of agile usually has the test group finding broken things.

The mental shift, from pursuing certainty and reducing risk, to pursuing opportunity, learning from experiments, and maximizing value, can be a difficult skill to swallow. The first book on Extreme Programing had it right with its subtitle of "Embrace Change"; coping with the mental changing involving change is certainly a skill.

Change can be bad or good. Either way, it is something to think about and be ready for when testing on an agile team.

Upgraded Reporting

One of the earliest bits of advice I got in testing was about how to produce a good bug report. The kind of report that goes into JIRA or Bugzilla or the like. I was told that there should always be steps to reproduce the issue, I should mention expected results and compare that to actual results, and be careful to include screen shots and log files. Not to mention the confusion that priority and severity can create. These reports took some time and care to for me to produce.

Done like this, bug reporting is a sales technique (or if you are in to philosophy, you can call it rhetoric). There is always too much work and too little time, so when I reported a bug it was a way of saying that what I found is more important than other things that could be worked on.

I developed a good working relationship with a front end developer at the last company I worked with. Randomly throughout the day, he would ask if I had a minute to go over something he was working on. We would stand together (he was a standing desk kind of person) and I'd ask questions about how he thought the page would be used, I'd try a few test ideas and sometimes find a couple of bugs. In a more traditional environment, he might have asked me to document these for him to look at later. Here, I was able to live demo a problem, have him try a few code edits, and then try things out again. Problems just got fixed.

Conversational reporting isn't always the best solution, sometimes it is good to create documentation as a reminder for bugs that take time to think though. But when it was possible, we got good software done fast.

Team Dynamics

People value independence, I certainly do. I like being able to pick the projects I work on, say no to the ones I'm not interested in, pick the people I work with, and decide how to get the work done. In some ways, agile facilitates that independence. Think about some of the more self-directed teams you've seen. Product managers give the group a description of what they need to build, and then they go off and do it. That means making decisions about what technology should be used, and the finer details of what the end result will look like.

Despite being able to make decisions about the product, in most companies we don't pick who we work with and in agile that means spending a lot of time with these people. Ron Jeffries writes about very small groups of people, feature teams, that come together long enough to produce a feature and then move on to the next thing. It sounds extremely efficient, and the one time I got to try this out, it was.

The one detractor for me was that one feature team generally didn't provide enough work. I would float between a few different teams as they needed me and help with testing questions when asked. The result was a few development teams that were used to working together and were able to get into a groove, and me floating between and not really feeling like I belonged to any team aside from the greater team that is the development department. The variety of work and ability to help people was there, but I didn't get that great sense of belonging that comes from consistently spending a lot of time with a small group.

As I mentioned earlier, with the exception of developing some technical chops, most of these are not hard skills, or arguably, skills at all. Scott Barber's suggestion that testing is testing, and agile is context, is absolutely right. Testing skills are transferable, once you learn a few you can apply them on a number of different projects. With agile, the trick is to see where the pieces fit. I like to ask myself; what is more important, what is less important, and where can I be the most useful. If you can answer those questions, then you can certainly be an agile tester.