SmartBear is still receiving questions relating to the recent Agile Testing All Star Webinar that we hosted in May! Thanks to Lisa Crispin, we have another installment of Q&A for Agile testers keen on honing their Agile testing skills. The following answers to questions are in Lisa’s own words. If you missed it, you may watch the Ondemand recording of Agile Testing Challenges: Overcoming Quality, Process and Team Issues.
Q: Could you explain performance testing in agile?
Lisa: In agile development, we generally work in small increments and quick iterations. When we’ve zoomed into some part of a feature, it’s easy to forget about testing other aspects of quality, such as performance.
My own team tackled performance testing by budgeting points for a few iterations to research what tools might work for us and what infrastructure we needed. After evaluating and choosing a tool, we wrote and did a user story to get a performance baseline. Some agile teams run performance tests regularly in their continuous integration. We usually do performance testing when we make a change to architecture, need to validate a new architecture or design, change a component of our production system such as the application server or database version. Sometimes, at the beginning of a new major theme, we start off with a spike where the developers write code to implement the solution they think will work, then we do performance testing to make sure it will scale. We throw away the code, but if the performance is good, we start working on the real stories knowing that we’ve got good enough performance.
Q: Have you identified any specific best practices or processes that help you when it comes to quality and making sure that you can keep up with the speed of delivery of the sprints?
Lisa: I think one key is not so much in agile testing process as just a team process in general. One of the most common problems I see with new agile teams is that they over-commit. They want to make their business people happy, make them think they can get a lot of work done in a week or two weeks or however long. And they bring in too many stories—they work on too many stories at the same time. Then they get to the end of the iteration, and the testing is not finished or maybe the coding is not finished. It's important to under-commit and take only a few stories. Focus on finishing one story at a time, including all the testing activities. And limit your work in process – that is really key.
I think once people get good at some of the agile practices, they still have trouble understanding the customer's requirements and specifications. That's another big problem. The software may not have any technical bugs, but you didn't deliver exactly what the customers wanted.
Strategies like the three amigos approach, where you get the developers, product owner, and testers together to discuss issues is effective here. The “whole team” approach in general, working together and doing some kind of specification by example or excess in test-driven development where we actually start by thinking about what we're going to test works. Then, start writing the code to make those tests pass and to get examples of desired behavior. Make sure to get desired behavior from the customers and translate those into tests that drive development.