Yesterday morning, I attended Dan North’s keynote session, Deliberate Testing in an Agile World. Dan spoke in depth about the gaps in traditional testing approach and how risks associated with software testing can be managed. I found the session extremely insightful, therefore I've decided to share these insights with testers who couldn't attend STAREAST in Florida. The following are some of the topics Dan touched upon during his session.
Software Development and Testing Has Changed
Traditional software development was linear, starting with development, then testing, and finally operations. The software delivery life-cycle (SDLC) looked much like the picture below.
And then Agile came along. Faster feedback and shorter iterations for releases brought different teams together. Collaboration was not only downstream, with development integrating with testing, but also upstream. Concepts such as lean start-up necessitated feedback from operations to the product development process as well.
As people and teams came together, programmers loved parts of testing and wrote a lot of tests. Examples of tests that programmers wrote include functional, unit, integration, BDD, TDD, and Load tests to see how the application behaves.
While programmers were writing all these tests, testers focused on parts of the applications that couldn’t be tested with automated scripts. Exploratory Testing and Usability Testing were examples of testing that testers focused on.
Dan spoke about how he found the division between development and testing based on things that could and couldn't be automated deeply classifying, and as a result he wanted to come up with a quadrant to understand if there was more to testing than the above mentioned types.
The quadrant Dan came up with has automated and manual testing methods on the Y axis, while Deterministic and Stochastic on the X axis. Stochastic is something that has a random or probabilistic input element attached to it, while deterministic produces the same output with a particular input.
Identifying Gaps in Your Testing Approach
If we plot all activities performed by testers and developers (discussed above), we can see there are massive gaps in the lower right hand and upper left hand.
It is thereby critical to understand what kind of testing falls in the empty quadrants. Dan discussed how the top left hand corner of the quadrant (intersection between manual and deterministic) could be filled with Documentation Testing and User Journey Testing. Document Testing involves checking if the document accurately describes what the product does. User journey testing is looking for unusual things while interacting with the system. It’s like the security guard who notices uncommon stuff while on duty.
Similarly, the bottom right hand corner could be filled with the following testing techniques:
Fuzzing: Fuzzing is checking how your application handles completely random stuff. Dan described fuzzing as ultimate black box testing. It can be great for finding implementation faults or security loopholes.
Property-Based Testing: In this case, the developer/tester runs tests with variety of input combinations to find arguments that cause tests to fail. Once done, they look for simpler versions of the data set that cause a similar issue.
Resilience testing: Dan gave example of Netflix’s chaos monkey and chaos gorilla to explain this testing. Chaos Monkey refers to how your application responds when once service or a data center abruptly stops working. On the other end, Chaos gorilla involves understanding what happens to the application when numerous services or databases encounter downtown simultaneously. As evident, to ensure application works as expected during chaos monkey or chaos gorilla, development, and operations need be tightly coordinated.
A/B testing: A/B testing can come in handy to determine which out of the two product designs performs better when rolled out in production.
The complete matrix looks something like the image below.
Risk Management in Testing
In the next part of the session, Dan spoke about how to manage app testing efforts. Having high test coverage is not always enough. Risk should be measured in terms of Likelihood and Impact of Failure. So, if we plot risk on a plane it would look something like the following plane.
Mapping software testing activities based on the risk and likelihood of impact of a feature can come in handy for prioritization. For instance, in the above figure, with “A” having the highest risk and highest likelihood for failure, we might want to achieve a higher test coverage (of probably more than 80%) for “A”. On the flip side, in case of “F”, one with lower impact and lower likeliness, 80% coverage might not be needed.
However, mapping likelihood and impact into risk is incomplete without taking Context into consideration. As a result, as seen below, the third axis of the risk plane is Context.
Dan explained how context can be linked back directly to different stakeholders in an organization. Context can enable testers to understand whose or which team’s risk is being assessed and measured. For example, people in a regulatory testing team might be more concerned with “C” and “G” working well because it affects them, while operations team would want to focus on “A” since it alters day to day operations of the product. The risk profile of a operations and a regulatory team would look something like the plane below.
This analysis when done for the entire team can help create a risk profile for each stakeholder, which looks something like the following plane.
In conclusion, testing in an agile world is multi-dimensional. It involves studying the application in-depth, identifying various stakeholders, understanding their risk appetite and concerns, and then exploring risk plane developed with each stakeholder.
Dan North's complete slide deck: https://speakerdeck.com/tastapod/deliberate-testing
Automated Testing is Not Agile Testing - http://blog.smartbear.com/test-automation/automated-testing-is-not-agile-testing/
Top 5 Common Challenges for Agile Testing Teams - http://blog.smartbear.com/sqc/top-5-common-challenges-for-agile-testing-teams/
Testing in an Agile Environment - http://blog.smartbear.com/agile-testing/testing-in-an-agile-environment/