3 of the Biggest Mistakes Software Testers Make When Reusing Tests
I was working at a financial services software company in the early days of User Interface automation frameworks.
We decided that automating the GUI was the way to go, and immersed in developing the framework and our own tests.
The goal of automating our tests was to speed us up, while still providing broad coverage once the system was in place. We would then be able to perform repetitive or slowly changing operations, like building a very complex document, to see if the application had memory leaks or unexpected crashes. We wanted to be able to repeat the tests and have each run be identical.
We learned a lot about what you should do and — most importantly — what you shouldn’t do when reusing tests.
Reusability can be one your biggest time savers in testing — whether you’re a manual tester, automation engineer, or developer implementing testing into the development process. But there are also pitfalls you’ll need to avoid.
Here are three mistakes to watch out for:
1. Not building a framework
While automation should speed things up, it is not a “set it and forget it” solution.
Imagine that you have ten different GUI "Test cases" and a button ID changes. All the scripts break and need to be updated. If you use a record/playback, you might be able to change the text or you might need to re-record the click operation ten times. Most people don't want 10 test scripts though. They want hundreds.
The right framework, or test infrastructure, can make that burden manageable.
If you are working in the user interface, the first place to look for reusability is something called the PageObject pattern.
The term feels a little technical but you may have seen this before. Every page on a website is made up of the things that a person needs to do — buttons, text fields, drop lists, date pickers, radio button sets, and check boxes to accomplish a job. The PageObject pattern is a style of documenting those elements in a code file before you work with that page. These files also describe the services or things a person can do with those user interface elements.
Imagine you want to type a user name and password, and then hit the submit button to log in. Without PageObject, you would have to start with something like this every time you wanted to touch those fields:
userNameTxt = driver.findElement(userName)
passwordTxt = driver.findElement(password)
submitBtn = driver.findElement(submit)
If one of those elements changes somehow, you get the pleasure of finding and updating that change in each script you have written. If you use the PageObject pattern, you might end up with a description file that looks a little more like this example. Instead of having to define the fields and buttons everywhere, you now just have to import the login class and type something like logMeIn("justin", password).
Setting up the PageObject pattern is a good bit of work to start; you can't jump right into building check. Instead, you model the operations of a page in an object-oriented way — editProfile(firstname, lastname, email), changePassword(oldPwd, newPwd), AddTag(tag), RemoveTag(tag), Search(searchTerm, expectedResults), and so on.
It takes time and forethought to get there. That might be a good thing.
Some tools will get you pretty close without having to be elbow deep in code. Automated testing tools like TestComplete provide record and playback solutions that allow you to build an object or service library that you can pass variables into. That means you don't have to do the re-record, or even change an ID in every test every time your designer wants that radio box just a few more pixels to the right.
It can be hard to visualize a webpage when working at the service level. Sometimes that's because there is no page to visualize. Instead of the PageObject pattern, here I like to build a framework around business language. This is sometimes known as a domain specific language (DSL). A company I was working with had a product designed to build advertising campaigns built on top of a REST API. A lot of the testing I wanted to do was dependent on having a campaign there, so I knew that was a source of potentially repeated code.
To make that move a little faster, we created createImageGallery(), createVideo(), and createImage(). You'll want to make sure it's worth the time before building out that infrastructure.
2. Asking the wrong questions
When I started dabbling in automation, no one ever asked the question of when to stop.
The goal from management was to completely automate our pre-release testing. In hindsight, it's easy to see that was a bad goal. Each day we would add new tests to the battery, and eventually we got to a point where we were spending as much time building infrastructure and maintaining tests as we were building new ones to cover the newest features.
Because tests have a maintenance cost, I like to ask myself "Is it worth it?" every time I add a new test to the nightly run. If that new test helps add coverage to a delicate part of the software, or helps shine light on a code base that is being refactored, then maybe the answer to that question is "yes." One way to balance that cost is to remove an automated check from the overnight run, so I am replacing not adding.
On one project, we had a data grid that the customer would use to enter lab test values into. Each row that was entered had to be saved before navigating away. We added asynchronous save to that data grid as a usability update, so that as soon as focus was lost on the field being edited a save would happen. Every test that was using that data grid had to be updated, and we also had to consider how and how much, to try to automate the new asynchronous save.
Sometimes considering who is making the automation matters, too.
3. Ignoring future impact
Writing a one-off script to do something simple can if a fantastic time-saver — if it is a throwaway.
Throwaway code doesn't have to DRY or go through deep code review. The problem comes when we take that code and use it again, or institutionalize it, now running it for every release or putting it into our continuous integration (CI) run.
Code that will be reused should consider the future, including people that might care about that code that might not be the original author, or at least what happens in six months when the original author needs to edit it and can't remember what the code did or why. Reusable code is expensive to write well in the first place, and more expensive if written poorly! Reusable code is expensive to write well in the first place, and more expensive if written poorly!
The person creating tools for reuse needs to have a strong understanding of testing — how and where we want to access data, how we want to move through the product, and how to see whether something is a problems or not. They should also have a good foundation in software engineering principles.
Automators with testing ability, but no technical skill end up with a jumbled up nightmare of a testing tool. Programmers with technical skill, but no testing foundations end up with good code that might not solve key testing problems. This skill set tug of war happens when teams try to create an automation tool that both satisfy the needs of the teams. To make things worse, the tool is typically used by one person working alone, who is unlikely to have both skills.
This tension will show its head every time someone new joins a project.
The best solution I have seen for this is to treat an automation project like any other software development project where we have to deal with the consequences of our decisions.
That means running the project with the right people — developers, a customer, and someone to guide vision and direction. An experienced person should be making the technical decisions, and people that are newer to programming should get bite-sized pieces of the problem set that won't have wide-ranging implications.
To reuse or throw away is a serious question; if the answer is throw away, then to automate at all might come into play. The big question is whether or not there is value for that check to run enough times to be made an ongoing part of the process. But there is also the very technical question of 'will this be a pain in the neck next week?'
Become an Effective Tester by Reusing Tests
The way applications are being built is changing. For developers, building cross-platform apps is becoming easier than ever before. With so much emphasis on cross-compatibility for development, QA and testing teams also need to think differently — that's where testing comes in.
In our newest eBook, How to Become an Effective Tester by Reusing Tests we provide a game plan for reusing tests and introduce you to the tools you'll need to get started.
Get your copy today!