Frontiers in Virtualization for Quality Assurance
  April 25, 2011

QA led the charge to virtualization over the last decade, often teaching other departments about the advantages and pitfalls. With QA now facing the toughest problems it’s had in 30 years, it’s time to wring even more benefits from virtualization. The same goes for your career: Build on what you already know about virtualization to be a leader in the cloud.

Quality Assurance was the first domain where virtualization (as currently understood) moved from hype to reality. Testing laboratories began to manage desktop environments. Instead of inventorying them as physical assets — that machine over there is set up with WinNT SP3, and that’s a Vista system, and so on — desktop environments became software images that could be replicated, compared, archived, restored, calibrated, and adjusted. Roughly a decade ago, QA began to boost its productivity through virtualization, significantly before Development, Operations, or other departments learned to do the same.

While that first wave of virtualization is effectively complete, there’s still a lot to learn. These new ideas in virtualization are just in time, too, because QA now faces the biggest challenges it’s had in three decades. Back then, the main problems were poorly-standardized operating systems and printers. Now the biggest difficulties — the ones likely to determine your career success — have to do with mobile computing, parallelism, and globalization.

What’s the problem?

Do mobility, parallelism, and globalization truly present anything you haven’t already conquered in some degree? Yes and no: If you’re not prepared for it, for instance, the extent to which mobile computing is a fashion industry will surprise you. Your end-users likely will be holding handsets bought as disposable retail devices.

Vendors’ executives often have backgrounds in marketing, and sometimes in hardware, but they rarely have software knowledge as we understand the term. While programs to certify against standards such as “HTML5” or “Android 2.3” sound like solutions to quality issues, the reality of the mobile market is that even very similar devices, including ones which share a model number, can differ enough in their firmware to give diverging results at the application level. To track all the variations even for the market leaders in mobility ranges between “tough” and “impossible.” At the same time, consumers increasingly expect “always-on” applications, ones where it’s not good enough for the tech support department to advise, “Just re-boot.” The demands on QA rise correspondingly: You must find errors before your software makes it to market.

One essential in management of this complexity is to leverage virtualization well. Daniel Dern recently wrote a marvelous article, Testing Apps For SmartPhones and Mobile Devices (Without Buying Out the Store), with seven specific tips on managing testing targeted at handsets. Not only did he explain how to use simulators and emulators, but he correctly identified the next big evolution of virtualization: the cloud. What Dern labels the “Online Mobile Device Petting Zoo” is remote on-demand access to handsets managed as a utility service.

The real problems the cloud really addresses

There certainly has been an excess of “cloud” rhetoric, mostly designed to sway the purchasers who don’t understand what they’re buying that it’s somehow soft and fluffy.

For QA, though, the advantages of cloud computing — on-demand, commoditized computational service — are very real. Rather than buy a particular handset that you only need for testing eight hours out of the year, rent it. Rather than buy enough servers to model every load profile your manager asks you to try, spin up virtual machines at a cloud provider for the day or two it takes to run that one experiment. Instead of hiring one administrator to tend to all the inmates in your “browser barn” (“was that Firefox 3.5 or 3.5.1, for Mac OS Panther?”), buy a few pennies’-worth of time with professionally-maintained installations of scores of different browsers.

Exploit paravirtualization to shake out most problems before they reach the level of a specific end-user client. Many of the problems your department faces are not unique; they’ve been tackled before and, to a growing extent, someone is now in a position to sell at least a partial solution as a managed service.

Notice how well this plays into the episodic character of QA work. Between releases, you plan, learn, and prepare. Then, when software is ready for testing and “thrown over the wall,” you leverage the cloud’s capabilities to test all combinations quickly, thoroughly, and reliably. As the cloud matures, providers like Saucelabs emerge. Not only does Saucelabs automate and remote Selenium-based testing, but Saucelabs adds conveniences and refinements to Selenium of the sort you’d do, if only you had the time. You concentrate on the scripting that’s specific to your organization, and let the specialists add value in their specialty.

However much you have improved your operations with virtualization to this point, it’s nearly certain that at least that much more progress is available to you right now. Typical modern applications have at least two, and often three, tiers, each of which deserve testing in all sorts of modes: black-box, load, white-box, and so on. Every separate aspect is an opportunity to virtualize and to better manage your testing.

It’s not all going to be easy. Cloud providers will occasionally have outages, and some will doubtless even go out of business. You won’t control individual resources the way you did when QA had its own budget for purchasing machines. On balance, though, a move to the cloud is the right one. It’s the only way you can hope to keep up with the problems your organizations expects you to tackle. It’s also the only way to keep ahead of the range of domains you need to master to test modern products adequately.