How Praxis Engineering Delivers Speed and Quality in a World of Complexity With Zephyr Enterprise

Maryland-based Praxis Engineering Technologies operates on a global scale. With a mission that extends far beyond the enterprise level, the wholly-owned subsidiary of General Dynamics specializes in compliance, cybersecurity, and mission-oriented solutions that support defense and intelligence initiatives.

With a client base that includes international governments, Praxis does more than develop software – they deliver turnkey hardware and software solutions that safeguard the security of nations while ensuring travelers can enter and exit countries around the globe. Safely, securely, and efficiently.

A Matter of National Security

It’s easy to understand why Praxis takes its QA so seriously. Their applications and hardware solutions affect hundreds of thousands, if not millions, of people daily. It’s a reality that Carolyn Jones, PMP and Testing Lead at Praxis, is accustomed to. Jones, a 14-year Praxis veteran, is responsible for quality assurance and testing on the complex Praxis Border Control program.

Downplaying the scope of the project she has been a part of since 2017, Jones explains the nature of the system:

“It’s a web application, deployed worldwide, with four major components: a component for traveler processing that scans your passport and checks you into a country, a screeninglist management system that keeps track of watchlists for persons of interest, and reporting and system administration modules.”

Small Team, Big Responsibilities

“We were charged with the testing – all of it. We’re small but mighty and we support some 25 or so developers.”

With nearly three years on the program, Jones recalls the challenges she and her six-person test team faced from the outset. The pressure was on them to succeed. “We were brought in after two attempts at automation had failed. We had carte blanche – charged with innovating any way we could.”

Multiple Sites, Multiple Versions, Infinite Configurations

“It’s a highly configurable application, so we’ve got all sorts of different configurations out in the field. It’s not always easy to know when someone has made a configuration change and if that change is what’s causing the current issue that we’re looking at.”

One of the key pain points Jones and the team faced was the many variations in the product – both in development and on client sites – which she breaks down into three primary branches:

“In the field, we have multiple versions of the legacy application developed in Swing. We’re working on developing our next generation of that application, and we’re also working on developing a mobile version we can deploy to a cell phone or a tablet. When we test, we have to remember the differences between all those versions.”

As a result of that complexity, Jones says releases were limited to “one or two releases per year, max” and regression testing took “six very painful weeks when no other testing could get done.”

Comfortably Numb

Jones points out that, early on, the issue wasn’t an absence of automated tests, but one of unmaintained, poorly documented tests. “On my first day on the program, I looked at the overnight test run, and out of 300 tests, something like 110 had failed overnight and had been failing for months.” As a result, after an automation run, the team would run the same tests again manually, both duplicating work and causing further delays in bringing the product into production.

“Nobody trusted the tests. People were numb to the failures for a variety of reasons. Sure, we had flaky tests, but we had actual failures too – bugs in the code that the tests were catching and nobody noticed because they were so buried by everything else.”

An Organizational Nightmare

When it came to testing automation, the lack of a central, organized test repository played a large role in the failure of the previous initiatives. In the beginning, Jones admits, “We had a ton of Microsoft Word documents and Excel spreadsheets. We were going through this massive six-week testing cycle where our progress in TestComplete was actually tracked manually in spreadsheets.”

In addition to that, there was no execution history. “I couldn’t point to a particular test case and say, this has been executed half a dozen times and it’s failed twice.” Moreover, “We had no way to know if a test case had been updated to reflect a change in the application, or a change in functionality None of those things were available to me.”

Her team had a lot of ground to cover. “We needed a reliable set of automated tests to help us with smoke and regression testing so that we could focus on the more functional aspects that couldn’t be automated.”

More importantly, they needed a way to manage their efforts and make every test count.

Making the Case for Zephyr Enterprise

Searching for a solution, Jones was familiar with Zephyr Enterprise, but had never used the product. “I heard about Zephyr for the first time in a test automation class that I was taking,” she says. “I knew that we needed something better, so I got a trial version and played with it.”

To make her case for acquiring the tool, with her evaluation copy in hand, Jones “set up Zephyr to track an actual release to see how it would really work.” She was impressed with its flexibility and pleased with the outcome. “I took the results to management, brought them up on screen and said, 'This is the kind of thing that I can provide to you with this tool.'”

Jones shares one other reason she had for choosing the SmartBear tool over the competition. “I looked at another tool, but they had a poor market presence. In comparison, the robust online communities SmartBear has are amazing resources.”

The Zephyr Difference

With Zephyr Enterprise endorsed by management and the system in place, Jones started incorporating Agile best practices and tools she was familiar with, like Selenium WebDriver, the page object model, and (other SmartBear product) Cucumber.

“All these good things now allow us to build tests at a very rapid pace,” she explains. “In addition to the first 200 tests we created and refactored, we now have 442 tests that run every night, and cover the major parts of our application.”

Smaller Releases, Higher Quality Software

The improvements in the process were significant. Emphasizing the difference, Jones says, “From a systemtest standpoint, we’ve pared down the releases. We now do a release every month, and it takes a week to test it.” It was a marked advancement over the twice-annual release schedule that required almost 2 months of intensive, primarily manual, testing.

Pleased with the progress, Jones says, “Our application is less buggy. We’re also relying on our automated tests, which are computer generated so they never fail. The tests run exactly the same way every time, and we get fast feedback.”

Sealing the Cracks – Time Better Spent

With everything neatly tied together using Zephyr Enterprise, the team at Praxis is free to pursue more intensive manual testing where it’s needed most – like in the UI where Jones feels that “automated testing is most brittle.”

“Solid automation means that my team can focus on things like exploratory testing, knowing that the barebones, basic stuff is covered. It allows my functional testers to get creative – to find some of those edge cases that otherwise fall through the cracks.”

Improving the Customer Experience

Jones and her team aren’t the only ones realizing the improvements of the Zephyr-enabled processes. Praxis is passing the benefits forward to its clients. When an issue in the field is discovered, “Our customers don’t have to wait for a backported fix. Since we’re doing a release every month, we can easily say, ‘Give us a few weeks, you’ll have this fix in the most up-to-date version of the software.’”

It is a win-win proposition. The customer is happy with the rapid response, and Praxis maintains consistent versions of the software they run on customer sites.

Benefits and Results

“We’re seeing a whole lot of benefits out of this. In particular, we can track test execution progress in a release cycle and say, at a glance, ‘OK. Just one or two more things to do’.”

Among other things, she notes:

  • Reduced Regression-Test Time, Increased Release Cycles – An increase in software releases from once or twice a year to monthly. Regression testing at Praxis has fallen from six weeks to one week, with improved test coverage.
  • Trusted Testing Automation – The team was able to take a broken library of failing tests and build a robust, trusted repository of 442 tests that cover the majority of the product.
  • Increased Test Coverage – Jones makes better use of her tester’s time, diverting them from laborious manual testing for obvious bugs, to focusing on capturing edge cases and exploratory testing.
  • Centralized Test Documentation – The QA team shifted from a disparate collection of Excel spreadsheets and Word documents to a shared, central document repository.

Last Word

Asked to provide a highlight of her experience with SmartBear and Zephyr Enterprise, Jones doesn’t hesitate to pinpoint the customer support she has received:

“I feel like SmartBear responds to me more as a partner than as a customer. I feel like you’re on my team trying to help me get to where I need to be, rather than trying to tell me how I should mold my process to fit the tool.”

Download the PDF
Key Products