You are here:

USS QA team saves time with automated functional testing

From left to right: Adam Williams/Faisal, Scott Wilgar, Reed Garside, Dan Thornley, and Clint Bowles

By Emily Rushton

The University Support Services (USS) Quality Assurance (QA) team has a tough job: constantly trying to ensure every software application or system USS develops for students, staff, and faculty is customer-ready prior to release.

“We’re the verifiers,” said Dan Thornley, QA associate director.

This means doing a whole lot of validation and testing.

Testing scenario output example

“In a basic sense, our goal is to replicate the end user experience,” said Thornley. And with the addition of automated functional testing (AFT) to some of QA’s projects, the process is getting more accurate all the time.

“We felt that our best bet was to move as much as we can into an automated arena, because when it’s automated, it’s harder to hit the wrong key, or forget what you did before,” Thornley added.

AFT helps simplify as much of the testing effort as possible with pre-coded tests that have been written based on real end user testing data and results. This data is captured by documenting or recording what users do – what they click on, the steps they follow – when using a new system or application. The recorded data is then saved into a library where it can be used for automated test development. 

Recording is most useful when a user is testing a new system and runs into an error.

“We have the error captured at that point,” said Reed Garside, QA lead for Finance, Faculty, and Research. Not only that, but the recording shows the exact steps taken leading up to the error, which helps developers replicate and then fix it.

In the past, the QA team had to wait for developers to finish writing the code for a new application or system upgrade before being able to conduct their testing – and this added a lot of time to the development cycle.

“The problem is, you can’t start writing an automation until you’ve got something to run it against,” said Thornley.

Now, the team uses a test-driven development approach and works with the USS user interface group to create prototypes of applications before the developers even start writing code for them. The prototypes look and feel almost exactly like the future end product, but are much easier to create because they require no backend coding. The QA team can start writing their automation tests based on the prototype, and when the end product is finally developed, QA can point their automation tests to the actual code instead of the prototype, thus saving a tremendous amount of time.

Example of multiple testing scenarios written in Gherkin, which are English-like statements that Cucumber (the QA team's automation framework) uses to run tests

“Obviously some things will change, but we’ve already got a bulk of the code written, so it minimizes how long it takes us to get it done,” said Thornley.

Finally, once the application has been developed and the tests written, the QA team can run hundreds of tests with a simple click of a button. There are over 1,500 automated testing steps written for the UBenefits application alone.

“Today when I came into work, all I had to do was push ‘play’ on this automation, and it tests the entire code after the developers have made their updates,” said Clint Bowles, QA lead for HR & Auxiliary. “We can know within an hour whether or not something has broken.”

The automation has saved so much time, Thornley estimates it’s the equivalent of multiple full-time employees.

“If we had to do this all manually, we’d have to have two to three more people here just doing testing,” said Thornley.

The team hopes to expand AFT into more projects and groups within USS, but acknowledges it’ll take some time to get to that point.

“Our goal is to become as familiar as possible with what the end user wants,” said Thornley. “And then do all we can to help validate that it’s being delivered.”

Last Updated: 8/30/17