How to Test an IRT: A Primer for Users, Part II

IRT Essentials

You’ve worked through the user requirements for your clinical project and how it will be implemented by your Interactive Response Technology (IRT/IWR/IVR) vendor. Your vendor has built the system and completed their verification testing to confirm it meets user specifications (see How to Test an IRT, Part I).  Presumably your team has also had a chance to join the vendor for an informal walk-through of the system to confirm, in global terms, that it correctly supports your project.

Now it’s time for you to complete your User Acceptance Testing (UAT) to validate the project IRT. Validation is performed by the user to formally confirm that the system as developed meets the detailed needs of the study.

To ensure the best quality results, user testing should be as independent of vendor testing as possible. Robust testing dictates that users do not simply repeat whatever test steps and test-subject-selection the vendor has employed. New tests, new challenges to the system from the user’s ‘real use’ perspective – is the optimal goal.

This is where common difficulties may arise for newcomers to the concepts of UAT. What do you test for? How? How does one go about creating test cases and/or scripts? Should you include both negative and positive testing, or just the latter?  Can you use the vendor’s verification documents or must you create testing materials of your own?

Vendor vs. User Test Cases

Vendor test scripts/cases can be leveraged as informative tools that enable you to see what has and hasn’t been tested during the vendor’s verification testing. These should be reviewed by the UAT team prior to finalizing their own test plans and test cases. Again, you do NOT have to repeat the vendor’s tests. You use their scripts to help determine that each of the project’s User Functions and User Rules (defined below) have been tested to your satisfaction.

During their verification testing, vendors will typically rely on their already validated CORE (base) functionality to determine what does and doesn’t need to be re-tested by them at the level of each project. These omissions are a good general indication of what users should test during UAT. The goal of project-level UAT is to ensure that the final configuration of the system, as applied by the vendor, is appropriate for the project.

Here’s a way to think about UAT:

The project specifications include two kinds of requirements, functions that users are able to perform (i.e., User Functions, such as Screening a Subject or Receiving a Shipment) and rules for the system to follow (i.e., User Rules, such as checking that a new subject’s age is within a specified range for the study and issuing a warning if it is not).

In a system with a pre-validated CORE technology (which includes most mainstream IRTs), the vendor has already confirmed that whatever configuration data is entered into the system during project setup will work appropriately. A good example of this is testing the age range of new subjects when they’re added to the system. Perhaps the vendor didn’t explicitly test this rule in their verification testing because they’ve built a system where age range is configurable, and during their extensive testing of their CORE system they’ve confirmed that the configuration item works regardless of the minimum and maximum ages specified. Therefore, testing for age restrictions may not be included in the vendor’s verification testing. What most likely will be part of the vendor’s verification testing is a test that confirms the configuration for the current project’s age range has been set correctly, for example a range of 18-65.

This common restriction on verification testing means you’ll often want to test a User Function using different steps than the vendor to confirm the system responds with the appropriate error messages and allows you to complete – or prevents you from completing, when appropriate – a task successfully, with the correct information ultimately stored in the system.

Let’s drill down a little further:

The User Rule for this example is, “Subjects must be between the age of 18 years old and 65 years old.” The steps to test this rule are:

  • Add a new subject
  • Out of Range Test 1: Enter a birthday so that tomorrow is the subject’s 18th birthday
    • Expected Result - a warning is generated by the system stating that the subject’s age is outside the expected range
  • Out of Range Test 2: Enter a birthday so that today is the subject’s 66th birthday
    • Expected Result - a warning is generated by the system stating that the subject’s age is outside the expected range
  • In Range Test 1: Enter a birthday so that today is the subject’s 18th birthday
    • Expected Result – subject is added to the system
  • Out of Range Test 2: Enter a birthday so that tomorrow is the subject’s 66th birthday
    • Expected Result – subject is added to the system

By reviewing the vendor’s test cases, you’ll be able to see how comprehensively they’ve tested or not tested for issues relating to dates and age. This knowledge will help you determine whether you want to conduct additional tests. For instance, perhaps you want to test for proper handling of invalid dates (such as 32-Jan-2000), or a future date (such as 01-Jan-2030), or even so-called garbage text (“&^*^$^@HU”).

The point here is that while you do not want to repeat the testing that was done by the vendor, you do want to test that all User Functions perform as meets your project expectations.

Stay tuned for our next hot topic on How to Test an IRT!

Contact Us Today to Learn More!

Share this Post