Sunday, September 21, 2008

TESTING: System Testing vs User Acceptance Testing

A new and disturbing trend seems to be catching on in the IT arena in the area of Testing. A couple of years ago I found myself in a horrendous Development environment in which the application development Lead had prepared "test cases" that he used to do his System Test and then turned those same test cases over to the User community so that they could do their UAT! What was he expecting, different results? The Product Manager happened to be a political animal and the head of the department was a putz who believed in throwing excessive resource hours at a project just to get the product to market, supported by a "yes man" in the PMO who managed project administrators. The latter's claim to fame was knowing the MS-Project Server. He did not have a clue as to what a good Project Plan was or what it should consist of. So I gave him a detailed template for a Project Plan before I left the place. But I digress..............

Let's begin by defining our terms:

A System Test is executed to ensure that the System is performing to specifications stated in the System Design and the Functional Specifications, whereas a User Acceptance Test ensures that the application addresses and satisfies the Business Requirements and the Workflow. There are of course other stages of testing such as Unit Testing and Load Testing.

So here are the artifacts of a System Test:

Screen Navigation –

  • Ensure that the sequence of screens adheres to the workflow.
  • Ensure that each Tab on a primary screen corresponds to a Business Process and each Tab on a secondary screen corresponds to a Business Function. I base this nomenclature on Structured Analysis which denotes a business process to consist of one or more Functions.
  • Ensure that the PF keys perform to standards. Ensure that the correct Error Message is displayed on the correct screen, for a "correct error".
  • Ensure correct processing of the error as stated in the Functional Specifications.
  • Ensure that data entered on a screen is wholly or partially carried forward to the next screen in the sequence, if it is needed on that screen. The User does not have to re-key the same data.

Screen Design & Content –

  • Ensure that the screens are not cluttered. 15 data items = complex screen, 5-10 items = medium complexity and up to 5 items = simple screen.
  • Ensure that field definitions are adhered to, in that a filed is numeric or alphanumeric as stated in the Functional Specification and filed lengths are consistent throughout all screens that display that data item. It is also good practice to apply the "auto-skip" feature, especially to pure data-entry screens.
  • Ensure cursor performance complies with industry standards.
  • Ensure that the color scheme is the one chosen by the Users or it displays corporate colors.
  • Ensure that field names are meaningful and consistent on all screens that contain the field.

Exception Processing –

  • The system should manage the processing of exceptions as detailed in the Functional Specification.

Response Time –

  • This is not a substitute for a Load Test. An initial test should be done just by hitting the Enter key to ensure that the response time between screen displays is within acceptable limits. Fine tuning can be done after the Load Test results are obtained.

It should be noted that the testing of the abovementioned functions is NOT a User responsibility. We as technical professionals should hand over a correctly working "System" to the Users that will efficiently support their business application.

I will present User Acceptance Test artifacts tomorrow.