Goal Testing
The other aspect of code testing for performance is testing to see that code executes in less time than any
specified SLA; both as a single statement execution and when run in concert with multiple users. When committing to
an SLA, the management team needs to commit to only those sections of the system they control.
A good SLA is, "Transaction X will complete within Y milliseconds (or seconds) at the database level with Z number
of simultaneous users." A poorly worded SLA example is, "Screen X will complete within Y seconds." Why is the second
example a poor SLA? Because the user seeing screen X could be next to the server and chances are good that the SLA can
be met. On the other hand, the user could be in Bangladesh while the server is in the U.S. In this case, all bets
are off in meeting the SLA once multiple network latencies start coming into the picture.
An example test harness for manual goal testing is shown in Listing 2.
As with the scalability test, the manual test needs to run simultaneously from multiple users to fully test the SLA.
Goal test results are usually reported as TPM or TPS required too reach a specific number of seconds or the number of
users required to reach a pre-specified TPS or TPM.
Automated Testing for Scalability and Goal Testing
Several companies provide utilities that have automated testing capabilities, for example, Load Runner from HP
(formally from Mercury) provides a rich testing environment. However, it may be overkill for the single developer
wishing to perform scalability and goal testing on a single statement, procedure, or functional component. Toad for
Developers from Quest Software provides a more single-user friendly test environment for the developer.