Roadmap: Performance Testing Process
An overview of the Performance Testing process
Main Description

Performance testing is a well-understood discipline that has been practiced for more than 30 years. First there were time sharing capabilities on mainframe computers, then minicomputers with multiple asynchronous terminals, and later networks of personal computers connected to server systems. Testing all of these systems revolved around the need to understand the capacity of the shared portions of the system.

The process of performance testing has not changed significantly since these earlier system types were being tested. However, the complexities of the system design with more distributed intelligent hardware components and many more interconnected software subsystems yield more challenges in the analysis and tuning parts of the process. On current systems, performance testing should be done iteratively and often during the system design and implementation. Tests should be performed during implementation of critical subsystems, during their integration together into a complete system, and finally under full capacity work loads before being deployed into production.

The basic steps in the performance testing process are listed here. Each step will be explained in the sections that follow in terms of the actions taken and results of the step.

  1. Determine the system performance questions that you need to answer.
  2. Characterize the workload that you want to apply to the system.
  3. Identify the important measurements to make within the applied workload.
  4. Establish success criteria for the measurements to be taken.
  5. Design the modeled workload, including elements of variation.
  6. Build the modeled workload elements, and validate each at the various stages of development (single, looped, parallel, and loaded execution modes).
  7. Construct workload definitions for each of the experiment load levels to collect your workload measurements.
  8. Run the test and monitor the system activities to make sure that the test is running properly.
  9. Analyze measurement results and perform system-tuning activities as necessary to improve the results, and then repeat test runs as necessary.
  10. Publish the analysis of the measurements to answer the established system performance questions.