Example: Using quality objectives

Your team can use Rational® Quality Manager to set overall quality objectives and to manage both entry and exit criteria. Quality objectives are defined at the project level and implemented in individual test plans, where you can track whether or not each objective has been met.

About this task

Here is an example of how a test team can use quality objectives:

Procedure

  1. During the planning process, the test team defines the quality objectives.
    1. A test manager examines the predefined quality objectives in System Properties and evaluates whether they are suitable for her test team.

      Manage Quality Objectives

      Each quality objective includes a Name and Description, as well as a Condition and a Target.

    2. The test manager decides to modify the settings of some of the predefined quality objectives and also create some new ones.

      For example, one of the predefined quality objectives states that the Percentage of Failed Execution Records must be less than 10%. The test manager replaces less than 10% with less than 5% by double-clicking and replacing 10 with 5.

      Both the predefined and new quality objectives can be used in any test plan within that project area. However, all user-defined quality objectives are informational only and do not contain computed values.

    3. A test lead who is responsible for a particular test plan defines the overall quality objectives for the test plan, using some of the quality objectives defined by the test manager, and if necessary, some additional quality objectives that she creates in the test plan.
      Note: Any new quality objectives that are added in the test plan are available to all test plans in the project area.

      For example, the test lead might want to add some overall quality objectives that have not been defined in System Properties, such as the number of concurrent users that the application under test must support or the maximum time allowed for the application under test to open.

      When a quality objective is added to a test plan, the Condition and Target are merged together in the Expected column, as shown in the following figure:

      Quality Objectives

      The test lead sets the status to Not Started.

    4. The test lead opens the Entry Criteria section of the test plan to define the prerequisites that must be met before testing can begin.

      For example a System Verification Test team might want to require that all functional verification tests have been attempted and that 95% have been completed; a Functional Verification Test team might want to require that the user interface is frozen.

    5. The test lead opens the Exit Criteria section of the test plan to define the conditions that must be met before testing can be concluded.

      For example a System Verification Test team might want to require that all System Verification Tests have been attempted and that 95% have been completed.

  2. As the development effort progresses, the test lead determines if the test entry criteria are being met.
    1. The test lead runs several reports to determine things like the number of Severity 1 Defects and the Execution Record Pass Rate.

      The software computes the Actual Value for each quality objective.

    2. The test lead compares these values with the expected values, sets the Status, and comments on each quality objective.

      Entry Criteria

    3. The test lead meets with other team members to determine whether the entrance criteria have been met.

      The test lead may decide to stick with the original entry criteria estimates or to make adjustments.

  3. As the testing effort moves forward, the test lead determines whether the exit criteria are being met, following a similar process to that of the entry criteria.
  4. At the end of the testing effort, the team evaluates whether the overall quality objectives have been met.

Feedback