Guideline: Process Performance Models
Establish process performance models to understand past and present process performance, and to predict future results.
Relationships
Main Description

Process performance models help the organization understand how their processes have performed in the past, and provide an expected range of future process performance based on that historical information. These models are calibrated using past project process and product metrics in order to predict the results future projects will achieve by following the same process.

Organizations establish process performance models in order to:

  • assess the potential return on investment/ business value of software capability improvement activities;
  • provide insight into the effect of implemented process changes on process capability and performance;
  • understand how projects should select and tailor organizational processes for optimum performance;
  • provide insight into how project variables impact process and product performance;
  • estimate progress toward achieving objectives that cannot be measured until a future phase of the project lifecycle;
  • enable confidence in estimation activities and meeting project objectives (within predicted performance ranges) through consistent statistical analysis of processes;
  • enable "what-if" analysis and/or trade-off analysis for project planning and taking corrective actions during the project lifecycle;

Establish and maintain process performance models

Identify business objectives and critical processes

In order to determine which process performance models to establish you must first confirm your business objectives. This step is typically performed when you are setting up a Performance Measurement System and re-confirmed as part of its ongoing management. Identify the relationships between those measurable objectives (e.g. improve quality, improve productivity) and their associated critical processes and subprocesses. These vital processes are candidates to statistically manage and control with process performance models. Narrow down the list of potential candidates to only those subprocesses that drive key business objectives and for which there is data available to establish baselines and develop models.

For example, if a key business objective is to Improve Quality, a process performance objective may be to reduce defect density to less than .5 defects per KLOC (thousand lines of code). You may also attach qualifiers for that performance objective to prevent suboptimizing in another area (e.g. you want to reduce defect density without increasing costs by a certain amount). Next, identify the processes that drive that performance objective. Processes like functional testing, reviews, and requirements development all impact defect density. These processes can be further broken down into sub-processes like integration testing, walkthroughs, and requirements analysis. These sub-processes are candidates for statistical management and control. 

Identify prediction outcomes

Identify what outcomes you will predict within projects across the organization. These are your outcome measures.  For example, if you want to reduce defect density, you should predict your project defect arrival and removal trends (control measures) in each phase of the lifecycle. These measurements are indicators of project quality. Outcomes help you identify when corrective actions are required. Outcome and control measures are typically identified when you are setting up your Performance Measurement System.

Identify controllable factors that impact outcomes

Projects have direct or indirect influence over factors that can affect project outcomes. These variables include controllable factors such as staffing, skills and expertise, selected tools, and technologies. Projects can adjust these variables to improve their outcomes. You should also identify uncontrollable factors like contract-related constraints and regulations. These variables may be unchangeable on a particular project, but not affect future projects.

Collect data and assess its integrity

Harvest project measurement data as part of your Performance Measurement system so that it can be analyzed. Confirm the completeness, accuracy and integrity of your measurement data to avoid errors in decision making and the implementation of process changes that are inappropriate or unnecessary.  Typical causes of bad measurement data include data entry errors, missing data, inconsistent metric definitions, no management support for the priority of data collection, and inefficient data collection mechanisms.

Identify data types of all outcomes and variables

Determine the types of data you collect (e.g. ratio, nominal, ordinal). These types will impact the selection of analysis techniques to perform.

Establish baselines

Project performance baselines are a measurement of past and current history for all identified factors. Baselines are used to monitor improvement based on process changes and changes in project variables.

Select appropriate analytical models

Analysis techniques typically used in process performance modeling include statistical modeling, Monte Carlo simulation, probabilistic modeling, and many others. By knowing the data type of your variables and your outcomes you can identify the appropriate analysis technique. For example, Monte Carlo simulations allow modeling of variables that are uncertain (e.g. a range of values instead of a single value). This technique is useful when you want to analyze simultaneous effects of many uncertain variables, such as in risk management or schedule estimation (best case/ worst case).

Create Predictions with both confidence and prediction intervals

Create predictions of outcomes to enable decision making. Confidence intervals are a statistical range of behavior of an average value computed from a sample of future data points. Prediction intervals are a statistical range of behavior of individual future data points.

Statistically manage sub-processes with PPMs

With process performance models in place you can begin to statistically manage your subprocesses to help predict where process performance should be compared to its actuals. If performance is outside of predictions then perform additional analysis to determine the cause.

Accuracy of predictions depends on:

  • the quality, stability, and tailorability of the process definition
  • consistency of project compliance
  • relevance and suitability of the the selected process performance model/ analysis technique

Take Action Based on predictions

Investigate unacceptable ranges of values for given outcomes in order to plan corrective actions and process improvements. You may need to recreate your process performance models based on unidentified project variables, or perform some trade-off analysis for process improvements (e.g. improving defect detection processes may make projects more expensive).

Calibrate process performance models

Validate and maintain your process performance models. Confirm that each model is providing practical value and that the analysis techniques used are appropriate for the associated data types. Confirm that your models are accurately predicting performance. Recalibrate your model as needed.

Process performance models may not be very accurate in the beginning. The more project data harvested and fed into the model over time, the better your process performance predictions will be.

Prerequisites

In order to support process performance modeling, the organization needs the following enablers:

  • process improvement organization
  • Performance Measurements System/ Measurement repository
  • ability to have statistical control over target processes, processes that are predictable and stable
  • Statistical analysis tools

Tool support 

Statistical analysis tools and spreadsheets are the typical tools used to create and manage process performance models. IBM Rational Focal Point provides features for performing statistical analysis of project financials and risk.