Capture Work Status
There are different ways to approach this step, and much of the approach will depend on your project culture. Where
available, gather and collate progress reports prepared by individual team members or sub-teams. Project time sheets
are another possible source to consider. If project scheduling systems such as Microsoft® Project are actively used and
updated with actual progress, this provides another useful information source. Where available and actively used, you
might also derive objective status or progress metrics from configuration and change management systems.
For this step and subsequent steps that deal with gathering information and assessing the test effort, try to obtain a
balanced view, incorporating both objective and subjective measures. Remember that objective numbers only give part of
the picture, and need to be supported and explained by the current project "climate". Conversely, do not rely purely on
hearsay and subjective speculation about the test effort: look for supporting objective evidence. Supplement your
objective data by discussion with either team leads or (where possible) individual team members to gather subjective
assessments and gauge how much confidence you can place in the objective data.
|
Gather Test Effort Productivity and Effectiveness Metrics
Investigate how much effort has been spent on the identification, definition, design, implementation, and execution of
tests. Keep an eye out for signs of excessive effort being devoted to one aspect of the test effort to the detriment of
others. Look also for areas where effort may be unproductive, or not showing sufficient benefit for the level of effort
being expended.
Look also at the effectiveness of the testing. Look for data that supports your initial observations of effectiveness.
Consider aspects such as defect discovery rate, defect severity counts, duplicate defect statistics, and defects
detected as test escapes.
|
Gather Change Request Distribution, Trend and Aging Metrics
Identify important trends evident in the Change Request data. In general, it is less important for this task to spend
time analyzing data volumes, and more important to identify what the relative data trends are indicating. Look for
positive signs such as a steady, continuous rate of defect discovery, or a light ongoing increase or decrease in
discovery rate over time. Be on the lookout for sharp peaks and troughs in discovery rate that indicate the test team
may be encountering process, environmental, political, or other problems that are reducing their productivity.
Look at trends in defect closures. Look for significant increases of closures by development staff as "not
reproducible": Identify cases where this is a result of the test team not performing sufficient analysis of the
failure, and quantify the extent of this problem. Look at trends in defects being closed by development staff as
"functioning as designed": Identify cases where this is a result of the test team not performing sufficient analysis of
the specification, and quantify the extent of this problem. Be careful to confirm that these indications are not false,
and due instead to overworked developers triaging their workload. You should also perform some analysis of defect
verification trends, as fixes to defects are released to the test team in subsequent builds: look out for trends that
indicate that defects awaiting verification by the test team are aging or growing to an unmanageable number.
Look for other trends that indicate problems. Look at the the way in which defects and other change requests have been
recorded or managed by the test team: ambiguous and insufficient information on a change request is difficult and
frustrating for a developer to take action on. The team should take care to monitor that the quality of the information
recorded against defects remains (on average) relatively high. Take the opportunity to improve the clarity of the
associated Change Requests, eliminating ambiguity and emotive language and reasoning. Work together with the
individuals who created these work products to ensure that the essence of the problem is clearly stated, and encourage
them to find factual and accurate ways to approach discussing the Issues.
Also look out for imbalances in defect distribution on a number of different dimensions. Look for functional areas of
the application or the specification that have low defect counts raised against them: this may indicate an exposure
that insufficient testing has been undertaken in that functional area. Look also at distribution by test team members:
there may be indications that individual team members are overworked, and that productivity is suffering.
|
Gather Traceability, Coverage and Dependency Metrics
Analyze the state of the traceability relationships between the testing assets (Test Ideas, Test Motivators, Test
Cases, Test Scripts, Test Suites, and Change Requests) and the upstream and downstream assets that they relate to. Look
for signs that indicate that the test effort is focused on the correct areas and a useful set of motivations. Look also
for negative indications that suggest certain aspects of testing are missing, or are no longer of importance: If the
requirements or development teams are working on areas not represented by the current test effort, this should raise
concerns.
|
Evaluate Metrics and Formulate Initial Assessment
Collate all of the information you have gathered and evaluate it as a collective whole. Remember that each piece of the
data gathered only addresses one aspect of the total assessment, and you must formulate your assessment of the test
effort based on a balanced and considered view of all data.
Record your initial assessment in a format that will be suitable for the stakeholders to assess, make comments, and
give feedback on.
|
Record Findings
This task produces summary status information that is important to the project manager and other roles in the
management team. These roles will use the summary findings to make informed decisions about the project.
Record some aspects of the test effort assessment in a format that allows subsequent assessments to be compared and
contrasted with previous ones. This will enable you to analyze the relative trend in test effort improvements over
time.
|
Plan and Implement Improvement Initiatives
Based on your analysis and the feedback that you have received from various stakeholders, identify opportunities for
improvement. Look for ways to make the testing more effective, productive, and efficient. This might involve:
reassigning staff (including pairing staff to work more effectively, or employing specialized contractors), using
productivity tools to improve efficiency, and finding alternative approaches and techniques that are more productive in
terms of finding defects.
In most cases, it is better to make small, incremental improvements to the test effort and avoid the risk of derailing
the project with large, unsettling changes; in some cases, a bigger change is warranted and useful. Use your best
judgment to formulate an appropriate approach to improvement, and discuss your ideas with other management staff to get
their input before committing the team to embrace large changes.
|
Monitor and Support Improvement Initiatives
For the improvements to be effective, you will need to manage their success. Identify ways that you will be able to
monitor improvement initiatives (preferably in advance on their adoption) to assess their effectiveness. Either
actively monitor the progress being made in adopting the changes yourself, or appoint someone else on the team to do
so.
Most changes meet resistance or problems that must be overcome for them to be ultimately successful. Allow time for,
and be prepared to quickly address, any issues that arise and prevent the initiative from succeeding. Be sensitive to
peoples' natural reluctance to change, and find ways to address their concerns appropriately.
|
Evaluate and Verify Your Results
You should evaluate whether your work is of appropriate quality, and that it is complete enough to be useful to those
team members who will make subsequent use of it as input to their work. Where possible, use checklists to verify that
quality and completeness are good enough.
Have the people who perform the downstream tasks that rely on your work as input review your interim work. Do this
while you still have time available to take action to address their concerns. You should also evaluate your work
against the key input work products to make sure that you have represented them accurately and sufficiently. It may be
useful to have the author of the input work product review your work on this basis.
|
|