Getting Started with Software Reviews

🗓️Feb 9, 2006February 9, 2006🏷️ Development🏷️ Process🏷️ Review
📖 5 minute read
  1. The purpose of the design and code reviews is to find defects.
  2. The factor that most influences the length of a design or code review is the amount of work product that has to be reviewed. This may be pages of design or lines of code of source.
  3. Checklist-directed reviews are the most effective at finding defects specific to the project and product. The checklist has to be built from past defects that the person or team had problems with most recently.

Starting a Review Practice in a Team

Checklist-directed reviews have a 70% to 90% yield in finding defects. The yield is defined as the percentage of defects found by the review that existed in the work product on entry into the review. When a team starts a review practice, there is no checklist yet. How can a Team get started? Here are some tips to get the reviews going.

What parts of the system to target for a review?

  1. Create on a large piece of paper, or on a whiteboard, a map of the system that the team is working on.
    1. This map must include at least all the physical modules of the system, but going to a finer level of detail is better.
    2. Each system part drawn should represent around 2 KLOC of NCSS (KLOC=thousand lines of code, NCSS=Non Comment Source Statement).
  2. Review the defect log.
    1. From the defect log select the most recent list of 100 defects (or whatever you have in the last month).
  3. Place a mark by each system part for each defect that can be traced to that part.

The system part with the most marks gets to be the target of the investigation.

If after a cursory assessment it seems that finding and fixing all the defects in the part would take longer then rewriting it, then the team should choose the rewrite route.

How much code can you review?

The speed of the review seems to be constant on all projects. Ranges from about 200 to 300 NCSS per hour. Reviewing at higher speeds will make the reviewers miss what they are looking for.

What is the difference between a review and defect finding?

During the review the reviewer is looking at the source code, and when he/she notices a defect he is looking at it right then and there. There is no more detection needed. On the other hand, when the developer has to investigate a defect that is reported by a Customer, then first the reason for the defect must be established. This is a time consuming activity because it contains a great deal of uncertainty.

What to look for in a review?

It is best to look for defects that had been observed already. Chances are that the defects that were noted in some part of the system, are present in some other part of the system as well (especially if they both have just been worked on).

Types of Reviews

  1. Personal Review
  2. Peer Review
  3. Inspection
  4. Walkthrough

Review Principles

Personal reviews follow a process with

  • entry and exit criteria
  • a defined review structure
  • guidelines, checklists, and standards

The personal review goal is to find every defect before the first unit test. To address this goal:

  • use coding standards
  • use design completeness criteria
  • measure and improve your review process

Design Review Principles

  1. Produce designs that can be reviewed.
  2. Follow an explicit review strategy.
  3. Review the design in stages.
  4. Verify that the logic correctly implements the requirements.

What is a Reviewable Design?

A reviewable design has:

  1. defined context
  2. precise representation
  3. consistent and clear structure

This suggests that:

  1. the design’s purpose and function is explicitly stated
  2. you have criteria for design completeness
  3. the design is explicitly structured in logical elements

Checklists

Checklists: The Theory

  1. When performing precise tasks, it is difficult to do more than one thing well at a time.
  2. The checklist defines the review steps in the suggested order for performing them.
  3. By checking off each item, you are more likely to perform it properly.
  4. Establish a personal checklist that is customized to your defect experience.
  5. Process yield is the percentage of defects found before the first unit test execution. (70%+)

Checklists: HOWTO

  1. Use your review strategy.
  2. Review one product component at a time.
  3. Check for one type of defect at a time.

Checklists: The Key Point

Treat each check as a personal certification that the product is free of this defect.

For Extra Credit: Estimating Remaining Defects After a Peer Review

Use the Capture-Recapture Method:

  • A: The number of defects found by the first reviewer.
  • B: The number of defects found by the second reviewer.
  • C: The number of defect found by both the first and the second reviewer.

Estimated total number of defects in the product:   E = A B C \ E = \frac{A \ast B}{C}
Total defects found so far:   D = A + B C \ D = A + B - C
Estimated remaining defects in the product:   R = E D = A B C ( A + B C ) \ R = E - D = \frac{A \ast B}{C} - (A + B - C)
Inspection Yield:   Y = D E 100 \ Y = \frac{D}{E} \ast 100

HOWTO Quick Summary

  1. Pick a piece of code that you are uneasy about.
  2. Review it for defects that you had recently. KEEP A LIST!
  3. Keep your review speed at 200 LOC/hr. Plan ahead!
  4. Only look for one defect at a time.
  5. Treat each check as a personal certification that the product is free of this defect.

See Also

Guiding Principles for Reviews from Wiegers

  1. Check your egos at the door (Weinberg)
  2. Keep the review team small.
  3. Find problems during reviews, but don't try to solve them.
  4. Limit review meetings to about two hours.
  5. Review only about 200 to 400 NCSS per hour.
  6. Start the reviews where the perceived pain is the greatest.
  7. "It is only a mistake if it gets out of the review."
  8. "Avoid using technical reviewers who are themselves 'above' review."

References

  1. Peer Reviews in Software, by Karl Wiegers
  2. Introduction to the Team Software Process, by Watts Humphrey
  3. When two eyes aren't enough, by Karl Wiegers
  4. Seven Thruths About Peer Reviews, by Karl Wiegers
  5. Seven Deadly Sins of Software Reviews, by Karl Wiegers
  6. Handbook of Walkthroughs, Inspections and Technical Reviews, by Freedman and Weinberg