Target The Software Experts, Software Inspection, http://www.the-software-experts.de/e_dta-sw-test-inspection.htm
Software Inspection
There are various names for the same thing. Some call it software inspection, which also could extend to the design and its documentation; some call it code inspection, which relates more to the source code. A third name would be Fagan Inspection, called after the person who invented this quality assurance and testing method.
Code inspections are a highly efficient test method which can not be substituted by any other test methods. It is time consuming but according to statistics it will find up to 90% of the contained errors, if done properly. However it all depends on the methods and checks applied and on the diligence of the inspectors. It must not be confused with the so called "code review" or "walk through" which is usually done in a single meeting lasting for a couple of hours. A proper code inspection may take several days and needs the help of tools to browse the symbols in order to find the places where they are used. The code review can be used in addition to e.g. to generated acceptance of a software package by the integrators, but it must not be a substitute for a proper inspection. Proper inspections can be applied for almost all work products in the software life cycle. At the first glance they may look very time consuming. But statistical evaluations have shown that over the whole life cycle of the software development they even save resources and thus money and improve the quality of the product.
Most people are not aware that the manual static testing methods i.e. inspections, reviews and walk throughs are defined in the "IEEE Standard for Software Reviews". This is the IEEE 1028-1997 standard. I want to give a short overview on the main definitions in this standard, however I will not discuss the "Management Review" which is in the widest sense a check of a project's performance and the related documents. I will also omit a discussion of the Audits described in the standard, which a more related to having external checks on work products and processes. I will focus on the review techniques for technical work products as they are typically used within a company. I also want to point out the problems involved with these methods and make an attempt to present some solutions for these problems.
Source of the diagram: Michael Fagan
Walk-throughs
A walk-through can have a twofold purpose. First of all is can be performed to evaluate a software product with the purpose of:
1. Find anomalies2. Improve the software product
3. Consider alternative implementations
4. Evaluate the conformance to standards and specifications
In summary you could say that this kind of walk-trough is a method which should be used throughout the design phase of a software product to collect ideas and inputs from other team members which lead to an overall improvement of the product. The second objective of a walk-through is to exchange techniques and perform training of the participants. It is a method to raise all team members to the same level of knowledge regarding programming styles and details of the product. In a sense it also generates agreement within the team about the object of the walk-through.
The formal aspects of a walk-through have a low profile. There are only a few roles defined in the standard. There is a walk-through leader, a recorder, the author of the work product and team members. The standard says that at least two members have to be assembled for the walk-through and the roles can be shared among them. The walk-through has to be planned which means that the participants have to be defined and the meeting has to be scheduled. Further the findings and outcomes of the meeting have to be recorded.
In total this is a nice and easy to use method for everyday technical work from which you can expect good benefit.
Technical Reviews
About the purpose of technical reviews the IEEE standard says: "The purpose of a technical review is to evaluate a software product by a team of qualified personnel to determine its suitability for its intended use and identify discrepancies from specifications and standards." In other words the technical review is a meeting in which a team analyzes a work product to see it its quality is as expected or if it needs some improvement. The standard further states that not necessarily all aspects of the review object have to be examined and that it is a possible purpose of the meeting to come up with alternatives for a better design. The list of work products for which the review can be applied is quite big: Software requirements specification, Software design description, Software test documentation, Software user documentation, Maintenance manuals, System build procedures, Installation procedures and Release notes are possible candidates for the review. The review meetings should be planned in the project plan or they can be held on request e.g. by the quality group. The roles involved in a technical review are as follows:
· Decision maker· Review leader
· Recorder
· Technical staff
· Management staff (optional)
· Other team members (optional)
· Customer or user representative (optional)
According to the IEEE standard the input to the technical review shall include the following:
· A statement of objectives for the technical review (mandatory)· The software product being examined (mandatory)
· Software project management plan (mandatory)
· Current anomalies or issues list for the software product (mandatory)
· Documented review procedures (mandatory)
· Relevant review reports (should)
· Any regulations, standards, guidelines, plans, and procedures against which the software product is to be examined (should)
· Anomaly categories (See IEEE Std 1044-1993 [B7]) (should)
Instead of describing the technical review procedure in words I put this into a diagram. In the following flow chart you find the steps of the review, the expected inputs i.e. work product and other documents, and a rough description of each process step in the comment boxes:
Inspections
The inspection as described in the IEEE standard is basically the same as the Fagan Inspection, as invented and described by Michael Fagan in 1976. Compared to the technical review there is more formalism and more roles, etc. involved. Further down you will find a comparison and discussion of these review techniques so I will not go into details here. The only thing which needs to be pointed out at this place, is that the IEEE standard states that the inspection should be done according to the project plan. It is usually not just done on demand. Further there is an explicit reference to the software verification and validation plan where the inspections should be reflected. An inspection is therefore regarded as a proper testing activity rather than an activity to evaluate a work product for suitability. The IEEE standard says that the following roles shall be established for the inspection:
· Inspection leader· Recorder
· Reader
· Author
· Inspectors
According to the IEEE standard the input to the inspection shall include the following:
· The software product to be inspected (mandatory)
· Documented inspection procedure (mandatory)
· Inspection reporting forms (mandatory)
· Current anomalies or issues list (mandatory)
· Inspection checklists (should)
· Any regulations, standards, guidelines, plans, and procedures against which the software product is to be inspected (should)
· Hardware product specifications (should)
· Hardware performance data (should)
· Anomaly categories (see IEEE Std 1044-1993 [B7]) (should)
Again here I have a description of the inspection procedure in a diagram. In the following flow chart you find the steps of the inspection, the expected inputs i.e. work product and other documents, and a rough description of each process step in the comment boxes:
Comparison of Inspections and Reviews
As you can see from the diagrams, a technical review is not the same as an inspection. There are big differences, and I want to summarize them in the following table:
Issue / Technical Review / InspectionObjective / Determine the suitability of a work product for its intended use. / Determine the suitability of a work product for it's intended use, but beyond that search for anomalies by examination through educated inspectors.
Roles / A minimum of two persons is required. Each one of them can assume multiple roles. Since the scope is different additional persons e.g. management or customer representative can participate, but this is not regarded as a role in the process. / Additionally required roles are the Author and the Reader. The roles are explicitly separated and can not be assumed by one person.
Input / The inputs are very similar than the ones of the inspection. It has to be observed that checklists are not mentioned i.e. not required. / Additional inputs are: Inspection reporting forms, Inspection checklists, Hardware product specifications, Hardware performance data. Some of these inputs are optional. The reporting form is mandatory as well as the inspection checklist. The wording of the standard would suggest that checklists are not mandatory, however a table in the appendix of the standard makes them mandatory.
Output / The only output is an Action Item List and a record (meeting minutes) of the technical review. / The outputs are a formal inspection report, a formal defect summary and a defect list with classified defects. The emphasis is on providing a standard output of found errors which also would allow statistical evaluations.
Entry Criteria for the Meeting / No special criteria mentioned. / It is explicitly stated that the meeting has to be re-scheduled if the inspection leader finds that the participants are not well prepared.
Meeting / Keep the described rules of a review meeting. / The defined roles have to be explicitly kept. The reader and not the author will present the material. The other roles also have to be formally kept.
Outcome of the Meeting / Generate of the defined review report. / At the end of the inspection the decision has to be made to accept the work product, close the inspection, but require a rework, reject the work product and require a re-inspection after the rework is done.
There are some additional things to be observed. According to the standard the inspection leader has to have a formal training about the inspection process. Further from practical experience it can be determined that an inspection or review meeting should not be longer than 2 hours. After that time the quality of the inspection or review decreases. Another rule is that in an inspection you should cover a maximum of 100 lines of code per hour and a maximum of 400 lines of code per day.
Practical Problems with Inspections
An inspection requires a big effort and lots of resources. The number of participants in an inspection is usually around 5 to 6 persons. However in modern industry it is very difficult to get 6 people who fulfill the "requirements" to attend in numerous inspection meetings. We do not live in 1976 any more. Time pressure is high and development teams are global. If you think of a small piece of software, let's say 2000 lines of code and if you try to keep the inspection rules this would mean that you have to schedule approximately 10 inspection meetings to do a complete inspection of the code. In case the leader finds that preparation was not well done he has to re-schedule the meetings. In a modern company with high time pressure in development and a global setup of the teams this seems to be almost impossible to accomplish, especially if you consider that a medium microcontroller application consists of 20000 and more code lines!
The formalism in the inspection process is heavy. However, the formalism does not guarantee that bugs are detected! Formalism in a way guarantees that the inspection meetings are performed in the intended way and by this it may be implied that the quality of the inspection has a certain standard, but that's all. You guarantee that a number of meetings were held according to the rules. But there is no real benefit in it and no guarantee that the real task i.e. TO FIND ANOMALIES is really accomplished to an extend that justifies the heavy effort.
The preparation period as described in the standard and as performed in real life is very vague! The checklists may not cover everything that is necessary. Usually they are too general. A typical question in such a checklist is: "Are overflows/underflows in calculations considered?". Then it is left up to the inspector to find a way to check this. Some experienced inspectors may have a standard method for themselves and they produce good results, but there is no clear method about the exact procedure that has to be followed in the preparation period or in the inspection meeting. This is left completely open. However, this activity IS THE BUG FINDING ACTIVITY! Especially during the preparation time the bugs are found. In the meeting you may have the effect that things become more clear when they are presented in the words of the reader and additional problems are found, but the main bug finding activity is the preparation time.