Chapter 3 Using Formal Inspections in Software Quality Assurance

CIS 841 Web Book -- Fall 1999

Authors

Ge Fan                     Email: gefan@cis.ksu.edu
Fang Fang                Email: ffa1928@cis.ksu.edu
Stacy Lacy               Email: slacy@kscable.com
Instructor Dr. Bill Hankley

Abstract

Formal inspections may be applied to any product or partial product of the software development process, including requirements, design, and code. The software formal inspections are in-process technical reviews of a product of the software life cycle, which aims to detect and eliminate defects in the early stages of each product?s development. In addition, the evaluation from the formal inspections can be immediately fed back to the author with improvements in the quality of future products.

The cost of changing requirements, altering design elements, and fixing and retesting code errors esculates as the project progresses. Implementing formal inspections into the software development process can have a direct postive impact on project cost, schedule, and functionality.

This tutorial will discuss the general concepts and definitions on the software formal inspections, the formal inspection process and roles of the inspection team.  Due to the wide spread adoption of object oriented development emphasis will be on formal inspections in an object oriented environment.


Table of Contents

1.0 Introduction (Fan)
1.1 What are inspections
1.2 Who uses inspections
1.3 Why use inspections
1.4 What are the differences among  inspections, walkthroughs and reviews
1.5 What are paybacks from inspections
2.0 Inspection Process (Lacy) 3.0 Roles in Inspection Process (Fang)
3.1 Procedural roles
3.1.1 Moderator
3.1.2 Author
3.1.3 Reader
3.1.4 Recorder
3.1.5 Inspectors
3.2 Guidelines for roles

3.3 Participation of inspectors

3.3.1 Planning
3.3.2 Overview
3.3.3 Preparation
3.3.4 Inspection Meeting
3.3.5 Discussion
3.3.6 Rework
3.3.7 Follow-up
4.0 Inspections During the Software Life Cycle
4.1 Requirements Inspection (Fang) This section is currently unavailable
4.1.1 Purpose
4.1.2 Materials to be distributed
4.1.3 Entrance criteria
4.1.4 Reference materials
4.1.5 Coverage rates
4.1.6 Participants and roles
4.1.7 Procedure
4.1.8 Exit criteria
4.1.9 Checklist
5.0 Inspection Tools and Techniques (Fan)
5.1 ASSIST
5.2 Scrutiny
5.3 ICICLE
5.4 CSI
5.5 WiP
5.6 ReviewPro
5.7 CheckMate
6.0 Summary (Lacy)

7.0 References

8.0 Self-assessment (Fan, Fang, Lacy)

9.0 Glossary (Fan, Fang, Lacy)

Extended Example


1.0 Introduction

        The software inspections were first introduced by Michael E. Fagan in 1970s, when he was a software development manager at IBM. The inspection process for software development was also developed as it was practiced by IBM at that time. To distinguish the software inspection from general inspection, the software inspection should be called the in-process inspection. For simplicity, we frequently use the term "inspection" as synonymous with "in-process inspection".

1.1 What are inspections
        The inspections are a means of verifying intellectual products by manually examining the developing product, a piece at a time, by small groups of peers to ensure that it is correct and conforms to product specifications and requirements [STRAUSS]. The purpose of inspections is the detection of defects. There are two aspects of inspections to address. One is the inspections occur in the early stages in software life cycle and examine a piece of the developing product at a time. These stages include requirements, design, and coding. The defects in the first stage would be amplified to more defects in design stage and much more defects in coding stage without inspections. Thus earlier detection for defects can make lower cost on software development and well ensure the quality of the software product to delivery. On the other hand, a small group of peers concentrating on one part of the product, can detect more defects than the same number of people working alone. Therefore, this improved effectiveness comes from the thoroughness of the inspection procedures and the synergy achieved by an inspection team.

1.2 Who uses inspections
        Inspections can play a significant role in a quality management system when used consistently and correctly, but they are of little value if applied haphazardly or without controls or for the wrong tasks [STRAUSS]. For example, inspections can not be applied for or only limitedly used in the following circumstances:

Even if applied correctly and consistently, inspections are not effective in all project environments [STRAUSS]. For instance, they will not work well on unstructured projects, such as research, or on a project that have no interim checkpoints. Thus the inspections are applied for person who 1.3 Why use inspections
        Any efforts are expected without waste to be turned into contribution to increase quality, productivity, and customer satisfaction. However, basically it is the cost of doing things over, possibly several times, until the things are done correctly. In addition, the costs of lost time, lost productivity, lost customers, and lost business are real costs because of no return on them. The cost of quality, however, is not such a negative cost. Experience with inspections shows that time added to the development cycle to accommodate the inspection process is more than gained back in the testing and manufacturing cycles, and in the cost of redevelopment that doesn't need to be done [STRAUSS].

1.4 What are the differences among  inspections, walkthroughs and reviews
        In the methods of quality control, inspection is a mechanism that has proven extremely effective for the specific objective of product verification in many development activities. It is a structured method of quality control, as it must follow a specified series of steps that define what can be inspected, when it can be inspected, who can inspect it, what preparation is needed for the inspection, how the inspection is to be conducted, what data is to be collected, and what the follow-up to be the inspection is. Thus the result of inspections on a project has the performance of close procedural control and repeatability. However, reviews and walkthroughs have less structured procedures. They can have many purposes and formats. Reviews can be used to form decisions and resolve issues of design and development. They can also be used as a forum for information swapping or brainstorming. Walkthroughs are used for the resolution of design or implementation issues. Both methods can range from being formalized and following a predefined set of procedures to completely informal. Thus they lacks the close procedural control and repeatability.

1.5 What are paybacks from inspections
        The key reason for inspections is to obtain a significant improvement in software quality, as measured by defects that are found in the product when it is used. A project example from AT&T the Integrated Corporate Information System  shows the inspections for a portion of ICIS.

The evaluation indicates the high quality of the resulting product. Nevertheless, as Mike Fagan pointed out in 1976 that inspections shorten the development schedule, the productivity will increase as the inspections perform formally. In addition, it should be expected that the development timescale, testing cost, and lifetime cost will be reduced and manageability of the development process will be improved.


2.0 Inspection Process

2.1 Inspection Process Overview

2.1.1 Inspection Process Flow
The inspection process consists of seven primary steps: Planning, Overiew, Preparation, Inspection, Discussion, Rework, and Follow-up. Two of these steps (Overview and Discussion) are optional based on the needs of the project. This chapter describes each of these steps. A flow chart of the Inspection process is provided in Figure 2.1. Use this flow chart throughout this section to better understand the material.
 
Figure 2.1 Inspection Process Flow.  currently unavailable

 
 
2.1.2 Inspection Steps and Deliverables
Each steps in the inspection process should have specified deliverables. (If there aren't deliverables, the step should be eliminated.) The following table outlines recommended deliverables for each part of the inspection process. Also indicated are the roles responsible for each deliverable.
 

Inspection Step Deliverable(s) Responsible Role
Planning Participant List Moderator
Materials for Inspection Author
Agenda Moderator
Entrance Criteria Moderator
Overview Common Understanding of Project Moderator
Preparation Preparation Logs Each Inspector
Inspection Defect Log Recorder
Inspection Report Moderator
Discussion Suggested Defect Fixes Various
Rework Corrected Defects Author
Follow-up Inspection Report (amended) Moderator

 
 

2.2 Planning an Inspection

Planning for a software review or inspection consists of three key parts: Selecting the participants, developing the agenda, and distributing necessary materials, and determining entrance criteria.
2.2.1 Selecting Participants
Selection of participants for an inspection can involve good political and negotiation skills. The main purpose of the inspection is to improve software quality and reduce defects. Rule of thumb #1: If a participant does not have the qualifications to contribute to the inspection, they should not be included [FREE].

Another complexity to selecting participants is how many people should be included. The number is somewhat determined by assigning specific roles. Section 5.0 discusses roles and how to assign them. A second consideration on team size has to do with communication. The larger the group, the more lines of communication must be maintained. If the group grows beyond 6-10 people, the time spent on communication, scheduling, and maintaining focus will detract from the quality and timeliness of the inspection. Rule of thumb #2: The optimum team size for inspections is less than 10.

A common question is should managers be included? Managers should be aware of the outcome of inspections, but generally not included [FREE]. Referring to rule of thumb #1, managers should only participate if they will directly add value to the substance of the inspection. Inspections and reviews are meant to improve software quality, not to access or manage people. If the latter becomes the goal of the process, the inspections will be threatening and participants will not be completely open.[NASA1]


2.2.2 Developing the Agenda
The agenda for an inspection should be created in advance and distributed to all participants. A standard template for each type of inspection should be developed based on the organizations implementation of the inspection process. The inspection meeting should be managed closely to follow the agenda. If meetings frequently go off track and run late, participation decline.

2.2.3 Distributing Materials
Prior to an inspection, all materials that will be used in the meeting should be distributed. The lead time depends upon the size of the items to be reviewed. In general, there should be time for each participant to thoroughly review the component they are responsible for inspecting. The author should provide the materials to the moderator. The moderator will verify and distribute them.

The method of distribution for materials is dependent on the culture of the development team and the type of inspection to be conducted. Many participants may prefer hard copy for design elements, while others prefer to navigate through electronic documents. Online access to code for inspection may be preferred for access to search functions.

Inspection participants should be expected to bring any materials they need for the review to the meeting. There should not be new material at the meeting, if materials were incomplete, the review should be rescheduled.

2.2.4 Entrance Criteria
Specific criteria should be required to qualify a product or artifact for an inspection [COLL]. Typical entrance criteria would be that the product is complete and would be delivered to the customer if the inspection finds no defects.

2.3 Pre-Inspection Overview

An optional pre-inspection overview may be held at the direction of the moderator. It is possible that inspectors from groups not immediately involved with the project will be necessary to conduct a thorough inspection. In this situation it is recommended that an overview of the project be conducted bringing the new players up to speed.

For inspections be conducted by people already familiar with the project, the overview may be abbreviated or eliminated. If an inspection seems to be bogged down in questions that should be general knowledge, the moderator may wish to postpone the inspection and conduct an overview.

2.4 Preparing for the Inspection

2.4.1 Inspecting the Product or Artifacts
The primary activity in preparation for a software inspection is the thorough inspection of the product or artifacts in question. This should be conducted prior to the inspection meeting and at a line-by-line or item-by-item level of detail. Inspection should consider standards, best practices, and regulations or policies.
2.4.2 Checklists
Inspections cover a wide variety of project artifacts including requirements, design, code, and test plans. For each different type of inspection it is useful for an organization to create standard checklists for use by the inspectors [NASA1]. The checklists should be "living" documents so an organization can improve on inspection practices and develop core inspection knowledge.
2.4.3 Preparation Logs
During preparation, inspectors should log all defects found and track the time spent on preparation. The use of a Preparation Log by each inspector is a best practice. The preparation log should be given to the moderator before the inspection meeting. The moderator can determine based on all the logs whether the inspection team is adequately prepared for the event. Click here for an example Preparation Log.

2.5 Conducting the Inspection

The software inspection is conducted by presenting the material, focusing on inspection of the product and completing two deliverables, the Defect Log and the Inspection Report.
2.5.1 Presenting the Material
The reader presents the inspection material in a speed and manner such that all participants can understand, keep up, and contribute. An inspection should not last more than a few hours due to the difficulty in maintaining attention span and comfort sitting. If an inspection covers more material than can be covered in 2 hours it should be broken up into multiple sessions or multiple inspections [NASA1].
2.5.2 Focus on Inspecting the Product
The moderator must maintain focus of the meeting on the inspection. Discussion should only point out defects to the author or ask for clarification. Any discussion about how to correct defects should be deferred to another meeting. By maintaining focus on the actual product being inspected, feelings of confrontation can be avoided. Specifically, defects logged are found in the product, not in people.
2.5.2 Defect Log
An accurate log of all defects must be kept. The log should indicate what the defect is, they defect type, where it was found, and the severity of the defect (major, minor). The defect log will be used by the author to prioritize and correct defects. It may also be used for metrics in a process improvement effort. The Defect Log should not be used in evaluating individuals. Its purpose is to improve software quality through honest and open feedback, any other use could lead to problems.

The defect log may be sufficient for many projects. However, defects may be somewhat complex and further information may need to be provided. A defect report form can be used in conjunction with the defect log to provide more information. In such circumstances, the defect log may become the index to individual defect reports.

Click here for an example Defect Log.

2.5.3 Inspection Report
At the conclusion of the inspection, the moderator should compile an inspection report. It should note the quantity of major and minor defects and indicate whether the project may proceed or needs rework. In the event there are defects in need of correction, an estimate of time and effort should be provided. The Inspection Report should be used to communicate to project management the results of the inspection.

If rework is required, the Inspection Report will later be amended after completion of rework. Click here for an example Inspection Report.

2.6 Post-Inspection Discussion

The focus of the formal inspection meeting must be maintained. When significant issues arise during inspections they should be deferred to a follow-up meeting. Additionally, if the author wishes to request the inspectors or others to assist in determining how to correct a defect, he should do this after the completion of the inspection meeting. The follow-up meeting or meetings could be referred to as Post-Inspection Discussion or Third-hour meetings [NASA1].

2.7 Rework

After the completion of an inspection meeting, the author has the responsibility for addressing all items in the Defect Log. Items listed as major defects must be fixed before the inspection results in approval to proceed. Items with minor impact may be fixed based on cost and time justification. An estimate of time to remediate defects should be provided to the moderator so follow-up meetings can be scheduled.

2.8 Follow-up

2.8.1 Follow-up Analysis
At the completion of rework on the product inspected, the moderator will determine if the defects have been corrected in an acceptable manner. If the analysis by the moderator identifies significant risk, a new inspection can be scheduled.
2.8.2 Amended Inspection Report
When all major defects identified through the inspection process have been corrected, the Inspection Report should be amended. The moderator should recommend that the project proceed. The Inspection Report and Defect Log may also be used to collect metrics for improving the development and inspection process.

The amended Inspection Report should provide one of two recommendations for the product inspected: Accept or Reject.


3.0 Roles in Inspection Process

The inspection process is performed by inspection team. It is repeated many times on many work products during the given product development. How well the inspection teams do their job will decide whether there is a net decrease in the overall development schedule and an increase in net productivity, To carry out the inspection, there are always five specific procedural roles are assigned.

3.1 Roles Responsibilities

     Moderator
     Reader
     Recorder
     Author
     Inspector

3.2 Guidelines for roles

To ensure the quality, efficiency and effectiveness of inspection teams, it is very important to carefully manage and use well-formed inspection teams. Inspection
teams need to combine several factors.
All team members are inspectors. Readers and recorders should be experienced inspectors. The number of inexperienced inspectors should be limited if possible.
Minimum is three (a moderator/ recorder, a reader, and an author). Enough team members can adequately verify the work product for the intended purpose of the inspection, but any more persons will reduce the effectiveness of the process.
So inspection team should be small with Maximum of seven persons.

3.3 Participation of inspectors

3.3.1 Planning
Roles:  - Moderator
            - Author
 

3.3.2 Overview
Roles:  - Moderator
            - Author
            - Inspectors

3.3.3 Preparation
Roles: - All inspectors

3.3.4 Inspection Meeting (5)
Roles:   - Moderator
             - Author
             - Reader
             - Recorder
             - Inspectors

3.3.5 Discussion
Roles:  - All inspectors

3.3.6 Rework
Roles:  - Author

3.3.7 Follow�Up
Roles:  - Moderator
           - Author
 



4.0 Inspections During the Software Life Cycle

Formal inspections are in-process peer reviews conducted within the phase of the life cycle in which the product is developed. The period of time that starts
when a software product is conceived and ends when the product is no longer available for use. The software life cycle typically traditionally includes the
following eight phases:

     Concept and Initiation Phase
     Requirements Phase
     Architectural Design Phase
     Detailed Design Phase
     Implementation Phase
     Integration and Test Phase
     Acceptance and Delivery Phase
     Sustaining Engineering and Operations Phase.

This tutorial emphasizes inspections in the phases of the requirements, design, and implementation in software development and suggests products that may
be inspected during each phase. The software life cycle used is the NASA standard waterfall model. The following sections describe the inspections to be
conducted in the three phases.



4.1 Requirements Inspection (Fang)   currently unavailable
 

4.2 Design Inspection  (Lacy)

Design inspections are used to eliminate design defects before the propogate into the construction phase of a proejct. After construction begins, alteration to the design of a system can be very costly. Although some organizations differ in the approach to inspections and reviews, the most common terminology used in verifying design work is the Design Review. Generally there are two different levels or stages of design reviews: Preliminary Design Review and Critical Design Review.

4.2.1 Preliminary Design Reviews

The Preliminary Design Review (PDR) focuses on reviewing the software archecture or basic design constructs. This approach is part of a two stage process in reviewing design. Some organizations may refer to this as a software architecture review or inspection. Other organizations use the term High Level design review or Logical Design review [FREE]. The goal of this review is to verify the architecture in a system design before effort is applied to detailed design work.

In an object oriented project the PDR would focus on the package and collaboration diagrams. The PDR might include review design to ensure compliance with a corporate component framework. A verification that all requirements for a system are mapped to a software component may also be done in this phase. Some organizations require a test matrix be defined and reviewed in the PDR [NASA2].

PDRs can be facilitated by the use of checklists for reviewers. Since the PDR is at a high level, the checklist can be useful for a wide range of projects implemented with varying technologies. The NASA Jet Propulsion Lab uses an Architecture Design Checklist. Click here for the Checklist

4.2.2 Critical Design Reviews

The Critical Design Review (CDR) are the second stage in the design review/inspection process. At this stage, thorough inspection of the detailed design occurs. Often, many inspections are completed and the CDR reviews the outcome of the inspections. The CDR should be used to ensure design and documentation standards are met. It should also validate that the design inspections are complete and the resulting defects have been resolved.

Each inspection of the design completed as part of or before the CDR would follow the basic inspection process outlined in section 4 of this chapter. Each inspection consists of planning, preparing, inspecting, discussing, rework, and follow-up. The inspection should result in a defect log that may be customized for the design inspection process. The inspection report would provide management with an assessment of readiness to begin construction of the system.

CDRs can also benefit from a standard checklist of design considerations. Click here to the Detail Design Checklist from JPL.



4.3 Code Inspections

Code and all new documentation are the candidates for inspections during this phase. Code inspections should check for technical accuracy and completeness of the code, verify that it implements the planned design, and ensure good coding practices
and standards are used. Code inspections should be done after the code has been compiled and all syntax errors removed, but before it has been unit tested. Other candidates are the integration and test plan and procedures, and other documents that have been produced. Documents should be inspected for accuracy, completeness, and traceability to higher level documents. The inspection team may be selected from participants in the detailed design, code, test, verification and validation, or from software quality assurance [NASA1].

4.3.1 Purpose
According to [STRAUSS], the basic purposes of the code inspection include:

To make the purposes more clear, the code inspection will serve the following purposes before the program test functions proceed [SCHULMEYER]: Code inspections for each software module are typical of the pass/fail events which serve as milestones in software development schedules. The result of a successful  code inspection should be a complete code which conforms to the high-level design, low-level design, and PPS.

4.3.2 Materials to be distributed
Following materials need to be distributed prior to the inspection:

4.3.3 Entrance criteria 4.3.4 Reference materials 4.3.5 Coverage rates 4.3.6 Participants and roles
Functional required for a code inspection are as follows: It is effective for the code being inspected to be looked at from the following viewpoints recommended: It is acceptable for a single inspector to take more than one viewpoint. When possible these roles should be filled by the people who are actually responsible for the effort. The inspection team has at least one peer other than the author, who had experience writing in the language being inspected.

4.3.7 Inspection procedures
Examine the code for conformance to the detailed design, conformance to the coding standards, and correctness. Focus on control logic, linkage parameters, internal and external interfaces, and data definitions and usage. Look for performance, structuring, and storage problems.

4.3.8 Exit criteria

4.3.9 Checklist
Checklist is a popular defect detection method. Its contents depend on features of the materials to be inspected. The code inpection checklist (I2) in [NASA1] includes the checklist for C and the checklist for Fortran. Two of the characteristics of a checklist defined by Gilb and Gramham [GILB] are: Dunsmore commented that the first point highlights that checklists change over time, to keep up-to-date with the current list of most frequent defects found in previous inspections. The second point emphasises the use of checklists, i.e. delivering a series of questions that have the highest probability of finding major defects. Both of these points highlight the fact that checklists are used to emphasize certain areas of code that have a higher probability of containing errors. This may preclude an inspector gaining a complete understanding of all the code to be inspected [DUNS]. To figure out  how the checklist works, an example code inpsection shows a piece of code and its basic requirements inspected uisng the checklist for C/C++ (click for checklist) from [MAC]. The result of defect detection is put in the preparation log and the defect log.



5.0 Inspection Tools and Techniques

The basis of this section is the existing inspection tools and techniques available either in market or in free use. Dunsmore [DUNS] evaluated several tools that had been created for inspections, such as ASSIST, Scrutiny, ICICLE, CSI, Wip, ReviewPro and CheckMate. The following sections briefly describe some current inspection tools, and highlight any features they have which could be used to help in the comprehension of code being inspected.

5.1 ASSIST
ASSIST (Asynchronous/Synchronous Software Inspection Support Tool) is an inspection tool designed by F. Macdonald to support any inspection process and allow inspection of any type of document. This is a research tool that uses a custom-designed process modeling language called IPDL (Inspection Process Definition Language). ASSIST is based on a client/server architecture, where the server is used as a central repository of documents and other data. ASSIST supports both individual and group-based phases of inspection. Group-based phases can be performed synchronously or asynchronously, with the choice of same-place or different-place synchronous meetings.

Macdonald created a generic software inspection template, which would cater for all current inspection processes and be versatile enough to cope with any future processes. This generic template was converted into a process definition language, and was embedded into ASSIST.

Facilities within ASSIST include defect finding aids, enhanced document representations, facilities for metric collection and analysis, and the provision of facilities for distributed inspection. ASSIST provides and on-line checklist, which can be marked off as each item on the list is covered, as well as a text browser that is used to view the product under inspection. The text browser can be used to highlight areas of the document and add annotations. The annotations describe defects that have been found in the document. ASSIST also supports the on-line usage of checklists [DUNS].

ASSIST is freely available for research purposes. It currently runs on SunOS 4.1.3, Solaris 5.1 and OSF/1 platforms. It requires Python 1.4 and Tcl 4.0 / Tk 7.4. Full details are available on the ASSIST homepage: http://www.cs.strath.ac.uk/CS/research/efocs/assist.html

5.2 Scrutiny
Scrutiny, an on-line inspection system developed by Bull HN Information Systems in conjunction with the University of Illinois. Scrutiny is a general inspection support tool which can support distributed inspections. The inspection method used by Scrutiny is based on four phases. The first phase is called Initiation, where the inspection team is formed and the Moderator prepares the necessary documentation. Phase two, Preparation, involves the inspectors creating their annotations of the presented documents for inspection. Phase three, Resolution, is equivalent to inspection meeting. The final phase, Completion, encompasses both the rework and follow-up stages of inspection process.

Scrutiny supports only text documents, but has been designed to be an open tool, in the hope of integrating other tools at a later date. In the product window, which displays the document being inspected, areas of text can be highlighted and an annotation assigned to it. There is no comprehension support available in scrutiny, and no support for checklists [DUNS].

A related paper "Scrutiny: A Collaborative Inspection and Review System", by John W. Gintell and John Arnold and Michael Houde and Jacek Kruszelnicki and Roland McKenney and Gerard Memmi, Proceedings of the Fourth European Software Engineering Conference, Garwisch-Partenkirchen, Germany, September, 1993, can be found at URL:  http://www.ics.hawaii.edu/~johnson/FTR/Bib/Gintell93.ps

5.3 ICICLE
Intelligent Code Inspection in a C Language Environment (ICICLE) is an intelligent inspection assistant for the inspection of C code. A major difference between this tool and others is that it tries to find common defects itself, in an attempt to help the inspector by removing the more obvious errors in the code. ICICLE achieves this through its own rule-based static debugging tool and the UNIX lint tool.

ICICLE supports a two-phase inspection, the individual inspection and the inspection meeting. Group meetings are held in the same room, no distributed facilities are provided. In the individual inspection stage, inspectors can produce comments for each line of code. A referencing system is supplied for variables and functions, allowing quick movement through code, e.g. selecting a variable name
would move you to its point of declaration. A hypertext browser is also available, providing domain specific knowledge. No facilities are provided for any form of comprehension or defect detection [DUNS].

A related paper "ICICLE: Groupware for Code Inspection", by L. Brothers and V. Sembugamoorthy and M. Muller, can be found at Proceedings of the 1990 ACM Conference on Computer Supported Cooperative Work, pages 169-181, October, 1990.

5.4 CSI
CSI (Collaborative Software Inspection) is an on-line inspection environment to allow distributed inspections, developed at the University of Minnesota. All material is available on-line, and inspection products are created on-line. CSI was designed to be able to cope with four types of collaborative inspection meeting: 1) same time, same place, 2) same time, different place, 3) different time, same place, 4) different time, different place. CSI supports both synchronous activities, e.g. group meeting, and asynchronous activities, e.g. individual checking.

CSI contains a browser that displays the material under inspection, but currently only supports text, and contains hyperlinks from the inspected material to a fault list, note pad, inspection summary, and action list. CSI contains hyperlinks to allow easy navigation between different areas of the system, but currently contains no help for the preparation stage or for checklists or any other form of defect detection.

Arelated paper "Distributed, Collaborative Software Inspection", by Vahid Mashayekhi and Janet Drake and Wei-Tek Tsai and John Riedl, can be found at IEEE Software, Volume 10, Number 5, September, 1993.

5.5 WiP
The WiP tool is designed to support distributed inspections, attempting to solve the problem of having a scattered inspection team. WiP utilizes the World Wide Web, and is designed to distribute the documents to be inspected, allow annotation of those documents, be able to search related documents, allow selection of a checklist, and gather inspection statistics. For the preparation stage of inspection, users are given access to source documents and checklists, as well as informal information which could lead to a better understanding of the given documents. The WiP interface also contains hyperlinks to allow easy navigation through the documents. Annotations that are made during the inspection are not made to the documents directly, to avoid multiplication of data and are instead kept separately and sent back to the main server. WiP by the authors own admission was designed primarily as an investigation to the possibility of carrying out inspections over the World Wide Web, and not as a complete inspection tool. As with ASSIST, WiP is aimed at enforcing the rigours of the overall inspection process [DUNS].

The WiP tool provides a set of functions for distributing the document to be inspected, annotating it, searching related documents, choosing the checklist and gathering inspection statistics. Evaluation of the tool indicated that despite some minor shortages it was well liked. The idea of on-line commenting was pleasant and the elimination of tangling with a variety of paper documents and reports made the inspection more effective. Many test users liked the simple statistics, and the tool was easy to learn and use.

A related paper "A WWW-Based Tool for Software Inspection", by Lasse Harjumaa, Ilkka Tervonen, University of Oulu can be found at Proceedings of the 31st Hawaii International Conference on System Sciences (HICSS'98), published by the IEEE Computer Society.

5.6 ReviewPro
ReviewPro is a commercial tool developed by Software Development Technologies Corporation. ReviewPro� is a web-based, Software Technical Reviews and Inspections tool. Technical Reviews and Inspections have proven to be the most effective form of defect detection and removal. Now there is a tool that automates this valuable process and prevents the migration of defects to late phases of software development. ReviewPro� is architecturally independent of the Web browser, Web server and messaging server software and runs on both Windows NT and Unix server platforms.

5.7 CheckMate
CheckMate enables a software inspection group to automatically inspect C and C++ source code against a pre-determined coding policy. CheckMate allows users to configure the coding policy or standard according to the developers' needs. It also keeps a list of software metric statistics. Basically, it allows the inspectors to concentrate on the functionality and architecture of the classes and their methods. CheckMate is available for all Windows platforms with UNIX/VMS and support for Visual Basic under development. More information can be found at http://www.sybernet.ie/source/projects_frame.html


7.0 References

[COLL] Collofello, James S. "The Software Technical Review Process: SEI Curriculum Module SEI-CM-3-1.5", Carnegie Mellon University Software Engineering Institute. June 1998.

[DUNS] Alastair Dunsmore, "Comprehension and Visualisation of Object-Oriented Code for Inspections", URL: http://www2.umassd.edu/SWPI/EFoCS/EFoCS-33-98.pdf

[MAC] Macdonald, F. and J. Miller, "A Comparison of Tool-Based and Paper-Based Software Inpection", URL: http://www2.umassd.edu/SWPI/ISERN/ISERN-98-17.pdf

[FREE] Freedman, Daniel P. Weinberg, Gerald M. "Handbook of Walkthroughs, Insepctions, and Technical Reviews: Evaluating Programs, Projects, and Products", Dorset House Publishing. New York. 1990.

[HOLL] Hollocker. "Software Reviews and Audits Handbook", 1990.

[IEEE1] Institute of Electrical and Electronics Engineers. "IEEE standard for software reviews / Sponsor Software Engineering Standards Committee of the IEEE Computer Society", The Institute. New York. 1998.

[NASA1] National Aeronautics and Space Administration. "Software Formal Inspections Guidebook", NASA-GB-A302. August, 1993.

[NASA2] National Aeronautics and Space Administration. "Software Assurance Guidebook", NASA-GB-A201.

[NASA3] National Aeronautics and Space Administration, "Software Formal Inspections Standard", NASA-STD-2202-93, April 1993.

[NASA4] National Aeronautics and Space Administration, "NASA Guidebook and Standard" (this usefull link is temperarily listed here)

"Reviews and Inspections" (this usefull link is temperarily listed here)

"Brad Appleton's Software Engineering Links" (this usefull link is temperarily listed here)

"Software Quality Links" (this usefull link is temperarily listed here)

[PRESSMAN] Pressman, R., "Software Engineering: A practitioner?s Approach (4th Edition)", McGraw-Hill, 1997.

[STRAUSS] Strauss, S. H. and R. G. Ebenau, "Software Inspection Process", McGraw-Hill, Inc, 1994.

[GILB] Gilb, T. and D. Graham, "Software Inspections", 1993.

[SCHULMEYER] Schulmeyer G. G. and J. I. McManus, "Handbook of Software Quality Assurance", Van Nostrand Reinhold Company Inc., 1987.



8.0 Self-assessment

These questions are to help you assess your understanding of the concepts in this article.
 
  Question Choice
1 What is the purpose of an inspection? A. To make a go/no-go decision to go to next phase.

B. Find defects through a detail examination of the product.

C. Defines the methods for developing software.

D. For manager to evaluate personnel.

2
What is the difference between a review and an inspection? A. They are the same.

B. An inspection is a project management check and a review finds errors.

C. An inspection is formal process to detect defects and a review is a management process to determine if the product is ready to move to the next phase.

D. A review is sub-set of an inspection.

3
What is a defect? A. A violation of a standard or a procedure.

B. A sub-optimal design approach.

C. Code that will not compile.

D. A difference in design approach.

 

4
Identify the 5 valid roles in an inspection? A. Author

B. Leader

C. Coordinator

D. Recorder

E. Secretary

F. Moderator

G. Reader

H. Manager

I. Inspectors

ANSWERS:
1-B,
2-C,
3-A,

          4-A,D,F,G,I



Short Answer Questions.
 
  1. Name and describe the seven steps in the software inspection process.
  2. Which two steps in the inspection process are optional?
  3. Name the deliverables for each step of the software inspection process.
  4. Should managers participate in a software inspection? Why or why not?
  5. What is the purpose of the Preparation Log?
  6. What is the purpose of the Defect Log?
  7. Describe the two different types of design inspections.


9.0 Glossary

Roles

Author(s)

Person or persons primarily responsible for creating a work product. The member of the inspection team that provides information about the work product during all stages of the inspection process and corrects defects during the rework stage. (Also known as (AKA) "Owner(s)".)
Inspection Team
A small group of peers who have a vested interest in the quality of the inspected work product and perform the inspection. This group usually ranges in size from 3 to 8 people and can be selected from various areas of the development life cycle (requirements, design, implementation, testing, quality assurance, user, etc.). Selected members of the inspection team fulfill the roles of moderator, author, reader, and recorder.
Inspector
A person whose responsibilities include reviewing work products created by others. All members should be considered inspectors in an inspection team.
Moderator
Person who is primarily responsible for facilitating and coordinating an inspection. When there is no "Reader" in the inspection process, the moderator also controls the pace of review of the work product during the inspection meeting.
Reader
Person who guides the team during the inspection meeting by reading or paraphrasing the work product. The role of the reader is usually fulfilled by a member of the inspection team other than the author(s). All inspection methods don't use this role.
Recorder
Person who records, in writing, each defect found and its related information (severity, type, etc.) during the inspection meeting. (AKA "Scribe".)
Process

Formal Inspection

Inspection Inspection Stages Planning Stage
Period of time in which details for an inspection are decided and necessary arrangements are made. These usually include; checking to ensure that entry criteria have been met, selection of an inspection team, finding a time and place for the inspection meeting, and deciding whether an overview meeting is needed.
Overview Meeting Stage
Meeting where the author(s) present background information on the work product for the inspection team. An overview meeting is held only when the inspection team needs background information to efficiently and effectively examine the work product. (AKA Kickoff Stage.)
Kickoff Stage
The kickoff stage is used to brief the inspection team on the contents of the inspection packet, inspection objective(s), inspector's defect finding role(s), logistics for the inspection meeting, recommended preparation time and preparation stage data to be collected. The moderator can elect to hold a short (5 to 30 minutes) meeting or may use any other method that will accomplish briefing the team.
Preparation Stage
Period of time inspectors individually study and examine the work product. Usually a checklist is used to suggest potential defects in the work product.
Inspection Meeting Stage
Meeting where the work product is examined for defects by the entire inspection team. The results of this meeting are recorded in a defect list (defect log).
Third Hour Stage
Time allotted for members of the inspection team to resolve open issues and suggest solutions to known defects (or Causal Analysis Stage).
Causal Analysis Stage
Time allotted for the inspection team to analyze defect causes and/or inspection process problems and, if possible, to determine solutions for those problems.
Rework Stage
Time allotted for the author(s) to correct defects found by the inspection team.
Follow-Up Stage
Short meeting between the author(s) and moderator to verify that defects found during the inspection meeting have been corrected and that exit criteria has been met. When exit criteria has not been met, the inspection team repeats inspection stages described under "process".


Materials

Checklist

A list of items posed as questions summarizing potential technical problems for an inspection. Checklists are used during the preparation stage to suggest potential defects and again during the inspection meeting stage to classify defects according to their type.
Defect Defect Classification
The process where defects identified during an inspection are classified by severity and type.
Entry Criteria
A set of measurable actions that must be completed before the start of a given task.
Exit Criteria
A checklist of activities or work items that must be complete or exist, respectively, prior to the end of a given process stage, activity, or subactivity.
Inspection Package
The collection of work products and corresponding documentation presented for inspection, as well as required and appropriate reference materials.
Severity
A degree or category of magnitude for the ultimate impact or effect of executing a given software fault, regardless of probability. The severity of a defect is generally classified as "Major" or "Minor".
Major
A defect, which if not identified and removed in a work product, or in a subsequent work product, could result in a test or customer/field reported problem.
Minor
All other defects.
Type of Inspection
A group of inspections which share a common type of work product, life cycle phase, checklist, entry criteria, and exit criteria. Common inspections types include (but are not limited to): systems requirements, system design, software requirements, software detailed design, source code, test plans, test procedures, user documentation, plans, standards, procedures, training materials, hardware diagrams, interface specifications, and the SIRO Newsletter.
Work Product
The output of a task. Formal work products are delivered to the acquirer. Informal work products are necessary to an engineering task but not deliverable. A work product may also be an input to a task.


I2 - Code Inspection Checklist (JPL) - C language

FUNCTIONALITY
1. Does each module have a single function?
2. Is there code which should be in a separate function?
3. Is the code consistent with performance requirements?
4. Does the code match the Detailed Design? (The problem may be in either the code or the design.)

DATA USAGE
A. Data and Variables
1. Are all variable names lower case?
2. Are names of all internals distinct in 8 characters?
3. Are names of all externals distinct in 6 characters?
4. Do all initializers use "="? (v.7 and later; in all
cases should be consistent).
5. Are declarations grouped into externals and internals?
6. Do all but the most obvious declarations have comments?
7. Is each name used for only a single function (except single character variables "c", "i", "j", "k", "n", "p", "q", "s")?
B. Constants
1. Are all constant names upper case?
2. Are constants defined via "# define"?
3. Are constants that are used in multiple files defined in an INCLUDE header file?
C. Pointers Typing
1. Are pointers declared and used as pointers (not integers)?
2. Are pointers not typecast (except assignment of NULL)?

CONTROL
1. Are "else__if" and "switch" used clearly? (generally "else__if" is clearer, but "switch" may be used for not-mutually-exclusive cases, and may also be faster).
2. Are "goto" and "labels" used only when absolutely necessary, and always with well-commented code?
3. Is "while" rather than "do-while" used wherever possible?

LINKAGE
1. ARE "INCLUDE" files used according to project standards?
2. Are nested "INCLUDE" files avoided?
3. Is all data local in scope (internal static or external static) unless global linkage is specifically necessary and commented?
4. Are the names of macros all upper case?

COMPUTATION
A. Lexical Rules for Operators
1. Are unary operators adjacent to their operands?
2. Do primary operators "->" "." "()" have a space around them? (should have none.)
3. Do assignment and conditional operators always have space around them?
4. Are commas and semicolons followed by a space?
5. Are keywords followed by a blank?
6. Is the use of "(" following function name adjacent to the identifier?
7. Are spaces used to show precedence? If precedence is at all complicated, are parentheses used (especially with bitwise ops)?
B. Evaluation Order
1. Are parentheses used properly for precedence?
2. Does the code depend on evaluation order, except in the following cases?
a. exprl, expr2
b. exprl? expr2 : exp2
c. exprl & & expr2
d. exprl || expr2
3. Are shifts used properly?
4. Does the code depend on order of effects? (e.g., i = i++;)?

MAINTENANCE
1. Are library routines used?
2. Are non-standard usages isolated in subroutines and well documented?
3. Does each module have one exit point?
4. Is the module easy to change?
5. Is the module independent of specific devices where possible?
6. Is the system standard defined types header used if possible (otherwise use project standard header, by "include")?
7. Is use of "int" avoided (use standard defined type instead)?

CLARITY
A. Comments
1. Is the module header informative and complete?
2. Are there sufficient comments to understand the code?
3. Are the comments in the modules informative?
4. Are comment lines used to group logically-related
statements?
5. Are the functions of arrays and variables described?
6. Are changes made to a module after its release noted in the development history section of the header?
B. Layout
1. Is the layout of the code such that the logic is apparent?
2. Are loops indented and visually separated from the surrounding code?
C. Lexical Control Structures
Is a standard project-wide (or at least consistent) lexical control structure pattern used:
e.g.

while (expr)
{
stmts;
}
or
while (expr) {
stmts;
}
ETC.



I2 - Code Inspection Checklist (JPL) - FORTRAN

FUNCTIONALITY
1. Do the modules meet the design requirements?
2. Does each module have a single purpose?
3. Is there some code in the module which should be a function or a subroutine?
4. Are utility modules used correctly?
5. Does the code match the Detailed Design specifications? If not, the design specifications may be in error.
6. Does the code impair the performance of the module (or program) to any significant degree?

DATA USAGE
A. General
1. Is the data defined?
2. Are there undefined or unused variables?
3. Are there typos, particularly "O" for zero, and "l" for one?
4. Are there misspelled names which are compiled as function or subroutine references?
5. Are declarations in the correct sequence? (DIMENSION, EQUIVALENCE, DATA).
B. Common/Equivalence
1. Are there local variables which are in fact misspellings of a COMMON element?
2. Are the elements in the COMMON in the right sequence?
3. Do EQUIVALENCE statements force any unintended shared data storage?
4. Is each EQUIVALENCE commented?
C. Arrays
1. Are all arrays DIMENSIONed?
2. Are array subscript references in column, row order? (Check all indices in multi-dimensioned arrays.)
3. Are array subscript references within the bounds of the array?
4. Are array subscript references checked in critical cases?
5. Is each array used for only one purpose?
D. Variables
1. Are the variables initialized in DATA statements, BLOCK DATA, or previously defined by assignments or COMMON usage?
2. Should variables initialized in DATA statements actually be initialized by an assignment statement; that is, should the variable be initialized each time the module is invoked?
3. Are variables used for only one purpose?
4. Are variables used for logical unit assignments?
5. Are the correct types (REAL, INTEGER, LOGICAL, COMPLEX) used?
E. Input and Output
1. Do FORMATs correspond with the READ and WRITE lists?
2. Is the intended conversion of data specified in the FORMAT?
3. Are there redundant or unused FORMAT statements?
4. Should this module be doing any I/O? Should it be using a message facility?
5. Are messages understandable?
6. Are messages phrased with the correct grammar? Do messages read like a robot or person talking? Robot: "Mount tape on drive. Turn on." Person: "Mount the tape on the tape drive. Then turn the tape drive on."
7. Does each line of a message fit on all of the expected output devices?
F. Data
1. Are all logical unit numbers and flags assigned correctly?
2. Is the DATA statement used and not the PARAMETER statement?
3. Are constant values constant?

CONTROL
A. Loops
1. Are the loop parameters expressed as variables?
2. Is the initial parameter tested before the loop in those cases where the initial parameter may be greater than the terminal parameter?
3. Is the loop index within the range of any array it is subscripting? Is there a check in critical cases such as COMMONs?
4. Is the index variable only used within the DO loop?
5. If the value of the index variable is required outside the loop, is it stored in another location?
6. Does the loop handle all the conditions required?
7. Does the loop handle error conditions?
8. Does the loop handle cases which may "fall through"?
9. Is loop nesting in the correct order?
10. Can loops be combined?
11. If possible, do nested loops process arrays as they are stored, with the innermost loop processing the first index (column index) and outer loops processing the row index?
B. Branches
1. Are branches handled correctly?
2. Are branches commented?
3. When using computed GO TOs, is the fall-through case tested, checked, and handled correctly?
4. Are floating point comparisons done with tolerances and never made to an exact value?

LINKAGE
1. Does the CALLing program have the same number of parameters as each routine?
2. Are the passed parameters in the correct order?
3. Are the passed parameters the correct type? If not, are mismatches proper?
4. Are constant values passed via a symbol (variable) rather than being passed directly?
5. Is an unused parameter named DUMMY, or some name which reflects its inactive status?
6. Is an array passed to a subroutine only when an array is defined in the subroutine?
7. Are the input parameters listed before the output parameters?
8. Does the subroutine return an error status output parameter?
9. Do the return codes follow conventions?
10. Are arrays used as intended?
11. If array dimensions are passed (dynamic dimensioning) are they greater than 0?
12. If a subroutine modifies an array, are the indices checked, or are the dimensions passed as parameters?
13. Does a subroutine modify any input parameter? If so, is this fact clearly stated?
14. Do subroutines end with a RETURN statement and not a STOP or a CALL EXIT?
15. Does a FUNCTION routine have only one output value?

COMPUTATION
1. Are arithmetic expressions evaluated as specified?
2. Are parentheses used correctly?
3. Is the use of mixed-mode expressions avoided?
4. Are intermediate results stored instead of recomputed?
5. Is all integer arithmetic involving multiplication and division performed correctly?
6. Do integer comparisons account for truncation?
7. Are complex numbers used correctly?
8. Is the precision length selected adequate?
9. Is arithmetic performed efficiently?
10. Can a multiplication be used instead of a division? If so, is it commented so as not to obscure the process?

MAINTENANCE
1. Are library routines used?
2. Is non-standard FORTRAN isolated in subroutines and well documented?
3. Is the use of EQUIVALENCE limited so that it does not impede understanding the module?
4. Is the use of GO TOs limited so that it does not impede understanding the module?
5. Does each module have one exit point?
6. Is there no self-modifying code? (No ASSIGN statements, or PARAMETER statements.)
7. Is the module easy to change?
8. Is the module independent of specific devices where possible?
9. Where possible, are the CALLing routine parameter names the same as the subroutine parameter names?
10. Are type declarations implicit rather than explicit when possible?

CLARITY
1. Is the module header informative and complete?
2. Are there sufficient comments to understand the code?
3. Are the comments in the modules informative?
4. Are comment lines used to group logically-related statements?
5. Are the functions of arrays and variables described?


/********************************************************************************************
Training Program simple sort.cc
Specification for program \simple sort"
Name:     simple sort { sort a list of numbers entered by the user
Usage:    simple sort
Description:
                simple_sort starts by prompting for the number of items to be sorted. The program then reads reads in the list of numbers from the user, sorts
them into ascending numerical order, then prints the sorted list.
Options:  None.
Example: Sorting a list of ten numbers:
% simple_sort
Enter the number of data values: 10
Data item 0: 5
Data item 1: 6
Data item 2: 7
Data item 3: 8
Data item 4: 2
Data item 5: 3
Data item 6: 9
Data item 7: 1
Data item 8: 4
Data item 9: 10
Sorted list:
Data item 0: 1
Data item 1: 2
Data item 2: 3
Data item 3: 4
Data item 4: 5
Data item 5: 6
Data item 6: 7
Data item 7: 8
Data item 8: 9
Data item 9: 10
Restrictions:  The number of elements which can be sorted is currently limited to 100.
Author:    Fraser Macdonald




//Code: simple sort.cc

1       #include <iostream.h>
2
3       const int TABLESIZE = 100;
4
5       void swap(int& x, int& y)
6       {
7           int temp = x;
8           x = y;
9           y = temp;
10     }
11
12     int max(int x, int y)
13     {if (x > y) return x; else return y;}
14
15     int main()
16     {
17         int size;
18         int table[TABLESIZE];
19
20         cout << "Enter the number of data values: ";
21         cin >> size;
22
23         if(size >= TABLESIZE)
24             cout << "Too many elements, maximum is " << TABLESIZE << endl;
25         else {
26             for(int i = 0; i < size; i++){
27                 cout << "Data item " << i << ": ";
28                 cin >> table[i];
29             }
30             for(i = size - 1; i > 0; i--)
31                 for(int j = 0; j <=i - 1; j++)
32                     if(table[j] > table[j+1])
33                         swap(table[j], table[j+1]);
34
35             cout << endl << "Sorted lits:" << endl;
36             for(i = 0; i < size; i++)
37             cout << "Data item " << i << ": " << table[i] << endl;
38         }
39         return(0);
40     }
41


C/C++ Code Checklist

1. Specification

     Is the functionality described in the speci cation fully implemented by the code?
     Is there any excess functionality implemented by the code but not described in the specification?
     Is the program interface implemented as described in the speci cation?

2. Initialisation and Declarations

     Are all local and global variables initialised before use?
     Are variables and class members i) required, ii) of the appropriate type and iii) correctly scoped?

3. Function Calls

     Are parameters presented in the correct order?Are pointers and & used correctly?
     Is the correct function being called, or should it be a di erent function with a similar name?

4. Arrays

     Are there any o -by-one errors in array indexing?
     Can array indexes ever go out-of-bounds?

5. Pointers and Strings

     Check that pointers are initialised to NULL
     Check that pointers are never unexpectedly NULL
     Check that all strings are identi ed by pointers and are NULL-terminated at all points in the program

6. Dynamic Storage Allocation

     Is too much/too little space allocated?

7. Output Format

     Are there any spelling or grammatical errors in displayed output?
     Is the output formatted correctly in terms of line stepping and spacing?

8. Computation, Comparisons and Assignments

     Check order of computation/evaluation, operator precedence and parenthesising
     Can the denominator of a division ever be zero?
     Is integer arithmetic, especially division, ever used inappropriately, causing unexpected truncation/rounding?
     Are the comparison and boolean operators correct?
     If the test is an error-check, can the error condition actually be legitimate in some cases?
     Does the code rely on any implicit type conversions?

9. Flow of Control

     In a switch statement, is any case not terminated by break or return?
     Do all switch statements have adefault branch?
     Are all loops correctly formed, with the appropriate initialisation, increment and termination expressions?

10. Files

     Are all les properly declared and opened?
     Is a le not closed in the case of an error?
     Are EOF conditions detected and handled correctly?


Inspection Preparation Log

 
Inspection Preparation Log      
         
Product Simple Sort       
Author/Team Fraser Macdonald      
         
Defect # Description Type Location Severity
1 Parameters are passed by value, not by reference. "swap" doesn't correctly swap the numbers, so the sort is not carried out correctly. function calls line 5, function swap() failure
2 Function max() is defined, but never used. No failure apparently, but checklist violation. function calls line 12, function max() trivial
3 >= should be >. The program only accepts one less than the true maximum number of elements. comparisons line 23, function main() sometimes errors
4 "list" is spelled incorrectly in the message. The program displays incorrect output. output format line 35, function main() bad quality of output
5        
6        
7        
8        
9        
10        
         
         
         
         
Inspector Ge Fan Date Received October 23, 1999  
    Date Completed October 23, 1999  
    Time Spent 20 min  



Inspection Defect Log

 
Inspection Defect Log      
         
Product Simple Sort  Date October 23, 1999  
Author/Team Fraser Macdonald      
         
Defect # Description Type Location Severity
1 Parameters are passed by value, not by reference. "swap" doesn't correctly swap the numbers, so the sort is not carried out correctly. function calls line 5, function swap() failure
2 Function max() is defined, but never used. No failure apparently, but checklist violation. function calls line 12, function max() trivial
3 >= should be >. The program only accepts one less than the true maximum number of elements. comparisons line 23, function main() sometimes errors
4 "list" is spelled incorrectly in the message. The program displays incorrect output. output format line 35, function main() bad quality of output
         
         
         
         
         
         
         
         
         
         
Moderator Noname1 Inspectors Ge Fan  
Recorder Noname2