The Project will conduct a software assurance program that shall include quality assurance, verification and validation, quality engineering, safety assurance, and security and privacy assurance. The Project Software Assurance Manager (SAM) shall be responsible for planning and execution of the assurance program. The following paragraphs detail the Project's plan, and specify software assurance requirements for software providers.
Each software provider shall conduct a software assurance program that satisfies the provider requirements in this document and that satisfies the requirements in the NASA Software Assurance Standard, NASA-STD-2201-93.
Each Provider shall include in the required software management plan (see section 1.2) a plan for a software assurance program in accordance with the above stated requirements.
8.1 Quality Assurance Planning
The Project and each provider shall conduct a program of Software Quality Assurance (SQA), which is a planned and systematic approach to the evaluation of the quality of and adherence to software product standards, processes, and procedures. SQA includes the process of assuring that standards and procedures are established and are followed throughout the software acquisition life cycle. Compliance with agreed-upon standards and procedures is evaluated through process monitoring, product evaluation, and audits. Software development and control processes shall include quality assurance approval points, where an SQA evaluation of the product shall be done in relation to applicable standards.
8.1.1 Approach and Activities
The Project will conduct oversight of the provider SQA organization to assure that the provider is carrying out a software assurance program that meets requirements. The Project SAM is assigned this oversight responsibility. As part of the oversight role, the Project will perform both scheduled and unscheduled audits of providers to establish the degree of conformance to the standards and procedures and to reported status.
The Project will review and approve the standards and procedures proposed for use by providers, and will assess the quality of all of the provider's delivered products against the appropriate standards.
The provider shall conduct software quality assurance activities throughout the software life cycle in accordance with the following requirements:
·During the software requirements phase, the software quality assurance activity shall assure that the software requirements are complete, testable, and properly expressed as functional, performance, and interface requirements.
·During the software architectural (preliminary) design phase, the software quality assurance activity shall:
·assure that all software requirements are allocated to software components;
·assure that a test verification matrix exists and is kept up to date;
·assure that Interface Control Documents are in agreement with the standard in form and content;
·review Preliminary Design Review documentation and assure that all action items are resolved; and
·assure that the approved design is placed under configuration control.
·assure that the results of design inspections are included in the design; and
·review Critical Design Review documentation and assure that all action items are resolved.
·status of all deliverable items;
·configuration management activities and the software development library;
·the nonconformance reporting and corrective action system.
·assure that all tests are run according to approved test plans and procedures and that any nonconformances are reported and resolved;
·assure that test reports are complete and correct;
·certify that testing is complete and software and documentation are ready for delivery; and
·participate in the Test Readiness Review and assure all action items are completed.
·During the software sustaining engineering and operations phase, there will be mini-development cycles to enhance or correct the software. During these development cycles, the software quality assurance activity shall conduct the appropriate phase-specific activities described above.
The Project will use an audit guide and checklists to perform scheduled and unscheduled audits of the provider's software process, products, and status reports. The Project will use checklists and the provider's Project-approved standards in its detailed evaluations of each of the provider's products.
The provider shall explain, in the required SMP, the methods and techniques to be used.
The Project will develop audit reports in accordance with Audit Report, NASA-DID-R002 for each audit conducted. The results of the audit will be conveyed to the provider so that appropriate action can be taken to correct any deficiencies found.
The provider shall describe in the required SMP the products of the SQA process. The provider shall develop audit reports in accordance with Audit Report, NASA-DID-R002 for each audit conducted. Copies of the reports shall be furnished to the Project, as required in section 126.96.36.199, for review and action.
8.2 Verification and Validation (V&V) Planning
The Project and the provider shall conduct V&V activities, to include reviews, inspections and informal technical reviews, and testing of all deliverable products.
8.2.1 Approach and Activities
The Project will conduct, and the provider shall support, formal reviews at the end of each life cycle phase. These reviews shall include the Software Requirements Review, the Software Preliminary Design Review, the Software Critical Design Review, and the Software Test Readiness Review.
The reviews shall encompass the items to be included in the configuration management baselines to be established after the successful completion of the review. See section 10.2.1 for the minimum contents of each baseline.
After each formal review, the Project Software Manager will decide upon the readiness of the provider to begin the next development life cycle phase. The Project Software Assurance Manager will make a readiness recommendation to the Software Manager based on an assessment of status and readiness of processes, procedures and standards needed in the next phase. After completion of rework for problems found during the review and correction of any readiness problems, permission to begin the next phase will be given.
GSFC Flight Assurance Reviews are in addition to the end of phase reviews specified above. They will be conducted by the Office of Flight Assurance. These reviews will be at the system level, and software will be among the items reviewed. The provider shall support these reviews as required above for Project formal reviews.
The provider shall conduct an inspection and internal technical review program as follows:
·The provider shall conduct formal inspections or walkthroughs of all other deliverable products.
Acceptance readiness testing shall be formal testing conducted by the provider and witnessed by the Project. The purpose of acceptance readiness testing shall be to show that the software is ready for acceptance testing by the Project. All discrepancies found during formal testing shall be entered in the provider's Nonconformance Reporting and Corrective Action (NRCA) system (see Section 8.7) and tracked until closure. Access to this data in the NRCA system shall be provided to the Project.
Test planning shall be done for all levels of testing. The provider shall submit to the Project for review and approval test plans for the formal testing, such as the acceptance readiness testing. Once a test plan is approved, the provider shall prepare test procedures according to DID A200. The procedures shall be used for the tests, and shall be available for Project review and comment.
The Project will conduct Formal Acceptance testing on delivered software, following a Test Readiness Review of the results of Acceptance readiness testing. Formal Acceptance Testing will include the generation of the system from source code, using installation procedures provided. The Project will prepare test plans, based on requirements and operations manuals, and will develop test procedures according to DID A200. Discrepancies found during Acceptance testing will be entered into the Project's NRCA system and tracked until closure. The provider shall correct all discrepancies found.
The provider shall conduct verification and validation activities throughout the software life cycle in accordance with the following:
·assuring that the requirements are testable and capable of being satisfied.
·conducting formal inspections of requirements.
·creating a preliminary version of formal test plans, including a test verification matrix.
·beginning development of test beds and test data generators.
·conducting informal technical reviews or formal inspections of the preliminary software and data base designs;
·conducting informal technical reviews or formal inspections of the detailed software and data base designs.
·unit testing of software and data structure units.
·locating and correcting errors and testing the changed software.
·development of test procedures for the next two phases.
·documenting test performance, test completion, and conformance of test results versus expected results.
·providing a test report that includes a summary of nonconformances found during testing.
·locating, recording, correcting, and retesting nonconformances.
·locating, recording, correcting, and retesting nonconformances
For each formal review, the provider shall:
·Develop and organize material for oral presentation to the Project review team. Copies of visual aids and other supporting material that are pertinent to the review shall be submitted to the Project at least 3 working days before the review.
·Support splinter meetings resulting from the review.
·Produce written responses to recommendations and action items resulting from the review.
The provider shall conduct informal testing in accordance with Project-approved provider standards and procedures.
The provider shall conduct, and the Project will witness, formal testing in accordance with the Project-approved test plan and provider developed procedures.
The products of Formal Inspections shall be inspection reports in accordance with the DID for Inspection Reports, NASA-DID-R003.
The results of informal reviews and walkthroughs shall be documented in the appropriate Software Development Folder (See Section 188.8.131.52.10). The provider shall summarize the results of informal reviews and walkthroughs in the Monthly Progress Report.
The results of informal testing shall be recorded in the appropriate software development folders.
The results of formal tests shall documented in accordance with Test Report, NASA-DID-R009.
Discrepancy reports shall be documented in accordance with Discrepancy (NRCA) Report, NASA-DID-R004. See section 8.7 for a discussion of Nonconformance Reporting and Corrective Action.
8.3 Quality Engineering Assurance Planning
Software Quality Engineering (SQE) is the activity that evaluates, assesses, and improves the quality of software. Software quality is often defined as the degree to which software meets requirements for reliability, maintainability, transportability, etc., as contrasted with functional, performance, and interface requirements that are satisfied as a result of software engineering.
The provider shall conduct a Software Quality Engineering program that satisfies the requirements of the Software Assurance Standard, NASA-STD-2201-93, section 3.3.2. In addition, the program shall satisfy the requirements in the remainder of 8.3.
8.3.1 Approach and Activities
Software classified in categories higher than "Normal" in reliability requirements will have additional processes conducted during development to ensure that reliability is built in. These categories, the software assigned to each category, and the activities to be conducted are described in section 5.3.2.
The provider shall collect data, analyze metrics, and use them to guide quality engineering activities. Metrics and associated requirements are described in section 184.108.40.206.
8.3.2 Methods and Techniques
Metric data shall be collected, stored in provider data bases, and provided to the Project. The Project will compute metrics and trends using PC based tools.
The Project will develop graphs and other displays that can be used in management and risk analysis.
8.4 Safety Assurance Planning
The Project will identify safety risks that can be caused by the failure of software to perform as required and any system risks that are to be controlled by software during the baseline risk assessment process described in section 9.0. Identified safety risks will be tracked by the Project as technical risks, and risk mitigation actions will be the responsibility of the SRMB.
The provider shall conduct a software safety assurance program that satisfies the requirements of the Software Assurance Standard, NASA-STD-2201-93, section 3.3.5. In addition, the software safety program shall satisfy the requirements in the remainder of section 8.4.
8.4.1 Approach and Activities
The following activities will be performed to assure safety requirements are met:
·Software requirements associated with safety hazards will be identified as critical, safety related, requirements.
·The provider shall describe in the SMP a plan to assure the safety of the system, the methodology for doing safety analyses, and the methods to be used to ensure that the software system satisfies critical, safety related, requirements.
·The methodology used by the provider shall contain a method for the tracing of safety critical requirements to software components and the identification of the component as safety critical. For identified safety critical software components, software safety activities shall be initiated to include requirements, design, and code analyses and special testing.
·The provider shall document and report on at all formal reviews:
·the approaches used to confront, address, and neutralize hazards.
·the use of safety engineering approaches.
The provider shall identify in the SMP the methods and techniques to be used to identify safety critical requirements and safety critical software components and the analyses and V&V methods to be used to ensure that they function as required. The provider shall include the following in the SMP:
·For identified critical components, the provider shall conduct formal inspection of the detailed requirements, the detailed design, the code, and the test plan and procedures.
·For identified critical components, the provider shall have some form of design and code analysis and special safety testing, which focus on locating program weaknesses and identifying extreme or unexpected situations that could cause the software to fail in ways that would cause a violation of safety critical requirements.
The provider shall submit risk analyses and safety hazard reports to the Project as required, and shall have available the results of all safety related analyses, inspections, and tests, in the SDFs of the critical components.
8.5 Security and Privacy Assurance Planning
The Project will conduct a security assessment process by considering and categorizing the sensitive information that is to be managed and controlled by the Project software. The information, including both programs and data, will be categorized according to its sensitivity. The categorization will meet the requirements contained in NMI 2410.7A, "Assuring the Security and Integrity of NASA Automated Information Systems."
Based on the categorization, the provider shall develop security requirements. The security requirements shall encompass system access control, including network access and physical access; data management and data access; environmental controls (power, air conditioning, etc.) and off-line storage; human resource security; and audit trails and usage records.
The provider shall conduct a software security and privacy assurance program that satisfies the requirements of the Software Assurance Standard, NASA-STD-2201-93, section 3.3.6. In addition, the software safety program shall satisfy the requirements in the remainder of section 8.5.
8.5.1 Approach and Activities
The provider shall conduct security assurance activities that are directed to ensuring that information being (or to be) processed by the software system and the software being developed has been assigned a proper sensitivity category as defined in NMI 2410.7A, and that the appropriate protection requirements have been developed and met in the software. In addition, security assurance activities shall include ensuring the control and protection of the software being developed and/or maintained, and of software support tools and data. A minimum security assurance program shall ensure that:
·Security requirements have been established for the software and data being developed and/or maintained.
·Security requirements have been established for the development and/or maintenance process.
·Each software review and/or audit includes evaluation of security requirements.
·The configuration management and corrective action processes provide security for the existing software and that the change evaluation processes prevent security violations.
8.5.2 Methods and Techniques
The provider shall review and analyze security and privacy requirements to include the following aspects: effective and accurate operations; protection from unauthorized alteration, disclosure, use or misuse of information processed, stored, or transmitted; maintenance of continuity of automated information support; incorporation of management and operational controls; and appropriate technical, administrative, environmental, and access safeguards.
Results of the security review shall be provided to the Project.
8.6 Certification Planning
8.6.1 Approach and Activities
8.6.2 Methods and Techniques
8.7 Nonconformance Reporting and Corrective Action
The Project and each software provider shall establish a Nonconformance Reporting and Corrective Action (NRCA) system, which shall provide for the recording of nonconformances, the evaluation of impact and establishing of priority, the tracking and reporting of status, and the closure after testing. A nonconformance shall be defined as a deviation of any product from its requirements or standards. Nonconformance reports shall be filed against any product in any phase of the software life cycle after a product is first approved or baselined by its developer and released for wider use. The NRCA system shall interface with the CM system in order to track the product changes and versions that result from correcting nonconformances.
8.7.1 Approach and Activities
A designated form shall be used to make the nonconformance report. The form shall contain at least the following information:
·Error identification (report number and title).
·Reporting individual and organization.
·Individual responsible for corrective action.
·Criticality of the nonconformance.
·Statement of the nonconformance.
·Proposed fix for the nonconformance.
·Identifier of the unit of code, data, or documentation in which corrective action must be taken.
·Life cycle phase in which the nonconformance was introduced.
·Life cycle phase in which the nonconformance was detected.
·Final closure resolution.
·Date and/or version of the configuration item in which the correction will be included.
·Date on which the nonconformance is closed.
8.7.2 Methods and Techniques
A nonconformance tracking and reporting system shall be used that is able to provide management reports containing error and correction status, the number of errors found per product, and the criticality of open problems. This data enables the impact of nonconformances to be evaluated so that the use of resources may be prioritized. All Nonconformance reports shall be entered and tracked by the reporting system. The Project shall have access to and use of the information in provider nonconformance systems.
Nonconformance reports shall be evaluated for criticality and level of importance. In addition, each nonconformance reports shall be evaluated to identify those that contain requirements changes disguised as nonconformances. Such reports shall be rejected and shall result in the opening of a change request. Factors to be considered in the criticality and level of importance shall include:
·The resources required for correcting the nonconformance.
·The impact on other baselined items if the nonconformance is corrected.
Each NRCA system shall provide access to the actual nonconformance reports, and shall provide summary and status reports that show the status of nonconformances.