bannerc
SWE-203 - Mandatory Assessments for Non-Conformances

1. Requirements

5.5.3 The project manager shall implement mandatory assessments of reported non-conformances for all COTS, GOTS, MOTS, OSS, or reused software components. 

1.1 Notes

This includes operating systems, run-time systems, device drivers, code generators, compilers, math libraries, and build and Configuration Management (CM) tools. It should be performed pre-flight, with mandatory code audits for critical defects.

1.2 History

SWE-203 - Last used in rev NPR 7150.2D

RevSWE Statement
A


Difference between A and B

N/A

B


Difference between B and C

NEW

C

5.5.3 The project manager shall implement mandatory assessments of reported non-conformances for all COTS, GOTS, MOTS, OSS, or reused software components. 

Difference between C and DChange "or" to "and/or"
D

5.5.3 The project manager shall implement mandatory assessments of reported non-conformances for all COTS, GOTS, MOTS, OSS, and/or reused software components.



1.3 Applicability Across Classes

 

Class

     A      

     B      

     C      

     D      

     E      

     F      

Applicable?

   

   

   

   

   

   

Key:    - Applicable | - Not Applicable

2. Rationale

Software components that are used to build the software product (e.g., compilers) or become a part of the software (e.g., Operating Systems) can introduce unexpected defects in the delivered product.  Whenever non-conformances in these products are discovered, a thorough assessment is required to identify any other potential impacts.    

3. Guidance

Non-conformances discovered in COTS, GOTS, MOTS, OSS, and reused software components are particularly difficult to diagnose.  Determining the risk to the overall system, due to errors in these acquired software components is challenging.  Most COTS, OSS, and some GOTS, MOTS, and reused software components share lists of known defects and non-conformances.  The intent of the requirement is for the user of the software to verify if a site or list of non-conformances or reported bugs is maintained by the developing organization and to review the list of known non-conformances or report bugs to see if the non-conformances or reported bugs could or do impact the software component.  Most commercial products and open-source software have a site that shows a list of well know bugs or non-conformances. The requirement is to research the information and see if any of the known bugs impact the software component being used by the project. Thorough assessments, including qualified software and systems engineers, of all reported non-conformances in these products, are essential for Project Managers.  Additionally, non-conformances in these products should be reported back to the supplier for analysis and potential corrections in future versions.

4. Small Projects

No additional guidance is available for small projects.

5. Resources

5.1 References

  • (SWEREF-197) Software Processes Across NASA (SPAN) web site in NEN SPAN is a compendium of Processes, Procedures, Job Aids, Examples and other recommended best practices.

5.2 Tools

Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). 

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. 

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool.  The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

6. Lessons Learned

6.1 NASA Lessons Learned

No Lessons Learned have currently been identified for this requirement.

6.2 Other Lessons Learned

No other Lessons Learned have currently been identified for this requirement.

7. Software Assurance

SWE-203 - Mandatory Assessments for Non-Conformances
5.5.3 The project manager shall implement mandatory assessments of reported non-conformances for all COTS, GOTS, MOTS, OSS, or reused software components. 

7.1 Tasking for Software Assurance

  1. Confirm the evaluations of reported non-conformances for all COTS, GOTS, MOTS, OSS, or reused software components are occurring throughout the project life-cycle.

  2. Assess the impact of non-conformances on the safety, quality, and reliability of the project software.

7.2 Software Assurance Products

  • Software Design Analysis
  • Source Code Analysis
  • Verification Activities Analysis
  • SA impact assessment of non-conformances on software quality (safety, quality, reliability.)


    Objective Evidence

    • Software defect or problem reporting data for all COTS, GOTS, MOTS, OSS, or reused software components
    • Software configuration management data
    • Software assurance audit results on the change management or defect management processes
    • Software milestone results
    • Software version description documents
    • Software control board data or presentations

    Objective evidence is an unbiased, documented fact showing that an activity was confirmed or performed by the software assurance/safety person(s). The evidence for confirmation of the activity can take any number of different forms, depending on the activity in the task. Examples are:

    • Observations, findings, issues, risks found by the SA/safety person and may be expressed in an audit or checklist record, email, memo or entry into a tracking system (e.g. Risk Log).
    • Meeting minutes with attendance lists or SA meeting notes or assessments of the activities and recorded in the project repository.
    • Status report, email or memo containing statements that confirmation has been performed with date (a checklist of confirmations could be used to record when each confirmation has been done!).
    • Signatures on SA reviewed or witnessed products or activities, or
    • Status report, email or memo containing a short summary of information gained by performing the activity. Some examples of using a “short summary” as objective evidence of a confirmation are:
      • To confirm that: “IV&V Program Execution exists”, the summary might be: IV&V Plan is in draft state. It is expected to be complete by (some date).
      • To confirm that: “Traceability between software requirements and hazards with SW contributions exists”, the summary might be x% of the hazards with software contributions are traced to the requirements.
    • The specific products listed in the Introduction of 8.16 are also objective evidence as well as the examples listed above.

7.3 Metrics

  • Total # of Non-Conformances over time (Open, Closed, # of days Open, and Severity of Open)
  • # of Non-Conformances in the current reporting period (Open, Closed, Severity)
  • # of Non-Conformances identified in source code products used (Open, Closed)
  • # of safety-related Non-Conformances
  • # of Non-Conformances identified in embedded COTS, GOT, MOTS, OSS, or reused components in-ground or flight software vs. # of Non-Conformances successfully closed

7.4 Guidance

Software Assurance will verify that the project is receiving the report of non-conformances for all of the COTS, GOTS, MOTS, OSS, or reused software that is being used by the project. Check that these reported non-conformances reports are being received periodically and that the project is reviewing them to see if there are any impacts to their project. If fixes to any of the COTS, GOTS, MOTS, OSS, or reused software that might impact the project, the project should be implementing these changes. Verify that any necessary changes are recorded in the project discrepancy database and track these entries to closure.

Review the lists of COTS, GOTS, MOTS, OSS, or reused software non-conformances and determine whether there might be any impacts on the software’s safety, quality, or reliability. If the non-conformance is in an area of safety-critical software, review the associated hazard analysis to determine the impact of the non-conformance. In non-safety critical areas of the code, assess what impact the non-conformance would have—Would it cause an incorrect value to be computed? Or prevent critical functionality from working properly? Or severely impact performance? Or is it something like spelling on a display? Reviewing the functionality of the code in the area of the non-conformance, thinking through the operational scenarios, or looking at redundancy for the code area can help in determining the impacts of the non-conformance. If the source code is available, it may also be helpful to run code analyzers to see if the known problems are discovered and how the analyzer classifies the non-conformances. If there are non-trivial impacts, verify that these non-conformances have been added to the project discrepancy database and track them to closure.

  • No labels