bannerc
SWE-136 - Software Tool Accreditation

1. Requirements

4.4.8 The project manager shall validate and accredit the software tool(s) required to develop or maintain software.

1.1 Notes

All software development tools contain some number of software defects. Validation and accreditation of the critical software development and maintenance tools ensure that the tools being used during the software development life cycle do not generate or insert errors in the software executable components. Software tool accreditation is the certification that a software tool is acceptable for use for a specific purpose. Accreditation is conferred by the organization best positioned to make the judgment that the software tool in question is acceptable. The likelihood that work products will function properly is enhanced, and the risk of error is reduced if the tools used in the development and maintenance processes have been validated and accredited themselves.

1.2 History

SWE-136 - Last used in rev NPR 7150.2D

RevSWE Statement
A

3.3.7 The project shall validate and accredit software tool(s) required to develop or maintain software.

Difference between A and B

No change

B

4.4.8 The project manager shall validate and accredit software tool(s) required to develop or maintain software.

Difference between B and C

No change

C

4.4.8 The project manager shall validate and accredit the software tool(s) required to develop or maintain software.

Difference between C and DNo change
D

4.4.8 The project manager shall validate and accredit the software tool(s) required to develop or maintain software.



1.3 Applicability Across Classes

Class

     A      

     B      

     C      

     D      

     E      

     F      

Applicable?

   

   

   

   

   

   

Key:    - Applicable | - Not Applicable

2. Rationale

Software development tools contain software defects.  Commercial software development tools do have software errors.  Validation and accreditation of the critical software development and maintenance tools ensure that the tools being used during the software development life cycle do not generate or insert errors in the software executable components.  This requirement reduces the risk in the software development and maintenance areas of the software life cycle by assessing the tools against defined validation and accreditation criteria.  The likelihood that work products will function properly is enhanced and the risk of error is reduced if the tools used in the development and maintenance processes have been validated and accredited themselves. This is particularly important for flight software (Classes A and B) which must work correctly with its first use if critical mishaps are to be avoided.

3. Guidance

Project managers are to use validated and accredited software tool(s) to develop or maintain software.  Software is validated against test cases using tools and test environments that have been tested and known to produce acceptable results. Multiple techniques for validating software may be required based on the nature and complexity of the software being developed. This process involves comparing the results produced by the software work product(s) being developed against the results from a set of ‘reference’ programs or recognized test cases. Care must be taken to ensure that the tools and environment used for this process should induce no error or bias into the software performance. Validation is achieved if there is an agreement with expectations and no major discrepancies are found.  Software tool accreditation is the certification that a software tool is acceptable for use for a specific purpose. Accreditation is conferred by the organization best positioned to make the judgment that the software tool in question is acceptable. The organization may be an operational user, the program office, or a contractor, depending upon the purposes intended. Validation and accreditation of software tool(s) required to develop or maintain software are particularly necessary in cases where:

  • Complex and critical interoperability is being represented.
  • Reuse is intended.
  • The safety of life is involved.
  • Significant resources are involved.
  • Flight software is running on new processors where a software development tool may not have been previously validated by an outside organization.

The targeted purpose of this requirement was the validation and accreditation of software tools that directly affect the software being developed. Examples of software tools that directly affect software code development include but are not limited to compilers, code-coverage tools, development environments, build tools, user interface tools, debuggers, and code generation tools. For example, if your project is using a code generation tool, the code output of that code generation tool is validated and accredited to verify that the generated code does not have any errors or timing constraints that could affect execution. Another example is the use of math libraries associated with a compiler. These linked libraries are typically validated and accredited before use in the executable code.

This requirement does not mean that a project needs to validate and accredit a software problem reporting tool or software defect database or a software metric reporting tool. These types of indirect tools can be validated by use. It is also recognized that formal accreditation of software development/maintenance tools is often provided by organizations outside NASA (e.g., National Institute of Standards and Technology [NIST], Association for Computing Machinery [ACM]) for many common software development tools such as compilers. It could be prohibitively expensive or impossible for NASA projects to perform equivalent accreditation activities.

Accreditation is conferred by the organization in the best position to make the judgment that the software tool in question is acceptable. Someone on the project needs to be able to determine an acceptable level of validation and accreditation for each tool being used on a software project and be willing to make a judgment that the tool is acceptable for the tool's intended use. Most of the time that decision can be, and is, made by the software technical authority. The acceptance criteria and validation approach can be captured and recorded in a software development plan, software management plan, or software maintenance plan.

Often the software work products being developed are based on new requirements and techniques for use in novel environments. These may require the development of supporting systems, i.e., new tools, test systems, and original environments, none of which have themselves ever been accredited. The software development team may also plan to employ accredited tools that have not been previously used or adapted to the new environment. The key is to evaluate and then accredit the development environment and its associated development tools against another environment/tool system that is accredited. The more typical case regarding new software development is the new use-case of bringing tools (otherwise previously accredited) together in a development environment (otherwise previously accredited) that have never been used together before (and not accredited as an integrated system).

A caveat exists that for many new NASA projects, previous ensembles of accredited development environments and tools just don’t exist. For these cases, projects must develop acceptable methods to assure their supporting environment is correct.

The flow of a software accreditation protocol or procedure typically requires an expert to conduct a demonstration and to declare that a particular software work product exhibits compatible results across a portfolio of test cases and environments.

The software can also be accredited by using industry accreditation tools.  Engineers can be trained and thus certified in the correct use of the software accreditation tools. 

Many of the techniques used to validate software (see SWE-055) can be applied by the software development team to validate and accredit their development tools, systems, and environments. These include the following:

  • Functional demonstrations.
  • Formal reviews.
  • Peer reviews/ component inspections.
  • Analysis.
  • Use case simulations.
  • Software testing.

One suggested flow for identifying, validating, accepting, and accrediting new development tools and environments to be used to develop software work products subject to this requirement is shown in Figure 3.1. While it may seem obvious, it is important to review the current and expected development tools (compilers, debuggers, code generation tools, code-coverage tools, and others planned for use) and the development environments and release procedures, to assure that the complete set is identified, understood and included in the accrediting process. 


          Figure 3.1 – Sample Accreditation Process Flow     


Once you have the complete list, the next step in the process is to review the accreditation status of each item. Ask the following questions:

  • Is the accreditation current?
  • Does the accreditation need to be re-certified?
  • Is the accreditation applies to the current development task?
  • Will its accreditation continue throughout the development life cycle?

For tools that need accreditation, use the information in the preceding paragraphs to develop a new accreditation process. Keep in mind that the goal is to show that the tools will do what the project expects them to do. The Project team needs to review and accept the new process, and the software technical authority and the Center's software assurance personnel are to be informed of it. When the integration of accredited tools needs its accrediting process, that planning is conducted using the same steps.

The actual steps and tasks in the process are not specified in this guidance because of the wide variety of tools and environments that currently exist or will have to be developed. They are left for the project and the software development team to develop. Once the Project Team has planned and approved the new and integrating accreditation processes, the appropriate validation activities are conducted. Center software assurance organization representative(s) are informed and their observation of the validation processes are accounted for in the activities. Results, process improvements, lessons learned, and any discrepancies that result are documented and retained in the project configuration management system. Any issues or discrepancies are re-planned and tracked to closure as part of the regular project's corrective and preventive action reporting system.

The final activity in the accreditation process is an inspection and peer review of the validation activity results. Once the agreement is reached on the quality of the results (what constitutes "agreement" is decided ahead of time as a part of the process acceptance planning), the identified tools and environments can be accepted as accredited.

Additional guidance related to software tool accreditation may be found in the following related requirements in this Handbook:


4. Small Projects

Small projects with limited resources may look to software development tools and environments that were previously accredited on other projects. Reviews of project reports, Process Asset Libraries (PAL)s, and technology reports may surface previous use cases and/or acceptance procedures that apply to new tools pertinent to the current project.

5. Resources

5.1 References


5.2 Tools


Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). 

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. 

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool.  The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

6. Lessons Learned

6.1 NASA Lessons Learned

No Lessons Learned have currently been identified for this requirement.

6.2 Other Lessons Learned

No other Lessons Learned have currently been identified for this requirement.

7. Software Assurance

SWE-136 - Software Tool Accreditation
4.4.8 The project manager shall validate and accredit the software tool(s) required to develop or maintain software.

7.1 Tasking for Software Assurance

  1. Confirm that the software tool(s) needed to create and maintain software is validated and accredited.

7.2 Software Assurance Products

  • None at this time.


Objective Evidence

  • Software tool validated and accredited criteria.
  • Software tool validated and accredited results.

Objective evidence is an unbiased, documented fact showing that an activity was confirmed or performed by the software assurance/safety person(s). The evidence for confirmation of the activity can take any number of different forms, depending on the activity in the task. Examples are:

  • Observations, findings, issues, risks found by the SA/safety person and may be expressed in an audit or checklist record, email, memo or entry into a tracking system (e.g. Risk Log).
  • Meeting minutes with attendance lists or SA meeting notes or assessments of the activities and recorded in the project repository.
  • Status report, email or memo containing statements that confirmation has been performed with date (a checklist of confirmations could be used to record when each confirmation has been done!).
  • Signatures on SA reviewed or witnessed products or activities, or
  • Status report, email or memo containing a short summary of information gained by performing the activity. Some examples of using a “short summary” as objective evidence of a confirmation are:
    • To confirm that: “IV&V Program Execution exists”, the summary might be: IV&V Plan is in draft state. It is expected to be complete by (some date).
    • To confirm that: “Traceability between software requirements and hazards with SW contributions exists”, the summary might be x% of the hazards with software contributions are traced to the requirements.
  • The specific products listed in the Introduction of 8.16 are also objective evidence as well as the examples listed above.

7.3 Metrics.

  • The number of defects or non-conformances found in flight code, ground code, tools, and COTs products used (open versus closed).

7.4 Guidance

For this requirement, software assurance needs to confirm that any software tools used for development and maintenance have been validated and accredited. All software development tools contain some number of software defects. Validation and accreditation of the critical software development and maintenance tools ensure that the tools being used during the software development life cycle do not generate or insert errors in the software executable components. Software tool accreditation is the certification that a software tool is acceptable for use for a specific purpose. Accreditation is conferred by the organization best positioned to make the judgment that the software tool in question is acceptable. The likelihood that work products will function properly is enhanced, and the risk of error is reduced if the tools used in the development and maintenance processes have been validated and accredited themselves.

The first step In confirming the validation and accreditation of the software tools used for development and maintenance is to obtain a list of the tools being used. Then check to see if the project) or another organization positioned to make the judgment that the tools are acceptable) has verified, validated, and approved them for their use in the development, analysis, testing, or maintenance of the project software.

Examples of SW tools that directly affect the SW code development can include but are not limited to, the compilers, code-coverage tools, development environments, build tools, user interface tools, debuggers, and code generation tools used by a Project. Another example would be the Project’s use of a linked code library, which would be typically validated and accredited before use in the executable code.

There is information in the software guidance section of this requirement on validating and accrediting these tools. 

If the software tools (e.g., simulators, compiler libraries, built-in memory checkers, build tools, development environments, code generators, code-coverage tools, etc.) have an impact on the safety of the system, then determine if they are correctly applied and used, operated as intended, the results documented, etc.  Typically, once it is determined if a tool has any safety implications, an assessment is performed on the severity of impact if the tool provides one or more wrong outputs, and the likelihood of that occurring determines the level of effort or hazard mitigations to employ.

Software assurance may want to check on the following other considerations for software tools:

  • When software tools are used to design, develop, analyze, test, or maintain software systems, assure that:
    • The tools are kept up to date and configuration managed, incorporating changes, as needed
    • Inputs are correct and complete
    • The models are testable, tested, and accredited
  • When using software tools check:
    • Use of up-to-date versions and licenses on purchased tools
    • Have any concerns or risks been recorded and resolved?
    • Are the tools being operated within the parameters/limitations of their capabilities?
  • Have any known errors or discrepancies been documented for the software tools in use, and if so, have the risks been input to the risk management system and evaluated?
  • Are the results as expected or can they be explained? 
  • Is there any analysis stating the level of certainty/trust from the results of the tools, including any concerns, risks, issues? 



  • No labels

0 Comments