bannerd


SWE-185 - Secure Coding Standards Verification

1. Requirements

3.11.7 The project manager shall verify that the software code meets the project’s secure coding standard by using the results from static analysis tool(s).

1.1 Notes

If a static analysis tool will not work with the selected coding standard, other methods are acceptable, including manual inspection.

1.2 History

SWE-185 - Last used in rev NPR 7150.2D

RevSWE Statement
A


Difference between A and B

N/A

B


Difference between B and C

NEW

C

3.11.9 The project manager shall verify that the software code meets the project’s secure coding standard by using the results from static analysis tool(s). 

Difference between C and D

No change

A note was added under this SWE for clarification.

D

3.11.7 The project manager shall verify that the software code meets the project’s secure coding standard by using the results from static analysis tool(s).



1.3 Applicability Across Classes

 

Class

     A      

     B      

     C      

     D      

     E      

     F      

Applicable?

   

   

   

   

   

   

Key:    - Applicable | - Not Applicable


2. Rationale

The use of uniform software coding methods, standards, and/or criteria ensures uniform coding practices, reduces errors through safe language subsets, and improves code readability. Verification that these practices have been adhered to reduces the risk of software malfunction for the project during its operations and maintenance phases. Assuring the adherence of the developed software to the coding standards provides the greatest benefit when followed from software development inception to completion. Coding standards are selected at the start of the software development effort. Verification activities of the software work products include reviews, such as peer reviews and inspections (see SWE-087 - Software Peer Reviews and Inspections for Requirements, Plans, Design, Code, and Test Procedures), and assessments of how the coding standards are used to develop the software work products. The use of automated tools for assessing adherence to standards at appropriate reviews, or even on a batch mode run overnight, will assist the project team in adherence and verification.

3. Guidance

3.1 Verification To Coding Standards

Verification of the developed software to the coding standards provides the greatest benefit when followed from software development inception to completion. Coding standards are selected at the start of the software development effort. Verification activities of the software work products include reviews, such as peer reviews and inspections (see SWE-087 - Software Peer Reviews and Inspections for Requirements, Plans, Design, Code, and Test Procedures), and assessments of how the coding standards are used to develop the software work products. See also Topic 7.10 - Peer Review and Inspections Including Checklists

The use of automated tools for assessing adherence to standards at appropriate reviews, or even on a batch mode run overnight, will assist the project team in adherence and verification. The less customization there is to a known coding standard, the higher probability the automated tool will work with the selected standard.  The quote below provides example guidance on the importance of using a coding standard and adhering to it during development.  If the tool is used it is left to the project’s discretion on how often to run.

Software Certification – Coding, Code, and Coders

ABSTRACT: "code is mechanically checked against the standards with the help of state of-the-art static source code analyzers..."

Paraphrasing from section 2.2 The Code
"Code should be checked nightly for compliance with a coding standard and subjected to rigorous analysis with state-of-the-art (static source code analysis tools). The warnings generated by each of these tools are combined with the output of mission-specific checkers that secure compliance with naming conventions, coding style, etc. In addition, all warnings, if any (there should be none), from the standard C compiler, used in pedantic mode with all warnings enabled, should be provided to the software developers... (who) are required to close out all reports before a formal code review is initiated. In peer code reviews, an additional source of input is provided by designated peer code reviewers... Separately, key parts of the software design can be also checked for correctness and compliance with higher-level design requirements with the help of logic model checkers.”

477

Training should be provided for the software development team in the use of any tools for the analysis and verification of software.  See also SWE-017 - Project and Software Training

An analysis to verify the complete application of safety or security coding standards is all but impossible. 476  Teams are still required to do these checks and provide a best effort. 

Static analysis is performed on code without executing it.  Static code analysis can be automated (by using static code analysis tools) or even performed manually (e.g., by code reviews/inspections).  Projects should not be caught up on the specific tool(s) being used to implement the intent of the requirement.  Use of a tool is recommended but not required.

Static analysis techniques are useful tool for assessing code.  Newer techniques further extend the team's ability to proactively identify errors in code or design. Where appropriate, teams should adopt additional analysis techniques and approaches, especially where these can be integrated into automated testing.  Additional techniques may include dynamic execution analysis, software fuzzing, etc.

See also SWE-058 - Detailed DesignSWE-060 - Coding SoftwareSWE-061 - Coding StandardsSWE-135 - Static Analysis, SWE-136 - Software Tool Accreditation, 8.04 - Additional Requirements Considerations for Use with Safety-Critical Software, SWE-157 - Protect Against Unauthorized Access, PAT-022 - Programming Practices Checklist

3.2 Additional Guidance

Additional guidance related to this requirement may be found in the following materials in this Handbook:

3.3 Center Process Asset Libraries

SPAN - Software Processes Across NASA
SPAN contains links to Center managed Process Asset Libraries. Consult these Process Asset Libraries (PALs) for Center-specific guidance including processes, forms, checklists, training, and templates related to Software Development. See SPAN in the Software Engineering Community of NEN. Available to NASA only. https://nen.nasa.gov/web/software/wiki  197

See the following link(s) in SPAN for process assets from contributing Centers (NASA Only). 

4. Small Projects

If the project has access to automated tools from the institution, it is recommended that they use the automated tools to reduce impacts.  The project should weigh the benefits of customizing/tailoring the standards in the tool to best meet the project constraints.

5. Resources

5.1 References

  • (SWEREF-197) Software Processes Across NASA (SPAN) web site in NEN SPAN is a compendium of Processes, Procedures, Job Aids, Examples and other recommended best practices.
  • (SWEREF-476) Michael Aguilar, NASA Engineering and Safety Center, October 21, 2014.
  • (SWEREF-477) Klaus Havelund and Gerard J. Holzmann Laboratory for Reliable Software (LaRS) Jet Propulsion Laboratory, California Institute of Technology 4800 Oak Grove Drive, Pasadena, California, 91109-8099.
  • (SWEREF-664) OCE site in NASA Engineering Network, Portal that houses information for software developers to develop code in a secure fashion, Formerly known as "Secure Coding"
  • (SWEREF-665) NVD is the U.S. government repository of standards based vulnerability management data represented using the Security Content Automation Protocol (SCAP).
  • (SWEREF-666) CVE® is a dictionary of publicly disclosed cybersecurity vulnerabilities and exposures that is free to search, use, and incorporate into products and services, per the terms of use.


5.2 Tools

Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). 

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. 

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool.  The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

 


6. Lessons Learned

6.1 NASA Lessons Learned

No Lessons Learned have currently been identified for this requirement.

6.2 Other Lessons Learned

No other Lessons Learned have currently been identified for this requirement.

7. Software Assurance

SWE-185 - Secure Coding Standards Verification
3.11.7 The project manager shall verify that the software code meets the project’s secure coding standard by using the results from static analysis tool(s).

7.1 Tasking for Software Assurance

From NASA-STD-8739.8B

1. Analyze the engineering data or perform independent static code analysis to verify that the code meets the project’s secure coding standard requirements.

7.2 Software Assurance Products

  • Source Code Analysis 
  • The analysis of engineering results or the SA independent static code analysis on the source code, showing the secure coding practices were followed.
  •  Identification of any risks or issues with the use of the secure coding practices


    Objective Evidence

    • The software development organization secure coding standard.
    • Static analysis results.

    Objective evidence is an unbiased, documented fact showing that an activity was confirmed or performed by the software assurance/safety person(s). The evidence for confirmation of the activity can take any number of different forms, depending on the activity in the task. Examples are:

    • Observations, findings, issues, risks found by the SA/safety person and may be expressed in an audit or checklist record, email, memo or entry into a tracking system (e.g. Risk Log).
    • Meeting minutes with attendance lists or SA meeting notes or assessments of the activities and recorded in the project repository.
    • Status report, email or memo containing statements that confirmation has been performed with date (a checklist of confirmations could be used to record when each confirmation has been done!).
    • Signatures on SA reviewed or witnessed products or activities, or
    • Status report, email or memo containing a short summary of information gained by performing the activity. Some examples of using a “short summary” as objective evidence of a confirmation are:
      • To confirm that: “IV&V Program Execution exists”, the summary might be: IV&V Plan is in draft state. It is expected to be complete by (some date).
      • To confirm that: “Traceability between software requirements and hazards with SW contributions exists”, the summary might be x% of the hazards with software contributions are traced to the requirements.
    • The specific products listed in the Introduction of 8.16 are also objective evidence as well as the examples listed above.

7.3 Metrics

  • # of software work product Non-Conformances identified by life cycle phase over time
  • Document the Static Code Analysis tools used with associated Non-Conformances
  • # of total errors and warnings identified by the tool
  • # of errors and warnings evaluated vs. # of total errors and warnings identified by the tool
  • # of Non-Conformances raised by SA vs. total # of raised Non-Conformances
  • # of static code errors and warnings identified as “positives” vs. # of total errors and warnings identified by the tool
  •  Total # of static code analysis "positives" vs.  # of "positives" resolved. Trend over time.
  • # of static code errors and warnings resolved by Severity vs. # of static code errors and warnings identified by Severity by the tool
  • # of static code “positives” over time (Open, Closed, Severity)
  • # of Cybersecurity vulnerabilities and weaknesses identified by the tool
  • # of coding standard violations identified (Open, Closed, type of violation, Severity)

See also Topic 8.18 - SA Suggested Metrics.

7.4 Guidance

  1. Confirm the coding guidelines (e.g., coding standards) address secure coding practices.  The selection of which coding standard to use should be done during the planning part of the software project

Some of the widely used coding standards that consider safety and security are:

For C language: MISRA C, SEI CERT C Coding Standard. The SEI CERT C Coding Standard is a software coding standard for the C programming language, developed by the CERT Coordination Center to improve the safety, reliability, and security of software systems.

For C++ language: MISRA C++, JSF AV C++ Coding Standard, SEI CERT C++ Coding Standard, AUTOSAR C++ Coding Guidelines.

2. Confirm that secure coding practices are used.

Confirm that the project is using the code standard selected.

3. Perform an independent static code analysis for secure coding practices.

Use a static code analysis tool on the source code to look for compliance with the coding standard rules.  Doing this by hand without a tool is nearly impossible, too many coding rules and too much code.

If engineering is running a tool that does the standard checking then SA can look at and use the tool output to determine if the code meets the code standards. 

It is best if SA runs a code standard checker on the source code. Part of this is to get SA more involved directly in the source code product and not just relying on what engineering and saying about the source code. 

IV&V may be able to help you with an independent static code analysis for secure coding practices.

Check to see if the engineering team and the project have run a static analysis tool to assess the cybersecurity vulnerabilities and weaknesses in the source code if so check to see that the findings from the static analysis tool have been addressed by the team.

Use of a tool is recommended but if a tool is not available static code analysis can be performed manually (e.g., by code reviews/inspections).  Projects should not be caught up on the specific tool(s) being used to implement the intent of the requirement.  

Static analysis techniques are useful tool for assessing code.  Newer techniques further extend the team's ability to proactively identify errors in code or design. Where appropriate, teams should adopt additional analysis techniques and approaches, especially where these can be integrated into automated testing.  Additional techniques may include dynamic execution analysis, software fuzzing, etc.

A method of identifying weaknesses and vulnerabilities is to use the National Vulnerability Database 665 from NIST that is the U.S. government repository of standards-based vulnerability data. Software weaknesses can be identified using Common Weakness Enumeration (CWE) 666 - a dictionary created by MITRE.

See the secure coding site 664 for more information (NASA access only).

See also 8.04 - Additional Requirements Considerations for Use with Safety-Critical Software

7.5 Additional Guidance

Additional guidance related to this requirement may be found in the following materials in this Handbook:



  • No labels