See edit history of this section
Post feedback on this section
- 1. The Requirement
- 2. Rationale
- 3. Guidance
- 4. Small Projects
- 5. Resources
- 6. Lessons Learned
- 7. Software Assurance
1. Requirements
3.11.9 The project manager shall verify that the software code meets the project’s secure coding standard by using the results from static analysis tool(s).
1.1 Notes
NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.
1.2 History
1.3 Applicability Across Classes
Class A B C D E F Applicable?
Key: - Applicable | - Not Applicable
2. Rationale
The use of uniform software coding methods, standards, and/or criteria ensures uniform coding practices, reduces errors through safe language subsets, and improves code readability. Verification that these practices have been adhered to reduces the risk of software malfunction for the project during its operations and maintenance phases. Assuring the adherence of the developed software to the coding standards provides the greatest benefit when followed from software development inception to completion. Coding standards are selected at the start of the software development effort. Verification activities of the software work products include reviews, such as peer reviews and inspections (see SWE-087), and assessments of how the coding standards are used to develop the software work products. The use of automated tools for assessing adherence to standards at appropriate reviews, or even on a batch mode run overnight, will assist the project team in adherence and verification.
3. Guidance
Verification of the developed software to the coding standards provides the greatest benefit when followed from software development inception to completion. Coding standards are selected at the start of the software development effort. Verification activities of the software work products include reviews, such as peer reviews and inspections (see SWE-087), and assessments of how the coding standards are used to develop the software work products.
The use of automated tools for assessing adherence to standards at appropriate reviews, or even on a batch mode run overnight, will assist the project team in adherence and verification. “Code should be mechanically checked against the standards with the help of state-of-the-art static source code analyzers. ... Flight code should be checked nightly for compliance with a coding standard and subjected to rigorous analysis with state-of-the-art [static source code analysis tools]. The warnings generated by each of these tools are combined with the output of mission-specific checkers that secure compliance with naming conventions, coding style, etc. In addition, all warnings, if any (there should be none), from the standard C compiler, used in pedantic mode with all warnings enabled, should be provided to the software developers... [who] are required to close out all reports before a formal code review is initiated. In peer code reviews, an additional source of input is provided by designated peer code reviewers... Separately, key parts of the software design can be also checked for correctness and compliance with higher-level design requirements with the help of logic model checkers.” 477
Training should be provided for the software development team in the use of the logic model checkers for the analysis and verification of flight software. 477
Manual analysis to verify the complete application of safety or security coding standards is all but impossible. 476
Additional guidance related to software coding standards and verification may be found in the following related requirements in this Handbook:
4. Small Projects
No additional guidance is available for small projects.
5. Resources
5.1 References
- (SWEREF-197) Software Processes Across NASA (SPAN) web site in NEN SPAN is a compendium of Processes, Procedures, Job Aids, Examples and other recommended best practices.
- (SWEREF-476) Michael Aguilar, NASA Engineering and Safety Center, October 21, 2014.
- (SWEREF-477) Klaus Havelund and Gerard J. Holzmann Laboratory for Reliable Software (LaRS) Jet Propulsion Laboratory, California Institute of Technology 4800 Oak Grove Drive, Pasadena, California, 91109-8099.
- (SWEREF-664) OCE site in NASA Engineering Network, Portal that houses information for software developers to develop code in a secure fashion, Formerly known as "Secure Coding"
- (SWEREF-665) NVD is the U.S. government repository of standards based vulnerability management data represented using the Security Content Automation Protocol (SCAP).
- (SWEREF-666) CVE® is a dictionary of publicly disclosed cybersecurity vulnerabilities and exposures that is free to search, use, and incorporate into products and services, per the terms of use.
5.2 Tools
NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.
The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.
6. Lessons Learned
6.1 NASA Lessons Learned
No Lessons Learned have currently been identified for this requirement.
6.2 Other Lessons Learned
No other Lessons Learned have currently been identified for this requirement.
7. Software Assurance
7.1 Tasking for Software Assurance
- Analyze the engineering data or perform independent static code analysis to verify that the code meets the project’s secure coding standard requirements.
7.2 Software Assurance Products
- Source Code Analysis
- The analysis of engineering results or the SA independent static code analysis on the source code, showing the secure coding practices were followed.
- Identification of any risks or issues with the use of the secure coding practices
Objective Evidence
- The software development organization secure coding standard.
- Static analysis results.
7.3 Metrics
- # of software work product Non-Conformances identified by life-cycle phase over time
- Document the Static Code Analysis tools used with associated Non-Conformances
- # of total errors and warnings identified by the tool
- # of errors and warnings evaluated vs. # of total errors and warnings identified by the tool
- # of Non-Conformances raised by SA vs. total # of raised Non-Conformances
- # of static code errors and warnings identified as “positives” vs. # of total errors and warnings identified by the tool
- # of static code errors and warnings resolved by Severity vs. # of static code errors and warnings identified by Severity by the tool
- # of static code “positives” over time (Open, Closed, Severity)
- # of Cybersecurity vulnerabilities and weaknesses identified by the tool
- # coding standard violations identified (Open, Closed, type of violation, Severity)
7.4 Guidance
- Confirm the coding guidelines (e.g., coding standards) address secure coding practices. The selection of which coding standard to use should be done during the planning part of the software project
Some of the widely used coding standards that consider safety are:
For C language: MISRA C, SEI CERT C Coding Standard. The SEI CERT C Coding Standard is a software coding standard for the C programming language, developed by the CERT Coordination Center to improve the safety, reliability, and security of software systems.
For C++ language: MISRA C++, JSF AV C++ Coding Standard, SEI CERT C++ Coding Standard, AUTOSAR C++ Coding Guidelines.
2. Confirm that secure coding practices are used.
Confirm that the project is using the code standard selected.
3. Perform an independent static code analysis for secure coding practices.
Use a static code analysis tool on the source code to look for compliance with the coding standard rules. Doing this by hand without a tool is nearly impossible, too many coding rules and too much code.
If engineering is running a tool that does the standard checking then SA can look at and use the tool output to determine if the code meets the code standards.
It is best if SA runs a code standard checker on the source code. Part of this is to get SA more involved directly in the source code product and not just relying on what engineering and saying about the source code.
IV&V may be able to help you with an independent static code analysis for secure coding practices.
Check to see if the engineering team and the project have run a static analysis tool to assess the cybersecurity vulnerabilities and weaknesses in the source code if so check to see that the findings from the static analysis tool have been addressed by the team.
A method of identifying weaknesses and vulnerabilities is to use the National Vulnerability Database 665 from NIST that is the U.S. government repository of standards-based vulnerability data. Software weaknesses can be identified using Common Weakness Enumeration (CWE) 666 - a dictionary created by MITRE.
See the secure coding site 664 for more information (NASA access only).