bannerd


8.03 - Organizational Goals of Software Assurance Metrics

1. Metrics

The table below shows the derivation of SA Metrics from the Goal Statements using the Goal, Question, Metric method.

Goal Statements

Goal

Question

SA Metric

Assure delivery of quality software requirements to assure safe and secure products in support of mission success and customer objectives.

Quality Software Requirements

Are the software requirements detailed enough for development and test?

The ratio of the number of detailed software requirements to the number of SLOC to be developed by the project.

Percentage complete of each area of traceability.

Are requirements stable?

Software requirements volatility trended after project baseline (e.g., # of requirements added, deleted, or modified; tbds).

Are the software hazards adequately addressed in the software requirements?

Percentage complete of traceability to each hazard with software items. (New)

Assure delivery of quality, safe, and secure code.

Quality Code

Is the code secure and has the code addressed cybersecurity requirements?

Number of cybersecurity secure coding violations per number of developed lines of code;

List of types of secure coding violations found.

Is the safety-critical code safe?

Software cyclomatic complexity data for all identified safety-critical software component;

What is the quality of the code?

Number of defects or issues found in the software after delivery;

The number of defects or non-conformances found in flight code, ground code, tools, and COTs products used.

Do the requirements adequately address cybersecurity?

Number and type of identified cybersecurity vulnerabilities and weaknesses found by the project.

Continuously improve the quality and adequacy of software testing to assure safe and reliable products and services are delivered.

Quality Software Testing

Does the test program provide adequate coverage?

Software Code Coverage data;

Software requirements test coverage percentages, including the percentage of testing completed and the percentage of the detailed software requirements, successfully tested to date;

Number of issues and discrepancies found during each test;

The number of lines of code tested.

Does the software test program test all of the safety-critical code?

Test coverage data for all identified safety-critical software components.

Continuously monitor software projects to improve management of Software Plans, Procedures, and Defects to assure quality products and services are delivered on-time and within budget.

Quality Software Plans, Procedures, and Defect Tracking

Is the SW project proceeding as planned?

Compare initial cost estimate and final actual cost, noting assumption and differences in cost parameters;

Is the SW project addressing identified problems?

The number of finding from process non-compliances and process maturity.

Is the SW project using peer reviews to increase product quality?

Number of peer reviews performed vs. # planned; the number of defects found in each peer review;

How well is the project following its processes and procedures?

Number of audits findings per audit;

The time required to close the audit findings;

Defect Tracking status and why the Defect occurred?

Problem/change report status: total number, number closed, the number opened in the current reporting period, age, severity;

Number of defects or issues found in the software after delivery;

The number of defects or non-conformances found in flight code, ground code, tools, and COTs products used.

Number of software non-conformances at each severity level for each software configuration item.

The number of root cause analyses performed; list of finding identified by each root cause analysis.

The trend showing the closure of corrective actions over time.

Maintain and advance organizational capability in software assurance processes and practices to meet NASA-STD-8739.8 requirements.

SA Process Improvements

Are SA findings providing value to software development?

The number of SA findings (e.g., # open, closed, latency, # accepted) mapped against SA activities, through the life cycle, including process non-compliances, process maturity.

The number of defects found by software assurance during each peer review activity.

Is the SA effort proceeding as planned?

Trend the software assurance cost estimates through the project life cycle;

Planned SA resource allocation versus actual SA resource allocation.

Percent of the required training completed for each of the project SA personnel.

The number of compliance audits planned vs. the number of compliance audits complete and trends on non-conformances from the audits.

See also 8.18 - SA Suggested Metrics

1.1 Additional Guidance

Links to Additional Guidance materials for this subject have been compiled in the Relevant Links table. Click here to see the Additional Guidance in the Resources tab.

2. Resources

2.1 Resources

  • (SWEREF-391) Basili, V., Caldiera, G. and Rombach, H. D., Institute for Advanced Computer Studies, Department of Computer Science, University Of Maryland, College Park, Maryland. FB Informatik, Universität Kaiserslautern, Kaiserslautern, Germany.
  • (SWEREF-392) Park, R., Goethert, W., and Florac, W. CMU/SEI-96-HB-002.


2.2 Additional Guidance

Additional guidance related to this requirement may be found in the following materials in this Handbook:

2.3 Center Process Asset Libraries

SPAN - Software Processes Across NASA
SPAN contains links to Center managed Process Asset Libraries. Consult these Process Asset Libraries (PALs) for Center-specific guidance including processes, forms, checklists, training, and templates related to Software Development. See SPAN in the Software Engineering Community of NEN. Available to NASA only. https://nen.nasa.gov/web/software/wiki  197

See the following link(s) in SPAN for process assets from contributing Centers (NASA Only). 

SPAN Links
  • No labels

0 Comments