bannerd


SWE-199 - Performance Measures

1. Requirements

5.4.5 The project manager shall monitor measures to ensure the software will meet or exceed performance and functionality requirements, including satisfying constraints.

1.1 Notes

The metrics could include planned and actual use of computer hardware resources (such as processor capacity, memory capacity, input/output device capacity, auxiliary storage device capacity, and communications/network equipment capacity, bus traffic, partition allocation) over time. As part of the verification of the software detailed design, the developer will update the estimation of the technical resource metrics. As part of the verification of the coding, testing, and validation, the technical resource metrics will be updated with the measured values and will be compared to the margins.

1.2 History

SWE-199 - Last used in rev NPR 7150.2D

RevSWE Statement
A


Difference between A and B

N/A

B


Difference between B and C

NEW

C

5.4.5 The project manager shall monitor measures to ensure the software will meet or exceed performance and functionality requirements, including satisfying constraints.

Difference between C and DNo change
D

5.4.5 The project manager shall monitor measures to ensure the software will meet or exceed performance and functionality requirements, including satisfying constraints.



1.3 Applicability Across Classes

Class

     A      

     B      

     C      

     D      

     E      

     F      

Applicable?

   

   

   

   

   

   

Key:    - Applicable | - Not Applicable


2. Rationale

It is very important to consider constraints on resources in the design of a system so that the development effort can make appropriate decisions for both hardware and software components.  As development proceeds, it is important to check regularly that the software is meeting the performance and functionality constraints. These results should be reported at major milestone reviews and regularly to the Project Manager.

3. Guidance

3.1 Requirements Testing

It is important to remember that functional requirements are the ‘what’ and nonfunctional requirements are the ‘how’. So, the testing of functional requirements is the verification that the software is executing actions as it should, while nonfunctional testing helps verify that customer expectations are being met.

3.2 Functional Requirements

An early metric for gauging the success of meeting functional requirements would be the results of unit testing. The next step would be looking at the number of issues captured while dry-running verifications. Finally, the number of issues found and resolved along with the number of requirements verified during formal verification are good metrics to track.

3.2 Performance Requirements

Performance requirements can be difficult to determine since many are domain and project-specific. They tend to be dependent on the software design as a whole including how it interacts with the hardware. Performance requirements are usually, but not always, related to the Quality attributes of the software system.

Some typical performance requirements involve the following:

  • Peak demand processing i.e. transactions per second over some prescribed duration
  • Sustained processing i.e. transactions per second over any period
  • Response time i.e. time to service an interrupt
  • Storage capacity/utilization
  • Sampling rates
  • CPU utilization
  • Memory capacity/utilization

At a minimum, these items should be evaluated and reported to the Project at all major milestone reviews as well as any major maintenance upgrades. See also SWE-195 - Software Maintenance Phase, 5.04 - Maint - Software Maintenance Plan, Topic 7.09 - Entrance and Exit Criteria, 8.52 - Software Assurance Status Reports

3.3 Additional Guidance

Additional guidance related to this requirement may be found in the following materials in this Handbook:

3.4 Center Process Asset Libraries

SPAN - Software Processes Across NASA
SPAN contains links to Center managed Process Asset Libraries. Consult these Process Asset Libraries (PALs) for Center-specific guidance including processes, forms, checklists, training, and templates related to Software Development. See SPAN in the Software Engineering Community of NEN. Available to NASA only. https://nen.nasa.gov/web/software/wiki  197

See the following link(s) in SPAN for process assets from contributing Centers (NASA Only). 

SPAN Links

4. Small Projects

No additional guidance is available for small projects.

5. Resources

5.1 References

  • (SWEREF-197) Software Processes Across NASA (SPAN) web site in NEN SPAN is a compendium of Processes, Procedures, Job Aids, Examples and other recommended best practices.


5.2 Tools

Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). 

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. 

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool.  The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

6. Lessons Learned

6.1 NASA Lessons Learned

No Lessons Learned have currently been identified for this requirement.

6.2 Other Lessons Learned

No other Lessons Learned have currently been identified for this requirement.

7. Software Assurance

SWE-199 - Performance Measures
5.4.5 The project manager shall monitor measures to ensure the software will meet or exceed performance and functionality requirements, including satisfying constraints.

7.1 Tasking for Software Assurance

From NASA-STD-8739.8B

1. Confirm that the project monitors and updates planned measurements to ensure the software meets or exceeds performance and functionality requirements, including satisfying constraints.

2. Monitor and track any performance or functionality requirements that are not being met or are at risk of not being met.

7.2 Software Assurance Products

  • Analysis of SW measures relating to SW meeting or exceeding requirements, highlighting any issues.


    Objective Evidence

    • Software measurement or metric data
    • Trends and analysis results on the metric set being provided
    • Status presentation showing metrics and treading data

    Objective evidence is an unbiased, documented fact showing that an activity was confirmed or performed by the software assurance/safety person(s). The evidence for confirmation of the activity can take any number of different forms, depending on the activity in the task. Examples are:

    • Observations, findings, issues, risks found by the SA/safety person and may be expressed in an audit or checklist record, email, memo or entry into a tracking system (e.g. Risk Log).
    • Meeting minutes with attendance lists or SA meeting notes or assessments of the activities and recorded in the project repository.
    • Status report, email or memo containing statements that confirmation has been performed with date (a checklist of confirmations could be used to record when each confirmation has been done!).
    • Signatures on SA reviewed or witnessed products or activities, or
    • Status report, email or memo containing a short summary of information gained by performing the activity. Some examples of using a “short summary” as objective evidence of a confirmation are:
      • To confirm that: “IV&V Program Execution exists”, the summary might be: IV&V Plan is in draft state. It is expected to be complete by (some date).
      • To confirm that: “Traceability between software requirements and hazards with SW contributions exists”, the summary might be x% of the hazards with software contributions are traced to the requirements.
    • The specific products listed in the Introduction of 8.16 are also objective evidence as well as the examples listed above.

7.3 Metrics

  • The number of requirements being met, any TBD/TBC/TBR requirements, or any requirements that are not being met.
  • Performance and functionality measures (Schedule deviations, closure of corrective actions, product and process audit results, peer review results, requirements volatility, number of requirements tested versus the total number of requirements, etc.)

See also Topic 8.18 - SA Suggested Metrics.

7.4 Guidance

Task 1: The software assurance personnel will review the software development/software management plan to become familiar with the functional and performance metrics the project has planned to collect. Functional metrics would be the metrics measuring “what” the project is doing, for example, the number of unit tests run and passed would be a functional measure. Examples of performance metrics are: 

  • Peak demand processing i.e., transactions per second over some prescribed duration
  • Sustained processing i.e., transactions per second over any period
  • Response time i.e., time to service an interrupt.
  • Storage capacity/utilization
  • Sampling rates
  • CPU utilization
  • Memory capacity/utilization

Software assurance then confirms that the project is collecting the planned measures and assessing whether they will meet their performance and functionality goals. Confirm that the project is updating the set of metrics being collected if the initial set chosen is not providing the information they need.

Task 2: Software assurance will do an independent analysis of the performance and functionality measures that the project is collecting to determine whether the project will meet or exceed its functional and performance requirements. It is particularly important to analyze some of these measures to get an early indication of problems. For example, if the number of unit tests run and passed is much lower than expected or planned for this point in the schedule, then it is unlikely the project will deliver on schedule without some correction. Requirements volatility is another early indicator of problems. If there is still a lot of volatility in the requirements going into implementation, then it is likely that implementation will be slower than planned and involve a lot of rework to account for changing requirements. 

In this case software assurance should do its own or independent assessment of the performance and functionality measures that the project is collecting to determine whether the project will meet or exceed its functional and performance requirements and not just accept or rely on the engineering assessment.

7.5 Additional Guidance

Additional guidance related to this requirement may be found in the following materials in this Handbook:

  • No labels