Comment:
Migration of unmigrated content due to installation of a new plugin
Excerpt Include
2D-SWE Page Message
2D-SWE Page Message
nopanel
true
Show If
spacePermission
edit
Panel
borderColor
red
title
Visible to editors only
Expand
Content updates needed on this page:
Update Guidance
Update Software Assurance
Tabsetup
0
1. The Requirement
1
2. Rationale
2
3. Guidance
3
4. Small Projects
4
5. Resources
5
6. Lessons Learned
6
7. Software Assurance
Div
id
tabs-1
1. Requirements
Excerpt
5.4.5 The project manager shall monitor measures to ensure the software will meet or exceed performance and functionality requirements, including satisfying constraints.
1.1 Notes
The metrics could include planned and actual use of computer hardware resources (such as processor capacity, memory capacity, input/output device capacity, auxiliary storage device capacity, and communications/network equipment capacity, bus traffic, partition allocation) over time. As part of the verification of the software detailed design, the developer will update the estimation of the technical resource metrics. As part of the verification of the coding, testing, and validation, the technical resource metrics will be updated with the measured values and will be compared to the margins.
1.2 History
Expand
title
Click here to view the history of this requirement: SWE-199 History
Include Page
SITE:SWE-199 History
SITE:SWE-199 History
1.3 Applicability Across Classes
Applicable c
a
1
b
1
csc
1
c
1
d
0
dsc
1
e
0
f
0
g
0
h
0
Show If
label
activity
1.4 Related Activities
This requirement is related to the following Activities:
Related Links
Include Page
SWE-199 - Related Activities
SWE-199 - Related Activities
Div
id
tabs-2
2. Rationale
It is very important to consider constraints on resources in the design of a system so that the development effort can make appropriate decisions for both hardware and software components. As development proceeds, it is important to check regularly that the software is meeting the performance and functionality constraints. These results should be reported at major milestone reviews and regularly to the Project Manager.
Div
id
tabs-3
3. Guidance
3.1 Requirements Testing
It is important to remember that functional requirements are the ‘what’ and nonfunctional requirements are the ‘how’. So, the testing of functional requirements is the verification that the software is executing actions as it should, while nonfunctional testing helps verify that customer expectations are being met.
3.2 Functional Requirements
An early metric for gauging the success of meeting functional requirements would be the results of unit testing. The next step would be looking at the number of issues captured while dry-running verifications. Finally, the number of issues found and resolved along with the number of requirements verified during formal verification are good metrics to track.
3.2 Performance Requirements
Performance requirements can be difficult to determine since many are domain and project-specific. They tend to be dependent on the software design as a whole including how it interacts with the hardware. Performance requirements are usually, but not always, related to the Quality attributes of the software system.
Some typical performance requirements involve the following:
Peak demand processing i.e. transactions per second over some prescribed duration
Sustained processing i.e. transactions per second over any period
Additional guidance related to this requirement may be found in the following materials in this Handbook:
Related Links
Include Page
SWE-199 - Related SWEs
SWE-199 - Related SWEs
Include Page
SWE-199 - Related SM
SWE-199 - Related SM
3.4 Center Process Asset Libraries
Excerpt Include
SITE:SPAN
SITE:SPAN
nopanel
true
See the following link(s) in SPAN for processassets from contributing Centers (NASA Only).
SPAN Links
Include Page
SITE:SPAN Metrics
SITE:SPAN Metrics
Div
id
tabs-4
4. Small Projects
No additional guidance is available for small projects.
Div
id
tabs-5
5. Resources
5.1 References
refstable
Show If
group
confluence-users
Panel
titleColor
red
title
Instructions for Editors
Expand
Enter the necessary modifications to be made in the table below:
SWEREFs to be added
SWEREFS to be deleted
SWEREFs called out in text: none
SWEREFs NOT called out in text but listed as germane: none
Related Links Pages
Children Display
5.2 Tools
Include Page
Tools Table Statement
Tools Table Statement
Div
id
tabs-6
6. Lessons Learned
6.1 NASA Lessons Learned
No Lessons Learned have currently been identified for this requirement.
6.2 Other Lessons Learned
No other Lessons Learned have currently been identified for this requirement.
Div
id
tabs-7
7. Software Assurance
Excerpt Include
SWE-199 - Performance Measures
SWE-199 - Performance Measures
7.1 Tasking for Software Assurance
Panel
borderColor
blue
title
From NASA-STD-8739.8B
Include Page
SITE:SWE-199 - SA Task1
SITE:SWE-199 - SA Task1
Include Page
SITE:SWE-199 - SA Task2
SITE:SWE-199 - SA Task2
7.2 Software Assurance Products
Analysis of SW measures relating to SW meeting or exceeding requirements, highlighting any issues.
Note
title
Objective Evidence
Software measurement or metric data
Trends and analysis results on the metric set being provided
Status presentation showing metrics and treading data
Expand
title
Definition of objective evidence
Include Page
SITE:Definition of Objective Evidence
SITE:Definition of Objective Evidence
7.3 Metrics
The number of requirements being met, any TBD/TBC/TBR requirements, or any requirements that are not being met.
Performance and functionality measures (Schedule deviations, closure of corrective actions, product and process audit results, peer review results, requirements volatility, number of requirements tested versus the total number of requirements, etc.)
Task 1: The software assurance personnel will review the software development/software management plan to become familiar with the functional and performance metrics the project has planned to collect. Functional metrics would be the metrics measuring “what” the project is doing, for example, the number of unit tests run and passed would be a functional measure. Examples of performance metrics are:
Peak demand processing i.e., transactions per second over some prescribed duration
Sustained processing i.e., transactions per second over any period
Response time i.e., time to service an interrupt.
Storage capacity/utilization
Sampling rates
CPU utilization
Memory capacity/utilization
Software assurance then confirms that the project is collecting the planned measures and assessing whether they will meet their performance and functionality goals. Confirm that the project is updating the set of metrics being collected if the initial set chosen is not providing the information they need.
Task 2: Software assurance will do an independent analysis of the performance and functionality measures that the project is collecting to determine whether the project will meet or exceed its functional and performance requirements. It is particularly important to analyze some of these measures to get an early indication of problems. For example, if the number of unit tests run and passed is much lower than expected or planned for this point in the schedule, then it is unlikely the project will deliver on schedule without some correction. Requirements volatility is another early indicator of problems. If there is still a lot of volatility in the requirements going into implementation, then it is likely that implementation will be slower than planned and involve a lot of rework to account for changing requirements.
In this case software assurance should do its own or independent assessment of the performance and functionality measures that the project is collecting to determine whether the project will meet or exceed its functional and performance requirements and not just accept or rely on the engineering assessment.
7.5 Additional Guidance
Additional guidance related to this requirement may be found in the following materials in this Handbook: