SWE-051 - Software Requirements Analysis

1. Requirements

4.1.3 The project manager shall perform software requirements analysis based on flowed-down and derived requirements from the top-level systems engineering requirements, safety and reliability analyses, and the hardware specifications and design. 

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2 History

Click here to view the history of this requirement: SWE-051 History

1.3 Applicability Across Classes















Key:    - Applicable | - Not Applicable
A & B = Always Safety Critical; C & D = Sometimes Safety Critical; E - F = Never Safety Critical.

2. Rationale

Software requirements are the basis of a software project.

Analyzing software requirements allows a team to ensure that they are properly formed and accurately and clearly describe the software system to be built.  The analysis provides a structured method of reviewing requirements to identify any issues with them individually or as a collected set.  The team can address identified issues before using the requirements for further project work.  This reduces the need for future rework, not only of the requirements but also of any work based on those requirements.

3. Guidance

The software requirements analysis determines the requirement's safety criticality, correctness, consistency, clarity, completeness, traceability, feasibility, verifiability, and maintainability. The software requirements analysis activities include the allocation of functional, non-functional, and performance requirements to functions and sub-functions.

It is important to ensure that requirements have been evaluated adequately because incomplete requirements can cause several problems: Incorrect estimates of project resources.

  • Missing or additional design elements.
  • Additional cost and schedule do to rework to correct for missing/incorrect requirements.
  • Added resources for verification and validation.
  • Loss of customer confidence due to improperly described requirements.

Requirements are not incorporated into the software requirements specification until the analysis process has been completed.

The requirements analysis methodology needs to be "measurable or otherwise verifiable." 278 Checklists of questions to consider (such as those included in the Resources section of this guidance) may be helpful.

In light of the operational concept and scenarios, the requirements for one level of the product hierarchy are analyzed to determine whether they are necessary and sufficient to meet the objectives of higher levels of the product hierarchy. The analyzed requirements then provide the basis for more detailed and precise requirements for lower levels of the product hierarchy." 157

Regardless of the methods chosen, the project team documents the methodology used for software requirements analysis in an appropriate project document, such as the Software Development Plan/Software Management Plan (SDP/SMP), and includes some minimum steps:

  • Verify requirements safety criticality, correctness, consistency, completeness.
  • Verify the requirements are clear, precise, unequivocal, verifiable, testable, maintainable, feasible.
  • Verify requirements traceability.
  • Verify that requirements have been properly flowed down from one level to the next (i.e., from the system requirements to the software subsystem requirements and to the various levels of requirements within the software subsystem).
  • Verify that requirements have been properly identified and flowed across from the software interfaces, including all computer hardware requirements and all fault management requirements.
  • Examine the requirements "individually and as an integrated set." 276

The team may perform analysis of software requirements in conjunction with the allocation of requirements to various levels of functions and sub-functions. Guidance on the logical decomposition of requirements may be found in SWE-050.

The following roles may be involved in software requirements analysis:

  • Software Requirements Engineers.
  • Software Safety and Assurance personnel.
  • Systems Engineers.
  • Hardware Engineers.
  • Operations.
  • Fault Management Engineers.
  • Customers.

Software requirements analysis begins after the System Requirements Review (SRR). The development team analyzes the software requirements for completeness and feasibility. The development team uses structured or object-oriented analysis and a requirements classification methodology to clarify and amplify the requirements. Prioritizing requirements may also occur as part of requirements analysis. Developers work closely with the requirements definition team to resolve ambiguities, discrepancies, and to-be-determined (TBD) requirements or specifications. The theme of reuse plays a prominent role throughout the requirements analysis and the design phase. Special emphasis is placed on identifying potentially reusable architectures, designs, code, and approaches.

When requirements analysis is complete, the development team prepares a summary requirements analysis report and holds a Software Requirements Review (SwRR). During the SwRR, the development team presents the results of their analysis for evaluation. Following the SwRR, the requirements definition team updates the requirements document to incorporate any necessary modifications and the requirements analysis is revised based on changes to requirements made after SwRR. This revision work is completed by Preliminary Design Review (PDR) at the same time the requirements are finalized.

Software requirements analysis is a continuous activity performed on all software requirements and software requirement changes.

The use of formal inspections is an excellent method of reviewing requirements with stakeholders because it brings multiple viewpoints to bear and also achieves a common understanding of the requirements. Information on formal inspections can be found in SWE-087. Software peer reviews/inspections (SWE-088, SWE-089) are a recommended best practice for all safety and mission-success related requirements, design, and code software components. Guidelines for software peer reviews/inspections are contained in the NASA Software Formal Inspections Standard (NASA-STD-2202-93). 277

Determine safety criticality

Software safety personnel need to be involved in the analysis of software requirements to determine their safety criticality. Software safety personnel analyze software requirements in terms of safety objectives to determine whether each requirement has safety implications. Those requirements with safety implications are designated, marked, and tracked as "safety-critical."

Additional analysis steps typically performed by software safety personnel include:

  • Verification that software safety requirements are derived from appropriate parent requirements, include modes, states of operation, and safety-related constraints, and are properly marked.
  • Verification that software safety requirements "maintain the system in a safe state and provide adequate proactive and reactive responses to potential failures."      

Additional information on the analysis performed by software safety personnel can be found in the NASA Software Safety Standard (NASA-STD-8719.13) 271and the NASA Software Safety Guidebook.

Determine correctness

Requirements are considered correct if they "respond properly to situations" 001 and are appropriate to meet the objectives of higher-level requirements. A method for determining correctness is to compare the requirements set against operational scenarios developed for the project.

Determine consistency

Requirements are consistent if they do not conflict with each other within the same requirements set and if they do not conflict with system (or higher-level) requirements. It is helpful to have at least one person read through the entire set of requirements to confirm the use of consistent terms/terminology throughout.

Determine clarity

Requirements are clear if they are precise, unequivocal, and unambiguous ("can only be interpreted one way" 001) both individually and as a collection. Requirements need to be concise, "stated as briefly as possible without affecting the meaning." 001

Suggested methods for confirming the clarity of requirements include:

  • Reading the requirements and their supporting documents.
  • Formal inspection.

Determine completeness

Requirements are complete if there are no omissions or undefined conditions in the requirements set. Requirements are also complete if there is no "TBDs" in the requirements set.

Suggested methods for confirming the completeness of requirements include:

  • Reading the requirements and their supporting documents.
  • Formal inspection.
  • Reviewing the requirements set to confirm that availability, installation, maintainability, performance, portability, reliability, safety, security, and other requirements are included as appropriate to the project. 061
  • Reviewing the requirements set to confirm they are "sufficiently complete to begin design." 061
  • Reviewing the requirements to confirm they have any necessary accompanying rationale and verifiable assumptions. 086
  • Review the requirements set against operational scenarios developed for the project.

Determine traceability

When determining requirement traceability, the team ensures that requirements trace bi-directionally so that all software requirements have a parent (higher level) requirement and all levels of software requirements and flowed down to the appropriate detailed (lower) levels for implementation. In order for requirements to be properly traced, they are also uniquely identified.

Suggested methods for this type of analysis include:

  • Trace requirements from parent/source documents into the software requirements specification and vice versa.
  • Reviewing existing traceability matrices for completeness and accuracy (SWE-052).
  • Reviewing the requirements set to confirm there are no "extra" or "unneeded" requirements (those not necessary to meet the parent requirement).
  • Reviewing the requirements to confirm all performance requirements are realistic.

Determine feasibility

Technically feasible requirements are reasonable, realistic requirements that can be implemented and integrated together successfully to meet the operational concepts and system requirements of the project within the given operating environment, budget, schedule, available technology, and other constraints. 061

Suggested methods for this type of analysis include:

  • Reviewing requirements to confirm they do not "overly constrain the design." 061
  • Reviewing the requirements to confirm they do not unnecessarily "necessitate the use of non-standard, unusual, or unique hardware or software." 061
  • Review the requirements to confirm they are appropriate for the operation and maintenance of the project.

Determine verifiability

Requirements are verifiable if they are testable if there is “a technique to verify and/or validate the requirement."  001 Suggested techniques include testing, demonstration, inspection, and analysis.

Suggested methods for determining if requirements are verifiable include:

  • Reviewing the requirements to confirm that they use verifiable terms (e.g., do not use terms such as "easy," "sufficient," "adequate").
  • Reviewing the requirements set to confirm requirements are "stated precisely to facilitate specification of system test success criteria." 086
  • Reviewing the requirements to confirm that there is at least one feasible method identified to verify the requirement

Determine maintainability

Requirements are maintainable if they are "written so that ripple effects from changes are minimized (i.e., requirements are as weakly coupled as possible)."  086 Maintainability can be achieved, by reviewing the requirements set looking for unnecessarily coupled or interdependent requirements.

Communicate outcome

Although not part of the engineering requirement, it is recommended that the results of software requirements analysis be captured in the project documentation and communicated to those who need this information to make decisions or to develop (or update) project documents. The stakeholders and the project will decide how to address the results of the analysis, including any changes that need to be made to address findings. The methodology used for the software requirements analysis and the results of the software requirements analysis is communicated at multiple project formal reviews as defined in the software development or management plan. Specifically, according to the NASA Software Safety Standard (NASA-STD-8719.13) 271 , "The provider software safety requirements analysis will be available to the acquirer and the acquirer SMA [Safety and Mission Assurance] for program, project, and facility formal reviews, system-level safety reviews, and upon acquirer request." 271

When capturing the results of software requirements analysis, consider the following content:

  • Purpose and background of the project, overall system concepts, and document overview.
  • Key reuse candidates and overall architectural concept for the system.
  • Updates to operations concepts resulting from work performed during the requirements analysis phase.
    • Updated operations scenarios.
    • Operational modes, including volume and frequency of data to be processed in each mode, order, and type of operations, etc.
    • Updated descriptions of input, output, and messages.
  • Specification analysis
    • Summary of classifications (mandatory, derived, "wish list," information only, or TBD) assigned to requirements and functional specifications.
    • Problematic specifications (identification and discussion of conflicting, ambiguous, infeasible, untestable, and TBD requirements and specifications).
    • Unresolved requirements/operations issues, including the dates by which resolutions are needed
    • Analysis of mathematical algorithms.
  • System constraints
    • Hardware availability (execution, storage, peripherals).
    • Operating system limitations.
    • Support software limitations.
  • Development assumptions.
  • Risks, both to costs and schedules, including risks related to TBD or changing requirements, as well as technical risks.
  • Prototyping efforts needed to resolve technical risks, including the goals and schedule for each prototyping effort.
  • Data flow or object-oriented diagrams (results of all functional decomposition or object-oriented analysis of the requirements performed during the requirements analysis phase).
  • Data dictionary for the updated processes, data flows, and objects shown in the diagrams.

Consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources related to software requirements analysis, including relevant checklists.

NASA-specific requirements analysis process information and resources are available in Software Processes Across NASA (SPAN), accessible to NASA users from the SPAN tab in this Handbook. 

Additional guidance related to software requirements analysis may be found in the following related requirements in this Handbook:

4. Small Projects

Projects with small budgets or limited personnel may choose to limit the number of reviews involved in software requirements analysis. It is important in this situation to avoid skipping any important analysis activities. Consider using checklists or other guides to ensure all analysis elements are addressed.

Additionally, multiple roles may be filled by a single person on small projects, so it may be helpful to request assistance from experts outside the project when conducting requirements analysis. These persons can provide "fresh eyes" as well as specific key perspectives that may not be available on the core project team.

5. Resources

5.1 References

  • (SWEREF-001) Software Development Process Description Document, EI32-OI-001, Revision R, Flight and Ground Software Division, Marshall Space Flight Center (MSFC), 2010. This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA-users from the SPAN tab in this Handbook.

  • (SWEREF-057) Software Management Plan (SMP) Template, GRC-SW-TPLT-SMP, NASA Glenn Research Center (GRC), 2009. This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA users from the SPAN tab in this Handbook.

  • (SWEREF-061)

    JPL Document D-24994, NASA Jet Propulsion Laboratory, 2003. See Page 20. Approved for U.S. and foreign release.

  • (SWEREF-086)

    5526_7-21-06_Req_RevA_generic-R1V0, 2006. This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA users from the SPAN tab in this Handbook.

  • (SWEREF-105) Software System/Subsystem Requirements Specifications (SSRS) Checklist, NASA Marshall Space Flight Center (MSFC) , 2012. This NASA-specific information and resource may be available in Software Processes Across NASA (SPAN), accessible to NASA-users from the SPAN tab in this Handbook.

  • (SWEREF-157)

    CMMI Development Team (2010). CMU/SEI-2010-TR-033, Software Engineering Institute.

  • (SWEREF-174)

    Department of Defence Systems Management College, Supplementary text prepared by the Defense Acquisition University Press, Fort Belvoir, VA, 2001.

  • (SWEREF-189) Writing an SRS, Foster, C.M. (1993). Analex Corporation for Glenn Research Center. This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA-users from the SPAN tab in this Handbook.

  • (SWEREF-271)

    NASA STD 8719.13 (Rev C ) , Document Date: 2013-05-07

  • (SWEREF-276)

    NASA-GB-8719.13, NASA, 2004.

  • (SWEREF-277)

    NASA-STD-8739.9, NASA Office of Safety and Mission Assurance, 2013. Change Date: 2016-10-07, Change Number: 1

  • (SWEREF-278)

    NASA-STD-8739.8A , NASA TECHNICAL STANDARDS SYSTEM, Approved 2020-06-01 Superseding "NASA-STD-8739.8, With Change 1"

  • (SWEREF-559)

    Public Lessons Learned Entry: 1501.

  • (SWEREF-576)

    Public Lessons Learned Entry: 3377.

5.2 Tools

Unable to render {include} The included page could not be found.

6. Lessons Learned

6.1 NASA Lessons Learned

The NASA Lessons Learned database contains the following lessons learned related to software requirements analysis:

  • Software Requirements Management. Lesson Number 3377 576 : "Cost and schedule impacts that result from incomplete, incorrect, or changing software requirements increase the later they occur in the software life cycle."
  • Orbital Space Plane - Stay true to the process! (Contributor to Orbital Space Plane (OSP) problems.) Lesson Number 1501 559: "Development of the Level 2 requirements did not follow established systems engineering guidelines for allocation, the inclusion of performance and functional requirements, validation, and feasibility assessments. ... Requirement development, analyses, and system design activities were not synchronized. Functional decomposition was not complete before system design started and before Level 3 requirements were base-lined. ... The process for demonstrating requirements feasibility was unclear."

6.2 Other Lessons Learned

No other Lessons Learned have currently been identified for this requirement.

7. Software Assurance

SWE-051 - Software Requirements Analysis
4.1.3 The project manager shall perform software requirements analysis based on flowed-down and derived requirements from the top-level systems engineering requirements, safety and reliability analyses, and the hardware specifications and design. 

7.1 Tasking for Software Assurance

  1. Perform a software assurance analysis on the detailed software requirements to analyze the software requirement sources and identify any incorrect, missing, or incomplete requirements. 

7.2 Software Assurance Products

  • From SWE-050, The results of the independent SA analysis performed on the detailed software requirements, including the list of requirements issues identified and recorded in a problem tracking system.

    Objective Evidence

    • None
     Definition of objective evidence

    Objective evidence is an unbiased, documented fact showing that an activity was confirmed or performed by the software assurance/safety person(s). The evidence for confirmation of the activity can take any number of different forms, depending on the activity in the task. Examples are:

    • Observations, findings, issues, risks found by the SA/safety person and may be expressed in an audit or checklist record, email, memo or entry into a tracking system (e.g. Risk Log).
    • Meeting minutes with attendance lists or SA meeting notes or assessments of the activities and recorded in the project repository.
    • Status report, email or memo containing statements that confirmation has been performed with date (a checklist of confirmations could be used to record when each confirmation has been done!).
    • Signatures on SA reviewed or witnessed products or activities, or
    • Status report, email or memo containing Short summary of information gained by performing the activity. Some examples of using a “short summary” as objective evidence of a confirmation are:
      • To confirm that: “IV&V Program Execution exists”, the summary might be: IV&V Plan is in draft state. It is expected to be complete by (some date).
      • To confirm that: “Traceability between software requirements and hazards with SW contributions exists”, the summary might be x% of the hazards with software contributions are traced to the requirements.

7.3 Metrics

  • # of software work product Non-Conformances identified by life-cycle phase over time
  • # of Software Requirements (e.g. Project, Application, Subsystem, System, etc.)
  • # of Software Requirements that do not trace to a parent requirement   
  • Defect trends for trace quality (# of circular traces, orphans, widows, etc.)
  • # of detailed software requirements vs. # of estimated SLOC to be developed by the project
  • # of incorrect, missing and incomplete requirements (i.e., # of requirements issues) vs. # of requirements issues resolved
  • # of safety-related requirement issues (Open, Closed) over time
  • # of safety-related non-conformances identified by life-cycle phase over time

7.4 Guidance

Software assurance and software safety should perform an independent SA analysis on the detailed software requirements as they are developed.  Use the guidelines in the SAANALYSIS - Software Assurance Analysis on the Detailed Software Requirements.

Make sure that the detailed software requirements include or point to requirements for COTS, GOTS, MOTS, OSS, or reused software components that are part of the software.  An example is the detailed software requirements for functions done by a real-time operating system that should be captured in the detailed software requirements.  

NASA Missions go through a logical decomposition in defining their requirements. Requirements analysis addresses a system’s software requirements including analysis of the functional and performance requirements, hardware requirements, interfaces external to the software, and requirements for qualification, quality, safety, security, dependability, human interfaces, data definitions, user requirements, installation, acceptance, user operation, and user maintenance.  

When evaluating the software requirements, consider the list of items below:

  1. Is the approach to requirements decomposition reasonable, appropriate, and consistent?
  2. Are the system’s software requirements both individually and in aggregate of high quality (correct, consistent, complete, accurate, unambiguous, and verifiable)?
  3. Will requirements adequately meet the needs of the system and expectations of its customers and users?
  4. Do requirements consider the operational environment under nominal and off-nominal conditions? Specifically:
    • Do the requirements specify what the system is supposed to do?
    • Do requirements guard against what the system is not supposed to do?
    • Do the requirements describe how the software responds under adverse conditions?
  5. Is this requirement necessary?
  6. Are the requirements understandable?
  7. Are the requirements unnecessarily complicated?
  8. Has system performance been captured as part of the requirements?
  9. Are the system boundaries (or perhaps operational environment) well defined?
  10. Is a requirement realistic given the current technology?
  11. Is the requirement singular in nature, or could it be broken down into several requirements? (looking at grammar not whether it can be decomposed or not)
  12. Within each requirement level, are requirements at an appropriate and consistent level of abstraction?
  13. In the traceability, is the parent requirements represented in the appropriate child requirements? (Note: SWE-052 requires that software complete the bidirectional traceability and the software assurance tasking for SWE-052 requires the assessment of the bidirectional traceability as well as confirming that the traceability includes tracing to hazard analysis.)
  14. Do the parent requirements include outside sources such as:
    • Hardware specifications
    • Computer\Processor\Programmable Logic Device specifications
    • Hardware interfaces
    • Operating system requirements and board support packages
    • Data\File definitions and interfaces
    • Communication interfaces including bus communications Software interfaces
    • Derived from Domain Analysis
    • Fault Detection, Isolation and Recovery requirements
    • Models
    • Commercial Software interfaces and functional requirements
    • Software Security Requirements
    • User Interface Requirements
    • Algorithms
    • Legacy or Reuse software requirements
    • Derived from Operational Analysis
    • Prototyping activities
    • Interviews
    • Surveys
    • Questionnaires
    • Brainstorming
    • Observation
    • Software Test Requirements
    • Software Fault Management Requirements
    • Hazard Analysis

  • No labels