Tab 2 of this page is the listing of all the content of SWE tab 4s. It can be used to build a meaningful Topic for Small Projects. This requirement primarily applies to the Office of the Chief Engineer (OCE) at headquarters and the performing Centers executing projects. The size of the project is not relevant to this requirement. Center requirements and policies apply to small projects as defined in the Center documentation. Typically the OCE includes a small project in the survey activities at a Center. The OCE survey leader will work with the Center SW point of contact (POC) to develop the appropriate level of survey involvement for small projects. Small projects do need to establish, document, execute, and maintain software processes per the requirements in this directive. Smaller projects may reuse processes previously developed or use processes developed by the organization or Center. Existing processes may be tailored to meet the project risks level. All software developed, acquired, or being maintained by NASA is included in the inventory. The size of the project is not used as a discriminator for the inclusion of project data in the inventory. Small projects with limited budgets or personnel may consider combining several plans into a single plan, devoting sections of the larger plan to specific planning topics. The basic estimation techniques apply to small as well as large tasks but they are tailored or scaled to fit the size and scope of the task. One area of concern is with the cost models, as they tend to be calibrated to medium size and larger projects. It is recommended that a greater reliance be placed upon analogy-based cost estimates unless the model being used can be verified for use on small tasks in your local environment. Microsoft's Project software is frequently used to develop and track schedules. This software is typically accessible for use on small projects. The level of detail associated with a small project's schedule could be less than that for a larger project. This saves small projects time in both the development and tracking of the schedule. Use a tool that is appropriate for the scale of your project. The development team may also use a Project Management Tool (like Microsoft® Project or Omniplan) to which it already has access. However, a small project of 2-3 people, or a project of less than 1 month, that may not already have access to an existing tool, may be better served by a Microsoft® Excel/Word or text-based schedule than by acquiring and being trained on using a project management tool. Small projects can consider using a pre-existing training plan if the project is similar to previous projects. Organizations that are responsible for developing numerous, similar small projects can develop an umbrella training plan that covers these projects. This requirement applies to all projects depending on the determination of the software classification of the project (see SWE-020 ). A smaller project, however, may be able to get by with less frequent reviews, if the risk to the overall project or program by the software product is low. Periodic evaluations of the software classification and risk level may validate the use of less frequent reviews or suggest an increase in their frequency. 7.8 - Maturity of Life Cycle Products at Milestone Reviews provides a chart that summarizes the current guidance approved by the NASA Office of the Chief Engineer (OCE) for software engineering life cycle products and their maturity level at the various software project life cycle reviews. This chart serves as guidance only and NASA Center procedures should take precedence for projects at those Centers. No additional guidance is available for small projects. This requirement applies to projects, regardless of size, unless a waiver is granted by the appropriate TA. Projects with limited financial or personnel resources need to carefully analyze what work the team needs to perform for a transitioned project. For example, if a project has completed the design phase and then the decision is made to transition that project to a higher classification, all completed work through the design phase needs to be assessed to determine how much of that work fails to meet the requirements of the higher classification. A small project with limited personnel or financial resources needs to weigh the value of repeating or performing work to meet the higher classification requirements against the risk to the overall project of having requirements and design that does not meet the requirements of the higher classification. The result of this analysis may provide the basis for the project to reduce or eliminate some of the transition efforts. Keep in mind, however, that all projects, regardless of size, will require waivers for any unmet requirements Software assurance is required regardless of project size. Since NASA-STD-8739.8 278 defines roles, not individuals, projects with limited personnel resources, can use one person to fulfill multiple roles and perform multiple software assurance functions, as long as the proper independence for the specific requirement is maintained. In small project situations, sometimes it will be necessary for a project to have software assurance conducted by someone not on the project, but potentially from the same organization. This is not the preferred approach, but better than having no software assurance done at all. Additionally, for acquired software, software assurance is performed by both the acquirer and the provider, so projects with small acquirer staff could consider doing more "insight" (Surveillance mode requiring the monitoring of customer identified metrics and contracted milestones) per NASA-STD-8739.8, "Tailoring the implementation of software assurance requirements is acceptable commensurate with the program/project classification as well as size, complexity, criticality, and risk." The specific activities and depth of analyses needed to meet the requirements can, and should, be modified to the software safety risk. In other words, while the requirements must be met, the implementation and approach to meeting those requirements may and should vary to reflect the system to which they are applied. Substantial differences may exist when the same software safety requirements are applied to dissimilar projects. For projects designated as a small project based on personnel or budget, the following options may be considered to assist in the fulfillment of this requirement: This requirement applies to all projects regardless of size. It's not unusual for smaller and less critical projects to utilize engineering personnel to fulfill some or all of the assurance duties (rather than personnel from the Center's Safety and Mission Assurance Organization). Smaller projects may also consider using a work tracking system or configuration management tool that provides automatic notification and tracking when updates to documentation occur and re-baselining is necessary. Several small projects have begun using wikis to create and maintain project documentation. The use of a wiki allows those working on the project to be notified of changes as they occur. This requirement applies to all projects regardless of size. National Defense Industrial Association (NDIA) CMMI® Working Group conducted a study on the use of CMMI®-DEV within Small Businesses in 2010 158. One of the counter-intuitive findings was that the "Perceptions that CMMI® is too burdensome for small businesses is not supported by data on CMMI®-DEV adoption". Significant numbers of organizations in the 1-20 employees range adopted and achieved CMMI® Level ratings. Small projects are expected to take advantage of the Agency, Center, and/or organizational assets While assessing all available options is important for any software development project, it may be even more important for projects with limited budgets, personnel, or both. Small projects need to evaluate their available resources against the possible solutions to find the best fit with the least risk. The use of existing trade studies and market analyses may reduce the cost and time of assessing available options. No additional guidance is available for small projects. The community of practice is encouraged to submit guidance candidates for this paragraph. Small projects may want to use a standard set of processes that have been tailored for their development environment, and type of project. These processes may have been developed by people in the same organization that may have done similar developments. Project review plans and milestones are covered in NPR 7120.5 083, NPR 7120.7 264, NPR 7120.8 269, and NPR 7123.1 041. Projects are to ensure that software components are a part of specific project milestone reviews. Small projects need to determine a review process that meets the project requirements and contains adequate content to provide the greatest insights into progress toward the project's technical goals and the technical direction of the project. This requirement applies to all projects regardless of size. Electronic access to software work products and software process tracking information is required for every project. However, access plans need to be written to a level of detail (e.g., limited schedules, minimum deliveries) appropriate for and commensurate with the size, complexity, risk, and safety aspects of the project. No additional guidance is available for small projects. For projects with limited personnel, consider limiting the audit participation in monitoring progress and reviewing the results as this would cause less interference and requires resources. This requirement applies to all projects regardless of size. Smaller projects may get by with fewer levels of detail in the schedule for the life cycle. Projects with small budgets or limited personnel may choose to limit the number of reviews involved in software requirements analysis. It is important in this situation to avoid skipping any important analysis activities. Consider using checklists or other guides to ensure all analysis elements are addressed. Additionally, multiple roles may be filled by a single person on small projects, so it may be helpful to request assistance from experts outside the project when conducting requirements analysis. These persons can provide "fresh eyes" as well as specific key perspectives that may not be available on the core project team. For projects with limited budgets or staff, spreadsheet programs may be used to manage the requirements and track changes to them, but when the number of requirements is large and a project can afford it, using an appropriate combination of a requirements management tool, a change management tool, and a change request tool can make managing requirements change easier for the project. No additional guidance is available for small projects. Small projects need to balance the effectiveness of the available methods against available resources to validate requirements associated with the software. Safety-critical requirements, human-rated requirements, and other critical requirements need to be validated with appropriately rigorous methods that are documented in the project's software development/management plan. Software architecture is one of those non-coding activities that can improve the quality of the software. Small projects may want a less-formal, more-affordable method of development. In general, if software development involves a low-risk and highly precedented system, the project can skimp on architecture. If the development involves high-risk and novel systems, the project must pay more attention to it. 131 Smaller, less risky projects may do just enough architecture by identifying their project's most pressing risks and applying only architecture and design techniques that mitigate them. Regardless of size, the resulting software architecture still needs to be adequately documented. The completion of a software development project inherently means that detailed design activities are conducted and completed. Smaller projects may benefit from limiting the development or use of original or unique tools and environments in the detailed design process. Smaller projects may consider using previously developed coding standards and documentation processes, if applicable, rather than developing their own. These standard applications may be available in the Center's Process Asset Library (PAL) or the software PALs of other Centers. No additional guidance is available for small projects. The community of practice is encouraged to submit guidance candidates for this paragraph. Smaller projects may consider using previously developed/tailored coding methods, standards, and guidelines rather than developing their own. These standard applications may be available in the software Process Asset Library (PAL) of other Centers, if not available at the performing Center. Projects with limited budgets and personnel may choose to perform unit testing or capture unit test results and artifacts in a less formal manner than projects with greater resources. Regardless of the formality of the procedures used, the software test plans for unit testing need to describe the test environment/setup, the results captured, simple documentation procedures, and compliance checks against the procedures. Some Centers have tailored lean unit test procedures and support tools specifically for small projects. No additional guidance is available for small projects. In many cases, test documentation can be combined, for example, the test plan and test procedures may be documented in the same document or the test procedures can be organized so test results can be captured in the same file or document. Software testing is required regardless of project size. No additional guidance is available for small projects. Small projects may choose to lighten their Verification and Validation (V&V) and accreditation requirements through the use of software models, simulations, and analysis tools verified, validated, and accredited by other NASA projects that used these tools in a similar manner and purpose as the small project. To determine the relevance and usefulness of this option, small projects need to be aware of the differences between the projects and the prior project's V&V and accreditation activities as well as the versions of the models, simulations, and analysis tools on which the V&V and accreditation activities were performed. No additional guidance is available for small projects. The community of practice is encouraged to submit guidance candidates for this paragraph. The small project does not normally involve highly complex platforms, so it is generally easier and cheaper to validate software systems on the targeted platform. However, the environment for space systems will typically need to be simulated during validation for projects regardless of size. When using simulated platforms, small projects are advised to look for existing tools rather than creating their own. To facilitate planning activities for projects with limited staff or budgets, consider adapting a maintenance plan from a similar project making sure to update the plan to reflect the current project's operations, maintenance, and retirement plans. The contents of a maintenance plan can also be addressed as sections in another project document and are not required to be in a separate document. No additional guidance is available for small projects. CM activities are based on risk, so projects designated small by the size of the team or budget need to ensure that their Software Configuration Management (SCM) plans to consider all the recommended content noted in 7.18 - Documentation Guidance, but only include those processes and the associated structure commensurate with project risk. This might mean planning to use simpler tools or using personnel to fill multiple roles to carry out the SCM processes. It could also mean planning to use a single tool for multiple purposes to reduce tool management and overhead. Small projects may not require the formality of a separate SCM plan; instead, SCM planning may be documented as a section of the project’s Software Management Plan. Alternatively, one master SCM plan may document CM for multiple small projects. Projects with limited budgets or personnel could reduce the overhead of tracking and evaluating changes, collecting metrics, etc., by using automated change request tools. Using existing tools can reduce purchase and setup costs for the project and if the tools are familiar to team personnel, training and start-up costs may also be minimized. Some automated tools have multiple capabilities that can provide the team with the means to perform multiple change tracking and evaluation activities with a single tool. Additionally, a small team size may be conducive to less formal evaluation methods, such as incorporating impact analysis into team meetings rather than holding separate meetings or assigning separate tasks with formal reports due to an evaluation board. Even though small projects may use less formal methods of tracking and evaluating changes, it is still very important to have a record of the changes and associated decisions so the team can have confidence in the final products. No additional guidance is available for small projects. For projects with a small staff size, the change authority or Change Control Board (CCB) for baselines and modifications to CIs may be a single person with the proper vision and oversight, such as the software manager, systems manager, product development lead, etc. Small projects with a limited budget or limited access to complex or expensive change request tools may choose to use a simpler spreadsheet tool such as the Problem Report Tool to manage change requests and authorizations, and obtain associated metrics. Projects with limited personnel and access to an automated CM tool that has reporting features may find that those features are helpful in fulfilling this requirement. For projects with limited personnel, consider sharing lead auditors or audit team members among projects. Another suggestion is for members of small projects to conduct configuration audits of other small projects. Projects with limited budgets may choose to follow a common Center or project-level release management procedure rather than have separate procedures for each project. Slight modifications may be required for each project, but the overall master process would not have to be developed or maintained on a per-project basis. Projects with limited budgets may consider using spreadsheets or small databases to track their project risks rather than purchase a tool for this purpose. Small projects could also consider using tools available at the Center level since those may have no associated purchase or lease costs. While small projects are required to use peer reviews and inspection processes to evaluate key artifact types, they could make the task more manageable by varying the size of the inspection team, as long as key stakeholders are still represented. When it isn't possible to find all of the needed expertise from within the project team itself, consider whether the peer review/inspection team can leverage personnel from: Small teams also determine whether quality assurance personnel from the Center participate, for example, by providing a trained moderator to oversee the inspection logistics. Checklists for various types of inspections can be found at the Fraunhofer Center website 421. Various inspection tools can be used to reduce the effort of tracking the information associated with inspections. See the "Tools" section of the Resources tab for a list of tools. Projects with small budgets or a limited number of personnel need not use complex or user-intensive data collection logistics. Given the amount of data typically collected, well-known and easy-to-use tools such as Excel sheets or small databases (e.g., implemented in MS Access) are usually sufficient to store and analyze the inspections performed on a project. Since small projects are typically constrained by budget and staff, they choose the objectives most important to them to help keep the cost and effort within budget. Some Centers have tailored measurement requirements for small projects. Be sure to check your Center's requirements when choosing objectives and associated measures. A few key measures to monitor the project's status and meet sponsor and institutional objectives may be sufficient. Data collection timing may be limited in frequency. The use of tools that collect measures automatically helps considerably. While a small project may propose limited sets and relaxed time intervals for the measures to be collected and recorded, the project still needs to select and record in the Software Development Plan the procedures it will use to collect and store software measurement data. Small projects may consider software development environments or configuration management systems that contain automated collection, tracking, and storage of measurement data. Many projects within NASA have been using the JIRA environment (see section 5.1, Tools) with a variety of plug-ins that helps capture measurement data associated with software development. Data reporting activities may be restricted to measures that support safety and quality assessments and the overall organization's goals for software process improvement activities. Data reporting timing may be limited to annual or major review cycles. Small projects may consider software development environments or configuration management systems that contain automated collection, tracking, and storage of measurement data. Often small projects have difficulty affording tools that would enable automatic measurement collection. There are several solutions to this issue. Some Centers, such as GSFC, have developed simple tools (often Excel-based) that will produce the measurements automatically. GSFC examples are the staffing tool, the requirements metrics tool, the action item tool, the risk tool, and the problem reporting tool. Other solutions for small projects involve organizational support. Some organizations support a measurement person on staff to assist the small projects with measurement collection, storage, and analysis, and some Centers use tools that can be shared by small projects. Since small projects are typically constrained by budget and staff, they choose the objectives most important to them to help keep the cost and effort within budget. Some Centers have tailored measurement requirements for small projects. Be sure to check your Center's requirements when choosing objectives and associated measures. A few key measures to monitor the project's status and meet sponsor and institutional objectives may be sufficient. Data collection timing may be limited in frequency. The use of tools that collect measures automatically helps considerably. While a small project may propose limited sets and relaxed time intervals for the measures to be collected and recorded, the project still needs to select and record in the Software Development Plan the procedures it will use to collect and store software measurement data. Small projects may consider software development environments or configuration management systems that contain automated collection, tracking, and storage of measurement data. Many projects within NASA have been using the JIRA environment (see section 5.1, Tools) with a variety of plug-ins that helps capture measurement data associated with software development. Data reporting activities may be restricted to measures that support safety and quality assessments and the overall organization's goals for software process improvement activities. Data reporting timing may be limited to annual or major review cycles. Small projects may consider software development environments or configuration management systems that contain automated collection, tracking, and storage of measurement data. Often small projects have difficulty affording tools that would enable automatic measurement collection. There are several solutions to this issue. Some Centers, such as GSFC, have developed simple tools (often Excel-based) that will produce the measurements automatically. GSFC examples are the staffing tool, the requirements metrics tool, the action item tool, the risk tool, and the problem reporting tool. Other solutions for small projects involve organizational support. Some organizations support a measurement person on staff to assist the small projects with measurement collection, storage, and analysis, and some Centers use tools that can be shared by small projects. No additional guidance is available for small projects. No additional guidance is available for small projects. While SWE-091 may offer limits to the type and interval of measures to be recorded, the project still needs to collect and analyze the selected software measurement data to develop the key software metrics. Using previously defined analysis procedures can help a project reduce the time and effort needed to develop procedures. Also, certain development environments, such as JIRA (see section 5.1, Tools), and associated plug-ins, or configuration management systems, can help automate the collection and distribution of information associated with the analysis of development metrics. No additional guidance is available for small projects. No additional guidance is available for small projects. Small projects will find value in using the example, templates, tools, and best practices on their project. Primarily this requirement is an institutional requirement, training in a unique implementation approach and technologies being used by a small project is the responsibility of the small project to provide. Small projects may lack the resources and schedule to individually apply for waiver relief from specific sets of NPR 7150.2 requirements. Centers can request a generic waiver that will cover multiple small projects. All requirements are listed in the compliance matrix, but the project can determine the critical entries along with waiver/deviation candidates during the development of the compliance matrix to balance project risks when personnel or funding restrictions limit documentation and other activities. This requirement applies to all projects regardless of size. The OCE survey leader will work with the Center SW POC to develop the appropriate level of survey involvement for small projects. All projects with a mission- or safety-critical software and receiving IV&V services, are treated the same. The same process is used for developing an IPEP, regardless of project size, and IPEPs all have the same basic elements (objectives, schedules, interfaces, etc.). The actual content of each IPEP is, of course, different for each project. This requirement applies to all projects regardless of size. Small projects can benefit from inserting static analysis in their development process. As mentioned above, there are many static analyzers available free of charge. So, using these does not impose an additional cost on a small project with constrained resources. However, it might have an impact on development time, since it takes time to get used to a new tool and free tools tend to generate a lot of false positives. The time spent getting used to a tool is negligible. Most tools use the same interfaces as compilers and are not difficult to use. There might be differences due to the language parser used by the tool. To ensure smooth integration of the tool in the development process, pick a tool that relies on parsers from common compilers (e.g., GCC). This helps to ensure that the tool can be easily integrated into existing make-files or other compiling mechanisms. The generation of many false positives can be a problem since it might overwhelm the users. In some sense, it is similar to the problem of choosing an adequate warning level for a compiler: generating too many warnings causes them to be ignored. Choose a static analyzer that has the capability of filtering results or that can be adjusted to generate fewer warnings. In general, it is a good habit to start using a static analyzer at the level that generates fewer warnings and to slowly increase the levels until too many warnings are generated. Small projects with limited resources may look to software development tools and environments that were previously accredited on other projects. Reviews of project reports, Process Asset Libraries (PAL)s, and technology reports may surface previous use cases and/or acceptance procedures that apply to new tools pertinent to the current project. No additional guidance is available for small projects. No additional guidance is available for small projects. All projects are assessed against these criteria to determine whether they should receive IV&V services. No additional guidance is available for small projects. The community of practice is encouraged to submit guidance candidates for this paragraph. No additional guidance is available for small projects. This requirement applies to all projects regardless of size. Process assets from small projects may be useful for other small projects in their efforts to meet the requirements of this NPR. This requirement applies to all projects regardless of size. No additional guidance is available for small projects. No additional guidance is available for small projects. No additional guidance is available for small projects. The conditions for cost estimates apply to small as well as large tasks but should be scaled to fit the size and scope of the task. All estimates are made using some combination of the basic estimation methods of analogy or Cost Estimating Relationship (CER). Whatever method is used, it is most important that the assumptions and formulas used be documented to enable more thorough reviews and to make it easier to revise estimates at future dates when assumptions may need to be revised. A documented estimate is the first line of defense against arbitrary budget cuts. Small projects nominally use the analogy estimates methods if the development organization has data to support them. The use of at least two estimates is always considered to be a best practice for small and large software estimation activities. No additional guidance is available for small projects. No additional guidance is available for small projects. No additional guidance is available for small projects. No additional guidance is available for small projects. No additional guidance is available for small projects. No additional guidance is available for small projects. Small Projects should consider the value of capturing and tracking of these elements (each center should define what is a small project). The value and benefit need to be determined for the project and organization. If an organization only does small projects then the value of collecting data may be important to future business and process improvement. No additional guidance is available for small projects. No additional guidance is available for small projects. No additional guidance is available for small projects. No additional guidance is available for small projects. No additional guidance is available for small projects. No additional guidance is available for small projects. No additional guidance is available for small projects. No additional guidance is available for small projects. No additional guidance is available for small projects. No additional guidance is available for small projects. No additional guidance is available for small projects. No additional guidance is available for small projects. No additional guidance is available for small projects. No additional guidance is available for small projects. No additional guidance is available for small projects. No additional guidance is available for small projects. No additional guidance is available for small projects. No additional guidance is available for small projects. No additional guidance is available for small projects. No additional guidance is available for small projects. No additional guidance is available for small projects. No additional guidance is available for small projects. No additional guidance is available for small projects. No additional guidance is available for small projects. Software assurance applies to all projects. No additional guidance is available for small projects. No additional guidance is available for small projects. No additional guidance is available for small projects. No additional guidance is available for small projects. This requirement applies to all NASA centers and all software classifications. This requirement applies to all NASA centers and all software classifications. This requirement applies to all NASA centers and all software classifications. This requirement applies to all NASA centers and all software classifications. This requirement applies to all NASA centers and all software classifications. This requirement applies to all Class A, B, C, and D projects that have safety-critical software regardless of size. No additional guidance is available for small projects. No additional guidance is available for small projects. No additional guidance is available for small projects. No additional guidance is available for small projects.
See edit history of this section
Post feedback on this section
1. Introduction
This is a working page for the conversion of SWE tab 4 content into a separate Topic so that tab 4 "Small Projects" can be repurposed into a "Software Engineering" tab for Project Activities. 1.1 Create list of current content of tab 4s in SWEs
2. Tab 4 content in SWEs
SWE Link Tab 4 text SWE-002 - Software Engineering Initiative SWE-003 - Center Improvement Plans SWE-004 - OCE Benchmarking SWE-005 - Software Processes SWE-006 - Center Software Inventory SWE-013 - Software Plans SWE-015 - Cost Estimation SWE-016 - Software Schedule SWE-017 - Project and Software Training SWE-018 - Software Activities Review SWE-020 - Software Classification SWE-021 - Transition to a Higher Class SWE-022 - Software Assurance SWE-023 - Software Safety-Critical Requirements SWE-024 - Plan Tracking SWE-027 - Use of Commercial, Government, and Legacy Software SWE-032 - CMMI Levels for Class A and B Software SWE-033 - Acquisition vs. Development Assessment SWE-034 - Acceptance Criteria SWE-036 - Software Process Determination SWE-037 - Software Milestones SWE-039 - Software Supplier Insight SWE-040 - Access to Software Products SWE-042 - Source Code Electronic Access SWE-045 - Project Participation in Audits SWE-046 - Supplier Software Schedule SWE-050 - Software Requirements "Any project with resource limitations must establish the relative priorities of the requested features, use cases, or functional requirements. Prioritization helps the project management plan for staged releases, make trade-off decisions, and respond to requests for adding more functionality. It can also help you avoid the traumatic 'rapid de-scoping phase' late in the project when you start throwing features overboard to get a product out the door on time." 358 SWE-051 - Software Requirements Analysis SWE-052 - Bidirectional Traceability For small projects without access to a requirements tool that includes tracing features and with time/budget limitations preventing them from acquiring a new tool and associated training, requirements tracing may be done with a spreadsheet (such as Microsoft® Excel), a simple database (such as Microsoft® Access) or a textual document. The project must be diligent about keeping such traces up to date. These methods do not include automatic updates when requirements, design elements, or other relevant documents change.
Value-based-requirement tracing may be an option for projects with small budgets where traceability of safety-critical requirements is the priority. Value-based requirement tracing prioritizes all of the requirements in the system, with the amount of time and effort expended tracing each requirement depending on the priority of that requirement. This can save a significant amount of effort by focusing traceability activities on the most important requirements. However, value-based tracing requires a clear understanding of the importance of each requirement in the system; it may not be an option if full tracing is a requirement of the customer or the development process standards used for the project. 237SWE-053 - Manage Requirements Changes SWE-054 - Corrective Action for Inconsistencies SWE-055 - Requirements Validation SWE-057 - Software Architecture SWE-058 - Detailed Design SWE-060 - Coding Software SWE-061 - Coding Standards SWE-062 - Unit Test SWE-063 - Release Version Description SWE-065 - Test Plan, Procedures, Reports SWE-066 - Perform Testing SWE-068 - Evaluate Test Results SWE-070 - Models, Simulations, Tools SWE-071 - Update Test Plans and Procedures SWE-073 - Platform or Hi-Fidelity Simulations SWE-075 - Plan Operations, Maintenance, Retirement SWE-077 - Deliver Software Products SWE-079 - Develop CM Plan SWE-080 - Track and Evaluate Changes SWE-081 - Identify Software CM Items SWE-082 - Authorizing Changes SWE-083 - Status Accounting SWE-084 - Configuration Audits SWE-085 - Release Management SWE-086 - Continuous Risk Management SWE-087 - Software Peer Reviews and Inspections for Requirements, Plans, Design, Code, and Test Procedures SWE-088 - Software Peer Reviews and Inspections - Checklist Criteria and Tracking SWE-089 - Software Peer Reviews and Inspections - Basic Measurements SWE-090 - Management and Technical Measurements SWE-091 - Establish and Maintain Measurement Repository SWE-092 - Using Measurement Data SWE-093 - Analysis of Measurement Data SWE-094 - Reporting of Measurement Analysis SWE-095 - Report Engineering Discipline Status SWE-098 - Agency Process Asset Library SWE-100 - Software Training Funding SWE-121 - Document Tailored Requirements SWE-125 - Requirements Compliance Matrix SWE-126 - Tailoring Considerations SWE-129 - OCE NPR Appraisals SWE-131 - Independent Verification and Validation Project Execution Plan SWE-134 - Safety-Critical Software Design Requirements SWE-135 - Static Analysis SWE-136 - Software Tool Accreditation SWE-139 - Shall Statements SWE-140 - Comply with Requirements SWE-141 - Software Independent Verification and Validation SWE-142 - Software Cost Repositories SWE-143 - Software Architecture Review SWE-144 - Software Engineering Process Assets SWE-146 - Auto-generated Source Code SWE-147 - Specify Reusability Requirements SWE-148 - Contribute to Agency Software Catalog SWE-150 - Review Changes To Tailored Requirements SWE-151 - Cost Estimate Conditions SWE-152 - Review Requirements Mapping Matrices SWE-153 - ETA Define Document Content SWE-154 - Identify Security Risks SWE-156 - Evaluate Systems for Security Risks SWE-157 - Protect Against Unauthorized Access SWE-159 - Verify and Validate Risk Mitigations SWE-174 - Software Planning Parameters SWE-176 - Software Records SWE-178 - IV&V Artifacts SWE-179 - IV&V Submitted Issues and Risks SWE-184 - Software-related Constraints and Assumptions SWE-185 - Secure Coding Standards Verification SWE-186 - Unit Test Repeatability SWE-187 - Control of Software Items SWE-189 - Code Coverage Measurements SWE-190 - Verify Code Coverage SWE-191 - Software Regression Testing SWE-192 - Software Hazardous Requirements SWE-193 - Acceptance Testing for Affected System and Software Behavior SWE-194 - Delivery Requirements Verification SWE-195 - Software Maintenance Phase SWE-196 - Software Retirement Archival SWE-199 - Performance Measures SWE-200 - Software Requirements Volatility Metrics SWE-201 - Software Non-Conformances SWE-202 - Software Severity Levels SWE-203 - Mandatory Assessments for Non-Conformances SWE-204 - Process Assessments SWE-205 - Determination of Safety-Critical Software SWE-206 - Auto-Generation Software Inputs SWE-207 - Secure Coding Practices SWE-208 - Advancing Software Assurance and Software Safety Practices SWE-209 - Benchmarking Software Assurance and Software Safety Capabilities SWE-210 - Detection of Adversarial Actions SWE-211 - Test Levels of Non-Custom Developed Software SWE-212 - NASA-STD-8739 Mapping Matrices SWE-214 - Internal Software Sharing and Reuse SWE-215 - Software License Rights SWE-216 - Internal Software Sharing List SWE-217 - List of All Contributors and Disclaimer Notice SWE-218 - Contracting Officers SWE-219 - Code Coverage for Safety Critical Software SWE-220 - Cyclomatic Complexity for Safety-Critical Software SWE-221 - OSMA NPR Appraisals SWE-222 - Software Assurance Training SWE-223 - Tailoring IV&V project selections
Repurposing Tab 4 in SWEs
Web Resources
View this section on the websiteUnknown macro: {page-info}