SCAMPI: Engineering Process Area Artifacts

Requirements Management Requirements Development Technical Solution Product Integration Verification Validation

REQM - Requirements Management

Practice Area
Basic & Advanced Maturity
ArtifactsConsiderations
DirectIndirect
Goal 1: 1 Requirements are managed and inconsistencies with project plans and work products are identified
Practice 1.1 - Develop an understanding with the requirements providers on the meaning of the requirements.
  • An agreed-to set of (product and/or product component) requirements
    • Requirements documents in a mutually acceptably form and format (text, Objects, Data-flow diagrams, etc.)
  • Defined criteria for evaluation and acceptance of requirements.
  • Criteria for selecting appropriate requirements
  • Results of analyses against requirement's criteria
  • Evidence of clarification reviews with requirement's providers (e.g., analysis reports, minutes, clarifications, review logs, requirements updates) resulting in identified requirements issues
  • Action items issued to track resolution of requirements issues
  • Agreement on the requirements list by its' providers.
 
Practice 1.2 - 2 Obtain commitment to the requirements from the project participants.
  • Documented commitments to requirements and requirements changes
  • Requirements impact assessments.
  • Requirements change request logs, with recorded commitment (e.g., signature) and estimates of impact (e.g: functionality impact, interface impact, algorithm impact, design impact, testing impact, validation impact…). Identification of the version where the change request will be introduced. (See also SP1.3-1.)
  • Requirements database reports, with attributes for review / commitment status.
  • Evidence of internal requirements reviews being held (e.g., minutes, checklists, logs, metrics, etc.) by key members of the project team (e.g: design authority, team leaders…).
  • Communication of requirements to project stakeholders, and involvement in establishing commitment.
  • Ensure this is performed not only for the initial requirements set, but also for subsequent changes.
  • Verify that the impact on verification test and validation test has been considered
  • The intent of this practice includes consideration of impact upon project stakeholders prior to commitment to requirements (e.g., plans, estimates, schedules).
Practice 1.3 - Manage changes to the requirements as they evolve during the project.
  • Requirements change request logs, with recorded commitment (e.g., signature) and estimates of impact
  • Updated requirements change history with the rationale for the changes
  • Impact analysis for any requirements change request that includes input from all relevant stakeholders]
  • Requirements status
  • Requirements database
  • Requirements decision database
    • Requirements reports with attributes indicating current state (e.g., approval, source, rationale, revision history, impact).
    • Change requests, notices, or proposals
    • Version control of baselined and documented requirements revisionsN.
    • Evidence of requirements change reviews during which requirements changes are evaluated with relevant stakeholders, including impact assessment
    • Evidence of project plan update (e.g: milestone update, resource update, cost at completion update…) due due to requirement change request
    • Measures of requirements effectiveness (e.g., volatility)
    • Revisions to work products resulting from changed requirements.
Much of requirements management is change management. Tracking the status of project change requests is illuminating: how many of them are open and how many closed? How many requests were approved and how many rejected? How much effort was spent implementing each approved change? How long have the requests been open? Change requests that remain unresolved for a long time suggest that a project change management process that isn’t very effective.R
Practice 1.4 - 2 Maintain bi-directional traceability among the requirements and the project plans and work products.
  • Requirements traceability matrix
  • Reports or database indicating traceability of requirements to/from project plans and work products, at each applicable level of system decomposition.
  • Requirements tracking system
    • Criteria and completed checklists and minutes for review of requirements traceability
    • Requirements tracking logs
    • Revision and maintenance of requirements traceability across the lifecycle
    • Listings of allocated requirements included in reviews of project plans and work products across the lifecycle
    • Requirements mappings used to support impact assessments.
  • Ensure that both vertical and horizontal traceability are included (e.g., across functions and/or interfaces)
  • Assessing traceability of requirements to "project plans is probably more implicit than explicit, and applies to plans such as test plans, V&V plans, etc.
  • See Project Planning Practice Area for project plans that might be affected. The assessment team must reach consensus on how this is to be assessed for the organization.) NO - How does the project go forward if the requirements are not driving the project tasks and activities
Practice 1.5 - Identify inconsistencies between the project plans and work products and the requirements.
  • Documentation of identified requirements inconsistencies including sources, conditions, rationales
  • Corrective Actions
  • Completed checklists, forms, logs, action items, or minutes substantiating reviews of requirements consistency with the project plans, activities or work products
  • Sources of inconsistencies are identified so that corrective action is initiated. This corrective action is tracked to completion and verified through a quality assurance function.

[To top of Page]

REQD - Requirements Development

Practice Area
Basic & Advanced Maturity
ArtifactsConsiderations
DirectIndirect
Goal 1: Stakeholder needs, expectations, constraints, and interfaces are collected and translated into customer requirements.
Practice 1.1 - Identify, elicit and collect stakeholder needs, expectations, constraints and interfaces for all phases of the product life cycle
  • Artifacts indicating stakeholder needs, expectations, and constraints that address the various product life-cycle activities have been consolidated and conflicts between major stakeholders have been resolved to produce the "customer" requirements
  • Results of requirements collection methods (e.g., Interviews, prototypes, Operational scenarios, market surveys, use cases., product domain analysis, reverse engineering)
  • Notes that indicate the stakeholders have agreed to the resolution of "conflicts" that surfaced during the gathering and consolidation of their needs, expectations, constraints, and possible operational concepts
  •  
Practice 1.2 - Transform stakeholder needs, expectations, constraints, and interfaces into customer requirements.
  • Customer Requirements
  • Customer constraints for verification process
  • Customer constraints for validation process
  • Test cases and expected results
  • Interface definitions
  • Constraints
  • Some form of hard or soft representation of the "customer" requirements that is under configuration management and is accessible to all relevant stakeholders
  • A Test Procedure is a set of Detailed instructions for the setup, execution, and evaluation of results for a given test.
Goal 2: Customer requirements are refined and elaborated to develop product and product-component requirements
Practice 2.1 - Establish and maintain product and product-component requirements, which are based on the customer requirements
  • Derived requirements
  • Product requirements
  • Product component requirements
  • Analysis and rationale of cost performance tradeoffs of requirements and of lifecycle phases considering business objectives
  • Performance modeling results
  • Description and results of methods used to translate customer needs into technical parameters
  • Architecture requirements
  • Design requirements
  • Requirements traceability matrix
  • There should be an obvious difference between the "customer" requirements and the product and product-component requirements
Practice 2.2 - Allocate the requirements for each product component
  • Requirement allocation sheets
  • Provisional requirement allocations
  • Design constraints
  • Derived requirements
  • Relationships between derived requirements.
  • Indication of allocated requirements traceability
  • Include allocation of product performance, design constraints, and fit, form, and function to meet the requirements
  • Production guidelines
  • Performance requirements that had to be partitioned to two or more product components are handled as derived requirements
  • Clearly understand "all" functions that the requirements were allocated too (i.e., software, hardware, mechanics, manufacturing, electrical, plastic, glass, firmware)

Classes of Requirements
  • User requirements list the tasks and goals of the user or consumer. They are intended to make the tool or product easier to use, faster, less error prone.
  • Business requirements list the goals of the business. At the highest level, these goals are to increase revenue, decrease costs, improve data management, increase knowledge transfer, improve efficiency, and so on.
  • Technical requirements are the hardware and software integration issues such as security, compatibility with existing systems, performance requirements, and so on.
  • Functional requirements answer the question, “how will we make the product or application?” Functional requirements are mostly about process.
Practice 2.3 - Identify interface requirements.
  • Interface requirements (external to the product and internal to the product)
  • Interfaces with product-related life-cycle processes such as test equipment, support systems, and manufacturing facilities.
  • If the interface requirements are not documented or captured as a separate artifact from the product and product-component requirements, the assessment team should press to understand the difficulty of updating the product and product-component requirements specifications and the traceable "customer" requirements whenever the interface descriptions have to be updated. Changes to the "customer" requirements requires the involvement of the organizational level Change Control Board.
Goal 3: The requirements are analyzed and validated, and a definition of required functionality is developed.
Practice 3.1 - Establish and maintain operational concepts and associated scenarios.
  • Operational concept
  • Product installation
  • Operational, maintenance and support concepts
  • Disposal concepts
  • Use cases
  • Timeline scenarios
  • Lower level detailed requirements
  • Revision histories
  • Conceptual solutions
  • Definition of environment in which the product will operate
  • Describe the path that was taken to develop the operational concept - a one-off or customer-provided idea of the operational concept. Describe participant in the development of the operational concept and associated operational scenarios.
Practice 3.2 - Establish and maintain a definition of required functionality
  • Functional architecture
    • Definition of functions
    • Logical Groupings
    • Association with requirements.
  • The definition of functions and their logical groupings should be established.
Practice 3.3 - Analyze requirements to ensure that they are necessary and sufficient.
  • Requirements defects reports
  • Key requirements
  • Refined requirements and new requirements.
  • Proposed requirements changes to resolve defects
  • Technical performance measures
  • Requirements traceability matrix or equivalent that shows the path from the lower level derived requirements to their higher level parent requirements
  • Key requirements that are documented and tracked because they have a strong influence on cost, schedule, functionality, risk, or performance
  • Focus on this practice to determine if the requirements are reviewed, who participated in the review (i.e., systems engineering, software engineering, manufacturing, mechanical engineering, electrical engineering, quality assurance, independent test, etc. ), what checklists are used and if the review at least is able to result in requirements that are complete, feasible, realizable, and verifiable.
  • Verification criteria should be captured for revised or new requirements before the requirements are re-baselined
Practice 3.4 - Analyze requirements to balance stakeholder needs and constraints
  • Assessment of risks related to requirements
    • Results of requirements analysis indicating impact on cost, schedule, performance, functionality, reusable components, quality factors such as maintainability and expandability, or risk.
  • Risk mitigation plan
  • Project Management plan
  • Probe deep enough to find out which methods were used to analyze the requirements. Models, simulations, and prototyping are useful but shoiuld be effectively used. This activity is often by-passed in order to get to market quickly.
Practice 3.5 - Validate requirements to ensure the resulting product will perform appropriately in its intended use environment.
  • Results of requirements validation.
  • Requirements traceability matrix
  • Requirements changes
  • Requirements specification
  • Results of techniques to demonstrate requirements functionality (e.g., prototypes, simulations, analyses, scenarios, and story boards)
  • Rationale of why a certain validation technique was used over other possible techniques and the interpretation of its effectiveness

[To top of Page]

TS - Technical Solution

Practice Area
Basic & Advanced Maturity
ArtifactsConsiderations
DirectIndirect
Goal 1: Product or product-component solutions are selected from alternative solutions
Practice 1.2 - Develop alternative solutions and establish selection criteriaN.
  • Alternative solutions that span the acceptable range of cost, schedule, and performance, and quality]
  • Selection criteria for final selection] - may include:
    • Technical performance
    • Complexity of the product component
    • Product expansion and growth
    • Sensitivity to construction methods and materials
    • Capabilities and limitations of end users
  • Evaluations of new technologies
  • Evaluation of solutions and technologies (new or legacy)
  • Design issues.
  • A process or processes for identifying solution alternatives, selection criteria, and design issues
  • COTS evaluations
  • How to rank design issues should be determined. Any necessary pre-determined activity must take place.
  • Design criteria should provide clear discrimination and an indication of success in arriving at a balanced solution across the life of the product. They typically include measures of cost, schedule, performance, and risk.
  • Alternative solutions need to be identified and analyzed to enable the selection of a balanced solution across the life of the product in terms of cost, schedule, and technical performance.
  • Selection of the best solution establishes the requirements provisionally allocated to that solution as the set of allocated requirements.
  • The circumstances in which it would not be useful to examine alternative solutions are infrequent in new developments. However, developments of precedented product components are candidates for not examining, or only minimally examining, alternative solutions.
  • Details of the alternative solutions provide more accurate and comprehensive information about the solution than nondetailed alternatives. For example, characterization of performance based on design content rather than on simple estimating enables effective assessment and understanding of environment and operating concept impacts.
  • Reference the Allocate Product-Component Requirements specific practice in the Requirements Development process area for more information about obtaining provisional allocations of requirements to solution alternatives for the product components
  • Reference the Decision Analysis and Resolution process area for more information about establishing selection criteria and identifying alternatives
  • Reference the Requirements Management process area for more information about managing the provisional and established allocated requirements.
Practice 1.2 - Evolve the operational concept, scenarios, and environments to describe the conditions, operating modes, and operating states specific to each product component.
  • Product-component operational concepts, scenarios, and environments for all product-related life-cycle processes
  • Timeline analyses of product-component interactions
  • Use cases
  • detail on the operational concepts and scenarios
  • Operational concepts and scenarios document the interaction of the product components with the environment, users, and other product components, regardless of engineering discipline. They should be documented for operations, product deployment, delivery, support (including maintenance and sustainment), training, and disposal and for all modes and states.
  • The environment of any given product component will be influenced by other product components as well as the external environment.
Practice 1.3 - Select the product component solutions that best satisfy the criteria established.
  • Product component selection decisions and rationale
  • Documented relationships between requirements and product components.
  • Alternative solutions under consideration and selection criteria (see Practice 1.1)
    • Operating concepts, modes, and states (see Practice 1.2)
    • Technical memos
    • Requirements allocated to product components
    • Resolution of issues for selection of best alternative solution using the functional requirements as a parameter
    • Documentation of selected solutions using the allocated requirements and selected product components
    • Processes and procedures for selection of product component solutions
    • Product component solutions that will be reused or acquired.
  • Product component solutions should be selected using the criteria established in Practice 1.1
  • There should be documented decisions and rationale, according to the selection criteria.
  • Reference the Requiements Development Practice Area for information on establishing allocated requirements and interface requirements.
  • Reference the Decision Analysis and Resolution Practice Area for more information about structured decision making
  • The descriptions of the solutions and rationale for selection are documented in an initial technical data package. The technical data package evolves throughout development…
  • Rationale for selection decisions should be maintained to support downstream decision making
Goal 2: Develop Product or product-component designs.
Practice 2.1 - Develop a design for the product or product component.
  • Product architecture
  • Product capabilities
    • Product partitions
    • Product-component identifications
    • Systems states
    • Major intercomponent interfaces
    • External product interfaces
  • Product component detailed designs
  • Structural elements
  • Coordination mechanisms
  • Standards and design rules that govern development of product components and their interfaces
  • Fully characterized interfaces
  • Product components completely defined
  • Updated traceability matrix
  • Design criteria against which the design can be evaluated
  • COTS components that must taken into account and that might modify the requirements
  • Implementation of this practice should include not only the standards for establishing and documenting a design, but also evidence that these standards are followed (e.g., completed review documentation or checklists)
  • Look for sufficient detail in product or product component designs to support life-cycle content (e.g., implementation, modification, reprocurement, maintenance, sustainment, installation)
  • Reference any models of software design methods (prototyping, structural models, Object Oriented Design, patterns, etc.), standards (user interface, safety, production, etc.), design attributes and criteria (modularity, maintainability, performance, etc.)
  • The design methods used may vary for different portions of the product component design.
  • Criteria are maintained through a process against which the effectiveness are measured.
Practice 2.2 - 1 Establish and maintain a technical data package
  • Technical data package.
  • Drawings
  • Specifications
  • Design descriptions
  • Design databases
  • Performance requirements
  • Quality assurance provisions
  • Packaging details
  • Different views that were captured to help organize data defining design descriptions
  • A technical data package provides the developer with a comprehensive description of the product or product component as it is developed. It may include:
    • product architecture description
    • allocated requirements
    • product-component descriptions
    • product-related life-cycle process descriptions if not described as separate product components
    • key product characteristics
    • required physical characteristics and constraints
    • interface requirements
    • materials requirements (bills or material and material characteristics)
    • fabrication and manufacturing requirements (for both the original equipment manufacturer and field support)
    • the verification criteria used to ensure requirements have been achieved
    • conditions of use (environments) and operating/usage scenarios, modes and states for operations, support, training, manufacturing, disposal, and verifications throughout the life of the product
    • rationale for decisions and characteristics (requirements, requirement allocations; design choices)
  • Determining the number of levels of product components that require documentation and requirements traceability is important to manage documentation costs and to support integration and verification plans
Practice 2.3 - Establish and maintain the solution for product component interfacesN.
  • Interface design
  • Interface design documents
    • Revision history and descriptions of changes incorporated to controlled interfaces
  • Interface requirements -internal and external to the product
  • For a Comptehensive Design include:
    • Origination
    • Destination
    • Stimulus and data characteristics for software
    • Electrical, mechanical, and functional characteristics of the hardware
  • Interfaces between product components and the product-related lifecycles
  • Interface control and design documents
  • Interface specification criteria, templates, and checklists used by design team
  • The criteria for interfaces frequently reflect a comprehensive list of critical parameters that must be defined, or at least investigated, to ascertain their applicability. These parameters are often peculiar to a given type of product (e.g., software, mechanical, electrical) and are often associated with safety, security, durability, and mission-critical characteristics.
  • Reference the Organizational Process Definition process area for more information about establishing and maintaining organizational process assets.
Practice 2.4 - Evaluate whether the product components should be developed, purchased, or reused based on established criteria.
  • Criteria for design and component reuse
  • Make or buy analyses including the factors that were taken into consideration
    • Functions the products or services will provide
    • Available project resources and skills
    • Costs of acquiring versus developing internally
    • Strategic business alliances
    • Market research of available products
    • Functionality and quality of available products
    • Skills and capabilities of potential suppliers
    • Product availability.
  • Supplier agreements.
  • Reuse component libraries, guidance, and criteria for reuse of non-developmental items (NDI).
  • Evaluation criteria, rationale, and reports for make-buy analyses and product component selection.
  • Plans for maintenance, support, and transition of COTS/NDI components.
  • Product acceptance criteria.
  • Product operational, maintenance and support concepts
Goal 3: Product components, and associated support documentation, are implemented from their designs
Practice 3.1 - Implement the designs of the product components
  • Product component implementation and support data (e.g., source code, documented data and services, fabricated parts, deployed manufacturing processes, facilities, materials).
  • Product component construction methods (e.g. coding, fabrication).
  • Standards, criteria, and checklists for constructed product components.
  • Results of peer reviews, inspections, or verifications performed on constructed components.
  • Unit test plans, procedures, results, and acceptance criteria.
  • Configuration and change control data for revision to product components.
  • Methods to implement the product components are documented, either directly or by reference, in the project's defined process.
  • Reference the Verification Practice Area for more information on peer reviews performed on product components.
  • Look for evidence of satisfying unit test criteria (e.g., test coverage (statement coverage, branch coverage, path coverage, etc.), bounds).
  • Ensure peer reviews are performed on selected product components.
Practice 3.2 - Develop and maintain the end-use documentation.
  • End-user training materials
  • User's manual
  • Operator's manual
  • Maintenance manual
    • Documentation for installation, operation, use, maintenance and support of product components.
    • Revision history and maintenance of product documentation.
  • Installation Manual/Build Book
  • On-line help
  • Documentation processes, standards, criteria, and checklists.
  • Help desk support.
  • Artifacts related to peer review of applicable documentation.
  • Site installation, training, and maintenance records
  • Documentation methods are documented, either directly or by reference, in the project's defined process."
  • Look for revision of affected documentation upon changes to requirements, design, implementation.

[To top of Page]

PI - Product Integration

Practice Area
Basic & Advanced Maturity
ArtifactsConsiderations
DirectIndirect
Goal 1: Preparation for product integration is conducted.
Practice 1.1 - Determine the product-component integration sequence..
  • Product integration sequence and plan
  • Rationale for selecting or rejecting integration sequences
    • List of components to be integrated
    • Integration schedules and dependencies
    • Meetings or presentations at which the plans for product integration are reviewed
  • It may be useful to document this information in project plans, e.g. product integration plan, or system integration and test plan
  • The product components that are integrated may include those that are a part of the product to be delivered along with test equipment, test software, or other integration items such as fixtures.
  • Reference the Technical Solution Practice Area for information about design/development of product components and defining interfaces.
  • Reference the Decision Analysis and Resolution process area for more information about using a formal evaluation process to selecting the appropriate product integration sequence
  • Referenceo the Risk Management process area for more information about identifying and handling risks associated with the integration sequence
  • Reference the Supplier Agreement Management process area for more information about transitioning acquired product components and the need for handling those product components in the product integration sequence
Practice 1.2 - Establish and maintain the environment needed to support the integration of the product components.
  • Descriptions or configuration of the verified environment for product integration, revised and maintained throughout the project
  • Support documentation for the product integration environment
    • Product integration plan
    • Product integration test bed (e.g., test equipment, simulators, HW equipment, recording devices)
  • It may be useful think in terms of an integration test bed, test harnesses, simulators, etc., in considering this practice.
  • A demonstration of the integration environment might be used as a source of evidence.
  • The product integration plan, or equivalent, should document the planned integration environment. Resources within the environment may be acquired, developed, or reused.
  • For unprecedented, complex projects, the product integration environment can be a major development. As such, it would involve project planning, requirements development, technical solutions, verification, validation, and risk management.
  • Reference the Supplier Agreement Management process area for more information about acquiring parts of the integration environment.
Practice 1.3 - Establish and maintain procedures and criteria for integration of the product components..
  • Product integration procedures
  • Product integration criteria
    • Revision history of integration procedures and criteria, maintained throughout the project.
  • Criteria and checklists for product-component readiness, integration, and evaluation
  • Criteria and checklists for validation, and delivery of the integrated product
  • Product integration inputs, outputs, expected results, and progress criteria
  • Incremental build/integration plan and procedures
  • Reviews or presentations of integration plans, procedures, and criteria
  • Test readiness reviews
  • Procedures for the integration of the product components can include such things as the number of incremental iterations to be performed and details of the expected tests and other evaluations to be carried out at each stage
  • Criteria can indicate the readiness of a product component for integration or its acceptability
  • Procedures and criteria for product integration address:
    • Level of testing for build components
    • Verification of interfaces
    • Thresholds of performance deviation
    • Derived requirements for the assembly and its external interfaces
    • Allowable substitutions of components
    • Testing environment parameters
    • Limits on cost of testing
    • Quality/cost tradeoffs for integration operations
    • Probability of proper functioning
    • Delivery rate and its variation
    • Lead time from order to delivery
    • Personnel availability
    • Availability of the integration facility/line/environment.
  • Criteria can be defined for
    • how the product components are to be verified and the functions they are expected to have
    • how the assembled product components and final integrated product are to be validated and delivered
  • Criteria may also constrain the degree of simulation permitted for a product component to pass a test, or may constrain the environment to be used for the integration test.
Goal 2: Ensure compatibility of product-component interfaces, both internal and external.
Practice 2.1 - Review interface descriptions for coverage and completeness.
  • Reviewed interface descriptions
  • Interface Categories
  • List of interfaces per category
  • Mapping of the interfaces to the product components and product integration environment
    • Interface specifications, control documents (ICDs), connection markings, design documents (IDDs)
    • Criteria and checklists for interface reviews
    • Result of interface reviews
    • Traceability matrices between requirements and interfaces
  • The interfaces should include, in addition to product-component interfaces, all the interfaces with the product integration environment
  • there shpould be no deviation between the existing descriptions and the products being developed, processed, produced, or bought.
Practice 2.2 - Manage internal and external interface definitions, designs, and changes for products and product components
  • List of agreed-to interfaces defined for each pair of product components
  • Updated interface description or agreement
    • Interface descriptions and relationships among product components
    • Interface specifications, Interface control documents (ICDs), Interface design documents (IDDs)
  • Table of relationships between the product components and the external environment (e.g., main power supply, fastening product, computer bus system)
  • Table of relationships between the different product components
  • Reports from the interface control working group meetings
  • Action items for updating interfaces
  • Application Program Interface (API)]
    • Result of interface reviews (e.g., peer reviews, quality assurance inspections, design reviews, interface control working groups, CCBs, action items to resolve interface issues)
    • Repository of interface data (e.g. interface data base).
    • Change requests for revision to interfaces.
  • Interface requirements drive the development of the interfaces necessary to integrate product components. Managing product and product-component interfaces starts very early in the development of the product. The definitions and designs for interfaces affect not only the product components and external systems, but can also affect the verification and validation environments
  • Management of the interfaces includes maintenance of the consistency of the interfaces throughout the life of the product, and resolution of conflict, noncompliance, and change issues
  • Reference the Requirements Development process area for more information about requirements for interfaces
  • Reference the Technical Solution process area for more information about design of interfaces between product components.
  • Reference the Requirements Management process area for more information about managing the changes to the interface requirements
  • Reference the Configuration Management process area for more information about distributing changes to the interface descriptions (specifications), so that everyone can know the current state of the interfaces
Goal 3: Verified product components are assembled and the integrated, verified, and validated product is delivered.
Practice 3.1 - Confirm, prior to assembly, that each product component required to assemble the product has been properly identified, functions according to its description, and that the product component interfaces comply with the interface descriptions.
  • Acceptance documents for received product components
    • Verified acceptance test results or inspection report for product components
    • Discrepancies identified in received product components.
  • Delivery receipts
  • Checked packing lists
  • Exception reports
  • Waivers
    • Configuration status reports for product components
    • Product integration plans and procedures
    • Criteria and checklists for product component readiness, delivery, integration, and evaluation.
  • The purpose of this specific practice is to ensure that the properly identified product component that meets its description can actually be assembled according to the product integration sequence and available procedures
  • Readiness is determined relative to the integration plans and procedures described in Practices 1.1 through 1,3
  • Only qualified components should be accepted for integration; see the Technical Solution and Verification process areas for details on verifying individual product components.
Practice 3.2 - Assemble product components according to the product integration sequence and available procedures.
  • Assembled product or product components
  • Product integration sequence (Practice 1.1)
  • Product integration procedures and criteria (Practice 3.3)
  • Records indicating performance of the product integration sequence and procedures (e.g., integration reports, completed checklists, configuration audits)
  • Recorded configuration and assembly information (e.g., identification, configuration status, calibration data).
  • Integration status and schedule reports (e.g., planned vs. actual components integrated)
  • Revisions to the integration plans or procedures,
  • The assembly activities of this specific practice (and the evaluation activities of the next specific practice) are conducted iteratively, from the initial product components, through the interim assemblies of product components, to the product as a whole.
Practice 3.3 - Evaluate assembled product components for interface compatibility.
  • Exception reports
  • Interface evaluation reports
  • Product integration summary reports
    • Discrepancies detected during checkout of product components
      • Milestones for completion of integration activities
  • Evaluation results
  • Logbook of product component issues or parameters.
  • Product integration sequence (Practice 1.1)
  • Product integration procedures and criteria (Practice 3.3)
  • Regression testing procedures and results
  • This evaluation involves examining and testing assembled product components for performance, suitability, or readiness using the available procedures and environment."
  • Beware of interpreting this practice too narrowly and focusing simply on interfaces; note that, outside of the practice statement itself. Consider the ability of the integrated components to cooperatively satisfy their intended purpose (functionality, performance, etc.) Interface compatibility is a key part of this, but compatibility may be determined explicitly or implicitly.
  • It may be useful to think of this practice in terms of a "checkout"
  • Reference the Verification and Validation process areas for more information on verifying and validating the assembled product components.
  • The assembly and evaluation of product components is often performed together, and it may be difficult to objectively distinguish these as discrete activities.
Practice 3.4 - Package the assembled product or product component and deliver it to the appropriate customer.
  • Packaged product or product components
  • Delivery documentation
  • Packaging procedures
  • Transportation and delivery procedures
  • Packing list
  • Certification for readiness of the operation site
  • Site installation surveys and procedures
  • Reference the Verification and Validation process areas for more information on verifying and validating the assembled product before packaging, or upon deployment at the operational site.
  • Consider site installation and checkout in accordance with this practice, where relevant, not just delivery.

[To top of Page]

VER - Verification

Practice Area
Basic & Advanced Maturity
ArtifactsConsiderations
DirectIndirect
Goal 1: Preparation for verification is conducted.
Practice 1.1 - Select the work products to be verified and the verification methods that will be used for each.
  • Lists of work products selected for verification
  • Verification methods for each selected work product
  • Requirements verification matrix with traceability to work products
  • Verification cross reference matrix
  • Verification plan
  • Re-verification approach (i.e., regression testing)
  • Peer review plans
  • The work products to be verified may include those associated with maintenance, training, and support services.
  • The work product requirements for verification are included with the verification methods which should address the technical approach to work product verification and the specific approaches that will be used to verify that specific work products meet their requirements
  • Selection of the verification methods typically begins with involvement in the definition of product and product-component requirements to ensure that these requirements are verifiable
  • Re-verification should be addressed by the verification methods to ensure that rework performed on work products did not cause unintended defects.
  • Reference the Maintain Bidirectional Traceability of Requirements specific practice in the Requirements Management process area to help identify the requirements for each work product.
Practice 1.2 - Establish and maintain the environment needed to support verification.
  • Verification environment
  • Requirements for the verification environment
  • Definition of verification support equipment and tools
  • Acquisition plan for verification environment components (e.g., COTS, reuse of existing assets, custom developed tools)
  • Plans or reports tracking availability of verification environment components
  • An environment must be established to enable verification to take place. The verification environment may be acquired, developed, reused, modified, or a combination of these, depending on the needs of the project
  • The type of environment required will depend on the work products selected for verification and the verification methods used. A peer review may require little more than a package of materials, reviewers, and a room. A product test may require simulators, emulators, scenario generators, data reduction tools, environmental controls, and interfaces with other systems.
Practice 1.3 - 3 Establish and maintain verification procedures and criteria for the selected work products.
  • Verification procedures
  • Verification criteria
  • Expected results and tolerances identified
  • Equipment and environmental components identified
  • sources for verification criteria:
    • Product and product-component requirements
    • Standards
    • Organizational policies
    • Test type
    • Test parameters
    • Parameters for tradeoff between quality and cost of testing
    • Type of work products.
Goal 2: Peer reviews are performed on selected work products
Practice 2.1 - Prepare for peer reviews of selected work products.
  • Peer review schedule
  • Selected work products to be reviewed
    • Peer review plans, processes, and schedules.
  • Peer review checklist
  • Entry and exit criteria for work products
  • Criteria for requiring another peer review
  • eer review training materia
    • Description of method chosen for the peer review such as inspections, walkthroughs, etc.
    • Peer review data package
    • Peer review preparation metrics.
  • types of peer reviews include Inspections, Structured walkthroughs and Active reviews
  • Examples of items addressed by the checklists include rules of construction, design guidelines, completeness, correctness, maintainability and common defect types
  • Reference the Measurement and Analysis process area for information on identifying and collecting data.
Practice 2.2 - Conduct peer reviews on selected work products to identify issues resulting from the peer reviews.
  • Peer review results
  • Peer review issues
  • Peer review data
    • Identified defects
    • Action items for corrective action
    • Data summarizing the conduct and results of the peer review.
  • Schedules showing peer reviews and re-review
  • Peer review data repository
  • Completed peer review checklists.
  • Peer reviews are performed incrementally, as work products are being developed.
  • Peer review should be clearly focused on the work product in review, not on the person who produced it.
  • When issues arise during the peer review, they should be forwarded to the primary developer of the work product for correction
  • Reference the Project Monitoring and Control process area for information about tracking issues that arise during a peer review.
Practice 2.3 - 2 Analyze data about preparation, conduct, and results of the peer reviews.
  • Peer review data
    • Data recorded to reflect the conduct of the review (preparation, conduct and results)
    • Documented peer review analysis results.
  • Peer review action items
    • Peer review data repository
    • List of action items produced during peer reviews.
  • Typical data are product name, product size, composition of the peer review team, type of peer review, preparation time per reviewer, length of the review meeting, number of defects found, type and origin of defect, etc. Additional information on the work product being peer reviewed may be collected, such as size, development stage, operating modes examined, and requirements being evaluated.
  • Reference the Project Monitoring and Control process area for information about tracking issues that arise during a peer review.
Goal 3: Selected work products are verified against their specified requirements
Practice 3.1 - Perform verification on the selected work productsN.
  • Verification results
    • Test results
    • Peer review results
  • Verification reports (trouble reports, corrective action reports, presentations, etc.)
  • As-run procedures log.
  • Demonstrations
    • Action items identified
    • Requirements verification list.
  • This is a Capability level 1 practice. Verification processes at capability level 1 or 2 may not include procedures and criteria, which are created in the Establish Verification Procedures and Criteria specific practice at capability level 3 (Defined Process). When there are no procedures or criteria established, the methods established by the Select Work Products for Verification specific practice can be used to accomplish capability level 1 performance.
Practice 3.2 - 2 Analyze the results of all verification activities and identify corrective action.
  • Analysis reportN
  • Identified corrective actions to verification methods, criteria, and/or infrastructure
  • Trouble reports
  • Method, criteria, and infrastructure change requests.
  • For each work product, all available verification results are incrementally analyzed and corrective actions are initiated to ensure that the requirements have been met.
  • Since a peer review is one of several verification methods, peer review data should be included in this analysis activity to ensure that the verification results are analyzed sufficiently.
  • Analysis reports or “as-run” method documentation may also indicate that bad verification results are due to method problems, criteria problems, or a verification environment problem.
  • Reference the corrective action practices of Project Monitoring and Control process area for more information on implementing corrective action.

[To top of Page]

VAL - Validation

Practice Area
Basic & Advanced Maturity
ArtifactsConsiderations
DirectIndirect
Goal 1: Preparation for validation is conducted.
Practice 1.1 - Select products and product components to be validated and the validation methods that will be used for each.
  • Lists of products and product components selected for validation
  • Validation methods for each product or product component.
  • Requirements for performing validation for each product or product component
  • Validation constraints for each product or product component
    • Evaluation criteria defined
    • Stakeholder reviews of validation methods
    • Validation plans and procedures
  • For each product component, the scope of the validation (e.g., operational behavior, maintenance, training, and user interface) should be determined
  • Validation methods should be selected early in the life of the project so they are clearly understood and agreed to by the relevant stakeholders
  • The product or product component must be maintainable and supportable in its intended operational environment.
Practice 1.2 - Establish and maintain the environment needed to support validation.
  • Validation environment
  • Validation plans and equipment and tools
  • Resource plan, including reuse of existing resources
  • Examples of the type of elements in a validation environment:
    • Test tools interfaced with the product being validated
    • Temporary embedded test software
    • Recording tools for dump or further analysis and replay
    • Simulated subsystems or components
    • Simulated interfaced systems
    • Real interfaced systems
    • Facilities and customer-supplied products
    • The skilled people to operate or use all the above elements
    • Dedicated computing or network test environment
Practice 1.3 - Establish and maintain procedures and criteria for validationN.
  • Validation procedures
  • Validation criteria
  • Test and evaluation procedures for maintenance, training, and support.
    • Product requirements mapping to validation procedures and methods
    • Documented environment, operational scenario, inputs, outputs and evaluation criteria
    • Reviews of validation procedures and criteria.
  • Examples of sources for validation criteria:
    • Product and product-component requirements
    • Standards
    • Customer acceptance criteria
    • Environmental performance
    • Thresholds of performance deviation.
Goal 2: The product or product-components are validated to ensure that they are suitable for use in their intended operating environment.
Practice 2.1 - Perform validation on the selected products and product components.
  • Validation reports
  • Validation results
  • As-run procedures log
  • Validation cross-reference matrix
  • Operational demonstrations
    • Data collected from performing validation procedures
    • List of deviations encountered during execution of validation procedures.
  • To be acceptable to users, a product or product component must perform as expected in its intended operational environment
  • The as-run validation procedures should be documented and the deviations occurring during the execution should be noted.
  • This is a capability level 1 specific practice. Validation processes at capability level 1 or 2 may not include procedures and criteria, which are created in the Establish Validation Procedures and Criteria specific practice at capability level 3. When there are no procedures or criteria established,the methods established by the Select Products for Validation specific practice to accomplish capability level 1 performance should be referenced.
Practice 2.2 - Capture and analyze the results of the validation activities and identify issues.
  • Validation deficiency reports, issues, reports and resultsN
  • Procedure change request
    • Validation evaluation criteria
    • Comparison of actual vs. expected results (e.g., measurements and performance data)
    • Minutes of reviews of validation results
  • Analysis reports or as-run validation documentation may also indicate that bad test results are due to a validation procedure problem or a validation environment problem..

[To top of Page]



Visit my web site