SCAMPI Support Management Process Areas Artifacts

Configuration Management Quality Control Measurement & Analysis Decision Analysis & Resolution Organizational Environment for Integration Causal Analysis & Resolution

CM - Configuration Management

Support - Configuration Management
Practice ID
ArtifactsConsiderations
DirectIndirect
Goal 1: Baselines of identified work products are established
Practice 1.1 - Identify the configuration items, components and related work products that will be placed under configuration management
  • Identified configuration items
  • Configuration management lifecycle for controlled items (e.g., owner, point at which placed under control, degree of control, change approval
  • Configuration management plan.
  • Configuration item identifiers, attributes and characteristics.
  • Documented criteria for selecting configuration items
  • Be sure to consider configuration items representative of all disciplines and processes within the appraisal scope and context. In a sense, this SP specifies the constraints under which the remaining SPs should be considered and assessed.
  • See model for definition and description of configuration item and its work product components.
  • See model for typical examples of work products that may be part of a configuration item (e.g. process descriptions, requirements, design, tools)
  • See model overview material for Generic Practice 2.6 [Manage Configurations] for a description of the various levels of control that might be provided across the lifecycle, e.g. version control vs. formal configuration management.
  • This process area applies not only to configuration management on projects, but also configuration management on organization work products such as standards, procedures, and reuse libraries.
  • Recall that this PA supports configuration management needs of all other process areas, as invoked by Generic Practice 2.6 [Manage Configurations]
Practice 1.2 - Establish and maintain a configuration management and change management system for controlling work products
  • Configuration management system with controlled work products
  • Change request database
    • Configuration management system with controlled work products
    • Change request database
    • Configuration management and change management procedures
  • Configuration management system access control procedures
  • CM library records and reports (e.g. baseline contents, level of controlled items, CCB status, audit reports)
  • Change management database reports
  • CM plan, describing tools and mechanisms for storage, retrieval, multiple levels of control
  • Records of the revision of the configuration management structure, as necessary
  • CM system backup and archive media
  • A configuration management system includes the storage media, the procedures, and the tools for accessing the configuration system.
  • Look for evidence of consistent use of the CM system and change management system for various types of work products (e.g. documentation, design, code, test) across the development lifecycle.
  • Consider potential differences in CM processes and tools across the life cycle (developmental CM, baseline control and management, archives, etc.)
  • Demos of the CM tool and change management tool capabilities, with inspection of random items, can serve as effective (affirmation) evidence of implementation of this practice.
Practice 1.3 - Create or release baselines for internal use and for delivery to the customer.
  • Baselines
  • Descriptions of baselines
    • Baseline identifiers with defined and controlled contents (configuration items)
  • Configuration management records and reports
  • CCB meeting minutes
  • Change documentation and version control associated with a baseline
  • Baseline generation / release procedures, scripts, transmittal documents
  • CM tool or repository demo (e.g., baselines, items, nodes, branches)
  • Baseline audits
  • Examples may include functional, allocated, and product baselines, releases to a customer, or internal builds.
  • Consider different types of baselines that may be established for representative work products throughout the project or product lifecycle and across the disciplines being assessed.
Goal 2: Changes to the work products under configuration management are tracked and controlled.
Practice 2.1 - Track change requests for the configuration items
  • Change request tracking products (e.g., change request database, reports, logs, closure status, metrics)
  • Recorded evaluation and disposition of change requests (e.g., review, authorization, approval of changes)
  • Change request impact analyses
  • Change request lifecycle or workflow descriptions
  • CCB / stakeholder review records (e.g., logs, meeting minutes)
  • Configuration item revision history
  • Typical change request contents include entries such as item identifier, description of change, proposed change, rationale, impact analysis, review / authorization, etc.
Practice 2.2 - Control changes to the content of the configuration items
  • Revision history of configuration items
  • Revised configuration items and baselines incorporating approved changes (e.g., CCB approval)
  • Archives of the baselines
    • Configuration management records and reports describing the revision status of baselines and configuration items
    • Impact analyses, reviews, or regression tests to ensure the integrity of baseline revisions
    • Change request review and tracking products (e.g., checklists, evaluation criteria, reports, logs, closure status, metrics)
    • Recorded evaluation and disposition of change requests (e.g., review, authorization, approval of changes)
    • Check-in/check-out procedures from the configuration management system.
  • Configuration items identified in Practice 1.1 should be controlled at the appropriate level.
  • Archives should be maintained for review / retrieval of superseded versions of baselines and configuration items (e.g., to support rollback to prior versions)
  • Typical change request contents include entries such as item identifier, description of change, proposed change, rationale, impact analysis, review / authorization, etc.
Goal 3: Integrity of baselines is established and maintained.
Practice 3.1 - Establish and maintain records describing configuration items
  • Records describing content, status, and version of configuration items and baselines.
  • Reports describing configuration item status, available to affected individuals and groups (e.g., CM library reports, baseline access.)
  • Multiple versions of configuration item records, maintained over time.
  • Revision history of configuration items
  • Change log
    • Change request logs or database.
  • Copy of the change requests
  • Status of configuration items
  • Differences between baselines
  • Ensure stakeholder access to configuration management records, baselines, and configuration items, as appropriate.
Practice 3.2 - Perform configuration audits to maintain integrity of the configuration baselines
  • Configuration audit results
  • Action itemsN
  • Criteria and checklists used to conduct configuration audits
  • Quality inspection records
  • Configuration audit schedules and descriptions.
  • Minutes of meetings in which the accuracy and contents of baselines or releases are reviewed.
  • Tools or reports to verify configuration baseline contents
  • Configuration audits may take several forms (functional, physical, logical, etc.), particularly when considering outside the software discipline
  • The frequency and conduct of configuration audits are typically described in the configuration management plan

[To top of Page]

PPQA - Process and Product Quality Assurance

Support - Quality Control
Practice ID
ArtifactsConsiderations
DirectIndirect
Goal 1: Adherence of the performed process and associated work products and services to applicable process descriptions, standards, and procedures is objectively evaluated.
Practice 1.1 - Objectively evaluate the designated performed processes against the applicable process descriptions, standard, and procedures.

Practice 1.2 - Objectively evaluate the designated work products and services against the applicable process descriptions, standards, and procedures.

  • Evaluation reports
  • Noncompliance reports
  • Corrective actions include:
    • Quality assurance plan, identifying the processes subject to evaluation, and procedures for performing evaluations.
    • Applicable process descriptions, standards, and procedures.
    • Action items for noncompliance issues, tracked to closure.
    • Criteria and checklists used for process and work product evaluations (e.g. what, when, how, who).
    • Schedule for performing process evaluations (planned, actual) at selected milestones throughout the product development life cycle.
    • Quality assurance records, reports, or database.
    • Records of reviews or events indicating QA involvement (e.g. attendance lists, signature)
  • Objectivity in quality assurance evaluations is critical to the success of the project. A description of the quality assurance reporting chain and how it ensures objectivity should be defined.
  • The frequency of evaluations or audits is typically defined in a quality assurance plan. Look for evaluations performed throughout the lifecycle, not just at the end a project or in close proximity to the appraisal. For example, management processes lend themselves to periodic examination; technical processes lend themselves to event-driven examination based on project schedules and progress.
  • A typical implementation of this practice is through the development and use of a quality assurance plan that may be a standalone document or incorporated into another plan.
  • Depending on the culture of the organization, the process and product quality assurance role may be performed, partially or completely, by peers, and the quality assurance function may be embedded in the process. Where the quality assurance function is embedded in the process, there must be a review of the objectivity aspect to ensure that the evaluation mechanism is being adequately applied.
  • A common misunderstanding is that the existence of a work product indicates satisfactory execution of the process. For all process evaluations, examine how the evaluation was done to ensure that the evaluation mechanism considers more than just work product existence.
Goal 2: Noncompliance issues are objectively tracked and communicated, and resolution is ensured.
Practice 2.1 - Communicate quality issues and ensure resolution of noncompliance issues with the staff and managers.
  • Corrective action and evaluation reports
  • Quality trends
    • Action items for noncompliance issues, tracked to closure.
    • Revised work products, standards and procedures, or waivers issued to resolve noncompliance issues.
    • Reports or briefings communicating noncompliance issues to relevant stakeholders.
    • Evidence of reviews held periodically to receive and act upon noncompliance issues.
    • Quality metrics and trend analyses.
    • Tracking system or database for noncompliance issues.
  • The status of noncompliance issues provides an indication of quality trends
  • When local resolution of noncompliance issues cannot be obtained, established escalation mechanisms should be used to ensure that the appropriate level of management can resolve the issue. Track noncompliance issues to resolution.
Practice 2.2 - Establish and maintain records of the quality assurance activities.
  • Evaluation logs
  • Quality assurance reports and records of activities.
  • Status reports of corrective actions and Status reports of activities
  • Reports of quality trends
    • Noncompliance actions, reports, logs, or database
    • Completed evaluation checklists
    • Schedule for performing process and product evaluations (planned, actual).
    • Records of reviews or events indicating QA involvement (e.g. attendance lists, signature)
    • Metrics or analyses used for quality assurance of processes and work products.
  • Recording of activities in sufficient detail such that status and results are known
  • Examination of records to ensure that appropriate actions are being taken, and resources being applied, to manage the effectiveness of the quality assurance functions.

[To top of Page]

MA - Measurement & Analysis

Support - Measurement & Analysis
Practice ID
ArtifactsConsiderations
DirectIndirect
Goal 1: Measurement objectives and activities are aligned with identified information needs and objectivesN.
Practice 1.1 - Establish and maintain measurement objectives that are derived from identified information needs and objectiveN.
  • Measurement objectives
  • Alignment between business goals, measurement objectives/goals, information needs/objectives
  • Identified information needs, objectives, and priorities
  • Documented sources of information needs
  • Reviews of measurement objectives with affected stakeholders.
  • The sources for measurement objectives may be management, technical, project, product, or process implementation needs
  • Measurement objectives may be constrained by existing processes, available resources, or other measurement considerations
  • Judgments may need to be made about whether the value of the results will be commensurate with the resources devoted to doing the work
  • Modifications to identified information needs and objectives may, in turn, be indicated as a consequence of the process and results of measurement and analysis
  • Sources of information needs and objectives:
    • Project plans
    • Monitoring of project performance
    • Interviews with managers and others who have information needs
    • Established management objectives
    • Strategic plans
    • Business plans
    • Formal requirements or contractual obligations
    • Recurring or other troublesome management or technical problems
    • Experiences of other projects or organizational entities
    • External industry benchmarks
    • Process-improvement plans
    • Reference the Project Planning process area for more information about estimating project attributes and other planning information needs
    • Reference the Project Monitoring and Control process area for more information about project performance information needs
    • Reference the Requirements Development process area for more information about meeting customer requirements and related information needs
    • Reference the Requirements Management process area for more information about maintaining requirements traceability and related information needs.
Practice 1,2 - Specify measures to address the measurement objectives.
  • Linkage between measures and project / organization measurement objectives and information needs
  • Algorithms, templates, checklists, procedures, ways of consistently collecting and recording measures for the product, project and process attributes identified. (?)
  • Evidence of review of proposed specifications with stakeholders and other end users.
  • List of prioritized measures.
  • Measures may be:
    • base: obtained by direct measurement
    • derived: from other data, typically by combining two or more base measureN.
  • The measurement objectives are refined into specific measures:
    • KEY GOAL INDICATORS (KGI): define measures that tell management — after the fact — whether an IT process has achieved its business requirements, usually expressed in terms of information criteria:
      • Availability of information needed to support the business needs
      • Absence of integrity and confidentiality risks
      • Cost-efficiency of processes and operations
      • Confirmation of reliability, effectiveness and compliance.
    • KEY PERFORMANCE INDICATORS (KPI): define measures to determine how well the IT process is performing in enabling the goal to be reached; are lead indicators of whether a goal will likely be reached or not; and are good indicators of capabilities, practices and skills.R
  • Operational definitions are stated in precise and unambiguous terms. They address two important criteria:
    • Communication: What has been measured, how was it measured, what are the units of measure, and what has been included or excluded?
    • Repeatability: Can the measurement be repeated, given the same definition, to get the same results?
  • Proposed specifications of the measures are reviewed for their appropriateness with potential end users and other relevant stakeholders. Priorities are set or changed, and specifications of the measures are updated as necessary.
Practice 1.3 - Specify how measurement data will be obtained and stored.
  • Data collection and storage procedures - who (responsibilities), how (procedures and tools), when (frequency), where (repository).
  • Data collection tools
    • Data collection mechanisms and supporting tools
    • Raw data collected, time tagged, and stored.
    • Analysis reports and trending indicating completeness of collected data.
    • Measurement repository.
    • Reports of invalid or discarded data.
  • Explicit specifications are made of how, where, and when the data will be collected. Procedures for collecting valid data are specified. The data are stored in an accessible manner for analysis, and it is determined whether they will be saved for possible re-analysis or documentation purposes.
  • Explicit specification of collection methods helps ensure that the right data are collected properly. It may also aid in further clarifying information needs and measurement objectives
  • Proper attention to storage and retrieval procedures helps ensure that data are available and accessible for future use.
Practice 1.4 - Specify how measurement data will be analyzed and reportedN.
  • Analysis specification and procedures - Analysis descriptions, including who (responsibilities), how (procedures and tools), when (frequency), where (repository), and how the results will be used.
  • Data analysis tools:
    • Results of data analyses
    • Alignment of data analyses with measurement objectives
    • Evidence of evaluations or meetings held to review measurement analyses
    • Criteria for evaluating the utility of measurement and analysis data
    • Revisions to measures and measurement objectives.
  • Early attention should be paid to the analyses that will be conducted and to the manner in which the results will be reported. These should explicitly address the documented measurement objectives and presentation of the results should be clearly understandable by the target audiences
  • Clarification of analysis criteria can affect measurement. Specifications for some measures may be refined further based on the specifications established for data analysis procedures. Other measures may prove to be unnecessary, or a need for additional measures may be recognized.
  • Criteria for evaluating the utility of the analysis:
    • The results are (1) provided on a timely basis, (2) understandable, and (3) used for decision making
    • The work does not cost more to perform than is justified by the benefits that it provides.
  • Criteria for evaluating the conduct of the measurement and analysis:
    • The amount of missing data or the number of flagged inconsistencies is beyond specified thresholds.
    • There is selection bias in samplingN
    • The measurement data are repeatable
    • Statistical assumptions are acceptable
Goal 2: Measurement results that address identified information needs and objectives are provided.
Practice 2.1 - Obtain specified measurement data.
  • Base and derived measurement data sets
    • Raw data collected, time tagged, and stored in accordance with defined data collection procedures (Practice 1.3)
    • Derived measures calculated from collected base measures.
  • Results of data integrity tests
    • Measurement repository populated with the specified measures
    • Analysis reports and trending indicating completeness of collected data
    • Results of integrity checks (e.g., tools, forms, reviews); reports of invalid or discarded data.
  • All measurements are subject to error in specifying or recording data. It is always better to identify such errors and to identify sources of missing data early in the measurement and analysis cycle.
  • Checks can include scans for missing data, out-of-bounds data values, and unusual patterns and correlation across measures.
  • It is important to test and correct for inconsistency of classifications made by human judgment (i.e., to determine how frequently people make differing classification decisions based on the same information, otherwise known as “inter-coder reliability”).
  • It is important to empirically examine the relationships among the measures that are used to calculate additional derived measuresN.
Practice 2.2 - Analyze and interpret measurement data.
  • Analysis results and draft reports.
  • Representations for analysis results
  • Evidence of evaluations or meetings held to review measurement analyses
  • Follow-up analyses performed to address areas of concern, if necessary.
  • Revisions of criteria for future analysis,
  • The results of data analyses are rarely self evident. Criteria for interpreting the results and drawing conclusions should be stated explicitly
  • The results of planned analyses may suggest (or require) additional, unanticipated analyses, or, they may identify needs to refine existing measures, to calculate additional derived measures, or even to collect data for additional primitive measures to properly complete the planned analysis

Practice 2.3 - Manage and store measurement data, measurement specifications, and analysis resultsN.
  • Measurement repository with historical data and results.
  • Contextual information for understanding and interpreting the measures, and assessing them for reasonableness and applicability.
  • Measurement repository, with access restriction to the stored data
  • Information stored typically includes Measurement plans, Specifications of measures, sets of data that have been collected and analysis reports and presentations
  • The stored information contains or references the information needed to understand and interpret the measures and assess them for reasonableness and applicability
  • Data sets for derived measures typically can be recalculated and need not be stored. However, it may be appropriate to store summaries based on derived measures Interim analysis results need not be stored separately if they can be efficiently reconstructed
  • Projects may choose to store project-specific data and results in a project-specific repository. When data are shared more widely across projects, the data may reside in the organization’s measurement repository.
  • Reference the Establish the Organization’s Measurement Repository specific practice of the Organizational Process Definition process area for more information about establishing the organization’s measurement repository
  • Reference the Configuration Management process area for information on managing measurement work products.
Practice 2.4 - Report results of measurement and analysis activities to all relevant stakeholdersN.
  • Delivered reports and related analysis results.
  • Contextual data or guidance to aid in interpretation of analysis results
    • Presentations of data analyses and reports
    • Measurement indicator templates
    • Distribution lists or web pages for communicating measurement results.
  • Measurement results are communicated in time to be used for their intended purposes. Reports are unlikely to be used if they are distributed with little effort to follow up with those who need to know the results
  • To the extent possible and as part of the normal way they do business, users of measurement results are kept personally involved in setting objectives and deciding on plans of action for measurement and analysis. The users are regularly kept apprised of progress and interim results.
  • Reference the Project Monitoring and Control process area for more information on the use of measurement results.

[To top of Page]

Decision Analysis & Resolution

Support - Decision Analysis & Resolution
Practice ID
ArtifactsConsiderations
DirectIndirect
Goal 1: Decisions are based on an evaluation of alternatives using established criteria.
Practice 1.1 - Establish and maintain guidelines to determine which issues are subject to a formal evaluation process.
  • Guidelines for when to apply a formal evaluation process
  • Criteria or checklists for determining when to apply a formal evaluation process
  • Process description for conducting formal evaluations and selection of applicable decision-making techniques
  • Identified set of typical issues subject to a formal evaluation process
  • Organizations may have different approaches for how the formal evaluation processes are architected and documented. They could be embedded within several associated processes (e.g., supplier selection process or trade studies) rather than a separate "Decision and Resolution process". If distributed across several processes, this may also already indicate the decision reached by the organization as to which processes need formal evaluation techniques, rather than a separate, integrated set of guidelines.
  • Identification of the issues subject to formal evaluation (i.e., applying these guidelines) is not explicitly addressed in other Decision and Resolution Practice Areas; therefore, it should be considered here. An outcome of this would be the identified set of issues subject to application of the formal evaluation process (which is detailed in the remainder of the Decision and Resolution Practice Areas).
  • There may be a variety of suitable evaluation processes that could be selected from, as appropriate to the situation. Formal evaluation processes can vary in formality, type of criteria, and methods employed.N.
  • Typical guidelines for determining when to require a formal evaluation process include:
    • When a decision is directly related to topics assessed as being of medium or high risk
    • When a decision is related to changing work products under configuration management
    • When a decision would cause schedule delays over a certain percentage or specific amount of time
    • When a decision affects the ability to achieve project objectives
    • When the costs of the formal evaluation process are reasonable when compared to the decision’s impact
    • Reference the Risk Management process area for more information about determining which issues are medium or high risk.
Practice 1.2 - Establish and maintain the criteria for evaluating alternatives, and the relative ranking of these criteriaN.
  • Documented evaluation criteria
  • Rankings of criteria importance
  • Traceability of criteria to documented sources (e.g., requirements, assumptions, business objectives)
  • Guidance for determining and applying evaluation criteria (e.g., ranges, scales, formulas, rationale)
  • Rationale for selection and rejection of evaluation criteria.
  • This process area is referenced by many other process areas in the model, and there are many contexts in which a formal evaluation process can be used. Therefore, in some situations you may find that criteria have already been defined as part of another process. This specific practice does not suggest that a second development of criteria be conducted
  • Document the evaluation criteria to minimize the possibility that decisions will be second-guessed, or that the reason for making the decision will be forgotten. Decisions based on criteria that are explicitly defined and established remove barriers to stakeholder buy-in.
  • Criteria should be traceable to requirements, scenarios, business case assumptions, business objectives, or other documented sources
  • Scales of relative importance for evaluation criteria can be established with nonnumeric values or with formulas that relate the evaluation parameter to a numerical weight
  • Documentation of selection criteria and rationale may be needed to justify solutions or for future reference and use.
Practice 1.3 - Identify alternative solutions to address issues.
  • Identified alternatives
  • Results of Brainstorming sessions, interviews, or other techniques used to identify potential solutions
  • Research resources and references (e.g. literature surveys)
  • A wider range of alternatives can surface by soliciting as many stakeholders as practical for input. Input from stakeholders with diverse skills and backgrounds can help teams identify and address assumptions, constraints, and biases
  • Evaluation criteria are an effective starting point for identifying alternatives. The evaluation criteria identify the priorities of the relevant stakeholders and the importance of technical challenges
  • Combining key attributes of existing alternatives can generate additional and sometimes stronger alternatives.
  • Brainstorming sessions may stimulate innovative alternatives through rapid interaction and feedback.
  • As the analysis proceeds, other alternatives should be added to the list of potential candidate solutions. The generation and consideration of multiple alternatives early in a decision analysis and resolution process increases the likelihood that an acceptable decision will be made, and that consequences of the decision will be understood.
Practice 1.4 - Select the evaluation methods.
  • Selected evaluation methods.
  • List of candidate or preferred evaluation method
  • Guidance on selection of appropriate evaluation methods
  • Methods for evaluating alternative solutions against established criteria can range from simulations to the use of probabilistic models and decision theory. These methods need to be carefully selected. The level of detail of a method should be commensurate with cost, schedule, performance, and risk impacts
  • While many problems may need only one evaluation method, some problems may require multiple methods. For instance, simulations may augment a trade study to determine which design alternative best meets a given criterion.
Practice 1.5 - Evaluate alternative solutions using the documented criteria.
  • Conclusions or findings from evaluations.
  • Evaluated assumptions and constraints for application of evaluation criteria or interpretation of results (e.g., uncertainty, significance)
  • Completed evaluation forms, checklists, or assigned criteria.
  • Results of simulations, modeling, prototypes, pilots, life cycle cost analyses, studies, etc., performed on potential solutions.
  • Iterative cycles of analysis are sometimes necessary.
  • Supporting analyses, experimentation, prototyping, or simulations may be needed to substantiate scoring and conclusions
  • The relative importance of criteria may be imprecise and the total effect on a solution may not apparent until after the analysis is performed. In cases where the resulting scores differ by relatively small amounts, the best selection among alternative solutions may not be clearcut
  • Challenges to criteria and assumptions should be encouraged.
  • Untested criteria, their relative importance, and supporting data or functions may cause the validity of solutions to be questioned
  • Criteria and their relative priorities and scales can be tested with trial runs against a set of alternatives. These trial runs of a select set of criteria allow for the evaluation of the cumulative impact of the criteria on a solution. If the trials reveal problems, different criteria or alternatives might be considered to avoid biases.
Practice 1.6 - Select solutions from the alternatives based on the evaluation criteria.
  • Recommended solutions to address significant issues
  • Documented results and rationale of the decision.
  • Decisions must often be made with incomplete information. There can be substantial risk associated with the decision resulting from incomplete information
  • Identified risks should be monitored
  • When decisions must be made according to a specific schedule, time and resources may not be available for gathering complete information. Consequently, risky decisions made with incomplete information may require re-analysis later
  • It is important to record both why a solution is selected and why another solution was rejected.

[To top of Page]

Organizational Environment for Integration

Support - Organizational Environment for Integration
Practice ID
ArtifactsConsiderations
DirectIndirect
Goal 1: An infrastructure that maximizes the productivity of people and affects the collaboration necessary for integrationN is provided.
Practice 1.1 - 1 Establish and maintain a shared vision for the organization.
  • Revision history for the organization's shared vision
  • Evaluations of the organization's shared vision
  • Guidelines for shared-vision building within projects and integrated teams
    • Minutes from meetings at which the organizational shared vision is developed or reviewed
    • Documented agreement and commitment to the organization's shared vision
    • Communications of organizational expectations for projects and integrated teams (newsletters, email, bulletin board postings, presentations, posters, etc.)
  • Creating a shared vision involves establishing and actively maintaining agreement and commitment about what is to be done and how it will be accomplished, both procedurally and behaviorally.
  • The organization’s shared vision must speak to every element of the organization
  • The shared vision should set reasonable expectations on the rate of change in an organization. Unrealistic proclamations can transform the shared vision into a source of frustration and cause the organization to retreat from it after initial pilot demonstrations.
Practice 1.2 - 1 Establish and maintain an integrated work environment that supports IPPD by enabling collaboration and concurrent development.
  • Integrated work environment
    • Tools and facilities designed and used to facilitate and enable collaboration and teamwork in the organization.
  • \Requirements for the integrated work environment
  • Design of the integrated work environment
    • Sufficient meeting rooms, email, fax, FTP or web sites, video teleconferencing facilities
    • Evaluations of effectiveness of the integrated work environment
    • Maintenance and support of the integrated work environment
    • Training materials and records for the integrated work environment
    • Evidence of awareness of current and emerging integrated work environment technologies, resources and tools.
  • The integrated work environment must accommodate both collocated and distributed integrated teams as required. Two-way communications media should be easily accessible by all relevant stakeholders
  • Integrated communication tool sets reduce time spent converting information from one medium or platform to another, and correcting transcriptions or misunderstandings when people do the conversions. Requirements for product and process information usability throughout
  • Integrated work environments are developed with the same, or greater, rigor as that used to develop a specific product or service. Integrated work environments are capital assets that are often expensive, have unique implementations, are irreversible (their implementation can destroy or make unusable the assets being replaced), and whose modification disrupts ongoing activities. The rigor appropriate to the development should be matched to the magnitude of the needs to be resolved and the deployment risks
  • The work environment should be monitored throughout its existence to ascertain if, and when, its performance degrades below that expected (or specified) as well as to identify opportunities for improvements.
Practice 1.3 - Identify the unique skills needed to support the IPPD environment.
  • IPPD strategic and tactical training needs.
  • Skills matrix for people to collaborate, integrate, and lead others
  • Training modules to develop and reinforce collaborative skills in an IPPD environment
  • Plan or schedule for training to support the introduction and sustainment of skills for IPPD
  • Leadership challenges include:
    • ensuring that all team members mutually understand their roles and responsibilities;
    • employing people in their intended roles; and
    • effectively accessing the depth and wealth of specific expertise resident in the organization and integrating it into the overall integrated team effort.
    • Reference the Organizational Training process area for more information about determining training needs and delivering the training.
    • The organization’s leadership and work force will typically need to develop new skills. IPPD requires integrative leadership, and interpersonal skills beyond those typically found in traditional environments where people tend to work alone or primarily interact with others from their own, or similar, functions, or disciplines.
Goal 2: People are managed to nurture the integrative and collaborative behaviors of an IPPD environment.
Practice 2.1 - Establish and maintain leadership mechanisms to enable timely collaboration.
  • Guidelines for determining the degree of empowerment of people and integrated teams
  • Guidelines for setting leadership and decision-making context
  • Organizational process documentation for issue resolution
  • Process descriptions and training for leadership mechanisms in an IPPD environment
  • Documented reporting relationships and decision-making authority.
  • Standard processes enable, promote, and reinforce the integrative behaviors expected from projects, integrated teams, and people
  • Organizational guidelines that scope the degree of empowerment for integrated teams serve an issue-prevention role. Best practices promote documented and deployed organizational guidelines that can preclude issues arising from empowerment and authority misinterpretation
  • In establishing the context for decision making, the various kinds of issues are described and agreements are reached on the decision type that will be used to resolve each kind of issue.
  • An organizational process for issue resolution can form the basis for project- and integrated-team-specific procedures and help ensure that basic issue-resolution avenues are available to projects and integrated teams when unresolved issues must be escalated.
Practice 2.2 - Establish and maintain incentives for adopting and demonstrating integrative and collaborative behaviors at all levels of the organizationN.
  • Policies and procedures for performance appraisal and recognition that reinforce collaboration
  • Integrated team and individual recognition and rewards - Incentive program that recognizes individual achievement as well as team performance.
  • Criteria for distinguishing collaborative behaviors
  • Performance review process that considers both functional supervisor and team leader input.
  • Individual excellence still should be recognized, but criteria should discern whether such excellence was achieved at the expense of the integrative behaviors expected or in support of them.
  • Incentives should be consistent with the objectives of the organization and applied to achieve desired behavior at all levels of the organization.
  • Criteria can establish guidelines for the reassignment of people who are unable to demonstrate desired behavior and the selection of people who can exhibit desired behavior for challenging or important jobs.
Practice 2.3 - Establish and maintain organizational guidelines to balance team and home organization responsibilities.
  • Organizational guidelines for balancing team and home organization responsibilities.
  • Performance review process that considers both functional supervisor and team leader input
    • Role descriptions and organization charts that identify responsibilities to both the project team and home organization
    • Department budgets and financial reports identifying individual participation in home department activities
    • Compensation policies, procedures, and criteria that recognize both individual and team achievement.
  • The balance must be reflected in the personal or career development plans for each individual. The knowledge and skills needed for an individual to succeed in both their functional and integrated team role should be honed, taking into account current and future assignments.
  • Guidelines should also be in place for disbanding teams and maintaining home organizations. It has been observed that sometimes teams attempt to remain in place beyond their productive life in organizations that do not have a home organization for the team members to report back to after the team is dissolved.

[To top of Page]

Causal Analysis & Resolution

Support - Causal Analysis & Resolution
Practice ID
ArtifactsConsiderations
DirectIndirect
Goal 1: Root causes of defects and other problems are systematically determined.
Practice 1.1 - Select the defects and other problems for analysis.
  • Defect and problem data selected for further analysis.
  • Defects and other problems to be analyzed further
  • Defect selection criteria
Practice 1.2 - Perform causal analysis of selected defects and other problems and propose actions to address them.
  • Action proposal
  • Defect categories
  • Enhanced infrastructure integrity
  • Fewer latent errors in environment
  • Causal analysis is performed with those people who have an understanding of the selected defect or problem under study, typically in meetingsN.
  • Depending on the type and number of defects, it may make sense to first group the defects before identifying their root causes.
  • Reference the Quantitative Project Management process area for more information about achieving the project’s quality and process performance objectives.
Goal 2: Root causes of defects and other problems are systematically addressed to prevent their future occurrence.
Practice 2.1 - Implement the selected action proposals that were developed in causal analysis.
  • Action proposals selected for implementation
  • Improvement proposals
  • Criteria for prioritizing action proposals
  • Removal of like-defects (current and future) through implementaiton of the action proposal
  • Information provided in an action item
  • Only changes that prove to be of value should be considered for broad implementation.
  • Reference the Organizational Innovation and Deployment process area for more information about the selection and deployment of improvement proposals for the set of standard processes.
Practice 2.2 - Evaluate the effect of changes on process performanceN.
  • Measures of performance and performance change.
  • influence on the ability of a process to meet its quality and process-performance objectives
 
Practice 2.3 - Record causal analysis and resolution data for use across the project and organizationN.
  • Causal analysis and resolution records
  • Data on defects and other problems that were analyzed
    • Rationale for decisions
    • Action proposals from causal analysis meetings
    • Action items resulting from action proposals
    • Cost of the analysis and resolution activities
    • Measures of changes to the performance of the defined process resulting from resolutions
 

[To top of Page]



Visit my web site