AI05: System Implementation & Accreditation

Description Controls KGI KPI CSF Maturity Levels

1. Description

Formalized installation, migration, conversion and acceptance plan

[To top of Page]

2. Control Objectives



[To top of Page]

3. Key Goal Indicators



[To top of Page]

4. Key Performance Indicators



[To top of Page]

5. Critical Success Factors



6. Service Maturity Variations

0 Non-existentThere is a complete lack of formal installation or accreditation processes and senior management or IT staff does not recognize the need to verify that solutions are fit for the intended purpose.
1 (Initial/Ad Hoc)There is an awareness of the need to verify and confirm that implemented solutions serve the intended purpose. Testing is performed for some projects, but the initiative for testing is left to the individual project teams and the approaches taken vary. Formal accreditation and sign-off is rare or non-existent.
2 (Repeatable but Intuitive)There is some consistency between the testing and accreditation approaches, but they are not based on any methodology. The individual development teams normally decide the testing approach and there is usually an absence of integration testing. There is an informal approval process, not necessarily based on standardized criteria. Formal accreditation and sign-off is inconsistently applied.
3 (Defined Process)A formal methodology relating to installation, migration, conversion and acceptance is in place. However, management does not have the ability to assess compliance. IT installation and accreditation processes are integrated into the system life cycle and automated to some extent. Training, testing and transition to production status and accreditation are likely to vary from the defined process, based on individual decisions. The quality of systems entering production is inconsistent, with new systems often generating a significant level of post-implementation problems.
4 (Managed and Measurable)The procedures are formalised and developed to be well organised and practical with defined test environments and accreditation procedures. In practice, all major changes to systems follow this formalised approach. Evaluation of meeting user requirements is standardised and measurable, producing metrics that can be effectively reviewed and analysed by management. The quality of systems entering production is satisfactory to management, with reasonable levels of postimplementation problems. Automation of the process is ad hoc and project dependent. Neither postimplementation evaluations nor continuous quality reviews are consistently employed, although management may be satisfied with the current level of efficiency. The test system adequately reflects the live environment. Stress testing for new systems and regression testing for existing systems is applied for major projects.
5 OptimizedThe installation and accreditation processes have been refined to a level of best practice, based on the results of continuous improvement and refinement. IT installation and accreditation processes are fully integrated into the system life cycle and automated when advisable, facilitating the most efficient training, testing and transition to production status of new systems. Welldeveloped test environments, problem registers and fault resolution processes ensure efficient and effective transition to the production environment. Accreditation takes place usually with limited rework and post implementation problems are normally limited to minor corrections. Post-implementation reviews are also standardised, with lessons learned channelled back into the process to ensure continuous quality improvement. Stress testing for new systems and regression testing for amended systems is consistently applied.

[To top of Page]


Visit my web site