Service Transition

1Introduction 2Serv. Mgmt. 3Principles 4Processes 5Activities 6Organization 7Consideration 8Implementation 9Issues AAppendeces

4. Service Transition Processes

4.1PLAN/SUPPORT 4.2CHANGE 4.3ASSET/CONFIG 4.4RELEASE/DEPLOY 4.5 VALIDATE/TEST 4.6EVALUATE 4.7KNOWLEDGE

4.6 Evaluation

Evaluation is a generic process that considers whether the performance of something is acceptable, value for money etc. - and whether it will be proceeded with, accepted into use, paid for, etc.

4.6.1 Purpose, Goal And Objective
The purpose of evaluation is to provide a consistent and standardized means of determining the performance of a service change in the context of existing and proposed services and IT infrastructure. The actual performance of a change is assessed against its predicted performance and any deviations between the two are understood and managed.

The goal of evaluation is to set stakeholder expectations correctly and provide effective and accurate information to Change Management to make sure changes that adversely affect service capability and introduce risk are not transitioned unchecked.

The objective is to:

4.6.2 Scope
Specifically in this section we consider the evaluation of new or changed services defined by Service Design, during deployment and before final transition to service operations. The importance of evaluating the actual performance of any service change against its anticipated performance is an important source of information to service providers to help ensure that expectations set are realistic and to identify that if there are any reasons that production performance does not meet what was expected.

4.6.3 Value To Business
Evaluation is, by its very nature, concerned with value. Specifically effective evaluation will establish the use made of resources in terms of delivered benefit and this information will allow a more accurate focus on value in future service development and Change Management. There is a great deal of intelligence that Continual Service Improvement can take from evaluation to analyse future improvements to the process of change and the predictions and measurement of service change performance.

4.6.4 Policies, Principles And Basic Concepts
Policies
The following policies apply to the evaluation process:

Principles
The following principles shall guide the execution evaluation process:

Basic Concepts
The evaluation process uses the Plan-Do-Check-Act (PDCA) model to ensure consistency across all evaluations.

4.6.5 Process Activities, Methods And Techniques
4.6.5.1 Service Evaluation Terms
The key terms shown in Table 4.13 apply to the service evaluation process.

TermFunction/Means
Service changeA change to an existing service or the introduction of a new service; the service change arrives into service evaluation and qualification in the form of a Request for Change (RFC) from Change Management
Service Design packageDefines the service and provides a plan of service changes for the next period (e.g. the next 12 months). Of particular interest to service evaluation is the Acceptance Criteria and the predicted performance of a service with respect to a service change
PerformancThe utilities and warranties of a service
Performance modeA representation of the performance of a service
Predicted performanceThe expected performance of a service following a service change
Actual performanceThe performance achieved following a service change
Deviations reportThe difference between predicted and actual performance
RiskA function of the likelihood and negative impact of a service not performing as expected
CountermeasuresThe mitigation that is implemented to reduce risk
Test plan and resultsThe test plan is a response to an impact assessment of the proposed service change. Typically the plan will specify how the change will be tested; what records will result from testing and where they will be stored; who will approve the change; and how it will be ensured that the change and the service(s) it affects will remain stable over time. The test plan may include a qualification plan and a validation plan if the change affects a regulated environment. The results represent the actual performance following implementation of the change
Residual riskThe remaining risk after countermeasures have been deployed
Service capabilityThe ability of a service to perform as required
CapacityAn organization's ability to maintain service capability under any predefined circumstances
ConstraintLimits on an organization's capacity
ResourceThe normal requirements of an organization to maintain service capability
Evaluation planThe outcome of the evaluation planning exercise
Evaluation reportA report generated by the evaluation function, which is passed to Change Management and which comprises:
  • A risk profile
  • A deviations report
  • A recommendation
  • A qualification statement.
Table 4.13 Key terms that apply to the service evaluation process

Figure 4.34 Evaluation process
Figure 4.34 Evaluation process

4.6.5.2 Evaluation Process
Figure 4.34 shows the evaluation process with inputs and outputs.

4.6.5.3 Evaluation Plan
Evaluation of a change should be carried out from a number of different perspectives to ensure any unintended effects of a change are understood as well as the intended effects.

Generally speaking we would expect the intended effects of a change to be beneficial. The unintended effects are harder to predict, often not seen even after the service change is implemented, and frequently ignored. Additionally, they will not always be beneficial, for example in terms of impact on other services, impact on customers and users of the service, and network overloading. Intended effects of a change should match the Acceptance Criteria. Unintended effects are often not seen until pilot stage or even once in production; they are difficult to measure and very often not beneficial to the business.

4.6.5.4 Understanding The Intended Effect Of A Change
The details of the service change, customer requirements and Service Design package should be carefully analyzed to understand fully the purpose of the change and the expected benefit from implementing it. Examples might include: reduce cost of running the service; increase service performance; reduce resources required to operate the service; or improve service capability.

The change documentation should make clear what the intended effect of the change will be and specific measures that should be used to determine effectiveness of that change. If they are in any way unclear or ambiguous the evaluation should cease and a recommendation not to proceed should be forwarded to Change Management. Even some deliberately designed changes may be detrimental to some elements of the service. For example, the introduction of SOX-compliant procedures, which, while delivering the benefit of legal compliance, introduce extra work steps and costs.

4.6.5.5 Understanding The Unintended Effect Of A Change
In addition to the expected effects on the service and broader organization there are likely to be additional effects which were not expected or planned for. These effects must also be surfaced and considered if the full impact of a service change is to be understood. One of the most effective ways of identifying such effects is by discussion with all stakeholders. Not just customers, but also users of the service, those who maintain it, those who fund it etc. Care should be taken in presenting the details of the change to ensure stakeholders fully understand the implications and can therefore provide accurate feedback.

4.6.5.6 Factors For Considering The Effect Of A Service Change
Table 4.14 shows the factors to be included when considering the effect of a service change.

FactorEvaluation of Service Design
S - Service provider capabilityThe ability of a service provider or service unit to perform as required.
T - ToleranceThe ability or capacity of a service to absorb the service change or release.
0 - Organizational settingThe ability of an organization to accept the proposed change. For example, is appropriate access available for the implementation team? Have all existing services that would be affected by the change been updated to ensure smooth transition?
R - ResourcesThe availability of appropriately skilled and knowledgeable people, sufficient finances, infrastructure, applications and other resources necessary to run the service following transition.
M - Modelling and measurementThe extent to which the predictions of behaviour generated from the model match the actual behaviour of the new or changed service.
P - PeopleThe people within a system and the effect of change on them.
U - Use Will the service be fit for use?The ability to deliver the warranties, e.g. continuously available, is there enough capacity, will it be secure enough?
P - PurposeWill the new or changed service be fit for purpose? Can the required performance be supported? Will the constraints be removed as planned?
Table 4.14 Factors for considering the effects of a service change

4.6.5.7 Evaluation Of Predicted Performance
Using customer requirements (including Acceptance Criteria), the predicted performance and the performance model, a risk assessment is carried out. If the risk assessment suggests that predicted performance may create unacceptable risks from the change or not meet the Acceptance Criteria, an interim evaluation report is sent to alert Change Management.

The interim evaluation report includes the outcome of the risk assessment and/or the outcome of the predicted performance versus Acceptance Criteria, together with a recommendation to reject the service change in its current form.

Evaluation activities cease at this point pending a decision from Change Management.

4.6.5.8 Evaluation Of Actual Performance
Once the service change has been implemented a report on actual performance is received from operations. Using customer requirements (including Acceptance Criteria), the actual performance and the performance model, a risk assessment is carried out. Again if the risk assessment suggests that actual performance is creating unacceptable risks, an interim evaluation report is sent to Change Management.

The interim evaluation report includes the outcome of the risk assessment and/or the outcome of the actual performance versus Acceptance Criteria, together with a recommendation to remediate the service change. Evaluation activities cease at this point pending a decision from Change Management.

4.6.5.9 Risk Management
There are two steps in risk management: risk assessment and mitigation. Risk assessment is concerned with analyzing threats and weaknesses that have been or would be introduced as a result of a service change.

A risk occurs when a threat can exploit a weakness. The likelihood of threats exploiting a weakness and the impact if they do, are the fundamental factors in determining risk.

The risk management formula is simple but very powerful:

Risk = Likelihood x Impact

Obviously, the introduction of new threats and weaknesses increases the likelihood of a threat exploiting a weakness. Placing greater dependence on a service or component increases the impact if an existing threat exploits an existing weakness within the service. These are just a couple of examples of how risk may increase as a result of a service change.

It is a clear requirement that a proposed service change must assess the existing risks within a service and the predicted risks following implementation of the change.

If the risk level has increased then the second stage of risk management is used to mitigate the risk. In the examples given above mitigation may include steps to eliminate a threat or weakness and using disaster recovery and backup techniques to increase the resilience of a service on which the organization has become more dependent.

Following mitigation the risk level is re-assessed and compared with the original. This second assessment and any subsequent assessments are in effect determining residual risk - the risk that remains after mitigation. Assessment of residual risk and associated mitigation continues to cycle until risk is managed down to an acceptable level.

The guiding principle here is that either the initial risk assessment or any residual risk level is equal to or less than the original risk prior to the service change. If this is not the case then evaluation will recommend rejection of proposed service change, or back out of an implemented service change.

The approach to risk representation recommended here takes a fundamentally different approach. Building on the work of Drake (2005a, 2005b) this approach recognizes that risks almost always grow exponentially over time if left unmanaged, and that a risk that will not cause a loss probably is not worth worrying about too much.

It is therefore proposed that a stronger risk representation is as shown in Figure 4.35. Principally, this representation is intended to promote debate and agreement by stakeholders: is the risk positioned correctly in terms of time and potential or actual loss; could mitigation have been deployed later (e.g. more economically); should it have been deployed earlier (e.g. better protection); etc.

Deviations - Predicted Vs Actual Performance
Once the service change passes the evaluation of predicted performance and actual performance, essentially as standalone evaluations, a comparison of the two is carried out. To have reached this point it will have been determined that predicted performance and actual performance are acceptable, and that there are no unacceptable risks. The output of this activity is a deviations report. For each factor in Table 4.14 the report states the extent of any deviation between predicted and actual performance.

Figure 4.35 Example risk profile
Figure 4.35 Example risk profile

Test Plan And Results
The testing function provides the means for determining the actual performance of the service following implementation of a service change. Test provides the service evaluation function with the test plan and a report on the results of any testing. The actual results are also made available to service evaluation. These are evaluated and used as described in section 4.6.5.8.

In some circumstances it is necessary to provide a statement of qualification and/or validation status following a change. This takes place in regulated environments such as pharmaceuticals and defence. The context for these activities is shown in Figure 4.36.

Figure 4.36 Context for qualification and validation activities
Figure 4.36 Context for qualification and validation activities

The inputs to these activities are the qualification plan and results and/or validation plan and results. The evaluation process ensures that the results meet the requirements of the plans. A qualification and/or validation statement is provided as output.

4.6.6 Evaluation Report
The evaluation report contains the following sections. Risk profile A representation of the residual risk left after a change has been implemented and after countermeasures have been applied.

Deviations Report
The difference between predicted and actual performance following the implementation of a change. A qualification statement (if appropriate) Following review of qualification test results and the qualification plan, a statement of whether or not the change has left the service in a state whereby it could not be qualified.

A validation statement (if appropriate)
Following review of validation test results and the validation plan, a statement of whether or not the change has left the service in a state whereby it could not be validated.

A Recommendation
Based on the other factors within the evaluation report, a recommendation to Change Management to accept or reject the change:

4.6.7 Triggers, Inputs And Outputs And Inter-process Interfaces
Triggers

Inputs:

Outputs:

4.6.8 Information Management

4.6.9 Key Performance Indicators And Metrics
The customer/business KPIs are:

The internal KPIs include:

4.6.9.1 Challenges
Challenges include:

[To top of Page]


Visit my web site