Open Framework, Information Management Strategy & Collaborative Governance | Data & Social Methodology - MIKE2.0 Methodology
Wiki Home
Collapse Expand Close

Members
Collapse Expand Close

To join, please contact us.

Improve MIKE 2.0
Collapse Expand Close
Need somewhere to start? How about the most wanted pages; or the pages we know need more work; or even the stub that somebody else has started, but hasn't been able to finish. Or create a ticket for any issues you have found.

Evaluation and Launch

From MIKE2.0 Methodology

Share/Save/Bookmark
Jump to: navigation, search
Activities in Phase 5
Phase 5 - Incremental Development, Testing, Deployment and Improvement
Content Model Relationship

Contents

Activity: Evaluation and Launch

Objective. The Evaluation and Launch activity encompasses transferring production processing from the existing system to the new system and cutting over system operations and support from the project team to the client operations personnel. The initial focus is on validating that all aspects of the system, the hardware, applications, data and organization, are ready for rollout. This includes the establishment of a contingency plan for possible rollout complications in any of the previously identified areas.

Major Deliverables
  • Operational System
Tasks

Task: Conduct Final User Acceptance Test (UAT)

Objective: The user population should run through a complete test cycle to ensure that all programs and procedures are in place to ensure normal operations. The results should be documented in writing to validate that start-up operations are ready to begin. This should only be done as a final user acceptance test before going live – users should be very familiar with the system by this time and would have already voiced any concerns around functionality gaps.


Input:

  • Trained Users
  • Trained Operations Staff
  • Deployed Production System


Output:

  • Acceptance Test Results

Task: Prepare Staff for Cut-Over

Objective: This task should create a forum to provide excitement, set final expectations and set the scene for feedback from the users. Unlike operational systems, some of the major functions of information management systems can be changed far more rapidly (especially analytical systems), so user feedback is vital to ensure opportunities are considered. The specific sub-tasks include:

  • Brief User Staff
  • Brief Operations Staff
  • Brief Project Team
  • Finalise Conversion and Cut-over Plans


Input:

  • Trained Users
  • Trained Operations Staff
  • Acceptance Test Results


Output:

  • Staff Ready for Launch

Task: Load Historical Data

Objective: In this step, historical data is loaded into the target production environment. This may come directly from a testing environment, and be loaded via the dump extracts as opposed to going through the integration layer. In many cases, the data has gone through a re-engineering process. Especially in the case of Data Warehouses, this task may run for multiple days as it often involves very large volumes of data.


Input:

  • Deployed Production System


Output:

  • All Historical Data Loaded into target production environment
  • Initial PVT on environment

Task: Activate Automation Components and Load Incremental Data

Objective: Once historical data is loaded into the system and an initial PVT is done to identify if there are any issues, the incremental loads of data can begin. This means that all components are activated and running in an automated fashion, driven either by events (i.e. an update of information) or on a temporal basis.


Input:

  • Deployed Production System
  • Historical Data Loaded
  • Initial PVT complete (optional)


Output:

  • Incremental Loads Running
  • All automation components activated and running without human intervention

Task: Conduct Production Verification Testing (PVT)

Objective: The purpose of PVT is to test that after performing deployment activities, the system operates in the expected manner from an end-user and operator perspective. PVT focuses on handing to the business a reliable and stable production environment.

In some cases, PVT may be broken into multiple steps, including testing after baseline implementation of hardware and software, testing after the historical data load and testing after automation components have been turned on. The deployment team is responsible for PVT.

PVT Test Cases typically leverage a subset of existing UAT Test Cases and may be considered a "final UAT". Entry criteria for PVT typically include:

  • Production approval for implementation received from Project team and stakeholders.
  • All previous test levels successfully completed as per the exit criteria.
  • No Severity 1 or 2 Defects outstanding.
  • Agreed plans to resolve outstanding defects for Severity 3 & 4 defects

Exit criteria for PVT typically include:

  • No Severity 1 or 2 issues raised in PVT.
  • 100% of relevant requirements tested in production environment
  • 100% signoff from related interfaces tested
  • Any issues raised are logged in Production Support for resolution.


Input:

  • Deployed Production System
  • Historical Data Loaded
  • Turn on Automation Components and Load Incremental Data

Task: Launch New System

Objective: Launching the new system includes initiating production operations and establishing support and tracking processes for resolving issues with the system, data and procedures. Communications and change management is a key aspect to the system launch. Once system processes are stabilised, operational functions are transitioned to the support team. System operations are monitored to identify improvement areas, which include:

  • Functionality
  • Performance
  • Reliability
  • Accuracy
  • Usability
  • Controls
  • Operating costs


Input:

  • User Procedures Manual
  • Operations Procedures Manual
  • Desk Procedures Manual
  • Trained Users
  • Trained Operations Staff
  • Deployed Production System
  • Historical Data Loaded
  • Incremental Loads Running
  • All automation components activated and running without human intervention
  • Acceptance Test Results
  • Staff Ready for Launch


Output:

  • Live Production System

Task: Support the Transition

Objective: Although the operational system has been rigorously tested and accepted by users, there is no substitute for the "live" working environment. It is important that all of the delivery team does not disperse immediately after deployment. Assessing, monitoring and tuning the new solution will not only assist in ensuring user acceptance, but will also give vital insight which can be used for subsequent increments. The sub-tasks include:

  • Monitor and Fine-Tune New System
  • Evaluate System Performance
  • Review Organisational Impact
  • Develop Recommendations
  • Assist with Production Support

Additionally, experience with live systems is vital to delivering leading solutions for our clients. The support process therefore provides invaluable insight which can be feed back into the MIKE2.0 methodology in the Lesson’s Learned section.

Continuous Improvement Activities specify detailed tasks on how the solution can be improved during the support phase.


Input:

  • Live Production System


Output:

  • Working Operational System

Task: Assess Compliance with Project Objectives

Objective: Once the new system has been implemented and stabilized, the results are evaluated against defined the initial defined business objectives.


Input:

  • Business Case
  • Live Production System


Output:

  • Compliance Assessment

Task: Perform Post-Implementation Systems Performance Review

Objective: Once the new system has been in operation for a period of time, the processing results should be reviewed. Certain performance improvements may be identified and developed to enhance operations. Results of these activities should be documented and integrated into the on-going iterative development process.

This task is particularly important to the incremental development process to ensure that the following increment is an improvement and is developed in a more efficient way. The sub-tasks include:

  • Present Recommendations to Management
  • Finalise Recommendations
  • Present Results to Management
  • Update Phase Documents

The Continuous Improvement Activities explicitly detail a number of these tasks.


Input:

  • Live Production System
  • Compliance Assessment


Output:

  • Performance Review

Role:Deployment Manager

Role:Test Manager

Yellow Flags

  • Resources will key skills are no longer available to help with deployment
  • Deployment issues on past increments
  • Complex deployment requirements that have scenarios (e.g. small time windows, large data volumes) that have not been explicitly tested during Testing Activities
  • A significant number of inconsistent viewpoints in post-implementation reviews
  • Support phase does not encapsulate an upcoming complex scenario (e.g. generation of a major quarterly report, major data changes)
Wiki Contributors
Collapse Expand Close