|
Wiki Home
![]() ![]() ![]()
Members
![]() ![]() ![]()
To join, please contact us. Improve MIKE 2.0
![]() ![]() ![]()
Need somewhere to start? How about the most wanted pages; or the pages we know need more work; or even the stub that somebody else has started, but hasn't been able to finish. Or create a ticket for any issues you have found.
|
Evaluation and LaunchFrom MIKE2.0 Methodology -> You are here: Evaluation and Launch
Activity: Evaluation and LaunchObjective. The Evaluation and Launch activity encompasses transferring production processing from the existing system to the new system and cutting over system operations and support from the project team to the client operations personnel. The initial focus is on validating that all aspects of the system, the hardware, applications, data and organization, are ready for rollout. This includes the establishment of a contingency plan for possible rollout complications in any of the previously identified areas. Major Deliverables
Tasks
Task: Conduct Final User Acceptance Test (UAT)Objective: The user population should run through a complete test cycle to ensure that all programs and procedures are in place to ensure normal operations. The results should be documented in writing to validate that start-up operations are ready to begin. This should only be done as a final user acceptance test before going live – users should be very familiar with the system by this time and would have already voiced any concerns around functionality gaps.
Task: Prepare Staff for Cut-OverObjective: This task should create a forum to provide excitement, set final expectations and set the scene for feedback from the users. Unlike operational systems, some of the major functions of information management systems can be changed far more rapidly (especially analytical systems), so user feedback is vital to ensure opportunities are considered. The specific sub-tasks include:
Task: Load Historical DataObjective: In this step, historical data is loaded into the target production environment. This may come directly from a testing environment, and be loaded via the dump extracts as opposed to going through the integration layer. In many cases, the data has gone through a re-engineering process. Especially in the case of Data Warehouses, this task may run for multiple days as it often involves very large volumes of data.
Task: Activate Automation Components and Load Incremental DataObjective: Once historical data is loaded into the system and an initial PVT is done to identify if there are any issues, the incremental loads of data can begin. This means that all components are activated and running in an automated fashion, driven either by events (i.e. an update of information) or on a temporal basis.
Task: Conduct Production Verification Testing (PVT)Objective: The purpose of PVT is to test that after performing deployment activities, the system operates in the expected manner from an end-user and operator perspective. PVT focuses on handing to the business a reliable and stable production environment. In some cases, PVT may be broken into multiple steps, including testing after baseline implementation of hardware and software, testing after the historical data load and testing after automation components have been turned on. The deployment team is responsible for PVT. PVT Test Cases typically leverage a subset of existing UAT Test Cases and may be considered a "final UAT". Entry criteria for PVT typically include:
Exit criteria for PVT typically include:
Task: Launch New SystemObjective: Launching the new system includes initiating production operations and establishing support and tracking processes for resolving issues with the system, data and procedures. Communications and change management is a key aspect to the system launch. Once system processes are stabilised, operational functions are transitioned to the support team. System operations are monitored to identify improvement areas, which include:
Task: Support the TransitionObjective: Although the operational system has been rigorously tested and accepted by users, there is no substitute for the "live" working environment. It is important that all of the delivery team does not disperse immediately after deployment. Assessing, monitoring and tuning the new solution will not only assist in ensuring user acceptance, but will also give vital insight which can be used for subsequent increments. The sub-tasks include:
Additionally, experience with live systems is vital to delivering leading solutions for our clients. The support process therefore provides invaluable insight which can be feed back into the MIKE2.0 methodology in the Lesson’s Learned section. Continuous Improvement Activities specify detailed tasks on how the solution can be improved during the support phase.
Task: Assess Compliance with Project ObjectivesObjective: Once the new system has been implemented and stabilized, the results are evaluated against defined the initial defined business objectives.
Task: Perform Post-Implementation Systems Performance ReviewObjective: Once the new system has been in operation for a period of time, the processing results should be reviewed. Certain performance improvements may be identified and developed to enhance operations. Results of these activities should be documented and integrated into the on-going iterative development process. This task is particularly important to the incremental development process to ensure that the following increment is an improvement and is developed in a more efficient way. The sub-tasks include:
The Continuous Improvement Activities explicitly detail a number of these tasks.
Role:Deployment ManagerRole:Test ManagerYellow Flags
|
Wiki asset search
![]() ![]() ![]() Toolbox
![]() ![]() ![]() Views
![]() ![]() ![]() Wiki Contributors
![]() ![]() ![]() Suggestions
![]() ![]() ![]() |