Open Framework, Information Management Strategy & Collaborative Governance | Data & Social Methodology - MIKE2.0 Methodology
Wiki Home
Collapse Expand Close

Members
Collapse Expand Close

To join, please contact us.

Improve MIKE 2.0
Collapse Expand Close
Need somewhere to start? How about the most wanted pages; or the pages we know need more work; or even the stub that somebody else has started, but hasn't been able to finish. Or create a ticket for any issues you have found.

Stress and Volume Testing

From MIKE2.0 Methodology

Share/Save/Bookmark
Jump to: navigation, search
Activities in Phase 5
Phase 5 - Incremental Development, Testing, Deployment and Improvement
Content Model Relationship

Contents

Activity: Stress and Volume Testing

Objective

Stress and Volume Testing (SVT) measures that the system response times meet performance measures defined as part of the non-functional requirements specifications. SVT determines the ability of the system to process expected production volumes under defined production environment conditions, as well in peak business conditions. Besides testing for failures, performance times are also measured to ensure that the system is functioning within the specified acceptable parameters.

As SVT testing occurs late in the cycle, changes at this stage are expected to be at the tuning level: significant performance issues should be discovered before SVT occurs and can be defined as functional or SIT test cases if necessary.

Using a specialised vendor tool generally can be very helpful for SVT to represent load scenarios, monitor these scenarios and recommend changes for an optimised environment.

Major Deliverables

  • SVT Test Results
  • Performance tuning changes to hardware/software environment (if required)

Tasks

Migrate Software to SVT Testing Environment

Objective:

Move software from configuration management environment to the SVT Testing environment and ensure that all changes have been made prior to propagating software.


Input:

  • Availability of a suitable and stable testing environment (ideally that mirrors production)


Output:

  • SVT Test Environment ready

Test BI Application Development Work Products

Objective:

This task tests the performance of BI applications in accessing data to ensure they meet performance measurements and do not fail under high-volume scenarios.


Input:

  • SVT Test Environment ready
  • SVT BI Application Test Plans and Test Cases
  • Completed Test Case Execution for BI Application Development E2E Testing


Output:

  • Completed Test Case Execution
  • Completed Initial Test Case Execution for BI Application Development SVT Testing
  • Performance tuning revisions to applications (if required)

Test Integration Work Products

Objective:

This task tests the performance of integration functions to ensure they meet performance measurements and do not fail under high-volume scenarios. For off-the-shelf integration components (e.g. ETL tools) this must involve specifically testing scenarios that are relevant to the system architecture (i.e. multiple jobs running at once against a centralized ETL server). As many vendors now offer parallel processing capability, this must now be tested for both its ability to meet performance requirements and for any functional issues in operation.

Input:

  • SVT Test Environment ready
  • SVT Integration Test Plans and Test Cases
  • Completed Test Case Execution for Integration Development E2E Testing


Output:

  • Completed Initial Test Case Execution for Integration Development SVT Testing
  • Performance tuning revisions to integration layer and services (if required)

Test Information Development Work Products

Objective:

This task tests the performance of information management functions to ensure they meet performance measurements and do not fail under high-volume scenarios. This is particularly important for testing data re-engineering or data monitoring processes that have now been operationalised as part of production system.


Input:

  • SVT Test Environment ready
  • SVT Information Management Test Plans and Test Cases
  • Completed Test Case Execution for Information Development E2E Testing


Output:

  • Completed Initial Test Case Execution for Information Development SVT Testing
  • Performance tuning revisions to information management layer and services (if required)

Test Infrastructure Management Processes

Objective: SVT Testing of Infrastructure Management environments will ensure outage windows are sufficient during high-volume scenarios as well as measuring the expected time it will take to recover from a disaster scenario.


Input:

  • SVT Test Environment ready
  • SVT Infrastructure Management Test Plans and Test Cases
  • Completed Test Case Execution for Infrastructure Management E2E Testing


Output:

  • Completed Initial Test Case Execution for Infrastructure Management SVT Testing
  • Performance tuning revisions to applications (if required)

Defect Resolution and Re-Testing

Objective:

This task covers re-execution of SVT test cases that had issues in the initial round of SVT Testing. Due to dependencies between test cases, this may require regression testing of some test cases.


Input:

  • Completed Initial Test Case Execution for BI Application Development SVT Testing
  • Completed Initial Test Case Execution for Integration Development SVT Testing
  • Completed Initial Test Case Execution for Information Development SVT Testing
  • Completed Initial Test Case Execution for Infrastructure Management SVT Testing


Output:

  • Completed Test Case Execution for BI Application Development SVT Testing
  • Completed Test Case Execution for Integration Development SVT Testing
  • Completed Test Case Execution for Information Development SVT Testing
  • Completed Test Case Execution for Infrastructure Management SVT Testing

Depends of project metrics, but typical exit criteria out of SVT may include:

  • 98% of performance test cases successfully completed
  • No Severity 1 or 2 Defects outstanding including performance issues
  • Agreed plans to resolve outstanding Severity 3 and 4 defects

These metrics should be agreed-upon as part of the overall Test Strategy.

Core Supporting Assets

Yellow Flags

  • There are not representative platforms to test load scenarios
  • Test Cases do not test proper exception scenarios
  • Testing tools to help simulate SVT scenarios are unavailable

Key Resource Requirements

Wiki Contributors
Collapse Expand Close