Open Framework, Information Management Strategy & Collaborative Governance | Data & Social Methodology - MIKE2.0 Methodology
Wiki Home
Collapse Expand Close

Members
Collapse Expand Close

To join, please contact us.

Improve MIKE 2.0
Collapse Expand Close
Need somewhere to start? How about the most wanted pages; or the pages we know need more work; or even the stub that somebody else has started, but hasn't been able to finish. Or create a ticket for any issues you have found.

ETL (Extract, Transform, Load) Component

From MIKE2.0 Methodology

Share/Save/Bookmark
Jump to: navigation, search
ETL

One of the key foundation activities of the SAFE architecture is the ETL integration environment. Within the ETL layer, the technology can be used to validate data, check business rules and mediate process flows between multiple systems. The tools that provide these capabilities are generally strong in their ability to perform complex set-oriented transformations and handle bulk data extraction.

ETL Overview

At the highest level, the ETL process is generally segmented into 3 steps:

  • Extracting the data from the source system
  • Transforming the data into the format required to load into the target model
  • Loading data into the target system.

This ETL process is typically used for data integration; its analogous components for application integration include middleware and process automation. There is overlap and convergence in these areas: ETL tools can be used for application integration through application tier, application integration technologies can be used for data integration. At this time, vendors still tend to have multiple products in this space and the products tend to perform better in their primary area of functionality.

ETL Implementation

The ETL process has typically been implemented through custom-developed SQL scripts. Some of the benefits of using a tool in this area include:

  • Simplifying the integration environment and improving reuse opportunities by storing all integration logic in a single engine environment.
  • Significantly improving audit and security capabilities by automatically tracking data remediation, auditing and interface activities.
  • Providing the capability to leverage highly flexible development functions, which provide the ability to rapidly react to data remediation and reporting requirements.
  • Improving the utilisation of existing resources team by simplifying review and QA processes through the user friendly Graphical User Interface (GUI) of the toolset
  • Reducing negative system resource impacts on production systems by moving processor intensive tasks onto a separate ETL server and by producing faster and more efficient system * processes.
  • Robust ETL tools make the transformation rules available in their metadata to popular data access tools

Relationship to other components of the SAFE Architecture

With the SAFE architecture framework, it is recommended that data profiling be done as a direct input to the ETL process, that re-engineering jobs be operationalised in the ETL layer, the data management services be implemented as part of a Services Oriented Architecture and that the ETL tool use a shared metadata environment for storing business and technical metadata.

Wiki Contributors
Collapse Expand Close