Open Framework, Information Management Strategy & Collaborative Governance | Data & Social Methodology - MIKE2.0 Methodology
Wiki Home
Collapse Expand Close

Collapse Expand Close

To join, please contact us.

In other languages
Collapse Expand Close
Improve MIKE 2.0
Collapse Expand Close
Need somewhere to start? How about the most wanted pages; or the pages we know need more work; or even the stub that somebody else has started, but hasn't been able to finish. Or create a ticket for any issues you have found.

Guidelines for Using the FSLDM

From MIKE2.0 Methodology

Jump to: navigation, search

When leveraging a Financial Services Logical Data Model (FSLDM) such as the Teradata FSLDM or the IBM BDW (Banking Data Warehouse) the emphasis will be less on traditional data modeling activity and more on mapping the requirements to the classification entity groupings within the FSLDM model, the validation of them and identification of gaps that remain. From this, additions to or extensions of the FSLDM model within the classification entity groupings will start to become visible.


Requirements Mapping and Gap analysis to the FSLDM

Conceptual Data Model for Financial Services

Variations to this general approach will occur based on where the requirements will come from, what level of detail have they been rationalized to and how well they are defined documented and understood. Example of where requirements can come from are:

  • Analytic models which map directly to the FSLDM (such as Basel II),
  • Client specific Asset/Liability Management (ALM) requirements yet to be mapped/verified against FSLDM,
  • Source files mapped to FSLDM which contain the atomic level ALM data,
  • A hybrid approach (top down, bottom up)

Example of information details that will need to be interpreted at their lowest level

  • Report based (i.e. Dimensions, measures, aggregates, etc)
  • Third party product based – Interpreting requirements that are considered out-of-the-box
  • Business query / question – What are the elements of the question (i.e. noun and verbs) which are used to break down to identify the business concepts and rules?
  • Performance Indicators – What are the measures and elements of which the Key Performance Indicators (KPI’s) are comprised of?
  • Source file – is the attribute at its lowest level or is it a derived attribute (i.e. a formula was used)?

DW Design considerations - Mapping of subject areas to the FSLDM

To ensure the various subject area s (ALM, Risk, Profitability, etc) will align to the relevant classification groupings and entities in the FSLDM, a requirements driven approach followed by a Source file-mapping activity would be recommended.

This will involve a map and gap analysis exercise of the various subject areas (i.e. ALM, etc) requirements to the relevant entity classification areas of the FSLDM. A in depth validation exercise will need to occur with the ALM business users or Subject Matter Expert (SME). This will provide a first cut draft of what ALM element requirements are within the FSLDM and what needs to be added/extended. The next step (source to target mapping) will highlight the realities of were the gap are from a more physical (data) perspective.

While a Source (file) to FSLDM target mapping may be seen as part of the prerequisite ETL activity (i.e. requirement specifications), it further ensures the accuracy of what has been modeled/mapped and that, which cannot be. When the original source data (i.e. system of record) is involved a different perspective may be presented when actual data values are used to confirm or validate assumptions. This may influence the business rule between entities and key attributes, cardinality, optionally, order with in the model entities or more appropriate alignment to other FSLDM entities / attributes.

The use of Subject Matter experts (SME) who understand the classification groupings and can accurately interpret the FSLDM element definitions will be essential. This combined with the clients subject areas (i.e. ALM, profitability) and source system SME’s will significantly increase the accuracy of this exercise. The more one has to assume or interpret that which is to be mapped; the higher risk.

It is not recommended to pursue a Source System to FSLDM mapping exercise as the first initiative. There are significant pitfalls to this action which can severely impacted the success of a DW initiative. A source to FSLDM mapping is meant, as a follow on exercise to increase the accuracy of a requirement driven mapping approach and to highlight gaps, which may exist regarding what is being requested and want is currently available from a data perspective.

Data Mart (DM) Design consideration

Different modelling techniques for building a data mart

The use of materialized views, OLAP reporting tools or an MIS application, which may use the DM as the source of data, will influence the Data Marts ultimate design.

Where the DM uses dimensional modeling techniques, there is limited flexibility after the design has been completed. This is due to its close alignment to a physical representation of how the data is to be stored vs. a logical representation with out storage or access considerations as is the case with the FSLDM.

The FSLDM data model, which is modeling using normalization techniques, is designed to reflect the way the business works, “not” how the data is it is to be reported on or accessed by the business end users. This is what a DM Dimensional (star schema) model is used for.

As a result the design of the DM different requires acknowledgement of several factors before an appropriate design can be pursued.

In general the design of a DM (if Dimensional modeling – Star Schema involved), would involve the following methodology approach.

  • Identify a business process/requirement (i.e. ALM requirements, MIS reports etc). A DM is designed around “known” requirements.
  • Identification of the lowest level of the process (i.e. Individual txn, individual daily/monthly snapshot), which will be represented in the fact table for this process.
  • Analyze the elements of the business process or requirements and identify the Dimension, Measure, etc of the process with related characteristics (i.e. hierarchies, aggregates, history, etc) noted.
  • For each business process the identification of the Fact, related Dimension tables, their contents and relationships between the tables are pursued. Where there are multiple business process or requirements within a subject area (i.e. ALM, Profitability, etc) this approach will continue.

If the DM logical design were based on a subset of the FSLDM, a similar process as discussed for how to design a DW would be pursued.

Additional Data Mart Considerations

It should be noted that with a dimensional DM, extension (i.e. later additions) to data model may be more complex and can affect the ETL process. The same is not true with the FSLDM as it is based around 3NF modeling methodology which more closely aligns to how the business works and or how the data is stored in the originating OLTP source systems.

Hierarchies, Aggregates, Measures, are the most effected areas within the DM design. The use or influence of an OLAP reporting tool, MIS application or when there are performance implications can affect such items.

Where multiple Fact tables from different subject areas share the same Dimensions (i.e. Federated Data Mart Architecture), it is essential that shared Dimensions be conformed (i.e. standardized). Reference Data Management and or active data governance process will need to be leveraged to ensure the appropriateness and accuracy of the common dimensions can be achieved and maintained.

Considerations to maintain the integrity of the evolving model

  • A ridged set of standards and design principles will be required and must be approved by interested and or effected parties. These will provide focus, conformity to the approach and to help mitigate the pending differences and debates that will arise.
  • Authority for the data and modeling standards, must be aligned and or approved by a data management / governance body who will and can react, should these standards be circumvented or challenged. If there will be a high probability that the data modeling team will be overridden (as they normally lack authority to enforce data standards), a significant decline in the quality of the model and its contents can occur.
  • A centralized data architecture team will be required and will be responsible for ensuring the level of integrity to the model.
  • If parallel streams of development activity by different modelers are occurring against the same subject area entities within the FSLDM, strict adherence to model publishing (i.e. most current version), version control, impact analysis with other downstream deliverables (i.e. Tables, ETL process/programs, etc) needs to be pursued.
  • Use of industry stand data modeling tool with relevant reporting and versioning considerations will be required.
Wiki Contributors
Collapse Expand Close

View more contributors