You are on page 1of 62

Detailed Design

Hybrid 2.0

Author:

Upal Chakraborty,Biju Das,Sandeepan Dey,Manoj Singh

Version:

V3.0

Status:

Final Draft

Date:

05/09/2014

258731137.doc

Version V3.0
Page 1 of 62

Detailed Design Template Version: V3.0

Document data
Document Change History
Version # Date
Author
Section
V0.1
18-06-2014 Manoj Singh,Upal
Chakraborty
V0.2
25-06-2014 Manoj Singh
V0.3

26-06-2014 Manoj Singh

V0.4
V0.5
V0.6
V0.7
V0.8
V0.9
V1.0
V1.1

26-06-2014
30-06-2014
30-06-2014
03-07-2014
04-07-2014
08-07-2014
10-07-2014
14-07-2014

Manoj Singh
Manoj Singh
Manoj Singh
Biju Das
Biju Das
Biju Das
Manoj Singh
Manoj Singh

V1.2
V1.3
V1.4
V1.5

16-07-2014
16-07-2014
22-07-2014
31-07-2014

Biju Das
Manoj Singh
Manoj Singh
Manoj Singh

V2.0

03-09-2014 Manoj Singh

V3.0

05-09-2014 Biju

Nature of Change
Initial draft

Data Type for New Tables,Purging and


Retention, Statistics
Modifying the table structure to show detail
about data laod into tables,Adding
NULLABLE columns informations
Modified 7.3
Minor changes
Population Method Update for New Tables
7.5
ADQM
Updated as per initial review
Minor change in Backup facet config
7.1.2.1
Updated
7.1.1.6,7.1.2. Added tabular structure for data insertion
2,7.1.3.1
and backup facet
7.5.2
Updated. Page formatted.
Minor change
7.1.4
Adding new Facet
7.1.13
Addition of new column in
INCENTIVE_EVENT table
7.4,7.1.4,7.1. Minor Changes:column name change for
1.6,7.1.1.3,7. INCENTIVE_EVENT table, Column Type
1.2,7.1.2.3
change for Stats Gather,PRODUCT_ITEM
ETL load
Minor changes version number in main
page, footer etc

Contributors and Reviewers


Name

Department

Role

Rajib Roy

Detailed design

Reviewer

Frits Hermans

BI Strategy & Architecture

Reviewer

Eric Janssen

BI Operations

Reviewer

Upal Chakraborty,Manoj Singh,


Biju Das , Sandeepan Dey

Detailed Design(IBM Team)

Contributor

Document Approval
Version # Approval Date

258731137.doc

Version V3.0
Page 2 of 62

Role

Name

Department

Table of Contents
1

Document Change History..............................................................................................2


Introduction..................................................................................................................... 6
1.1
Overview.................................................................................................................. 6
1.2
Purpose.................................................................................................................... 6
1.3
Charcoal Diagram....................................................................................................7
1.4
Related Documentation............................................................................................7
1.5
Open Items.............................................................................................................. 8
1.6
Risks........................................................................................................................ 8
Scope.............................................................................................................................. 9
2.1
In Scope................................................................................................................... 9
2.2
Assumptions............................................................................................................ 9
2.3
Out of Scope............................................................................................................ 9
Interface File and KPI File.............................................................................................10
3.1
Characteristics of the data file................................................................................10
3.2
General requirements of the data file.....................................................................10
3.3
General Business Rules.........................................................................................10
3.4
Functional Descriptions of the files.........................................................................11
3.5
Naming convention of the data file.........................................................................11
3.6
Structure of the data file.........................................................................................11
3.7
Characteristics of the KPI file.................................................................................12
3.8
General requirements of the KPI file......................................................................12
3.9
Naming convention of the KPI file..........................................................................13
3.10
Structure of the KPI file.......................................................................................13
3.11
Characteristics of the metadata file.....................................................................14
3.12
General requirements of the metadata file..........................................................14
3.13
Naming Convention of the metadata file.............................................................14
3.14
Structure of the metadata file..............................................................................15
3.15
Sample metadata file for data file.......................................................................15
ETL Design................................................................................................................... 17
4.1
Project Creations:...................................................................................................17
4.2
File Polling............................................................................................................. 17
4.3
Generic Batch Processing......................................................................................18
4.4
Post Processing.....................................................................................................19
4.4.1 KPI Post Processing................................................................................................19
4.4.2 Feed Files Post Processing....................................................................................20
4.5
DDL HOUSEKEEPING..........................................................................................21
4.5.1 Housekeeping for feed files :....................................................................................21
4.5.2 Housekeeping for KPI files.......................................................................................24
4.6
ATLAS Processing.................................................................................................25
4.6.1
Project Structure.............................................................................................25
4.6.2
Polling............................................................................................................. 26
4.6.3
Source File Registration and Metadata Validation...........................................26
4.6.4
Surrogate Key extraction.................................................................................29
4.6.5
Create Max Key lookup...................................................................................29

258731137.doc

Version V3.0
Page 3 of 62

4.6.6
MDW Process.................................................................................................29
4.6.7
ATLAS lookup synchronization........................................................................30
4.6.8
Update Max Key Lookup in Legacy.................................................................30
4.6.9
Source Target Mapping...................................................................................30
4.6.9.1
Event Facet..............................................................................................30
4.6.10 Loading process for KPI files...........................................................................31
4.7
Impact of changes due to introduction of new field in the PRODUCT_ITEM_HIST
table 32
4.7.1
Changes to be made in Legacy system:.........................................................32
4.7.2
Changes to be made in the existing MDW lookups:........................................33
4.8
ATLAS Housekeeping............................................................................................33
5 Process Flow Diagrams................................................................................................35
5.1
Feed File Process Flow :........................................................................................35
5.2
MDW Process Flow :..............................................................................................37
5.3
KPI File Process Flow............................................................................................38
6 TWS scheduling :.........................................................................................................40
7. Teradata Design............................................................................................................ 41
7.1 Hybrid 2.0 data Load (New)........................................................................................41
7.1.1 Event.Base facet: LDM tables data load..................................................................41
7.1.1.1 One time load script for EVENT_CLASS Table..............................................41
7.1.1.2 One time load script for EVENT_TABLE_TYPE Table...................................42
7.1.1.3 New Table: INCENTIVE_EVENT (SET TABLE).............................................42
7.1.1.4 New Table: INCENTIVE_RESULT_TYPE (SET TABLE)................................43
7.1.1.5 Copy-to-Prod Configuration for Event.Base facet..........................................44
7.1.1.6 Data Backup Configure Backup facet Script................................................44
7.1.1.7 One-to-One view (New).................................................................................45
7.1.1.8 One-to-One PL view (New)............................................................................45
7.1.1.9 One time script for INCENTIVE_RESULT_TYPE TABLE...............................45
7.1.2 Offer.Product facet: LDM tables data load................................................................47
7.1.2.1 Existing Table: PRODUCT_ITEM_HIST (SET TABLE)...................................48
7.1.2.2 Data Backup Configure Backup facet Script................................................50
7.1.2.3 One time script for PRODUCT_ITEM TABLE.................................................51
7.1.2.4 One-to-One view (New).................................................................................51
7.1.2.5 One-to-One PL view (New)............................................................................51
7.1.3 Miscellaneous facet: LDM tables data load..............................................................51
7.1.3.1 Data Backup Configure Backup facet Script................................................52
7.1.3.2 One time script for CREATION_SOURCE_TYPE TABLE..............................53
7.1.4 Campaign facet: LDM tables data load....................................................................53
7.1.4.1 Data Backup Configure Backup facet Script................................................54
7.1.4.2 One time load script for CAMPAIGN_STRATEGY Table................................55
7.2 Data Backup...............................................................................................................55
7.3 Data Retention and Purging........................................................................................55
7.4 Statistics Gathering.....................................................................................................55
7.5 ADQM......................................................................................................................... 56
7.5.1 Values to be calculated.........................................................................................56
7.5.2 DETAILED KPI CALCULATIONS.........................................................................58
7.5.3 Price Reference Feed..........................................................................................58
7.5.3.1 Number of records per Price plan, FBS.........................................................58
7.5.4 Campaign Discount Feed.....................................................................................59
7.5.4.1 Number of records per Result Code, Campaign............................................59
8. Appendix....................................................................................................................... 61
258731137.doc

Version V3.0
Page 4 of 62

A. Key Decision Emails.....................................................................................................61

258731137.doc

Version V3.0
Page 5 of 62

1 Introduction
1.1 Overview
Flex Hybrid2.0 is an improvement over the originally deployed Hybrid 1.0 also known as Flex 1.0 ,
deployed in the month of January 2014 .Hybrid 2.0 project mainly focuses on the capability to
introduce more than one Hybrid Price Plan and the capability to apply reduced monthly charges,
based on campaign data from Unica.
Also earlier in Flex Hybrid 1.0 it was not possible to link Data from Vesta system with Data from
Surepay (containing product id) system in order to reconcile and check the monthly charge.
The payment information from Vesta does not have any product Id information. Vesta payments
happen few days after Surepay balance event data come to ATLAS. The Surepay balance events
does not contain the event amount so it is not possible to reconcile between Surepay events and
Vesta payments.
As part of hybrid 2.0 it was needed to have reference tables which will allow reconciliation (flex
product) of monthly charge vs. monthly bundle using price plan id into account.
The above improvements would be brought in by introducing 2 feeds from MOSA , one of the feed
would contain Campaign related data targeting the customers eligible for receiving discounts or
reduced pricing plan and the other would contain the Priceplan and bundle related data along with
their associated price which would help the business to set an association between the Flex products
and their corresponding prices involved for them. The above data feeds would be propagated through
the DDL server and passed through the Abinitio MDW process to LDM in Atlas.

1.2 Purpose
The purpose of this document is to describe in detail the changes needed in ETL process to load
MOSA data to ATLAS LDM via DDL.Two new data files along with 2 KPI files will be loaded to LDM
through DDL.

258731137.doc

Version V3.0
Page 6 of 62

1.3 Charcoal Diagram

1.4 Related Documentation


Doc
#
1

Title

Document Name

HLD Hybrib 2.0

Vodafone IT HLD

HLD-HYBRID 2.0_v0.9.doc Version 009,


June 2014
HLD - Hybrid 2 0 - V1.0.docx

Unify DDL Design Document

Unify DDL Design Amendment


Document
VFNL-Unify-IDD-DDL-ATLAS-111.doc
VFNL-DA-ETL Design ATLAS.doc

5
6
7

Detailed DesignEBU_CVM_v1.2.doc

258731137.doc

Version V3.0
Page 7 of 62

VFNL-Unify-DD-DDL.doc,Version 0.2,dated
30-01-2012
VFNL-Unify-DA-DDL.doc,Version 2.6,dated
14-09-2012
VFNL-Unify-IDD-DDL-ATLAS-11-1.doc
dated 06/12/2013
VFNL-DA-ETL Design ATLAS.doc,version
1.2 ,dated 15/04/2013
Detailed Design-EBU_CVM_v1.2.doc

1.5 Open Items

Interface documents (IDD) from AMDOCS needs to be confirmed and finalized . Details such
as the datatype of the fields are still pending at their side.
Sample data from the source were not received during DLD phase. The datatypes, format
and details such as nullability of the source fields and exact attribute name still not available
during Detailed design phase.
OLA will be needed to be taken care in later phases of the project.

1.6 Risks

If the source doesnt send a final verison of the IDD with all file details before development
phase , there might a risk of not processing the data from the source file correctly and may
result in data inconsistency.

258731137.doc

Version V3.0
Page 8 of 62

2 Scope
3.1 In Scope
The below are the requirements from reporting perspective with their feasibility status :

Two new feeds from MOSA would send data towards Atlas . One of the feed would contain
price reference data and the other would contain campaign related data for the targeted
customers .
Processing of 2 data files and 2 metadata files from the new feed through DDL and loading of
the data into Atlas LDM tables using Abinitio MDW .
Processing of 2 KPI files and their corresponding metadata files and loading them into Atlas
LDM tables .

2.1 Assumptions

No History loading will be needed as part of this project.


No impact on the Feeds received from Vesta and Surepay presently .
The 2 new files from MOSA interface will be verified based on IDD document provided by
Source system during System testing.
KPI and metadata files will be provided by MOSA for the two feeds from MOSA to ATLAS.
Sample data should be provided by the source system during later phases of the project
( during detailed design phase).
MOSA will be providing the CTN values in the same format as visible in the sample file from
UNICA and in this case no additional data cleansing will be required on the CTN values.

2.2 Out of Scope

Any kind of Report generation is out of scope of the project


Process development for reconciliation between existing Vesta payments and Surepay
events is out of scope of this project.
Changes in simrtb, ama events developed as part of Flex 1.0 is out of scope of this project.

258731137.doc

Version V3.0
Page 9 of 62

3 Interface File and KPI File


Each of the data file will be precisely described by a corresponding metadata file.
The meta-data will therefore contain information about the data file: size, version, record count, etc.
This data will also be used to confirm the file transmission. If the meta-data has been received, the
data transmission is completed and processing can start.
In order to meet the Vodafone ADQM Requirements each data file must be accompanied with a so
called KPI-file to be used for reconciliation purposes. These KPIs are determined by Vodafone per
source.
In total, each daily MOSA will land the following files :
-

two data files

two metadata files describing each of the data file

two KPI files

two metadata files describing each of the KPI file

3.1 Characteristics of the data file


Expected # records (daily): 18-20 records for the price_reference feed and around 3500 records for
the campaign_dicsount file on a daily basis.
Expected # MB (daily): Around 1MB per day
Expected growth (%) per month: < To be finalized once sample file is received >
Extraction-type: Daily delta file for the campaign_discount file and daily full dump for
theprice_reference file .

3.2 General requirements of the data file


Frequency / time date of delivery: Daily
Time of delivery: <TBD>
Dependency: <TBD>
Coding Standard: ascii
Compression Method: Compressed
File Type: .gz
File Transfer Method: SFTP push
File authorization: at least read/ write for all (UNIX: rw-rw-rw-)
Column delimiter: |
Each data file is accompanied by a corresponding metadata-file.
The metadata files should be delivered as plain text (not compressed).

3.3 General Business Rules


258731137.doc

Version V3.0
Page 10 of 62

MOSA should provide data as per the requirements of Hybrid 2.0 project.
Any filtrations required on the data should be provided in MOSA.
The data file should be sent before the metadata file.
The KPI file should be sent before the corresponding metadata file

3.4 Functional Descriptions of the files


(i)

Data Files :- There are two data files which will be needed as a part of the Hybrid 2.0
population :- campaign_discount and price_reference .
The campaign_discount file contains the list of Hybrid customers identified by their ctn , the
corresponding campaign associated with it identified by the campaigncode field , the
discount associated with it and the resultcode associated with it which actually indicates
whether a hybrid customer is eligible or campaign or not .
The hybrid campaign customer list is actually initiated by Unica which is the campaign
management system and sent to MOSA which checks if a customer is eligible for the
discount to be applied or not and accordingly populates the value of the resultcode(1 or 0) .
The price_reference file is a snapshot file which contains the list of all possible combinations
of Hybrid Price Plan and the FBS and the associated price for them.

(ii)

KPI Files :- There would be two KPI files associated with each of the above mentioned data
files one for price reference and the other for campaign discount .
The KPI file for campaign_discount provides the count of records per result_code and
campaign_code provided.
The KPI file for price_reference provides the count of records per price_plan_id and FBS_id .

3.5 Naming convention of the data file


<sourcesystem>_<filename>_<date_of_content>_<seq_no>.dat.gz
Whereby:
Sourcesystem:
Name of the source system For e.g mosa
Filename: Name of the file within the source system For e.g 99999999
Date of content: The date of the data which the file is holding
Format: YYYYMMDDHH24MISS
Example(s):

mosa_price_reference_YYYYMMDD_99999999.dat.gz
mosa_campaign_discount_YYYYMMDD_99999999.dat.gz

3.6 Structure of the data file


There would be 2 data files , the structures of which are shown below :mosa_price_reference_YYYYMMDD_99999999.dat.gz
Attribute
258731137.doc

Version V3.0
Page 11 of 62

Type

Description/Example

priceplan
FBS
Price

string
string
decimal

Flex
Flex_Sma_DataS

2750

mosa_campaign_discount_YYYYMMDD_99999999.dat.gz
Attribute
CTN
CampaignCode

Type
decimal
string

Campaign Effective Startdate


Campaign Effective Enddate
Minimum Value
Discount value

DATE
DATE
decimal
decimal

FBS ID

String

PricePlanId

string

Price

decimal

Date Time of Charging event

datetimestamp

Result

Decimal

Description/Example
610774354
Campaign identification as received within
HCL input file. Not relevant for MOSA
discount calculation, but used to report back
in the return-file towards UNICA.
Effective start date of a campaign
Effective end date of a campaign
Referred to as Current bundle value in Euros
The absolute amount, that is to be deducted
from the Monthly Charge
The FBS ID applicable at the moment of
Charge calculation by MOSA. This is the
actual name of the FBS.
The PricePlan ID applicable at the moment
of Charge calculation by MOSA. This is the
actual name of the Price Plan (not the
numeric code), e.g. flex 1
The actual price that would have applied if no
campaign was applicable.
Date Time stamp of the moment the discount
was applied
1 = Reduction Applied
2 = Reduction Not applied, minimum value
not reached

3.7 Characteristics of the KPI file


Expected # records (daily): below 20
Expected # MB (daily): approx. 10 KB
Expected growth (%) per month: NA
Extraction-type: Full

3.8 General requirements of the KPI file


Frequency / time date of delivery: Daily
Time of delivery: Should be same as that of the corresponding data files
Dependency: <TBD>
Coding Standard: ascii
Compression Method: gzip
258731137.doc

Version V3.0
Page 12 of 62

File Type: gz
File Transfer Method: SFTP push
File authorization: at least read/ write for all (UNIX: rw-rw-rw-)
Column delimiter: |. Please note that there should not be any pipe in any of the data attributes.
Each KPI file is accompanied by a corresponding metadata-file.
KPI files should be delivered in Unix-zip format (compressed).
The metadata files should be delivered as plain text (not compressed).
Below are the KPIs decided for each source file:

3.9 Naming convention of the KPI file


<sourcesystem>_<kpifilename>_<date_of_content>_<file_seq_no>.dat.gz
Whereby:
Sourcesystem:
Name of the source system(For e.g MOSA)
Kpi filename: Name of the KPI file within the source system- (For e.g price_reference_kpi)
Date of content: The date of the data which the file is holding
Format: YYYYMMDDHH
Example(s):

mosa_price_reference_kpi_YYYYMMDD_99999999.dat.gz
mosa_campaign_discount_kpi_YYYYMMDD_99999999.dat.gz

Note*:- Please note that the sequence number is governed by the source system , for every file
delivery the sequence number is supposed to be incremented by 1 by the source system .

3.10 Structure of the KPI file

mosa_price_reference_kpi_YYYYMMDD_99999999.dat.gz

column name
SYSTEM
KPI_ID
GROUP_LEVEL_1
GROUP_LEVEL_2
KPI_DATE
KPI_VALUE
UOM

Value
MOSA
5
Priceplanid
FBSid
Date
Number of counted records per group level 1 and 2
Number

Below is the sample KPI file for ICS service


MOSA|5|priceplanid|fbsid|2014-05-01|10|number

258731137.doc

Version V3.0
Page 13 of 62

mosa_campaign_discount_kpi_YYYYMMDD_99999999.dat.gz
column name
SYSTEM
KPI_ID
GROUP_LEVEL_1
GROUP_LEVEL_2
KPI_DATE
KPI_VALUE
UOM

Value
MOSA
6
Result_Code
Campaign_code
Date
Number of counted records per group level 1 and 2
Number

Below is the sample KPI file


MOSA|6|<result_code>|<campaign_code>|2014-05-01|10|number
The above KPI files are sample files. Please note that the number of records in these KPI files will be
dynamic as the number of services will be dynamic for each day.

3.11 Characteristics of the metadata file


Expected # records (daily): < 10
Expected # MB (daily): < 100 KB
Expected growth (%) per month: n/a
Extraction-type: Full

3.12 General requirements of the metadata file


Frequency / time date of delivery: same as data/ KPI file.
Time of delivery: same as data/ KPI file.
Dependency: Must be sent after the corresponding data/KPI file.
Coding Standard: ascii
Compression Method: no compression
File Type: met
File Transfer Method: SFTP push
File authorization: at least read/ write for all (UNIX: rw-rw-rw-)
Column delimiter: |. Please note that there should not be any pipe in any of the data attributes.

3.13 Naming Convention of the metadata file


(all lowercase)
<sourcesystem>_<filename>_<date_of_content>_<file_seqno>.met
whereby:
258731137.doc

Version V3.0
Page 14 of 62

Sourcesystem: Name of the source system


Filename: Same as name of the data/ KPI file
Date of content: The date of the data which the file is holding. Format: YYYYMMDD
File seq no:
Sequence number of the file within the source system/filename deliveries.
Format: 00000000. Sequence number starts with 00000001. The next file should have seq nr
00000002 and so on. There shouldnt be gaps in the seq_nos. If for whatever reason a file
has to be re-sent the seq_no should be the same. (not increased).

Example(s):
For the data files, the names of the metadata files would be

mosa_campaign_discount_YYYYMMDD_<SEQNO>.met
mosa_price_reference_YYYYMMDD_<SEQNO>.met

For the KPI files, the names of the metadata files would be

mosa_campaign_discount_kpi_YYYYMMDD_SEQNO>.met
mosa_price_reference_kpi_YYYYMMDD_SEQNO>.met

3.14 Structure of the metadata file


There is one metadata file per each data file.
The files should contain the following information:
Fieldname/tag
SOURCE_SYSTEM
FILE_TYPE
NUM_REC

BYTES
DELIMITER
FULL_DELTA
FIELD_NAMES
CRE_DATE
SEQ_NO
VERSION

Description
Name of the source system
Name of the source file
Number of records in the datafile. This
number must be equal to the actual
number of records in the transferred
file.
Number of bytes in the datafile. This
number must be equal to the actual
number of bytes in the transferred file.
Character used as delimiter between
the attributes of the datafile
Indicator if the datafile holds a full dump
or deltas only
List of all the attributes separated by a
|.
Same as date_of_content in naming
convention
Same as file_seqno in naming
convention
Version of interface defintion

3.15 Sample metadata file for data file


258731137.doc

Version V3.0
Page 15 of 62

Example
MOSA
DATA or KPI
1723

332324
|
D/F

20090422
00000001
001

SOURCE_SYSTEM=MOSA
FILE_TYPE=DATA
NUM_REC=292934
BYTES=3232324
DELIMITER=|
FULL_DELTA=D
FIELD_NAMES=<Incoming field names >
CRE_DATE=20130801
SEQ_NO=00000001
VERSION=001

258731137.doc

Version V3.0
Page 16 of 62

4 ETL Design
4.1 Project Creations:
Since MOSA interface will have two new daily feeds which needs to go through DDL processing and
Atlas load , a new source project needs to be introduced for MOSA at the DDL end . The new project
structures would like something below :A new private project named mosa has to be created in the path /Projects/ddl/source and a public
project named com_mosa has to be created at the same path /Projects/ddl/source as depicted
below. The com_mosa common project has to be included in mosa private project.

4.2 File Polling


1. All MOSA interface files will land into DDL landing directory in DDL server in the following location
$COM_MOSA_INBOUND (/opt/SP/ai_serial/data/ddl/source/com_mosa/pending)
Note*:- The above is an example of a production path .
2. Polling of all the files will be done at DDL server with the existing polling script.
Name: $COM_DDL_BIN/generic_wait_for_file.ksh
258731137.doc

Version V3.0
Page 17 of 62

Usage: generic_wait_for_file.ksh [-h] -s <sandbox root> -p <project name> [-m] -F <feed prefix> -S
<feed suffix> [-M <metadata file suffix>]
[-D <duration>] [-I <polling interval>] [-P directory parameter] [-H host]
Where:
-h displays this help message.
-s <sandbox root> the name of the folder where all AbInitio projects are rooted
-p <project name> the name of the project whose parameter is to be sourced for folder
information
-F Feed File name prefix. Can take wild cards including unix commands in
-S Feed File Name suffix
-M Optional metadata file suffix. Needs to be provided if there is a metadata file to look for
-D Optional Duration in minutes for which to wait for files. Defaults to 30 mins
-I Optional polling interval in seconds after which files are to be looked for again. Defaults
to 30 secs
-P Parameter name (without $) in the sandbox which points to directory where to look for
files. Defaults to COM_<PROJECT_NAME>_INBOUND
-H Optional host name where to look for file. ssh should be enabled for that. defaults to
localhost
Example:
/ai_serial/sand/ddl/abiprod/com_ddl/bin/generic_wait_for_file.ksh -s /ai_serial/sand/ddl/abiprod/ddl -p
mosa -F campaign_discount -users -S .dat.gz -M .met -D 300 -I 300
Note*:- The polling needs to be done for both the data files and their corresponding KPI files .

4.3 Generic Batch Processing


The two new source files landed from MOSA undergoes the generic DDL batch processing .For
details of the existing DDL framework the following DDL detailed design document and the
corresponding amendment document needs to be followed :VFNL-Unify-DD-DDL.doc,Version 0.2,dated 30-01-2012 and VFNL-Unify-DA-DDL.doc,Version
2.6,dated 14-09-2012 as mentioned in Sec 1.4
This will lead to creation of two new psets for each of the source files (one generic graph pset and
another generic batch plan pset) .
The following psets needs to be created as below :/Projects/ddl/source/mosa/pset/price_reference_plan.pset
/Projects/ddl/source/mosa/pset/price_reference_generic_batch.pset
/Projects/ddl/source/mosa/pset/campaign_discount_plan.pset
/Projects/ddl/source/mosa/pset/campaign_discount_generic_batch.pset
/Projects/ddl/source/mosa/pset/price_reference_kpi_plan.pset
/Projects/ddl/source/mosa/pset/price_reference_kpi_generic_batch.pset
/Projects/ddl/source/mosa/pset/campaign_discount_kpi_plan.pset
/Projects/ddl/source/mosa/pset/campaign_discount_kpi_generic_batch.pset

The above mentioned feeds undergo the Generic Batch processing and create respective delta and
snapshot files. Since this process is processed through a generic and reusable DDL graphs , new
258731137.doc

Version V3.0
Page 18 of 62

parameter values needs to be supplied for each of the psets mentioned above . The details of the
parameter values for each of the above mentioned psets are provided in the below attached
spreadsheet .

Mosa_interface_ge
neric_batch_parameter_details.xls

Since currently, for any new source the corresponding KPI files will be landed in the source specific
landing directory from where the KPI files will be picked up and processed by the respective private
project process , in this case mosa .
Currently the KPI undergoes a singleton post processing , which means that there is a generic post
processing graph which collects all KPI delta files from $COM_KPI_PROCESSED directory . So
inorder that the existing post processing can be reused or a minimal change is done to the process ,
the KPI files for Mosa should be finally getting created in $COM_KPI_PROCESSED rather than
$COM_MOSA_PROCESSED , which can be achieved by including the common project com_kpi in
the private project of mosa and passing the value $COM_KPI_PROCESSED to the parameter
DELTA_DIRECTORY for the psets :/Projects/ddl/source/mosa/pset/price_reference_kpi_generic_batch.pset
/Projects/ddl/source/mosa/pset/ campaign_discount_kpi_generic_batch.pset

4.4 Post Processing


The delta files produced as a result of the generic batch process as described above needs to
undergo a post processing process wherein the records are brought in the format which the
downstream needs. More details of the existing post processing framework can be inferred from the
document# 3,4 & 6 as mentioned in the Related document Section .
The details of the post processing changes for the KPI feeds and the data feeds are highlighted
below:-

4.4.1 KPI Post Processing


There exists a generic post processing plan atlas_kpi_post_processing.plan which invokes a
bespoke graph $AI_MP/ atlas_kpi_post_processing.mp for post processing purpose of KPI files ,
currently . But this generic kpi post processing process was more tailored towards meeting the needs
of the KPI files to be processed for Unify process, where the resources of the plan are set to be
dependent on the various KPI files which have been sent from the source during Unify project
development . We have two options here to work with :Option1 : Use the same existing set up i.e using the existing kpi post processing plan and graphs .
This would need some changes to the KPI graph which includes small changes in the pset
parameters and inclusion of the new resource names along with the existing resource names in the
Abinitio plan .
Advantages : No new set of plans or graphs needs to be developed , existing can be used .
Disadvantages : Creates dependency upon Unify KPI stream since this plan is invoked once when the Unify
DDL processing for KPI is completed .
258731137.doc

Version V3.0
Page 19 of 62

Option 2 : Create separate plan and graph for MOSA KPI processing .
Advantages :
No dependency upon Unify stream, MOSA KPI can be processed independently.
Disadvantages:

Creation of new plan and graphs and new TWS JOBS .


Keeping in mind the various advantages and disadvantages for both the options the option 2 is more
beneficial as it will allow the Sources to get the KPI processed independently of each other
especially now when the KPI files for each source are landed in the source specific landing area .
The changes which needs to be made for option 2 are described below :One new graph and a new plan needs to be created .
Graph Name :- /Projects/ddl/target/ddl_atlas/mp/atlas_mosa_kpi_post_procesing.mp
Plan Name :- /Projects/ddl/target/ddl_atlas/plan/atlas_mosa_kpi_post_procesing_plan.plan
The graph and plan should be similar to the existing KPI graph and plan with the exception of the
following plan parameters as shown below :Parameter Name
METADATA_FILE_URL
SOURCE_FEEDS
DISTRIBUTION_FILE

Value
$COM_DDL_ATLAS_PENDING/atlas_kpi_mosa_last_extracted_details_without_status.dat
MOSA,price_reference_kpi|MOSA,campaign_discount_kpi
$COM_DDL_ATLAS_PENDING/atlas_kpi_mosa_last_extracted_details_with_status.dat

It should be taken care that the output file produced as a result of the post processing should be
different than any other regular KPI post processing files created and should be source specific . An
example of the output file which might be created as a result of this process :$COM_DDL_ATLAS_PENDING/ddl_kpi_mosa_<YYYYMMDD>_<seq_no>.dat

4.4.2 Feed Files Post Processing


The price_reference and campaign_discount files needs to undergo the post processing process , the
changes needed are mentioned below :(i)

Two new plan and graph psets needs to be created as below :atlas_mosa_campaign_discount_generic_post_processing_plan.pset
atlas_mosa_campaign_discount_generic_post_processing.pset
atlas_price_reference_generic_post_processing_plan.pset
atlas_price_reference_generic_post_processing.pset

(ii)

Parameter values for each of the plan pset to be provided as supplied in the below
document :-

MOSA_data_postpr
ocessing_params.xls

258731137.doc

Version V3.0
Page 20 of 62

4.5 DDL HOUSEKEEPING


4.5.1 Housekeeping for feed files :
A new plan needs to be created in the path :- /Projects/ddl/source/mosa/plan/housekeeping.plan . The
plan should look something like the one below :-

258731137.doc

Version V3.0
Page 21 of 62

The sub plans price reference deltas and campaign discount deltas would determine the purge
date for each of the corresponding Subscribers and purge them accordingly . The above sub plans
would consist of something as below :-

(i)
Determining Purge date :The first graph task would determine the purge date for the subscriber and the second graph task
would purge the required data in delta files .
Two new psets needs to be created as :$AI_PSET/determine_ddl_data_store_purge_date_cmpgn_disc.pset
$AI_PSET/determine_ddl_data_store_purge_date_price_ref.pset
The above psets would be passed in the Graph parameter of the above mentioned determine purge
date subplan .
The details of parameters of the above pset are as follows :Graph Name called by the psets :/Projects/ddl/com_housekeeping/mp/determine_ddl_data_store_purge_date.mp
SOURCE_SYSTEM
FEED
LKP_LOCATION

MOSA
campaign_discount/price_reference
$COM_DDL_SERIAL_LOOKUP

ii) Removing older delta files:The second graph task would remove the delta files which are older than the PURGE_DATE . This
needs creation of two new psets as follows :$AI_PSET/housekeep_cmpgn_disc_deltas.pset
$AI_PSET/housekeep_price_ref_deltas.pset
The details of the parameters of the psets are as follows :Graph Name called by the psets :/Projects/ddl/com_housekeeping/mp/data_store_copied_files_housekeeping.mp
258731137.doc

Version V3.0
Page 22 of 62

SOURCE_SYSTEM
FEED
FEED_DIRECTORY
FILE_PATTERN
PURGE_DATE_FILE_LOCATION

MOSA
campaign_discount/price_reference
$COM_MOSA_PROCESSED
mosa_cmpgn_disc_delta/mosa_price_ref_delta
$AI_SERIAL

iii) Insertion of onetime records in Control tables :1. The object /Projects/ddl/com_ddl/sql/create_control_tables.sql would be modified to insert
the below sql query.
INSERT
INSERT
INSERT
INSERT

INTO
INTO
INTO
INTO

DDL_HOUSEKEEPING
DDL_HOUSEKEEPING
DDL_HOUSEKEEPING
DDL_HOUSEKEEPING

VALUES
VALUES
VALUES
VALUES

('MOSA,'Log', 30);
(MOSA, Archive ,30 ) ;
('MOSA','Reject',30);
('MOSA','Error',30);

2. The object /Projects/ddl/target/ddl_atlas/sql/target_system_config.sql would be modified to insert


the below sql query.
insert into ddl_subscribers (source_system,feed,subscriber)
('MOSA',campaign_discounts, ATLAS_CAMPAIGN_DISCOUNT);
insert into ddl_subscribers (source_system,feed,subscriber)
price_reference, ATLAS_PRICE_REFERENCE');
insert into ddl_subscribers (source_system,feed,subscriber)
('MOSA',campaign_discount_kpi, ATLAS_KPI');
insert into ddl_subscribers (source_system,feed,subscriber)
price_reference_kpi, ATLAS_KPI');

values
values ('MOSA',
values
values ('MOSA',

3. The object /Projects/ddl/source/mosa/sql/source_system_config.sql would be modified to insert

the below sql query.


insert
insert
insert
insert

into
into
into
into

ddl_data_retention_period
ddl_data_retention_period
ddl_data_retention_period
ddl_data_retention_period

values
values
values
values

('MOSA',campaign_discounts,7);
('MOSA',price_reference,7);
('MOSA',campaign_discounts_kpi,7);
('MOSA',price_reference_kpi,7);

iv) Creation of pset for housekeeping of log , reject , error and archive files of MOSA :The following psets needs to be created to be passed to the error/log/reject/archive sub plans
:$AI_PSET/housekeep_mosa_error_files.pset
$AI_PSET/housekeep_mosa_log_files.pset
$AI_PSET/housekeep_mosa_reject_files.pset
$AI_PSET/housekeep_mosa_archive_files.pset
The details of the parameters of the above psets are as below :URL
258731137.doc

Version V3.0
Page 23 of 62

$AI_SERIAL_ERROR/$COM_MOSA_REJECT/
$COM_MOSA_ARCHIVE/$AI_SERIAL_LOG

HOUSEKEEPING_FILE_TYPE|

Error/Reject/Archive

4.5.2 Housekeeping for KPI files


For housekeeping the KPI files created as a result of the MOSA source , the same new
housekeeping plan as described in sec 4.5.1 would be used wherein two new subplans :Housekeep price reference KPI deltas and Housekeep campaign discount KPI deltas
needs to be introduced . Since the DELTA_PERIOD value for the batch processing for KPI files are
All , the subplan would consist of something like below :-

The same details as for housekeeping the feed files also applies for KPI files which are briefly
described below :(i)
Determining Purge date :Two new psets needs to be created as :$AI_PSET/determine_ddl_data_store_purge_date_cmpgn_disc_kpi.pset
$AI_PSET/determine_ddl_data_store_purge_date_price_ref_kpi.pset
Details of the parameter values as below :SOURCE_SYSTEM
FEED
LKP_LOCATION
(ii)

MOSA
campaign_discount_kpi/price_reference_kpi
$COM_DDL_SERIAL_LOOKUP

Housekeeping delta files :-

Create a pset each in mosa project of the graph data_store_housekeeping.mp located in


com_housekeeping project for the two kpi feeds. This will delete records from delta file based upon
the purge date determined in the above step.

258731137.doc

Version V3.0
Page 24 of 62

Pset names :- $AI_PSET/housekeep_mosa_campaign_discount_kpi_deltas.pset


$AI_PSET/housekeep_mosa_price_reference_kpi_deltas.pset

4.6 ATLAS Processing


The DDL post processing process will sftp the file to ATLAS server. Before data is loaded to ATLAS
LDM it will go through Meta validation and series of MDW graphs where business rule would be
applied and eventually data will be loaded to LDM through MDW framework graph.

4.6.1 Project Structure


Below diagram shows the proposed project structure which should exist at the atlas server end. The
boxes marked in yellow are the new project which needs to be introduced.
258731137.doc

Version V3.0
Page 25 of 62

4.6.2 Polling
A new TWS job needs to be defined during development phase which will poll for the new feed files
and the corresponding KPI files of MOSA , the existing wrapper
/Projects/atlas_dwh/com_atlas/bin/generic_conduct_etl_flow.ksh, needs to be used for the
polling purpose.

4.6.3 Source File Registration and Metadata Validation


(i)

Feed file registration process:


The new feed files of MOSA will undergo file registration process along with metadata file
validation with the revised data file. In order to run this process a new pset per feed needs to
be created for this feed as mentioned below.

pset Location: /Projects/atlas_dwh/source/mosa/pset/


pset Name:
atlas_meta_validation_mv_mosa_price_reference.pset
atlas_meta_validation_mv_mosa_campaign_discount.pset
Graph Name:
/Projects/atlas_dwh/com_atlas/mp/atlas_meta_validation.mp

258731137.doc

Version V3.0
Page 26 of 62

Graph Parameter Description (pset):

Name

Example

Description

SOURCE_SYSTEM

MOSA

FEED

campaign_discount/price_reference

SOURCE_DIRECTORY

${COM_ATLAS_SERIAL_PENDING}

FILE_PATTERN

<metadata file pattern>

Source system
name
Feed name as per
the pset used
The directory the
source files can be
found in.
The pattern to be
used to locate the
metadata

IS_SOURCE_COMPRESSED
SOURCE_DML_FILE

OUTPUT_DML_FILE

True
Target extract dml file name of each
of the feeds
out :: reformat(in) =
begin
out.* :: in.*;
end;
Same as source dml file

IS_VAL_REQD

True

DBC_FILE

$COM_ATLAS_DB/atlas_ldm.dbc

SCHEMA

$ATLAS_CTRL_OWNER

CREATE_LOOKUP

False

NO_OF_FILES_TO_PICK

SOURCE_TRANSFORM

258731137.doc

Version V3.0
Page 27 of 62

Consolidated file
record format
If record validation
is required
Teradata DBC file
name
Schema name of
atlas registration
table
Whether to use
consolidated file
also as a lookup file
to be used in
downstream
process.

(ii)
KPI files registration process :
The corresponding KPI files of MOSA will undergo file registration process along with metadata file
validation with the revised data file. In order to run this process a new pset per feed needs to be
created for this feed as mentioned below.
pset Location: /Projects/atlas_dwh/source/mosa/pset/
pset Name:
atlas_meta_validation_mv_mosa_price_reference_kpi.pset
atlas_meta_validation_mv_mosa_campaign_discount_kpi.pset
The same graph as mentioned above for processing the data feed files will be called from the above
mentioned psets .The values of the parameters of the pset are as follows :-

Name

Example

Description

SOURCE_SYSTEM

MOSA

FEED

campaign_discount_kpi/price_reference_kpi

SOURCE_DIRECTORY

${COM_ATLAS_SERIAL_PENDING}

FILE_PATTERN

ddl_kpi_mosa*.met

Source system
name
Feed name as
per the pset
used
The directory
the source files
can be found
in.
The pattern to
be used to
locate the
metadata

IS_SOURCE_COMPRESSED
SOURCE_DML_FILE
SOURCE_TRANSFORM

OUTPUT_DML_FILE

True
${COM_DDL_ATLAS_DML}/kpi/ddl_kpicommon_out.dml
out :: reformat(in) =
begin
out.* :: in.*;
end;
${COM_DDL_ATLAS_DML}/kpi/ddl_kpicommon_out.dml

IS_VAL_REQD

True

DBC_FILE

$COM_ATLAS_DB/atlas_ldm.dbc

SCHEMA

$ATLAS_CTRL_OWNER

CREATE_LOOKUP

False

NO_OF_FILES_TO_PICK

258731137.doc

Version V3.0
Page 28 of 62

Consolidated
file record
format
If record
validation is
required
Teradata DBC
file name
Schema name
of atlas
registration
table
Whether to use
consolidated
file also as a
lookup file to be
used in
downstream
process.

4.6.4 Surrogate Key extraction


Surrogate key of the tables Campaign, Product,Product_item and Event has to be extracted before
the start of the MDW process. The below psets need to be reused without making any change. No
new TWS jobs needs to be created the existing surrogate key extraction jobs for the above
mentioned dimension tables can be used .
Details of the TWS jobs,provided in the TWS job section .

4.6.5 Create Max Key lookup


With the extracted surrogate keys from the various dimension tables , a max key lookup file is
created , which will be created through the current create max key lookup process , so there needs no
change to this process in this project .

4.6.6 MDW Process


The MDW processes data from MOSA feeds into a set of target databases i.e LDM database. The
MDW allows for multiple source datasets to contribute portions of records and be combined into a
single target table, or for a single source dataset to contribute to multiple target tables.

Process flow:

MDW Script process flow


The script performs the following actions:
1. In parallel. MDW source_to_model will read input data from one or more source feed files and
maps the data to target model i.e apply business rules.
2. In parallel. MDW key_census identifies key fields
258731137.doc

Version V3.0
Page 29 of 62

3. Serially. MDW process_key creates all needed surrogate keys. Must run as a singleton to ensure
key consistency and prevent duplicates.
4. In parallel. MDW apply_keys replace natural keys with existing surrogate keys or newly created
surrogate keys.
5. In parallel. MDW model_to_physical maps all the fields in the model to the physical table record
format.
6. In parallel. MDW database_load loads all waiting input files (possibly from many feeds) targeted
for the table and consolidates them. There will be one MDW database load pset per target table.

4.6.7 ATLAS lookup synchronization


Lookup files of the tables such as EVENT , CAMPAIGN , PRODUCT_ITEM_HIST have to be
synchronized after MDW Database load is completed by running the following psets in TWS.
/Projects/atlas_dwh/com_atlas/pset/atlas_synchronization/sync_unify_to_atlas.event.pset
/Projects/atlas_dwh/com_atlas/pset/atlas_synchronization/sync_unify_to_atlas.product_item_hist.pset
/Projects/atlas_dwh/com_atlas/pset/atlas_synchronization/sync_unify_to_atlas.campaign.pset
For details of the lookup sync up jobs , please refer to the Atlas TWS job sheet in section 6 .

4.6.8 Update Max Key Lookup in Legacy


The current job which updates the max key would be used for updating the max key lookup file and
there would not be any introduction of new jobs in this project.

4.6.9 Source Target Mapping


4.6.9.1 Event Facet
The event facet undergoes a change wherein a new table to hold the campaign related events are
introduced INCENTIVE_EVENT and at the same time a INCENTIVE_RESULT_TYPE is also
created which will hold the INCENTIVE_RESULT_CD of the introduced incentive events . The data
model for the above is shown below:-

258731137.doc

Version V3.0
Page 30 of 62

The details of the mapping is as described in the embedded mapping spreadsheet document
below :-

Detailed
Design-Hybrid 2.0_v01.1.xlsx

4.6.10

Loading process for KPI files

The consolidated KPI files created by the registration process as mentioned above in the section 4.6.2
would be picked up loading purpose. The KPI data doesnt go through the surrogate key generation
process through the MDW framework and directly gets loaded in the SOURCE_KPI_RECON table.
The current KPI loading process/graph needs to undergo a change to allow the graph to pick up a list
of KPI files created from different process by a pattern naming convention of the KPI files. Currently
the load process looks out for a fixed file name , it should be changed to look for a list of KPI files
created in the $COM_ATLAS_SERIAL_TEMP dir .
The benefit of the change is that a single KPI load process would be responsible for loading all the
KPI load files created in a day .

258731137.doc

Version V3.0
Page 31 of 62

Similar changes are actually taking place in a separate project :- EBU_CVM , and details of the
changes can be referred in the detailed design document of EBU_CVM as mentioned in the related
document list section (Document # 7) .
Note*:- During development phase the Hybrid 2.0 development team needs to coordinate with
the EBU_CVM development team for this change so that a single change satisfies both the
projects need .

4.7

Impact of changes due to introduction of new field in the


PRODUCT_ITEM_HIST table

In the current project a new field SECONDARY_PRODUCT_ID would be added to the


PRODUCT_ITEM_HIST table , this would need changes and adjustments made to the current code
which interacts with this table both in legacy and Unify code base .

4.7.1 Changes to be made in Legacy system:


Currently in the legacy system Product_item_hist currently gets loaded from Product facet . The
below highlighted graph needs to be looked at and any dml changes/adjustments needs to be done at
this level .
Graph Name: /Projects/atlas/product/mp/pdm_trf_product_facet_gemini.mp.
Areas of impact in the graph are highlighted below , some of the components and output data files
have embedded dmls which needs to be changed accordingly .
The load ready file $ATLAS_DML/pdm_ldb_product_item_hist.dml should also need to be
regenerated to cater to the addition of the new field in the table .

Note*:(i)

Arrangements needs to be made to transform the previous days file

pdm_lkp_product_item_hist to the new format , i.e with the new field


secondary_product_id where the value of the new field will be populated with NULLs just
before the day the production run happens with the changed dml.
258731137.doc

Version V3.0
Page 32 of 62

(ii)
(iii)

This onetime change can be done by means of creating a onetime abintio graph or by
using any scripting code .
The above change need not be done if the table PRODUCT_ITEM_HIST is already
extracted by the lookup creation job which runs at the start of the facet jobs in the legacy
system .Needs to be verified during development phase .
Rigorous testing needs to be done during development phase to establish code
robustness for the changes made .

4.7.2 Changes to be made in the existing MDW lookups:


There would be subtle changes to be made in the existing MDW lookups which is used in maintain the
relationship between the parent and child tables and establishes the capability to seamlessly allow the
processing in MDW key generation process for any new tables added and or any modified table in this
project . The list of table changes are as below : There would be an addition of a new column to the table PRODUCT_ITEM_HIST table and
hence the existing MDW logical and physical dml has to be regenerated.
A new table INCENTIVE_EVENT would be introduced and so the corresponding logical and
physical dml needs to be generated .
An existing table CAMPAIGN which was getting loaded through legacy process would be
introduced in Unify load process.
For all the above the relation of the tables with their corresponding dimensions needs to be fed to a
one time MDW abinitio process to generate the logical and physical dmls and also to establish the
correct relationship between each of the tables to its corresponding dimension tables . The required
psets to be used to execute this one time process is as below :
Pset Name :- /Projects/atlas_dwh/utility/pset/3_mdw_import.VFNL.pset
Plan Name :- /Projects/mdw2/MDW/plan/mdw_import.plan
The above plan takes an excel spreadsheet as an input parameter , the spreadsheet contains the
dimension list , relationship of the table with respect to the dimension , the physical dml structure ,
MDW keydefs,keysequences,keycontexts , valuecontexts etc .
A sample example spreadsheet is attached below, however during development phase the latest
spreadsheet has to be acquired by the developer from the existing Unify team and changes needs to
be built on top of them.

Logical_Data_Mode
l_20140522.xls

4.8 ATLAS Housekeeping


Housekeeping is required to purge archive and error files generated through Meta
Validation.
Here are the steps.
1. Update the sql
/Projects/atlas_dwh/com_atlas/sql/create_registration_table.sql to insert
these two queries.
258731137.doc

Version V3.0
Page 33 of 62

INSERT INTO ${ATLAS_CTRL_OWNER}.ATLAS_HOUSEKEEPING VALUES ('MOSA','Archive',7);


INSERT INTO ${ATLAS_CTRL_OWNER}.ATLAS_HOUSEKEEPING VALUES ('MOSA','Error',7);

2. Create two psets for the graph


/Projects/atlas_dwh/com_housekeeping/mp/generic_housekeeping.mp to purge MOSA
archive and error files.
/Projects/atlas_dwh/source/mosa/pset/generic_housekeeping_archive_files.pset
/Projects/atlas_dwh/source/mosa/pset/generic_housekeeping_error_files.pset
Graph Name :/Projects/atlas_dwh/com_housekeeping/mp/generic_housekeeping.mp
Parameter Name
URL
FILE_TYPE

Value
$AI_SERIAL_ARCHIVE/$AI_SERIAL_ERROR
DIR

SYSTEM_NAME

MOSA

HOUSEKEEPING_FILE_TYPE

Archive/Error

LKP_LOCATION

$COM_ATLAS_SERIAL_LOOKUP

258731137.doc

Version V3.0
Page 34 of 62

Description
Dir to be purged
Use the default
value
Source System
Name
Archive dir for
archive
housekeeping
pset and Error
dor for error
housekeeping
pset
Keep the default
value

5 Process Flow Diagrams


5.1 Feed File Process Flow :

258731137.doc

Version V3.0
Page 35 of 62

258731137.doc

Version V3.0
Page 36 of 62

5.2 MDW Process Flow :

258731137.doc

Version V3.0
Page 37 of 62

5.3 KPI File Process Flow

258731137.doc

Version V3.0
Page 38 of 62

258731137.doc

Version V3.0
Page 39 of 62

6 TWS scheduling :
New TWS jobs needs to be introduced to execute the above illustrated process flows. The TWS
jobs can be broadly classified into DDL jobs and jobs at Atlas end . The possible list of TWS jobs
which would be created and their corresponding dependencies are listed down in the below
attached documents as below :(i)

List of DDL and post processing jobs to be created :

Hybrid_DDL_job_sc
hedules.xls

(ii)

List of Atlas and MDW TWS jobs to be created :-

Hybrid_ATLAS_job_s
chedules.xls

Note*: The jobs are named as per the testing naming conventions, the names need to be
realigned during development as per the environment where the jobs would be
scheduled to run .
The job dependencies with the existing jobs needs to be revalidated during
development phase with the actuals running in prod if incase there are changes to the
job stream during the development phase .

258731137.doc

Version V3.0
Page 40 of 62

7.

Teradata Design

As part of Hybrid 2.0 Project, below are the design covered in this document related to Teradata.

7.1 Hybrid 2.0 data Load (New)


Two new feeds from MOSA source system will be added and data from which will be pushed towards
ATLAS server through new interfaces. The interfaces shall contain both the business data, metadata
as well as KPI records for ATLAS reconciliation. New ETL mappings would be needed to transform
and load the data into ATLAS.
Data load into LDM should be done using the MDW framework as being currently done for Unify. On
top of the ATLAS LDM, one to one views for each of the table as well as 1-on-1 views will be created
in the presentation layer.

7.1.1 Event.Base facet: LDM tables data load


Below is the list of tables that will be impacted for this facet in STG_LDM & PROD_LDM:

TABLE NAME
EVENT
EVENT_TABLE_TYPE
EVENT_CLASS
INCENTIVE_EVENT
INCENTIVE_RESULT_TY
PE

NEW/EXISTING

REFERENCE
TABLE(Y/N)

ETL
LOAD(Y/N)

EXISTING
EXISTING

N
Y

Y
N

EXISTING

NEW
NEW

N
Y

Y
N

COMMENTS
One Time Manual
Load
One Time Manual
Load
One Time Manual
Load

7.1.1.1 One time load script for EVENT_CLASS Table


A new script (insert_event_class.sql) will be created to insert an entry in the EVENT_CLASS table in
the STG environment.
The default value will be -1 for both ICID & LUID for all the one time insert to the tables mentioned
below.

EVENT_CL
ASS_CD

EVENT_CLA
SS_NAME

IE

Incentive
event

258731137.doc

Version V3.0
Page 41 of 62

EVENT_CLASS_
DESC

Incentive event

ICID

LUID

-1

-1

7.1.1.2 One time load script for EVENT_TABLE_TYPE Table


A new script (insert_event_table_type.sql) will be created to insert an entry in the
EVENT_TABLE_TYPE table in the STG environment.
The default value will be -1 for both ICID & LUID for all the one time insert to the tables mentioned
below.

EVENT_TAB
LE_TYPE_C
D

EVENT_TABL
E_TYPE_NAM
E

EVENT_TABLE_T
YPE_DESC

INEV

Incentive event

Incentive event

ICID

LUID

-1

-1

7.1.1.3 New Table: INCENTIVE_EVENT (SET TABLE)


This table is not present in ATLAS.
Script will be developed to create this table in the in the STG and PROD environment which will have
the definition as mentioned below.
Column

Data
Type

Description

NUL
L(Y/
N)

Sample

Population
Method

INCENTIVE_E
VENT_ID

INTEGER

Foreign Key to
Event.event_id

ACCS_METH_I
D

INTEGER

CAMPAIGN_ID

INTEGER

PRODUCT_ID

INTEGER

SECONDARY_P
RODUCT_ID

INTEGER

Using the source


field CTN - lookup
at table
access_method
to get access
method id.
FK to campaign
table based on
lookup in the
source field
campaign_code
Lookup on
PRODUCT.PRODU
CT_NAME for
Source field priceplanid
values.Use the
product_id post
look up.
Lookup on
PRODUCT.PRODU
CT_NAME for
Source field fbsid
values .Use the
product_id post
look up.

258731137.doc

Version V3.0
Page 42 of 62

Indexes
Unique
Primary
Index
INCENTIV
E_EVENT_I
D
NA

NA

NA

NA

NUL
L(Y/
N)

Column

Data
Type

ACTUAL_CHAR
GE

DECIMAL(8,
2)

INCENTIVE_RE
SULT_ID

INTEGER

CREATION_SO
URCE_TYPE_C
D
MINIMUM_VAL
UE

VARCHAR(3
)

DECIMAL(8,
2)

REDUCTION_A
MOUNT

DECIMAL(8,
2)

ORG_KEY

ICID

VARCHAR(2
55)
SMALLINT

LUID

SMALLINT

Description

Sample

Population
Method

Indexes

Source data from


Price Field .This
field can have
datatype as
integer/decimal
but as the source
datatype wasnt
available and no
sample data was
available it will
be advise dto
revisit during
detail design.
FK to
Incentive_result_t
ype.incentive_res
ult_cd .
MOS (MOSA)

NA

Mapped to field
minimum value
from source
feed .This field
can have
datatype as
integer/decimal
but as the source
datatype wasnt
available and no
sample data was
available it will
be advise dto
revisit during
detail design.
Mapped to field
discount value
from source
feed .This field
can have
datatype as
integer/decimal
but as the source
datatype wasnt
available and no
sample data was
available it will
be advise dto
revisit during
detail design.

NA

NA

NA

NA

NA
Through ETL
Process
Through ETL
Process

NA
NA

7.1.1.4 New Table: INCENTIVE_RESULT_TYPE (SET TABLE)


This table is not present in ATLAS. This table should be loaded one time through manual scripts.
258731137.doc

Version V3.0
Page 43 of 62

Script will be developed to create this table in the in the STG and PROD environment which will have
the definition as mentioned below.

Data
Type

Column

Description

NUL
L(Y/
N)

Sample

INCENTI
VE_RES
ULT_ID

INTEGER

INCENTIV
E_RESUL
T_DESC

VARCHAR(2
55)

Incentive result
description

Reduction Applied

ORG_KEY

VARCHAR(2
55)
SMALLINT
SMALLINT

Originating key from


source

ICID
LUID

Incentive result code

Population
Method
Generated
ID.

Indexes
Unique
Primary
Index

INCENTI
VE_RESU
LT_ID
1=
Reduction
Applied
2=
Reduction
Not applied,
minimum
value not
reached
Straight
move

N
N

NOTE: incentitive_result_cd (As per HLD) has been changed to Incentive_result_id. As per the
existing standard, it should be id not code.
Also column desc (As per HLD) above has been changed to Incentive_result_desc.

7.1.1.5 Copy-to-Prod Configuration for Event.Base facet


The generic COPY-TO-PROD-script, which copies the data from STG_LDM to PROD_LDM on a daily
basis, will have to be extended for the below list of tables

Table-name
INCENTIVE_EVENT
INCENTIVE_RESULT_TYPE

Delete/Truncate from
PROD_LDM
Yes
Yes

Insert into PROD_LDM


Yes
Yes

No change will be required in the COPY_TO_PROD.ksh shell script. However, there will be entries for
the new tables as mentioned above in the control table PROD_DBA.COPY_TO_PROD.

7.1.1.6 Data Backup Configure Backup facet Script


The generic facet backup script will have to be extended for all the new tables introduced for this
facet.
This facet backup script backs up the tables in STG_LDM database to the STG_BACKUP database
on a daily basis.
No change will be required in the facet backup shell script. However, there will be entries for all the
258731137.doc

Version V3.0
Page 44 of 62

impacted table in the control table STG_DBA.BACKUP_FACET. Sample data from BACKUP_FACET
table is given below for your reference.

DATABASE
NAME

DATABA
SENAME
_PRE

DATABA
SENAME
_POST

TABLENAM
E

STG_LDM

STG

LDM

EVENT

EVENT_HB

H_B_EVENT

NULL

MAIN

STG_LDM

STG

LDM

EVENT_CLASS

EVENT_HB

H_B_EVENT
_CLASS

NULL

REF

STG_LDM

STG

LDM

EVENT_TABLE
_TYPE

EVENT_HB

NULL

REF

STG_LDM

STG

LDM

INCENTIVE_EV
ENT

NULL

MAIN

STG_LDM

STG

LDM

INCENTIVE_RE
SULT_TYPE

EVENT_IN
CENTIVE_
HB
EVENT_IN
CENTIVE_
HB

H_B_EVENT
_TABLE_TYP
E
H_B_INCEN
TIVE_EVENT
H_B_INCEN
TIVE_RESUL
T_TYPE

NULL

REF

FACET

BACKUPN
AME

EXC
DATA_I
ND

CREATE_
LKP_IND

7.1.1.7 One-to-One view (New)


Oneto-one view (create_view_event.sql) need to be created for the new table mentioned below in
STG_VIEW and PROD_VIEW database.
Source
Database

Sno.

Source Table Name

Target view
database

View name

PROD_LDM

INCENTIVE_EVENT

PROD_VIEW

INCENTIVE_EVENT

PROD_LDM

INCENTIVE_RESULT_TYPE

PROD_VIEW

INCENTIVE_RESULT_TYPE

Source
Database

Sno.

Source Table Name

Target view
database

View name

STG_LDM

INCENTIVE_EVENT

STG_VIEW

INCENTIVE_EVENT

STG_LDM

INCENTIVE_RESULT_TYPE

STG_VIEW

INCENTIVE_RESULT_TYPE

7.1.1.8 One-to-One PL view (New)


Oneto-one PL view (create_PL_view_event.sql) need to be created for the table mentioned below in
PROD_PL_VIEW database.
Sno.

Source
Database

Source Table Name

Target view
database

View name

PROD_VIEW

INCENTIVE_EVENT

PROD_PL_VIEW

INCENTIVE_EVENT

PROD_VIEW

INCENTIVE_RESULT_TYPE

PROD_PL_VIEW

INCENTIVE_RESULT_TYPE

258731137.doc

Version V3.0
Page 45 of 62

MAX_V
ALUE_I
ND

7.1.1.9 One time script for INCENTIVE_RESULT_TYPE TABLE


A new script (insert_stg_incentive_result_type.sql) will be created to insert value related to Hybrid 2.0
in the INCENTIVE_RESULT_TYPE table in STG environment.
The default value will be -1 for both ICID & LUID for all the one time insert to the tables mentioned
below.

INCENTIVE_RESULT_ID

INCENTIVE_RE
SULT_DESC
Reduction Applied

Reduction Not
applied, minimum
value not reached

ORG_KEY
1
2

ICID

LUID

-1

-1

-1

-1

This is the existing table, one column will be added as part of Hybrid 2.0, the new column would be
Secondary_Product_Id. The row highlighted in yellow shows the new column addition.
Column

Data
Type

Descrip
tion

NULL
(Y/N)

Population
Method

Sample

PRODUCT_ID

INTEGER

PRODUCT_ITEM_ID

INTEGER

PRODUCT_ITEM_STA
RT_DT

DATE

PERIOD_CD
DISCOUNT_METHO
D_CD
TIER_LEVEL_CD
ACTION_CD

CHAR(1)
CHAR(1)

N
N

Lookup on
PRODUCT.PRODUCT
_NAME for Source
field - priceplanid .
Reference to the
Product_item.produ
ct_item_id
For a new
combination of
Priceplanid and
FBSid insert a new
record with sysdate
.If the combination
of
<priceplanid,fbsid>
for a particulr price
changes and a new
value of price
comes in then a
new record will be
populated with the
sysdate.
DATE FORMAT
'yyyy-mm-dd'
0
X

CHAR(1)
CHAR(1)

N
N

X
X

258731137.doc

Version V3.0
Page 46 of 62

383047

2013-12-01

Indexes

Column

Data
Type

Descrip
tion

NULL
(Y/N)

Population
Method

Sample

PRODUCT_ITEM_STA
RT_TM

TIME(0)

PRODUCT_ITEM_EN
D_DT

DATE

PRODUCT_ITEM_EN
D_TM

TIME(0)

PRODUCT_ITEM_CH
ARGE_TYPE_CD
PRODUCT_ITEM_MO
NETARY_AMT

CHAR(1)

For a new
combination of
Priceplanid and
FBSid insert a new
record with systime
.If the combination
of
<priceplanid,fbsid>
for a particulr price
changes and a new
valu of price comes
in then a new
record will be
populated with the
systime.
For a new
combination of
Priceplanid and
FBSID insert a new
record with NULL .If
the combination of
<priceplanid,fbsid>
for a particulr price
changes and a new
value of price
comes in then a
new record will be
populated with the
NULL ,The older
record will be
closed with
sysdate.
For a new
combination of
PriceplanId and
FBSId insert a new
record with NULL .If
the combination of
<priceplanId,fbsId
> for a particulr
price changes and
a new value of
price comes in then
a new record will
be populated with
the NULL ,The older
record will be
closed with
sysdate.
X

Source - Price field

EUR*

DECIMAL(
12,5)
INTEGER
CHAR(4)

NA

Y
Y

NA
NULL

SMALLINT

SMALLINT

Through ETL
Process
Through ETL
Process

MONETARY_UNIT_O
F_MEASURE_CD
PRELIMINARY_CHAR
GE
UNIT_DURATION
DURATION_UNIT_OF
_MEASURE_CD
ICID
LUID

258731137.doc

Version V3.0
Page 47 of 62

DECIMAL(
18,6)
CHAR(4)

12/31/1899

Indexes

Column
SECONDARY_PRO
DUCT_ID

Data
Type

Descrip
tion

INTEGER

NULL
(Y/N)

Sample

Population
Method

Indexes

Lookup on
PRODUCT.PRODUCT
_NAME for Source
field fbs

Note: The changes needs to be made in both the staging and target database.
* Verify sample data whether it would be EUR or EURC

7.1.2 Offer.Product facet: LDM tables data load


Below is the list of tables that will be impacted for this facet in STG_LDM & PROD_LDM:
TABLE NAME

NEW/EXISTING

REFERENCE
TABLE(Y/N)

ETL
LOAD(Y/N)

PRODUCT_ITEM_HIST

EXISTING

PRODUCT_ITEM

EXISTING

COMMENTS
There is structural
change in table,
One new column
to be added in this
Table
One Time ETL Load

7.1.2.1 Existing Table: PRODUCT_ITEM_HIST (SET TABLE)


This is the existing table, one column will be added as part of Hybrid 2.0, the new column would be
Secondary_Product_Id. The row highlighted in yellow shows the new column addition.
Column

Data
Type

Descrip
tion

NULL
(Y/N)

PRODUCT_ID

INTEGER

PRODUCT_ITEM_ID

INTEGER

PRODUCT_ITEM_STA
RT_DT

DATE

258731137.doc

Version V3.0
Page 48 of 62

Sample
383047

2013-12-01

Population
Method
Lookup on
PRODUCT.PRODUCT
_NAME for Source
field - priceplanid .
Reference to the
Product_item.produ
ct_item_id
For a new
combination of
Priceplanid and
FBSid insert a new
record with sysdate
.If the combination
of
<priceplanid,fbsid>
for a particulr price
changes and a new
value of price
comes in then a
new record will be
populated with the
sysdate.
DATE FORMAT
'yyyy-mm-dd'

Indexes

Column

Data
Type

Descrip
tion

NULL
(Y/N)

Population
Method

Sample

PERIOD_CD
DISCOUNT_METHO
D_CD
TIER_LEVEL_CD
ACTION_CD
PRODUCT_ITEM_STA
RT_TM

CHAR(1)
CHAR(1)

N
N

0
X

CHAR(1)
CHAR(1)
TIME(0)

N
N
N

PRODUCT_ITEM_EN
D_DT

DATE

PRODUCT_ITEM_EN
D_TM

TIME(0)

PRODUCT_ITEM_CH
ARGE_TYPE_CD
PRODUCT_ITEM_MO
NETARY_AMT

CHAR(1)

X
X
For a new
combination of
Priceplanid and
FBSid insert a new
record with systime
.If the combination
of
<priceplanid,fbsid>
for a particulr price
changes and a new
valu of price comes
in then a new
record will be
populated with the
systime.
For a new
combination of
Priceplanid and
FBSID insert a new
record with NULL .If
the combination of
<priceplanid,fbsid>
for a particulr price
changes and a new
value of price
comes in then a
new record will be
populated with the
NULL ,The older
record will be
closed with
sysdate.
For a new
combination of
PriceplanId and
FBSId insert a new
record with NULL .If
the combination of
<priceplanId,fbsId
> for a particulr
price changes and
a new value of
price comes in then
a new record will
be populated with
the NULL ,The older
record will be
closed with
sysdate.
X

Source - Price field

EUR*

NA

Y
Y

NA
NULL

MONETARY_UNIT_O
F_MEASURE_CD
PRELIMINARY_CHAR
GE
UNIT_DURATION
DURATION_UNIT_OF
_MEASURE_CD
258731137.doc

Version V3.0
Page 49 of 62

DECIMAL(
18,6)
CHAR(4)
DECIMAL(
12,5)
INTEGER
CHAR(4)

12/31/1899

Indexes

Column

Data
Type

Descrip
tion

NULL
(Y/N)

ICID

SMALLINT

LUID

SMALLINT

SECONDARY_PRO
DUCT_ID

INTEGER

Sample

Population
Method
Through ETL
Process
Through ETL
Process
Lookup on
PRODUCT.PRODUCT
_NAME for Source
field fbs

Note: The changes needs to be made in both the staging and target database.
* Verify sample data whether it would be EUR or EURC

258731137.doc

Version V3.0
Page 50 of 62

Indexes

7.1.2.2 Data Backup Configure Backup facet Script


The generic facet backup script will have to be extended for all the impacted tables for this facet.
This facet backup script backs up the tables in STG_LDM database to the STG_BACKUP database on a daily basis.
No change will be required in the facet backup shell script. However, there will be entries for all the impacted table in the control table
STG_DBA.BACKUP_FACET. Sample data from BACKUP_FACET table is given below for your reference.
DATABASENA
ME

DATABA
SENAME
_PRE

DATABA
SENAME
_POST

TABLEN
AME

FACET

BACKU
PNAME
EXC

DATA_I
ND

CREATE
_LKP_IN
D

MAX_VA
LUE_IN
D

MAX_KE
Y

VR_IND

PART
ITIO
N_KE
Y

PRIMARY_
KEY
PRODUCT_ID;
PRODUCT_ITE
M_ID;PRODUC
T_ITEM_START
_DT;
PERIOD_CD;D
ISCOUNT_MET
HOD_CD;TIER
_LEVEL_CD
,ACTION_Cd
PRODUCT_ITE
M_ID

STG_LDM

STG

LDM

PRODUCT
_ITEM_HIS
T

OFFER_H
B

H_B_PRO
DUCT_ITE
M_HIST

NULL

HIST

STG_LDM

STG

LDM

PRODUCT
_ITEM

OFFER_H
B

H_B_PRO
DUCT_ITE
M

NULL

MAIN

PRODUCT
_ITEM_ID

258731137.doc

Version V3.0
Page 51 of 62

7.1.2.3 One time script for PRODUCT_ITEM TABLE


A new script (insert_product_item.sql) will be created to insert value related to Hybrid 2.0 in the
PRODUCT_ITEM table.
The default value will be -1 for both ICID & LUID for all the one time insert to the tables mentioned
below.
A one time load will be done in the table through ETL process.

PRODUCT_I
TEM_ID
MAX(PRODUC
T_ITEM_ID)+1

PROD
UCT_I
TEM_
GROU
P_CD
UNK

PROD
UCT_I
TEM_S
ALE_I
ND
P

PROD
UCT_I
TEM_
NAME
FLEX
PRICE

PROD
UCT_I
TEM_
DESC

PROD
UCT_I
TEM_
PRICI
NG_N
AME

PROD
UCT_I
TEM_
TYPE_
CD

CREA
TION_
SOUR
CE_TY
PE_CD

FLEX
PRICE

NULL

UNK

LEG

ICID
-1

LUID
-1

7.1.2.4 One-to-One view (New)


Existing view of PRODUCT_ITEM_HIST table needs to be refreshed in both STG_VIEW and
PROD_VIEW database as new column has been added to the table.

7.1.2.5 One-to-One PL view (New)


Existing view of PRODUCT_ITEM_HIST table needs to be refreshed in PROD_PL_VIEW database
as new column has been added to the table.

7.1.3 Miscellaneous facet: LDM tables data load


Below is the list of tables that will be impacted for this facet in STG_LDM & PROD_LDM:
TABLE NAME
CREATION_SOURCE_TY
PE

258731137.doc

Version V3.0
Page 52 of 62

NEW/EXISTING
EXISTING

REFERENCE
TABLE(Y/N)
Y

ETL
LOAD(Y/N)
N

COMMENTS
One Time Manual
Load

7.1.3.1 Data Backup Configure Backup facet Script


The generic facet backup script will have to be extended for all the impacted tables for this facet.
This facet backup script backs up the tables in STG_LDM database to the STG_BACKUP database on a daily basis.
No change will be required in the facet backup shell script. However, there will be entries for all the impacted table in the control table
STG_DBA.BACKUP_FACET. Sample data from BACKUP_FACET table is given below for your reference.

DATABASEN
AME
STG_LDM

258731137.doc

Version V3.0
Page 53 of 62

DATAB
ASENA
ME_PR
E
STG

DATAB
ASENA
ME_PO
ST
LDM

TABLE
NAME
CREATIO
N_SOUR
CE_TYPE

FACET
MISC_H
B

BACKU
PNAM
E
H_B_CR
EATION_
SOURCE
_TYPE

EXC
NULL

DATA_I
ND

CREAT
E_LKP_
IND

MAX_V
ALUE_I
ND

MAX_K
EY

VR_IN
D

PARTIT
ION_K
EY

PRIMA
RY_KE
Y

REF

CREATIO
N_SOUR
CE_TYPE
_CD

7.1.3.2 One time script for CREATION_SOURCE_TYPE TABLE


A new script (insert_creation_source_type.sql) will be created to insert value related to Hybrid 2.0 in
the CREATION_SOURCE_TYPE table.
The default value will be -1 for both ICID & LUID for all the one time insert to the tables mentioned
below.
CREATION_SOURCE_TYP
E_CD

CREATION_SOU
RCE_TYPE_NAM
E

CREATION_SO
URCE_TYPE_D
ESC

MOS

MOSA

MOSA

ICID
-1

LUID
-1

7.1.4 Campaign facet: LDM tables data load


Below is the list of tables that will be impacted for this facet in STG_LDM & PROD_LDM:
TABLE NAME
CAMPAIGN
CAMPAIGN_STRATEGY

258731137.doc

Version V3.0
Page 54 of 62

NEW/EXISTING
EXISTING
EXISTING

REFERENCE
TABLE(Y/N)
N
Y

ETL
LOAD(Y/N)
Y
N

COMMENTS
One Time Manual
Load

7.1.4.1 Data Backup Configure Backup facet Script


The generic facet backup script will have to be extended for all the impacted tables for this facet.
This facet backup script backs up the tables in STG_LDM database to the STG_BACKUP database on a daily basis.
No change will be required in the facet backup shell script. However, there will be entries for all the impacted table in the control table
STG_DBA.BACKUP_FACET. Sample data from BACKUP_FACET table is given below for your reference.

DATABASEN
AME

DATAB
ASENA
ME_PR
E

DATAB
ASENA
ME_PO
ST

TABLE
NAME

FACET

BACKU
PNAM
E

EXC

DATA_I
ND

CREAT
E_LKP_
IND

MAX_V
ALUE_I
ND

MAX_K
EY

VR_IN
D

PARTIT
ION_K
EY

PRIMA
RY_KE
Y

STG_LDM

STG

LDM

CAMPAI
GN

CAMPAI
GN_HB

H_B_CA
MPAIGN

NULL

MAIN

CAMPAI
GN_ID

CAMPAI
GN_ID

STG_LDM

STG

LDM

CAMPAI
GN_STR
ATEGY

CAMPAI
GN_HB

H_B_CA
MPAIGN_
STRATE
GY

NULL

DATA

CAMPAI
GN_STR
ATEGY_C
D

258731137.doc

Version V3.0
Page 55 of 62

7.1.4.2 One time load script for CAMPAIGN_STRATEGY Table


A new script (insert_comaping_strategy.sql) will be created to insert an entry for the Hybrid 2.0 in the
CAMPAIGN_STRATEGY table in the STG environment.
The default value will be -1 for both ICID & LUID for all the one time insert to the tables mentioned
below.

CAMPAIGN_
STRATEGY_
CD

CAMPAIGN_S
TRATEGY_NA
ME

CAMPAIGN_STRA
TEGY_DESC

HCL

Hybrid
Campaign List

Campaign List for


Hybrid customers

ICID

LUID

-1

-1

7.2 Data Backup


Database level backup is done by the DBA team on a daily basis

7.3 Data Retention and Purging


Data Retention

Database level

Data Purging

Table Level

Online backup is taken


at database level.

No History
Requirement

Database Level
Purging is not
happening at
database level.

Table level
Not in scope

7.4 Statistics Gathering


Statistics needs to be gathered on the columns identified as PI for the new tables introduced for
Hybrid 2.0 data load.
The control table COL_STAT_DATA need to be configured in PROD_DBA database.
Below table structure shows the data and column mapping for the table COL_STAT_DATA to gather
statistics in Staging database.

DATABASENA
ME

TABLEN
AME

${DB_ENV1}_LDM

INCENTIV
E_EVENT

${DB_ENV1}_LDM

258731137.doc

Version V3.0
Page 56 of 62

INCENTIV
E_RESULT
_TYPE

COLUM
NTYPE
IDX
IDX

COLUM
NNAME
LIST
INCENTIV
E_EVENT
_ID
INCENTIV
E_RESUL
T_ID

SAMPL
E_STAT
S

INTERV
AL

JOBNA
ME

ACTIVE

NULL

REST

NULL

REST

DATABASENA
ME

TABLEN
AME

${DB_ENV1_LDM

PRODUCT
_ITEM_HIS
T

COLUM
NTYPE
COL

COLUM
NNAME
LIST
PRODUCT
_ID,SECO
NDARY_P
RODUCT_
ID

SAMPL
E_STAT
S
NULL

INTERV
AL
D

JOBNA
ME
REST

ACTIVE
Y

Below table structure shows the data and column mapping for the table COL_STAT_DATA to gather
statistics in Target database.
DATABASENA
ME

TABLEN
AME

${DB_ENV1}_LDM

INCENTIV
E_EVENT

${DB_ENV1}_LDM

Note:

INCENTIV
E_RESULT
_TYPE

COLUM
NTYPE
IDX
IDX

COLUM
NNAME
LIST
INCENTIV
E_EVENT
_ID
INCENTIV
E_RESUL
T_ID

SAMPL
E_STAT
S

INTERV
AL

JOBNA
ME

ACTIVE

NULL

DELTA

NULL

DELTA

${DB_ENV1} will be replaced by the STG/PROD

7.5 ADQM
KPI file is used for the purpose of reconciliation in ATLAS. The ADQM sub-system within Atlas
provides a reconciliation mechanism between the supplying source systems and the Atlas LDM.
Source systems calculate values for specific KPIs and these values are then re-calculated within
the Atlas LDM.
The file KPI file should be associated with a corresponding metadata file. As part of Hybrid 2.0
project, there would be two KPI files

Priceplan Feed
Discount Feed (HCL feed)

The existing load processes shall be re-used and the KPIs generated from these queries loaded to
the PROD_RECON.ATLAS_KPI_RECON table. The KPIs delivered by the source systems shall be
loaded to the PROD_RECON.SOURCES_KPI_RECON.
The ADQM sub-system consists of a number of scripts that execute SQL against the LDM and load
equivalent data, calculated within Atlas rather than within the Source System, in to the LDM.

7.5.1 Values to be calculated


Both Source generated and ADQM generated KPI data have the same structure:

No. Name

Type

Description

0
1

String(20)
Number

Unique Identifier per System


Unique Identifier per measurement within 1 system

SYSTEM
KPI_ID

258731137.doc

Version V3.0
Page 57 of 62

2
3
4
5
6

GROUP_LEVEL_1
GROUP_LEVEL_2
KPI_DATE
KPI_VALUE
UOM

String(100)
String(100)
Date
Number
String

Grouping level 1
Grouping level 2
Date for the measurement
Value of the measurement for the specific date
Unit of Measurement

The values calculated within ADQM are described below.


The following sections describe how each of the KPIs is to be calculated within Atlas against the LDM
and is the basis of the BTEQ scripts that are executed.

258731137.doc

Version V3.0
Page 58 of 62

7.5.2 DETAILED KPI CALCULATIONS


For the new KPIs, reference data needs to be added in the following table.
The new KPI details need to be added in PROD_RECON.REF_KPI_DESCR_RECON table.
SYSTEM

KPI_ID

KPI_DESCRIPTION

TARGET_DIFFERE
NCE

TARGET_DIFF
_DATE

DEVIATIO
N

MOSA

1,00

CURRENT
DATE
CURRENT
DATE

15,00

MOSA

Number of records per


Priceplan,Fbs
No of records per
results code per
campign_id

1,00

15,00

7.5.3 Price Reference Feed


7.5.3.1 Number of records per Price plan, FBS
Get the total number of records from the following staging tables.

STG_VIEW.PRODUCT_ITEM_HIST

[IE]

The following rules apply:

Where IE.product_item_id = { use the lookup logic ETL will be using }


Note: This will be the hard coded value with which the script need to be modified during
deployment and value for product item id should be picked only after executing the one time
script (insert_product_item.sql) for PRODUCT_ITEM table

Consider the latest active records


{ PRODUCT_ITEM_END_DT is null }

Apply aggregation per PRODUCT, SECONDARY PRODUCT ID (IE.PRODUCT_ID,


SECONDARY_PRODUCT_ID) level.

Column Name
SYSTEM
KPI_ID
GROUP_LEVEL_1
GROUP_LEVEL_2
KPI_DATE
KPI_VALUE
UOM

258731137.doc

Version V3.0
Page 59 of 62

Value
MOSA
5
PRICEPPLAN_ID
FBS_ID
Date (XLS.BATCH_DATE)
Count(*)
Number

7.5.4 Campaign Discount Feed


7.5.4.1 Number of records per Result Code, Campaign
Get the total number of records from the following staging tables.

STG_VIEW.INCENTIVE_EVENT
STG_VIEW.XL2_LOAD_STATUS
STG_VIEW.XL2_LOAD_STATUS_HIST

[IE]
[XLS]
[XLSH]

Apply join conditions (INNER/OUTER JOIN) as appropriate.


join [IE] to [XLS] via IE.ICID to XLS.ICID AND IE.LUID to XLS.LUID
join [XLS] to [XLSH] via XLS.BATCH_NR to XLSH.BATCH_NR

(>= Non-Equi JOIN )


(>= Non-Equi JOIN)

The following rules apply:

Consider those records from MOSA


{ Creation_Source_Type_Cd =MOS }

From the XL2_LOAD_STATUS_HIST, take the latest batch data related to the Call centre
interface.
{XLSH.SOURCE = MOSA }

Apply aggregation per RESULT CODE, CAMPAIGN (IE. INCENTIVE_RESULT_ID, IE.


CAMPAIGN_ID) level.
Column Name
SYSTEM
KPI_ID
GROUP_LEVEL_1
GROUP_LEVEL_2
KPI_DATE
KPI_VALUE
UOM

258731137.doc

Version V3.0
Page 60 of 62

Value
MOSA
6
INCENTIVE_RESULT_ID
CAMPAIGN_ID
Date (XLS.BATCH_DATE)
Count(*)
Number

258731137.doc

Version V2.0
Page 61 of 62

8.

Appendix

A. Key Decision Emails

258731137.doc

Version V2.0
Page 62 of 62

You might also like