You are on page 1of 11

1. Which of the following Teradata utilities have you used?

Fastload

Multiload

TPump

BTEQ Import

Can you provide an example of when you would use each one?

What do you understand by the term – SESSIONS?


2. The following data-load process is reported to be running extremely slowly. Given the table definition of the target table, and the SQL process
itself, can you make some recommendations as to how to speed up the load process?

TABLES DEFINITIONS SLOW PROCESS

CREATE TABLE mytable DELETE FROM mytable ALL;


(
Sort_Code INTEGER NOT NULL, INSERT INTO mytable
Account_Number INTEGER NOT NULL, SELECT …..
Customer_Identifier DECIMAL(10,0), FROM source_table_1
Product_Identifier SMALLINT NOT NULL, ;
Telephone_Home DECIMAL(11,0),
Telephone_Mobile DECIMAL(10,0), INSERT INTO mytable
. SELECT …..
. FROM source_table_2
Balance DECIMAL(15,2) NOT NULL, ;
Overdraft_Limit DECIMAL(5,0)
)
PRIMARY INDEX (Sort_Code,Account_Number)
INDEX (Customer_Identifier)
INDEX (Product_Identifier)
INDEX (Telephone_Home)
INDEX (Telephone_Mobile)
;
3. Given the following tables and SQL process, write down the key steps of the query plan (i.e. the results of an EXPLAIN) that you would
expect Teradata to produce for the query.

CREATE TABLE mytable_1 CREATE TABLE mytable_2


( (
Customer_Identifier DECIMAL(10,0) NOT NULL, Country_Code CHAR(3) NOT NULL,
Customer_Name VARCHAR(50) NOT NULL, Country_Name VARCHAR(35) NOT NULL
County_Code CHAR(3) NOT NULL, )
. UNIQUE PRIMARY INDEX (Country_Code)
. ;
Customer_Type_Code CHAR(1) NOT NULL
COMPRESS (‘N’,’O’,’P’)
)
UNIQUE PRIMARY INDEX (Customer_Identifier)
;

Data Volume: 18 million rows Data Volume: 220 rows

Additionally, how would you ensure the query runs as expected?


4. The following data-items are to be held in a Teradata table. Using any financial system knowledge you have, propose appropriate data-types
for each of these, and also define any other column attributes that you believe would be appropriate. Justify your reasons for your answers.

Can Contain Can Be


NULLS Compressed Compress
Data-Item Data-Type (Y/N) (Y/N) Values Justification For Answer
Credit-Debit-Account-
Code CHAR(6)

Mortgage_Held_Indicator
CHAR(1)

Monthly_Debit_Count SMALLIN
T

Monthly_Debit_Balance DECIMAL
(15,2)

Account_Opened_Date
DATE

Account_Closed_Date
DATE

Customer_Surname VARCHA
R(15)

Customer_Initials
CHAR(1)
5. Given the following data-definition and sample data, can you please write a piece
of SQL that will determine the balance of an account at the end of January 2006 and
the end of February 2006, and output the data in the following format:

Account-Number January-2006-Balance February-2006-Balance


---------------------- ----------------------------- -------------------------------

12345678 99.00 103.00


87654321 0.00 50.00

CREATE TABLE mytable_3


(
Account_Number INTEGER NOT NULL,
Balance_Amount DECIMAL(15,2) NOT NULL,
Balance_Start_Date DATE NOT NULL,
Balance_End_Date DATE NOT NULL
)
PRIMARY INDEX (Account_Number);

Sample-Data

Balance_ Balance_
Account_ Balance_ Start_ End_
Number Amount Date Date
------------ ------------ ------------- -------------

12345678 123.00 21-12-2005 17-01-2006


12345678 132.00 18-01-2006 22-01-2006
12345678 99.00 23-01-2006 19-02-2006
12345678 103.00 20-02-2006 01-03-2006
Etc.
6. Given the DDL below, and the following statements, choose what you believe to be
the best Primary Index for the table, and state why you have chosen this.

CREATE TABLE mytable_4


(
Customer_Identifier DECIMAL(10,0) NOT NULL,
Summary_Date DATE NOT NULL,
Product_Identifier SMALLINT NOT NULL,
Average_Credit_Balance DECIMAL(15,2) NOT NULL,
Average_Debit_Balance DECIMAL(15,2) NOT NULL,
Net_Interest_Revenue DECIMAL(9,2) NOT NULL
)

• The primary key of the table is Customer_Identifier, Summary_Date and


Product_Identifier.

• Many queries that use this table will be looking to calculate the total balance
and net-interest-revenue for each customer per month.

• Many users will look to use the data above with additional customer attributes
from other customer tables that will be indexed on Customer-Identifier.
7. Using data from the tables below, write a piece of SQL that will work out for every
account in the ACCOUNTS_TABLE the number of credit and debit transactions per
account during September 2006.

CREATE TABLE accounts_table


(
Account_Number INTEGER NOT NULL,
Sort_Code INTEGER NOT NULL,
Summary_Date DATE NOT NULL,
Average_Credit_Balance DECIMAL(15,2) NOT NULL,
Average_Debit_Balance DECIMAL(15,2) NOT NULL
)
PRIMARY INDEX (Account_Number,Sort_Code);

Sample-data:

Average Average
Account Sort Summary Credit Debit
Number Code Date Balance Balance
--------- ------ ------------- ------------ -----------
12345678 404775 2006-09-30 123.30 0.00
23456789 404775 2006-09-30 0.00 -5.47
34567890 404775 2006-09-30 20.00 0.00

CREATE TABLE credit_trans


(
Account_Number INTEGER NOT NULL,
Sort_Code INTEGER NOT NULL,
Transaction_Date DATE NOT NULL,
Transaction_Type BYTEINT NOT NULL,
Transaction_Amount DECIMAL(9,2) NOT NULL
)
PRIMARY INDEX (Account_Number,Sort_Code);

Sample Data:

Account Sort Transaction Transaction Transaction


Number Code Date Type Amount
--------- ------- -------------- -------------- --------------
12345678 404775 2006-09-16 C17 23.00
12345678 404775 2006-09-22 C19 1046.18
CREATE TABLE debit_trans
(
Account_Number INTEGER NOT NULL,
Sort_Code INTEGER NOT NULL,
Transaction_Date DATE NOT NULL,
Transaction_Type BYTEINT NOT NULL,
Transaction_Amount DECIMAL(9,2) NOT NULL
)
PRIMARY INDEX (Account_Number,Sort_Code);

Sample Data:

Account Sort Transaction Transaction Transaction


Number Code Date Type Amount
--------- ------- -------------- -------------- --------------
12345678 404775 2006-09-16 D06 -44.00
12345678 404775 2006-09-24 D22 -232.17
12345678 404775 2006-09-29 D23 -32.05
23456789 404775 2006-09-02 D06 -44.00
23456789 404775 2006-09-09 D22 -22.17
23456789 404775 2006-09-11 D23 -2.05
23456789 404775 2006-09-18 D23 -30.00

Output required:

Account_ Sort_ Number_Credit_ Number_Debit_


Number Code Transactions Transactions
------------ ------- --------------------- --------------------

12345678 404775 2 3
23456789 404775 0 4
34567890 404775 0 0
8. Give some examples of when you have or would use:

• Volatile tables

• Global temporary tables

• Permanent tables
9. Given the following data definition:

CREATE TABLE my_stats_table


(
Customer_Identifier DECIMAL(10,0) NOT NULL,
Branch_Identifier INTEGER NOT NULL,
Total_VAPM_Value DECIMAL(9,2) NOT NULL
)
UNIQUE PRIMARY INDEX (Customer_Identifier)
;

Sample Data:

Customer Branch Total_VAPM


Identifier Identifier Value
------------ ------------ -----------------
1002 204505 11.11
1004 204505 -65.14
1008 204511 123.45

Write a piece of SQL to split the customers into Total_VAPM_Value decile bands.

Write a second piece of SQL to determine the top-10 customers, in


Total_VAPM_Value terms, at each branch.
10. Users have been complaining about the negative performance impacts when using
the following table – the DDL is provided below:

CREATE MULTISET TABLE mytable_5


(
Customer_Identifier DECIMAL(10,0) NOT NULL,
Summary_Date DATE NOT NULL,
Main_Sort_Code INTEGER NOT NULL,
Main_Account_Number INTEGER NOT NULL,
Barclays_Risk_Grade CHAR(2) NOT NULL,
Customer_Name CHAR(100) NOT NULL,
Date_Of_Birth DATE NOT NULL,
Date_Became_Customer DATE NOT NULL,
Relationship_Manager_Id INTEGER NOT NULL,
Relationship_Manager_Name CHAR(100) NOT NULL,
Relationship_Manager_Team CHAR(100) NOT NULL,
Relationship_Manager_Area CHAR(100) NOT NULL,
Relationship_Manager_Region CHAR(100) NOT NULL,
)
UNIQUE PRIMARY INDEX (Customer_Identifier,Summary_Date)
;

Here are some facts about the table:

• There are 10 million customer records per month.

• The table stores 36 months worth of data.

• Less than 2% of customers join the bank per month.

• Less than 1% of customers leave the bank per month.

• Less than 3% of customers’ details change on a regular basis.

• Users join this table to other customer-level tables.

Please make some recommendations about how the performance of queries against
the table could be improved.

What other considerations would need to be made?

You might also like