You are on page 1of 69

DATASTREAM DATA LOADER

CUSTOMER SPECIFICATION
INVESTMENT & ADVISORY DATAFEEDS
1

REVISION HISTORY
Name Date version Summary of changes
Paul Bacon March 2009 1.0 General update & redesign
Paul Bacon July 2010 1.5 Update and corrections
Gareth Izzard Dec 2012 2.0 General update & redesigns

DDL Customer Specification Guide


2

CONTENTS
About this document 4
Intended Readership 4
In This Guide 4
Chapter 1 Support 5
Contacting the Helpdesk 5
Service Alerts 6
Datastream Data Alerts 7
Chapter 2 DDL Operational Hours 8
The DDL Week 8
Historical File Processing 9
Accepted Usage 9
Chapter 3 Hardware / Software Requirements 10
Chapter 4 Other Resources for the DDL User 11
Chapter 5 Connecting To The DDL FTP Server and Communications 12
Chapter 6 FTP Account 13
Chapter 7 Naming Conventions 14
Request File 14
Validation Output Files 14
Free Form Names 15
Chapter 8 What Happens When a Request File is Uploaded? 16
Chapter 9 Request File Structure 17
Request File Facts 17
Request File Template 17
Logon Record 20
Global Level Format Record 20
Specific Data Channel Formatting 22
Extraction Level Records 23
Trigger Record 24
Extraction Level Format Record 26
Request Type Record 27
Datatype Record 29
Instrument Record 31
Chapter 10 General Request File Information 35
Extract Sizing 35
Chapter 11 Standard DDL Output Formats 36
Default Format – Static Request 36
Default Format – Time Series Request 37
Chapter 12 Examples of Formatting Output Options 40
Static Requests 41
Time Series Requests 45
Chapter 13 System Thresholds 50

DDL Customer Specification Guide


3

Chapter 14 Error / Warning Messages 51


Chapter 15 Functions & Expressions 54
Chapter 16 Using the Weekend Extracts 61
Chapter 17 Handling Price History Adjustments & Corporate Actions 63

DDL Customer Specification Guide


4

ABOUT THIS DOCUMENT


This document outlines the information needed for clients to use and access the Datastream Data Loader (DDL)
Custom service.
If you have any queries regarding DDL please contact your local Thomson Reuters representative who provided you
with this document. Alternatively, if this is not possible contact our regional Helpdesks

INTENDED READERSHIP
This user guide is for DDL users who create & amend DDL request files and require background information on using
the DDL service

IN THIS GUIDE
This guide will detail the information required for end users of DDL to design & build their request files successfully. It
aims to assist with fault finding and provide links to other reference sites. The DDL server is set to London Local Time
and all timings within this guide reflect that.

DDL Customer Specification Guide


5

CHAPTER 1 SUPPORT
CONTACTING THE HELPDESK
The Thomson Reuters Helpdesk can be contacted by region on the telephone numbers below:

European Tel. No. United States Tel. No.


Austria 0800 234 805 United States 1-888-888-1082
Belgium 0800 80 928 +1 (646) 822 2777
France 0800 97 0234
Germany 0800 7388 3771 Latam Tel. No.
Italy 800 929 080 Argentina 0800 288 9999
Luxembourg 800 239 89 (French IVR) Aruba 5411-5554-7391
Luxembourg 800 239 90 (German IVR Bolivia 800 10 0277
Portugal 800 206 953 Brazil 0800 891 7872
Spain 900 81 1027 Chile 800 80 0058
Switzerland 0800 00 1509 (French IVR) Colombia 01800 944 2979
Switzerland 0800 00 1535 (German IVR Costa Rica 0800 011 0849
Switzerland 0800 00 1529 (Italian IVR) Ecuador 1866-222-0650
United Kingdom 0870 1910581 El Salvador 800-6122
Other countries +44 20 3229 0644 Guatemala 1866-222-2581
Guyana 5411-5554-7391
Asia-Pacific Tel. No. Honduras 1866-222-2605
Australia 1800 630 128 Mexico 01800 123 0162
China (Mandarin) 4008811408 Neth Antilles 001 800 898 4679
Hong Kong 852 2524 0077 Nicaragua 001 800 044 0075
India 000 800 100 7376 Panama 001 800 898 4679
Japan +81 3 4589 2424 Paraguay 0800 11 45 28
Malaysia 1800 814 158 Peru 0800 51 828
New Zealand 0800 738 837 St Marteen 5411-5554-7391
Philippines 1800 1855 0002 Uruguay 000 411 009 3079
Singapore 1800 776 7188 Venezuela 0800 100 4242
Taiwan 0080 185 5287

If you would prefer to email the desks please use the Thomson Reuters Customer Zone service by registering at
https://customers.reuters.com/Home/Default.aspx using the “Contact Us” link and selecting Datastream as the
Product.

When raising queries with the support teams relating to a specific DDL issue or Extraction question please include
details of the file in question (the extraction ID) and your three digit DDL FTP account ID, this will ensure any second
and third level teams have the required information to begin investigations quickly.

DDL Customer Specification Guide


6

SERVICE ALERTS
If there are any issues with our network or services, you need to know as soon as possible. Our Service Alerts, which
are free to Thomson Reuters customers, will send you updates via Thomson Reuters Messenger or email
so you know what's going on straight away. And because they can be tailored to the Thomson Reuters services
you use, you can make sure you only receive alerts that matter to you.

Setting up Product Service Alerts.

For New Users


Register on the Thomson Reuters Customer Zone at
https://customers.reuters.com/Home/Default.aspx

Go to Service Alerts, which can be found from Home>Support>Technical Support>Service Alerts, or directly from
https://customers.reuters.com/support/ServiceAlerts

Complete the three steps of the Subscription Wizard


Select your preferred Region.
Select your preferred Role.
Amend your categories accordingly (Datastream can be found in Investment & Advisory Solutions>Investment
Management)

Select receipt options from the Delivery link – email, text, Reuters Messaging – and then select “Update” to save your
preferences

For Existing Subscribers


Go to Edit Subscriptions.
The Datastream options are found at
• Investment & Advisory Solutions
o Investment Management

DDL Customer Specification Guide


7

DATASTREAM DATA ALERTS

Overview
The email alerts system, an extension to the Thomson Reuters Datastream Extranet, allows you to sign up to receive
emails when:

A ‘Data Alert’ matching your specified search criteria is created on the Extranet.
Data Alerts are added to the Extranet when an issue is identified with a particular data set – for example, where
equity data for a particular market is late because of supplier difficulties. The Data Alerts search interface allows you
to specify which data categories you are interested in and key words to search for in the description.

A ‘Content & Product Update’ matching your specified search criteria is created on the Extranet.
Content & Product Updates are added to the Extranet when new content is added for a particular data set, or where
there is a significant change to series definitions – for example, where a supplier of economics data announces they
plan to make major revisions to their series. The Content & Product Update search interface again allows you to
specify which data categories you are interested in and key words to search for in the description.

SIGNING UP FOR THE EMAIL ALERTS SERVICE


In order to access the Thomson Reuters Datastream Email Alerts services you need to have a valid Extranet logon
ID. If you do not currently have a logon ID access the registration page found here:
http://extranet.datastream.com/extranetregistration/registrationpage.aspx

CREATING ALERTS
Alerts are created based on the Search criteria you specify via the Content & Product Update, Data Alerts and
Infostream search forms.

The process of setting up an Alert is the same for all three search forms, namely:
• Enter a search in the relevant search form
• Check the “Add to My Alerts” box on the Search results page
• Name the Alert
• Click the “Add to My Alerts” button.

A full guide with examples of creating, amending and disabling alerts can be found on the Datastream Extranet site
at the following here
http://extranet.datastream.com/DataAlertsDocuments/Email%20Alerts%20user%20guide%20v3.pdf

DDL Customer Specification Guide


8

CHAPTER 2 DDL OPERATIONAL HOURS

The DDL FTP server, where client output files are stored, is available 24 hours a day, seven days a week. However,
the DDL processing engine is operational only during Monday to Saturday.

THE DDL WEEK


The DDL week starts at 9:00 am UK time on a Monday. Processing of extracts continues through to 8 am the
following day – this period of time is known as “DDL Monday”. This is continued throughout the week, with
processing starting at 9 am, until 8 am each day through to 8am Saturday when final daily processing for “DDL
Friday” ends.

This “off-set” of day start and end times is done on purpose to ensure data for all regions is captured within a single
24 hour period.

Weekend processing requires an additional process to be enabled on each FTP account. If you would like to enable
weekend processing please contact your local Thomson Reuters support staff.

DDL Customer Specification Guide


9

HISTORICAL FILE PROCESSING


The processing of large historical files for seeding a database or refreshing history should be undertaken during a
“weekend process”. If users require access to weekend processing or need assistance in creating large historical
downloads they should contact their Thomson Reuters Relationship Manager

ACCEPTED USAGE
For the benefit of all users, clients are requested not to place a large number of extractions on the same trigger, but
space them out during the DDL day.

We advise clients not to request large data downloads on triggers just before the housekeeping window (0730 &
0800) as this will impact processing for all users.

Any questions relating to the above, please contact your local helpdesk.

DDL Customer Specification Guide


10

CHAPTER 3 - HARDWARE/SOFTWARE REQUIREMENTS

In line with Thomson Reuters batch FTP products there are no specific hardware or software requirements as long
as the customer system can:

• Connect to the DDL FTP server using an FTP client.


• Issues appropriate ID and Password details on FTP logon.
• Create the ASCII formatted request file (e.g. using MS Notepad).
• Connect to and download the data extract file from the DDL FTP server.
• Decompress the CSV format output file from ZIP protocol. (PKZIP is the compression software used)

DDL Customer Specification Guide


11

CHAPTER 4 – OTHER RESOURCES FOR THE DDL USER

Datastream Extranet - http://extranet.datastream.com/


The main resource for DDL users is known as the Datastream Extranet - This web site has huge amounts of
information related to Datastream content (registration is required). From here users can search for codes, find out
about new data, any data delays, product & content news.

DDL Extranet Homepage - http://extranet.datastream.com/User%20Support/PubDoc/DDL.htm


This is part of the Datastream Extranet stores documentation for Datastream Data Loader; here users can find sample
request files, latest Trigger lists, and a request builder

Datastream Navigator - http://product.datastream.com/Navigator/Seriessearch.aspx


This web site provides users with access to an intelligent code look up facility, normally found within Advance & AFO
Excel add in.

Datastream Datatype look up - http://product.datastream.com/Navigator/Datatypesearch.aspx?


Look up facility for Datastream datatypes and definitions.

DDL Customer Specification Guide


12

CHAPTER 5 - CONNECTING TO THE DDL FTP SERVER AND


COMMUNICATIONS

There are several connection methods available to users,

• Via Internet
• Connection to Thomson Reuters private network TF1.Net - leased circuit
• Connection to Thomson’s VPN LANAS - leased circuit
• BT MPLS

Access to the DDL server is IP restricted and therefore a static WAN side IP address is required to be registered with
the Thomson Reuters Network Security team.

Clients changing locations / servers will need to register each different WAN IP – 2-3 working days are currently
needed to register changes on the Thomson Reuters corporate firewall.

Clients connect to the DDL FTP server using the DNS name: uktfmonddlp1.datastream.com

A users ID and Password can be presented in one string using the following format;
ftp://<DDL ID>:<PASSWORD>@uktfmonddlp1.datastream.com/<DDL ID>
for example
ftp://ABC:Password1@uktfmonddlp1.datastream.com/ABC

The DDL FTP server will disable access to an account if an incorrect password is input more than 5 (five) times in a
30-minute period. If your account does become disabled for this reason please contact your local Helpdesk or via
https://customers.reuters.com/Home/, providing your source IP address and DDL FTP account ID. Executives on the
Helpdesk will be able to place request files in to your account or send you output files by e-mail while the FTP
account is re-set.

The DDL system & basic operational process can be seen in the simplified diagram below.

DDL Customer Specification Guide


13

CHAPTER 6 - FTP ACCOUNT

All DDL client accounts are the same. Each account has a 3 character name and a randomly generated password. If
you do not know the ID or Password please contact your Thomson Reuters Thomson Reuters Relationship Manager.

The FTP account consists of the following folders:

• Request – This folder will contain the request file that is to be processed. It should be noted that only one
request file can be in this folder.
• Internal Def – This is where perpetual request files are held. Clients should contact their Thomson Reuters
Relationship Manager or the Helpdesk to get files put in here as access is not granted to customers
• Validation Output – Shortly after a request file has been uploaded to the Request Folder a report is produced to
advise the user if the request file is valid or not. This rpt file will also list out any invalid instruments used.
• Send – When processing of a request has been completed then output files will be available in this folder.
• Sizing Output – This folder provides basic sizing information about the extractions within the request folder
(Dates, number of data points, datatypes, instruments requested, etc)

The Request and Send folders have up to 5 sub-folders. Each folder is dated and contains one day’s worth of
output data. Each sub-folder is named after the date that the request file was processed.

DDL Customer Specification Guide


14

CHAPTER 7 - NAMING CONVENTIONS

When using Datastream Data Loader there are a number of naming conventions for different files which the user will
require to become familiar with.

REQUEST FILE:
The DDL Request file contains all of the information required for an output to be generated, these are custom built so
users get exactly the data they require at the right time.

Every DDL request file has the same naming convention DD4ccc.req – Where [ccc] relates to your 3 character FTP
account Id.

VALIDATION OUTPUT FILES:


DD4ccc.rpt – This is a report file produced shortly after receipt of a new request file on the FTP server, see section 6
for more information

Output files within Send folder:


The default option for the output files names is to use the date of extraction and the extract number, defined below by
[mmdd] and [nn] accordingly.

When an output is produced 4 files are returned to the Send folder these are a .CSV, .ERR, LOG & .ZIP – these are
detailed below.

• [mmdd].[nn].csv This is the data output file


• [mmdd].[nn].err This is the error file. Error details can be found in Chapter 14
• [mmdd].[nn].log This is the output file log showing the details of what was processed for each
request within the extract and how many values were output for each datatypes
• [mmdd].[nn].zip This is a pkzip compressed version of the CSV file only.
This is the last file to be added to the Send folder when the output data is being sent to
the FTP server.
Clients should look for the availability of this file when polling the FTP server.

DDL Customer Specification Guide


15

FREE FORM FILE NAMES:


Clients have the ability in DDL to dictate part of the output file name themselves. This facility is switched on by
requesting the {FORMAT} option SRVRFILEPREFIX=. The files that this option affects are those found in the Send
folder.

More information about format and placing of this option in the request file can be found in Request File Structure of
this document.

Clients can include the current DDL date in their own free form file name by using the following

%D= Two Character day value


%M= Two Character month value
%C= Two Character century value
%Y= Two Character year value

For example: SRVRFILEPREFIX=%D%M%Y_EQUITY would create an output file with the name
th
091208_EQUITY.[nn] for Extracts generated on the 9 December 2008.

DDL Customer Specification Guide


16

CHAPTER 8 - WHAT HAPPENS WHEN A REQUEST FILE IS UPLOADED

When a client uploads a request file to the DDL FTP server two processes occur straight away. These processes
are designed to help customers identify any potential problems with the syntax or amount of data being requested in
each Extract. The output of the two processes is made available to clients in an .rpt file which can be found in the
Validation Folder of the DDL account being used.

The first process checks the sizing of the file. DDL has two processing limitations of time and disk space. A sizing
algorithm calculates these based on the information in the request file. If an extract exceeds either of these limits it
will be rejected. The second process checks the request file for any signs of syntax errors, mixed frequencies;
invalid instruments or datatypes in each extract. Error messages contained in the second part of the .rpt report are
available in Chapter 14.

Errors, either syntactical or sizing that cause an Extract to be rejected are shown in the first part of the .rpt file.
Clients can use keywords to check the file for rejected extracts. This part of the report file also tells clients the
number of instruments and datatypes in each Extract. An example of the first part of the request file can be seen
below:

DD4VAL Data Download: Validation Summary - ABC Date: 09/09/2012


--------------------------------------- Time: 19:00:15
Extraction Definition 01 0900
Se 1 Static Datatypes: 20 Stocks: 190
Extraction Definition 02 1245
Seq 1 Time-Series Datatypes: 8 Stocks: 442
Seq 1 Static Datatypes: 20 Stocks: 442
Extraction Definition XX WRONG
REJECTED – SYNTAX

The report file (.rpt) is expected to be available to the customer within ten minutes of the client uploading their
request file to the FTP server. Very large request files may take slightly longer than the stated ten minutes.

The Extraction Definition line includes the extract number and the triggers requested. The sequence line (shown by
keyword Seq) shows the type of request and number of instruments and datatypes requested. An extraction
definition line with the keyword ‘WRONG’ shows that extract has been rejected. The line directly beneath this shows
the reason, either syntax or sizing. If the reason is to do with syntax then further information can be found by looking
at the error messages in the second part of the file for that extract.

Warning

When posting a new request file users should take care not to overwrite any extraction details for files yet to be
produced. The safest way to prevent this would be to append any new requests to the bottom of the current request
file.

If users are sure all previous requests have been produced the Request file can be overwritten completely

DDL Customer Specification Guide


17

CHAPTER 9 - REQUEST FILE STRUCTURE


REQUEST FILE FACTS
• DDL uses a single request file.
• Within 1 request file clients can request up to 99 Extracts.
• An extract can contain up to 999 requests.
• A maximum of 99 extracts can be produced per account per day.
If you require to create more than 99 extractions in a day please contact your Thomson Reuters Relationship Manager to discuss additional
DDL accounts.
• Every request file must contain the following records;
{LOGON} - DS Logon record
{FORMAT} – Global level format record
{EXTRACTION} - Extraction record
{TRIGGER} – Trigger record
{FORMAT} – Extraction based format record
{REQTYPE} – Request Type format record
{DATATYPES} – Datatypes record
{INSTRUMENTS} – Instrument Record

DDL REQUEST FILE TEMPLATE

This section outlines the basic requirements for a valid DDL v4.1 request file. It is intended for use as a basic
template guide for setting up a DDL request.

Please note that although some format options are given below this section does not cover all of the various format
options available, these are outlined in more detail in the ‘Global Records’ section (page 17).

SECTION CARD
DESCRIPTION EXAMPLE
HEADER

Insert Datastream research or logon ID (Xcccnnn)


{LOGON} {LOGON} XABC123
here

This section should include any format options to


be applied to all extractions. Please note that any {FORMAT}
{FORMAT} NOTAVAIL=N/A
extraction level format options will over-ride
options specified at this level

The sections in blue text above can only appear once at the top of your request file
Insert up to 2 character alpha-numeric extraction
{EXTRACTION} 01
{EXTRACTION} ID here

Insert the market based or time based triggers {TRIGGER}


{TRIGGER} UKEQPRC
here. Multiple market based triggers are allowed, SWEQPRC
but only one time based trigger is allowed. 1800

Insert extraction specific format options here.


{FORMAT}
{FORMAT} These will over-ride any duplicate format options SRVRFILEPREFIX=UK_Equity_Prices
set by the above global level

DDL Customer Specification Guide


18

The sections in red standard text can be repeated up to 99 times in a request file, providing up to 99 separate
output files
Insert type of request Time-series or Static, start {REQTYPE} S D 4
{REQTYPE} date, end date, retrieval frequency and number of {REQTYPE} T 01/01/2011 –0D D 4
decimal places. {REQTYPE} T 01/01/2011 01/01/2012 D 4

Insert required datatypes here.


{DATATYPES}
Output datatype names can be over-ridden by P:USERNAME=Closing_Price
{DATATYPES}
adding :USERNAME=[own datatype name] to the VO
end of the datatype

Insert the instruments required here. Comments


can also be added on separate lines. {INSTRUMENTS}
LFTSE100
U:F
{INSTRUMENTS} Note: By adding an end of file marker to the end of
the last instrument section of an extract you can <COMMENT>End of FTSE100 constituents
easily tell that download of an output CSV file was <COMMENT>@End of file#
complete.

The sections in green text can be repeated up to 999 times in a single extraction

See the main body of this DDL v4.1 technical specification for more detailed format options, instrument over-rides and
positioning of section cards etc. in the request file

Notes:
• The request file must be named DD4[ccc].req where [ccc] is your three character FTP account Id.
• All lines must start at position 1, i.e. far left hand side of the request file
• Having blank lines in the request file can result in a major error, which may invalidate the entire request file.
The only exception to this is the {INSTRUMENTS} and {DATATYPES} sections. Any blank lines included in these
sections will generate a warning message in the validation report (.rpt file) before being ignored.
• Extractions must have different IDs. Once an ID has been used for a particular extract it cannot be used again
that day
• When uploading new or amended extracts to the DDL FTP server ensure you include all extracts you want
processed for that day that have not yet run. Leaving an extract that has not processed out of the request file will
act as a delete.
• Up to 200 datatypes are allowed in a single {DATATYPES} section
• All section cards, wrapped in {} brackets, must be in full upper case
• Up to 20,000 lines are allowed in the {INSTRUMENTS} section, including comments
• Up to 100,000 instruments included in constituent lists are allowed in the {INSTRUMENTS} section
• The request file must not total more than 300,000 lines

DDL Customer Specification Guide


19

An example DDL request showing one static and one time-series request can be seen below. The records, e.g.
{LOGON}, shown in bold are the Global records, those in normal font are the Extract level records:

{LOGON} XABC123
{FORMAT}
{EXTRACTION} 01
{TRIGGER}
UKPRCEQ
{FORMAT}
MNEMONIC=Y
SRVRFILEPREFIX=UK_EQ_STAT
{REQTYPE} S -0D
{DATATYPES}
SECD
ISIN
GEOGN
{INSTRUMENTS}
FBRIT
{EXTRACTION} 02
{TRIGGER}
UKPRCEQ
MNEMONIC=Y
SRVRFILEPREFIX=UK_EQ_HIST
{REQTYPE} T 01/01/2008 31/12/2008 D 3
{DATATYPES}
P
PH
PL
PO
VO
{INSTRUMENTS}
FBRIT

For the request above a successful validation report would be produced and be available in the Validation Output
folder looking like:

DD4VAL Data Download: Validation Summary - XXX Date: 09/09/2012


--------------------------------------- Time: 14:25:52
Extraction Definition 01 UKPRCEQ
Seq 1 Static Datatypes: 3 Stocks: 2188
Extraction Definition 02 UKPRCEQ
Seq 1 Time-Series Datatypes: 5 Stocks: 2188
** End of Extraction Summary **
** End of Validation Report **

DDL Customer Specification Guide


20

LOGON RECORD
To use DDL you will require 2 ID’s. The first is an ID used to logon to the DDL FTP Site. This is purely for access to
the FTP server, and is registered on the access control system of the FTP server only. The second ID is a valid
Datastream Research Logon ID. This is embedded in the header record of the request file. This logon ID uses the
Datastream research permission system to control content access levels.

{LOGON} XABC123

GLOBAL LEVEL FORMAT RECORD


The Format record controls the format of the extract file. The FORMAT section contains the mandatory word
{FORMAT} followed by a series of optional format settings each on its own line in the request file e.g.

{FORMAT}
DCDEFLT=Y
MERGE=N
TITLES=Y

Formats can be set at the Global level and will be applied to all Extractions in the file or they can be set specific to the
individual files under the Extraction Level format.

The following table shows the various options that can be set in the Global Format Record (the Defaults are
displayed in BOLD text)

DDL Customer Specification Guide


21

Value
Keyword Value Options Field Descriptor Mandatory/Optional
Length

{FORMAT} Nil Key word to signify start of formatting options Mandatory

Y Output requested code next to DS code in CSV Optional


MNEMONIC= Char 1
N file (default assumed if omitted)
Y Output instrument name, e.g. Optional
NAME= Char 1
N ‘IMP.CHEM.INDS’ (default assumed if omitted)
Y Output expression name, e.g. Optional
EXNAME= Char 1
N ‘IMP.CHEM.INDS – PRICE’ (default assumed if omitted)

Allows clients to replace default datatype


User defined Optional (Standard Datastream
DFLTDTUSERNAME= mnemonics such as X or DFLT with something Char 26
string mnemonic output if omitted)
more meaningful
Optional (default mmdd.nn.[ext]
Allows client to define specific file name for
User defined assumed if omitted – see file
SRVRFILEPREFIX= output data, replaces [mmdd] part of standard Char 26
string names section for more
filename
information)
The Datastream system requires alpha-
Optional – use of a 7-character
Y numeric SEDOL numbers to be prefixed with
alpha numeric SEDOL without a
‘UK’ to be valid. Selecting ‘Y’ with this option
UK prefix or this {FORMAT} option
will cause DDL to automatically prefix any 7-
PREFIXSEDOL= switched on will cause the Char 1
character code conforming to the SEDOL
instrument to be rejected with the
check digit algorithm to be prefixed with ‘UK’.
N error ‘INVALID INSTRUMENT –
This allows clients to continue using 7-
POSSIBLE SEDOL?’
character alpha-numeric SEDOLs
Y Optional (default assumed if
NUMQUOT= Numbers in quotes Char 1
N omitted)
, (Comma)
Data Separator
S (Space) Char 1
Note if the decimal point indicator (DECSEP=) Optional (default assumed if
DATASEP= T (Tab) is changed from <period> to <comma> then the omitted)
; (Semi-Colon) delimiter is implicitly changed to <semi colon>
| (Pipe)
. (Full Stop) Optional (default assumed if
DECSEP= Decimal Separator Char 1
, (Comma) omitted)

(Nothing)
Default
Optional (default assumed if Up to
NOTAVAIL= Not available
Any 4 omitted) Char 3
characters
Y Optional (default assumed if
STATNULL= Display Static Null Values Char 1
N omitted)
U Optional (default assumed if
PADTIME= Pad Time-Series Data Char 1
P omitted)

Existing
Format
DD/MM/YY
Optional (default assumed if
DATE= Date format Char 10
MM-DD-YY omitted)
DD/MM/CCYY
YY/MM/DD

DDL Customer Specification Guide


22

The grey shaded options in the below table are specific “Data Channel” formatting options. They require that the
DCDEFLT=Y record is also entered. If it is omitted then the file is rejected. Data Channel formatting options are
described in more detail below.

N Optional
DCDEFLT= Data Channel Format Char 1
Y (default assumed if omitted)
Y Optional
MERGE= Merge Char 1
N (default assumed if omitted)
Y Optional
TRAN= Transpose Char 1
N (default assumed if omitted))
Y Optional
COLUMN= Column headings Char 1
N (default assumed if omitted)
Y Optional
ROW= Row headings Char 1
N (default assumed if omitted)
N Optional
TITLES= Titles/Dates Char 1
Y (default assumed if omitted)

If no formatting parameters are entered (apart from the mandatory {FORMAT}) then the output will be in standard Data
Loader format, example of this output are displayed later on in this guide.

SPECIFIC DATA CHANNEL FORMATTING


Any of the format parameters, in the first table, when used individually will result in different formatting options such
as commas, quotes etc being output in the normal Data Loader style format. However when any of the Data
Channel style options are selected in conjunction with these parameters, or just in their own, the format changes to
mirror that offered by Data Channel.

As a side note at this moment, it is important to understand that if DCDEFLT=Y is entered then the output will be in a
data channel format with the default format options also shown in grey applied. The following section details the
layout of output when different Data Channel formatting options have been selected. For purposes of clarity, the
output has been placed in a table. Actual output would be in a delimited (normally CSV) format.

DDL Customer Specification Guide


23

EXTRACTION LEVEL RECORDS


The extraction records are shown as {EXTRACTION} then a 2 field alphanumeric. For example.

{EXTRACTION} 01

Or

{EXTRACTION} SW

Or

{EXTRACTION} K9

The Extract number has to be 2 characters; DDL will not accept leading or trailing spaces. The Extraction record
does not decide what order the requests are processed in. This decision is made based on the contents of the
Trigger record, see below.

Warning

An Extract with a particular Extract ID (the two characters that follow the {EXTRACTION} card) can only be
processed once in a DDL day. Once processed, any further requests for that Extract number to be processed in that
day will be ignored; however, no error will be generated in the validation report If an Extract, for example 01, is
processed at 09:30 pm UK time and then uploaded again at 13:00 UK time the second version will be ignored. It will
NOT be carried across to the following day.

DDL Customer Specification Guide


24

TRIGGER RECORD
Directly below the Global level format record settings, comes the TRIGGER record. This record must contains a
mandatory word {TRIGGER} followed by at least one item, each on its own line in the request file e.g.

{LOGON} XABC123
{FORMAT}
{EXTRACTION} 01
{TRIGGER}
1000

In DDL extracts are processed based on triggers supplied by the user. There are two types of triggers that can be
used. The first are market based triggers; the second are time based triggers. Details on how to use these triggers are
outlined in the following table:

Mandatory/ No. of Value


Keyword Value Options Field Descriptor
Optional instances Length

Maximum
Key word to signify start of extract
{TRIGGER} Nil Mandatory of 1 (one) Char 9
trigger options
per Extract

Market based triggers allows the


At least one market
Market based customer to have an extract processed Up to 35
See Chapter 8 or time based trigger Char 15
trigger as soon as the Data for a particular marke (thirty five)
must exist
is updated to Datastream.

15 minute intervals from Used when either a market trigger is At least one market Maximum
Time based
0900 am – 0800 am not available or a data required at set or time based trigger of 1 (one) Num 4
trigger
UK time time must exist per Extract

Processes in the next available 15 Maximum


As soon as
NOW minute schedule after validation Optional of 1 (one) Char 3
possible
process has completed per Extract

Maximum
Validation
NEVER Used to validate a request file only Optional of 1 (one) Char 5
only
per Extract

DDL Customer Specification Guide


25

Market Based Triggers

There can be multiple triggers for an extract. The extract will not process until all triggers are satisfied. This means
that if a market archive is seriously delayed then all data within that Extract will wait for it. If a trigger has already
satisfied when the request has been uploaded then the extract will process as soon as it is passed to the mainframe.
This definition only applies if just market based triggers are listed.

{LOGON} XABC123
{FORMAT}
{EXTRACTION} 01
{TRIGGER}
UKPRCEQ

Time Based Triggers

There can only be one time based trigger per extract. The extract will process when the time is reached – all times
are UK local time. If the time has passed when the request file is uploaded to the FTP server then the extract will be
processed as soon as it is passed to the mainframe, this ultimately gives batch on demand.

{LOGON} XABC123
{FORMAT}
{EXTRACTION} 01
{TRIGGER}
1000

Combining Market and Time Based Triggers

Multiple market triggers can be combined with a single time trigger. This has the effect of creating a “drop-dead”
effect. If the market based triggers all satisfy before the time based trigger then the Extract will be processed when
the last market trigger satisfies. However, if more than one market trigger is outstanding when the time trigger is
reached then the extract will be processed when the time trigger satisfies. This means that that latest data may or
not be available depending on what triggers are requested.

{LOGON} XABC123
{FORMAT}
{EXTRACTION} 01
{TRIGGER}
UKPRCEQ
2000

DDL Customer Specification Guide


26

Using the “NOW” Trigger

If a user requires their data as soon as possible, then the trigger NOW can be used. The request once validated will
be processed on the next available 15 minute schedule. For example if the request is validated at 11.06am then the
extraction will begin processing at 11.15 am.

{LOGON} XABC123
{FORMAT}
{EXTRACTION} 01
{TRIGGER}
NOW

Using the “NEVER” Trigger

The NEVER trigger allows users to submit a request file for validation only purposes. DDL will check the file for errors
and create the validation output as normal but the request will not be scheduled to process. This is particularly of
interest for users wanting to make changes to their request file and allow checking prior to releasing on one of the
before mentioned trigger options.

{LOGON} XABC123
{FORMAT}
{EXTRACTION} 01
{TRIGGER}
NEVER

EXTRACTION LEVEL FORMAT RECORD


The format parameters allowed in this section are the same as in the Global Records section listed previously. The
difference is that any format parameters specifically listed at the Extract level with override the Global level format
options. If left empty the default or Global level format will be applied, with Global format option overriding the default
options

DDL Customer Specification Guide


27

REQUEST TYPE RECORD


This record denotes the beginning of a request within an extract, of which there has to be at least 1 (one), and no
more than 99 (ninety nine).

Start position of
Field Type Length of field Field value
field in row

Keyword 1 9 ‘{REQTYPE}’

Separator 10 1 <SPACE>

‘S’ for Static data


Static or Time Series indicator 11 1
‘T’ for Time series data

Separator 12 1 <SPACE>

Start date 13 10 Relative date or DD/MM/CCYY

Separator 23 1 <SPACE>

End date 24 10 Relative date or DD/MM/CCYY

Separator 34 1 <SPACE‘>

‘D’ for daily values


‘W’ for weekly values
Frequency 35 1
‘M’ for monthly values
‘Y’ for yearly values

Separator
36 1 <SPACE>
(Time series Requests Only - optional)

No of decimal places in output n where n is a number 1-9 (trailing zeros in the


37 1
(optional) output will not be displayed)

For this part of the request file to pass validation the following rules need to be observed.
• Static/Time series indicator must be ‘S’ or ‘T’.
• Start and end date default to run date if not present.
• For static data request, end date and frequency are ignored.
• For time series data request, start date must not be after end date.
• Frequency must be either ‘D’,’W’,’M’ or ‘Y’.
• All parameters and keyword in correct position on line

DDL Customer Specification Guide


28

Here are some examples of fixed and relative dates for both Time Series and Static requests that can be used in a
request string

(Note the correct character spacing)

Relative dates
Relative dates work on DDL in a slightly different manner to the Datastream desktop. We use the following rules:

• The DDL system date is different from the normal Datastream date and is changes at approximately 08:30 am
every day. It runs therefore about 8.5 hours later than UK time and thus before 08:30 am is set to the previous
weekday date, whilst past 08:30 am reflects the current weekday. The DDL system date is never set to a Sunday
date but runs daily Monday /Tuesday /Wednesday /Thursday /Friday /Monday with a single Weekend Extract
processing on Saturday at 12:00 pm.

• When doing requests with relative dates on DDL, the date arithmetic is very simple. The current DDL system
date is used as the reference date at the time the extract runs. Any relative date is calculated from that DDL
system date.

For example say an extract ran at 2030 on Monday 7th April 2008. The DDL system date would be the 7th
April. If the customer specified +0d in a request that ran in that extract, it will be evaluated as 7h April. -1d
evaluated to 6th April -1w evaluated to the 1st April, -1m equates to the 7th March.

DDL Customer Specification Guide


29

DATATYPE RECORD

This record contains a list of all the datatypes that a client wishes to have data returned for. Each datatype must be
placed on a separate line. The Datastream Datatype Navigator can be used to look up datatypes and provide
definitions. To access the online version go to http://product.datastream.com/Navigator/Datatypesearch.aspx
(registration required).

The example part request file below shows a static request for the datatypes Sedol, ISIN & Geographic Name

{LOGON} XABC123
{FORMAT}
{EXTRACTION} 01
{TRIGGER}
UKPRCEQ
{FORMAT}
MNEMONIC=Y
SRVRFILEPREFIX=UK_EQ_STAT
{REQTYPE} S -0D
{DATATYPES}
SECD
ISIN
GEOGN
{INSTRUMENTS}
FBRIT

The USERNAME function allows customers to include their own coding in the request file to be output in the CSV file.
This makes it easier for customers with their own coding system to ingest the data as it is already mapped to their
internal Ids. The same request file has been amended to use the USERNAME function.

{LOGON} XABC123
{FORMAT}
{EXTRACTION} 01
{TRIGGER}
UKPRCEQ
{FORMAT}
MNEMONIC=Y
SRVRFILEPREFIX=UK_EQ_STAT
{REQTYPE} S -0D
{DATATYPES}
SECD:USERNAME=SEDOL_CODE
ISIN:USERNAME=ISIN_CODE
GEOGN:USERNAME=COUNTRY_ID
{INSTRUMENTS}
FBRIT

DDL Customer Specification Guide


30

Datatypes field can be amended further to customise the output

Field Type Length of field Field value

Datatype N/A Any valid Datastream datatype

Separator 1 ‘:’

PADTIME=P for padded time-series data


Padding option 9
PADTIME=U for non-padded time-series data

Separator 1 ‘,’

Number of decimal places 7 DECPL=n – where n can be a value from 1-9

Separator 1 ‘1’

User specified datatype name 69 USERNAME=[User specified string]

In the example below the Datatype for Price (P) has been customised

{DATATYPES}
P:USERNAME=CLOSING_PRICE

This could be further customised to include options to pad the data for non trading days and to increase the number of
displayed decimal places

{DATATYPES}
P:PADTIME=P,DECPL=3,USERNAME=CLOSING_PRICE

For this part of the request file to pass validation the following rules need to be observed

• There must be at lease one datatype present in the Datatypes record


• Must be valid Datastream datatype, or DDL function for expression
• The first separator after the datatype must be a semi-colon (:), all others must be a comma (,)
• In the USERNAME section spaces and punctuation are allowed
• If requesting composite datatypes, such as RD001, then enough USERNAME references must be given for each
repeat, each separated by a comma. For example:
Input:

{DATATYPES}
RD001:USERNAME=DATE,VALUE

Output:

DATE001=16/06/2001
VALUE001=1.309

The support for function and expressions is described in Chapter15

DDL Customer Specification Guide


31

INSTRUMENT RECORD

This section includes the series, mainframe lists, or user created lists that data is required against.

The USERNAME function allows customers to include their own coding in the request file to be output in the CSV file.
This makes it easier for customers with their own coding system to ingest the data as it is already mapped to their
internal ID’s. This works in an identical way to the datatype record. The example below shows the username
function to convert the Datastream code to an I/B/E/S code.

{LOGON} XABC123
{FORMAT}
{EXTRACTION} 01
{TRIGGER}
UKPRCEQ
{FORMAT}
MNEMONIC=Y
SRVRFILEPREFIX=UK_EQ_STAT
{REQTYPE} S -0D
{DATATYPES}
SECD
ISIN
{INSTRUMENTS}
901419:USERNAME=@BTA
911488:USERNAME=@BIA
900995:USERNAME=@BPA
914447:USERNAME=@BHI
901295:USERNAME=@BAT

For this part of the request file to pass validation the following rules need to be observed.

• Input under {INSTRUMENT} can be either a Datastream code, ISIN code or a “UK” prefixed Sedol code
• If the instrument is valid, the LOGON ID must have the access to it. If the instrument is invalid, it will be logged to
the error report and deleted from the request file. In this instance, the whole request will not be rejected. If this
occurs contact your Thomson Reuters Thomson Reuters Relationship Manager to gain access.
• The first character of the instrument mnemonic must be in position 1 of the request file line directly under
{INSTRUMENTS}
• There must be at least one instrument in the record for the request to be valid
• When using the USERNAME function clients must delimit the instrument mnemonic from the USERNAME
keyword with a colon (:), e.g. MKS:USERNAME=U:MKS

DDL Customer Specification Guide


32

Handling of Duplicates within an INSTRUMENTS record

During the validation process the DDL system will ignore only duplicates that directly follow each other in the record.
Duplicate series found elsewhere within the instruments section will be processed.

For example, consider the following request string, including duplicate series in each of the three extractions;

{EXTRACTION} 01
{TRIGGER}
NEVER
{FORMAT}
{REQTYPE} S -0D
{DATATYPES}
SECD
ISIN
{INSTRUMENTS}
929724
507534
{EXTRACTION} 02
{TRIGGER}
NEVER
{FORMAT}
{REQTYPE} S -0D
{DATATYPES}
SECD
ISIN
{INSTRUMENTS}
929724
507534
929724
507534
{EXTRACTION} 03
{TRIGGER}
NEVER
{FORMAT}
{REQTYPE} S -0D
{DATATYPES}
SECD
ISIN
{INSTRUMENTS}
929724
929724
507534
507534

DDL Customer Specification Guide


33

Once validated the DDL system ignores only the duplicates in Extraction 03;

DD4VAL Data Download: Validation Summary - ABC Date: 16/10/2012


--------------------------------------- Time: 11:40:21
Extraction Definition 01 NEVER
Seq 1 Static Datatypes: 2 Stocks: 2
Extraction Definition 02 NEVER
Seq 1 Static Datatypes: 2 Stocks: 4
Extraction Definition 03 NEVER
Seq 1 Static Datatypes: 2 Stocks: 2
** End of Extraction Summary **
{EXTRACTION} 03
{TRIGGER}
{FORMAT}
{REQTYPE} S -0D 00001
{DATATYPES}
{INSTRUMENTS}
929724
********* DUPLICATE STOCK - IGNORED
507534
********* DUPLICATE STOCK - IGNORED
** End of Errors **
** End of Validation Report **

Economic instrument validation

DDL needs to validate additionally on the basis of the default frequency of the instruments requested to ensure that
the structure of the resultant file is consistent. To that end, if economics data is requested, the series must be wholly
contained in their own request. This may or may not form part of a multiple request file. Furthermore, economics
data of varying frequencies must be split between requests to have similar frequency requests in the same request.

The following simple request contains two US economic series, the first for Unemployment is a monthly series, the
second for GDP is a quarterly series

{EXTRACTION} 01
{TRIGGER}
NEVER
{FORMAT}
{REQTYPE} T -10Y M
{DATATYPES}
X
{INSTRUMENTS}
USUN%TOTQ
USGDP...D

DDL Customer Specification Guide


34

Attempting to validate this 10 year monthly timeseries request would produce the following validation report

DD4VAL Data Download: Validation Summary - ABC Date: 16/10/2012


--------------------------------------- Time: 13:55:27
Extraction Definition 01 NEVER
Seq 1 Time-Series Datatypes: 1 Stocks: 1
** End of Extraction Summary **
{EXTRACTION} 01
{TRIGGER}
{FORMAT}
{REQTYPE} T -10Y M 00001
{DATATYPES}
{INSTRUMENTS}
USGDP...D
********* ECONOMICS WITH FREQUENCY MISMATCH
** End of Errors **
** End of Validation Report **

We see the quarterly GDP series has been rejected because the data cannot be run as a monthly series as validated
by the first series

To run these two series in the same Extraction would require a second request;

{EXTRACTION} 01
{TRIGGER}
NOW
{FORMAT}
{REQTYPE} T -10Y M 2
{DATATYPES}
X
{INSTRUMENTS}
USUN%TOTQ
{REQTYPE} T -10Y Q 2
{DATATYPES}
X
{INSTRUMENTS}
USGDP...D

In a similar fashion, although not validated against, splitting the instruments into instrument type (equities, indices,
economics etc) and using a request for each instrument type, has proved to be more efficient for both extraction and
ingest time and also file sizes.

DDL Customer Specification Guide


35

CHAPTER 10 - GENERAL REQUEST FILE INFORMATION


EXTRACT SIZING

Each Extract must comply with the following rules regarding its size:

• All valid requests must not total more than 100,000 lines of text (lists are not decomposed into their individual
constituents i.e. individual instruments);
• All static requests (when lists are decomposed) must not exceed 69,000 lines;
• There cannot be more than 350 datatypes in an Extract;
• There cannot be more than 999 requests in an Extract.

Note: Decomposed refers to the number of constituents contained in a list mnemonic, for example the Datastream
list LFTSE100 would use up 1 line in the DDL request file but when ‘decomposed’ would result in approximately 100
lines

Request File ASCII Formatting

In order for DDL to be able to read a request file the ASCII formatting has to be correct. DDL requires that each line
end with <CR><LF> (carriage return, line feed) ASCII characters.

Warning

It is possible that when converting a DDL file from a UNIX machine to Windows the <CR> character gets lost in the
translation. This will cause DDL to not recognise the file format and fail to process.

DDL Customer Specification Guide


36

CHAPTER 11 - STANDARD DDL OUTPUT FORMATS

There are two types of request available to the DDL user, Static & Time Series. This section provides information on
the output formats for these request types.

DEFAULT FORMAT – STATIC REQUEST

The request file below was used to create the output shown further below. This is the default output users will receive
if they do not use any of the DCDEFLT=Y or other format options within a static request

{LOGON} XABC123
{FORMAT}
{EXTRACTION} 01
{TRIGGER}
NOW
{FORMAT}
{REQTYPE} S -1D
{DATATYPES}
NAME
ISIN
SECD
GEOGN
INDM
P
UP
{INSTRUMENTS}
LDJINDUS

DEFINITION OF FIELDS
The above request file generate the following output;

RUN ,,201210221034,103437,,0,01
"DATES",,"20121019",,,1,"20121019"
"902172","NAME","20121019",,,1,"3M"
"902172","ISIN","20121019",,,1,"US88579Y1010"
"902172","SECD","20121019",,,1,"2595708"
"902172","GEOGN","20121019",,,1,"UNITED STATES"
"902172","INDM","20121019",,,1,"Divers. Industrials"
"902172","P","20121019",,,1,92.94
"902172","UP","20121019",,,1,92.94
"945388","NAME","20121019",,,1,"AT&T"

DDL Customer Specification Guide


37

The first line

RUN ,,201210221034,103437,,0,01

This line shows the date, time and extract ID for the request

RUN ,,[RUN_DATE]YYYYMMDDHHMM,[RUN_TIME]HHMMSS,,0,[EXRACTID]

The ‘DATES’ line denotes to beginning of an output set for a particular {REQTYPE} set within an Extract.

"DATES",,"20121019",,,1,"20121019"

It is included as a header record to the content for that request set. Each “DATES” line follows the below format:
“DATES”,,”REQUESTED_ DATE”,,,[NUMBER_OF_VALUES],”DATE_OF_VALUE”

DEFAULT FORMAT – TIME SERIES


The following request file was used to create the output below. This is the default output users will receive if they do
not use the DCDEFLT=Y with a Time Series request. This request is using 3 datatypes and the stocks within the Dow
Jones Industrial Average index.

{LOGON} XABC123
{FORMAT}
{EXTRACTION} 02
{TRIGGER}
NOW
{FORMAT}
{REQTYPE} T 01/01/2012 05/01/2012 D 4
{DATATYPES}
P
DY
MV
{INSTRUMENTS}
LDJINDUS

DDL Customer Specification Guide


38

DEFINITION OF FIELDS

RUN ,,201210221220,122044,,0,02
"DATES",,"20111230","20120105","D",5,"20111230","20120102","20120103","20120104","20120105"
"902172","P","20111230","20120105","D",5,81.7300,,83.4900,84.1800,83.8000
"902172","DY","20111230","20120105","D",5,2.6918,,2.6350,2.6134,2.6253
"902172","MV","20111230","20120105","D",5,57280.0200,,58513.4900,58997.0600,58730.7400
"945388","P","20111230","20120105","D",5,30.2400,,30.3800,30.4300,30.4000
"945388","DY","20111230","20120105","D",5,5.8201,,5.7933,5.7838,5.7895

RUN ,,201210221220,122044,,0,02

As with the default static formatting this line shows the date, time and extract ID for the request

RUN ,,[RUN_DATE]YYYYMMDDHHMM,[RUN_TIME]HHMMSS,,0,[EXRACTID]

"DATES",,"20111230","20120105","D",5,"20111230","20120102","20120103","20120104","20120105"

The ‘DATES’ line denotes to beginning of an output set for a particular {REQTYPE} set within an Extract. It is
included as a header record to the content for that request set. Each “DATES” line follows the below format:

“DATES”,,”START_DATE”,”END_DATE”,”FREQUENCY”,”NUMBER_OF_VALUES”,”DATE1”,”DATE2” etc

More output file formats are available. These are known as ‘DataChannel’ formats and details of these can be found in
Chapter 12.

Where clients have no initial formatting option requirements it is generally advised to use the default layouts and
program against these. As requests possibly grow in size or time series histories get longer it is not always possible for
DDL to maintain specific formatting options – the request will still be run but the .LOG file will indicate that the formatting
has defaulted to the standard layout – this can cause ingest scripts to fail and the need to be rewritten. See Chapter 13
for thresholds

NOTES:
• If no values exist for a time-series datatype/instrument combination then this line will be omitted from the output file
• Clients can request null values to be displayed or omitted by using the ‘STATNULL=’ format option, see relevant section
for more information
• The .log file that accompanies the .CSV file details instruments for which no data was returned, these are omitted from
the output file
• If there is data missing from the .CSV file that you think should be there first check the .rpt file for errors, or the .log file
for information detailed in note 3

DDL Customer Specification Guide


39

OTHER FORMATTING OPTIONS


The following details the functionality for other formatting options for DDL.
DDL DataChannel Formatting supports user-friendly functionality: input mnemonic; user-defined labels for instruments
and datatypes; instrument NAME literal; comments.

DataChannel Formatting-specific options:


Enable formatting keyword DCDEFLT= - Y or N (default N)
Merge output keyword MERGE= - Y or N (default Y)
Transpose output keyword TRAN= - Y or N (default contextual: N –
static; Y – time-series)
Column Headings keyword COLUMN= - Y or N (default Y)
Row Headings keyword ROW= - Y or N (default Y)
Titles/dates keyword TITLES= - Y or N (default N)

Optional output fields handled as part of DataChannel formatting:


Input Mnemonic keyword MNEMONIC= - Y or N (default N)
Instrument name keyword NAME= - Y or N (default N)
Series name keyword EXNAME= - Y or N (default Y for DataChannel formatting only)

User-friendly functionality options: input MNEMONIC; user-defined labels for instruments and datatypes; instrument
NAME literal; <COMMENT> records; EXNAME gives greater flexibility in how the series name is both generated and
output. Note that these options are not exclusive to DataChannel formatting, i.e. EXNAME applies equally for DDL-
format output where the series name literal can be output to identify time-series in the same way occurs for
DataChannel 900B.

MNEMONIC= Tags the output record with the identifier used in the input request.
If a list identifier was defined on input then it is propagated for each constituent. (For time-series
requests, this will be verbose depending on the number of series!)

NAME= Promote instrument name to output record prefix rather than as part of data.

EXNAME= Include series name as part of output record prefix in the same way that DataChannel 900B does.
(Only has relevance in a time-series context.)

Note that for the two NAME-type options, the name literal is derived from the retrieval context rather than that
returned by the NAME datatype.

Taking 26923V / IHG as an example:

NAME datatype value “ICTL.HTLS.GP..”


Derived from Company Accounts retrieval “INTERCONTINENTAL HOTELS GROUP”
Derived from Worldscope retrieval “INTERCONTINENTAL HOTELS GROUP PLC”

For DDL-format output, EXNAME=N is the default, but EXNAME=Y explicitly included in the {FORMAT} section of the
request file will return the series name. Note that in this context NAME=Y,EXNAME=Y will be overkill.

For DataChannel format, EXNAME=Y is the default so EXNAME=N should really only be explicitly included in the
{FORMAT} section of the request file in order to disable this feature.

USERNAME={literal string}
These are used to associate user-defined symbology to instruments and/or datatypes and, where present in the input
request (where they may be sporadically defined), will be mapped accordingly.

DDL Customer Specification Guide


40

CHAPTER 12 EXAMPLES OF FORMATTING OUTPUT OPTIONS


STATIC REQUESTS

The following basic request was used to create the outputs on the next pages. The highlighted MERGE= & TRAN= are
the only parts of the request file that are being changed to produce the different style outputs.

The inputs to change MERGE= & TRAN= are either Y or N. All other default settings are applied.

DDL Customer Specification Guide


41

Static format 1

{FORMAT}
DCDEFLT=Y
MERGE=Y
TRAN=N
MNEMONIC=Y
NAME=Y
SRVRFILEPREFIX=SFORMAT1
{REQTYPE} S -0D
{DATATYPES}
P
MV
NOSH
{INSTRUMENTS}
@AAPL
K:CHT
S:NESN

OUTPUT
"Code","Mnemonic","Name","P","MV","NOSH"
"992816","@AAPL ","APPLE",659.3899,618115.9,937406
"867874","K:CHT","CHINA MOBILE",85.95,1727563,20099620
"929724","S:NESN ","NESTLE 'R'",60,193487.9,3224798

Code Mnemonic Name P MV NOSH


992816 @AAPL APPLE 659.3899 618115.9 937406
867874 K:CHT CHINA MOBILE 85.95 1727563 20099620
929724 S:NESN NESTLE 'R' 60 193487.9 3224798

DDL Customer Specification Guide


42

Static format 2

{FORMAT}
DCDEFLT=Y
MERGE=N
TRAN=N
MNEMONIC=Y
NAME=Y
SRVRFILEPREFIX=SFORMAT2
{REQTYPE} S -0D
{DATATYPES}
P
MV
NOSH
{INSTRUMENTS}
@AAPL
K:CHT
S:NESN

OUTPUT

"Code","Name" Code Name


"992816","APPLE" 992816 APPLE
"Code","Mnemonic" Code Mnemonic
"992816","@AAPL " 992816 @AAPL
"Code","P" Code P
"992816",659.3899 992816 659.3899
"Code","MV" Code MV
"992816",618115.9 992816 618115.9
"Code","NOSH" Code NOSH
"992816",937406 992816 937406
"Code","Name" Code Name
"867874","CHINA MOBILE" 867874 CHINA MOBILE
"Code","Mnemonic" Code Mnemonic
"867874","K:CHT " 867874 K:CHT
"Code","P" Code P
"867874",85.95 867874 85.95
"Code","MV" Code MV
"867874",1727563 867874 1727563
"Code","NOSH" Code NOSH
"867874",20099620 867874 20099620
"Code","Name" Code Name
"929724","NESTLE 'R'" 929724 NESTLE 'R'
"Code","Mnemonic" Code Mnemonic
"929724","S:NESN " 929724 S:NESN

DDL Customer Specification Guide


43

Static format 3

{FORMAT}
DCDEFLT=Y
MERGE=Y
TRAN=Y
MNEMONIC=Y
NAME=Y
SRVRFILEPREFIX=SFORMAT3
{REQTYPE} S -0D
{DATATYPES}
P
MV
NOSH
{INSTRUMENTS}
@AAPL
K:CHT
S:NESN

OUTPUT

"Code","992816","867874","929724"
"Mnemonic","@AAPL","K:CHT","S:NESN"
"Name","APPLE","CHINA MOBILE","NESTLE 'R'"
"P",659.3899,85.95,60,1369000
"MV",618115.9,1727563,193487.9
"NOSH",937406,20099620,3224798

Code 992816 867874 929724


Mnemonic @AAPL K:CHT S:NESN
Name APPLE CHINA MOBILE NESTLE 'R'
P 659.3899 85.95 60
MV 618115.9 1727563 193487.9
NOSH 937406 20099620 3224798

DDL Customer Specification Guide


44

Static format 4

{FORMAT}
DCDEFLT=Y
MERGE=N
TRAN=Y
MNEMONIC=Y
NAME=Y
SRVRFILEPREFIX=SFORMAT4
{REQTYPE} S -0D
{DATATYPES}
P
MV
NOSH
{INSTRUMENTS}
@AAPL
K:CHT
S:NESN

OUTPUTS

Code,"992816" Code 992816


Name,"APPLE" Name APPLE
Code,"992816" Code 992816
Mnemonic," @AAPL" Mnemonic @AAPL
Code,"992816" Code 992816
P,659.3899 P 659.3899
Code,"992816" Code 992816
MV,618115.9 MV 618115.9
Code,"992816" Code 992816
NOSH,937406 NOSH 937406
Code,"867874" Code 867874
Name,"CHINA MOBILE" Name CHINA MOBILE
Code,"867874" Code 867874
Mnemonic," K:CHT" Mnemonic K:CHT
Code,"867874" Code 867874
P,85.95 P 85.95
Code,"867874" Code 867874
MV,1727563 MV 1727563
Code,"867874" Code 867874
NOSH,20099620 NOSH 20099620
Code,"929724" Code 929724
Name,"NESTLE 'R'" Name NESTLE 'R'
Code,"929724" Code 929724
Mnemonic,"S:NESN" Mnemonic S:NESN
Code,"929724" Code 929724
P,60 P 60
Code,"929724" Code 929724
MV,193487.9 MV 193487.9
Code,"929724" Code 929724
NOSH,3224798 NOSH 3224798

DDL Customer Specification Guide


45

TIME SERIES REQUESTS

The following basic Time Series request was used to create the outputs on the next pages. The highlighted MERGE= &
TRAN= Are the only sections of the request file that are being changed to produce different style outputs. The inputs to
change MERGE= & TRAN= are either Y or N. All other default settings are applied.

The request below is providing a time series output for 3 equities and returning data (Price, Market Capitalisation &
Dividend Yield) for 5 days in January 2012

In each of the 4 output styles the input mnemonic and series name have been requested under the extraction level
{FORMAT} section.

{LOGON} XABC123
{FORMAT}
{EXTRACTION}
{TRIGGER}
NOW
{FORMAT}
DCDEFLT=Y
MERGE=
TRAN=
MNEMONIC=Y
NAME=Y
{REQTYPE} T 01/01/2012 05/01/2012 D 4
{DATATYPES}
P
MV
DY
{INSTRUMENTS}
@AAPL
K:CHT
S:NESN

DDL Customer Specification Guide


46

Time Series Format 1

{FORMAT}
DCDEFLT=Y
MERGE=Y
TRAN=Y
MNEMONIC=Y
NAME=Y

"Series","Mnemonic","20111230","20120102","20120103","20120104","20120105"
"APPLE","@AAPL",405.0000,,411.2300,413.4399,418.0298
"APPLE - DIVIDEND YIELD","@AAPL",0.0000,,0.0000,0.0000,0.0000
"APPLE - MARKET VALUE","@AAPL",377518.8000,,383326.1000,385386.1000,389664.4000
"CHINA MOBILE","K:CHT",75.9000,,77.1000,75.7000,76.7500
"CHINA MOBILE - DIVIDEND YIELD","K:CHT",4.2989,,4.2319,4.3102,4.2512
"CHINA MOBILE - MARKET VALUE","K:CHT",1523404.0000,,1547556.0000,1519456.0000,1540531.0000
"NESTLE 'R'","S:NESN",54.0000,,54.3000,54.4000,54.4500
"NESTLE 'R' - DIVIDEND YIELD","S:NESN",3.4259,,3.4070,3.4007,3.3976
"NESTLE 'R' - MARKET VALUE","S:NESN",178199.9000,,179189.9000,179519.9000,179684.9000

Series Mnemonic 20111230 20120102 20120103 20120104 20120105


APPLE @AAPL 405 411.23 413.4399 418.0298
APPLE - DIVIDEND YIELD @AAPL 0 0 0 0
APPLE - MARKET VALUE @AAPL 377518.8 383326.1 385386.1 389664.4
CHINA MOBILE K:CHT 75.9 77.1 75.7 76.75
CHINA MOBILE - DIVIDEND YIELD K:CHT 4.2989 4.2319 4.3102 4.2512
CHINA MOBILE - MARKET VALUE K:CHT 1523404 1547556 1519456 1540531
NESTLE 'R' S:NESN 54 54.3 54.4 54.45
NESTLE 'R' - DIVIDEND YIELD S:NESN 3.4259 3.407 3.4007 3.3976
NESTLE 'R' - MARKET VALUE S:NESN 178199.9 179189.9 179519.9 179684.9

DDL Customer Specification Guide


47

Time Series Format 2

{FORMAT}
DCDEFLT=Y
MERGE=Y
TRAN=N
MNEMONIC=Y
NAME=Y

"Name","APPLE","APPLE - DIVIDEND YIELD","APPLE - MARKET VALUE","CHINA MOBILE","CHINA MOBILE -


DIVIDEND YIELD","CHINA MOBILE - MARKET VALUE","NESTLE 'R'","NESTLE 'R' - DIVIDEND YIELD","NESTLE
'R' - MARKET VALUE"
"Code","992816(P)","992816(DY)","992816(MV)","867874(P)","867874(DY)","867874(MV)","929724(P)","929
724(DY)","929724(MV)"
"Mnemonic","@AAPL","@AAPL","@AAPL","K:CHT","K:CHT","K:CHT","S:NESN","S:NESN","S:NESN"
"20111230",405.0000,0.0000,377518.8000,75.9000,4.2989,1523404.0000,54.0000,3.4259,178199.9000
"20120102",,,,,,,,,
"20120103",411.2300,0.0000,383326.1000,77.1000,4.2319,1547556.0000,54.3000,3.4070,179189.9000
"20120104",413.4399,0.0000,385386.1000,75.7000,4.3102,1519456.0000,54.4000,3.4007,179519.9000
"20120105",418.0298,0.0000,389664.4000,76.7500,4.2512,1540531.0000,54.4500,3.3976,179684.9000

CHINA
APPLE - APPLE - CHINA CHINA MOBILE MOBILE -
Name APPLE DIVIDEND MARKET - DIVIDEND
YIELD VALUE MOBILE YIELD MARKET
VALUE
Code 992816(P) 992816(DY) 992816(MV) 867874(P) 867874(DY) 867874(MV)
Mnemonic @AAPL @AAPL @AAPL K:CHT K:CHT K:CHT
20111230 405 0 377518.8 75.9 4.2989 1523404
20120102
20120103 411.23 0 383326.1 77.1 4.2319 1547556
20120104 413.4399 0 385386.1 75.7 4.3102 1519456
20120105 418.0298 0 389664.4 76.75 4.2512 1540531

DDL Customer Specification Guide


48

Time Series Format 3

{FORMAT}
DCDEFLT=Y
MERGE=N
TRAN=N
MNEMONIC=Y
NAME=Y

Name,"APPLE" Name APPLE


Code,"992816(P)" Code 992816(P)
Mnemonic,"@AAPL" Mnemonic @AAPL
20,111,230,405.00 20111230 405
20120102, 20120102
20,120,103,411.23 20120103 411.23
20,120,104,413.44 20120104 413.4399
20,120,105,418.03 20120105 418.0298
Name,"APPLE - DIVIDEND YIELD" Name APPLE - DIVIDEND YIELD
Code,"992816(DY)" Code 992816(DY)
Mnemonic,"@AAPL" Mnemonic @AAPL
20111230,0.0000 20111230 0
20120102, 20120102
20120103,0.0000 20120103 0
20120104,0.0000 20120104 0
20120105,0.0000 20120105 0
Name,"APPLE - MARKET VALUE" Name APPLE - MARKET VALUE
Code,"992816(MV)" Code 992816(MV)
Mnemonic,"@AAPL" Mnemonic @AAPL
20,111,230,377,518.80 20111230 377518.8
20120102, 20120102
20,120,103,383,326.10 20120103 383326.1
20,120,104,385,386.10 20120104 385386.1
20,120,105,389,664.40 20120105 389664.4
Name,"CHINA MOBILE" Name CHINA MOBILE
Code,"867874(P)" Code 867874(P)
Mnemonic,"K:CHT" Mnemonic K:CHT
20111230,75.9000 20111230 75.9
20120102, 20120102
20120103,77.1000 20120103 77.1
20120104,75.7000 20120104 75.7
20120105,76.7500 20120105 76.75
Name,"CHINA MOBILE - DIVIDEND YIELD" Name CHINA MOBILE - DIVIDEND YIELD

DDL Customer Specification Guide


49

Time Series Format 4

{FORMAT}
DCDEFLT=Y
MERGE=N
TRAN=Y
MNEMONIC=Y
NAME=Y

"Series","Mnemonic","20111230","20120102","20120103","20120104","20120105"
"APPLE","@AAPL",405.0000,,411.2300,413.4399,418.0298
"Series","Mnemonic","20111230","20120102","20120103","20120104","20120105"
"APPLE - DIVIDEND YIELD","@AAPL",0.0000,,0.0000,0.0000,0.0000
"Series","Mnemonic","20111230","20120102","20120103","20120104","20120105"
"APPLE - MARKET VALUE","@AAPL",377518.8000,,383326.1000,385386.1000,389664.4000
"Series","Mnemonic","20111230","20120102","20120103","20120104","20120105"
"CHINA MOBILE","K:CHT",75.9000,,77.1000,75.7000,76.7500
"Series","Mnemonic","20111230","20120102","20120103","20120104","20120105"
"CHINA MOBILE - DIVIDEND YIELD","K:CHT",4.2989,,4.2319,4.3102,4.2512
"Series","Mnemonic","20111230","20120102","20120103","20120104","20120105"
"CHINA MOBILE - MARKET VALUE","K:CHT",1523404.0000,,1547556.0000,1519456.0000,1540531.0000
"Series","Mnemonic","20111230","20120102","20120103","20120104","20120105"
"NESTLE 'R'","S:NESN",54.0000,,54.3000,54.4000,54.4500
"Series","Mnemonic","20111230","20120102","20120103","20120104","20120105"
"NESTLE 'R' - DIVIDEND YIELD","S:NESN",3.4259,,3.4070,3.4007,3.3976
"Series","Mnemonic","20111230","20120102","20120103","20120104","20120105"
"NESTLE 'R' - MARKET VALUE","S:NESN",178199.9000,,179189.9000,179519.9000,179684.9000

Series Mnemonic 20111230 20120102 20120103 20120104 20120105


APPLE @AAPL 405 411.23 413.4399 418.0298
Series Mnemonic 20111230 20120102 20120103 20120104 20120105
APPLE - DIVIDEND YIELD @AAPL 0 0 0 0
Series Mnemonic 20111230 20120102 20120103 20120104 20120105
APPLE - MARKET VALUE @AAPL 377518.8 383326.1 385386.1 389664.4
Series Mnemonic 20111230 20120102 20120103 20120104 20120105
CHINA MOBILE K:CHT 75.9 77.1 75.7 76.75
Series Mnemonic 20111230 20120102 20120103 20120104 20120105
CHINA MOBILE - DIVIDEND YIELD K:CHT 4.2989 4.2319 4.3102 4.2512
Series Mnemonic 20111230 20120102 20120103 20120104 20120105
CHINA MOBILE - MARKET VALUE K:CHT 1523404 1547556 1519456 1540531
Series Mnemonic 20111230 20120102 20120103 20120104 20120105
NESTLE 'R' S:NESN 54 54.3 54.4 54.45
Series Mnemonic 20111230 20120102 20120103 20120104 20120105
NESTLE 'R' - DIVIDEND YIELD S:NESN 3.4259 3.407 3.4007 3.3976
Series Mnemonic 20111230 20120102 20120103 20120104 20120105
NESTLE 'R' - MARKET VALUE S:NESN 178199.9 179189.9 179519.9 179684.9

DDL Customer Specification Guide


50

CHAPTER 13 SYSTEM THRESHOLDS


Users should take note of the following system thresholds, if exceeded then output will be forced to MERGE=N, TRAN=N

STATIC REQUESTS

TRAN=N (ROW - Instruments; COLUMN – datatypes)

No. of columns/datatypes must not exceed 678 if COLUMN=Y or 1414 if COLUMN=N.Threshold cannot theoretically be
exceeded as extract will not permit this number).

TRAN=Y (ROW – datatype; Column – Instruments)

No. of columns/instruments must not exceed 1414. However, this limit is further reduced to 160 if any datatypes
returning long literal fields (such as WC06091 – Worldscope Business Description) are requested. Additionally,
comments are factored into the number of columns (1 comment ≡ 3 instruments) and there is a further restriction
(internal program array space for inverting the output):–
no. of rows times no. of columns times 103 (or 203 for long literal fields) <= 4450000.

TIME-SERIES REQUESTS

TRAN=Y (ROW – series; COLUMN – datapoint value)

No. of columns/datapoints must not exceed 1410.


(Note that DDL-format has a larger theoretical maximum as overflow/continuation records are generated if the DATES
or the values records exceed the MVS operating system limit of 32k characters. DCFormatting’s lower figure ensures
that values are aligned under a single date for retrieval up to 5.4 years daily.)

TRAN=N Row – datapoint value; Column – series.

No. of columns/series must not exceed 1414. Additionally, a further restriction (internal program array space for
inverting the output):– no. of rows x no. of columns x 23 <= 4,450,000.

Warning

There is also an absolute limit to the number of output records which can only be breached by time-series
MERGE=N,TRAN=N (”long-and-thin”) output - this is approx 5 million. Once exceeded, an attempt will be made to terminate
output at the end of the request set being processed and a warning message output to the .log file.

DDL Customer Specification Guide


51

CHAPTER 14 ERROR / WARNING MESSAGES

The table below outlines all of the possible return descriptions for an error that has occurred during the processing of a
DDL request file. These error messages are displayed in the .rpt produced after uploading a request file, and the .err
file produced along with the .CSV DDL output. Each error message is displayed with the relevant instrument code, data
type or format card that the message is associated with.

ERROR MESSAGE ERROR DETAIL

LOGON CARD MISSING The 1st card in the request must be a {LOGON} card.

REQTYPE CARD MISSING The {REQTYPE} card has not been entered, or has been entered incorrectly.

STATIC/TIME SERIES IND. MUST The static/time series indicator entered in the {REQTYPE} card is incorrect.
BE S OR T It must be either ‘S’ or ‘T’

INVALID START DATE The start date has been entered incorrectly in the {REQTYPE} card.

INVALID END DATE The end date has been entered incorrectly in the {REQTYPE} card.

DUPLICATE DATATYPE Consecutive datatypes have been found to be the same - this is not permitted.

The frequency entered in the {REQTYPE} card is incorrect.


FREQUENCY MUST BE D W M Q Y
It must be either ‘D’, ‘W’, ‘M’, ‘Q’, ‘Y’ or ‘ ‘.

The {DATATYPES} card must follow on from the {REQTYPE} card. If the {DATATYPES} card is
DATATYPE CARD MISSING missing, the set will be rejected and all following input will be ignored until the next {REQTYPE}
card is reached.

INVALID DATATYPE The datatype in question has been returned as invalid.

INSTRUMENT CARD MISSING The {INSTRUMENTS} card has not been entered, or has been incorrectly entered.

The requested instrument is neither a valid stock, nor a valid list or the user does not have
INVALID STOCK/LIST TYPE
permission for this instrument.

FORMAT CARD MISSING The 2nd card in the request must be a {FORMAT} card

INVALID DATATYPE FOR TIME


A datatype that has been entered as part of a time series request is only valid for static data retrieval.
SERIES

The end of the request has been found, but the request is incomplete; it must begin with {LOGON}
INCOMPLETE SET - CHECK INPUT and {FORMAT} cards, with each set containing a {REQTYPE} card, a {DATATYPES} card followed
FILE DATA by at least one datatype for a STATIC request (although not necessarily for a TIME SERIES request),
and an {INSTRUMENTS} card followed by at least one instrument (in that order).

NO VALID DATATYPES FOR No valid datatypes have been found for a STATIC request, which must have at least ONE valid
STATIC datatype associated with it.

DDL Customer Specification Guide


52

START DATE MUST NOT BE


The start date entered in the {REQTYPE} card is later than the end date.
AFTER END DATE

EMPTY FILE - PREVIOUS/NO


The input file is empty
REQUEST PROCESSED

DATATYPES EXCEED 100 The maximum allowable number of datatypes has been exceeded.

DATATYPE NOT The first datatype in the request was a company accounts datatype, therefore all subsequent
ACCOUNT/CODIFIED PROFILE datatypes must be company accounts or codified profiles datatypes.

ACCOUNT NUM/CODIFIED The first datatype in the set was NOT a company accounts datatype, so subsequent datatypes may
PROFILES NOT ALLOWED NOT be company accounts or codified profiles datatypes.

STOCKS EXCEEDING 20000 A maximum of 20000 individually requested stocks per set is permitted - this amount has been
IGNORED exceeded.

LISTS EXCEEDING 20000


A maximum of 20000 lists per set is permitted - this amount has been exceeded.
IGNORED

DATATYPE CODENUM If the datatype CODENUM is entered it will be ignored as it will be implemented for all instruments
- IGNORED later on regardless as to whether or not it has been requested by the user.

ALL STOCKS/LISTS ARE INVALID All instruments found in this set have been returned as invalid.

NO INSTRUMENTS FOUND IN
No instruments have been entered for this set.
THE SET

ALL DATATYPES INVALID All datatypes in the set are invalid, so reject the set.

LIMIT OF 10 COMPOSITE
The maximum number of composite datatypes allowed is 10 - this number has been exceeded.
DATATYPES EXCEEDED

ECONOMICS NOT ALLOWED An economic instrument has been found in a set where the first instrument was not an economic
HERE one. Whilst this is acceptable for STATIC series, this is NOT acceptable for TIME SERIES.

ONLY ECONOMICS ALLOWED A non-economic instrument has been found in a TIME SERIES set which has been started with
HERE economics. Only economics are allowed in this set.

For economic series the display frequency must be the same as or less frequent than the series
ECONOMICS WITH
base frequency. In an economics set with default frequency, the base frequencies of the series
INCOMPATIBLE FREQUENCIES
must be the same.

The required number of decimal places entered in the {REQTYPE} card or on a datatype card is
INVALID DECIMAL NUMBER INPUT
incorrect. It must be a single digit between 0 and 9.

X.U DATATYPE NOT ALLOWED IN


The datatype X.U is only valid for STATIC requests.
TIME SERIES

ACCOUNTS FREQUENCY
The requested frequency for TIME SERIES account datatypes must by ‘Y’.
MUST BE Y

DATATYPE LONGER THAN 45


The datatype entered exceeds 45 characters in length.
CHARACTERS

DDL Customer Specification Guide


53

INVALID FORMAT PARAMETER An invalid format parameter has been detected.

DUPLICATE FORMAT PARAMETER A duplicate format parameter has been detected.

Data Channel processing is NO by default, or has been set to NO, but one or more of the following
DATA CHANNEL FORMAT MUST
Data Channel formatting parameters has been entered: MERGE=, TRAN=, COLUMN=, ROW=,
BE YES
TITLES=. To use these, must have DCDEFLT=Y.

DATA CHANNEL FORMAT MUST


If Piranha formatting is set to YES, then DataChannel formatting must be set to NO
BE NO

PROCESSING STOPPED DUE TO One or more of the formatting parameters have been incorrectly entered, so processing has been
FORMAT ERRORS stopped.

COMMAS AND LEADING SPACES The chosen three letter abbreviation for NOT AVAILABLE cannot contain commas or leading /
NOT ALLOWED separating spaces.

DISPLAY STATIC NULL VALUES


If Data Channel formatting is selected (DCDEFLT=Y) then STATNULL must be set to ‘N’.
MUST BE NO

MORE THAN 1 DECIMAL NUMBER


If more than one DECPL=x has been entered on the same datatype card.
INPUT

MORE THAN 1 PADDING


If more than one PADTIME=x has been entered on the same datatype card.
INDICATOR INPUT

INVALID PADDING INDICATOR The padding indicator entered on the {REQTYPE} card or on a datatype card is invalid.
INPUT It must be ‘P’, ‘U’ or ‘ ‘.

DUPLICATE STOCK - IGNORED Consecutive stocks have been found to be the same - this is not permitted.

INVALID LOGON CARD - READING The {LOGON} card is invalid - nothing will be processed until the next {LOGON} card is found
TO NEXT ONE or the end of file is reached.

DDL Customer Specification Guide


54

CHAPTER 15 FUNCTIONS & EXPRESSIONS

FUNCTIONS

Functions are an important part of Datastream functionality. DDL can access that functionality.

How do functions work?


Datastream functions are statistical operators that allow you to calculate and view data in the way you want. For
example, you may need to look at a moving average or percentage change rather than an actual price movement.
There are currently more than 50 Datastream functions.

To learn more about Datastream Function visit the Datastream Extranet


http://extranet.datastream.com/User%20Support/PubDoc/FunctionsAndExps.htm

Construction
All functions share a common format:

Function#(Expression,Parameters) where:

Function# is a function code, for example moving average, MAV#

Expression In DDL, the expression will always be represented by a substitutable “X”. At execution time X will be
substituted by all instruments defined in the {INSTRUMENT} section of the request. Parameters vary according to function,
but generally relate to time, for example, start date or time period.

For example:

To calculate a 30 day moving average of British Airways share price, type:

{DATATYPES}
MAV#(X,30D)
{INSTRUMENTS}
BAY

Static and Moving Formats


There are two forms of function; static and moving. Some functions can be used in both static and moving formats:

Actual and Percentage change - ACH#, PCH#


Maximum and Minimum values - MAX#, MIN#
Standard deviation values - SDN#, SDN1#
Sum of values - SUM#
Weighted average - WAV#

DDL Customer Specification Guide


55

Static Formats

Static format functions produce a single value between two specified dates and take the format:

Function#(Expression,Start Date,End Date)

For example, to calculate the total number of Marks & Spencer shares traded between 1/1/04 and 1/6/04, type:

{DATATYPES}
SUM#(X(VO),1/1/04,1/6/04)
{INSTRUMENTS}
MKS

This gives a single figure; the total number of shares (VO) traded between the start and end date.

Moving Formats
Moving format functions produce a rolling value over a specified period and take the format:

Function#(Expression,Period)

For example, to calculate the weekly total volume of shares traded for Marks & Spencer over a 5 day period, type:

{DATATYPES}
SUM#(X(VO),5D)
{INSTRUMENTS}
MKS

This gives the total number of shares traded in the previous five days for each day in the time period specified in the
program, for example, 12 months.

Using dates within functions


You can use two date formats with functions within DDL:

• Actual dates
• Relative dates

Actual dates
Actual dates are used to define the actual start and end dates for a function. For example,
to obtain the average share price for British Airways between 1/6/98 and 1/6/99, type:

{DATATYPES}
AVG#(X,1/6/98,1/6/99)
{INSTRUMENTS}
BAY

Warning

If the actual date falls on a Saturday or Sunday, data for the previous Friday is used.

DDL Customer Specification Guide


56

Relative Dates

Relative dates allow you to specify a period relative to the DDL date at the time of execution. For example, -10D
indicates the 10 days prior to the DDL date. The default end date is the DDL date. The periods you can use are:

• D days
• W weeks
• M months
• Q quarters
• Y years

For example: The below gives the highest value for the FTSE index over the past 30 days

{DATATYPES}
MAX#(X,-30D,)
{INSTRUMENTS}
FTSE100

Or

The below gives the highest value for the FTSE index over the past 2 weeks

{DATATYPES}
MAX#(X,-2W,)
{INSTRUMENTS}
FTSE100

Warning

If you use displacement dates with a single function, take care not to confuse the moving and static formats of SDN#,
MAX#, MIN# and SUM#. See below.

The following two examples, identical in format except for the final comma, produce very different results. In both cases,
a time series request is used with the start and end dates being 1/1/98 and 1/1/99 respectively.

{DATATYPES}
SUM#(X,-3M)
{INSTRUMENTS}
FTSE100

In this example, a moving total for the FTSE 100 index is calculated with each calculated value based on a moving
three month spread of data, ranging from October-December 1997 for the earliest figure though to October-December
1998 for the last figures. Note that the minus sign (-3M) is ignored in this format.

DDL Customer Specification Guide


57

{DATATYPES}
SDN#(X,-3M,)
{INSTRUMENTS}
FTSE100

In this example, a single sum for the FTSE 100 index is plotted with the calculated value based on the last three
months' data, that is, from October 1998 to December 1998.

Time period dates specify the period (days, weeks, months, or years) over which a calculation is made. You can use
any of the following period codes: D, W, M, Q, or Y.
For example, to calculate a moving average

MAV#(X,5D) gives the average of X over a moving 5 day period


MAV#(X,1M) gives the average of X over a moving 1 month period

Function time periods must be greater than the data frequency, that is, the frequency at which data is stored. For
example, equity data is stored on a daily basis: the minimum time period you can use, therefore, is two days (that is,
2D). Monthly economic data requires a minimum time period of two months (that is, 2M), and so on. Please note the
following exceptions to this rule.

The following functions accept a time period that is equal to the data frequency:

• Actual change - ACH#


• Lag or lead - LAG#
• Percentage change - PCH#

For these functions, single unit time periods are accepted, that is: 1D for daily data, 1M for monthly data, 1Q for
quarterly data, and 1A for annual data.

Nested functions
Nested functions are functions used within other functions. A simple series mnemonic can be replaced with a second
function.

For example the below calculates the monthly percentage change in the US Dollar - Japanese Yen exchange rate.

{DATATYPES}
PCH#(X,1M)
{INSTRUMENTS}
JAPYNUS

You can replace the exchange rate mnemonic by a function, the moving average of the exchange rate.

{DATATYPES}
PCH#(MAV#(X,1M),1M)
{INSTRUMENTS}
JAPYNUS

DDL Customer Specification Guide


58

This new function calculates the monthly percentage change in the monthly moving average for the US Dollar-
Japanese Yen exchange rate. This generates a much smoother trend line when graphed. You can combine, or nest, up
to 6 functions in a single expression.

You can place functions anywhere in an expression. For example, to show the deviation of the actual British Airways
share price around the 30 day moving average (by subtracting the moving average from the actual price. Positive
values indicate a rising average.

{DATATYPES}
X-MAV#(X,30D)
{INSTRUMENTS}
BAY

Nested functions are subject to the following rules:

• Actual dates are only permitted in the bottom (i.e. innermost) nested function
• A maximum of 6 functions per expression
• A maximum of 19 operands per expression
• A maximum of three function 'blocks'. That is, sections of an expression separated by mathematical
operators (see the volatility example below)

These rules allow you to construct highly complex expressions. For example, the expression for historic volatility is:

POW#(12,0.5)*SDN#(LN#(X/LAG#(X,1M)),60M)

In this example, LN# and LAG# are nested within SDN#, where X is a series mnemonic. This expression has two
function blocks: POW# defines the first, and SDN# defines the

Warning

Nested functions are self-contained functions, complete with mnemonics and date parameters, contained within a
second function. They are quite distinct from function codes without parameters used as arguments in other functions -
see next section.

EXPRESSIONS

Expressions are formulae made up of Datastream codes, or series, combined with datatypes, functions, or
mathematical operators. Users can create their own Expressions within the Datastream Advance application – click on

the button.

For example: Stock(MV) / Index(MV) * 100

This calculates the weighting of a stock in an index. There is a set of global expressions created by Datastream
available for your use. These can be listed with descriptions using Expression button (above) within Datastream
Advance or AFO, all Global Expressions are stored as nnnE codes.

DDL Customer Specification Guide


59

Construction
Any Datastream code or series can be combined with appropriate datatypes, functions, and mathematical operators to
form an expression. You can form expresssions with variables to represent dates and codes.
The following rules apply to the construction of expressions:

Mathematical operators
multiply
/ divide
+ add
- Subtract

Brackets
Rounded brackets must be used to indicate priority. If you do not use brackets mathematical priority will apply, that is, *
and / are processed before + and -.

Operands
Operands may be series codes with or without datatypes and optionally:
• Currency conversion, tilde ~
• Expression codes, Ennn
• Functions
• Constants

Warning

Expressions may be symbolic. Constants of 6 or more digits must include a decimal point to distinguish them from
series codes. The maximum number of operands you can have in one expression is 19.

Formats

Symbolic expressions use variables to represent codes or values in their construction. This enables you to use the
same expression with many different codes or values.

For example: (X/65)*100

Assigning values to symbolic expressions

When using a symbolic expression, you must enter parameters for the variables. For example, the expression to
calculate the stock turnover of a company is PCH#(X/X(PE),1Y) where X represents the equity.

This expression is stored as a global expression 004E. To use this expression, you must state the values for X

For example:

{DATATYPES}
004E(X)
{INSTRUMENTS}
BAY

DDL Customer Specification Guide


60

Global expressions
Global expressions are a set of expressions created by Datastream and are available to all users. These can be
accessed through the function button within Datastream Advance or AFO & then by choosing Global Expressions from
the drop down menu.

DDL Customer Specification Guide


61

CHAPTER 16 USING THE WEEKEND EXTRACTS

The DDL weekend extract facility has been put in place for clients that require downloading long histories. In order to
use this facility, clients should first contact their Thomson Reuters Relationship Manager to get the DDL account
entitled for access. To get entitled for access clients are requested to provide a brief summary of their requirements,
when this has been identified the appropriate Weekend TRIGGER will be provided.

Weekend processing starts at approximately 12:00 pm UK time each Saturday.

Clients should be aware of the following.

• A separate request file to the daily request file must be used. This should be named DD4[ccc]W.REQ,
where [ccc] is the clients three-character FTP account ID.

• Weekend processing starts at approximately 12:00 pm UK time each Saturday.

• Clients can use up to 99 Extracts in the weekend request file, each extract must have the same trigger, but
must have a unique {EXTRACTION} identifier.

• The weekend request file must be uploaded to the Request folder of the clients FTP account.

• Clients can upload their weekend request file any time throughout the week, but it must be available by 11
am UK time on Saturday to ensure processing.

• A separate .rpt file will be made available shortly after a weekend request file has been uploaded, this will
contain a ‘W’ in the filename to distinguish it from the daily .rpt file

• If the WEEKEND extract is enabled then there must always be a DD4[ccc]W.REQ WEEKEND request file
present in your Request directory. If you have no WEEKEND requirement for a particular Saturday then a
dummy (e.g. 0 kb) Request file, which is set up when the WEEKEND extract is enabled, is there as a
minimum requirement. A copy of the dummy Request file will also be posted to the Internal_def folder
when the extract is first enabled.

DDL Customer Specification Guide


62

Below is an example request file requesting time-series and static data in two separate Extracts. Please note that all
other request file rules that apply to the daily request file also apply in the weekend request file:

{LOGON} DS ID
{FORMAT}
{EXTRACTION} 01
{TRIGGER}
WEEKEND
{FORMAT}
{REQTYPE} S –0D
{DATATYPES}
NAME
SECD
EXMNEM
{INSTRUMENTS}
LFTSE100
543232
{EXTRACTION} AA
{TRIGGER}
WEEKEND
{FORMAT}
{REQTYPE} T 31/12/2000 –0D D
{DATATYPES}
P
VO

The WEEKEND extract is not automatically enabled when your account is initially set up. To enable this facility please
contact your Thomson Reuters Relationship Manager.

Perpetual WEEKEND request files can be set up by forwarding your request file to your local Helpdesk. The same rules
apply to editing/overwriting a perpetual WEEKEND request as with the standard daily Request files

DDL Customer Specification Guide


63

CHAPTER 17 HANDLING PRICE HISTORY ADJUSTMENTS & CORPORATE


ACTIONS

If you are downloading extensive price histories and storing them locally it is important to ensure the integrity of your
local database is kept by adjusting price histories appropriately for corporate actions. This section explains different
methods for achieving this.

Using the LAFAMEND List


The suggested method is to download data through DDL for only those instruments that have experienced a corporate
action that has adjusted the price history in the 24 hours prior.

The method uses exactly the same request structure as the current DDL requests and can be included in a request file
with requests for other data. Instruments that have experienced an adjustment to the price history are placed in to a
mainframe list called LAFAMEND. This list is created at approximately 6 am GMT, and is created by a mainframe
process that checks all instruments in the Equity universe that have undergone a change to the datatype CAI in the
previous 24 hours. The LAFAMEND list usually produces a list of up to 30 or 40 instruments; making any output files
very small.

The procedure is for clients to request the LAFAMEND list in a request file, along with the datatypes they are interested
in, for the amount of history they require. An example1 is given below:

{LOGON} XABC123
{FORMAT}
{EXTRACTION} CA
{TRIGGER}
CAMEND
{FORMAT}
{REQTYPE} T 01/01/1980 –0D D
{INSTRUMENTS}
LAFAMEND
{DATATYPES}
P
PO
PH
PL
VO

If you cut and paste from this above the Extract will process as soon as the LAFAMEND list has been updated, this is
done by using the CAMEND trigger which is linked to the LAFAMEND creation process.

As the list is produced using the entire Datastream Research Equity universe, and not the clients own universe then it is
rare that the contents of the LAFAMEND list will match exactly the instruments that a client is interested in. Therefore,
the client must ‘filter’ the output data to ensure that they only ingest data for the instruments they are either interested in,
or contracted for.

The availability time of the output data cannot be guaranteed as it is not regularly known how much data the client will
request, or what other clients will request. Also, due to variability of the amount of processing occurring on the
Datastream system at any one time the LAFAMEND creation time can vary, typically between 4 am and 6 am GMT.

DDL Customer Specification Guide


64

There are some things to bear in mind. These are:

1. The LAFAMEND creation process is set to run off the last relevant data ingest job. This may be delayed and
therefore, so might the creation of the LAFAMEND list. By using the CAMEND trigger clients are assured that the
Extract will only be processed once the LAFAMEND list has been successfully updated.

2. The list is not available historically. The build process that runs daily overwrites the previous contents of the file with
the new instruments for that day.

3. The list can be used against any relevant datatype, i.e. NAME, AX, AF, etc. The datatype does not have to be time-
series only, and does not have to be related to the retrieval of history. The client can retrieve anything that they are
contractually allowed.

Other Methods of Tracking Adjusted Price History


In addition to the method given above, it is also possible for client’s using DDL to track these changes themselves.
There are a number of datatypes that can be used to spot a corporate action. Below is a definition of their mnemonic
and how they are used. Important: The results of some of these datatypes indicate a corporate action. This does not
necessarily result in a change to the price history..

AF – This datatype shows the adjustment factor. It is updated daily and is available historically. The datatype shows the
adjustment factor on any given date. Each time there is a corporate action that has an adjustment factor, this is shown
on the same day on datatype AF. Also, the historical values for AF are then adjusted accordingly to provide a consistent
adjustment factor for the price history.

AX – This datatype shows the adjustment coefficient and indicates that a corporate action has taken place on a
particular day. Datatype AX will always show 0 or N/A unless there has been a corporate action on that day, when it will
reflect the adjustment factor. Again this datatype is available historically.

CAI – This is similar to the datatype AF, but is expressed in the form of an index rather than actual adjustment factors,
and the base value at the base date of the stock history is fixed at 1. The adjustment factors are accumulated in
chronological order. For a capital action occurring on day t, the CAI value changes on day t. The issue with having the
client follow this datatype is they are likely to be one day behind in updating their history.

DDL Customer Specification Guide


65

Advance notice of Corporate Actions

Clients wanting advance notice of corporate actions should use the list code ADJALERT This list catches any
corporate action that will occur in the next 24 hours on Datastream. The list is updated at 06:00 am UK time daily and
then again at 12:00 pm UK time to catch late updates. The default data in the list is Datastream codes. In order to
display the names, codes, adjustment factor details, type of corporate action and corporate action date, as below, the
following datatypes can be used.

An example request using the ADJALERT list is below

{LOGON} XABC0101
{FORMAT}
{EXTRACTION} AD
{TRIGGER}
1130
{FORMAT}
SRVRFILEPREFIX=ADJALERT
DCDEFLT=Y
MERGE=Y
TRAN=N
MNEMONIC=Y
{REQTYPE} S -0D
{DATATYPES}
NAME
AF
AX
CAI
CEXT
CEXD
{INSTRUMENTS}
ADJALERT

The below output is returned.

DDL Customer Specification Guide


66

Corporate Actions Composite Datatypes

In 2008 new datatypes were released that allowed a composite download of historic corporate actions on DDL. These
datatypes provide the last 1, 3, 5, 10, 15, 20, 25 or 30 corporate actions for an individual stock or Equity list.

The available datatypes are:

Mnemonic Description
CA01 Corporate Action - Last event
CA02 Corporate Action - Last 2 events
CA03 Corporate Action - Last 3 events
CA04 Corporate Action - Last 4 events
CA05 Corporate Action - Last 5 events
CA10 Corporate Action - Last 10 events
CA15 Corporate Action - Last 15 events
CA20 Corporate Action - Last 20 events
CA25 Corporate Action - Last 25 events
CA30 Corporate Action - Last 30 events

Each entry will display the following data items where available:

o EE – Ex/Event Date
o TI – Type of Issue
o RGHT – Rights Issue
o BONS – Bonus Issue
o DISC – Stock Distribution Cash Amount Equivalent
o DIST – Stock Distribution
o STKD – Stock Dividend
o STKC – Cash Distribution
o CAPR- Capital Repayment
o TAKE – Takeover
o MERG – Merger
o DMRG – Demerger
o EXCH – Exchange
o SPLT – Stock Split
o SUBD – Subdivision
o CONS – Consolidation
o REVS – Reverse Split
o SPNO – Spin Off
o LOTS – Trading Lot Size Change
o CI – Complex Issue Marker
o N – Explains adjustment factor
o Y – Does not explain adjustment factor
o X – Not applicable
o RM – Renounceable Marker
o N – Non-Renounceable (e.g. Open offer)
o Y – Renounceable
o X – Not applicable
o ST – Status (for takeovers/mergers)
o PRP – Proposed
o LPS – Lapsed

DDL Customer Specification Guide


67

o EFF – Effective
o UNC – Unconditional
o WUC – Wholly Unconditional
o CLO - Closed
o AD – Announcement Date
o RD – Record Date
o ED – Expiry Date
o TN – Terms No. New Shares
o TO – Terms No. Old Shares
o CU – Currency
o IO – Issue/Offered Price
o RS – Resultant Stock Code
o OC – Offering Co. Name
o MI – Multiple Issue Marker
 O – On original holding
o N – On new holding
o X – Not applicable
st
Please note these datatypes are currently only available for data from 1 January 1998.

DDL Customer Specification Guide


68

008 Thomson Reuters. All rights reserved. more information


ublication or redistribution of Thomson Reuters content, includ
aming or similar means, is prohibited without the prior written d us a sales enquiry at
sent of Thomson Reuters. 'Thomson Reuters' and the Thomson ers.com/salesenquiry
ters logo are registered trademarks and trademarks of Thomso d more about our products at
ters and its affiliated companies. ers.com/productinfo
out how to contact your local office
ers.com/contacts

DDL Customer Specification Guide

You might also like