You are on page 1of 34

Ministry of Defence

Defence Standard 03-44


Issue 2 Publication Date 31 March 2008

A Generic Process for the


Verification & Validation of Modelling
and Simulation
&
Synthetic Environments Systems

DEF STAN 03-44 Issue 2

Contents
Foreword .........................................................................................................................................................iv
0

Introduction ..........................................................................................................................................v

0.1

Background .......................................................................................................................................v

0.2

Intended Audience............................................................................................................................v

0.3

Tailoring .............................................................................................................................................v

Scope ....................................................................................................................................................1

Warning.................................................................................................................................................1

Definitions.............................................................................................................................................1

Benefits of V&V ....................................................................................................................................2

4.1

Why do V&V.......................................................................................................................................2

4.2

When to do V&V? ..............................................................................................................................2

4.3

Acceptance Considerations.............................................................................................................2

4.4

Key V&V Stakeholders .....................................................................................................................2

The V&V Process .................................................................................................................................3

5.1

Overview ............................................................................................................................................3

5.2

The 5 Step Process...........................................................................................................................4

5.3

Plan V&V ............................................................................................................................................4

5.4

Collate Evidence ...............................................................................................................................5

5.5

Assess the Evidence ........................................................................................................................5

5.6

Determine the Residual Risk ...........................................................................................................5

5.7

Reporting ...........................................................................................................................................5

5.8

Acceptance ........................................................................................................................................6

5.9

Detailed V&V Process and Activities ..............................................................................................6

Annex A V&V Process Activity descriptions .................................................................................................7


A.1

Required Activities ...........................................................................................................................7

A.2

Plan V&V ............................................................................................................................................7

A.3

Collate Evidence .............................................................................................................................15

A.4

Assess Evidence.............................................................................................................................17

A.5

Determine the Residual Risk .........................................................................................................18

A.6

Report...............................................................................................................................................19

Annex B Detailed Information Requirements and Worked Examples ......................................................21


B.1

Key Inputs........................................................................................................................................21

B.2

Further Sources of Information .....................................................................................................27

ii

DEF STAN 03-44 Issue 2

Figures
Figure 1 The Top Level V&V Process...........................................................................................................4
Figure 2 V&V Activities ..................................................................................................................................7
Figure 3 Information Requirements ...........................................................................................................21
Figure 4 Test Hierarchy ...............................................................................................................................26

Tables
Table 1

Definitions.........................................................................................................................................1

Table 2

Simple V&V Objectives...................................................................................................................9

Table 3

V&V Techniques............................................................................................................................10

Table 4

Sample Activity Table ...................................................................................................................12

Table 5

Example Acceptance Criteria Recording Table .........................................................................24

iii

DEF STAN 03-44 Issue 2

Foreword
AMENDMENT RECORD
Amd No

Date

Text Affected

Signature and Date

REVISION NOTE
This standard is raised to Issue 2 following a review of comments submitted during the Interim status of the
Defence Standard.
HISTORICAL RECORD
This standard supersedes the following:
Interim Def Stan 03-44 Issue 1 dated 31 January 2007.

a.

This Standard provides Top Level Process guidance, and details associated activities for the Validation,
& Verification (V&V) process for Modelling and Simulation & Synthetic Environment systems.

b.

This Standard has been produced on behalf of the Defence Material Standardization Committee
(DMSC) by the Directorate of Analysis, Experimentation and Simulation (DAES) in MOD.

c.

This Standard has been agreed by the authorities concerned with its use and is intended to be used
whenever relevant in all future designs, contracts, orders etc. and whenever practicable by amendment
to those already in existence. If any difficulty arises which prevents application of the Defence Standard,
UK Defence Standardization (DStan) shall be informed so that a remedy may be sought.

d.

Any enquiries regarding this standard in relation to an invitation to tender or a contract in which it is
incorporated are to be addressed to the responsible technical or supervising authority named in the
invitation to tender or contract.

e.

Compliance with this Defence Standard shall not in itself relieve any person from any legal obligations
imposed upon them.

f.

This Standard has been devised solely for the use of the Ministry of Defence (MOD) and its contractors
in the execution of contracts for the MOD. To the extent permitted by law, the MOD hereby excludes all
liability whatsoever and howsoever arising (including, but without limitation, liability resulting from
negligence) for any loss or damage however caused when the standard is used for any other purpose.

iv

DEF STAN 03-44 Issue 2

0 Introduction
This Standard describes the Verification & Validation (V&V) process and associated activities that have been
developed by DAES under Output 03A (Joint Capability Studies) research activity. The generic process is
designed to be both tailorable and adaptable to a wide range of Modelling and Simulation & Synthetic
Environment (MS&SE) activities.

0.1

Background

The process is based on best practice derived from a considerable body of work undertaken in the UK and
the rest of the world over the course of many years. The approach taken by this Standard distils much of this
early work into a simple high-level process from which more detailed activities can be derived. Both the
process and the activities are described in this Standard.

0.2

Intended Audience

This Standard has been written for a very broad range of potential readers that are organized into three
basic groups:
a. User Community. The users of modelling and simulation are the personnel that conduct studies,
experimentation, Test & Evaluation (T&E) activities or deliver training using MS&SE based tools.
b. Developers. The developers of models or MS&SE are the technical personnel that engineer a new
model or simulation or integrate them as a Synthetic Environment (SE).
c. V&V Practitioners. Those who plan and execute the V&V process. Dependent upon the nature of
the MS&SE activity they may be part of either the user or the development community, but it would
be preferable if they have some degree of independence to minimize the effects of bias.

0.3

Tailoring

0.3.1 Users of this Standard should be aware that the V&V activities described, whilst being generally
applicable to MS&SE, are intended to be tailored to meet the needs of each individual application. Every use
and therefore every application of this Standard could be unique; hence the extent to which the
recommended V&V practices can be performed depends on the quality of the requirements information and
the resources available to the V&V Practioners.
0.3.2 The practices described in this Standard should be used as a starting point for developing the
specific approach to the V&V requirement to support the intended use.

DEF STAN 03-44 Issue 2

This page intentionally left blank

vi

DEF STAN 03-44 Issue 2

A Generic Process for the Verification & Validation of


Modelling and Simulation & Synthetic Environments Systems

Scope

1.1 The objective of this Defence Standard is to provide concise, easy to read, generic guidance on the
V&V of MS&SE being used for Defence purposes. The guidance is general and could be applied to any
MS&SE application domain; however, the context is targeted at the UK Defence community.
1.2 Throughout this standard the term V&V refers to the MS&SE itself and not to the scrutiny of the activity
(experimentation, training, acquisition etc) as a whole. V&V of the MS&SE and its data and does not include
the experimental design, the analysis or the presentation of the results.

Warning

The Ministry of Defence (MOD), like its contractors, is subject to both United Kingdom and European laws
regarding Health and Safety at Work. All Defence Standards either directly or indirectly invoke the use of
processes and procedures that could be injurious to health if adequate precautions are not taken. Defence
Standards or their use in no way absolves users from complying with statutory and legal requirements
relating to Health and Safety at Work.

Definitions

The Definitions in Table 1 are used throughout this Standard:


Table 1 Definitions
Term

Definition

Acceptance

Is the process whereby the Customer/User accepts that the MS&SE is fit for use in its intended purpose

Acceptance
Criteria

A set of standards, established by the modelling and simulation application sponsor or approved authority, that
a particular model or simulation must meet to be accepted as fit for purpose. The criteria will be unique to each
problem and will give key insights to potential solutions

Claim

The hierarchical breakdown of credibility decomposed into sub-claims (arguments), and arguments into
evidence (facts), to determine the appropriateness and the correctness of a model or simulation.

Conceptual
Model

A statement of the content and internal representations which are the users and developer's combined concept
of the model. It includes logic and algorithms and explicitly recognizes assumptions and limitations

Credibility

The relevance that the user sees in the model and the confidence that the user has that the model or
simulation can serve his purpose. The appropriateness of the model plus the correctness of the model equals
the credibility.

Face Validation

The process of determining whether a model or simulation seems reasonable to people who are
knowledgeable about the system under study, based on performance. This process does not review the
software code or logic, but rather reviews the inputs and outputs to ensure that they appear realistic or
representative

Interoperability

The ability of a model or simulation to provide services to and accept services from other models and
simulations, and to use the services so exchanged to enable them to operate effectively together

Model

A representation of something. A model may be physical (e.g. a wooden mock-up) or virtual (computer-based).
The 'something' may be physical (e.g. a vehicle) or abstract (e.g. a relationship between variables in a graph).

DEF STAN 03-44 Issue 2


Term
Simulation

Definition
The exercising of a model over time. A simulation may be:
'live' - exercising real people and/or real equipments in the real world (e.g. a live trial or exercise)
'virtual' - exercising real people and virtual people/equipments, possibly in a virtual world
'constructive' - exercising virtual people and/or equipments, usually in a virtual world

Synthetic
Environment

A computer-based representation of the real world, usually a current or future battle space, within which any
combination of 'players' may interact. The 'players' may be computer models, simulations, people or
instrumented real equipments

Validation

The process of determining the degree to which a model or simulation is an accurate representation of the realworld from the perspective of the intended uses of the model or simulation

Verification

The process of determining that a model or simulation implementation accurately represents the developer's
conceptual description and specification. Verification also evaluates the extent to which the model or simulation
has been developed using sound and established software engineering techniques

Benefits of V&V

4.1 Why do V&V


4.1.1
If MS&SE are going to be used to help make decisions derived from T&E, experimentation or
produce a training effect, then V&V is required to ensure that the resulting decisions can be made in
confidence or that the skills developed are transferable to the actual operational environment.
4.1.2 The decision to invest in V&V, and the decision as to how much V&V is required is entirely based on
the criticality of the decisions that result from the application of MS&SE.

4.2

When to do V&V?

4.2.1 V&V should be conducted when the risk of making an incorrect decision, or the risk of not
developing the correct skill level in a trainee, or the risk of developing an incorrect Mission Rehearsal or
Operational context, outweighs the risk of conducting the V&V activity.
4.2.2 The true benefits of V&V are only realised when it is an integral part of the MS&SE development.
Conducting V&V throughout the development cycle (part of a whole life approach) enables the V&V team to
capture issues that can be more efficiently addressed upfront and hence pro-actively de-risk the MS&SE
development.

4.3

Acceptance Considerations

4.3.1
The User Needs Statement (UNS) (see Annex B, clause B.1.1) is a precise statement of the Users
problem and the underlying context. It is fundamental to the design and development of the MS&SE itself. It
is also the document that the Acceptance Criteria (AC) (See Annex B, clause B.1.3)are derived from.
4.3.2 The acceptance of an MS&SE or its results is a decision made by the User. The V&V process
presents the User with an objective set of conclusions that demonstrate that the MS&SE is fit for purpose.
4.3.3 The fitness for purpose is demonstrated by the V&V activities collecting and assessing the evidence
required to show that each of the AC have been met. Hence the selection of appropriate AC is key to the
success of the V&V.

4.4

Key V&V Stakeholders

4.4.1 The V&V activity does not exist in a vacuum and it is essential that those responsible for the V&V
interact with other stakeholders. These stakeholders either have a vested interest in the output from the V&V
or they are contributors to the V&V activity, for example, by providing access to the evidence required to
demonstrate that the MS&SE is fit for purpose.

DEF STAN 03-44 Issue 2


4.4.2 Generic groups of V&V stakeholders are identified below, these will vary with the structure of the
programme, and on small MS&SE developments the same person may fulfil more than one role:
a. User. The User needs to solve a problem or make a decision and wants to use an MS&SE to do so;
the User also has to convince the scrutiny authorities that the decisions based on the output of the
MS&SE are sound. The User defines the requirements, agrees the AC by which MS&SE fitness for
purpose will be assessed and ultimately accepts the results.
b. MS&SE Programme Manager (PM) is responsible for planning and managing resources for
MS&SE development, directing the overall MS&SE effort, and overseeing configuration
management and maintenance of the MS&SE. The PM shall need to accept and resource the V&V
plan.
c. MS&SE Designers are those individuals charged with the design of the MS&SE and production of
the Design Products.
d. Component Suppliers are those individuals or companies that supply the components or federates,
that comprise the SE. They will generally also produce the Development Products.
e. Developers/Integrators/Implementers are responsible for the design, implementation and test of
the MS&SE and will usually produce the Test and Integration Products.
f.

V&V Agent/Team is responsible for undertaking the V&V activities required to demonstrate to the
user that the MS&SE is fit for purpose. The V&V team is also responsible for the management of the
V&V task, within the delegated authority set by the PM.

4.4.3
For the purposes of this Standard, the User is responsible for the acceptance of the MS&SE
and/or its results and this acceptance should be an informed decision based upon the V&V team
recommendations.

5
5.1

The V&V Process


Overview

5.1.1 As explained in sub-clause 4.2, the full benefits of V&V can only be realised if it is treated as an
integral part of the MS&SE development, and is planned, monitored and budgeted for in the same manner
as any other aspect of the MS&SE development.
5.1.2 This section identifies the High Level 5-step process for planning and executing V&V that is
designed to be integrated into an overarching MS&SE development process. The aim of this section is to
describe each of the steps.

DEF STAN 03-44 Issue 2


5.2

The 5 Step Process

The V&V process is represented as 5 steps, as shown in Figure 1. Although there is some linearity to the
process, it should be noted that the process can, and should, iterate stages as required.

Plan V&V

Collate Evidence

Assess Evidence

Determine Residual Risk

Report

Figure 1

5.3

The Top Level V&V Process

Plan V&V

5.3.1 The initial step is to plan the V&V activity. The outputs of this step are much the same as for any
other project management planning activity, and as such the planner should use the tools and
documentation techniques that they feel comfortable with or as prescribed by the wider MS&SE development
programme management.
5.3.2 The planning activity will be iterative and there are a number of key inputs that must be derived
before a detailed plan can be issued, these include:
a. Level of V&V Required: The V&V plan must reflect the level of V&V rigour that is required (see
Annex A, sub-clause A.2.5); this is a reflection on the intended usage and sensitivity of the MS&SE
results. This information enables the planner to tailor the V&V activities to be commensurate with the
Users needs for the MS&SE results.
b. Agreed Acceptance Criterion: V&V is largely based upon demonstrating that the results from the
MS&SE are fit for purpose. To do this the User agrees a set of criteria; if the MS&SE is shown to
meet the AC, then the User will be satisfied that it meets their needs.
c. V&V Objectives: The V&V objectives detail how the V&V activity will demonstrate that the MS&SE
results are fit for purpose. They are the basis for what evidence is required and how this evidence
will be integrated and assessed to make the case for a V&V recommendation.
5.3.3

Once all necessary inputs are available then a typical project plan can be compiled based around:

a. Tasks.
b. Schedule.
c. Budgeting.
4

DEF STAN 03-44 Issue 2


d. Dependencies.
e. Resources.
f.

Monitoring regime.

5.3.4 The layout and content detail shall be in accordance with the overall programme quality plan and the
plan should be agreed with both the User and the appropriate Scrutiny Authority.

5.4

Collate Evidence

5.4.1 It is recommended that the V&V team produce the evidence requirement early in the programme
and this requirement shall be checked for completeness and consistency with the V&V Objectives.
5.4.2 The evidence requirement should list not only the evidence required, but also the source of the
evidence, this could include:
a. Design and development documentation.
b. Planned federation testing.
c. Legacy federate documentation.
d. Face validation with Subject Matter Expert (SME).
5.4.3 Where the V&V team are dependent upon third party evidence collation, for example, from
federation testing or from design documents, the evidence requirement should be communicated as early as
possible to these providers.

5.5

Assess the Evidence

5.5.1 The collated evidence is assessed. The objective of the assessment is to build the evidence case to
demonstrate to the User that each of the agreed acceptance criterion have been met.
5.5.2 It is not necessary to wait until the end of MS&SE development for this process to start. Indeed it is
preferable if this assessment is conducted as and when the evidence is available. This early stage approach
allows for corrective action to be made if the evidence suggests that the criteria are at risk of not being met
by the MS&SE development.
5.5.3 The techniques used to assess the evidence shall be determined by both the V&V objectives and
the level of V&V and may vary from one line of evidence to the next.

5.6

Determine the Residual Risk

5.6.1 The Residual Risk associated with a particular MS&SE shall be evaluated in the context of the
MS&SE use.
5.6.2 The methods to be used in the determination of residual risk should be identified at the same time
that the AC are derived.

5.7

Reporting

5.7.1

The V&V reporting activity has three functions:

a. To recommend the User, accepts the MS&SE results based on their fitness for purpose. This shall
include the scope of use across which the MS&SE is a valid application.
b. To document the evidence and reasoning that supports the recommendation.
c. To support MS&SE reuse. A structured approach to V&V documentation and appropriate archiving
should enable more cost effective reuse of the MS&SE and/or its results.
5

DEF STAN 03-44 Issue 2


5.8

Acceptance

5.8.1 Although it is beyond the scope of this Standard to consider in detail the acceptance decision and
the wider concept of approval the following points are proffered:
a. It is the User who is responsible for making the acceptance decision. This should be an informed
decision based upon the findings of the V&V report.
b. The acceptance decision will be dependent upon the Users trust in the ability and objectivity of the
V&V team, and their confidence that the MS&SE or its results will withstand detailed scrutiny.
c. In accepting the MS&SE the User shall recognise that they are also accepting all the residual risk.

5.9

Detailed V&V Process and Activities

5.9.1
Detailed descriptions of each of the 5 steps outlined in this Section, together with their processes
and activities are given at Annex A to this Standard.
5.9.2
The requisite information required by the processes and activities described in this Section and
Annex A are discussed in detail, together with examples, at Annex B to this Standard.

DEF STAN 03-44 Issue 2

Annex A
V&V Process Activity descriptions

A.1

Required Activities

This Section describes the typical Activities required when conducting V&V. The Top Level Process,
Activities and Information requirements are summarised in Figure 2 below:

Plan V&V

Collate
Evidence

Assess
Evidence

Determine
Residual
Risk

Report

Determine Level
of V&V

Issue Evidence
Requirement

Integrate
Evidence

Conduct Residual
Risk Assessment

Document V&V
Case

V&V
Process

V&V
Activities
Determine V&V
Objectives

Collect Evidence

Compile Case,
Argument, Evidence

V&V
Recommendation

Determine Evidence
Requirement

Identify Required
V&V Activities

Plan Schedule &


Resources

Figure 2

A.2

V&V Activities

Plan V&V

A.2.1 The planning activity is the initial V&V step that defines the necessary tasks to the level where they
can be assigned to V&V personnel and executed.
A.2.2 The development of the V&V plan is inevitably incremental, as the detail of V&V execution is not
revealed until the V&V objectives and level of V&V required, are determined. As such the plan will mature
with these early phase V&V activities.
A.2.3 It is good planning practice to conduct a risk assessment as part of the planning activity. This should
be conducted with the User, MS&SE PM, and Developers to identify the high-risk areas and priorities upon
which the V&V effort should focus. These help determine the scope and magnitude of the V&V effort.
A.2.4

This Section describes some of the sub-activities associated with the Planning step.

A.2.5

Determine Level of V&V

A.2.5.1

Activity Objective

A.2.5.2 To determine the Level of V&V required for each component and consequently for the MS&SE as
a whole.
7

DEF STAN 03-44 Issue 2


A.2.5.3

Information Requirement

A.2.5.3.1 The level of V&V required is dependent upon the usage of the MS&SE outputs. Clearly an
MS&SE that feeds into sensitive decisions upon which large commitments of life or money are made should
be subject to more rigorous V&V than an MS&SE that is used to provide less sensitive back of the envelope
outputs.
A.2.5.3.2 In line with the MoD Acquisition Management System (AMS) which contains Director General
(Scrutiny & Analysis) (DG(S&A)) guidance on the V&V of Operational Analysis (OA) modelling capabilities,
this Standard recommends the three levels of validation detailed below:
a. Unvalidated little or no assessment of the modelling capability has been undertaken or
documented, or fundamental shortfalls have been identified.
b. Validated, Level 1 the model has undergone full assessment by review and its strengths and
weaknesses recorded. Areas in which it could not be used for key decisions have been identified
together with the risks to decisions in areas in which it is to be used.
c. Validated, Level 2 As Level 1 but, additionally, validation against trials/exercises and/or historical
analysis has taken place.
A.2.5.3.3 The guidance is clear that validation shall consider the model itself and its data (either for
calibration or experimentation).
A.2.5.3.4 The level of validation required will tend to be component specific with different parts of the
MS&SE requiring greater or lesser attention to detail depending on the sensitivity of the MS&SE as a whole
to changes or errors in a specific component.
A.2.5.4

Sub-Activities

Determining the requirement for component Log Books.


A.2.5.5

Outputs

A.2.5.5.1 A description of the level of V&V effort for the MS&SE as a whole and for specific components.
A.2.5.5.2 Directives on Log Book requirement and content.
A.2.5.6

Impact/Distribution

As a minimum the: User; Component suppliers; MS&SE Designers; MS&SE Integrators; V&V Agents.
A.2.5.7

Recommend Practice

The DG(S&A) has recommended the use of Logbooks for all model developments and although their
guidelines do not specifically recommend it, for MS&SE, it provides a good guide. DSTL have published
guidelines for the format of a typical logbook in the MoD AMS. Although this log book was originally intended
for OA models the typical content translates well to MS&SE components and suppliers should be
encouraged to offer as much detail as they possibly can.
A.2.6
A.2.6.1

Determine V&V Objectives


Activity Objective

A.2.6.1.1 To determine the specific objectives for the V&V activity.


A.2.6.1.1.1 The V&V Objectives state how each of the AC, which have been agreed with the User, will be
demonstrated as being met. The V&V Objectives therefore provide the basis for the Evidence Requirement
and the subsequent integration and assessment of the evidence.

DEF STAN 03-44 Issue 2


A.2.6.2

Information Requirement

a. User Needs Statements.


b. Requirements documentation.
c. Acceptance Criteria.
d. MS&SE Architectural Design Document.
e. MS&SE System Design Document.
A.2.6.3

Sub-Activities

Typically, the objectives will need to be decomposed into a set of more easily manipulated sub-objectives. It
is recommended that a structured approach is adhered to such as that offered by the Claims-ArgumentsEvidence structure (see sub-clause A.4.2.6.1). This provides a framework for decomposing the V&V problem
into a coherent set of acceptability criteria (claims) and the subsequent evidence requirement.
A.2.6.4

Outputs

V&V Objectives and sub-objectives.


A.2.6.5

Impact/Distribution

As a minimum the: Component suppliers; MS&SE Designers; MS&SE Integrators; V&V Agent.
A.2.6.6

Recommend Practice

Examples of V&V objectives are contained in Table 2.


Table 2 Simple V&V Objectives
Objective
To support the evidence provided with a suitable audit trail
To identify those components that are critical to the performance of the systems under test
in the MS&SE (for example: the performance of the radar component)
To show how the experiment (MS&SE) has met each of the requirements placed upon it
To provide a detailed account of how each of the MS&SE components are fit for purpose in
this case

A.2.7

Determine Evidence Requirement

A.2.7.1
A.2.7.1.1
gathered.

Activity Objective
To determine what evidence will be required in support of V&V and how that evidence will be

A.2.7.1.2 The V&V objectives state how the MS&SE is going to be demonstrated as being fit for purpose.
This is done by collecting and assessing the evidence required to show that the AC have been met. The next
logical step is to derive the chain of evidence required to do this. The evidence requirement must itself be
checked to ensure that it is complete with respect to the V&V objectives and of course is internally
consistent.

DEF STAN 03-44 Issue 2


A.2.7.2

Information Requirement

a. V&V objectives.
b. Acceptance Criteria.
c. MS&SE Architectural Design Document.
d. MS&SE System Design Document.
A.2.7.3

Sub-Activities

A.2.7.3.1

Identify evidence sources:

A.2.7.3.1.1 The evidence requirement should also indicate the source for the evidence. There are three
classes of evidence source:
a. Evidence that is generated as a third party activity, and may be found in intermediate development
products such as design documents.
b. Evidence that needs to be derived for the purposes of V&V; for example, through specially
commissioned tests or the use of SMEs in face validation exercises.
c. Evidence that will be generated during the development and execution of the SE:
c.1 Objective evidence.
c.2 Subjective evidence.
c.3 Technical evidence.
A.2.7.3.2
A.2.7.4

Estimate the resource requirements for capturing the evidence requirements.


Outputs

a. Statements of evidence requirements in V&V Plan.


b. Resource requirements for inclusion in the overall programme plan.
A.2.7.5

Impact/Distribution

As a minimum the: MS&SE Programme Manager; Component suppliers; MS&SE Designers; MS&SE
Integrators; V&V Agent.
A.2.7.6

Recommend Practice

A.2.7.6.1 The following are valid techniques, as recommended by Sargent (WSC2003), for the V&V of
simulation models. It should be noted that simulation models in this case refers to individual standalone
models and not to MS&SE per se; however as can be seen from Table 3 many of these techniques can be
used to generate evidence for the V&V of MS&SE.
Table 3 V&V Techniques
Technique

Summary

Animation

MS&SE behaviour is viewed graphically (visually) as time progresses.

Comparison to other
models

Various results from the MS&SE are compared with results from other (valid) MS&SE
(or OA/standalone modelling).

Degenerate tests

MS&SE behaviour is tested by appropriate selection of input and internal parameters.


Intuitive results are expected and tested for.

Event validity

Compare MS&SE events with those from real systems.

10

DEF STAN 03-44 Issue 2


Technique

Summary

Extreme condition tests

Test the MS&SE for plausibility when extreme or unlikely combinations of levels or
factors are applied.

Face validity

Use SME knowledge (military expertise) to assess whether the MS&SE behaviour is
reasonable.

Historical data validation

If historical data is available then some of the data should be used to build the MS&SE
and the rest should be used to test MS&SE behaviour.

Historical methods

Rationalism assumes everyone knows if assumptions in the MS&SE are true logic is
used to derive the correct MS&SE.
Empiricism requires every assumption and output to be empirically validated.
Positive Economics requires that the MS&SE is able to predict the future and is not
concerned with the MS&SE assumptions or structure.

Internal validity

Several replications are made and the stochastic variability of the output assessed. A
large variability may raise questions over the MS&SE appropriateness.

Multistage validation

Develop the MS&SE assumptions on theory observation and general knowledge.


Validate the assumptions by empirically testing them.
Compare the input-output relationships of the MS&SE to the real system.

Operational graphics

The values of various performance measures are shown graphically as the MS&SE
runs.

Parameter variability
sensitivity analysis

Changing the value of various input and internal parameters to determine the effect
upon MS&SE behaviour.

Predictive validation

The MS&SE is used to predict system behaviour. The actual (field trials) and predictive
behaviours are compared.

Traces

The behaviour of specific entities are traced (followed) through the MS&SE to determine
if the MS&SE logic is correct

Turing test

SME are asked if they determine between MS&SE and system outputs.

A.2.7.6.2 Clearly not all of these techniques will be applied to every component in every case, but it is
important that the relevant or necessary techniques are identified and the evidence associated with those
techniques clearly stated.
A.2.7.6.3 A structured approach such as the Claims-Argument-Evidence structure which is described in
more detail in sub-clause A.4.2.6.1 will provide a coherent mechanism for deriving the evidence requirement
which is discussed in more detail in sub-clause A.2.6.
A.2.7.6.4

Guidance on other V&V techniques can be found in clause B.2

A.2.7.6.5 Once the Evidence Requirement has been identified, consideration must be given to the
gathering process. It will be important to determine who will gather what evidence, where they will store it
and in what format. Equally important will be identification of the route from the data archive to the analysts.
A.2.8
A.2.8.1

Identify Required V&V Activities


Activity Objective

To identify what actually has to be done in the execution of the V&V Plan.
A.2.8.2

Information Requirement

a. V&V Plan.
b. V&V Objectives.
c. Evidence requirement.
d. Analysis requirement.

11

DEF STAN 03-44 Issue 2


A.2.8.3

Sub-Activities

As required to support the activities identified and recorded in the V&V Plan activity table.
A.2.8.4

Outputs

a. Activity plan including timescales, deliverables, reviews.


b. Resource requirements.
A.2.8.5

Impact/Distribution

A.2.8.5.1 As a minimum the: MS&SE Programme Manager; Component suppliers; MS&SE Designers;
MS&SE Integrators; V&V agent.
A.2.8.6

Recommend Practice

An example of an Activity Table is shown at Table 4.


Table 4 Sample Activity Table
Activity

Action/Resources

MS&SE design available

Developer/Integrator/Implementer

Test & Integration plan available

Developer/Integrator/Implementer

28 Nov

Complete

Architecture representation available

Developer/Integrator/Implementer

16 Dec

Complete

Component Verification log books


available from all suppliers

Component suppliers

12 Dec

Scenarios available

Developer/Integrator/Implementer

19 Dec

Data capture plan available

Developer/Integrator/Implementer

9 Jan

Analysis plan available

Developer/Integrator/Implementer

16 Jan

V&V plan review

Target Date

Complete

Complete

22 Nov

Supplier site component testing

Component suppliers

Complete
Dec

Collocation at experiment site

Component suppliers

12 Dec

Component verification testing

MS&SE Developer

13 -16 Dec

Architecture representation conformance


testing

MS Programme Manager

9-23 Jan

Scenario testing and SME acceptance

MS&SE Developer

9-23 Jan

To test:

12

Remarks

Complete
by

12

Bulk of the detail will be in


the Test and Integration
plan

Validation criteria required

DEF STAN 03-44 Issue 2


A.2.9
A.2.9.1

Plan Schedule and Resources


Activity Objective

To identify the resources required in the execution of the V&V Plan.


A.2.9.2

Information Requirement

a. MS&SE Programme Plan.


b. Evidence requirement.
c. Required Activities.
d. Analysis requirement.
A.2.9.3

Sub-Activities

a. Identify suitable personnel for the V&V team.


b. Identify V&V tool requirement.
A.2.9.4

Outputs

Resource plan.
A.2.9.5

Impact/Distribution

As a minimum the; MS&SE Programme Manager; Component suppliers; MS&SE Designers; MS&SE
Integrators; V&V Agent.
A.2.9.6

Recommended Practice

A.2.9.6.1 This is a standard Project Management Activity, and as such the mechanics are not outlined
here. The following should be considered:
a. The need for V&V to be a through life activity.
b. The V&V effort should be commensurate with the usage of the MS&SE and the demands of the
scrutiny authority.
c. The need for the V&V team to have access to, and require information from, other MS&SE
stakeholders.
A.2.9.6.2 Although the V&V effort is monitored by the MS&SE Programme Manager the V&V Agent is in
charge of implementing the V&V plan. In this capacity, the V&V Agent has several management
responsibilities, including:
a. Keeping the V&V effort focused on essential technical activities.
b. Selecting appropriate and available tools, methods, and techniques and changing them to match the
program changes when required.
c. Ensuring that the necessary resources are available when needed.
d. Locating appropriate personnel to participate in the V&V effort and providing adequate training when
needed.
e. Keeping current with the Developers configuration management system and products.

13

DEF STAN 03-44 Issue 2


A.2.9.6.3

Cost

A.2.9.6.3.1 Manpower is a key cost in the V&V activity and the effort required to perform the tasks will be
determined by the complexity and size of the development effort and the level of V&V rigour required to meet
the Acceptability Criteria. Other factors that have a significant effect on the cost include:
a. Availability and quality of data and development artefacts.
b. Maturity and experience of the Developers.
c. Maturity and experience of the Component Supplier staff.
d. Component maturity.
e. Stability of the requirements.
f.

The level of component fidelity.

A.2.9.6.3.2 The V&V planner needs to produce a cost estimate based on the above. It should include
other direct costs for such things as tools, hardware, support software and SMEs.
A.2.9.6.3.3 It can be expected that there will be some negotiation with the MS&SE PM in agreeing the
funding of the V&V effort and this may require some compromises from both parties. Under funding can be a
significant problem since it invariably results in compromises, which, if severe, can jeopardize completion of
all the required activities. The risk assessment should enable the V&V team to demonstrate the increased
risk and downstream impact of reduction of V&V activities.
A.2.9.6.4

Selecting the Right People to Support the V&V Effort

A.2.9.6.4.1 A successful V&V effort requires skilled and experienced participants. To identify the types of
skills, experience, and knowledge needed, the V&V Agent should have a thorough understanding of both the
intended use and the general requirements of the MS&SE program as well as the MS&SE technology itself.
A.2.9.6.4.2 Verification of technical interoperability can often be accomplished by the MS&SE Developers
and Engineers within the context of software and systems testing and greater efficiency will result from the
utilization of development team members where they are available. This is both logical and appropriate and
should not compromise the independence of the V&V team.
A.2.9.6.4.3 Consideration should also be given to the need for training, especially if V&V expertise is not
readily available in the resource pool. Training could be important if personnel are being asked to work with
unfamiliar V&V tools.
A.2.9.6.4.4 In a typical validation exercise there will be some reliance on SME, and lower levels of V&V
require greater SME input. A common need is for experts in the problem domain and User domain
associated with the application to assist with requirements verification and Conceptual Model (CM)
validation. The User community is usually the best source for experts in the problem domain, and the User
can often either supply these people or make good recommendations about whom to ask and how to secure
their help.
A.2.9.6.4.5 Clearly when SMEs are being relied upon to validate results, data, behaviours, etc, the quality
of the validation product is a function of the intrinsic quality of the SME. SME selection is therefore key and
effort should be made to ensure that the SME is credible, and has the appropriate range and depth of
knowledge, experience and communication skills to undertake the validation task. Additional selection
criteria could include:
a. Formal training.
b. Interest in the project.
c. Experience.
d. Ability to support the effort for the specified times.
14

DEF STAN 03-44 Issue 2


A.2.9.6.5

Ensuring the Availability of Development Products

Another important success factor is the timely availability of development artefacts, data, and products,
which depend on the responsiveness of the Developer. A few key things need to occur at this planning stage
for this to work effectively:
a. The MS&SE Programme Manager should document and agree with the Developer and the V&V
Agent what products should be produced and who should have access to them.
b. The V&V Agent should identify what development artefacts and products will be needed at each
phase of the development, and identify what V&V products will be provided in return.
A.2.9.6.6

Acquiring the Right Tools

A.2.9.6.6.1 There exist a number of commercial tools to support V&V activities, in particular verification.
The selection of appropriate tools, and the cost effectiveness of using such tools will depend to some extent
on the V&V teams previous exposure to a particular tool and significant coercion might be required for them
to change their tool of the trade. In a large MS&SE programme that has a significant V&V effort it might be
considered worthwhile doing some market analysis and cost benefit studies to select a set of tools and
associated training.
A.2.9.6.6.2 Several of the key factors include compatibility with Developers tools, cost, training,
availability and maturity, flexibility, and required accuracy and capabilities. The best tools for V&V efforts are
often the smaller, easier-to-use, cheaper tools that respond better to quick reaction assessments than the
more elaborate, highly integrated development environments and tools, which are more effective for
development.
A.2.9.6.7

Using Appropriate Techniques

It is possible to undertake the activities described in this standard in differing ways and the techniques used
to perform the activities are chosen based on the needs of the programme. The technique chosen for a
particular task depends on the importance of that task to the overall program, the risks involved, and the
priorities established by the User.

A.3

Collate Evidence

A.3.1 Having defined the evidence requirement there is a significant challenge involved in disseminating
that requirement to all the responsible parties and then in making sure that the evidence is either delivered or
collected.
A.3.2

Issue Evidence Requirement

A.3.2.1 Activity Objective:


A.3.2.1.1
A.3.2.2

To disseminate the Evidence Requirement to all the necessary responsible parties.


Information Requirement

a. Evidence requirement.
b. Analysis plan.
A.3.2.3

Sub-Activities

a. Define evidence gathering tools and techniques.


b. Develop data gathering sheets for Observer Analysts.
c. Define log file formats.
d. Define collection, archiving and dissemination processes.
15

DEF STAN 03-44 Issue 2


e. Define archive procedures.
A.3.2.4

Outputs

Evidence collection and co-ordination plan.


A.3.2.5

Impact/Distribution

As a minimum the: Programme Manager; MS&SE Designers; MS&SE Developers; MS&SE Component
Suppliers; V&V Agent.
A.3.2.6

Recommended Practice

A.3.2.6.1 The evidence requirement shall be distributed to all V&V stakeholders including those third
parties that are expected to provide V&V evidence. As such the requirement must be distributed at a time
when these requests for evidence can be efficiently addressed. It is recommended that the V&V team meet
with the third party providers at an early stage to negotiate the evidence requirement; this is particularly
important where the third party is a commercial organisation under contract.
A.3.2.6.2 It is almost impossible to include too much detail at this point; it essential that the processes for
collecting and storing the evidence data are defined as well as the route from that archive to the analysts.
Any shortcomings in these plans will result in data being lost or delays to the analysis process.
A.3.3
A.3.3.1

Collect Evidence
Activity Objective

To collect all the evidence required.


A.3.3.2

Information Required

a. Evidence requirement.
b. Test plans.
c. Analysis plan.
d. Evidence collection plan.
A.3.3.3

Sub-Activities

a. Conduct Objective, Subjective and Technical tests.


b. Gather log data/evidence sheets.
A.3.3.4

Outputs

A complete set of evidence data.


A.3.3.5

Impact/Distribution

As a minimum the: MS&SE Programme Manager; MS&SE Designers; MS&SE Developers; MS&SE
Component suppliers; V&V Agent.
A.3.3.6

Recommended Practice

If the evidence requirement has been rigorously defined then the collection of the evidence should follow
naturally. However, in practice it will be essential to ensure that those identified as responsible in the
evidence requirement have all the resources they need to perform the collection. Similarly a monitoring
process will be required to ensure that the data is actually being collected and that it is in the expected
formats and is suitable for analysis as required.

16

DEF STAN 03-44 Issue 2

A.4

Assess Evidence

A.4.1

Integrate evidence

A.4.1.1

Activity Objective

To consolidate all of the evidence gathered into a form suitable for analysis.
A.4.1.2

Information Requirement

a. Evidence requirement.
b. Analysis plan.
A.4.1.3

Sub-Activities

Data conversions, formatting, archiving, distribution.


A.4.1.4

Outputs

A consolidated collection of evidence.


A.4.1.5

Impact/Distribution

As a minimum the: MS&SE Programme Manager; MS&SE Designers; MS&SE Developers; MS&SE
Component suppliers; V&V Agent; Analysts.
A.4.1.6

Recommended Practice

Having gathered all the evidence required for the V&V case it is usually best to make one person responsible
for the management of the evidence data. This may be a member of the V&V team or it might be one of the
analysts associated with processing the output of the MS&SE itself. Its highly likely that there will be a good
deal of commonality between the two sets of data (V&V data and MS&SE output data) and so careful
cooperation and coordination will be required.
A.4.2
A.4.2.1

Compile Case Argument Evidence


Activity Objective

To prepare the V&V case through use of a structured Claim-Argument-Evidence approach.


A.4.2.2

Information Requirements

a. V&V Objectives.
b. V&V plan.
c. Evidence data.
A.4.2.3

Sub-Activities

a. Analysis and claim formulation.


b. Data analysis.
A.4.2.4

Outputs

Arguments to support the V&V case for each V&V Objective.


A.4.2.5

Impact/Distribution

As a minimum the: MS&SE Programme Manager; V&V Agent; Analysts.


17

DEF STAN 03-44 Issue 2


A.4.2.6

Recommended Practice

It has been recommended throughout this standard that a structured approach should be adopted to
increase the efficiency and help ensure consistency in the V&V approach. The Claim-Argument-Evidence
approach outlined below is one such approach.
Claim:

Acceptance criteria evaluation.


Pre-defined claims for each case (Acceptance Criterion).
Decomposed into sub-claims if required and connected by arguments.
Supported by items of evidence.

A.5

Argument:

Rationale for decomposing claims.

Evidence:

V&V results that support claims.

Determine the Residual Risk

A.5.1
A.5.1.1

Residual Risk Assessment


Activity Objective

To identify the residual risks associated with the use of the MS&SE to support particular decisions or
programme interventions.
A.5.1.2

Information Requirement

a. Acceptability Criteria.
b. Analysis output from the SE.
c. Output from the Claim-Argument-Evidence process.
d. User Needs Statements.
e. MS&SE design documentation.
A.5.1.3

Sub-Activities

a. Identify risk assessment techniques.


b. Conduct individual risk assessments.
A.5.1.4

Outputs

Risk statements.
A.5.1.5

Impact/Distribution

As a minimum the: MS&SE Programme Manager;. MS&SE Designer; V&V Agent.


A.5.1.6

Recommended Practice

A.5.1.6.1 The Residual Risk associated with a particular MS&SE must be evaluated in the context of the
MS&SE use.
A.5.1.6.2 In equipment acquisition the risk can be expressed in technology, scientific, contractual or
project management terms and the MoD AMS Risk Management Guide elaborates further on each of these
particular areas. For example in a warfighting experiment, or any other MS&SE with significant Hardware in
18

DEF STAN 03-44 Issue 2


the Loop involvement, the residual risk may be better evaluated by identifying key sensitivities in the MS&SE
components or the way in which it has been constructed. The methods to be used in the determination of
residual risk should be identified at the same time that the AC are derived.
A.5.1.6.3 Residual risks should be recorded and documented in a format agreed between the User and
MS&SE Programme Manager.

A.6

Report

A.6.1 The V&V report is prepared after the final assessment of the evidence has been completed. The aim
of the report is to provide a supported recommendation to the User to support his acceptance decision. The
report should include the findings from the V&V activities, the outcome of the evidence assessment
illustrating how the AC and V&V objectives have been met. A typical report will include:
a. Reports from each V&V activity.
b. A summary of the accomplishments, findings, conclusions, and recommendations of the overall V&V
effort.
c. A list of all the evidence, data, documentation, assessments and reports associated with the V&V
effort, including information leveraged from other parts of the program.
A.6.2

Depending upon the level of independence of the V&V team the report is either:

a. Internally reviewed by the V&V team and released to the User as an input into his acceptance
decision.
b. Reviewed by the MS&SE programme manager for approval before their release to the User.
A.6.3 It is recommended that the V&V Agent should also meet with the User to provide a walkthrough of
the V&V recommendation and to ensure the materials provided throughout the V&V effort provide the
information needed by the User to make an informed acceptance decision.
A.6.4

Document V&V Case

A.6.4.1

Activity Objective

To compile a report summarising the V&V activity.


A.6.4.2

Information Requirement

a. V&V Plan.
b. Evidence assessment.
c. Residual risk assessment.
A.6.4.3

Sub-Activities

As required to provide the necessary evidence to support the report recommendations.


A.6.4.4

Outputs

The final V&V report.


A.6.4.5
A.6.4.5.1

Impact/Distribution
As a Minimum the: MS&SE Designer; V&V Agent; User.

19

DEF STAN 03-44 Issue 2


A.6.4.6

Recommended Practice

A.6.4.6.1

It is necessary to build and document the V&V history for an MS&SE to:

a. Capture the background knowledge that will support the acceptance of the system or its results.
b. Provide an audit trail to support the acceptance decision.
c. Support re-use of the MS&SE and associated subsequent V&V by facilitating re-use of V&V
evidence and assessment.
A.6.4.6.2 Documentation of the V&V case will collect together the information that supports the
acceptance and provides an audit trail. Of equal importance, the documentation will support the cost
effective re-use of the MS&SE or its constituent components.
A.6.4.6.3 The level and method for documentation will be dependent upon the MS&SE and its application
and the demands of the scrutiny authority. The documentation of the V&V case should be archived
appropriately to facilitate re-use.
A.6.5

V&V recommendation

A.6.5.1

Activity Objective

To produce a recommendation to the User regarding acceptance of the MS&SE results based on their
fitness for purpose. This will include the scope of use across which the MS&SE is considered valid.
A.6.5.2

Information Requirement

a. V&V Plan.
b. Evidence assessment.
c. Residual risk assessment.
A.6.5.3

Sub-Activities

As required to support the formulation of the recommendation.


A.6.5.4

Outputs

V&V Recommendation.
A.6.5.5
A.6.5.5.1
A.6.5.6

Impact/Distribution
The User; MS&SE Designer; V&V Agent.
Recommended Practice

A.6.5.6.1 In the ideal case, all of the AC will have been met and suitable concrete arguments will have
been made for each.
A.6.5.6.2 In many cases not all of the AC will have been demonstrated as having been met because
insufficient evidence was collected and/or assessed. The recommendation may be more difficult to make
and will need further qualification. It may be possible in these situations to make recommendations that the
MS&SE should still be accepted but with a proviso on the increased risk of doing so, or to suggest the
MS&SE is augmented by alternative activities such as judgement panels, war gaming, or other experiments
and trials, to provide a more robust solution. V&V recommendation for non-acceptance can include
suggestions for remedial action based on the V&V findings.

20

DEF STAN 03-44 Issue 2

Annex B
Detailed Information Requirements and Worked Examples

B.1

Key Inputs

This section describes the typical Key Inputs or Information Requirements, shown in Figure 3, for conducting
successful V&V.

Collate
Evidence

Assess
Evidence

Determine
Residual
Risk

Report

User Needs
Statement

Design
Products

V&V
Plan

Results
Analysis

V& V
Objectives

Input Data and


Scenarios

Development
Products

Conceptual
Model

Integration and
Test Products

Plan V&V

Acceptance
Criteria

Key:
Input Documents
Derived Documents

Acceptance
Criteria

Figure 3
B.1.1

Information Requirements

The User Needs Statement (UNS)

B.1.1.1 The UNS is a precise statement of the Users problem and the underlying context. It is
fundamental to the design and development of the MS&SE itself. It must be captured during the
requirements definition phase and agreed with the User who will eventually be called upon to accept the
MS&SE This should be independent of any MS&SE implementation to avoid imposing a solution on the
problem at the outset of the programme.
B.1.1.2 The UNS should be used to define a number of objective tests that must be conducted to prove
the fitness for purpose of the MS&SE. The UNS can be used to define a number of System Requirements
that can be tested during the MS&SE Test and Integration phase through subjective and technical tests.
There are two components that should be represented in the UNS:
a. The problem domain: the world environment in which the problem operates (for example collective
training).
b. The application domain: the definition of the type of problem (for example Naval C4I).
B.1.1.3 It is only through consideration of both of these domains that the complete understanding of the
users requirements can be achieved. It is necessary for the V&V team to work with the User to gain a full
understanding of the UNS and subsequently, together, dervive a complete set of AC.
B.1.1.4 It is assumed that the UNS is the top-level description of the problem and does not itself require
V&V. It provides the background against which the CM is derived and verified.
21

DEF STAN 03-44 Issue 2


B.1.2

Input Data and Scenarios

B.1.2.1 The V&V activity should encompass both the simulation and the data; in fact it is virtually
impossible to separately evaluate a model and the data it uses because it is the interaction of data and code
that produces simulation results.
B.1.2.2 This mutual dependency suggests that data V&V activities should be considered part of the overall
V&V process. Indeed, data V&V activities are discussed as part of the V&V process throughout this
standard. However, the issue of data V&V differs in some key ways:
a. Data V&V tasks are conducted on different sets of data.
b. Different data V&V tasks may be required for different sets of data.
c. Different techniques and tools may be needed to conduct data V&V tasks on different sets of data.
d. Different data sets are obtained at different times.
e. The people performing data V&V activities frequently require different qualifications (e.g., SMEs with
expertise in individual data areas).
B.1.2.3 Whoever conducts data V&V activities should work closely with those developing and preparing
the simulation for use and with those performing the model V&V activities. Data V&V activities should be
carefully documented and included in the V&V report.
B.1.2.4 Data V&V activities reflect the nature of the data. In some cases real data will exist, for example
performance data of an existing system. Some MS&SE aim to represent future systems for which there is no
real world data (in the sense that data is observed and measured), but rather estimates. The same is the
case for scenarios; these are specific cases, where a system, multiple systems and or people can be
exercised.
B.1.2.5 Where real data exists it is possible to derive a referent against which the data V&V can be
conducted. The V&V will test that the MS&SE data is complete and also how accurately it reflects the real
world as represented in the referent.
B.1.2.6 Clearly there may be many referents, for example weapon system data can be found in data
supplied by the manufacturer, the system specification, field trials, or operational use depending on the
maturity of the system. The choice of referent will need to be made in the light of the MS&SE application.
B.1.2.7 Where real data does not exist then the referent will need to be compiled from other sources, such
as validated theory describing the phenomena (e.g., Newtons laws of motion), validated simulations
representing the phenomena (e.g., a simulation that represents the projectiles kinematics), SME knowledge
of the phenomena (e.g., an artillery officers experiences about the behaviour of actual cannon rounds) or a
combination of these different sources.
B.1.2.8 It is usually the case that the referent material is held disparately; it is recommended that there is a
V&V task to collect the information from the different referent sources into a single consistent referent.
Collecting the referent information includes defining the referents scope of coverage, identifying credible
sources and actually acquiring the information. It is also good practice, to collect or estimate the
uncertainties associated with the information in the referent. It is key that the MS&SE User must accept the
credibility of the referent because they must be confident that the data has sufficient fidelity to serve their
purposes.
B.1.2.9

Component Input Data

There is a requirement to capture the level of abstraction that the developer thought was appropriate and in
addition, where for pragmatic reasons, the developer went into more or less detail and in what sense this
was true. This data should be captured at both the MS&SE level as well as at the component level.
B.1.2.9.1 It will be essential that details of each of the components used in an MS&SE are recorded in an
appropriate Component Log Book. The level of detail required will depend on the exact usage of the
particular component in the context of the simulation. Where the residual risk associated with a particular
component is high, then more detail will be required in the component logbook.
22

DEF STAN 03-44 Issue 2


B.1.2.9.2 It is essential that component suppliers are made aware of the logbook requirement early in the
component selection process. Some suppliers may be wary of the logbook compilation process and may
require assurance over the confidentiality of the information supplied.
B.1.2.9.3

Personnel, SMEs and Warfighters

Whilst many of the components supplied for inclusion in an MS&SE may have a significant pedigree of use
and exploitation, the effectiveness of a particular model or simulation in any given instance may depend
heavily on the expertise and experience of the personnel supplied to integrate and manage the component
and on the competency of the operators assigned to the component. It is essential that summary CVs are
produced for the technical personnel involved in component integration and support and for the SMEs used
to operate the components during execution of the MS&SE.
B.1.3

Acceptance Criteria

B.1.3.1 Defining the AC is a fundamental step in undertaking V&V, since the core V&V activity is to
demonstrate that the AC have been met.
B.1.3.2 The AC are a set of conditions that the MS&SE must satisfy for it to be accepted by the User;
they should relate directly to the UNS described in sub-clause B.1.1. It is necessary that the User agrees on
the AC set and indeed it is preferable that the User and V&V agent formulate the AC set together ideally with
support from the MS&SE Programme Manager and SMEs.
B.1.3.3 It is also good practice to conduct a risk assessment at the same time as deriving the AC to
determine which requirements are most important to the success of the application; this can also determine
the priorities for V&V (i.e., which issues should be addressed first and what level of effort should be focused
on each).
B.1.3.4 The AC is derived from an assessment of the UNS. The AC should, where possible, be
quantifiable, and some consideration should be given by the User as an indication of what evidence he
requires to satisfy him that the AC have indeed been met.
B.1.3.5 It is also recommended that the V&V agent discuss at this stage, with the User, the comparative
importance of the AC such that a priority can be determined. This will enable the V&V planning stage to be
flexible if there are resource or cost constraints.
B.1.3.6 The AC can be used to support the V&V of data and scenarios and should include non-functional
requirements such as availability, failure rates, operating costs, security requirements and documentation
standards.
B.1.3.7 One approach to deriving the AC is to walk-through the UNS, identifying specific criterion for each
specific requirement; it is recommended that for each criterion an indication of how this criterion can be
demonstrated and assessed be derived along with the Users relative priority. The following steps are
provided:
a. Obtain an accurate and complete statement of the problem, objectives and requirements from the
User.
b. Conduct a risk assessment.
c. Identify acceptance information needs.

23

DEF STAN 03-44 Issue 2


B.1.3.8

Table 5 illustrates how a set of AC can be recorded:


Table 5

Requirement

Example Acceptance Criteria Recording Table


Acceptance Criteria

Assessment Method

Priority
(H/M/L)

The MS&SE shall be


capable of accepting
input data
characterizing a
particular missile
flight test.

Analysis team is able to perform


post-flight analysis/reconstruction
with the simulations using
conditions of the actual flight test as
input

Inspection of documentation of postflight analysis for at least one test flight

MS&SE shall model


aerodynamics of the
missile.

Simulation predictions shall match


Test & Measurement data to a
degree that is acceptable to SMEs
in the established program office
simulation working group.

Post-flight analysis. Compare


simulation predictions with TM for the
following parameters: fin position and
angle of attack for a given Mach
number. See approved support plan for
exact Test & Measurement channels to
be compared with these simulation
parameters.

Inspection of code to confirm that mass


properties are modelled. Inspection of
simulation documentation to confirm
that the source of the mass property
data is documented.

(If an objective pass/fail number can


be derived, or the SMEs can agree
on a quantitative pass/fail criterion-e.g. the predicted value of a
particular parameter in the
simulation must match the
measured value from an
instrumented test data to within x%
of the measured value-- the
quantitative criteria should be listed.
MS&SE shall model
the mass properties
for the Mk xx
configuration

B.1.3.9

Mass properties are modelled and


source of data in simulation is
documented.

Such a table can be used to form the basis of an acceptance checklist.

B.1.3.10 The AC can be expressed as a threshold, typically the minimum capability needed for the
MS&SE to be fit for the intended purpose as defined and agreed to by the User.
B.1.4
B.1.4.1

V&V Objectives
The V&V Objectives should be clearly stated in the V&V Plan.

B.1.4.2 V&V objectives should be defined in terms of the measures and specific tasks to be performed to
address each measure. Objectives may need to be decomposed to attain this level of specificity. The ideal
situation is when one V&V objective is defined by one measure that can be obtained performing one task.
This situation seldom occurs. Generally, it will take many measures to address each objective, and multiple
tasks to address each measure. However, where possible the principle of simplicity and direct traceability of
objectives to sub-objectives, to measures, to tasks should be used as a guide in the full development of
objectives.
B.1.5

Conceptual Model

B.1.5.1 The MS&SE CM is the Developers view of what he intends to build in response to the UNS. As
such, a robust CM is an important bridge for communication between application domain experts and
MS&SE developers.
B.1.5.2 The CM is produced by the MS&SE Developer and is a statement of his understanding of the
requirement. It is an abstraction of, and states the MS&SE concept that the Developer is intending to
24

DEF STAN 03-44 Issue 2


produce. The CM should include descriptions of MS&SE entities, objects, algorithms, relationships
(architectures) and data as well as assumptions and limitations.
B.1.5.3

The CM demonstrates the Developers understanding of the intended application. It also:

a. Connects the design to the requirements, by providing a comprehensive description of the


architecture, algorithms, behaviours, interactions, and capabilities to be addressed in the design
specification.
b. Supports the transition from requirements to design by serving as the framework where the MS&SE
requirements are converted into the necessary capabilities needed by the simulation.
c. Presents a thorough functional-level description and architectural model of the MS&SE that is
capable of supporting the design process and future reuse.
B.1.5.4 The CM along with the AC form the baseline against which subsequent V&V activities are
conducted so it is essential that the CM is itself validated. CM validation is done to ensure that all
participants, especially the V&V Agent and Developer, have a clear and accurate understanding of the
Developers vision of intended use and proposed capabilities of the simulation. CM validation consists of a
series of reviews and assessments by the V&V Agent and appropriate SMEs to determine if the various
parts of it are adequately defined and represented.
B.1.5.5 The validated CM becomes (or at least bounds) the referent for the MS&SE because it presents
an adequate representation of the performances, behaviours, interactions, and fidelity needed to meet the
intended use
B.1.5.6 Fidelity Issues. The level of fidelity needed to address a specific application is a highly significant
and difficult characteristic to determine. It requires a thorough understanding of the problem being addressed
and of how the MS&SE is to be used. The perpetual question is how much is enough, and the answer
varies considerably based on any number of factors, such as what characteristics of the entities need to be
represented and what behaviours and interactions are expected. The level of fidelity of interacting entities
should be roughly equivalent to ensure fair and unbiased interoperability (e.g., a level battlefield). The level
of fidelity of the simulation has to be sufficient to adequately and accurately represent the referent (i.e., the
representations in the simulation have to be enough like the real systems to satisfy the Users needs).
B.1.6

Design Products

B.1.6.1 During MS&SE design a number of design products are produced, these artefacts are important
inputs to the V&V activity. A typical set of design products are:
a. Requirements.
b. Conceptual Model.
c. Federation Design.
d. Specifications.
e. Federate Selection.
f.

Test Criteria.

B.1.6.2 The above list is generalised and will vary between MS&SE developments, however it is common
to have a sequential waterfall set of documents. Each subsequent document can be verified against its
predecessor. The verification aims to ensure that each document is complete and consistent (externally with
the predecessor and internally with itself). This is usually achieved either by walkthroughs or a mapping
exercise.
B.1.6.3 During the design and development of the MS&SE numerous (by)products will be produced, these
include such things as software specification and test designs. The exact nature and number of products will
be determined by the complexity of the MS&SE and the development process being followed.

25

DEF STAN 03-44 Issue 2


B.1.7

Re-use of Components

B.1.7.1 In the development of an MS&SE there is a desire to re-use components. The use of legacy
components can reduce programme costs and timescales, however there are implications resulting from reuse. The V&V team should be central in supporting the selection of components by conducting the following
activities.
B.1.7.2 Assess the legacy components capabilities. Only by knowing what the component brings with
it, in terms of capabilities and limitations, can the User determine what needs to be done to ensure the
simulation is fit for the intended purpose. Once a component has been selected, the V&V Agent will help
assess how well that component can address the M&S requirements and to identify what should be done, if
anything, to improve the simulations fitness for the intended use.
B.1.7.3 Assess the legacy components for the intended use. Several V&V activities should be
performed regardless of whether the component needs to be modified. The input data need to be verified
and validated for the intended use and the results from executing the overall implementation need to be
compared against the representational requirements of the intended application. Further, depending on the
completeness and credibility of existing component information, additional verification tasks may need to be
performed.
B.1.7.4 Providing support to the modification effort. Where a component needs some modification for
re-use then the V&V agent will support this effort by verifying the changes proposed. Once the changes are
accepted and implemented the V&V agent will verify that the changes have no consequent effect on the
overall MS&SE. To make this interaction work effectively, the V&V Agent needs ready access to the data,
documents, and interim products.
B.1.7.5 The majority of the information required for this task should be available in the form of a
Component Log Book that should be requested from the supplier.
B.1.7.6 In the case where the development is following the Federation Execution and Development
Process IEEE 1516.3, (FEDEP) there are potentially a large number of FEDEP specific artefacts that could
be produced, depending upon the tailoring of the FEDEP. It is recommended that if the FEDEP is being
followed then the V&V activities pertinent to FEDEP products is conducted in accordance with the FEDEP
Verification, Validation & Accreditation (VV&A) Overlay (Draft IEE guidance document).
B.1.8

Development Products

B.1.8.1 During the actual development of the MS&SE a number of products will be generated; the nature
of these products will vary depending on the development methodology in place.
B.1.8.2 During this period MS&SE components will be developed, legacy components modified as
required and some component testing conducted. These development activities will have associated
documentation that will be an input in to the V&V process. As with the design products in sub-clause B.1.8
there must be completeness and consistency checking of these documents with their hierarchical
predecessors. Evidence of testing and associated test designs are typically of interest to the V&V team and
will be required as part of V&V evidence collation. The typical test hierarchy is shown in Figure 4.

Subjective Tests

Objective Tests

Technical Tests

Figure 4
26

Test Hierarchy

DEF STAN 03-44 Issue 2


B.1.8.3 The V&V team can add value during development by highlighting issues that may impact final
acceptance at an early and fixable stage and should be available to support and/or direct testing as is
appropriate and efficient.
B.1.9

Integration and Test Products

B.1.9.1 The integration and test of the MS&SE will deliver products that will support the V&V case for, in
particular, technical interoperability. At this stage evidence to support the AC that relate to MS&SE
performance will be collected; again the V&V team can support integration and test and will need to ensure
that the evidence that they require to support the acceptance decision is collected.
B.1.9.2 Many of the products at this stage will be the results from the various component and MS&SE
integration tests. This testing can generate a huge volume of data; it is essential that this volume is
anticipated by the V&V team at the Evidence Requirement definition stage and that processes are in place to
capture, store and analyse the data.
B.1.10

Analyse Results

B.1.10.1 The bottom line for any MS&SE is that it should meet the UNS and this is largely addressed by
the analysis of the MS&SE results upon its execution. Results validation is the final line of evidence to
support acceptance and it examines the extent to which the MS&SE, and its associated data, can provide
appropriate responses when exercised in the context of the application.
B.1.10.2 Beginning in the implementation phase, individual components or modules are executed to
ensure appropriate performance and output. Additional testing is done as the components are integrated.
Ultimately, the complete MS&SE is executed and the results are analysed to determine if it is producing
credible, appropriate answers.
B.1.10.3 A MS&SE has results validity when testing has provided sufficient results to assure that it meets
a purposes representational requirements. This decision may come from any one, or a combination, of the
following comparison techniques:
a. SME Assessments (face validation).
b. Visual comparison.
c. Analytical comparison.
d. Formal comparisons.

B.2

Further Sources of Information

B.2.1 Validation and verification is an area of on-going research and development. Of particular note are
the following:
B.2.1.1 The US DOD Defence Modelling and Simulation Office (DMSO) have been a leader in the
development of V&V techniques and maintain a useful website for accessing their guidance documentation
(http://vva.dmso.mil/).
B.2.1.2 The Simulation Interoperability Standards Organisation (SISO) are a leading organisation in
developing V&V techniques and standards. Up to date information can be obtained from the SISO web site
(www.sisistds.org)

27

Crown Copyright 2008

Copying Only as Agreed with DStan

Defence Standards are Published by and Obtainable from:

Defence Equipment and Support


UK Defence Standardization
Kentigern House
65 Brown Street
GLASGOW G2 8EX

DStan Helpdesk

Tel 0141 224 2531/2


Fax 0141 224 2503
Internet e-mail: enquiries@dstan.mod.uk

File Reference
The DStan file reference relating to work on this standard is D/Dstan/23/44.
Contract Requirements
When Defence Standards are incorporated into contracts users are responsible for their correct
application and for complying with contractual and statutory requirements. Compliance with a Defence
Standard does not in itself confer immunity from legal obligations.
Revision of Defence Standards
Defence Standards are revised as necessary by an up issue or amendment. It is important that users
of Defence Standards should ascertain that they are in possession of the latest issue or amendment.
Information on all Defence Standards can be found on the DStan Website www.dstan.mod.uk,
updated weekly and supplemented regularly by Standards in Defence News (SID News). Any person
who, when making use of a Defence Standard encounters an inaccuracy or ambiguity is requested to
notify UK Defence Standardization (DStan) without delay in order that the matter may be investigated
and appropriate action taken.

You might also like