You are on page 1of 33

2007 Fall Extension Conference

1
Evaluating Training Programs
The Four Levels
Dr. Myron A. Eighmy
Based on the work of Dr. Donald L.
Kirkpatrick, University of Wisconsin -
Madison
2007 Fall Extension Conference
2
Objectives
Upon completion of this presentation you will
be able to:
State why evaluation of programs is critical to you
and your organization.
Apply Kirkpatricks four levels of evaluation to
your programs.
Use guidelines for developing evaluations.
Implement various forms and approaches to
evaluation
2007 Fall Extension Conference
3
Why Evaluate?
Determine the effectiveness of the program design
How the program was received by the participants
How learners fared on assessment of their learning
Determine what instructional strategies work
presentation mode
presentation methods.
learning activities
desired level of learning
Program improvement
2007 Fall Extension Conference
4
Why Evaluate?
Should the program be continued?
How do you justify your existence?
How do you determine the return on
investment for the program?
human capital
individual competence
social/economic benefit

2007 Fall Extension Conference
5
Four Levels of Evaluation
Kirkpatrick
During program evaluation
Level One Reaction
Level Two Learning
Post program evaluation
Level Three Behavior
Level Four Results
2007 Fall Extension Conference
6
Reaction Level
A customer satisfaction measure
Were the participants pleased with
the program
Perception if they learned anything
Likelihood of applying the content
Effectiveness of particular strategies
Effectiveness of the packaging of
the course
2007 Fall Extension Conference
7
Examples of Level One
Your Opinion, Please
In a word, how would you describe this
workshop?
Intent
Solicit feedback about the course. Can also
assess whether respondents transposed the
numeric scales.
2007 Fall Extension Conference
8
Example of Level One
Using a number, how would you describe this
program? (circle a number)

Terrible Average Outstanding
1 2 3 4 5

Intent: Provides quantitative feedback to determine
average responses (descriptive data). Watch scale
sets!
2007 Fall Extension Conference
9
Example of Level One
How much did you know about this subject before taking this
workshop?
Nothing Some A lot
1 2 3 4 5

How much do you know about this subject after participating in
this workshop?
Nothing Some A lot
1 2 3 4 5

Intent - The question does not assess actual learning, it
assesses perceived learning.
2007 Fall Extension Conference
10
Example Level One
How likely are you to use some or all of the skills
taught in this workshop in your work/community/
family?
Not Very
Likely Likely Likely
1 2 3 4 5
Intent determine learners perceived relevance of
the material. May correlate with the satisfaction
learners feel.
2007 Fall Extension Conference
11
Example of Level One
The best part of this program was
The one thing that could be improved most ..

Intent
Qualitative feedback on the course and help
prioritize work in a revision. Develop themes
on exercises, pace of course, etc.
2007 Fall Extension Conference
12
Guidelines for Evaluating Reaction
Decide what you want to find out.
Design a form that will quantify reactions.
Encourage written comments.
Get 100% immediate response.
Get honest responses.
If desirable, get delayed reactions.
Determine acceptable standards.
Measure future reactions against the standard.
2007 Fall Extension Conference
13
Learning Level
What did the participants learn in the
program?
The extent to which participants change attitudes,
increase knowledge, and/or increase skill.
What exactly did the participant learn and not
learn?
Pretest Posttest
2007 Fall Extension Conference
14
Learning Level
Requires developing specific
learning objectives to be
evaluated.
Learning measures should be
objective and quantifiable.
Paper pencil tests, performance on
skills tests, simulations, role-plays,
case study, etc.
2007 Fall Extension Conference
15
Level Two Examples
Develop a written exam based on the desired
learning objectives.
Use the exam as a pretest
Provide participants with a worksheet/activity sheet
that will allow for tracking during the session.
Emphasize and repeat key learning points during the
session.
Use the pretest exam as a posttest exam.
Compute the posttest-pretest gain on the exam.
2007 Fall Extension Conference
16
What makes a good test?
The only valid test questions emerge from
the objectives.
Consider writing main objectives and
supporting objectives.
Test questions usually test supporting
objectives.
Ask more than one question on each
objective.


2007 Fall Extension Conference
17
Level Two Strategies
Consider using scenarios, case studies,
sample project evaluations, etc, rather than
test questions. Develop a rubric of desired
responses.
Develop between 3 and 10 questions or
scenarios for each main objective.
2007 Fall Extension Conference
18
Level Two Strategies
Provide instructor feedback during the
learning activities.
Requires the instructor to actively monitor
participants discussion, practice activities, and
engagement. Provide learners feedback.
Ask participants open ended questions
(congruent with the learning objectives) during
activities to test participant understanding.
2007 Fall Extension Conference
19
Example
Which of the following should be considered when
evaluating at the Reaction Level? (more than one
answer possible)
___Evaluate only the lesson content
___Obtain both subjective and objective
responses
___Get 100% response from participants
___Honest responses are important
___Only the course instructor should review results.
2007 Fall Extension Conference
20
Example
Match the following to the choices below
___ Reaction Level
___ Learning Level
A. Changes in performance at work
B. Participant satisfaction
C. Organizational Improvement
D. What the participant learned in class
2007 Fall Extension Conference
21
Scenario Example
An instructor would like to know the
effectiveness of the course design and how
much a participant has learned in a seminar.
The instructor would like to achieve at least
Level Two evaluation.
What techniques could the instructor use to
achieve level two evaluation?
Should the instructor also consider doing a level
one evaluation? Why or why not?
2007 Fall Extension Conference
22
Rubric for Scenario Question
Directions to instructor: Use the following topic checklist to determine
the completeness of the participants response:

___ Learner demonstrated an accurate understanding of what level two
is: learning level.
___ Learner provided at least two specific examples: pretest -posttest,
performance rubrics, scenarios, case studies, hands-on practice.
___ Learner demonstrated an accurate understanding of what level one
evaluation is: reaction level.
___The learner provided at least three specific examples of why level
one is valuable: assess satisfaction, learning activities, course
packaging, learning strategies, likelihood of applying learning.
2007 Fall Extension Conference
23
Behavior Level
How the training affects performance.
The extent to which change in behavior
occurred.
Was the learning transferred from the
classroom to the real world.
Transfer Transfer - Transfer

2007 Fall Extension Conference
24
Conditions Necessary to Change
The person must:
have a desire to change.
know what to do and how to do it.
work in the right climate.
be rewarded for changing.


2007 Fall Extension Conference
25
Types of Climates
Preventing forbidden to use the learning.
Discouraging changes in current way of
doing things is not desired.
Neutral learning is ignored.
Encouraging receptive to applying new
learning.
Requiring change in behavior is
mandated.
2007 Fall Extension Conference
26
Guidelines for Evaluating Behavior
Measure on a before/after basis
Allow time for behavior change (adaptation) to take
place
Survey or interview one or more who are in the best
position to see change.
The participant/learner
The supervisor/mentor
Subordinates or peers
Others familiar with the participants actions.
2007 Fall Extension Conference
27
Guidelines for Evaluating Behavior
Get 100% response or a sample?
Depends on size of group. The more the better.
Repeat at appropriate times
Remember that other factors can influence
behavior over time.
Use a control group if practical
Consider cost vs. benefits of the evaluation
2007 Fall Extension Conference
28
Level Three Examples
Observation
Survey or Interview
Participant and/or others
Performance benchmarks
Before and after
Control group
Evidence or Portfolio
2007 Fall Extension Conference
29
Survey or Patterned Interview
1. Explain purpose of the survey/interview.
2. Review program objectives and content.
3. Ask the program participant to what extent performance was improved as a result
of the program. __ Large extent __ Some extent __ Not at all
If Large extent or Some extent, ask to please explain.
4. If Not at all, indicate why not:
___ Program content wasnt practical
___ No opportunity to use what I learned
___ My supervisor prevented or discouraged me to change
___ Other higher priorities
___ Other reason (please explain)
5. Ask, In the future, to what extent do you plan to change your behavior?
___ Large extent ___ Some extent ___ Not at all
Ask to please explain:
2007 Fall Extension Conference
30
Evidence and Portfolio
Thank you for participating. I am very interested in how the
evaluation skills you have learned are used in your work..

Please send me a copy of at least one of the following:
a level three evaluation that you have designed.
a copy of level two evaluations that use more than one method of
evaluating participant learning.
a copy of a level one evaluation that you have modified and tell me how
it influenced program improvement.
(indicate if you would like my critique on any of the evaluations)
If I do not hear from you before January 30, I will give you a call no
pressure just love to learn what you are doing.
2007 Fall Extension Conference
31
Results Level
Impact of education and training on the
organization or community.
The final results that occurred as a result of
training.
The ROI for training.
2007 Fall Extension Conference
32
Examples of Level Four
How did the training save costs
Did work output increase
Was there a change in the quality of work
Did the social condition improve
Did the individual create an impact on the
community
Is there evidence that the organization or
community has changed.
2007 Fall Extension Conference
33
Guidelines for Evaluating Results
Measure before and after
Allow time for change to take place
Repeat at appropriate times
Use a control group if practical
Consider cost vs. benefits of doing Level Four
Remember, other factors can affect results
Be satisfied with Evidence if Proof is not possible.