You are on page 1of 9

University of Kentucky

UKnowledge
Educational, School, and Counseling Psychology
Educational, School, and Counseling Psychology
Faculty Publications

3-2002

Does It Make a Difference? Evaluating Professional


Development
Thomas R. Guskey
University of Kentucky, GUSKEY@UKY.EDU

Follow this and additional works at: http://uknowledge.uky.edu/edp_facpub


Part of the Educational Assessment, Evaluation, and Research Commons

Repository Citation
Guskey, Thomas R., "Does It Make a Difference? Evaluating Professional Development" (2002). Educational, School, and Counseling
Psychology Faculty Publications. 7.
http://uknowledge.uky.edu/edp_facpub/7

This Article is brought to you for free and open access by the Educational, School, and Counseling Psychology at UKnowledge. It has been accepted for
inclusion in Educational, School, and Counseling Psychology Faculty Publications by an authorized administrator of UKnowledge. For more
information, please contact UKnowledge@lsv.uky.edu.
Authors
Thomas R. Guskey

Does It Make a Difference? Evaluating Professional Development

Notes/Citation Information
Published in Educational Leadership, v. 59, issue 6, p. 45-51.

Copyright 2002 Thomas R. Guskey

The copyright holder has granted the permission for posting the article here.

This article is available at UKnowledge: http://uknowledge.uky.edu/edp_facpub/7


Does It Make a Difference?
Evaluating Professional Development
Using five critical levels of ducators have long considered evaluation of professional development

E
evaluation, you can improve professional development to programs more imponant than ever.
be their right- omething Traditionally, educators haven't paid
your school's professional they deserve as dedicated and much attention to evaluating their
development program. hardworking individuals. But professional development efforts. Many
But be sure to start with the legislators and policymakers have consider evaluation a co tly, time-
recently begun to question that right. As consuming proce s that diverts atten-
desired result-impr oued education budget grow tight, they look tion from more irnponant activities such
student outcomes. at what chools pend on professional as planning, implementation, and
development and want to know, Does follow-up. Otl1ers feel they lack the skill
the investment yield tangible payoffs or and expeni e to become involved in
could that money be spent in better rigorous evaluations; as a result, they
Thomas R. Guskey ways? Such questions make effective either neglect evaluation issues
completely or leave them to time was well spent? Did the
"evaluation experts." material make sense to them?
Good evaluations don 't have Were the activities well
to be complicated. They planned and meaningful? Was
simply require thoughtful the leader knowledgeable and
planning, the ability to ask helpful' Did the participantS
good questions, and a basic find the information useful?
understanding of how to find lmportam questions for
valid answers. What's more::, professional development
they can provide meaningful workshops and seminars also
information that you can use to include, Was the coffee hot and
make thoughtful, responsible ready on time? Was the room at
decisions about professional the right temperamre? We::re
development processes and the chairs comfortable? To
effects. some, questions such as these
may seem silly and inconse-
What Is Evaluation? quential. But experienced
In simplest terms, evaluation is professional developers know
"the systematic investigation of the importance of arrending to
merit or wonh"Ooint Com- these basic human needs.
mittee on Standards for Educa- lnformation on participants'
ti.onal Evaluation, 1994, p. 3). reactions is generally gathered
Systematic implies a focuse::d, through questionnaires handed
d1oughtful, and intentional out at the end of a session or
process. We conduct evalua- activity. These quel>'tionnaires
tions for clear reasons and wid1 typically include a combination
explicit intent. investigation of rating-scale items and open-
refers co the collection lmd ~ ended response questions that
analysis of pertinent informa- "' allow participants to make
tion through appropriate ! personal comments. Because of
methods and techniques. Merit the general nature of this infor-
or worth denotes appraisal and judg- Critical Levels of Professional mation, many organizations use the
ment. We use evaluations to determine Development Evaluation same questionnaire for all their profes-
the value of something-to help answer Effective professional development eval- sional development activities.
such questions as, Is this program or uations require the collection and anal- Some educators refer to these
activity achieving its intended results? Is y is of the five critical levels of informa- measures of participants' reactions as
it better than what was done in the tion hown in Figure 1 (Guskey, 2000a). "happiness quotients," in isting that
past? Is it berrer than another, With e::ach succeeding level, d1e process they reveal only the entertainment value
competing activity? ls it worth the of gathering evaluation information gets of an activity, not its quality or worth.
costs? a bit more complex. And because each But measuring participants' initial satis-
Some educators under tand the level builds on those that come before, faction with the experience can help
importance of evaluation for event- success at one level is usually necessary you improve the design and delivery of
driven professional development activi- for success at higher level . programs or activities in valid way .
tie , such as workshops and seminars,
but forget the wide nmge of less formal, Levell: Participa11ts' Reactions Level2: Participmzts' Leanling
ongoing, jo~mbedded professional The first level of evaluation looks at In addition to liking their professional
development activities-study groups, participants' reaction tO the profes- development experience, we also hope
action research, collaborative pL'Ullli.ng, sional development experience. This is d1at participants Jeam somedling from
curriculum development, tructured the most common form of professional it. Level 2 focuses on measuring d1e
observations, peer coaching, mentoring, development evaluations, and the knowledge and skills that participants
and o on. But regardless of its fom1, easiest type of information to gather and gained. Depending on the goals of the
professional development should be a a nalyze. program or activity, this can involve
purposeful endeavor. Through evalua- At Level l , you address questions anything from a pencil-and-paper assess-
tion, you can detennine whether these focusing on whether or not participants ment (Can participants describe the
activities are achieving their purposes. liked d1e experience. Did they feel their crucial attributes of mastery learning

46 ED UC ATIONAl. LHADER ~ liii' / MAR C H 2 0 02


and give examples of how these might
be applied in typical das room situa-
tions?) to a simulation or full- calc skill
I Traditionally, educators I such as these can play a large part in
determining the success of any profes-
sional development effort.
de monstration (Pre e nted with a variety haven't paid much attention to Gathering information at Level 3 is
of classroom conflictS, can participants generally more complicated than at
diagnose each situation and then evaluating their professional previou levels. Procedures differ
prescribe and carry out a fair and work- development efforts. depending on the goals of the program
able solution?). You can also use oral or activity. They may involve analyzing
personal reflections or portfolios that district or school records, examining
participants assemble to document their The lack of positive results in this the minutes from foiJow-up meetings,
learning. case doesn"t reflect poo r training or administering q uestionnaires, and inter-
Although you can usually gather Level inadequate learning, but rather organiza- viewing participants and c hool admin-
2 evaluation information at the comple- tion polic ies that undermine implemen- istrators. You can use this information
tion of a professional development tation efforts. Problems at Level 3 have not only to document and improve
activity, it requires more than a stan- essentially canceled the gains made at organization support but also to inform
dardlzed form. Measures must show Leve ls 1 and 2 (Sparks & Hirsh, 1997). future change initiatives.
attainment of specific learning
goals. This means that indicators
of uccessful learning need to be
o utlined before activities begin.
You can use this informatio n as a
basis for improving the content,
format, and organizatio n of the
program o r activities.

Level 3: Orga11izalio11
Support arzd Cha11ge
At Level 3, the focus shifts to the
organization. Lack of organiza-
tion suppo rt and c hange can
sabotage any professional devel-
opment e ffon , even w hen all the
individual aspects of professional
development are done right.
Suppose, for example, that
everal ecoodary school educa-
tors participate in a professional
development program on coop-
er.ttive teaming. They gain a
thorough understanding of the
theory and develop a variety of class- That's w hy professional develo pment Level 4: Participa11ts' Use of
room activities based o n cooperative evaluations must include information on New Kttowledge and SkiUs
teaming principles. Following their organization support and c hange. At Level 4 we a.sk, Did the new knowl-
training, they try ro in1plement these At Level 3, you need to focus on edge and skills that participants learned
activities in schools w he re students are questions about the organization char.tc- make a difference in their professional
graded "on the c urve" -according to te ristics and attributes necessary for practice? The key to gathering relevant
t11eir relative standing among class- su ccess. Did the professional develop- information at this level rests in speci-
mates-and great importance is ment activities promote changes mat fying c.lear indicators of both the degree
attached to selecting the class valedicto- were aligned with the mission of the and the quality of implementation.
rian. Organi7..ation policies an d practices school and district? Were changes at the Unlike Levels 1 and 2 , this information
such as these make learning highly individual level encouraged and cannot be gathered at the end of a
competitive and w ill t11wart the most supported aL all levels? Were sufficient professional development se sio n.
valiant efforts to have students coop- resources made available, including Enough time must pass to allow partici-
erate and help one anothe r learn time for sharing and reflection? We re pa nts to adapt the new ideas and prac-
(Guskey, 2000b). successes recognized and share d? Issues tices tO their ettings. Because imple-

A S "OC I ATI ON FO R SU PE RVI S I ON AN D CU RRI CU I.U .\1 0E VE L O PME T 47


F I G U R f I

Five Leve ls of Professio na l Development Evaluation

Evaluation Level What Questions Are Addressed? How Will Information What Is Measured or
Be Gathered? Assessed?

1. Participants' Did they like it? Questionnaires administered at the end Initial satisfaction with the
Reactions Was their time well spent? of the session experience
Did the material make sense?
Will it be useful?
Was the leader knowledgeable and
helpful?
Were the refreshments fresh and tasty?
Was the room the right temperature?
Were the chairs comfortable?

2. Participants' Did participants acquire the intended Paper-and-pencil instruments New knowledge and skills of
Learning knowledge and skills? Simulations participants
Demonstrations
Participant reflections
(oral and/or written)
Participant portfolios

3. Organization Was implementation advocated, District and school records The organization's advocacy,
Support & facilitated, and supported? M1nutes from follow-up meetings support, accommodation,
Change Was the support public and overt? Questionnaires facilitation, and recognition
Were problems addressed quickly and Structured interviews with participants
efficiently? and district or school administrators
Were sufficient resources made available? Participant portfolios
Were successes recognized and shared?
What was the impact on the organization?
Did it affect the organization's climate
and procedures?

4. Participants' Did participants effectively apply the new Questionnaires Degree and quality
Use of New knowledge and skills? Structured interviews with participants of implementation
Knowledge and their supervisors
and Skills Participant reflections
(oral and/or written)
Participant portfolios
Direct observations
Video or audio tapes

5. Student What was the impact on students? Student records Student learning outcomes:
Learning Did it affect student performance School records Cognitive (Performance &
t'
Outcomes or achievement? Questionnaires Achievement)
~
.. Did it influence students' physical Structured interviews with students, Affective (Attitudes &
! or emotional well-being?
Are students more confident as learners?
parents, teachers, and/or
administrators
Dispositions)
Psychomotor (Skills &
~
0 Is student attendance improving? Participant portfolios Behaviors)
~ Are dropouts decreasing?
~
u

48 ED UC ATIONAL LEADER S IIIP / MAR C !i 20()2


mentation is often a gradual and uneven
process, you may also need to measure
progress at severaJ time intervals . ~perintendents, board I
You may gather this information
through questionnaires or truc rured
members, and parents rarely
inte rview s with participants and their ask, "Can you prove it?"
supe rvisors, oraJ o r w ritten p ersonal
How Will Information
rdlections, or examination of partid- Instead, they ask for evidence.
Be Used?
pants' journals or portfolios. The most
- accurate information typically comes
To improve program design and delivery
from direct observations, either w ith more time for writing. Had information
trained observers or by reviewing video- at Level 5 been restricted to t11e single
or audiotapes. These observations, measure of students' writing, tlus impor-
however, should be ke pt as unobtrusive tant unintended re ult might have gone
as p ossible (for examples, see Hall & unnoticed.
Borel, 1987). Measures of student learning typically
You can analyze this information to include cognitive indicators of stude nt
h elp restmcture future programs and pe rforman ce and achievem<::nt, such as
activities to facilitate better and more portfolio evaluations, grade , and scores
To improve program content. format,
consiste nt implementation. fro m standardized tests. ln addition , you
and organization
may want to measure affective o ut-
Level5: Student Lea171i11g Outcomes comes (attitudes and disp ositions) and
Le vel 5 addresses "the bottom line'': psychomotor o utcomes (skills and
How did the professional developme nt behavio rs). Exampks include students'
activity affect students? Did it benefit self-co ncepts, study habits. c hool atte n-
them in any way? The pa rticular student dance, homework completion rates, and
To document and improve organization learning outcomes of interest depend , classroo m behaviors. You can also
support of course, on the goals of that specific consider such schoolwide indicatOrs as
To inform future change efforts profe sio nal development effort. enrollme nt in advanced clas es, member-
In additio n to the stated goals, the ship in honor societies, participation in
activity may result in important unin- sch ool-related activities, disciplinary
te nded outcomes. For this reason, evalu- action s, and retention or drop-out rates.
atio ns should always include multiple Student <md school records provide the
measures of student learning Qoyce, majority of such information. You can
1993). Consider, for example, elemen- also include results from questionnaires
tary school educator who participate in and tructured interviews with students,
study groups dedicated to finding ways pare nts, tead1ers, and administrators.
To document and improve the to improve the quality of students' Level 5 information about a program's
implementation of program content writing and devise a series of strategies ove rall impact can guide improvements
that they believe will w ork for their in all aspects of professional develop-
students. ln gathering Level 5 informa- ment, including program de ign, imple-
tion, they fmd that their srude nrs' scores mentation, and follow-up. In some
on measures of w riting ability over the cases, info rmation on srude nt learning
school year increased significantly outcomes i used to estimate tlle cost
compared with those of com parable effectiveness of professional develop-
stude nts whose teachers did not use ment, some times referred to as "return
To focus and improve all aspects of program these strategies. on investment" or "ROl evaluation"
design, implementation, and follow-up On further analysis, however, they (Parry, 1996; Todnem & Warner, 1993).
To demonstrate the overall1mpact of discove r t11at their students' scores on
professional development mat11ematics ad1ievement declined Look for Evidence, Not Proof
compared w ith those of the othe r Using these five levels of information in
students. This unintended olllcome p rofessional development evaluations,
apparently occurred because t11e are you ready to "prove" that profes-
teachers inadvertently sacrificed instmc- sional development p rograms make a
tional time in mathe matics to provide differe nce? Can you now demonstrate

A >~ O C I A T I OJII FOR St:PER VIS I ON ANI) C U R RICl' lll M l)f.\'f;l 0P M EN T 49


that a p<u1ic ular professional develop-
ment program, and nothing else, is
olely responsible for the school's 10
p ercent increase in student achieve-
ment scores or its 50 p erce nt reduction
in discipline referrals?
Of course not. Nearly aU professional
development takes place ill real-world
settings. The relationship between
professional development and improve-
ments ill student learning in these real-
world ettings is far too complex and
include too many intervening variables
to permit simple causal inferences
(G uskey, 1997; Guskey & Sparks, 1996).
What's more, most schools are engaged
in systemi c reform initiatives that
involve the simultaneous implementa-
tion of multiple innovations (FuUan,
1992). Isolating the effects of a single
program or activity under uch condi-
tions is usuaUy impossible.
But in the absence of p roof, you can know what you're looking for before The third implication, and perhaps
collect good evidence about whether a you begin. Many educators find evalua- the most important, is this: In planning
professional development program has tion at Levels 4 and 5 difficult, exp en- professional development to improve
contributed to specific gains in student sive, and time-consuming because they student learning, the m de1 of these
lcarnillg. Superinte ndent<;, board are coming in after the fact to earch for levels must be reversed. You mu t plan
members, and p arents rarely ask, "Can results (Gordon, 1991). If you don 't "back-ward" (Guskey, 2001) , startil1g
you prove it?" Instead, they ask for know where you are going, it's very w here you want to cod and the n
evidence. Above aU, be sure to gather difficult to tell whether you've arrived. working back.
evidence on measures that are mean- But if you clali.fy your goals up front, In back-ward p lanning, you first
ingful to stakeholders in the evaluation most evaluation issues fall into place. consider the stude nt learning outcomes
process. that you want to achieve (Level 5). For
Conside r, for example, the use of Working Backward example, do you want to improve
an ecdotes and testimonittls. From a Through the Five Levels stude nts' reading comprehension,
methodological p erspective, they are a Three important implications stem from enhru1ce their skills in problem olving,
poor source of data. They are typically I his model for evaluating professional develop their sense of confidence in
highly subjective, and t hey may be development. First, eacl1 of these five learning situations, or improve dleir
inconsistent and unreliable. Neverthe- levels is important. The information collabora1ion w ith classmates? Critical
less, as <my trial attorney will teU you, gathered at each level provides vital analyses of relevant data from assess-
they offer the kind of personttlized data for in1proving the qua lity of profes- ments of student learning, exan1ples of
evidence that most people believe, and sional development programs. student work, and school records are
they should not be ignored as a source Second. tr.tcklng effectiveness at one cspedally useful in identifying these
of information. Of course. anecdOtes level tell:. you nothing abou1 the impact student learning goals.
ttnd testimonials should never form d1e at the next. Although success at an early Then you determine, o n the basis of
basis of an e ntire evaluation. Setting up level may be necessary for positive pertinent researcl1 evidence, what
me,u1lngful comparison groups and results at the next higher o nt:, it's instructional practices and policies will
using appropriate pre- and post- clearly not suffident. Breakdowns can most effectively ru1d efficiently produce
measures provide valuable information. occur at any point along the way. lt' those ou1comes (Level 4). You need to
Time-series designs that include mul- important to be aware of the difficulties ask, What evidence verifies that the e
tiple mea!>'Ures collected before and involved in moving from professional particular practices and polides will
after implementation art: another useful developme nt experiences (Lt:vd 1) to lead to the desired results? How good or
alternative. improveme ms in stude nt learning (Level reliable is that evidence? Was it gath-
Keep in mind, too, that good evi- 5) and to phU1 for the time ru1d effort e red in a context similar to o urs? Watch
dence isn't hard to come by if you required to build this connection . out for popular innovations that are

50 E D l C A TI O~A I L~A IH R~ Ill P / M A RC II 200 2


more opinion-based than research- work complicates matters further. Even analysis as a centmJ componcnr of all
based, promoted by people mo re if we agree on the student lea rning professional clevelopmem activities, we
concerned with "what sells" than with outcomes that we want to achieve, can enhance the s uccess o f profe sional
"what works." You need co be cautiou what works best in one context with a development efforts everywhere.
before jumping on any educatio n band- particular community of educators and
wagon. alway making sure that trust- a panicular group of students might not References
worthy evidence validates whatever work as well in another context with Pullan, M.G. ( 1992). Vision that blind.
approach you choose. different educators and differcm Educational Leade1~sbip , 49(5) 19-20.
Gordon, J. (199 1, August). Measllling d1e
Next, coruidcr w hat aspects of orga- sLUdents. This is what makes developing "goodness" of tmini.ng. naining, 19-25.
nization support need lU be in place for examples of truly universal "best prac- Guskey, T. R. ( 1997). Re earc h needs ro
those practices and policies to be imple- tices" in professional development o Link professio nal development and
mented (Level 3). Sometimes, as I difficult. What works always depends student leaming. j oumal of Staff
mentioned earlier, aspects of the o n where, when, and with whom. Del'elopment, 18(2), 36-40.
Guskey, T . R. (2000a). Evaluating profes-
organization act ually pose barriers w sional dei> elopment. Thousand Oaks,
implementation. " No tolerance CA: Corwin.

~ove all, be sure to gathe~


policies regarding student disc ipline Guskcy, T. R. (2000b). Grading policies
and grading, for example, may limit that work agai.nst standards and how to
teachers' options in dealing with tix them. NASSP Bulletin, 84(620).
evidence on measures that 20-29.
tudems behavioral or learning prob- Guskey. T. R. (2001). The backward
lems. A big part of planning involves approach.j oumal ofStaff D(welop-
are meaningful to stakeholders
ensu ring that organization e leme nts are ment, 22(3). 60.
in place to support the desired practices in the evaluation process. Guskey, T. R., & parks, D. (1996).
and policies. Exploring the relationship between
staff development and improvements in
Then, decide what knowledge and student learning. j ournal of Staff
kills the participating p rofessionals Unfo rtunately, professional devel- Development, 1 (4), 34-38.
must have to implement the prescribed opers can fall into the same rrap irt plan- Hall, G. E. , & Hord. S. M. (1987). Cbange
practices and policies (Level 2). What ning that teachers sometimes do- in schools: Facilitating tbe process.
must they know and be able to do to making plans in terms of what they are Albany, NY: SUNY Press.
j oint Comminee on tandards for Educa-
succes fully adapt the innovation ro going to do, instead of what they want tional Evaluation . (1994). Tbe program
their specific situation and bring about d1eir students to know and be able to evaluation standards (2nd ed.). Thou-
the sought-after change? do. Professional developers o fte n phm sand Oaks, CA: Sage.
Finally, consider what set of experi- in terms of what they will do (work- joyce, B. (1993). The link is there, but
ences w ill enable participants to acquire shops, seminars, i.nstjtutes) o r how they where do we go fro m here? j ournal of
Staff Development, 14(3), 10- 12.
the needed knowledge and skills (level will do it (study groups, action re earch, Parry, . B. (1996). Measuring training's
1) . Workshops and seminar , e pedally peer coaching). This dimini hes the ROI. Training & Development, 50(5),
when paired with coUaborati,-e plan- effectivene s of their effons and makes 72-75.
ning and structured opportunities for evaluation muc h mo re difficult. Sparks, D. ( 1996, February). Viewing
practice w ith feedback , action research Instead, begin p lanning professional reform from a systems perspective. The
Developer, 2, 6.
projects, organized study groups, and a development with what you want to Sparks, D., & J-[jrsb. S. (1997). A new
wide range of other activities can all be adl.ieve in terms of learning and vision for staff' development.
effective. depending on the specified Ieamer and then work bac l,.ward from Alexandria. VA: ASCD.
p urpose of the profe sional develop- there. Planning will be muc h more effi- Todnem, G.. & Warner. M. P. (1993).
mem. cient and d1e results will be mud1 easier sing ROI to assess staff developmenr
effons.joumal of Staff Detelopment,
T his backward planning process is so to evaluate . 14(3). 32-34.
important because the decisions made
at each level profoundly affect tlmse at Making Evaluation Central Copyrig ht 2002 Thomas R. Gusker.
the next. For example, the particular A lo t of good things are clone in the
student learning outcomes you want to name.: of professional development. But
achieve influence the kinds of practices so <ue a lot of ro tten things. What
Thomas R. Guskey (Guskey@uky.edu) is
and policies you implement. likewise, educators haven' t done is provide
a professor of education policy studies
the practices and policies you want to evidence to document the difference and evaluation, College of Educatron,
implement influence the kinds of o rga- between the two. University of Kentucky, Taylor Education
nization suppon or change required, Evaluatio n provides the key to Building, Lexington, KY 40506, and
and so o n. making that distinctio n. By including author of Evaluating Professional Devel-
The cootext-spec iJ1c namre of this syste matic information gathering and opment (Corwin Press. 2000).

A S~OGIAT IO N FOR l" PERVI ~ I O At-.0 Ct.:llRIC !J U I M Oti' I:L OP\IE:'IT 51

You might also like