You are on page 1of 16

See

discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/315777556

The Art and Science of Evaluating Organization


Development Interventions

Article · April 2017

CITATIONS READS

4 984

1 author:

Allan H. Church
PepsiCo Inc.
106 PUBLICATIONS 2,152 CITATIONS

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

How are top companies defining and organizing talent management globally View project

OD Values View project

All content following this page was uploaded by Allan H. Church on 04 April 2017.

The user has requested enhancement of the downloaded file.


“Historically, the evaluation component of the classic consulting model has largely been downplayed or ignored. Today,
with increasing pressure from organizations to demonstrate the value of our efforts, having both a well-designed and
articulated evaluation strategy is key, as is a detailed multi-measure and level measurement process.”

The Art and Science of Evaluating


Organization Development
Interventions

By Allan H. Church How do we evaluate the impact of our to design a meaningful and robust evalu­
organization change programs, processes, ation process?
and initiatives? What are the best ways Whether you are an external consul-
to measure success or failure of various tant or internal practitioner there are a host
interventions? How do we know we have of challenges associated with the measure-
really made a difference? While the field of ment of causal relationships resulting from
organization development (OD) has its ori- organizational interventions at the individ-
gins in action research and enhancing the ual or systems level. Despite the definitive
growth and development of organizations writings of Kirkpatrick (1998), the prospect
and their people, if we are honest with of evaluating much of what we do in the
ourselves, our focus as a field on formally social sciences and in organizational set-
evaluating the impact of our work has tings involves dynamic, interdependent,
lagged far behind. The level of emphasis and often long-lead times that can far
collectively placed as a field on the debate outlast the consultant-client (or employee-
around having a clear and consistent employer) relationship. In fact, one of the
definition of OD, the right set of core OD most frustrating elements of being an
values, and creating new types of tools and external practitioner is the lack of visibility
techniques has far overshadowed the rigor to the long-term impact of one’s work. This
and share of mind given to measuring the is often cited as a key reason why people
impact of our efforts. take internal positions. While it’s fantastic
Why is this the case? Is this due to experience multiple client organizations
to practitioners’ lack of measurement and challenges, it can also be quite reward-
design and analytics capability as some ing to experience first-hand the changes
have argued (e.g., Church & Dutta, 2013)? in a company (or other social system)
Recent research conducted on over 380 following work you have personally had a
practitioners in the field (Shull, Church, hand in over time. The alternate side of the
& Burke, 2014) would suggest this might equation are the personal challenges and
be part of the issue. Only 29% cited using even threat to evaluating one’s own work.
statistics and research methods in their As an external consultant, if your project
OD toolkit. Or is it because it is not part fails to deliver you might not get paid for
of their values structure? That same study that engagement. As an internal practi-
reported that evidence-based practice was tioner, your program or process might be
ranked 21st out of 34 possible values that cancelled and you could find yourself out
drive their efforts. The history of evaluation of a job.
as a core practice area in OD would also All of this sits squarely in juxtaposition
support this argument. Could the reason with the client’s interest in measuring the
for our lack of focus on outcomes simply impact and in some cases financial return
be because it’s too difficult and daunting on investment (ROI) of our interventions.
While this element of the work has always

26 OD PRACTITIONER  Vol. 49 No. 2  2017


Contracting Data Gathering

been present (arguably few paid organiza-


tional interventions are done purely for the
sake of humanistic values alone), in today’s Entry Data Analysis
hypercompetitive business environment
the emphasis has never been stronger.
Given the dynamics and challenges cited
above (e.g., capability, values, complexity,
and personal interest in the game) how do
we as OD practitioners move the needle
and more holistically embrace the evalua-
tion conundrum in our efforts?
The purpose of this paper is to discuss Evacuation / Feedback &
this issue in depth and attempt to answer Success Metrics Interpretation
these questions by focusing on the art
and science involved in evaluating OD
interventions. The paper begins with
a brief overview of the evolution of the Intervention(s)
evaluation phase of the classic consult-
ing paradigm from an afterthought to a
Figure 1. Classic OD Consulting Process Model
core element required in the field today.
This will be ­followed by a discussion of
three key requirements for setting an both quantitative as well as qualitative focus continues to be relatively limited
evaluation strategy. Lack of attention to components of OD as a data-driven process in the field.
these areas works against practitioners (Waclawski & Church, 2002). As many authors have noted (e.g.,
and their clients’ ability to appropriately Second, the insights and observations Anderson, 2012; Burke, 2011; Church,
conceptualize and implement suitable offered are based on the author’s personal 2003; Cummings & Worley, 2015), the
outcome measures. These requirements external and internal consulting experience evaluation stage in the model is often given
will be presented in the context of why they and evaluation research over the past 25 lip service or overlooked entirely. This is
cause issues and how best to address them years following the implementation of a true whether you look at classic approaches
up-front in the process. The paper then variety of large-scale organizational change to doing OD work as well as the newest
offers three additional recommendations initiatives and global talent management dialogic approaches (Bushe & Marshak,
for creating an effective evaluation process processes. The focus therefore is less on 2015—where evaluation, for example,
that will yield the right kinds of informa- one-off individual coaching engagements, is not even listed in the topic index). A
tion needed to demonstrate the impact team building efforts, or group interven- quick scan of the EBSCO database shows
of OD and related applied organizational tions but more on measuring the impact of no academic articles published at all, for
interventions. broader initiatives, systems, and processes. example, on the terms evaluation and OD
Thus, while the suggestions offered reflect since 2012.
Recognizing Our Biases a certain normative and science-based Instead, the emphasis has often been
paradigm, and are constrained by the expe- placed on the actions taken or the change
Before proceeding, however, and as any riences of the author, it is hoped that they in behaviors and culture observed. While
OD practitioner should do, it is important will appeal to a much broader spectrum of some research on the evaluation of various
to recognize several biases inherent in OD applications. individual OD methods has certainly been
the approach to be presented here. First, done (e.g., Basu & Das, 2000; Terpstra,
the perspective is from that of an organi- The Evolution of Evaluation in OD 1981; Woodman & Wayne, 1985), and there
zational psychologist by formal training is a plethora of qualitative and quantitative
with a significant grounding in applied Although the evaluation stage of the clas- case studies both individually and in OD
statistics and measurement theory (and sic consulting model that OD shares with texts citing the impact of various programs,
therefore represents a scientist-practitioner many other applied social science disci- relatively few authors have taken a more
mindset). Thus, there is an inherent bias plines, such as industrial-organizational focused view of how to systematically
that evaluation methods for all types of psychology (I-O) and human resource measure our efforts. Moreover, many of
organizational initiatives (whether directed development (HRD), has been ­present these have come from authors with cross-
at organization change, enhancing develop- since Burke (1982), Nadler, (1977), and disciplinary backgrounds (e.g., Armenakis,
ment, or improving performance) should others (e.g., Rothwell, Sullivan, & McClean, Bedeian, & Pond, 1983; Edwards, Scott, &
follow some degree of rigor and contain 1995) discussed the framework, the Raju, 2003; Martineau & Preskill, 2002;

The Art and Science of Evaluating Organization Development Interventions 27


Rothwell, et. al., 1995). At the same time, Three Key Requirements for Setting an up front in the contracting or project char-
scholar-practitioners in other related dis- Evaluation Strategy ter stages. Importantly, this is more than
ciplines (e.g., Holton, 1996; Kirkpatrick, a set of objectives or goals for the project.
1998; Roberts & Robertson, 1993; Svyan- There are many reasons why OD profes- The definition of impact, i.e. the intended
tek, O’Connell, & Baumgardner, 1992; sionals might not pursue or even actively outcomes, needs to be crystal clear in such
Terpstra, 1981, Woodman & Wayne, 1985) seek to avoid engaging in the evalua- a way that it can be measurable at one or
have taken this topic on years ago. Surpris- tion stage of their work. Some of these more specific levels of analysis.
ingly enough of those with more academic reflect internal states and motivations Many practitioners have as their goal
orientations have advanced the field in far (e.g., ­values, personal investment in the culture change, behavior change, process
more significant ways than more tradi- outcome) and/or a lack of specific capa- improvement, organizational effectiveness,
tional OD scholar-practitioners with the bilities and skills (e.g., in research design enhanced team functioning, etc. These are
introduction of scorecards (e.g., Becker, and ­statistics). While very important for all excellent objectives but they are not tight
Huselid, & Ulrich, 2001) and more recently ­setting a baseline, they do not necessar- enough to be used as measures of impact.
bottom-line linkages (e.g., Lawler, 2014; ily tell the complete story. Instead, the You need to be able to answer the question
of “what will be the measurable indictors
Many practitioners have as their goal culture change, of a positive outcome as a result of this
effort.” These can be quantitative or quali-
behavior change, process improvement, organizational tative (some of both methods are generally
effectiveness, enhanced team functioning, etc. These are all best for maximizing perceived evaluation
credibility) but they need to be measurable
excellent objectives but they are not tight enough to be used and aligned.
as measures of impact. You need to be able to answer the Marshall Goldsmith, for example,
in his coaching practice is known for
question of “what will be the measurable indictors of a positive his ­contracting efforts around behavior
outcome as a result of this effort.” These can be quantitative change. As part of the commitment to his
work, he uses a pre-and post-behavioral
or qualitative (some of both methods are generally best for feedback assessment tool following a
maximizing perceived evaluation credibility) but they need to yearlong engagement. If he does not
see change in the measure, it is a direct
be measurable and aligned. ­indicator of the success or failure of the
coaching project. While this might sound
Savitz & Weber, 2013), and the application e­ mphasis here is on a third set of rea- simple enough there are important mea-
of decision-science (Boudreau & Ramstad, sons—that is, the dynamics and com- surement aspects, such as the quality of
2007) to their work. plexities of measuring change real time the feedback measure used, the nature
Given the trends in the field it would in organizations. Listed below are three of the raters selected, the rating process
seem we are at cross-roads. There is key challenges and requirements for (confidential vs. anonymous) etc. that can
increasing pressure to measure the impact ­setting an effective evaluation strategy. impact the outcomes in ways that might
of our efforts (Anderson, 2012; Shull et al., be unexpected.
2014), at multiple levels in the organization 1. Clarifying the Definition of Impact While other interventions (e.g.,
(Lawler, 2014), and using more types of Whether we start with a measurement a ­cultural integration effort following a
complex and potentially anti-OD oriented theory approach (what is the criterion?) or merger and acquisition) are much more
“Big Data” applications (Church & Dutta, an OD consulting model (what is included complex than this, the same principle
2014). Yet arguably few OD practitioners in the contract?), the importance of clarify- applies. In this context, the outcome mea-
have the right set of capabilities included ing up front the outcomes to be measured sure might be comprised of targeted levels
in their formal training (Church, 2001) or is critical to the success of any interven- of turnover, identification and retention
core toolkits (Church & Dutta, 2013; Shull tion (see Figure 1). This is true whether the of key talent, improvements in levels of
et al., 2014) to address these trends. Con- effort is a simple team building exercise, engagement or other cultural indicators
sequently, we are simply not engaging in an executive ­coaching assignment, the on an employee survey, increased perfor-
rigorous evaluation methods of our organi- implementation of a large-scale engage- mance in business units, or an increase
zational interventions. Let us now explore ment survey program, or a whole sys- in the innovation pipeline 2–3 years later.
the reasons in more depth in the hopes of tems process intervention. Whatever the There are no correct answers but there are
finding some answers and potential solu- initiative, it is imperative that outcomes be aligned ones. The focus needs to be on the
tions to this challenge in the field. clearly articulated and there is alignment realistic measurable indicators of impact

28 OD PRACTITIONER  Vol. 49 No. 2  2017


Figure 2. The Impact of Taking Action From Survey Results on Employee Satisfaction

that can be linked to the timing of a spe- on target than there is on determining the there is often not a definitive end to the
cific intervention. best window (and method) for measuring engagement. In fact, many organiza-
the impact of that work over time. tional processes (and in particular those
2. Setting Realistic Time Horizons A related issue, and common fallacy implemented for employee development,
for Measurement in organizations, is the use of the “pilot” performance management, and talent
OD practitioners all know the simple concept as a means for testing the impact management purposes) continue to evolve
fact that change takes time. Based on of a new program. While launching an long after the initial design and imple-
the prevailing problem, scope, and inter- intervention in a small-scale environment mentation phases. From an evaluation
vention, this can range anywhere from or controlled area of the business can perspective then the measurement aspect
minutes following a process observation to be very useful for ironing out the imple- of assessing impact needs to be seen as
years after a leadership transition. Unfor- mentation kinks, rarely does this offer an occurring at discrete points-in-time and
tunately, clients are not always of the same effective means for predicting the potential not as an end-state. This is an important
mind. While fewer and fewer executives impact of a much larger scale program. distinction as it enables the practitioner to
seem to believe in the fallacy of changing This is because larger scale OD efforts contract regarding “points of impact” mea-
corporate culture overnight, their sense of need to be aligned to a larger set of systems surement at different stages of evolution,
timing and urgency is often directly pro- factors which require much broader think- and not rely exclusively on a single evalu-
portional to the pace of their business. For ing about organizational impact than what ation metric. Not only should this remove
example, consumer products organizations typically occurs in a small pilot context. some of the burden of having to show
generally move faster than pharmaceuti- The question to ask yourself here is “given impact all at once, but the measurement
cals. The point is that as part of the out- what we are anticipating measuring, when quality will improve as well. Time series
come alignment process OD professionals do we expect to see this outcome change as studies and multi-method approaches are
need to ensure that the timing window of a result of our efforts?” far more rigorous and valid than are single
the evaluation and measurement compo- One important caveat should be raised program reviews.
nents is clear and reasonable. here. Although the discussion so far might Case in point, in the mid-2000s there
Another issue that can occur is an suggest that all OD interventions have a was an applied study done at PepsiCo on
over emphasis up front on planning for distinct beginning and end to them, we the impact of their global employee organi-
the timing of the intervention launch, and know this is not the case. While the clas- zational health survey program (Church &
less attention paid to the appropriate lag sic consulting model tends to present the Oliver, 2006). The research was conducted
time required to observe the impact of the world in this semi-linear fashion, the vast in an effort to answer senior leadership’s
effort. Often this is because the bulk of the majority of our work rarely begins from questions regarding the impact of the
development work and consulting delivery a blank slate. Burke, et al. (1997) have survey on key employee outcomes. The
costs are front-loaded. There is more con- termed this effect “midstream consult- researchers analyzed survey data over time,
cern about meeting the deadline to deliver ing” and it applies in just about every including the use of an action planning
a new program or rollout a change agenda case whether internal or external. Further, variable, and demonstrated the impact

The Art and Science of Evaluating Organization Development Interventions 29


Figure 3. A Multi-Level Framework for Aligning Evaluation Efforts
of taking action from the results on both of those impacting and impacted by the As it turns out, often the best form
softer survey outcomes of employee satis- change. Whether you prefer the Burke-Lit- of evaluating impact is to design a new
faction and commitment as well as hard win Model (1992) or some other approach measurement process at the start of the
metrics such as turnover, lost time and to conceptualizing an organizational intervention or change effort to enable a
accidents at the plant level. Specifically,system, it is critical that broader thinking pre-post comparison. The Marshall Gold-
they reported that employee satisfaction be applied than just a micro analysis of a smith approach is in fact a perfect example
and commitment a year later (via the sec- single intervention. This is actually one of this principle in action as is the PepsiCo
ond survey) were significantly impacted by area where OD practitioners should have organizational health survey where the
managers who both shared the results and the advantage over practitioners from other questions regarding action planning were
took action versus those who only shared disciplines such as I-O Psychologists who asked the year before the company was
results or did nothing at all. While these tend toward a more individual level of interested in learning about the impact of
insights were extremely well received in the
analysis. Unfortunately, OD practitioners the results. Even if you are not in a posi-
organization, several years later the same seem to ignore their own strengths when it tion to implement a pre-measurement tool,
questions emerged under new leadership, comes to evaluation. there are scenarios where it might take
so the study was conducted again (Church Too many practitioners treat the evalu- significant time and resources to collect the
et. al., 2012). The same results were evident
ation process as an afterthought or some- necessary information to show results, and
across multiple years (see Figure 2) dem- thing to be considered once they are further this would need to start earlier on in the
onstrating the power of an organizational along in the intervention. Though there effort. Take for example, an organization
survey with an action planning focus at theis certainly wisdom in revising, aligning, that would like to know the impact of a new
local level to drive organizational change and adjusting the measurement approach leadership curriculum on online learn-
and employee engagement. if needed as the intervention progresses, it ing utilization. While the data regarding
should have a solid basis in evidence-based learning system utilization might not be
3. Applying Systems Thinking to science articulated at the beginning of the centralized at the start of the project, by the
Systems Interventions intervention. When practitioners fail to end centralization would be necessary to
This third recommendation is simple focus on evaluation early in the process, it aggregate the data. This, would require lead
enough. OD practitioners have a deep is no surprise that when pushed by clients time and dedicated resources.
understanding of systems thinking so it to show results, they need to scramble to Kirkpatrick’s famous multi-level
should be easy to apply that same approach put something in place. This often results framework (1998) is perhaps the best
to the evaluation of their efforts. This in a poorly designed and/or implemented and most easily recognized approach to
means considering variables across all measurement approach which can yield setting a systemic strategy for evaluation.
levels and sub-systems involved (Katz & inaccurate results and might even derail While initially designed for the evalua-
Kahn, 1978) and aligning the measurement the intervention itself. tion of learning interventions it has since

30 OD PRACTITIONER  Vol. 49 No. 2  2017


been expanded to include additional 1. Design Using a Multiple Measures can it serve as a criterion for something
concepts such as ROI (Phillips, 1991) and and Levels (MML) Approach else? The bottom line here is that to have
can easily be adapted to OD and related One of the aspects that makes OD an effective evaluation process in today’s
work including talent management (e.g., efforts so exciting and attractive to people multi-faceted data-driven landscape you
Church, 2013; DeTuncq & Schmidt, 2013; as a profession is the variety of projects need to have a multiple measures and
Holton, 1996). The core idea is that there that comprise the spectrum of our work. levels (MML) solution.
are multiple levels of impact that can be Whether the interventions are focused on What does this mean in practice? Max-
measured in various ways (as aligned and the individual, group, or organizational imizing your ability to measure the true
timed per above). Essentially the model levels, because they are grounded in the impact of your efforts requires more than
measures outcomes at the following levels: social and behavioral sciences there are one type of tool, process, or information
(1) reaction, (2) learning, (3) application, always a myriad of complex human dynam- system producing data. Preferably these
(4) ­business impact, and (5) bottom-line/ ics involved (unlike say pure management multiple measures are done at different
long-term outcomes. consulting which often focuses more on levels of analysis (e.g., organization-macro,
The key is spending the time required business strategy, design, or financials). group-meso, and individual-micro), and
at the initial stages of the effort to design
an integrated systems approach to the five By using multiple levels of analysis, you enable a more complex
levels of analysis, with the right types of and interdependent way of assessing impact and change. So,
metrics and under the right time horizons.
This needs to be measurable to: (a) have for example, from a multi-levels approach, in measuring the
sufficient rigor to demonstrate the impact rollout of a new set of corporate values you might measure
of your efforts, (b) satisfy your clients, and
(c) be reasonably executed with enough both how senior executives model the behavior in town halls
latitude to adjust for potential contingen- and other public forums (via personal observation or interview
cies. Figure 3 provides a simple framework
for how this might be applied. feedback), how middle managers are rated as demonstrating
Only by setting the strategy for evalu- the behaviors in the workplace (via 360 feedback), and how
ation up front can you get ahead of the
potential issues that will naturally come employees feel about the authenticity of the new values for the
with these types of efforts. culture (via employee focus groups or a pulse survey).
Three Recommendations for Building an Unfortunately, this element of OD work you collect at least two different types of
OD Evaluation Process also makes it particularly challenging to data at any given moment in time. By using
evaluate at times. Often the interventions multiple levels of analysis, you enable a
Now that the importance and critical are focused on less quantifiable aspects of more complex and interdependent way
factors involved in having an evaluation human interactions such as group dynam- of assessing impact and change. So, for
strategy have been discussed let us turn to ics, power, archetypes, norms, and culture. example, from a multi-levels approach, in
the process itself. What elements and types Even when behaviors are involved (e.g., measuring the rollout of a new set of corpo-
of data should be included in the evalua- such as leadership competencies, manage- rate values you might measure how senior
tion process? What are the key factors in ment practices, communication or col- executives model the behavior in town
conveying the messages around the impact laboration skills, digital capability, learning, halls and other public forums (via personal
of your efforts? What is the role of OD val- etc.) they are not always easily measured observation or interview feedback), how
ues in analytics (is that an oxymoron?) and by standard tools or off-the-shelf assess- middle managers are rated as demonstrat-
what are the pitfalls to avoid? Listed below ments. In addition, the measurement of ing the behaviors in the workplace (via 360
are three key recommendations for devel- performance has come under fire recently feedback), and how employees feel about
oping an impactful evaluation process. in the literature (e.g., Church, Ginther, the authenticity of the new values for the
While they are not meant to reflect all the Levine, & Rotolo, 2015) as being negatively culture (via employee focus groups or a
elements of what’s involved in designing received by leaders and poorly designed pulse survey).
evaluation research (see instead texts such and implemented in companies today. It is Conversely by using a multiple mea-
as Edwards et al., 2003; Kirkpatrick, 1998) no wonder then that recent conversations sures approach at the same level and point
these three should be helpful in design- with senior leaders have resulted in their in time you enable a process of triangula-
ing an evaluation approach that helps put questioning the use of internal perfor- tion. This allows you to see if all measures
your OD efforts in the best possible light mance management data as a valid indica- are showing the same type of change as
yet stays grounded in solid measurement tor of the impact of OD interventions. If a result of your intervention or if one
theory and rigor. PMP is not measuring the right things how indicator is moving when another is not (or

The Art and Science of Evaluating Organization Development Interventions 31


going in the opposite direction). Continu- For example, in evaluating the impact of dispelled management’s myths regarding
ing with the values rollout example, from PepsiCo’s Potential Leader Development the potential negative outcomes of being
a multiple measures perspective in order Center (PLDC), a custom talent assessment transparent with the results. It would have
to add a second indicator of the practice of process and part of their broader LeAD been impossible to demonstrate these rela-
the new values at the managerial level, you program aimed at identifying high poten- tionships without using this type of MML
might incorporate an audit of performance tials early in their careers, the program evaluation approach.
reviews along with the 360-feedback pro- team employed a MML approach (Church In the end, there is no single best
cess. The question would be are managers & Rotolo, 2016). Specifically, they wanted outcome measure or set of measures. The
being reviewed by their bosses based on to know whether being transparent (i.e., important point is to always cycle back to
the values, and are they being rated by sharing the results of the assessment of your evaluation strategy and measurement
others as demonstrating them? A nice side “potential”) with over 5,000 employees construct and build from there laying the
benefit of this approach would be the cor- globally had any impact on how people felt foundation for multiple measures. It is also
relations you could run between manager about the program itself and/or resulted important to remember, however, two old
ratings on the 360 survey and manager rat- in any unintended changes in turnover adages when it comes to measurement:
ings (if available) on the performance tool. or performance. As part of the evaluation (a) what gets measured gets done, and
By using this MML approach applied to OD strategy, at six months and one year after (b) beware the law of unintended conse-
interventions you are essentially following the individual feedback reports had been quences. If the goal of your talent manage-
a similar path to what I-O psychologists delivered, the team surveyed participants ment program is enhanced movement
call the multi-trait multi-method (MTMM) regarding their attitudes about the program and you do not account for other factors
which is used for assessing individual skills content and mechanics and their percep- then you will get movement even if it is
and capabilities. tions of the feedback they had received. of the wrong type (Church, 2013; Church
Survey feedback, behavioral ­feedback, The data was then linked at the individual & Waclawski, 2010). Remember to think
performance ratings, observations, inter- level to assessment results (i.e. leadership holistically and at the systems level when
views, and focus groups, etc. are just some potential scores), annual business perfor- designing your measurement process.
of the ways you can collect data and use mance ratings, promotion rates, and turn-
them for your evaluation purposes. The over. After an in-depth analysis, the project 2. Build a Meaningful Story
practice of OD is replete with all sorts both team was able to answer senior leaders’ key Through Insights
qualitative and quantitative data-driven questions regarding the impact of the pro- In as much as the first recommen-
tools (Waclawski & Church, 2002) and gram one year later on talent movement, dation is about getting your hands on a
the savvy applied behavioral scientist can the extent to which results were used in significant amount of data from different
convert the output of almost anything into individual development planning, and on sources, the second key to creating a sus-
a measure that can be used for some level employee perceptions of the culture. tainable evaluation process is the ability to
of analysis (even if it’s non-parametric). More specifically, the research indi- build a meaningful story out of the insights
Linking that softer data to perceived harder cated that: (1) the assessment process was you generate from that data. There are two
metrics such as the those in the list below effective at predicting future success—i.e. parts to this.
is where the rubber meets the road: actual performance and promotion rates First, having data by itself with no
»» performance (caveats one year later were significantly correlated connectivity or insights is meaningless. It’s
notwithstanding) with performance on the assessment tools; just information. This is true whether it’s
»» turnover (2) transparency of how employees scored one simple piece of evaluating ratings or
»» quality of new hires (their level of LIFT, a proxy for potential) five years of culture data linked to business
»» promotion rates had no negative impact on satisfaction unit performance and employee turnover.
»» talent transfers with the program (70% favorable), per- How significant is the overall effect? Where
»» external reputation indices and ceptions of organizational commitment, is the change really happening and where
awards or actual turnover; and (3) the program is it failing? What are the dynamics and
»» market share met its goal of providing developmental interplay between culture, behavior, and
»» EBITA (Earnings before interest, feedback to all participants with the vast turnover? All of this needs to be answered
taxes, and amortization) majority (77% and 83% respectively) in a way that answers the questions posed
»» diversity representation indicating that the results had helped them at the start of the intervention.
»» customer satisfaction data increase their effectiveness as a leader and The data you gather for your evalua-
»» learning completion rates showed an investment by the company tion efforts must be analyzed in such a way
»» sales in their personal growth and develop- to create insights into what is going on in
»» productivity ment. In short, the data provided statisti- the organization. In terms of the values
»» product shrinkage cally significant and meaningful results rollout example, if the 360-feedback data
»» senior leadership tenure, etc. regarding the impact of the program and were to show that managers were engaging

32 OD PRACTITIONER  Vol. 49 No. 2  2017


in the desired behaviors but the reward sys- of impact and identify clear areas for action
part of the ongoing debates in the field
tems were not reinforcing them, this would going forward? (e.g., Chamorro-Premuzic, et al., 2016;
be a key insight and useful in explaining Although the approach outlined here
Church, 2014; Church & Silzer, 2016;
why the program could be failing or having is similar to the data analysis, feedback,
Rotolo & Church, 2015). As information
less of an impact than might be desired. and interpretation stages (aka diagnosis)flows up and down at an increasingly rapid
This is why analysis skills though impor- of the classic OD consulting model, therepace (remember Big Data is characterized
tant by themselves are not enough (Church is a slightly different spin. While the same
by volume, velocity, variety, and veracity)
& Dutta, 2013). The ability to see con- ­analytical techniques and capabilities the challenge of determining causality
nections and derive meaning from those ­generally apply, the emphasis in evalua-
versus random relationships is even more
connections, in some cases even causality tion is on what has changed both positively
apparent. Unfortunately, many analyt-
between variables if the methodology and and/or negatively as a consequence of your
ics teams take the data blender approach
research design support these conclu- interventions. The point is to isolate the
(e.g., throw a large cluster of variables in
sions, is critical. It is a skill that comes with direct and indirect results of your efforts,
a blender and mix it all up and see what
practice and experience working with large and the potential moderating effect of other
spits out) and the resulting findings can be
and often complex datasets (Church & factors that you have measured over time
meaningless or at best difficult to interpret.
Waclawski, 2001). as well. This suggests not evidence-based science
but more of a fishing expedition. Relation-
In the end, there is no single best outcome measure or set of ships that make little sense can emerge
and limited judgement is applied. With-
measures. The important point is to always cycle back to your out a values filter applied to the analysis
evaluation strategy and measurement construct and build from of data, relationships identified might be
­spurious, or even potentially unethical
there laying the foundation for multiple measures. It is also from an OD perspective.
important to remember, however, two old adages when it comes Consider the case of employee engage-
ment survey data that was collected under
to measurement: (a) what gets measured gets done, and (b) the auspices of being confidential, yet
beware the law of unintended consequences. If the goal of your through linkage research is connected at
the individual level to other variables such
talent management program is enhanced movement and you as performance data, promotions, social
don’t account for other factors then you’ll get movement even if media postings, and iPhone activity. If
these analyses are done internally and mis-
it’s of the wrong type . . . used (e.g., to classify individuals for various
decisions or opportunities) it could result
Second is the ability to tell a compel- 3. Maintain a Watchful OD Values Lens in a major violation of employee trust and
ling story from those insights. Here your to Your Evaluation Work engagement with the organization. Once it
communications skills are tested. Even if The final recommendation for build- gets around that leaders cannot be trusted,
you have the greatest dataset in the world ing an effective evaluation process con- the entire process of gathering employee
regarding your OD intervention, if you cerns a return to the use of OD values. perspectives will be destroyed. This is
put it in front of senior executives they This paper has not been focused on the just a simple example but a powerful and
may not know what to do with it. In fact, OD values component of evaluation, yet it quite real one. As OD practitioners, we
the more complex your data the worse it is vital to ensuring our work is evaluated need to ensure that the data is used in the
gets. A collection of interesting insights effectively and in the right context. While way it was intended and communicated
alone is often not enough to determine there has always been an element of the to employees. The psychological contract
your intervention’s success. You need to dark side involved in data analysis and and integrity of our profession needs to
be able to put it all together into a com- storytelling (see How to Lie with Statistics by be maintained. One misstep with employ-
pelling package that is tailored to the Huff, 1993), the rise of Big Data and analyt- ees can erase in a heartbeat the positive
right audience (Church & Waclawski, ics functions in organizations is exacerbat- momentum gained from an entire multi-
2001). What has been happening since ing this problem exponentially. There are year OD intervention.
you launched your intervention? What several reasons for this situation. Big Data by itself of course is not
other changes at the systems level have First, the fascination with Big Data and the problem, nor is the use of advanced
occurred? How have these affected the machine intelligence is making all types of analytics capabilities. In fact, the future
change program and how do you know analyses even more complex. OD interven- of human capital management is going
that (e.g., what other measures or data do tions and I-O practices such as selection to require a digital mindset and statistical
you have)? Can you pinpoint the drivers and high-potential identification are now prowess beyond what most practitioners

The Art and Science of Evaluating Organization Development Interventions 33


have in their toolkits today. Rather the process. While many OD practitioners Burke, W. W., & Litwin, G. H. (1992). A
problem rests with those doing the analy- continue to have limited interest and/or causal model of organizational perfor-
ses. The concern is the application of what capability in engaging in evidence-based mance and change. Journal of Manage-
might be termed “values free analytics” to science for their evaluation efforts, with the ment, 18(3), 523–545.
understanding the social organizational increase in Big Data and the need for ROI Bushe, G. R., & Marshak, R. J. (Eds.).
phenomena at work. While linkage and there is nowhere to hide anymore. Going (2015). Dialogic organization develop-
similar research represent important forward practitioners need to start with ment: The theory and practice of trans-
approaches for demonstrating impact, in the science to build these skills and ensure formational change. Oakland, CA:
the wrong hands they can be misleading or they are designing the right types of mea- Berrett-Koehler.
even damaging to an organizational change sures and analyzing the data appropriately. Chamorro-Premuzic, T., Winsborough, D.,
agenda and threaten the existing culture OD professionals need to be facile at gen- Sherman, R. A., & Hogan, R. (2016).
and practices. erating both insights from their data and New talent signals: Shiny new objects
The key recommendation here is to the ability to tell a compelling story about or a brave new world? Industrial and
ensure that whatever analysis approach the impact of their interventions. There is Organizational Psychology: Perspectives
is taken safeguards are in place to protect clearly a science and an art to conducting on Science and Practice, 9(3), 621–640.
employees and the organization within evaluation efforts in OD, and none of it Church, A. H. (2001). The professionaliza-
the context of the work being done. OD should be as painful or as daunting as one tion of organization development: The
practitioners should play an integral role might think. next step in an evolving field. In W.
in the analyses being done (if not doing A. Pasmore & R. W. Woodman (Eds.),
them ourselves) to ensure they are done in References Research in organizational change and
the right manner and with the right frame development (Vol. 13, pp. 1–42). Green-
of reference. This means being involved Anderson, D. L. (2012). Organization devel- wich, CT: JAI Press.
in every phase along the way from design- opment: The process of leading organiza- Church, A. H., (2003), Organization devel-
ing the evaluation approach, selecting tional change (2nd ed.). Thousand Oaks, opment. In J. E. Edwards, J. C. Scott, &
the outcomes, asking the right questions, CA: Sage. N. S. Raju (Eds.), The Human resources
ensuring the analyses are robust and Armenakis, A. A., Bedeian, A. G., & Pond, program-evaluation handbook (pp.
appropriate to the data in hand, interpret- S. B (1983). Research issues in OD 322–342). Thousand Oaks, CA: Sage.
ing results to determine key insights, evaluation: Past, present, and future. Church, A. H. (2013). Assessing the effec-
telling the story given the context involved, The Academy of Management Review, tiveness of talent movement within a
and working with the client to make the 8(2), 320–328. succession planning process. In T. H.
right decisions. None of these steps should Basu, K., & Das, P. (2000). Evaluations DeTuncq & L. Schmidt (Eds.), Integrated
be left to someone with a limited sense of dilemmas in OD interventions: Mixed talent management scorecards: Insights
OD or complex change dynamics (e.g., a record involving Indian rural credit from world-class organizations on demon-
pure statistics or economics background). institutions. Public Administration Quar- strating value (pp. 255–273). Alexandria,
We bring a unique perspective to the table terly, 24(4), 433–444. VA: ASTD press.
as OD practitioners with a specific set of Becker, B. E., Huselid, M. A., & Ulrich, D. Church, A. H. (2014). What do we know
research questions, and we should always (2001). The HR scorecard: Linking people, about developing leadership potential?
be present to influence and protect the use strategy, and performance. Boston, MA: The role of OD in strategic talent man-
and misuse of employee data. Harvard Business School Press. agement. OD Practitioner, 46(3), 52–61.
Boudreau, J. W., & Ramstad, P. M. (2007). Church, A. H., & Dutta, S. (2013). The
Conclusion Beyond HR: The new science of human promise of big data for OD: Old wine
capital. Boston, MA: Harvard Business in new bottles or the next generation
The purpose of this paper has been to dis- School Press. of data-driven methods for change?
cuss some of the challenges and opportuni- Burke, W. W. (1982). Organization develop- OD Practitioner, 45(4), 23–31.
ties associated with the evaluation of OD ment: Principles and practices. Glenview, Church, A.. H., Ginther, N., M. Levine, R.,
interventions in organizations. Historically, IL: Scott, Foresman. & Rotolo, C. T. (2015). Going beyond
the evaluation component of the classic Burke, W. W. (2011). Organization change: the fix: Taking performance manage-
consulting model has largely been down- Theory and practice (3rd ed.). Thousand ment to the next level. Industrial and
played or ignored. Today, with increasing Oaks, CA: Sage. Organizational Psychology: Perspectives
pressure from organizations to demon- Burke, W. W., Javitch, M. J., Waclawski, J., on Science and Practice, 8(1), 121–129.
strate the value of our efforts, having both & Church, A. H. (1997). The dynam- Church, A. H., Golay, L. M., Rotolo, C. T.,
a well-designed and articulated evaluation ics of midstream consulting. Consult- Tuller, M. D., Shull, A. C., & Desrosiers,
strategy is key, as is a detailed multi-­ ing Psychology Journal: Practice and E. I. (2012). Without effort there can be
measure and multi-level measurement Research, 49(2), 83–95. no change: Reexamining the impact of

34 OD PRACTITIONER  Vol. 49 No. 2  2017


Allan H. Church, PhD, is Senior
Vice President of Global Talent
Assessment & Development at
survey feedback and action planning on Huff, D., (1993). How to lie with statistics.
employee attitudes. In A. B. Shani, W. New York, NY: W. W. Norton & Com- PepsiCo. Over the past 16 years
A. Pasmore, & R. W. Woodman (Eds.), pany, Inc. he has held a variety of roles in
Research in organizational change and Katz, D., & Kahn, R. L. (1978). The social organization development and
development 20 (pp. 223–264). Bing- psychology of organizations (2nd ed.). ­talent management in the com-
ley, UK: Emerald Group Publishing New York, NY: John Wiley. pany. ­Previously he was with
Limited. Kirkpatrick, D. L. (1998). Evaluating
Warner Burke Associates for
Church, A. H., & Oliver, D. H. (2006). The training programs. San Francisco, CA:
importance of taking action, not just Berrett-Koehler. almost a decade, and before that
sharing survey feedback. In A. Kraut Lawler, E. E. III, (2014). Sustainable at IBM. He is currently on the
(Ed.), Getting action from organizational ­effectiveness and organization develop- Board of Directors of HRPS, the
surveys: New concepts, technologies and ment: Beyond the triple bottom line. Conference Board’s Council of
applications (pp. 102–130). San Fran- OD Practitioner, 46(4), 65–67. Talent Management, an Adjunct
cisco, CA: Jossey-Bass. Martineau, J. W., & Preskill, H. (2002).
Professor at Columbia University,
Church, A. H., & Rotolo, C. T. (2016). Lift- Evaluating the impact of organiza-
ing the veil: What happens when you tion development interventions. In and Associate Editor of JABS. He
are transparent with people about their J. Waclawski & A.H. Church, (Eds.), is a former Chair of the Mayflower
future potential? People & Strategy, 39(4), Organization development: A data- Group. Church received his PhD
36–40. driven approach to organizational change in Organizational Psychology
Church, A. H., & Silzer, R. (2016). Are (pp. 286–301). San Francisco, CA: from Columbia University, and is
we on the same wavelength? Four Jossey-Bass.
a ­Fellow of SIOP, APA, and APS.
steps for moving from talent signals to Nadler, D. A. (1977). Feedback and organiza-
valid talent management applications. tion development: using databased meth- He an reached at Allan.Church@
Industrial and Organizational Psychology: ods. Reading, MA: Addison-Wesley. pepsico.com.
Perspectives on Science and Practice, 9(3), Phillips, J. J. (1991). Handbook of train-
645–654. ing evaluation and measurement
Church, A. H., & Waclawski, J. (2001). methods (2nd ed.). Boston, MA: Svyantek, D. J., O’Connell, M. S., & Baum-
Designing and using organizational Butterworth-Heinemann. gardner, T. S. (1992). Applications of
surveys: A seven step approach. San Fran- Roberts, D.R., & Robertson, P. J. (1993). Bayesian Methods to OD evaluation
cisco, CA: Jossey-Bass. Positive-findings bias, and measuring and decision making. Human Relations,
Church, A. H. & Waclawski, J. (2010). Take methodological rigor, in evaluations of 45(6), 621–636.
the Pepsi Challenge: Talent develop- organization development. Journal of Terpstra, D. E. (1981). Relationship between
ment at PepsiCo. In R. Silzer & B. E. Applied Psychology, 77(6), 918–925. methodological rigor and reported
Dowell (Eds.), Strategy-driven talent Rothwell, W. J., Sullivan, R., & McLean, G. outcomes in organization development
management: A leadership imperative N. (Eds.). (1995). Practicing organization evaluation research. Journal of Applied
(pp. 617–640). San Francisco, CA: development: A guide for consultants. San Psychology, 66(5), 541–543.
Jossey-Bass. Francisco, CA: Jossey-Bass/Pfeiffer. Waclawski, J., & Church, A. H. (2002).
Cummings, T. G., & Worley, C. G. (2015). Rotolo, C. T., & Church, A. H. (2015). Big Introduction and overview of organi-
Organization development and change data recommendations for industrial- zation development as a data-driven
(10th ed.). Stamford, CT: Cengage organizational psychology: Are we in approach for organizational change.
Learning. Whoville? Industrial and Organizational In J. Waclawski & A. H. Church,
DeTuncq, T. H., & Schmidt, L. (Eds.), Psychology: Perspectives on Science and (Eds.), Organization development: A
(2013). Integrated talent management Practice, 8(4), 515–520. data-driven approach to organizational
scorecards: Insights from world-class Savitz, A. W., & Weber, K. (2013). Talent, change (pp. 3–26). San Francisco, CA:
organizations on demonstrating value. transformation, and the triple bottom Jossey-Bass.
Alexandria, VA: ASTD press. line: How companies can leverage human Woodman, R. W., & Wayne, S. J. (1985). An
Edwards, J., E., Scott, J. C., & Raju N. S. resources to achieve sustainable growth. investigation of positive findings bias in
(Eds.) (2003). The human resources San Francisco, CA: Jossey-Bass. evaluation of organization development
program evaluation handbook. Thousand Shull, A. C., Church, A. H., & Burke, W. W. interventions. Academy of Management
Oaks, CA: Sage. (2014). Something old, something new: Journal, 28(4), 889-913.
Holton, E. E. III (1996). The flawed four- Research findings on the practice and
level evaluation model. Human Resource values of OD. OD Practitioner, 46(4),
Development Quarterly, 7(1), 5–21 23–30.

Copyright © 2017 by the Organization Development Network, Inc. All rights reserved.

The Art and Science of Evaluating Organization Development Interventions 35


A New e-Book Resource for Practitioners

ORGANIZATION
DEVELOPMENT
Organization Development in Practice
IN PRACTICE Editors
William J. Rothwell, Jacqueline M. Stavros,
Roland L. Sullivan, and John Vogelsang
WILLIAM J. ROTHWELL, JACQUELINE M. STAVROS,
ROLAND L. SULLIVAN, & JOHN VOGELSANG Editors

Available from the Organization Development Network


OD Network

Organization Development in Practice brings together experienced OD professionals who share their methods
for developing more effective and resilient organizations, enabling organizational and social change, and being
responsive to continuous change.
Some of the chapters include:
The Ebb and Flow of OD Methods the fundamentals of action research in a process
Billie T. Alban and Barbara Benedict Bunker describe called the Culture of Opportunity that leverages
the first and second wave of OD methods and their the talent, relationships, knowledge, capital, and
perspective on what is happening in the 21st century. communications that are largely fragmented and
When OD methods first emerged in the 1960s, they were disconnected in most organizations. They outline the
considered innovative and exciting. OD practitioners process of instilling a Culture of Opportunity within
have shifted their methods with time and adapted to three distinct organizations that hit crisis points in
current situations. However, Alban and Bunker question response to changing environments and difficult
which of the current methods are new and which are just circumstances.
a repackaging of already existing practices. As the pace
At the Crossroads of Organization Development
of change has accelerated, they also wonder whether
and Knowledge Management
the turbulent external environment has driven many to
Denise Easton describes what emerges at the
think they need new methods when what they may need
intersection of OD and Enterprise Knowledge
is more creative adaptation of existing methods.
Management, where a collaborative partnership
How the Mind-Brain Revolution Supports accelerates the understanding, development, and
the Evolution of OD Practice transformation of dynamic, techno-centric systems
Teri Eagan, Julie Chesley, and Suzanne Lahl believe of knowledge, information, learning, and networks
that the early promise of OD was inspired by a found in 21st century organizations. When OD is part
desire to influence human systems towards greater of developing knowledge management processes,
levels of justice, participation, and excellence. They systems, and structures the organization not only
propose that a critical and integrative neurobiological survives but thrives.
perspective holds the potential to advance OD in two
Accelerating Change: New Ways of Thinking
ways: what we do—the nature and quality of our ability
about Engaging the Whole System
to assess and intervene in service of more effective
Paul D. Tolchinsky offers new ways of developing,
organizations and a better world; and who we are—our
nurturing, and leveraging intrapreneurialship in
competencies, resilience, and agility as practitioners.
organizations. Most organizations underutilize
Culture of Opportunity: Building Resilient the capabilities and the entrepreneurial spirit of
Organizations in a Time of Great Transition employees. Tolchinsky describes how to unleash the
Mark Monchek, Lynnea Brinkerhoff, and Michael entrepreneurial energy that exists in most companies.
Pergola explore how to foster resiliency, the ability In addition, he offers five suggestions organizations
to respond effectively to change or challenges. They can implement, drawing on several examples from
examine the inherent potential of resilient organiza­ corporations such as Zappos, FedEx, HCL Technologies,
tions to reinvent themselves by understanding their and companies developing internal Kick Starters and
social networks, using design thinking, and utilizing crowd sourcing platforms.
Journal of the Organization Development Network
Guidelines for Authors

Journal Information or rejecting the article. If they decide preparation for publication. The ODP
the article is publishable with changes, Editor makes the final decision about
The OD Practitioner (ODP) is pub- one of the Review Board members will which articles will be published.
lished by the Organization Develop- email or call the primary author to dis-
ment Network. The purpose of the cuss the suggested changes. Once the Criteria for Accepting an Article
ODP is to foster critical reflection author has made the changes to the
on OD theory and practice and to satisfaction of the two Review Board Content
share applied research, innovative members, the ODP Editor will work »» Bridges academic rigor and
approaches, evidence based practices, with the author to prepare the article ­relevance to practice
and new developments in the OD field. for publication. »» Is accessible to practitioners
We welcome articles by authors who »» Presents applied research, innova-
are OD practitioners, clients of OD Process 2 (double blind peer review): tive practice, or new developments
processes, Human Resource staff who This option is offered to meet the in the OD field
have partnered with OD practitioners standards for academic insti­tutions. »» Includes cases, illustrations, and
or are practicing OD, and academics Submit articles with a cover page with practical applications
who teach OD theory and practice. As the article’s title, all authors’ identify- »» References sources for ideas,­ theo-
part of our commitment to ensure all ing and contact information, and brief ries, and practices
OD Network programs and activities biographies for each of the authors; »» Reflects OD values: respect
expand the culture of inclusion, we also include any acknowledgements. and inclusion, collaboration,
encourage submissions from authors Provide an abbreviated title running authenticity, self-awareness, and
who represent diversity of race, gender, head for the article. Do not include empowerment.
sexual orientation, religious/spiritual any identifying information other than
practice, economic class, education, on the title page. Two members of Stylistic
nationality, experience, opinion, and the review board will independently »» Clearly states the purpose and
viewpoint. receive the article without the author’s content of the article
information and without knowing the »» Presents ideas logically and with
The Review Process identity of the other reviewer. Each clear transitions
reviewer will recommend accepting »» Includes section headings to help
The ODP is a peer reviewed journal. the article for publication, rejecting the guide the reader
Authors can choose between two article with explanation, or sending »» Is gender-inclusive
review processes and should notify the the article back to the author for revi- »» Avoids jargon and overly formal
Editor which they prefer when they sion and resubmittal. Recommenda- expressions
submit an article: tions for revision and resubmittal will »» Avoids self-promotion
include detailed feedback on what is
Process 1 (open peer review): Submit required to make the article publish- If the article is accepted for publica-
articles with a cover page with the able. Each ODP Board member will tion, the author will receive a PDF
article’s title, all authors’ identify- send their recommendation to the proof of the article for final approval
ing and contact information, and a ODP Editor. If the Editor asks the before publication. At this stage the
50– 80 word biography for each of author to revise and resubmit, the Edi- author may make only minor changes
the authors; also include any acknowl- tor will send the article to both review- to the text. After publication, the Edi-
edgements. Two members of the ODP ers after the author has made the tor will send the author a PDF of the
Review Board will review the article. suggested changes. The two members article and of the complete issue of
They will recommend accepting of the Review Board will work with ODP in which the article appears.
the article for publication, pursuing the author on any further changes,
publication after suggested changes, then send it to the ODP Editor for (continued next page)

60 OD PRACTITIONER  Vol. 49 No. 2  2017


Guidelines for Authors (contd.)

Preparing the Article Graphics Policy on Self-Promotion


for Submission Graphics that enhance an article are Although publication in the ODP is
encouraged. The ODP reserves the a way of letting the OD community
Article Length right to resize graphics when neces- know about an author’s work, and is
Articles are usually 4,000 –   sary. The graphics should be in a therefore good publicity, the purpose
5,000 words. program that allows editing. We prefer of the ODP is to exchange ideas and
graphics to match the ODP’s three-, information. Consequently, it is the
Citations and References two-, or one-column, half-page or full- policy of the OD Network to not accept
The ODP follows the guidelines of page formats. If authors have ques- articles that are primarily for the
the American Psychological Associa- tions or concerns about graphics or purpose of marketing or advertising
tion Publication Manual (6th edition). computer art, please contact the Editor. an author’s practice.
This style uses parenthetical reference
citations within the text and full refer- Other Publications Submission Deadlines
ences at the end of the article. Please The ODP publishes original articles, Authors should email articles to the
include the DOI (digital object identi- not reprints from other publications editor, John Vogelsang, at jvogelsang@
fier; http://www.apastyle.org/learn/ or journals. Authors may publish earthlink.net. The deadlines for submit-
faqs/what-is-doi.aspx), if available, with materials first published in the ODP ting articles are as follow: October 1
references for articles in a periodical. in another publication as long as for the winter issue; January 1 for the
the ­publication gives credit to the spring issue; April 1 for the summer
OD Practitioner as the original place issue; and July 1 for the fall issue.
of publication.

Copyright © 2017 by the Organization Development Network, Inc. All rights reserved.

OD Practitioner Guidelines for Authors 61


Products and Services
Publications Professional Development

»» OD Practitioner, the flagship publica- OD Network professional develop- 


tion of the OD Network, is a peer- ment events offer cutting-edge theory  
reviewed quarterly journal. and practice. Learn more at  
http://www.odnetwork.org.
»» Practicing OD provides practice-
related concepts, processes, and »» OD Network Conferences, held
tools in short articles by and for annually, provide unsurpassed pro-
busy practitioners. fessional development and network-
ing opportunities.
Both publications and their submission
»» Regular webinars include events
guidelines are available online at http://
in the Theory and Practice Series,
www.odnetwork.org.
Conference Series, and OD Network
Live Briefs.

Member Benefits

Low annual dues provide members with Online Resources


a host of benefits:
In addition to the online resources for
»» Free subscriptions to our members only, the OD Network website
publications. offers valuable tools that are open to the
public:
»» Free access to online job ads in the
OD Network Job Exchange. »» Education directory of OD-related
degree and certificate programs.
»» Discounts on conference registra-
tion, OD Network products (includ- »» Catalog of OD professional develop-
ing back issues of this journal), Job ment and networking events.
Exchange postings, professional
»» Bookstore of titles recommended by
liability insurance, books from John
OD Network members.
Wiley & Sons, and more.
»» Links to some of the best OD
»» OD Network Member Roster, an
resources available.
essential networking tool, in print
and in a searchable online database. »» E-mail discussion lists that allow
OD practitioners worldwide to share
»» Online Toolkits on action research,
ideas.
consulting skills, and HR for OD—
foundational theory and useful tools »» Lists, with contact information,
to enhance your practice. of regional and international OD
networks.
»» Case studies illustrating the value of
OD to potential client organizations.

Copyright © 2017 by the Organization Development Network, Inc. All rights reserved.

70 OD PRACTITIONER  Vol. 49 No. 2  2017


Mark your calendar!
2017 OD NETWORK
Annual Conference
THE CALL OF OUR TIME:
OD INNOVATINGforIMPACT
OCTOBER 14-17, 2017
LOEWS CHICAGO O’HARE
Learn more: www.odnetwork.org/2017Conference

Sponsor & Exhibitor Opportunities are Open!


Visit www.odnetwork.org for details.

View publication stats

You might also like