You are on page 1of 20

QUEST, 1992, 44, 15-34

Field Systems Analysis:


Prioritizing Patterns
in Time and Context
Among Observable Variables
Tom Sharpe and Andrew Hawkins
Many research paradigms are currently competing for acceptance in teacher
effectiveness research, all striving to lend greater insight into the subtle
intricacies of expert instruction. One alternative, called field systems analysis,
is presented from conceptual and technical perspectives in providing an
alternative to traditional behavior-analytic approaches. This method primarily
focuses upon the temporal relationships among organismic behaviors and
context-specific setting elements in classroom settings, providing a more
exhaustive, and hence more accurate, description of what actually occurs
in particular gymnasium settings. The methodology includes (a) verbal
description, (b) inductive category system construction, (c) alternative forms
of data presentation, and (d) alternative dimensions of data interpretation.
Selected results of an exemplary study with one teacher are provided to
demonstrate the nature and benefits of the paradigm. Several contiguous
relationships among instructional behaviors were apparent, with insight
gained into the conditional relationships in time among encouraging, instruc-
tional, feedback, and interpersonal teacher behaviors.

Theoretical Foundations
Pedagogical inquiry has tended to produce research that has yielded
conflicting findings when attempting to make broad generalizations (Borich,
1986). It has also frequently been the case that practitioners within any particular
teaching paradigm have been hesitant to accept another camp's point of view
(Berliner, 1986; Finn, 1988; Firestone, 1987). Some have even proposed that
exemplary teaching is primarily an artistic endeavor, beyond the scope of
science, for it does not presently conform to rigorous laws that yield generalized
predictability and control (Gage, 1978, 1984). This has, in part, created a

About the Author: Tom Sharpe is with the School of Health, Physical Education
and Recreation, 204 Mabel Lee Hall, University of Nebraska, Lincoln, NE 68588-0229.
Andrew Hawkins is with the Department of HPEAT, PO Box 6116, West Virginia
University, Morgantown, WV 26506-61 16.
A[1e3!iewaisds oi a[qe uaaq 'lad se 'iou aneq spoqlaw asayL ' ( ~ 8 6 'u!qna 1
1
f g ~ 6 'a8er;) 1
f ~ 8 6 'Iausg :t7861 ' a ~ e auo!qsq
) an!l!niu!ue u! suadxa ,,anq,, 01
iu!od 01 s.1ay3nasa.~ [woye3npa pamo[le dluo aneq pue sn3oj ~!~ewalsds e 73131
llaq~leqi paz!uSo3a~aneq slay3leasaJ lluew 'JanamoH -ls~olz~n.~isu! an~3ajjau!PUB
anp3ajja uaaMiaq sa3ualajjrp aquxap ll[an!i3afqns01 a[qeuaaq aneq pue qxeasa~
[euo!le3npa u! iualenald a n spoqlaw q3leasal an!iei!lenb pue an!ldu~saa
w u! 'aw!i sso~3esuo!i~auuo~~aiu! I!aqi uodn diuoud 8u!3e[d '!lnwps Ianixaiuo3
pug sloFneqaq 3!ws!ue8~0jo laqurnu ~aiea18e s~o1!uowdguam3uo3 anpetualle
s!qi 'll18u!p.1033~ .s!sll~eue3!j!luays JO i3npuo3 aqi .IOJ sa8ualleq3 @pads saiea.13
'sluana [elo!neqaqpug 'len1xaiuo3'[e3uois!yjo suo!iem2yuo3 Ou!n~onaXpuelsuo:,
pw anbrun d[le3psuai~r?leq3pug A1je~odwa1a n swaislls leqi 'ICroayl play jo
~auaipuawepunj ayi 'paapuI .aw!i lano sa3mos a~d!i[nw wolj 8u!8~awasa[qeuen
ardyqnw jo sd!ysuo!ie[a~[ e ~ o d u aayi ~ 8u!zd[ew d[le3!iewalslls q i ! ~paie!30sse
swa~qo~d xaldwo3 ayi awo3~ano01 '(6861 'oie~d[aa'8 Lea to661 'y3u+~ aas)
slaylo lua8~awaMaj e q i ! Buo~e ~ '1dwai1e auo s! s!sd[eue swaisds pIa!+J'sluaw
-a[a alqemasqo 8 u o w sd!qsuope[a.~~aiu! le~odwaljo suual u! a 3 u a p u a d a p ~ n u l
Ie1uawaIa uo sasn3oj dp.nx~~ud ley1 anp3ads~adpau!e.18-aug pw xa[dwo:, alow
e JOJ 8u!moj1e s! q3e0.1dde s!qi 'll3qu! s11 U! q 8 n 0 ~-uo!qsej palelos! w
u! iuawaja q3ea jo s3!is!.1ai~enq:, aians!p aql 8u!z!u!in~3sueqi laqiel 'sluawala
alqemasqo 8uoure sd!qsuo!1e~a~ [e~odwal a~d!i[nw aqi uodn sn3oj 3g!3ads-%upias
I2 pleMOl~l~018 UMOqS SEy (0661 'lapaOly3S fS861 'eqnr;)'8 Ul03U!7 :S861 '[IVH
'8 ha^ 'Aalueis 'ppenblaa 'pooMuaaJr;)f6861 ' 1 1 '8 ~ '~~ p e n b l a a'poo~uaa~r;)
f 6 8 6 ~'qsmH ig a y y ~ a'%a) s.~aq~leasa~ [e3!80103a pue w8rpeled s s a ~ o ~ d
Oupe!paw dq d801ouy3al .~aindwo:, he1odwa1uo3 uo papunoj y . 1 0 ~lua3aa
.payoopano uaaq saw!iawos aney u!ewop leuop3nlisu! ue u ! y i ! ~Bu!ielado
X~
s1013q S ~ O ~ U L ) J Ipauual dl[wop!pe~i 'duew ayl 8uowe aw!i u! sd!qsuo!leja~
a~d!l~nw ayl pue 'smmo uo!mnrlsu! aq1 y 3 ! y ~u! 8upias 10 ixaiuo3 [e3!80103a
ayl 'lo!neqaq laqc~ea~ jo iualxa pue sn3ol [e~odwalay1 se q3ns s3ps!.1a13e1eqc1
'(EL61 'JsmtI'8 au!Wuaso?3 f6L61 ' A ~ I P ~f6L61 N 'Po09 f8L61 '9L61 ' 3 8 ~ f9861 9
'poor;) '8 llqdoig '-8.a) uo!i!psll y3leasal syqi Aq pa3npo~d uaaq aney salpnis
jnj1q8!su! Xuew y8noyqv -(0861 '~ay3edLuuad'8 u o ~ s u q ofig61 ~ '~a.raw!a~
'8 m q e ~ r ; )ixaiuo3
) u! sa[qepen [euo!~~nr~su! 8uowe aw!i u! sd!qsuo!ie[a~aq1
01 p!ed uopuane a[il!I ~ I ! M's~uauodwo3paielos! jo suual u! lotneqaq l a q ~ e a ~
paMay A~je3uois!qsey ssauan!i3ajja [euo~l~nr~su! uo q3masal jeloynayaa
-1oneapua 3y!iua!3s aulnua8 e se Ou1y3ea1uo y3nasa~jo eap!
ayi aiowo~d01 a n am j! djlapadsa 'dse18 3gpua!3s m o puodaq aq 01 iq8noyi
Apnolna~dsiuaurala puop~nrisu!jo a1nide3ay1.10~ M O I I B ieq1 sa!%aie~~saAperuaqe
jo hrano3s!p ayl uo sa!8iaua m o sn3oj pjnoqs aM 'peaisuI -ploy awl 01 ICI!nbu!
3!lewaisAs jo dse18 ayi puollaq Ouraq 8u!q3eal anp3aga jo uopou ayx M O I ~ E
iou plnoys lli!unwwo~A908epad y3nasaJ ay1 ieyi ' l a ~ a ~ 'uo!qsod oy m o s! 11
-%u!y3ea1uo y3nasal u! siuauranosdw! 1e3!8olouy3aijo xuawdojanap ayl u!eqsuo3
01 8uypual se sa!iln3r,j!p asayi 01 iurod ( ~ 8 6 1p) u e 8 a r ~pue S U I ~ M E H .(6861
' U O S M B ~ )%u!y3eal an~3ajja103saww i e y ~8uyple8a.I a8uajleq3 A3ewpy8a1
HELD SYSTEMS ANALYSIS 17

demonstrate, in an empirically well-defined fashion, just how particular expert


teachers orchestrate their complex instructional repertoires.
Behavior-analytic inquiry, on the other hand, has successfully quantified
many observable components of effective instruction. However, it has typically
extracted the discrete quantifiable elements of concern from the larger time- and
context-dependent ecosystem in which they naturally reside (Kantor, 1970). This
has led to confounding results in many cases, for one stimulus may affect many
responses (e.g., teacher attention not only may reinforce the response on which
it is contingent but also may elicit other responses and set the occasion for further
stimuli), and responses are seldom the function of a single stimulus (e.g.,
requesting toys is a joint function of the presence of toys and an audience).
Therefore, strategies are warranted that may more systematically describe mul-
tiple interrelationships across time regarding the array of interactive variables
inherent to classroom settings.
A Conceptual Recommendation
Recommendations that observable instructional processes be the primary
focus of pedagogical research, most specifically by focusing on processes related
to instruction and by generating quantitative data from live observations of those
processes, parallel movement toward a systems methodology (Johnston, 1988;
Metzler, 1989). According to Metzler (1989), research and development models
that can better integrate teacher process, content, observable context, and student
outcome and assist in looking for ways to improve those components within a
particular instructional setting are the research models that hold promise for
furthering pedagogical content knowledge.
The methodological model described herein is rooted in the interbehavioral
and ecobehavioral literature. We have employed the ternfield systems, following
a recently opened avenue of applied research that empahsizes the recording and
quantifying of temporal and contextual patterns and relationships for particular
subjects and their settings (Powell & Dickie, 1990). Before proceeding with a
tactical outline, we must make it clear that this type of analysis differs substantially
from more traditional strategies. As Frick (1990) has surmised, until the past 3
to 5 years, behavioral research has been inclined to measure variables separately
and then attempt to define their relationships with an appropriate mathematical
model (e.g., dependent variable y is some function of independent variable x).
Though the principle may be extended to multiple variables, the mathematical
equation that expresses this relationship is represented by a line (straight or
curved) in two dimensions. When this type of relationship does indeed exist,
then an equation is a very elegant and parsimonious means of expression.
However, if the emphasis is upon analysis of temporal and contextual
patterns, alternative means of representing the relationships among variables is
warranted. In field systems, relationships among variables (e.g., observable
elements in a setting) are viewed as a set of temporal patterns, not as a line on
a Cartesian coordinate system. Element connections are measured by counting
occurrences of existent temporal patterns and then by aggregating the frequencies,
durations, rates per minute, and percent durations and, most importantly, by
prioritizing the recurrent temporal relationships among those patterns. In this
regard, complex linear mathematical models may distract from relations that are
of interest.
SHARPE AND HAWKINS

Definitional Literature
The field systems model, therefore, departs from traditional three-term
contingency analyses that isolate the stimulus and response of interest from the
larger setting in which they reside (Kantor, 1922; Lichtenstein, 1983). A systems
focus directs explanatory efforts away from the search for an ultimate cause and,
instead, encourages a probabilistic descriptive approach in terms of an entire set
or series of conditions taken in its available totality (Holton, 1973). Such a
perspective is predicated on the assumption that the organisms under scrutiny
are part of a larger field of dynamically changing events (Thorne, 1973).
Therefore, only by attempting to represent the entire system of factors operating
in that field will it be possible to provide a clear and accurate description of a
particular event within that setting (Kantor, 1969).
Systems theory holds that simple linear models, with their assumptions of
independent cause and effect (Cook & Campbell, 1979), do not provide satisfac-
tory foundations for the temporal and contextual analysis necessary in applied
research. Field systems research thus emphasizes mutual implications and inter-
dependence among ecologies and organismic behaviors, allowing for multi-
directional influence among many variables. This requires that labeling one
element as independent and another as dependent be avoided, in that all behavioral
and environmental cues tracked within a particular field are thought of as
interdependent. In other words, temporally related packages or clusters of
behavioral and contextual events combine to promote packages of other events.
As such, these temporal relationships are the primary focus of analysis.
Field systems methodology also takes into account that living systems flow
across time. The focus is not merely on multiple behaviors and environmental
stimuli but is primarily upon multiple observations of strings, or series, of events
across time. However, field systems' time series are unlike the time series
analyzed in linear behavior-analytic research, in which individual variables are
isolated, measured, and compared on a particular dimension across sessions.
Field systems methodology focuses on series of multiple events as they unfold
in concert through time, primarily within sessions.
Ray and Delprato (1989, p. 87) summarize the major implications of a field
systems orientation as follows:
methodology needs to remain sensitive to (a) the continuity of organismic-
environmental interactions across time and circumstance; (b) the inter-
dependencies of these interactions; (c) the interactional context, or setting
conditions attending the interaction; and (d) multiple classes of events within
each of many possible domains.

Representation of interdependence and the probabilistic organization of behav-


ioral and environmental events across time should allow researchers to glean the
temporal relationships among multiple variables as they actually exist in class-
room settings-a dimension often overlooked with traditional research methods.
Purpose
The purposes of this paper are to provide a brief overview of field systems
theory and to demonstrate its implementation in an applied setting. Though some
FIELD SYSTEMS ANALYSIS 19

findings emerging from our study of a single teacher may begin to extend the
knowledge base of pedagogical expertise, our primary purpose is to familiarize
the reader with the technical aspects of the paradigm. Though the behavior of
the student is ultimately a primary concern of instructional efficacy, student data
have been purposefully omitted in order to simplify the explanation and to allow
the reader to focus on the method's tactical aspects. Hence, one study of a single
setting of an expert teacher is used more for demonstration purposes than for a
study in expertise per se.
For greater depth of information the reader is referred to an epistemological
review (Sharpe & Hawkins, in press), a dissertation (Sharpe, 1989), and a strategic
discussion with examples from the psychological literature (Ray & Delprato,
1989). For the purposes of this paper, terminology that is commonly understood
among field systems theorists is defined as the methodology unfolds. For a more
complete treatment of the language of field systems, Lichtenstein (1983) and
Pronko (1980) also provide excellent reference points.

Field Systems Application


The following discussion details field systems implementation within the
context of one classroom setting. A study with one complete class period of an
"expert" primary physical education instructor is our exemplar. The focus is on
the teacher, and scrutiny of the multiple ecological and behavioral element
relationships as they occur through the class period is represented in highlighting
the relationships in time among multiple variables. Our primary objective is to
convey the methodology and stimulate collegial discussion regarding its relative
merit and development. At this juncture, strategic questions concerning the
following are open to discussion: (a) fine discrimination among conceptually
similar events, (b) subject choice methods, (c) effects of the base rate of the
events over time that trigger conditional probabilities, (d) standards for the
relative constitution of velocity and complexity, (e) serial dependency,
(f) standards for representation in a chain structure, and (g) appropriate parametric
tests such as the Q methodology (see Bakeman & Gottman, 1986). Our intention
at this stage is not to draw conclusions about expertise in general, only to describe
an appealing methodology in its infancy via an exemplar of one expert's setting.
Figure 1 outlines the four analysis steps. The methodology is driven
inductively. Induction is ensured, in large part, by generating a verbal descriptive
chronology from which behaviors and contextual events within a particular
setting are defined for further analysis.
Subject Selection
Any teacher's setting could legitimately be used to demonstrate the field
systems model. However, our academic interests lie in the area of expertise.
Motivated by a desire to ultimately describe and compare experts and novices,
we purposely selected subjects who could be so categorized. Though Berliner
(1986) has convincingly posited a rationale for studying expert (and not-so-
expert) teachers, he did not distinguish experts from teachers with experience.
Believing that experience is a necessary but insufficient condition for expertise,
we employed the behavioral definitions of expertise,experience, and effectiveness
adopted by Siedentop and Eldar (1989, p. 254). These definitions were expanded
SHARPE AND HAWKINS

(1) Verbal Field Description

(2) Category System Construction

Contextual/
I
Contextual/
\
Discrete
\
Specific
Environmental Or anismic Environmental Behavioral
Setting getting Stimulus Elements
Elements Elements Elements

A-A
(3) Data Presentation

zx~$ Probability Kinematic


Flow Chart
Quasi-Graphic
Analysis
Summary Matrix

( (4) Data Interpretation )

Duration Rhythm
Coherence velbcity

Figure 1 - Field systems tactical outline.

and developed into a three-part Likert scale (Sharpe, 1989) that was used in
choosing the teacher in our study.
A remote-sound videotape of one instructional lesson from beginning to end
was generated for several prospective subjects. Instructors had been encouraged to
teach the subject matter with which they were most familiar and comfortable in
a-manner perceived by them to be most effective. The videotapes were then
viewed independently by three raters (from the education faculty) and categorized
using the Likert scale. Agreement quotients of 90% were attained across interraters
using Kazdin's (1982) point-by-point interrater formula, in verification of consen-
sus regarding the relative expertise of our subject.
Verbal Field Description
By repeated viewing (with, typically, 10 to 12 complete passes), a rich,
narrative account of one entire setting (i.e., from teacher entrance to pupil and
teacher exit) is produced from the videotape. Prerogative is encouraged in
stipulating and defining alternative terminology as necessary in trying to best
describe what actually occurred in a particular setting. Consultation among investi-
gators on unique or atypical behaviors encourages accuracy of the description.
Elemental Category System Construction
Once the setting is verbally described, a category system is constructed
based on the narrative. Each behavioral and context element initially described
FIELD SYSTEMS ANALYSIS 21

is assigned a key word descriptor. Next, descriptors are combined into categories
by their similarity of function in order to obtain an all-inclusive category system.
Operational definitions are then written for each category.
Appendix A shows the category system generated for our study. The first
two sections of the system are environmental and organismic setting elements.
They comprise the historical events and characteristics that do not change within
the context of the setting under investigation. These categories are valuable for
across-study comparisons and generalization studies, in addition to providing
insight into the historical impact upon a particular setting. It should also be noted
that setting elements are not precisely identical with the "setting events " defined
in the behavioral literature.
The second two parts of the category system, environmental stimulus
elements and specific behavioral elements, comprise all observable events that
begin and terminate at least once within the setting duration. Most of these
elements tend to change rapidly. Though every attempt is made to generate an
all-inclusive category stystem, no attempt is made to make it mutually exclusive.
It seems clear that many behaviors and environmental stimuli evidence themselves
simultaneously or overlap in a setting. Retaining these contextual relationships
to ensure that the classroom setting imposed its own syntactic organization on
the investigator (and not the other way around) seems paramount in teasing out
our primary concern of element relationships in time. For example, a teacher
may be watching a group of students perform at the same time as he or she is
modeling and verbally prompting a particular task. In this case, Behavioral
Elements 19 (verbal instructional prompts), 24 (individual modeling), and 12
(specific observation) would overlap in time.

Data Presentation

The third step in our strategy involves data collection and presentation of
the behavioral and stimulus elements (Appendix A) as they appear in time on
the videotape. In our study, a NEC PC-8300 microcomputer, programmed to
collect real-time data, was used.'
Data Collection. Each of the stimulus and behavioral elements in Appen-
dix A are assigned an alpha or numeric character corresponding to the keyboard.
Data are recorded on two or three behavioral and/or stimulus elements with each
viewing of the videotape by pressing and holding a particular key for the duration
of each occurrence or by toggling a particular key to a momentary press as events
begin and end. Technological advances permit the start time between the VCR
and the NEC computer to be synchronized for each viewing, allowing accurate
temporal merging of each two- to three-element data set into one comprehensive
data set for analysis.
After the data are collected and merged, the videotape is viewed a final
time for two purposes. The first is to assure the investigators that the merging
process accurately reflects the temporal relationships extant in the setting.
Second, assurance of agreement among the research team regarding the proper
categorization of each element, based upon the previously established definitions,
is thought necessary. This step to assure intersubjective agreement is an important
one in better ensuring reliability of the coding process.
Any necessary editing of the data may also be accomplished by bringing
22 SHARPE AND HAWKINS

the data-set text into a word processing program. Start and stop times are
expressed in terms of microseconds (i.e., 0.01953 s, or approximately 50 units
equaling 1 s) allowing for an extremely fine-grained analysis of even the most
fleeting elements in the classroom setting. Message cues allow for the addition
of brief statements to mark behaviors or stimuli of interest.
Discrete Element Synopsis. The merged and inspected data set is then
downloaded into a more capable microcomputer in implementing four formats
for data presentation and analysis. Each focuses on the stimulus and behavioral
elements in Appendix A. The first presentation format includes a synopsis of
these elements with respect to frequency, average length of occurrence, percent
duration, and rate per minute for the entire class setting. This provides the
investigator with an intitial sense of the multiple characteristics of each element
in the system. This step is similar to traditional behavior-analytic efforts, though
a multiple-dimension analysis provides a more detailed account of a particular
setting than has traditionally been offered.
Table 1 relates a partial data profile. For clarity of tactical explanation, we
have opted to provide excerpts from the complete data set, focusing upon
one of the more marked clusters of strongly related elements. A complete
representation is available upon request.
To illustrate, Element 17 in Table 1 corresponds to Behavioral Element 17
(content-specific encouragement) in Appendix A. Further, Table 1 relates that
this type of encouragement occurred 281 times and was of very short duration
(a mean of 33.7 microseconds per occurrence). It may also be seen that Elements
26 (positive instructional feedback) and 3 1 (positive nonverbal communication)
were both frequent and fleeting.
Some initial behavioral and contextual relationships may also be gleaned
from this representation. For example, similar frequencies among encouraging,

Table 1
Partial Behavioral and Stimulus Element Summary
-

Specific behavioral elements


Element Frequency Duration M duration OO/ duration Ratelmin

12-specific observation 98 31.676 ,323 74.36 2.30


17-content-specific encourage-
ment 281 3.087 ,011 7.25 6.60
19-verbal instruction prompts 198 3.866 .020 9.07 4.65
21-verbal instruction skill state-
ments 161 8.505 .053 19.96 3.78
24--individual modeling 98 11.314 ,115 26.56 2.30
26--positive instructional feedback 337 3.985 .012 9.36 7.91
31-nonverbal positive communi-
cation 355 8.555 .024 20.08 8.33

Note. Mean duration time=1.953125-.02 s; 100 units=2 s.


FIELD SYSTEMS ANALYSIS 23

feedback, and nonverbal communication are evident. A similar relationship is


true of specific observation and individual modeling episodes.
Matrix Construction. Once individual behavior and stimulus characteris-
tics have been summarized, a probability matrix for all stimulus and behavioral
events is constructed. This matrix relates the frequency of occurrence and the
relative probability with which the onset of each coded event immediately
succeeds the onset of all other events. Figure 2 presents a representative portion
of the complete matrix.
As Figure 2 forms a foundation for all further discussion and data analysis,
further elaboration is in order. With this data presentation step, focus is shifted
away from the discrete variable characteristics toward the relationships in time
among these variables as they actually operate within their instructional setting.
The axis numbers on the matrix correspond to the behavioral and stimulus
elements in Appendix A and in Table 1, with the horizontal bar comprising
succeeding elements, and the vertical bar, preceding elements. Elements on the
vertical bar correspond to the same elements as on the horizontal bar (e.g.,
Element 12 represents specific observation on the ordinate and the abscissa).
Within each matrix cell the upper number indicates the frequency with which a
succeeding element was the next to occur in the setting, or the first to follow the
onset of the preceding event on the vertical bar. It is important to note that the
preceding element could still be ongoing when a succeeding element begins.
Building the matrices according to element start times allows for scrutiny of
overlapping or simultaneously occurring events.
The lower number in each cell indicates the probability with which a
succeeding element followed the preceeding element relative to all of the possible
preceding elements in this setting. These lower number probability indicators
add up to 1.00 across each preceding element's horizontal line within a complete
matrix. We have presented aggregate probabilities in the right-hand column for
the succeeding elements represented.
To illustrate, preceding Element 17 (content-specific encouragement; Fig-
ure 2, vertical axis) immediately preceded Element 26 (positive instructional
feedback; Figure 2, horizontal axis) 70 times, for a relative probability of .25.
Teacher Behavior 21 (verbal skill statement) immediately preceded Behavior 31
(positive nonverbal communication) 38 times, for a probability of .24, and so

SucceedingElements
Aggregate

.77
Preceding #21
Elements .84
#24
.R7

.07 .25 .79

.79

Figure 2 - Expert conditional probability matrix example.


24 SHARPE AND HAWKINS

forth. This step in data presentation begins to demonstrate the temporal and
contextual nature of instructional behavior by focusing on all of the dual
behavior and/or stimulus patterns in time within the setting, rather than merely
characteristics of isolated elements.
Kinematic Flow Charting. The third data presentation step within field
systems analysis represents the probable flow in time of behavioral and stimulus
chains of high frequency. Selection of the trigger element (i.e., the element that
initiates a chain) and the formulation of chains is based upon the conditional
probability matrix (Figure 2) and upon the behavioral and stimulus element
summary (Table 1). Succeeding, as well as antecedent, elements may be exarn-
ined, and minimum probability levels and latencies between events are used to
guide the chaining process (e.g., elements with lower than a 10% relative
probability or a frequency of 3 or less were ignored for the purposes of our
analysis).
Once behavioral and stimulus elements of high frequency are determined,
relative probabilities of dual-event chains may be gleaned directly from the
software programming. Occurrence in time may be displayed in several ways,
some of which include (a) a list of high-frequency dual-element chains and their
respective frequencies and probabilities (Figure 3); (b) a topographic field of
probable element sequences, which depicts a greater portion of the system in
action and interaction (Figure 4); or (c) groups of preceding and succeeding
elements of frequent occurrence that cluster around a trigger element (Figure 5).

Expert Frequency1 Expert Frequency1


Dual Chains Probability Dual Chains Probability
12 -.----+ 17 (161.16) ( 191.20)
19 (251.26) 24- 17 (1 11.11)
26 (181.19) 19 (221.23)

-
31 (141.14) 21 (121.12)
- 26 ( 171.17)
.75 -
31 .83
17 17 (251.09)
19 (381.14) (591.18)
21 (321.10) 26- 17 (3 11.09)
26 (701.25) 19 (411.12)
31 (471.21) 26 (841.25)
- 31 -
.75 .64

19 ------+ 17 (361.18) (591.17)


21 ( 181.09) 31- 17 (371.10)
26 (301.15) 19 (331.09)
31 (411.21) 21 (80123J

.
- 26
.63 .59

21 17 (271.17)
24 (241.15)
26 (271.17)
31 -
(381.24)
.73

Figure 3 - Behavioral and stimulus element dual chain examples.


HELD SYSTEMS ANALYSIS

Figure 4 - Behavioral and stimulus element sequential field example.

Figure 5 - Pre- and post-trigger element nest examples.

To elaborate, Figure 3 lists multiple dual chains of high frequency. This


type of visual representation clusters high-frequency behaviors, along with
frequency and relative probabilities of occurrence, after particular trigger ele-
ments. For example, Trigger Element 17 (content-specific encouragement) is
immediately followed by itself and Elements 19 (verbal instructional prompts),
26 SHARPE AND HAWKINS

21 (verbal instruction skill statements), 26 (positive instructional feedback), and


31 (positive nonverbal communication) with relatively high frequency. These five
conceptually related succeeding behaviors comprise a relatively high cumulative
probability of positive follow-up to initial content-specific encouragement for
our exemplary teacher (i.e., .75).
This representation format describes in detail element clusters that directly
follow the content-specific encouraging behaviors of the teacher, demonstrating
the high-probability elemental time-pattern characteristics of a particular expert
instructor. From this type of data result, we may substantiate the importance of
emitting instructional behaviors in sequence. For example, it may be equally
important for a teacher to provide content-specific encouragement immediately
prior to instructional conveyance behaviors (to better ensure student receptivity
to an instructional attempt) as to simply include these discrete behaviors in her
or his repertoire. In the same light, it seems apparent that specific observation
should immediately succeed instructional behaviors to ascertain student under-
standing, instructional feedback should be provided immediately after the afore-
mentioned student observations to tie knowledge of results to the immediate task
at hand and better ensure student improvement, and positive communication
should be tied to this behavioral time sequence to promote a warm emotional
climate within the temporal context of the encouragement-instruction-
observation-feedback teacher behavior chain.
Given the emerging evidence of similar clusters of behavioral and/or
stimulus elements following particular trigger elements with regularity within
Figure 3, a more encompassing visual representation as depicted by Figure 4
may be warranted in representing multiple occurrences of similar time patterns
within the setting. This data representation allows the researcher to represent a
larger network of elemental relationships in portraying more of the important
interactive characteristics of the experimental system.
To illustrate, Behavior 12 (specific observation) is chosen as the initial
trigger event. From the behavior's emission, immediately succeeding behavioral
or stimulus elements of high probability are depicted (i.e., Elements 17, content-
specific encouragement; 19, verbal instruction prompts; 26, positive instructional
feedback; and 31, positive nonverbal communication). We then depict each of
the secondary succeeding elements that occur with high probability after the
primary succeeding behavior, the tertiary after the secondary, and so forth.
This data representation format, again, attempts to demonstrate the multiple
relationships among behaviors (and setting elements if applicable) within the
actual time sequence in which they were emitted. From the depiction in Figure
4, multiple overlapping chains of behaviors and ecological elements may be
clearly shown within the larger field. One of the interesting features in Figure
4, for example, shows content-specific encouragement and positive instructional
feedback (i.e., Elements 17 and 26), both to be followed first by content-specific
encouragement (17), verbal instructional prompts (19), positive instructional
feedback (26), and nonverbal positive communication (3 1). This type of informa-
tion also gives an indication of longer chains of time-dependent relationships
that occur in a setting with high probability.
Due to the complex nature of Figure 4's field-like representation of the
behavioral and stimulus elements and the possibility of more complex temporal
relationships among specific behavioral elements, a more microscopic representa-
FIELD SYSTEMS ANALYSIS 27

tion of the high-probability element clusters that occur before and after the trigger
event may aid in data interpretation. Therefore, Figure 5 relates nests of high-
probability behaviors that occur immediately prior to, and immediately after, a
particular-event's onset. Analysis software allows search parameters such as
(a) frequency (>3 in this example), (b) relative probability (>.lo), (c) lag-time
specification between elements (not used in this example), and/or (d) within-
chain behavior or context element specification searches (not used in this
example), in determining specific elements within a nest package. The observa-
tional, encouraging, modeling, feedback, positive communication cluster (i.e.,
Elements 12, 17, 24, 26, and 31) that occurs prior to teacher behavior Element
21 (verbal instruction skill statements) and the similar nest package that follows
verbal instruction (Element 21) illustrate the utility of this data representation
format quite elegantly, as there is great similarity among-pre-.
and post-element
clusters for our experimental teacher (see Figure 5).
All of the chains we've represented, as well as their respective configura-
tions, are dependent upon the particular investigative setting. It is therefore the
researcher's-prerogatiGe to determine which visual format best conveys the
structure of the field system. In cases where element strings occur in the data set
but are interrupted by unrelated events, time-lag analyses should also be con-
ducted. In lag analysis, instead of determining the probability with which the
onset of an element immediately follows the onset of a preceding one, we assess
the probability that the onset of an element occurs within x amount of time after
the onset of a trigger element. Time lags are quite useful in determining the
degree of temporal contiguity among elements within a particular string of
interest. Further, frequent strings initially hidden by simultaneously occurring
behavioral or stimulus elements may be discerned.
Graphic Representation. The final data presentation format involves
graphic depiction of the relationships in time among multiple elements in
promoting a more accurate representation of how observable events actually
occur in a setting context. Those behavioral and environmental stimulus elements
that evidence a high relative probability (Figure 2) are juxtaposed graphically in
order to visually examine further possible interrelationships or more clearly
display the temporal patterns evidenced by the behavioral and setting elements
of interest. ~lementsi r e purposely selected, based on information in the three
previous data presentation formats, and strung horizontally from the graph's
ordinal axis. Start and stop-time parameters are specified in terms of micro-
seconds2along the abscissa: ~ r a ~ h l software
cs allows the investigator to specify
abscissa time parameters in order to gain differential visual slices of representative
portions, or the total field system, in question. Figure 6 depicts a representative
4-min portion from our exemplar.
Time comprises the horizontal axis, and chosen elements are juxtaposed
along the vertical axis in Figure 6. Each individual element stream is stretched
across time, with the boxed portions' appearance signifying the onset of one
instance of a particular behavioral or stimulus episbdei and each box's end
showing its termination. Therefore, element frequency and duration for the time
frame specified may be visually discerned from the graph. Alternate behavioral
and setting elements viewed on several graphs provide a unique representation
of the temporal locus and extent of events in relation to one another in the context
of the time sequence in which they actually occurred.
28 SHARPE AND HAWKINS

Keystroke

"'
Cantcn~
Specrfic wI o I 1000110 11101 I II I o 1 1 III I III 1110
Emowagcmcnt
Y19
Verbal
,,,,o.~
hmpr
08 I D I I I 1 0 I 1 1 0 1 1 OBI I I

Individual
Modcling

I I I I I I
60000 62000 64000 66000 68000 70000 72000
Time (1.953125E-02seconds)

Figure 6 - Four-minute graphic representation of select behavioral elements.


Figure 6's vertical axis includes the seven numerically designated elements
chosen for tactical explanation. In a horizontal scan of the graph across time, the
elements listed on the graph are viewed primarily in terms of their temporal
relationships with one another, rather than simply by their isolated discrete
characteristics. From this example, we may systematically describe our teacher1
subject by stating that it is first clear that Element 19 (verbal instructional
prompts) regularly succeeds Element 17 (content-specificencouragement) within
the context of ongoing Element 12 (specific observation). Second, longer duration,
lower frequency periods of Element 21 (verbal skill statements), and even longer
duration Element 24 (individual modeling) episodes, follow encouraging and
prompting behaviors (Elements 17 and 19). Finally, a tightly coupled dual chain
of high-frequency, short-duration Element 26 (positive instructional feedback)
and Element 31 (positive nonverbal communication) terminates the interactive
package.
A richly detailed time-dependent view of multiple elements within an
instructional setting may therefore be uniquely attained as multiple graphic
displays of elements of possible interest are viewed and interpreted. Such a fine-
grained analysis, in terms of time, allows for observation, interpretation, and, to
some extent, even prediction of discrete observable events, due to the repeated
temporal relationships with one another.
Data Interpretation
Thus far, we have attempted to show how field systems analysis moves
beyond traditional behavioral description, collation, and synthesis of isolated
FIELD SYSTEMS ANALYSIS 29

behavioral and stimulus elements toward a focus upon a more encompassing


time-series-dependent system. Further, the temporal and contextual nature of
data presentation gradually moves from simple dual connections to complex
elemental clusters that actually occur in the setting. Though the ability to
demonstrate independent causation among events may not be possible, this
paradigm is able to describe in an alternative light some of the correlational, or
probabilistic, intricacies of the inherently complex instructional interaction of
expert teachers. What is additionally necessary are suitable interpretive concepts
that may communicate field systems findings to the scientific community.
We therefore identify and define five alternative descriptive dimensions of
field systems in an attempt to more clearly articulate the temporal characteristics
of a setting. Though this is by no means an exhaustive listing, it may suffice as
an appropriate starting point for describing the nature of a field system.
Frequency and Duration. The termsfrequency and duration are ascribed
traditional meanings that speak to the number and length of occurrence for each
individual element.
Rhythm. The term rhythm refers to regularly recurring temporal patterns
of elements, chains, or clusters of elements. The frequency and relative probability
of patterns is a major focus of the analysis. Rhythm involves the tempo and
regularity with which patterns are repeated through time.
Complexity. To obtain the complexity dimension, the system is defined
in terms of the number of conceptually different subcategories required for
its description. In our example, 34 stimulus and behavioral elements were
synthesized from the original verbal description. Given that virtually any
investigative situation may be broken down into more and more elemental
categories, with finer discriminations among conceptually similar elements,
complexity rests upon the notion of conceptually differentiated subcategories
of elements. Of these, five different environmental stimulus elements and
nine different behavioral subcategories were determined (Appendix A). When
investigating other instructional systems, a different number of subcategories
may be necessary. Relative complexity may thus be seen as a meaningful
interpretive concept.
Coherence. Coherence is a more complex interpretive dimension. Implicit
in the complete probability matrix, of which Figure 2 is only a small portion, are
indications of the actual versus the possible number of different behavioral and
stimulus relations in time. If one were to consider the number of pairs that could
have occurred (i.e., the total number of boxed portions contained within one
matrix) and compare this to the number that actually did occur (i-e., the number
of cells that contain data), it may be concluded that the number of pairs actually
observed does not reflect all of the possible relations.
Therefore, the number of matrix cells containing entries in relation to the
total number of possible cells reflects both the variability of elemental change
and the probable reliability, or coherence, of the structural configuration of
elements within the system. In effect, coherence represents the relative degree
of order and consistency extant in a setting. High probabilities of high frequency
within few cells reflect a greater organizational coherence. Conversely, low
probabilities of low frequency within many cells indicate less coherence within
a field. We should not, however, confuse the terms coherence and predictability
with an assessment of independent causality, nor should predictability be taken
30 SHARPE AND HAWKINS

to imply a deterministic system. It may be more accurately characterized as the


relative degree of correlation among elements in time.
Velocity. The velocity dimension totals the frequency of occurrence for
every element pair within the complete probability matrix. This numeric total
quantifies the frequency of element sequencing for a setting. Considered in light
of a known time parameter, such a measure reflects the velocity of element
change, or the rate at which observable element change occurs in a setting.

Selected Application Results

Our analysis of one expert instructor's lesson conducted in mid unit yielded
a number of interesting observations that attest to the value of field systems
analysis. Though lacking a comparative view within this paper, we found evidence
of what appears to be a consistent and coherent system of instructional behavior.
This is corroborated in the repetitive nature of the behavioral chains in Figures
3 through 6. Second, velocity of behavior was in agreement with Jackson's (1968)
high-velocity perspective on effective instruction. Though relative complexity
may not be determined at this juncture, the rate at which behavior change occurred
seemed high, as did the number of different categories required for exhaustive
systemic description. This finding is in accord with the perception that as teachers
attain expertise, they acquire a larger repertoire of instructional behaviors and
implement them at a more rapid and time-efficient rate. Further, our expert
proved adept at emitting multiple behaviors simultaneously. This finding is
clearly viewed in Figure 6 by moving a vertical line across the graph. Many
behavioral elements occurred simultaneously or overlapped in time. Finally, a
consistent rhythm of instructional activity was evident. Certain chains, or
streams of behavioral elements, were found to have high probabilities of
occurrence. This, too, is consistent with the proposition that expert teachers
attain a great degree of consistency and predictability in their instructional
repertoires.
In highlighting contiguous temporal relationships across stimulus and
behavioral elements (refer to Figure 6), a link between specific observation
(Element 12) and encouraging (17) and prompting (19) behaviors was evident.
A consistent coupling between encouragement (Element 17), skill instruction (21
and 24), and positive feedback (26 and 31) was also seen. Additionally, longer
periods of more intense forms of instruction (e.g., Element 24, individual
modeling) frequently followed initial instructional prompts (Element 19), and,
when instructional prompts were emitted, positive nonverbal (3 1) communication
consistently ensued.
Finally, connections between our exemplary subject's instructional reper-
toire and the descriptive dimensions of velocity, complexity, coherence, and
rhythm used in characterizing a field system were periodically made. This
particular instructor seemed to be purposefully orchestrating a field of stimulus
and behavioral elements through a well-defined instructional path in which
particular element emissions were predicated upon appropriate temporal place-
ment within the larger context of the setting. Implications for further scrutiny of
temporal relationships and their possible importance for enabling exemplary
instruction are clear.
FTELD SYSTEMS ANALYSIS

Summary
The field systems paradigm, thus, has the potential to fulfill a number of
roles. First, additional insight into category system development may be gained.
Such an analysis is a closer approximation to the provision of category systems
that avoid imposition of a deductive conceptual scheme on the data. Due to the
inductive nature of the paradigm, category systems developed and refined by this
analytic technique have the potential for new and different descriptions of teacher
expertise, which should add to the existing literature. Additionally, and of primary
import, an important contribution may be made regarding the conceptualization
of variables in the context of their relationships with others in time, allowing for
the description of teaching expertise in terms of the interconnected nature of
observable pedagogical components.
As such, field systems analysis has demonstrated promise for extending
the body of knowledge regarding the nature of instructional efficacy. A clearer
view of the interconnections in time among the many environmental elements in
which expert instructors operate, and the behavioral elements that experts
successfully orchestrate, should inform teacher educators about more of what is
involved in becoming a teacher par excellence in the context of live instructional
settings. Adding a comparative dimension to the paradigm and including student
behavior in the system description should greatly facilitate this process.
Turning to such a field-oriented research method should also allow teacher
educators to better prepare aspiring teachers in a more systematic and context-
appropriate fashion. Hopefully, implementation of alternative strategies and
tactics such as this may move the educational research community closer to the
ultimate goal of describing with accuracy previously undiscovered characteristics
of effective classrooms. Receptivity to alternative paradigms, when coupled with
replication and comparison across instructors of variable expertise and experience,
should make this an achievable goal.

References
Bakeman, R., & Gottman, J.M. (1986). Observing interaction: An introduction to sequen-
tial analysis. New York: Cambridge.
Berliner, D.C. (1986). In pursuit of the expert pedagogue. Educational Researcher, 15(7),
5- 13.
Borich, G.D. (1986). Paradigms of teacher effectiveness research: Their relationship to
the concept of effective teaching. Education and Urban Society, 18, 143-167.
Brophy, J., & Good, T.L. (1986). Teacher behavior and student achievement. In M.C.
Wittrock (Ed.), Handbook of research on teaching (3rd ed., pp. 328-375). New
York: Macmillan.
Cook, T.D., & Campbell, D.T. (1979). Quasi-experimentation: Design and analysis issues
for field settings. Boston: Houghton Mifflin.
Dawe, H.A. (1984). Teaching: Social science or performing art? Har-vard Educational
Review, 54, 11 1-1 14.
Dickie, R.F., & Hursh, D.E. (Eds.) (1989). Juniper Gardens special issue. Education and
Ti-eatment of Children, 12(4).
Eisner, E.W. (1983). The art and craft of teaching. Educational Leadership, 40(1), 4-13.
Finn, C.E. (1988). What ails education research. Educational Researcher-, 17(1), 5-8.
32 SHARPE AND HAWKINS

Firestone, W.A. (1987). Meaning in method: The rhetoric of quantitative and qualitative
research. Educational Researcher, 16(7), 16-21.
Frick, T.W. (1990). Analysis of patterns in time: A method of recording and quantifying
temporal relations in education. American Educational Research Journal, 27(1),
180-204.
Gage, N.L. (1976). A factorially designed experiment on teacher structuring, soliciting,
and reacting. Journal of Teacher Education, 29(1), 35-38.
Gage, N.L. (1978). The scientific basis of the art of teaching. New York: Teachers College
Press.
Gage, N.L. (1984). What do we know about teaching effectiveness? Phi Delta Kappan,
66, 87-90.
Good, T.L. (1979). Teacher effectiveness in the elementary school. Journal of Teacher
Education, 30(2), 52-64.
Graham, G., & Heimerer, E. (1981). Research on teacher effectiveness. Quest, 33(1), 14-25.
Greenwood, C.R., Delquadri, J.C., & Hall, R.V. (1989). Longitudinal effects of classwide
peer tutoring. Journal of Educational Psychology, 81, 371-383.
Greenwood, C.R., Delquadri, J.C., Stanley, S.O., Teny, B., & Hall, R.V. (1985). Assess-
ment of eco-behavioral interaction in school settings. Journal of Psychopathology
and Behavioral Assessment, 7, 331-347.
Hawkins, A.H., & Wiegand, R.L. (1987). Where technology and accountability converge:
The confessions of an educational technologist. In G.T. Barrette, R.S. Feingold,
C.R. Rees, & M. Pieron (Eds.), Myths, models, & methods in sport pedagogy (pp.
67-75). Champaign, IL: Human Kinetics.
Holton, B. (1973). Introduction to concepts and theories in physical science (2nd ed.).
Reading, MA: Addison-Wesley.
Jackson, P.W. (1968). Life in classrooms. New York: Holt Rinehart & Winston.
Johnston, J.M. (1988). Strategic and tactical limits of comparison studies. The Behavior
Analyst, 11(2), 1-9.
Johnston, J.M., & Pennypacker, H.S. (1980). Strategies and tactics of human behavioral
research. Hillsdale, NJ: Erlbaum.
Kantor, J.R. (1922). Can the psychophysical experiment reconcile introspectionists and
objectivists? American Journal of Psychology, 32, 48 1-510.
Kantor, J.R. (1969). The scientific evolution of psychology (Vol. 2). Chicago: Principia
Press.
Kantor, J.R. (1970). An analysis of the experimental analysis of behavior (TEAB). Journal
of the Experimental Analysis of Behavior, 13, 101-108.
Kazdin, A.E. (1982). Single-case research designs. New York: Oxford University Press.
Lawson, H.A. (1989). Review essay: The multiversity and issues of discipline, profession,
and centrality. Quest, 41(1), 68-80.
Lichtenstein, P.E. (1983). The interbehavioral approach to psychological theory. In N.W.
Smith, P.T. Mountjoy, & D.H. Ruben (Eds.), Reassessment in psychology: The
interbehavioralalternative (pp. 3-20). Washington, DC: University Press of America.
Lincoln, Y., & Guba, E. (1985). Naturalistic inquiry. London: Sage.
Medley, D. (1979). The effectiveness of teachers. In P.L. Peterson & H.J. Walberg (Eds.),
Research on teaching: Concepts,findings, and itnplications (pp. 11-27). Berkeley,
CA: McCutchan.
Metzler, M. (1989, October). Bringing the teaching act hack into sport pedagogy research.
Paper presented at the R. Tait McKenzie symposium on research in sport pedagogy,
Knoxville, TN.
FIELD SYSTEMS ANALYSIS 33

Powell, J., & Dickie, R.F. (1990). Kinetic output: A conceptual, dimensional, and empirical
analysis. The Behavior Analyst, 13(1), 3-10.
Pronko, N.H. (1980). Psychology fiom the standpoint of an interbehaviorist. Monterey,
CA: BrooksfCole.
Ray, R.D., & Delprato, D.J. (1989). Behavioral systems analysis: Methodological strategies
and tactics. Behavioral Science, 34(2), 81-127.
Rosenshine, B., & Furst, N. (1973). The use of direct observation to study teaching. In
R.M.W. Travers (Ed.), Second handbook of research on teaching (pp. 156-157).
Chicago: Rand McNally.
Rubin, L.J. (1985). Artistry in teaching. New York: Random House.
Schroeder, S. (Ed.) (1990). Ecobehavioral analysis and developmental disabilities. New
York: Springer-Verlag.
Sharpe, T. (1989). Interbehavioral field systems analysis across expert and novice
elementary physical education instructors: A paradigmatic alternative to the teacher
effectiveness quantification puzzle (Doctoral dissertation, West Virginia University,
1989). Dissertation Abstracts International.
Sharpe, T., & Hawkins, A. (in press). An epistemological postscript. Journal of Teaching
in Physical Education.
Siedentop, D., & Eldar, E. (1989). Expertise, experience, and effectiveness. Journal of
Teaching in Physical Education, 8, 254-260.
Thome, F.C. (1973). Eclectic psychotherapy. In R. Corsini (Ed.), Current psychotherapies
(pp. 445-486). Itasca, IL: F.E. Peacock Press.

Notes
'A complete software guide and accompanying technological hardware contacts
are available upon request. For further details contact the primary author.
'Though microseconds is a complex unit of measurement, current software is
programmed in this fashion for other applications. The reader may convert via the
following approximations: 50 microseconds = 2 s; 1,000 microseconds = 20 s; 3,000
microseconds = 1 min.

Acknowledgment
We gratefully acknowledge the methodological input of Roger Ray of Rollins
College and the historical and conceptual advice of Dennis Delprato of Eastern Michigan
University. Their work and encouragement have given impetus to our efforts.
SHARF'E AND HAWKINS

Appendix A
Category system

Contextual/environmentalsetting elements
-Full-court gymnasium, optimal lighting, 30 pupils, five-station 1PI movement-educationunit,
new equipment
-Urban, middle-class third-grade homogeneous student population with an equal mix of
males and females
-All pupils exposed to a daily physical education class
Contextual/organismicsetting elements
-Seven-year veteran female instructor in her early 30s with a masters degree in physical
education
Discrete environmental stimulus elements
--Content stage -Noise level
1. Preview 6. High
2. Instruction 7. Medium
3. Review 8. Low
-Pupil configuration -Teacher position
4. Station drill work 9. Central
5. Games situation 10. Peripheral
-Indiscriminate environmental stimuli
13. Transition
33. Indecision/contradiction
Specific behavioral elements
-Observation -Nonverbal instruction
11. General 23. Group modeling
12. Specific 24. Individual modeling
-Management 25. Physical guidance
14. Request -Instructional feedback
15. Direction 26. Positive
16. Support 27. Negative
-Encouragement -Nontask verbal
17. Content specific 28. Positive
18. Content unrelated 30. Negative
-Verbal instruction 29. Off-task
19. Prompts -Nonverbal communication
20. Strategies/rules 31. Positive
21. Skill statements 32. Negative
22. Analogy
36. Tie-in statements

Note. The numbers that correspond to the stimulus and behavioral elements are used as
elemental descriptors throughout the results and analyses sections. Headings indicate a
mutually exclusive category grouping: Elements within a category heading may not occur si-
multaneously or overlap; conversely, elements across category headings may occur simul-
taneously or overlap.

You might also like