You are on page 1of 8

A Graphical Exploration of SPC

Editors note: This article is the second of a two-part illustrated discussion of the definitions and practices of process control, especially those practices that are supported by the construction and interpretation of Shewhart control charts. The first article, which
appeared in the May 1996 issue, addressed issues relating to the structure of control charts. This second article focuses on the interpretation of control charts, with special emphasis on the sensitivity of seven rules that are commonly used to help quality professionals hear the voice of the process.

A
Part 2: The
probability
structure of
rules for
interpreting
control charts

by
Robert W. Hoyer
and
Wayne C. Ellis

TTENTION WILL BE FOCUSED ON A SET OF

seven rules designed to help quality


professionals interpret information
contained in standard X charts or in
any control chart for which the marginal distribution of the statistic is approximately normally distributed. But before you read further,
examine the B chart in Figure 1. Circle the points
indicating that the process is out of control. Do not
circle the entire set of points that defines the pattern; circle only the point that triggers the rule you
are using. After circling these points, examine the
list of rules in Table 1. If Table 1 contains any of
the rules that you used to identify out-of-control
conditions in the chart, write the number of the
rule next to the corresponding point in Figure 1.
This exercise of identifying the out-of-control
signals in Figure 1 always gets mixed results.
Some people circle only two points, while others
circle almost half of the points in the display. A
glance at the chart reveals that the values of B are
initially fairly large, averaging close to two standard deviations of B above the centerline. Then
they begin a downward trend that bottoms out at a
point below the lower control limit. On the righthand side of the chart, there is a sudden jump in
the values of B that, within the space of two samples, is greater than four standard deviations of B.
Within the context of the rules in Table 1, eight of
the B charts 16 points trigger rules
Figure 1.
that suggest the presence of specialcause variation. In addition, the 14th
point in the time series is, by itself,
consistent with five of the seven rules.
The rules in Table 1 are a fairly
standard set; they are used in many
manufacturing, assembly, and servicedelivery settings. Each rule describes a
pattern of a statistic that, if it occurs in
a control chart, implies the presence of
a special cause of variation.
Notice that the patterns in Table 1
are for points above the centerline or
for points that are in an upward trend.
Because the rules comparative

strength (or sensitivity) is of interest, the rules have


been described in a one-sided fashion. For example, instead of stating that a point outside the control limits signals a special cause of variation, it is
stated that a point above the upper control limit
sends such a signal. Each pattern has a symmetric
counterpart, however, for points below the centerline or for points that are in a downward trend.
Furthermore, because the rules relative sensitivities are of interest, the rules have been described in
a manner that causes several to be subsumed by
others (e.g., Rule 2 subsumes Rule 2). Finally, the
rules are described in terms of a generic B chart.
The rationale for specifying that the patterns in
Table 1 are indicative of special-cause variation is
rarely discussed in the statistical process control
(SPC) literature. Thus, the decision to focus on the
seven rules is, for the most part, based on the facts
that few quality professionals would find them
objectionable and few process control experts use a
more extensive set.
Without supporting arguments, it can be stated
that:
A signal of a special cause of variation:
Is a systematic pattern of the statistic
Has a low probability of occurring when the
process is in control
While it is fairly easy to calculate the probability of a specific pattern of the statistic when the

The B Chart

Quality Progress June 1996

57

Table 1. Rules for Identifying Special Causes of Variationi


in Control Charts
Rule

Description

Rule 1

A point is above the upper control limit.


    3

Rule 2

Of three consecutive points, two are more than two


standard deviations above the centerline.
Of three consecutive values of , two satisfy
    2

Rule 2'

Two consecutive points are more than two standard


deviations above the centerline.
For two consecutive values of , four satisfy
    2

Rule 3

Of five consecutive points, four are more than one


standard deviation above the centerline.
Of five consecutive values of ,     1

Rule 3'

Four consecutive points are more than one standard


deviation above the centerline.
For four consecutive values of ,     1

Rule 4

Seven consecutive points are above the centerline.


For seven consecutive values of ,   

Rule 5

Six consecutive points are in a monotone increasing


pattern (count the first and last points in the pattern).
i  i1 i2 i3 i4 i5

Rule 6

Of 10 consecutive values of B, there is a subset of eight


(reading from left to right) that are in a monotone
increasing pattern (count the first and last points in the
pattern).

Rule 7

Given two consecutive points, the second is at least four


standard deviations above the first.
i1  i  4

process is in control, deciding whether it is a systematic pattern


is essentially subjective. Using the rules in Table 1, Figure 2
shows those points that are both systematic patterns and have
low probabilities of occurring when the process is in control.

What is sensitivity?
Suppose you have been measuring a certain process characteristic that has been in control for a long time. Suddenly, you
reach a point in the chart where one of the seven patterns of the
statistic occurs. You can be fairly certain that one of two possibilities is true: Either you have witnessed a miracle (a low-probability event for an in-control process) or the pattern is not surprising (it can easily be explained by adjusting the distribution
of the process output in a manner that makes the patterns
occurrence a reasonably high-probability event).
These two alternatives are illustrated in the X chart in Figure 3.
Notice that in both charts, there is no reason to believe that the
process is out of control until reaching the 15th value of X. At
that point, it seems reasonable to believe that either the process
is in control (chart a) and you have just witnessed a value of X
that occurs with a probability (P) less than P = 0.00135 (i.e.,
58

Quality Progress June 1996

0.135% of the time) or the process actually moved upward


(chart b).
Since a statistician, by definition, is a variance analyzer who
has never witnessed a miracle, he or she always accepts the second explanation. But accepting the second explanation requires
an adjustment of the distribution of the process output to make
the probability seem less than miraculousand that means
there is at least one special cause of variation. Eureka, the
process is out of control!
What was potentially a surprising event was not surprising at
all. Values of X this large will occur almost 7% of the time. But
to make the occurrence of a potentially unusual pattern seem
ordinary, it is necessary to assume that the process moved
upwardin other words, it is necessary to assume that a special
cause of variation is affecting the process.
Most quality professionals know that the probability of
observing a value of X above the upper control limit is P =
0.00135 when the process is in control. But few know about the
X chart probabilities associated with other rules (and even fewer
know about the probabilities of patterns in other types of charts,
such as R or s charts). Take this challenge: Rank the rules in
Table 1 according to their sensitivity for detecting a special
cause of process variation. In other words, indicate which rule
most strongly signals that the process is out of control, which
rule is the second strongest signal, and so forth.
When asked to rank the rules in terms of their sensitivities,
most people want to know: What is meant by sensitivity? For
purposes of this discussion, the definition of sensitivity is:
Given any two rulessay, Rule A and Rule BRule A is a
more sensitive indicator that the process is out of control
than is Rule B if, when the process is in control, the probability of the event in Rule A is smaller than the probability
of the event in Rule B.
Now, armed with this definition, you are encouraged to
assume that:
The distribution of the process characteristic being measured
is approximately symmetric.
The chart being assessed is an X chart with n = 5 observations in each sample.
In light of these assumptions, again rank the sensitivities of
the nine rules (two of which are special cases of other rules) in
Table 1. Then examine Figure 2 and label the designated points
most sensitive, second most sensitive, and so forth.
Almost without exception, those who respond to this challenge give top billing to Rule 1, with Rule 4 usually close
behind. But, as you will soon discover, Rule 2 is actually more

Figure 2. Points That Signal Special-Cause Variation

Table 2. Rules for Identifying Special Causes of Variationi


in Control Charts (Ranked in Order of Sensitivity)
Rule
(sensitivity)

Description

Probability
(%)

Rule 6'
(7.50)

Of nine consecutive points, there


is a subset of eight (reading from
left to right) that are in a monotone
increasing pattern (count the first
and last points in the pattern).

0.018%

Rule 2'
(2.60)

Two consecutive points are more


than two standard deviations above
the centerline.

0.052%

Rule 3'
(2.14)

Four consecutive points are more


than one standard deviation above
the centerline.

0.063%

Rule 6
(1.96)

Of 10 consecutive points, there is


a subset of eight (reading from left
to right) that are in a monotone
increasing pattern (count the first
and last points in the pattern).

0.069%

Rule 1
(1.00)

A point is above the upper control


limit.

0.135%

Rule 5
(0.97)

Six consecutive points are in a


monotone increasing pattern (count
the first and last points in the pattern).

0.139%

Rule 2
(0.88)

Of three consecutive points, two are


more than two standard deviations
above the centerline.

0.153%

Rule 7
(0.58)

Given two consecutive points, the


second is at least four standard
deviations above the first.

0.234%

Rule 3
(0.49)

Of five consecutive points, four are


more than one standard deviation
above the centerline.

0.277%

Rule 4''
(0.35)

Eight consecutive points are above


the centerline.

0.391%

Rule 6''
(0.32)

Of nine consecutive points, there


is a subset of seven (reading from
left to right) that are in a monotone
increasing pattern (count the first
and last points in the pattern).

0.417%

Rule 4
(0.17)

Seven consecutive points are above


the centerline.

0.781%

Rule 4'
(0.09)

Six consecutive points are above


the centerline.

1.563%

sensitive than Rule 1, and Rule 4 describes the least sensitive of


all of the signals in Table 1.

How sensitivity is calculated


If the distribution of the process characteristic is approximately symmetric (which includes, of course, a normal distribution), the central limit theorem guarantees that X is normally
distributed, even with relatively small sample sizes. With this
knowledge and the assumption that the process is in control, the
exact probabilities of the rules listed in Table 1 can be calculat-

ed. These probabilities are displayed in Table 2, where the list


of rules has been expanded and ranked by the rules sensitivities.
Although there are several scenarios within which the probabilities in Table 2 could have been calculated, the following
real-time assessment was chosen: Imagine that you are a
process owner (operator) constructing an X chart in real time
and evaluating it accordingly (which, incidentally, is the optimal
use of the chart). Imagine also that the process is in control. At a
specific point in time, you ask yourself: Of the next three
points, what is the probability that I will observe two that are
more than two standard deviations of X above the centerline?
The answers to such questions led to the development of Table 2.
In many process control cultures, Rule 1 has an exalted status
compared to that of other rules. For that reason, a crude sensitivity index was also developed. This index compares each patterns probability of occurrence when the process is in control
to the corresponding probability of observing a point above the
upper control limit when the process is in control. In other
words, the index is:
P[Pattern in Rule 1]
Sensitivity[Rule A] =
P[Pattern in Rule A]
Discovering a pattern of a statistic (i.e., a rule) with a sensitivity index value greater than 1 is more surprising than finding
a point outside the control limits. Consequently, when you actually observe such a pattern, there is a strong reason to believe
that at least one special cause of variation is affecting the
process.
Many quality professionals respond in disbelief to the information in Table 2 because they have grown accustomed to
using a single criterionnamely, Rule 1to identify special
causes of process variation. They find it difficult to believe that
there are other rules that are more sensitive indicators of out-ofcontrol conditions.
Consider, for example, the X chart in Figure 4. Ask a few of
your colleagues who are knowledgeable about SPC to look at
the chart and identify all signals of special-cause variation.
Without exception, they will choose the 15th point; a few will
choose the 12th point; and almost none will choose the seventh
point. What is remarkable about these points is the fact that all
three are strong signals that the process is out of control; of the
three, the point above the upper control limit is the weakest signal (has the smallest sensitivity). In fact, by the time the process
reached the seventh time period, the process output had either
shifted or trended upward, indicating that the process is out of
control. The signal that is triggered by the seventh value is
almost three times as strong as the signal sent by the point
above the upper control limit.1
Even those who are knowledgeable about other rules sometimes ignore them. For example, while working with a process
control team at an automobile assembly plant, we observed that
only Rule 1 was being used to assess the control status of a
large number of processes. We were told, Management
instructed us to use only the out-of-control rule, even though
these other rules could give us additional information about the
presence of special causes of variation. Frankly, this sort of
misunderstanding about both the nature of out-of-control
processes and the principles of continuous improvement is far
more prevalent than you would expect given that it has been

Quality Progress June 1996

59

Table 3. Rule Probabilities for X Charts When the Process Is in Control


Probability (sensitivity)
X is normal

X is slightly skewed

X is seriously skewed

Sample size

Sample size

Sample size
is irrelevant

10

25

10

25

Rule

Description

X is more than 3X above X

0.135%

0.254%

0.209%

0.191%

0.488%

0.380%

0.281%

X is more than 3X below X

0.135%

0.021%

0.046%

0.080%

0.000%

0.008%

0.035%

Of three consecutive values of X,


two are above X  2X

0.153%

0.237%

0.212%

0.190%

0.342%

0.281%

0.235%

Of three consecutive values of X,


two are below X 2X

0.153%

0.076%

0.098%

0.119%

0.020%

0.048%

0.082%

Two consecutive values of X are


above X  2X

0.052%

0.080%

0.072%

0.064%

0.117%

0.096%

0.080%

Two consecutive values of X are


below X 2X

0.052%

0.026%

0.033%

0.040%

0.007%

0.016%

0.028%

Of five consecutive values of X,


four are above X  1X

0.277%

0.284%

0.281%

0.280%

0.273%

0.274%

0.277%

Of five consecutive values of X,


four are below X 1X

0.277%

0.284%

0.284%

0.278%

0.268%

0.273%

0.276%

Four consecutive values of X are


above X  1X

0.063%

0.065%

0.064%

0.064%

0.062%

0.063%

0.063%

Four consecutive values of X are


below X 1X

0.063%

0.065%

0.065%

0.064%

0.061%

0.063%

0.063%

Seven consecutive values of X are


above X

0.781%

0.631%

0.680%

0.705%

0.494%

0.568%

0.636%

Seven consecutive values of X are


below X

0.781%

0.961%

0.896%

0.865%

1.202%

1.060%

0.953%

Eight consecutive values of X are


above X

0.391%

0.306%

0.333%

0.347%

0.231%

0.271%

0.309%

Eight consecutive values of X are


below X

0.391%

0.495%

0.457%

0.439%

0.639%

0.554%

0.491%

Six consecutive values of X are in


a monotone increasing pattern

0.139%

0.139%

0.139%

0.139%

0.139%

0.139%

0.139%

Six consecutive values of X are in


a monotone decreasing pattern

0.139%

0.139%

0.139%

0.139%

0.139%

0.139%

0.139%

Of 10 consecutive values of X,
a subset of eight (reading from
left to right) are in a monotone
increasing pattern

0.069%

0.069%

0.069%

0.069%

0.069%

0.069%

0.069%

Of 10 consecutive values of X,
a subset of eight (reading from
left to right) are in a monotone
decreasing pattern

0.069%

0.069%

0.069%

0.069%

0.069%

0.069%

0.069%

Of two consecutive values of X,


one is more than 4X larger than
the other

0.234%

0.229%

0.230%

0.232%

0.282%

0.261%

0.243%

2'

3'

4"

60

Quality Progress June 1996

Figure 3. Explaining an Unusual Pattern of X

more than 65 years since Walter Shewhart began using control


charts at Bell Labs.
We will be the first to admit that, in ranking the rules in terms
of their sensitivities, we are making far too much of minuscule
differences between probabilities such as P = 0.018% (Rule 6),
P = 0.417% (Rule 6), and all of the values of P in between. In
truth, each of these probabilities is associated with a very
unlikely pattern of the statistic when the process is in control; so
witnessing such an event should give the control chart analyst
ample reason to conclude that special-cause variation is present.
On the other hand, we will gladly accept the criticism of
being excessively concerned with minor differences if it will
inspire quality professionals to use a broadly based collection of
rules to assess process behavior. There is a great deal more to
reading a control chart than identifying points outside the control limits. If managers are going to instruct process owners to
respond to only one type of out-of-control signal (it is hoped
that they will learn enough about continuous improvement
issues to recognize the absurdity of that strategy), let them
choose the rule that two consecutive points are more than two
standard deviations of the statistic away from the center of the
process (Rule 2) or any of the other rules that are more sensitive than Rule 1.
To illustrate the importance of using a comprehensive set of
rules to assess process control, consider the X chart in Figure 5.
Notice that:
None of the 13 rules in Table 2 applies to the patterns in this
chart. The inexperienced process control analyst might be
tempted to conclude that this is an excellent illustration of an

in-control process. But there is strong reason to believe that


around the seventh value of X, there was an upward shift in
process output equal to one of Xs standard deviations.
Furthermore, after observing 10 values of X beyond the point
where the shift took placeand despite using the fairly comprehensive set of rulesthere is still no formal information
that the shift actually occurred.
In the final analysis, there is no substitute for Brookes
optical trauma:
You know the process is out of control when
Bang!it hits you between the eyes.
Assessment of the upward shift in the process described
by the control chart in Figure 5 is a classic example of
Brookes optical trauma. Although, up to this point, the rules
were silent about the presence of a special cause of variation,
an alert process owner could easily identify the process shift.
Using only Rule 1 to detect special causes of variation will
almost certainly result in the loss of important information. A
well-known theorem in statistics states that if only Rule 1 is
used to determine the control status of a process, then, on
average, an X chart will detect a shift of 1X in process output approximately 44 samples after the process shift takes
place.2 Thus, when it is the only rule to evaluate a control
chart, Rule 1 is a relatively insensitive analytical tool. Undue
reliance on it could have a disastrous effect on the output of a
process, especially if the process has only marginally acceptable capability. That, in itself, is reason enough to expand the
collection of rules that are applied to control chart interpretation.3
The rules in Table 2 are not all-inclusive. Rather, they are
simply a fairly standard set that is used more frequently than
other rules. (These other rules would make just as much sense
as this set and could be used just as easily as this set.) In addition, the rules in Table 2 do not constitute the final statement
about the likelihood of a special cause of variation affecting
process output. They are merely practical aids for process owners who use SPC as one of their tools to achieve continuous
process improvement.
As a final note on the rules for identifying special-cause variation, experience indicates that, after Rule 1 (a point outside the
control limits), Rule 4 (seven consecutive points on one side of
centerline) is the most frequently used rule on the shop floor. It
is easy to see, however, that the probability associated with Rule
A is not even close to most of the other probabilities in Table 2.
Even the probability of eight consecutive values of X on one
side of the centerline (Rule 4) is somewhat higher than the
other probabilities. Anyone using Rule 4 should know that it
will mistakenly identify instances of random process behavior
as special causes of variation much more frequently than any of
the other rules. The probability of this outcome when the
process is in control (P = 0.781%) is substantially larger than
the other probabilities listed in Table 3. Even the probability of
eight consecutive values of X on one side of the centerline (P =
0.391%) is somewhat larger than the others.
It was observed earlier that control chart analysts appear to
be unusually responsive to values of the statistic that are outside
the control limits. Because the probability of a value of X being
above the upper control limit is P = 0.135%, perhaps that probability should be considered something of an ideal sensitivity to
which the sensitivities of the remaining rules should be compared. In that case, it is noteworthy that the probability of Rule

Quality Progress June 1996

61

Figure 4. X Chart for an Out-of-Control Process

4nine consecutive points above (or below) the X charts


centerlineis P = 0.195%, which is very close to the probability associated with Rule 1.
In practice, it is a simple matter to make suggestions about
which of the rulessay, Rule 4, 4, 4, or 4the analyst
should use. If the process output has high capability and if, historically, the process has not been susceptible to large shifts,
Rule 4 will probably serve the analyst quite well. If there is a
change in the supplier of process input, use Rule 4 or even Rule
4 for a short time after the new materials have been incorporated into the process. Similarly, if the process output has a low
capability, Rule 4 and Rule 4 are probably satisfactory.
In any event, there are two cardinal rules that everyone concerned with continuous quality improvement must remember:
The rules in Table 2 are merely guidelines. There is no substitute for the input of a process owner who has an intimate
knowledge of virtually every facet of the process. It is a
thoughtful, well-trained, empowered process owner for
whom Brookes optical trauma is most pertinent.
Discovering that a process is out of control is not a terrible
event. It should not be hidden from supervisors, managers,
auditors, quality control experts, or, most important, customers. In a sense, it is an event that should be celebrated
because it gives the process owner an opportunity to improve
the process. It is very frustrating to encounter manufacturing,
assembly, and service-delivery cultures in which out-of-control processes are treated like horrible diseases that should be
hidden from everyone except the quality professional.

Figure 5. Out-of-Control Process Shift

62

Quality Progress June 1996

How to bend the rules for asymmetric distributions


The primary assumption underlying the calculation of the
probabilities in Table 2 was that the distribution of the process
output from which the samples were selected was approximately symmetric. The conclusion of the normal theory (when the
process output is normally distributed) and the central limit theorem (when the process output is not normally distributed) is
that the distribution of X will be approximately normally distributed. It is this fact and the inherent probability structure of
normal distributions that provide the basis for the probability
calculations in Table 2.
On the other hand, many natural distributions of process output are not normally distributed. Indeed, there are times when
the distribution of the process output is sufficiently asymmetric
to call into question the validity of the rules that are typically
used to assess process control. To explore such possibilities, two
distributionsa slightly skewed distribution and a seriously
skewed distributionwill be discussed. These two distributions, which are actually variations of Weibull distributions, are
displayed in Figure 6, along with the sampling distributions of
X based on samples of size n = 5, n = 10, and n = 25.
In Figure 6, it is easy to see that the distribution of X for the
slightly skewed process output looks much more like a normal
distribution as n increases. Despite this, it is visually apparent
that the distribution of X is skewed to the right, even when n is
as large as 10. It is somewhat fatter in the tails (i.e., X has a
greater probability of being an extreme value) than a normal
distribution, even when n = 10.
The probabilities of the seven rules for the slightly skewed
and seriously skewed process output and for a normally distributed process output are shown in Table 3. The primary reason
for examining these probabilities is to develop a sense for how
you should bend the rules for identifying special-cause variation
when the distribution of the process output is skewed. Although
the comments here will focus on distributions that are skewed to
the right, there is an obvious flip-side assessment when the distributions are skewed to the left. Furthermore, since a great
many control charts are based on samples of size n = 1 (every X
and MR chart, for example), n = 3, or n = 5, attention will be
focused on the smaller sample sizes. For convenience, Rules 1,
2, 2, 3, and 3 will be referred to as the zone rules.
In Table 3, observe that for both of the skewed distributions,
almost all of the conditions have small probabilities (less than
1%) when the process is in control. Therefore, even if the usual
seven rules for identifying special-cause variation were used,
there would be a relatively small cost associated with making
non-optimal decisions. Nevertheless, there is some advantage in
being a more sophisticated analyst by knowing that:
Unless the process is susceptible to small shifts in its center
or unless the process output has low capability, Rule 4
instead of Rule 4 should be used. For a process characteristic
whose distribution is skewed to the right, it is very difficult to
make a case for responding to seven consecutive values of X
below the centerline. The analyst should, however, respond
to eight consecutive points above the centerline of such a
process.
The probabilities of the extreme zone rules (Rules 1, 2, and
2) are unusually small for patterns below the centerline
when process output is skewed to the right. While all of the
zone rules probabilities are small and, therefore, require
responses to suspected special causes of variation, observing

Figure 6. Distributions of X for Slightly Skewed and


Seriously Skewed Distributions
Slightly
skewed
process
distribution

Seriously
skewed
process
distribution

ignore opportunities to improve the process by removing special-cause variation, we cannot overlook the fact that, for
small sample sizes, both Rule 2 and Rule 3 are much more
sensitive to upward movements in the process center than is
Rule 1.
No matter whether the distribution of the process output is
symmetric or skewed, Rule 1 should be viewed as merely a single indicatorand not even the most important onein a comprehensive set of rules that can help analysts identify specialcause variation. Frankly, singling out Rule 1 as an indicator that
can be used by itself to assess process control is intellectually
naive and borders on being dishonest.

The 10 commandments of SPC

a pattern below the centerline is next to impossible unless the


process is out of control. In such cases, the analyst should be
willing to wager next months salary that the process center
has dipped below the centerline of the chart.
Independent of the extent of the skewness, when a pattern of
values of X is consistent with either Rule 5 or Rule 6, it is
almost guaranteed that there is a trend in the process. In other
words, these two rules signal the presence of trends and are
very sensitive indicators of special-cause variation. Keep in
mind, however, that Rules 5 and 6 are the least effective of all
of the rules for identifying the process behavior they are
designed to recognize. For example, there is an obvious trend
in the data displayed in Figure 7, yet more than 11 samples
after the trend began, neither of the trend rules sent a signal
to that effect. This demonstrates the importance of both Rule
3 (which has been violated three times) and Brookes optical
trauma.
When the distribution of the process output is seriously
skewed to the right, it is not unusual to observe a point above
the upper control limit when the process is in control. While
we would not want to be accused of encouraging anyone to

On a personal level, we were inspired to write this two-part


article because our experience indicated that a sizable majority
of quality professionals are not knowledgeable about fairly
basic issues of statistics and SPC. Our instructional activities in
a broad range of academic, industrial, and service delivery environments have convinced us that there are many individuals
who are doing SPC without understanding what it is about. It
is not surprising to encounter, not just a few, but many individuals who have been entrusted with continuous improvement
responsibilities who cannot define an in-control process, who
cannot accurately distinguish between process control and
process capability, who cannot distinguish between process
capability and product capability, who do not understand the
basic structure of a control chart, who do not have practical
knowledge of the fundamental theorem of SPC, and who do not
understand the significance and relative importance of various
signals of special causes of variation. And why should they?
Our review of a very large number of SPC textbooks reveals
page after page of cookbook discussions of practically everything under the sunwith very little discussion on the foundation of SPC.
Although it is disappointing that the technical content of the
quality improvement discipline has progressed so little during
the past 65 years, that is probably not the most significant problem facing process control initiatives during the next decade.
Instead, there is every reason to be concerned that many quality
professionals are directing continuous process improvement
activities without a sound understanding of the basic issues.
Since the subject of this two-part article has been rather
broad, its conclusions can be summarized in a list of the 10
commandments of SPC:
I. Thou shalt not elevate the importance of process output
above the importance of the process itself.
II. Thou shalt not confuse the issues of process control and
process capability.
III. Thou shalt not use the output of an out-of-control
process to construct a control chart.
IV. Thou shalt not draw specification limits on any control
chart.
V. Thou shalt not adulterate thy data to make an out-ofcontrol process appear to be in control.
VI. Thou shalt not worship the rule of a point outside the
control limits.
VII. Thou shalt not bear false witness against thy data.
VIII. Thou shalt not ignore the distribution of process output.
IX. Thou shalt not ignore the presence of special-cause
variation.

Quality Progress June 1996

63

Figure 7. Out-of-Control Process Trend

3. For other perspectives on this issue, see Robert B. Davis and


William H. Woodall, Performance of the Control Chart Trend Rule
Under Linear Shift, Journal of Quality Technology, Vol. 20, No. 4,
October 1988, and Andrew C. Palm, Tables of Run Length Percentiles
for Determining the Sensitivity of Shewhart Control Charts for
Averages with Supplementary Runs Rules, Journal of Quality
Technology, Vol. 22, No. 4, October 1990.
Robert W. Hoyer is the president of Decision Dynamics Inc. in Ann
Arbor, MI. He received a doctorate in statistics from the Virginia
Polytechnic Institute in Blacksburg.
Wayne C. Ellis is a professor of statistics at Eastern Michigan
University in Ypsilanti. He received a doctorate in mathematics from
the University of Michigan in Ann Arbor.

X. Thou shalt not conceal an out-of-control process from thy


supervisor, manager, or customer.

What did you think about this article?

References
1. Any argument in support of the mathematical accuracy of this relative comparison of sensitivities will be vulnerable to the counterargument that the measurement scale for pattern sensitivity is probably not
a ratio-level scale.
2. See Thomas P. Ryan, Statistical Methods for Quality
Improvement (New York, NY: John Wiley and Sons, 1989), pp. 102104.

64

Quality Progress June 1996

Quality Progress needs your


feedback. On the postage-paid
reader service card inserted toward
the back of this magazine, please
circle the number that corresponds
with your opinion of the preceding
article.

Excellent

Circle #361

Good

Circle #362

Fair

Circle #363

Poor

Circle #364

You might also like