Professional Documents
Culture Documents
To facilitate and elucidate practical issues and decisions related to the application of SEM and highlight certain issues
that researchers should consider more seriously, we present
a modified decision-making framework. Specifically, SEM
applications typically follow a five-step process (Bollen and
288
290
292
Figure 1
CudeckHenly (1991) Representation of the
Relationship Between a Model and the Real World
294
Causal Inferences
Appropriate use of SEM also entails a priori consideration
of the models one plans to test, the nature of the manifest
variables, and the sampling design. Far too often, it is
only after data have been collected and the first attempt at
model estimation conducted that researchers consider the
adequacy and compatibility of the research problem, theoretical framework, population sampled, manifest variables,
and plausible models.
Marketing researchers often construct theoretical causeand-effect statements involving networks of latent variables
and evaluate the networks using cross-sectional data in
structural equation models. In fact, in its earlier days,
researchers often referred to SEM as causal modeling, and
the path-analytic structures of structural equation models
clearly imply causal flows from exogenous to endogenous
constructs. However, deriving causal inferences from
cross-sectional data can be fraught with risk, especially
when there is an implicit (nonexplicated) assumption that
changes in one construct cause changes in another construct
(e.g., Cliff 1983). Doing so requires the assumption that
the causal relationship is instantaneous, that homeostasis
has occurred, or, in the case of self-report data, that study
participants can reasonably be asked to simultaneously
provide information on previous, present, or future behaviors. Such an assumption would appear unwarranted in
most cross-sectional SEM applications. Indeed, researchers
using SEM techniques should keep in mind that causality
can never be conclusively established from correlational
(including longitudinal) data.
Closely related to the use of temporally lagged dependent variables to facilitate causal inferences is the use of
multiple data sources for the same purpose. For example,
it is common in studies of individual performance in sales
management or services marketing contexts to employ
manager ratings or objective measures of performance in
order to avoid common-method bias and facilitate causal
inferences, which are generally implicit in such models.
Even though it is clearly commendable to incorporate different data sources in this manner, causal inferences are
still subject to the same caveat as noted above. Further, this
type of model generally specifies a path-analytic (causal)
flow linking other constructs, which are represented as
antecedents of performance and are operationalized using
data taken from the same source (e.g., individual salespeople
or service providers). The causal relationships implied to
exist among such constructs are subject to stronger caveats
than those that link the antecedents to performance, but
researchers tend to accept such inferences as long as the
ultimate dependent variable is measured using data from
a different source.
The issue of variable timing in causal structural equation
models, regardless of whether cross-sectional or longitudinal data are being analyzed, is one that requires significant
research. Although cross-lagged relationships can lead to
meaningful causal inferences, it is important to recall
that a temporal sequence of data is not by itself necessarily sufficient to infer causality. There may be intervening
constructs or constructs acting on both the hypothesized
cause-and-effect constructs. One is reminded of the research
and counterresearch that appeared in the sales forecasting
literature over the proper time lag to use when predicting
sales responses to marketing efforts (and the demonstrations
that the time lag employed had a significant effect on sales
response parameter estimates).
296
study. Specifically, we noted important limitations regarding the use of SEM in light of certain research objectives and
conditions. Although SEM possesses the capability to facilitate the evaluation of complex structures of relationships
and explicitly model measurement error, it is important to
recognize the limitations inherent within this capability.
In this regard, we hope that our review serves as a useful
reminder of best practices when applying SEM.
NOTE
1. The Chamberlin (1965) reference cited here is a reprint of
an article published in 1890 in Science.
REFERENCES
Allison, Paul D. (2003), Missing Data Techniques for Structural
Equation Modeling, Journal of Abnormal Psychology, 112
(November), 545557.
Anderson, James C., and David W. Gerbing (1988), Structural
Equation Modeling in Practice: A Review and Recommended
Two-Step Approach, Psychological Bulletin, 103 (May),
411423.
Armstrong, J. Scott, Roderick J. Brodie, and Andrew G. Parsons
(2001), Hypotheses in Marketing Science: Literature Review and Publication Audit, Marketing Letters, 12 (May),
171187.
Bagozzi, Richard P. (1980), Causal Models in Marketing, New York:
John Wiley & Sons.
, and Youjae Yi (1988), On the Evaluation of Structural Equation Models, Journal of the Academy of Marketing Science,
16 (Spring), 7494.
Bauer, Daniel J. (2003), Estimating Multilevel Linear Models
as Structural Equation Models, Journal of Educational and
Behavioral Statistics, 28 (Summer), 135167.
Bekker, Paul A., Arien Merckens, and Tom J. Wansbeek (1994),
Identification, Equivalent Models and Computer Algebra, Boston: Academic Press.
Blalock, Hubert M., Jr. (1979), The Presidential Address: Measurement and Conceptualization Problems: The Major Obstacle
to Integrating Theory and Research, American Sociological
Review, 44 (December), 881894.
(1986), Multiple Causation, Indirect Measurement and
Generalizability in the Social Sciences, Synthese, 68 (July),
1336.
Bollen, Kenneth A., and J. Scott Long, eds. (1993), Testing Structural
Equation Models, Newbury Park, CA: Sage.
, and Kwok-fai Ting (2000), A Tetrad Test for Causal Indicators, Psychological Methods, 5 (March), 322.
Boomsma, Anne (2000), Reporting Analyses of Covariance Structures, Structural Equation Modeling, 7 (3), 461483.
Browne, Michael W., and Steven H.C. DuToit (1992), Automated
Fitting of Nonstandard Models, Multivariate Behavioral
Research, 27 (April), 269300.
Butner, Jonathan, Polemnia G. Amazeen, and Genna M. Mulvey
(2005), Multilevel Modeling of Two Cyclical Processes:
Extending Differential Structural Equation Modeling to
298
Pedhazur, Elazar J., and Liora Pedhazur Schmelkin (1991), Measurement, Design, and Analysis: An Integrated Approach, Hillsdale,
NJ: Lawrence Erlbaum.
Peterson, Robert A. (2005), Response Construction in Consumer
Behavior Research, Journal of Business Research, 58 (March),
348353.
Platt, John R. (1964), Strong Inference, Science, 146 (October
16), 347353.
Podsakoff, Philip M., Scott B. MacKenzie, Jeong-Yeon Lee, and
Nathan P. Podsakoff (2003), Common Method Biases in
Behavioral Research: A Critical Review of the Literature and
Recommended Remedies, Journal of Applied Psychology, 88
(September), 879903.
Quintana, Stephen M., and Scott E. Maxwell (1999), Implications
of Recent Developments in Structural Equation Modeling
for Counseling Psychology, Counseling Psychologist, 27
(July), 485527.
Rozeboom, William W. (2005), Meehl on Metatheory, Journal
of Clinical Psychology, 61 (October), 13171354.
Saris, Willem E., Albert Satorra, and Dag Srbom (1987), The
Detection and Correction of Specification Errors in Structural
Equation Models, in Sociological Methodology, Clifford C.
Clogg, ed., Washington, DC: American Sociological Association, 105129.
Satorra, Albert, and Peter M. Bentler (1994), Corrections to
Test Statistics and Standard Errors in Covariance Structure
Analysis, in Latent Variables Analysis: Applications for Developmental Research, Alexander von Eye and Clifford C. Clogg,
eds., Thousand Oaks, CA: Sage, 399419.