You are on page 1of 3

The Prediction Problems of Earthquake

System Science
Editors note: The following is the text of the SSA Presidential
Address presented at the Annual Luncheon of the Seismological
Society of America (SSA) Annual Meeting on 30 April 2014.
The Seismological Society of America (SSA) has always
been dedicated to understanding and
reducing the earthquake threat. The So-
ciety was founded in 1906 for the ac-
quisition and diffusion of knowledge
concerning earthquakes and allied phe-
nomena. According to our new strategic
plan, approved by the Board in 2012, the
core purpose of SSA is to advance seis-
mology and the understanding of earth-
quakes for the benefit of society. This
plan lays out the vision for SSA to be the primary forum
for the assembly, exchange, and dissemination of scientific
knowledge essential for an earthquake-aware and safer world.
In the past twenty years or so, the study of earthquakes has
become a true system science, offering new pathways for the
advancement of seismology. Today I would like to explore
what the rise of earthquake system science might imply for
the future of our field and for SSAs mission in earthquake
research.
System science seeks to explain phenomena that emerge
from nature at the system scale, such as global climate change
or earthquake activity in California or Alaska. The system is
not a physical reality, but a hypothetical representation of
nature, typically a numerical model that replicates an emergent
behavior and predicts its future course.
The choice of target behavior determines the system
model, as can be illustrated by two representations of earth-
quake activity in California. One is UCERF3, the latest uni-
form California earthquake rupture forecast of the Working
Group on California Earthquake Probabilities, which repre-
sents future earthquake activity in terms of time-dependent
fault-rupture probabilities. Another is the Southern California
Earthquake Center (SCEC)s CyberShake ground-motion
model, which uses simulations to represent the probability of
future earthquake shaking at geographic sites, conditional on
the fault rupture. These two system-level models can be com-
bined to generate site-specific hazard curves, the main forecast-
ing tool of probabilistic seismic-hazard analysis (PSHA).
The first point to emphasize is that earthquake system sci-
ence is all about forecasting and prediction. For many years
now, earthquake prediction has remained an awkward topic
in polite seismological company, primarily because it has been
defined in the public mind by something we cannot do, which
is to predict with high probability the regional occurrence of
large earthquakes over the short term. Yet the P-word is too
central to our science to be banned from our working vocabu-
lary. From a practical perspective, we must be able to predict
earthquake hazards in order to lower seismic risk. From the
basic-research perspective of system sci-
ence, testing a models predictions against
new data is the principle means by which
we can gain confidence in the hypotheses
and theories on which the model is built.
For example, many interesting prob-
lems of contingent predictability can be
posed as physics questions in a system-spe-
cific context. What will be the shaking in-
tensity in the Los Angeles basin from a magnitude 7.8
earthquake on the southern San Andreas fault? By how much
will the strong shaking be amplified by the coupling of source
directivity to basin effects? Will deep injection of waste fluids
cause felt earthquakes near a newly drilled well in Oklahoma?
How intense will the shaking be during the next minute of an
ongoing earthquake in Seattle? SSA should stake its claim as
the central forum for the physics-based study of earthquake
predictability, and its publications should be the place where
progress in understanding predictability is most rigorously doc-
umented.
My second point is that forecasting and prediction are all
about probabilities. The deep uncertainties intrinsic to earth-
quake forecasting are most coherently expressed in terms of
two distinct types of probability: the aleatory variability that
describes the randomness of the system, and the epistemic un-
certainty that characterizes our lack of knowledge about the
system. In UCERF3, the former is cast as the time-dependent
probabilities of fault ruptures, of which there are over 250,000,
whereas the latter is expressed as a logic tree with 5760 alter-
native branches. Similarly, CyberShake represents the aleatory
variability in wave excitation through conditional hypocenter
distributions and conditional slip distributions, and it charac-
terizes the epistemic uncertainty in the wavefield calculations
in terms of alternative 3D seismic-velocity models.
The full-3D treatment of seismic-wave propagation has
the potential to improve our PSHA models considerably. A
variance-decomposition analysis of the recent CyberShake re-
sults indicates that more accurate earthquake simulations could
reduce the aleatory variance of the strong-motion predictions
System science offers a
brick-by-brick approach to
building up our
understanding of
earthquake predictability.
doi: 10.1785/0220140088 Seismological Research Letters Volume 85, Number 4 July/August 2014 767
by at least a factor of 2 relative to the empirical ground-motion
prediction equations in current use; other factors being equal,
this would the lower exceedance probabilities at high-hazard
levels by an order of magnitude. The practical ramifications
of this probability gain for the formulation of risk-reduction
strategies could be substantial.
The coherent representation of aleatory variability and
epistemic uncertainty in physics-based hazard models involves
massive forward and inverse calculations, typically requiring
very large ensembles of deterministic simulations. For example,
a CyberShake hazard model for the Los Angeles region involves
the computation of about 240 million synthetic seismograms.
These calculations have been made feasible by the development
of clever algorithms based on seismic reciprocity and highly
optimized anelastic wave propagation codes, but they still
strain the capabilities of the worlds fastest supercomputers,
which are currently operating at petascale (10
15
floating
point operations per second).
It is important to realize that our communitys needs for
computation are growing more rapidly than our nations super-
computer resources. In this year alone, for example, SCEC
simulations will consume almost 200 million core-hours on
National Science Foundation (NSF) supercomputers such as
Blue Waters and Department of Energy (DOE) supercom-
puters such as Titan. As we move towards exascale computing,
the machine architectures will become more heterogeneous
and difficult to code, and the workflows will increase in com-
plexity. To an ever-increasing degree, progress in earthquake
system science will depend on deep, sustained collaborations
among the seismologists and computational scientists focused
on extreme-scale computing. SSA should think carefully about
how to accommodate such interdisciplinary
collaborations into its structure, and it will
need to work with NSF, DOE, and other
government agencies to make sure our com-
putational capabilities are sufficient for the
demands of physics-based PSHA.
PSHA occupies a central position in the
universe of seismic-risk reduction. However,
recent earthquake disasters have reinvigo-
rated a long-standing debate about PSHA methodology. Many
practical deficiencies have been noted, not the least of which is
the paucity of data for retrospective calibration and prospective
testing of long-term PSHA models. But some critics have raised
the more fundamental question of whether PSHA is misguided
because it cannot capture the aleatory variability of large-mag-
nitude earthquakes produced by complex fault systems. More-
over, the pervasive role of subjective probabilities and expert
opinion in specifying the epistemic uncertainties in PSHA has
made this methodology a target for scientists who adhere to a
strictly frequentist view of probabilities. According to some of
these critics, PSHA should be replaced by neodeterministic
hazard estimates based on a maximum credible earthquake.
As Warner Marzocchi pointed out in an Eos article last
July, neodeterministic SHA is not an adequate replacement
for probabilistic SHA. The choice of a maximum credible
earthquake requires uncertain assumptions, such as choosing
a return period, which essentially fix the level of acceptable risk.
This black-and-white approach is fundamentally flawed be-
cause it conflates the role of scientific advisor with that of a
decision maker, mixing scientific judgments with political and
economic choices that lie outside the domain of science. Fully
probabilistic descriptions, such as those given by PSHA, are
needed for two reasons: first, to avoid unintended and often
uninformed decision making in the tendering of scientific fore-
casts, and second, to provide decision makers, including the
public, with a complete rendering of the scientific information
they need to balance the costs and benefits of risk-mitigation
actions.
We may never be able to predict the impending occur-
rence of extreme earthquakes with any certainty, but we do
know that earthquakes cluster in space and time, and that
earthquake probabilities can locally increase by a thousand-fold
during episodes of seismicity. The lessons of LAquila and
Christchurch make clear that this information must be deliv-
ered to the public quickly, transparently, authoritatively, and on
a continuing basis. Systems for this type of operational earth-
quake forecasting (OEF) are being developed in several coun-
tries, including Italy, New Zealand, and the United States, and
they raise many questions about how to inform decision mak-
ing in situations where probability for a significant earthquake
may go way up in a relative sense but still remain very low
(<1% per day) in absolute terms.
As we usher in new technologies that will spew out pre-
dictive information in near real timeand here we should in-
clude earthquake early warning (EEW) systems as well as OEF
the need to engage the public has never been more critical.
We must continually educate the pub-
lic into the conversation about what
can, and cannot, be foretold about
earthquake activity.
Toward this end, SSA should
increase its role in communicating the
science that underlies OEF and EEW.
In particular, it should provide a
roundtable for seismologists to interact
with social scientists and risk-communication experts in help-
ing the responsible government agencies translate uncertain
probabilistic forecasts into effective risk-mitigation actions.
This brings me to my final point, which concerns the im-
portance of rigorous forecast validation. Validation involves
testing whether a forecasting model replicates the earthquake-
generating process well enough to be sufficiently reliable for
some useful purpose, such as OEF or EEW. Since 2006, a new
international organization, the Collaboratory for the Study of
Earthquake Predictability (CSEP), has been developing the cy-
berinfrastructure needed for the prospective testing of short-
term earthquake forecasts. CSEP testing centers have been
set up in Los Angeles, Wellington, Zrich, Tokyo, and Beijing,
and more than 380 short-term forecasting models are being
prospectively evaluated against authoritative seismicity catalogs
in natural laboratories around the world. CSEP experiments
We must continually
educate the public into the
conversation about what
can, and cannot, be foretold
about earthquake activity.
768 Seismological Research Letters Volume 85, Number 4 July/August 2014
have validated the probability gains of short-term forecasting
models that are being used, or will be used, in OEF. Moreover,
the Collaboratory is capable of supporting OEF and EEW by
providing an environment for the continual testing of opera-
tional models against alternatives. However, U.S. participation
in CSEP has thus far been primarily funded by a private organi-
zation, the W. M. Keck Foundation, and stable support for its
long-term mission is not guaranteed.
Of course, extreme earthquakes are very rare, so it will be a
while before enough instrumental data have accumulated to
properly test our long-term forecasts. However, as Dave Jack-
son argued in a paper presented at this meeting, the earthquake
hiatus in California suggests the current UCERF model inad-
equately represents the large-scale interactions that are modu-
lating the earthquake activity of the San Andreas fault system.
Use of paleoseismology to extend the earthquake record back
into geologic time is a clear priority. The SSA should be the
home for this type of historical geophysics.
It is also urgent that we increase the spatial scope of our
research to compensate for our lack of time. One goal of SSA
should be to join forces with CSEP and other international
efforts, such as the Global Earthquake Model (GEM) project,
in fostering comparative studies of fault systems around the
world. The issue is not whether to focus on the prediction
problems of earthquake system science, but how to accomplish
this research in a socially responsible way according to the most
rigorous scientific standards.
I call upon a new generation of seismologiststhe stu-
dents and early-career scientists in this roomto take on
the challenges of earthquake system science. You are fortunate
to be in a field where the basic prediction problems remain
mostly unsolved and major discoveries are still possible. You
are also fortunate to have access to vast new datasets and tre-
mendous computational capabilities for attacking these prob-
lems. System-level models, such as those I have described here,
will no doubt become powerful devices in your scientific
arsenal.
However, these models can be big and unwieldy, requiring
a scale of expertise and financial resources that are rarely avail-
able to one scientist or a small research group. This raises a
number of issues about how to organize the interdisciplinary,
multi-institutional efforts needed to develop these models. In
particular, all of us at SSA need to make sure that any research
structure dominated by earthquake system science allows you,
as the rising leaders in this field, to develop new ideas about
how earthquake systems actually work.
Thomas H. Jordan
Southern California Earthquake Center
University of Southern California
Los Angeles, California 90089-0742 U.S.A.
tjordan@usc.edu
Seismological Research Letters Volume 85, Number 4 July/August 2014 769

You might also like