now, earthquake prediction has remained an awkward topic in polite seismological company, primarily because it has been defined in the public mind by something we cannot do, which is to predict with high probability the regional occurrence of Editor’s note: The following is the text of the SSA Presidential large earthquakes over the short term. Yet the “P-word” is too Address presented at the Annual Luncheon of the Seismological central to our science to be banned from our working vocabuSociety of America (SSA) Annual Meeting on 30 April 2014. lary. From a practical perspective, we must be able to predict The Seismological Society of America (SSA) has always earthquake hazards in order to lower seismic risk. From the been dedicated to understanding and basic-research perspective of system scireducing the earthquake threat. The Soence, testing a model’s predictions against ciety was founded in 1906 “for the acSystem science offers a new data is the principle means by which quisition and diffusion of knowledge brick-by-brick approach to we can gain confidence in the hypotheses concerning earthquakes and allied phebuilding up our and theories on which the model is built. nomena.” According to our new strategic understanding of For example, many interesting probplan, approved by the Board in 2012, the earthquake predictability. lems of contingent predictability can be core purpose of SSA is to “advance seisposed as physics questions in a system-spemology and the understanding of earthcific context. What will be the shaking inquakes for the benefit of society.” This tensity in the Los Angeles basin from a magnitude 7.8 plan lays out the vision for SSA to be “the primary forum earthquake on the southern San Andreas fault? By how much for the assembly, exchange, and dissemination of scientific will the strong shaking be amplified by the coupling of source knowledge essential for an earthquake-aware and safer world.” directivity to basin effects? Will deep injection of waste fluids In the past twenty years or so, the study of earthquakes has cause felt earthquakes near a newly drilled well in Oklahoma? become a true system science, offering new pathways for the How intense will the shaking be during the next minute of an advancement of seismology. Today I would like to explore ongoing earthquake in Seattle? SSA should stake its claim as what the rise of earthquake system science might imply for the central forum for the physics-based study of earthquake the future of our field and for SSA’s mission in earthquake predictability, and its publications should be the place where research. progress in understanding predictability is most rigorously docSystem science seeks to explain phenomena that emerge umented. from nature at the system scale, such as global climate change My second point is that forecasting and prediction are all or earthquake activity in California or Alaska. The “system” is about probabilities. The deep uncertainties intrinsic to earthnot a physical reality, but a hypothetical representation of quake forecasting are most coherently expressed in terms of nature, typically a numerical model that replicates an emergent two distinct types of probability: the aleatory variability that behavior and predicts its future course. describes the randomness of the system, and the epistemic unThe choice of target behavior determines the system certainty that characterizes our lack of knowledge about the model, as can be illustrated by two representations of earthsystem. In UCERF3, the former is cast as the time-dependent quake activity in California. One is UCERF3, the latest uniprobabilities of fault ruptures, of which there are over 250,000, form California earthquake rupture forecast of the Working whereas the latter is expressed as a logic tree with 5760 alterGroup on California Earthquake Probabilities, which reprenative branches. Similarly, CyberShake represents the aleatory sents future earthquake activity in terms of time-dependent variability in wave excitation through conditional hypocenter fault-rupture probabilities. Another is the Southern California distributions and conditional slip distributions, and it characEarthquake Center (SCEC)’s CyberShake ground-motion terizes the epistemic uncertainty in the wavefield calculations model, which uses simulations to represent the probability of in terms of alternative 3D seismic-velocity models. future earthquake shaking at geographic sites, conditional on The full-3D treatment of seismic-wave propagation has the fault rupture. These two system-level models can be comthe potential to improve our PSHA models considerably. A bined to generate site-specific hazard curves, the main forecastvariance-decomposition analysis of the recent CyberShake reing tool of probabilistic seismic-hazard analysis (PSHA). sults indicates that more accurate earthquake simulations could The first point to emphasize is that earthquake system scireduce the aleatory variance of the strong-motion predictions ence is all about forecasting and prediction. For many years
The Prediction Problems of Earthquake System Science
doi: 10.1785/0220140088
Seismological Research Letters
Volume 85, Number 4
July/August 2014
767
earthquake requires uncertain assumptions, such as choosing by at least a factor of 2 relative to the empirical ground-motion a return period, which essentially fix the level of acceptable risk. prediction equations in current use; other factors being equal, This black-and-white approach is fundamentally flawed bethis would the lower exceedance probabilities at high-hazard cause it conflates the role of scientific advisor with that of a levels by an order of magnitude. The practical ramifications decision maker, mixing scientific judgments with political and of this probability gain for the formulation of risk-reduction economic choices that lie outside the domain of science. Fully strategies could be substantial. The coherent representation of aleatory variability and probabilistic descriptions, such as those given by PSHA, are epistemic uncertainty in physics-based hazard models involves needed for two reasons: first, to avoid unintended and often massive forward and inverse calculations, typically requiring uninformed decision making in the tendering of scientific forevery large ensembles of deterministic simulations. For example, casts, and second, to provide decision makers, including the a CyberShake hazard model for the Los Angeles region involves public, with a complete rendering of the scientific information the computation of about 240 million synthetic seismograms. they need to balance the costs and benefits of risk-mitigation These calculations have been made feasible by the development actions. of clever algorithms based on seismic reciprocity and highly We may never be able to predict the impending occuroptimized anelastic wave propagation codes, but they still rence of extreme earthquakes with any certainty, but we do strain the capabilities of the world’s fastest supercomputers, know that earthquakes cluster in space and time, and that which are currently operating at petascale (∼1015 floating earthquake probabilities can locally increase by a thousand-fold during episodes of seismicity. The lessons of L’Aquila and point operations per second). It is important to realize that our community’s needs for Christchurch make clear that this information must be delivcomputation are growing more rapidly than our nation’s superered to the public quickly, transparently, authoritatively, and on computer resources. In this year alone, for example, SCEC a continuing basis. Systems for this type of operational earthsimulations will consume almost 200 million core-hours on quake forecasting (OEF) are being developed in several counNational Science Foundation (NSF) supercomputers such as tries, including Italy, New Zealand, and the United States, and Blue Waters and Department of Energy (DOE) supercomthey raise many questions about how to inform decision makputers such as Titan. As we move towards exascale computing, ing in situations where probability for a significant earthquake the machine architectures will become more heterogeneous may go way up in a relative sense but still remain very low and difficult to code, and the workflows will increase in com(< 1% per day) in absolute terms. plexity. To an ever-increasing degree, progress in earthquake As we usher in new technologies that will spew out presystem science will depend on deep, sustained collaborations dictive information in near real time—and here we should inamong the seismologists and computational scientists focused clude earthquake early warning (EEW) systems as well as OEF on extreme-scale computing. SSA should think carefully about —the need to engage the public has never been more critical. how to accommodate such interdisciplinary We must continually educate the pubcollaborations into its structure, and it will lic into the conversation about what We must continually need to work with NSF, DOE, and other can, and cannot, be foretold about educate the public into the government agencies to make sure our comearthquake activity. conversation about what Toward this end, SSA should putational capabilities are sufficient for the can, and cannot, be foretold increase its role in communicating the demands of physics-based PSHA. about earthquake activity. science that underlies OEF and EEW. PSHA occupies a central position in the universe of seismic-risk reduction. However, In particular, it should provide a recent earthquake disasters have reinvigoroundtable for seismologists to interact rated a long-standing debate about PSHA methodology. Many with social scientists and risk-communication experts in helping the responsible government agencies translate uncertain practical deficiencies have been noted, not the least of which is probabilistic forecasts into effective risk-mitigation actions. the paucity of data for retrospective calibration and prospective This brings me to my final point, which concerns the imtesting of long-term PSHA models. But some critics have raised portance of rigorous forecast validation. Validation involves the more fundamental question of whether PSHA is misguided testing whether a forecasting model replicates the earthquakebecause it cannot capture the aleatory variability of large-maggenerating process well enough to be sufficiently reliable for nitude earthquakes produced by complex fault systems. Moresome useful purpose, such as OEF or EEW. Since 2006, a new over, the pervasive role of subjective probabilities and expert international organization, the Collaboratory for the Study of opinion in specifying the epistemic uncertainties in PSHA has Earthquake Predictability (CSEP), has been developing the cymade this methodology a target for scientists who adhere to a berinfrastructure needed for the prospective testing of shortstrictly frequentist view of probabilities. According to some of term earthquake forecasts. CSEP testing centers have been these critics, PSHA should be replaced by “neodeterministic” set up in Los Angeles, Wellington, Zürich, Tokyo, and Beijing, hazard estimates based on a maximum credible earthquake. As Warner Marzocchi pointed out in an Eos article last and more than 380 short-term forecasting models are being July, neodeterministic SHA is not an adequate replacement prospectively evaluated against authoritative seismicity catalogs for probabilistic SHA. The choice of a maximum credible in natural laboratories around the world. CSEP experiments 768
Seismological Research Letters
Volume 85, Number 4
July/August 2014
have validated the probability gains of short-term forecasting models that are being used, or will be used, in OEF. Moreover, the Collaboratory is capable of supporting OEF and EEW by providing an environment for the continual testing of operational models against alternatives. However, U.S. participation in CSEP has thus far been primarily funded by a private organization, the W. M. Keck Foundation, and stable support for its long-term mission is not guaranteed. Of course, extreme earthquakes are very rare, so it will be a while before enough instrumental data have accumulated to properly test our long-term forecasts. However, as Dave Jackson argued in a paper presented at this meeting, the earthquake hiatus in California suggests the current UCERF model inadequately represents the large-scale interactions that are modulating the earthquake activity of the San Andreas fault system. Use of paleoseismology to extend the earthquake record back into geologic time is a clear priority. The SSA should be the home for this type of historical geophysics. It is also urgent that we increase the spatial scope of our research to compensate for our lack of time. One goal of SSA should be to join forces with CSEP and other international efforts, such as the Global Earthquake Model (GEM) project, in fostering comparative studies of fault systems around the world. The issue is not whether to focus on the prediction problems of earthquake system science, but how to accomplish this research in a socially responsible way according to the most rigorous scientific standards.
I call upon a new generation of seismologists—the students and early-career scientists in this room—to take on the challenges of earthquake system science. You are fortunate to be in a field where the basic prediction problems remain mostly unsolved and major discoveries are still possible. You are also fortunate to have access to vast new datasets and tremendous computational capabilities for attacking these problems. System-level models, such as those I have described here, will no doubt become powerful devices in your scientific arsenal. However, these models can be big and unwieldy, requiring a scale of expertise and financial resources that are rarely available to one scientist or a small research group. This raises a number of issues about how to organize the interdisciplinary, multi-institutional efforts needed to develop these models. In particular, all of us at SSA need to make sure that any research structure dominated by earthquake system science allows you, as the rising leaders in this field, to develop new ideas about how earthquake systems actually work.
Seismological Research Letters
Thomas H. Jordan Southern California Earthquake Center University of Southern California Los Angeles, California 90089-0742 U.S.A.
[email protected]
Volume 85, Number 4 July/August 2014
769