Home  /  SCEC Research  /  Working Group: Collaboratory for the Study of Earthquake Predictability (CSEP)

Working Group: Collaboratory for the Study of Earthquake Predictability (CSEP)

Item 1
Item 2
Item 3
Item 1
Item 2
Item 3

Research Objectives

CSEP is developing a virtual, distributed laboratory—a collaboratory—that supports a wide range of scientific prediction experiments in multiple regional or global natural laboratories. This earthquake system science approach seeks to provide answers to the questions: (1) How should scientific prediction experiments be conducted and evaluated? and (2) What is the intrinsic predictability of the earthquake rupture process?

Example Research Strategies

  • Establishing rigorous procedures in controlled environments (testing centers) for registering prediction procedures, which include the delivery and maintenance of versioned, documented code for making and evaluating predictions including intercomparisons to evaluate prediction skills;
  • Constructing community-endorsed standards for testing and evaluating probability-based, alarm-based, fault-based, and event-based predictions;
  • Developing hardware facilities and software support to allow individual researchers and groups to participate in prediction experiments;
  • Designing and developing programmatic interfaces that provide access to earthquake forecasts and forecast evaluations.
  • Providing prediction experiments with access to data sets and monitoring products, authorized by the agencies that produce them, for use in calibrating and testing algorithms;
  • Characterizing limitations and uncertainties of such data sets (e.g., completeness magnitudes, source parameter and other data uncertainties) with respect to their influence on experiments;
  • Expanding the range of physics-based models to test hypotheses that some aspects of earthquake triggering are dominated by dynamic rather than quasi-static stress changes and that slow slip event activity can be used to forecast large earthquakes;
  • Evaluating hypotheses critical to forecasting large earthquakes, including the characteristic earthquake hypothesis, the seismic gap hypothesis, and the maximum-magnitude hypothesis;
  • Conducting workshops to facilitate international collaboratories

A major focus of CSEP is to develop international collaborations between the regional testing centers and to accommodate a wide-ranging set of prediction experiments involving geographically distributed fault systems in different tectonic environments.

Research Priorities

  • Retrospective Canterbury experiment: finalizing the retrospective evaluation of physics-based and statistical forecasting models during the 2010-12 Canterbury, New Zealand, earthquake sequence by (i) comparing retrospective forecasts against extent prospective models, (ii) transitioning models to prospective evaluation, including in other regions;
  • Global CSEP experiments: developing and testing global models, including, but not limited to, those developed for the Global Earthquake Model (GEM);
  • Strengthening testing and evaluation methods: developing computationally efficient performance metrics of forecasts and predictions that (i) account for aleatory variability and epistemic uncertainties, and (ii) facilitate comparisons between a variety of probability-based and alarm-based models (including reference models);
  • Supporting Operational Earthquake Forecasting (OEF): (i) developing forecasting methods that explicitly address real-time data deficiencies, (ii) updating forecasts on an event basis and evaluating forecasts with overlapping time-windows or on an event basis, (iii) improving short-term forecasting models, (iv) developing prospective and retrospective experiments to evaluate OEF candidate models;
  • Earthquake rupture simulators: developing experiments to evaluate the predictive skills of earthquake rupture simulators, against both synthetic (simulated) and observed data (see also the WGCEP section), with specific focus on how to automate the identification of a large earthquake with a modeled fault;
  • External Forecasts and Predictions (EFP): developing and refining experiments to evaluate EFPs (generated outside of CSEP), including operational forecasts by official agencies and prediction algorithms based on seismic and electromagnetic data;
  • Induced seismicity: developing models and experiments to evaluate hypotheses of induced seismicity, e.g. in the Salton Trough or in Oklahoma, including providing data access to injection/depletion rates and other potentially pertinent data;
  • Hybrid/ensemble models: developing methods for forming optimal hybrid and ensemble models from a variety of existing probability-based or alarm-based forecasting models;
  • Hazard models: developing experiments to evaluate seismic hazard models and their components (e.g., ground motion models);
  • Coulomb stress: developing forecasting models based on the Coulomb stress hypothesis that can be tested retrospectively and prospectively within CSEP;
  • Developing methodology to forecast focal mechanisms and evaluating the skill of such forecasts;
  • Testing paleo-based forecasts: developing experiments to prospectively test the fault rupture and earthquake probabilities implied by paleoseismic investigations of California faults (e.g., testing probabilities of future ruptures at paleoseismic sites where numerous ruptures have been documented, the relative effectiveness of proposed fault segment boundaries at stopping ruptures, and the relative frequency of on-fault and off-fault ruptures in California) (see also the WGCEP and SoSafe sections).

Recent Results

CSEP activities have continued within a vigorous international collaboration, ranging from software development via model development and testing to workshops and conference sessions. Software development at SCEC has focused on installing new models, evaluating results, and upgrading CSEP software and hardware. CSEP also hosted a workshop at the 2014 SCEC annual meeting in collaboration with the USGS and the Global Earthquake Model (GEM) Foundation.

Testing the USGS Hazard Model

Figure 1: Hazard curves of all NSHM models (in color) compared with the observation from DYFI (black). The inset shows the hatched part enlarged. For the higher ground motions, the models overestimate the recurrence compared to the observation. [Mak et al., in preparation]

Testing seismic hazard assessments (SHAs) generally faces one over-riding challenge: the lack of data. This challenge consists of two components: the lack of earthquake occurrence, and the lack of records even when there are earthquakes. Compared with testing individual components of a SHA, it is even more challenging in this aspect for testing the whole outcome of a SHA because, regardless of the seismicity of a region, earthquakes that sufficiently contribute to the hazard of interest are always rare events.

Mak et al. from GFZ Potsdam have confronted this challenge by two means: to use a spatial-temporal aggregation approach, and the tentative use of a new form of data. Spatial-temporal aggregation means testing the hazard of the region as a whole, instead of point hazard that is the direct outcome of a SHA. This converts rare event (with respect to a point location) to a less rare event (with respect to an area). Even so, most regions in the world are not instrumented with sufficient accelerometers to record earthquake ground motion. Macroseismic intensity data generated by an internet-based earthquake ground-motion collection system, "Did You Feel It?" (DYFI), was used as a proxy for true ground motion data.

With a control of data completeness, the observed seismic hazard as a whole by DYFI data collected from 2000 to 2015 in California was compared with the corresponding hazard predicted by the National Seismic Hazard Maps (versions 1996, 2002, 2008, 2014). The same comparison was also performed using instrumental data. Both the DYFI data and instrumental data provided consistent results, and so confirm the usefulness of DYFI data. This analysis was then extended to compare the observed seismic hazard by DYFI to the predicted one at the Central and Eastern US (CEUS), where instrumental data are lacking.

This study reveals a conservative (slight but statistically significant) hazard prediction for California, and a slight (but statistically significant) underprediction for CEUS. It also shows the most recent version of the hazard maps is the most consistent with the observed hazard.

Retrospective Evaluations of a Rate-and-State Coulomb Stress Model

The GFZ group developed and tested a rate-and-state Coulomb-based seismicity rate forecast for the Japan CSEP testing regions (all of Japan, Mainland and Kanto). Unlike previous physics-based forecasts submitted to CSEP, stress is calculated through inverting variations in past seismicity rates for Coulomb stress steps over defined time intervals (Dieterich et al., 2000). Compared to deriving the stress tensor from a fault dislocation model, the rate-and-state Coulomb stress inversion relies upon fewer (often) assumed physical parameters such as the coefficient of friction or receiver-fault orientation. Additionally, stress singularity artifacts, which often distort the Coulomb stress field near fault patch boundaries, are smoothed when inverting seismicity for Coulomb stress changes. Using background seismicity rates derived from inter-earthquake distances (Ogata, 2011), the model calculated the Coulomb stress evolution and expected seismicity rates over three years, one year, three months and one day in 2009 and following the Tohoku earthquake. The hybrid Coulomb-ETAS forecast underestimates the number of earthquakes during the testing periods; however, the stress perturbations improve the spatial distribution of these events compared to the original ETAS forecast. As anticipated from Dieterich's study, the stress inversion method yields more consistent associations between stress change and earthquake distribution over longer time intervals, displaying potential to be applied in long-term, alarm-based earthquake forecasts. This model is now under prospective testing in CSEP Japan.

Collaboration with CSEP Japan

We have intensified the collaboration with CSEP Japan. D. Schorlemmer has visited the Earthquake Research Institute (ERI) at the University of Tokyo two times in 2015 and will visit again in October 2015. Besides keeping the testing center at ERI running and using the latest CSEP software distribution, scientific collaborations are ongoing. Together with H. Tsuruoka and N. Hirata, CSEP is investigating the resolution dependence of current CSEP seismicity rate testing metrics. Initial results indicate a noticeable dependence but these findings need further investigation to deliver recommendations for further CSEP testing strategies. A. Strader from GFZ Potsdam was visiting ERI and is developing a physics-based rate-and-state Coulomb model for the testing regions of California and Japan (see previous section). This model development will include several Japanese researchers to further strengthen the collaboration. D. Schorlemmer has finished a study on the network recording completeness of the Japan Meteorological Agency covering the entire period of instrumental earthquake recording (1923 to 2014). The results will soon be publicly available. Currently, D. Schorlemmer is developing a system at ERI to track recording completeness in near real-time from 2015 on.

Collaboration with the Global Earthquake Model

CSEP has worked together with the Global Earthquake Model (GEM) Foundation in the field of testing earthquake forecasts, ground-motion prediction equations and hazard. The result of testing the USGS hazard model have been presented in a previous section. This work continues with testing the Japanese hazard model to cover two of the most important hazard models. In the domain of seismicity model testing, investigations of the GEAR1 model are upcoming, see next section.

Installing and Evaluating Global Earthquake Forecasting Models

CSEP has installed two new global earthquake forecasting models for prospective testing. The first model SHIFT-GSRM2f by Bird and Kreemer (2015) calculates seismicity rates from a new global strain rate map and provides an interesting alternative to seismicity-based forecasts. The second global model (GEAR1) was developed by Bird et al. (2015) in collaboration with the GEM Foundation and optimally combines a smoothed seismicity model and a strain rate model to provide complimentary forecasting skill. CSEP is now developing the software codes for new testing metrics (based on Kagan's information gain scores) to investigate the forecasting power of these and other global models. Software development and the evaluation is being led by the GFZ Potsdam CSEP/GEM team.

Figure 2: Information gains of 1-year forecasts issued right after the 2010 Darfield earthquake and updated once in September 2011. Black: retrospective mode using best available data. Red: pseudo-prospective mode using near-real-time data. [Werner et al., 2015]

Retrospective Evaluation of Time-Dependent Earthquake Forecasting Models during the 2010-2012 Canterbury, New Zealand, Earthquake Sequence

The M7.1 Darfield earthquake triggered a complex earthquake cascade that provides a wealth of new scientific data to study earthquake triggering and evaluate the predictive skill of short-term earthquake forecasting models. To provide maximally objective results, a global CSEP collaboration of scientists from the US, New Zealand and Europe conducted a retrospective evaluation of short-term forecasting models during this sequence. Their primary objective was to assess the performance of newly developed physics-based Coulomb/rate-state seismicity models and hybrid statistical/Coulomb models against observations and against extant Omori-Utsu clustering models such as the Epidemic-Type Aftershock Sequence (ETAS) model. In stark contrast to previous CSEP results, Werner et al (2015) observed that Coulomb/rate-state models and hybrid Coulomb/statistical models provided more informative forecasts during the sequence than statistical models over all tested forecast horizons (1-year, 1-month and 1-day). They also evaluated the effect of near-real-time data on the quality of the forecasts by using daily real-time catalog snapshots obtained by the CSEP New Zealand testing center during the sequence. Surprisingly, forecasts do not universally degrade in quality when real-time data is used as input; results are model-dependent.

Ensemble Modeling

Figure 3: Bayesian model averaging of 1-day earthquake forecast models over a one-year period from 2012 to 2013 within the CSEP California testing region. Red squares indicate magnitudes of observed earthquakes. Curves indicate model weights. [Werner, Coe and Rougier, 2015]

CSEP is implementing strategies for combining multiple models for optimal forecasts. Both linear as well as multiplicative combination strategies are being pursued. Werner et al. (2015b) combined 1-day forecast models in California using Bayesian Model Averaging (BMA). Their preliminary results (Figure 3), which cover a one year period from 2012 to 2013, show that the optimal ensemble model is heavily dominated by just several models, while the weights of other models quickly diminish towards zero. Specifically, the models K3 and ETAS_K3 (which is itself an ensemble model) comprise the lion's share of the weights for the ensemble model after several months of data.

Development of External Forecasts and Predictions (EFP) Experiments

CSEP has designed and implemented a communication protocol for registering externally generated predictions in collaboration with the QuakeFinder group. A machine-readable xml schema was developed to transmit earthquake predictions from QuakeFinder to CSEP, as well as a file transmission protocol to automate and sanity-check the delivery of earthquake predictions.

Select Publications

  • Bird, P., and C. Kreemer (2015), Revised tectonic forecast of global shallow seismicity based on version 2.1 of the Global Strain Rate Map, Bull. Seismol. Soc. Am, 105(1), 152-166 plus electronic supplements, doi: 10.1785/0120140129. SCEC Contribution 6091
  • Bird, P., D. D. Jackson, Y. Y. Kagan, C. Kreemer, & R. S. Stein [2015] GEAR1: a Global Earthquake Activity Rate model constructed from geodetic strain rates and smoothed seismicity, Bull. Seismol. Soc. Am., 105(5/October); BSSA Early Edition (1 September), doi: 10.1785/0120150058. SCEC Contribution 2075
  • Rhoades, D, A., M. C. Gerstenberger, A. Christophersen, J. D. Zechar, D. Schorlemmer, M. J. Werner, T. H. Jordan (2014), Regional Earthquake Likelihood Models II: Information Gains of Multiplicative Hybrids, Bull. Seismol. Soc. Am, 104(6), 3072-3083. SCEC Contribution 1837
  • Werner, M. J., et al. (2015), Retrospective Evaluation of Time-Dependent Earthquake Forecast Models during the 2010-12 Canterbury, New Zealand, Earthquake Sequence, SSA Annual Meeting, Pasadena, 2015. SCEC Contribution 6001