A Milestone for Forecasting Earthquake Hazards

In a new study, researchers report that their physics-based model of California earthquake hazards replicated estimates from the state’s leading statistical model.

By
Kim Martineau
August 22, 2018

Researchers Replicate Latest California Earthquake Projections with Physics-based Model

NASA Radar 3 D View of San Andreas Fault 620.jpg
In a milestone for earthquake forecasting, researchers report that their physics-based model of California earthquake hazards replicated estimates from the state’s leading statistical model. In the above radar image, a section of California’s San Andreas fault can be seen below the Crystal Springs Reservoir (in black), with San Francisco (in pink) to the east. (Image: NASA Jet Propulsion Laboratory)

Earthquakes pose a profound danger to people and cities worldwide, but with the right hazard-mitigation efforts, from stricter building requirements to careful zoning, the potential for catastrophic collapses of roads and buildings and loss of human lives can be limited.

All of these measures depend on science delivering high-quality seismic hazard models. And yet, current models depend on a list of uncertain assumptions, with predictions that are difficult to test in the real world due to the long intervals between big earthquakes.

Now, a team of researchers from Columbia University’s Lamont-Doherty Earth Observatory, University of Southern California, University of California at Riverside and the U.S. Geological Survey has come up with a physics-based model that marks a turning point in earthquake forecasting. Their results appear in the new issue of Science Advances.

“Whether a big earthquake happens next week or 10 years from now, engineers need to build for the long run,” says the study’s lead author, Bruce Shaw, a geophysicist at Lamont-Doherty. “We now have a physical model that tells us what the long-term hazards are.”

Simulating nearly 500,000 years of California earthquakes on a supercomputer, researchers were able to match hazard estimates from the state’s leading statistical model based on a hundred years of instrumental data. The mutually validating results add support for California’s current hazard projections, which help to set insurance rates and building design standards across the state. The results also suggest a growing role for physics-based models in forecasting earthquake hazard and evaluating competing models in California and other earthquake prone regions.

The physics-based model produced its results after simulating nearly 500,000 years of California earthquakes. A randomly selected 3,000-year segment is visualized here. (Courtesy: Kevin Milner, University of Southern California)

The earthquake simulator used in the study, RSQSim, simplifies California’s statistical model by eliminating many of the assumptions that go into estimating the likelihood of an earthquake of a certain size hitting a specific region. The researchers, in fact, were surprised when the simulator, programmed with relatively basic physics, was able to reproduce estimates from a model that has improved steadily for decades. “This shows our simulator is ready for prime time,” says Shaw.

Seismologists can now use RSQSim to test the statistical model’s region-specific predictions. Accurate hazard estimates are especially important to government regulators in high-risk cities like Los Angeles and San Francisco, who write and revise building codes based on the latest science. In a state with a severe housing shortage, regulators are under pressure to make buildings strong enough to withstand heavy shaking while keeping construction costs down. A second tool to confirm hazard estimates gives the numbers added credibility.

“If you can get similar results with different techniques, that builds confidence you’re doing something right,” says study coauthor Tom Jordan, a geophysicist at USC.

A hallmark of the simulator is its use of rate and state-dependent friction to approximate how real-world faults break and transfer stress to other faults, sometimes setting off even bigger quakes. Developed at UC Riverside more than a decade ago, and refined further in the current study, RSQSim is the first physics-based model to replicate California’s most recent rupture forecast, UCERF3.  When results from both models were fed into California’s statistical ground-shaking model, they came up with similar hazard profiles.

John Vidale, director of the Southern California Earthquake Center, which helped fund the study, says the new model has created a realistic 500,000-year history of earthquakes along California’s faults for researchers to explore. Vidale predicted the model would improve as computing power grows and more physics are added to the software. “Details such as earthquakes in unexpected places, the evolution of earthquake faults over geological time, and the viscous flow deep under the tectonic plates are not yet built in,” he said.

The researchers plan to use the model to learn more about aftershocks, and how they unfold on California’s faults, and to study other fault systems globally. They are also working on incorporating the simulator into a physics-based ground-motion model, called CyberShake, to see if it can reproduce shaking estimates from the current statistical model.

“As we improve the physics in our simulations and computers become more powerful, we will better understand where and when the really destructive earthquakes are likely to strike,” says study coauthor Kevin Milner, a researcher at USC.

The study was funded by the National Science Foundation, Southern California Earthquake Center and W.M. Keck Foundation. The other authors are Ned Field, U.S. Geological Survey; Jacquelyn Gilchrist, USC; and Keith Richards-Dinger and James Dieterich, UC Riverside.

Study: A physics-based earthquake simulator replicates seismic hazard statistics across California