Exciting news! We're transitioning to the Statewide California Earthquake Center. Our new website is under construction, but we'll continue using this website for SCEC business in the meantime. We're also archiving the Southern Center site to preserve its rich history. A new and improved platform is coming soon!

Workflow Management of the SCEC Computational Platforms for Physics-Based Seismic Hazard Analysis

Thomas H. Jordan, Scott Callaghan, Philip J. Maechling, Gideon Juve, Ewa Deelman, Mats Rynge, Karan Vahi, & Fabio Silva

Published December 2012, SCEC Contribution #1782

Earthquake simulation has the potential to substantially improve seismic hazard and risk forecasting, but the practicality of using simulation results is limited by the scale and complexity of the computations. Here we will focus on the experience of the Southern California Earthquake Center (SCEC) in applying workflow management tools to facilitate physics-based seismic hazard analysis. This system-level problem can be partitioned into a series of computational pathways according to causal sequences described in terms of conditional probabilities. For example, the exceedance probabilities of shaking intensities at geographically distributed sites conditional on a particular fault rupture (a ground motion prediction model or GMPM) can be combined with the probabilities of different ruptures (an earthquake rupture forecast or ERF) to create a seismic hazard map. Deterministic simulations of ground motions from very large suites (millions) of ruptures, now feasible through high-performance computational facilities such as SCEC's CyberShake Platform, are allowing seismologists to replace empirical GMPMs with physics-based models that more accurately represent wave propagation through heterogeneous geologic structures, such as the sedimentary basins that amplify seismic shaking. One iteration of the current broadband CyberShake hazard model for the Los Angeles region, which calculates ground motions deterministically up to 0.5 Hz and stochastically up to 10 Hz, requires the execution of about 3.3 billion jobs, taking 12.8 million computer hours and producing 10 TB of simulation data. We will show how the scalability and reliability of CyberShake calculations on some of the nation's largest computers has been improved using the Pegasus Workflow Management System. We will also describe the current challenges of scaling these calculations up by an order of magnitude to create a California-wide hazard model, which will be based on the new Uniform California Earthquake Rupture Forecast (version 3). Finally, we will review the problems associated with the management of workflows for SCEC's Broadband Platform in a distributed computing environment.

Jordan, T. H., Callaghan, S., Maechling, P. J., Juve, G., Deelman, E., Rynge, M., Vahi, K., & Silva, F. (2012, 12). Workflow Management of the SCEC Computational Platforms for Physics-Based Seismic Hazard Analysis. Oral Presentation at AGU Fall Meeting 2012.