|
SCEC Working Group A: Master Model
The Master Model Group had two major foci in 1999: the effect on tectonic and seismic stress increments on earthquake occurrence (physical master model), and the general problem of estimating earthquake potential for use in the more general master model. Figure 1 is the current version of the SCEC Master Model as prepared by Ned Field and edited by Bernard Minster.
Stress Evolution
We made great progress in studying stress evolution and its influence on earthquakes. Important stress effects include the steadily accumulating tectonic stress, the episodic stress increments from earthquakes, the effects of afterslip on faults, the effects of viscoelastic flow in the lower crust, and poro-elastic effects caused by flow of fluids in response to other stress. The first three can be modeled elastically, and SCEC researchers had already made great progress modeling these effects. The latter two are inelastic effects that require explicitly time-dependent and nonlinear calculations: this year has brought significant progress in these areas.
Ross Stein organized a two-day workshop to introduce participants in hands-on sessions to three software codes in current use for studies of earthquakes, faults, and stress interaction: The programs were Coulomb 1.0 (S. Toda, G. King, & R. Stein); 3D-DEF (M. Ellis & J. Gomberg), and VISCO1D (F. Pollitz). The authors of the programs participated in the workshop and helped to instruct users in their proper use. Preparation for the workshop included extensive documentation of the programs, which are now available free over the www.
Jishu Deng and his colleagues Gurnis, Hudnut, and Hauksson modeled both elastic and viscoelastic stresses following the 1994 Northridge earthquake. They used a finite element program called "FEVER," developed with SCEC funding, to calculate the deformation and stress history. They found that the locations and focal mechanisms of Northridge earthquakes correlated much better with the postseismic viscoelastic stresses than with the coseismic stress increment. They showed that the visco-elsatic effects are quite significant, lending importance to SCEC's efforts in this area. Their paper was just published in Geophysical Research Letters in 1999.
Several investigators continued work on the stress increments from the Landers earthquake and its effect on later seismicity, including the Hector Mines earthquake of 1999. R. Stein showed that the elastic effects of the Landers earthquake caused the Coulomb stress to increase for a certain range of depths on the fault that ruptured in the Hector Mine earthquake. They assumed a frictional coefficient of 0.4, so that normal stress is important in the Coulomb stress calculation. Jackson and Wang computed the effects of visco-elastic effects on the stress at Hector Mine, using a model optimized to fit the postseismic geodetic deformations following Landers. They found that the shear stress on the shallow part of the Hector Mines fault was increased first by the coseismic effects of Landers, and then by the postseismic relaxation in the lower crust. The viscoelastic effect increased the shear stress by about 20% of the elastic effect.
Hough and Seeber studied the characteristics of earthquakes on the southern San Andreas and Brawley faults immediately following and probably triggered by the Hector Mine earthquake. Many were typical events for the area, but some were unusual. The unusual ones had spectral characteristic like those of magmatic or hydrothermal events at Long Valley, CA. These results suggest that the events were triggered by fluid-controlled deformation in response to the seismic stresses of the Hector event. A similar class of mechanisms has been suggested to explain earthquakes in volcanic and hydrothermal areas triggered by the Landers event in 1992.
Hardebeck considered the effects of tectonic and elastic coseismic effects on the focal mechanisms of background earthquakes in southern California. She constructed a new stress orientation map from inversion of earthquake focal mechanisms, and compared it to the focal mechanisms of individual small events along the San Andreas. Earthquakes away from the fault have mechanisms consistent with the average far-field stress orientation. However, earthquakes near the SAF have compression axes nearly perpendicular to the fault, indicating that the shear stress on the fault is much smaller than the normal stress. The focal mechanisms near the fault are consistent with a model in which high fluid pressures weaken the rock near the fault.
In a separate study, Hardebeck and Hauksson compared the effects of the Landers and Northridge earthquakes on aftershock seismicity. Roughly speaking, the aftershocks from Landers agreed well with the Coulomb stress increments from the main shock, while for Northridge the effect of stress triggering was not significant. The authors found that prior stresses help to explain the differences: Northridge is a more complex area with large, variable stresses, which will complicate the effects of coseismic stresses on aftershocks.
Kagan and Jackson studied the effect of shear and normal stresses on future events. They studied the correlation of later seismicity with the invariants of the stress tensor from all previous events. Using the stress invariants avoids the necessity of resolving the fault-plane ambiguity. They found that for individual events the effect of normal stress might encourage or discourage future events; but on average normal stress was not an important factor. This suggests fluids or other mechanisms conspire to reduce the effective friction on earthquake faults.
Earthquake Potential
Field and many others organized a working group to develop models of earthquake potential appropriate for seismic hazard calculations, and specified in a way that their implications can be compared and the models tested over time. Petersen is developing certain aspects of the CDMG source model for hazard calculations. Dolan, Foxall, and Rockwell are developing and compiling new information on faults in southern California and expressing their parameters in a standard form. Kagan and Jackson are formulating a model of earthquake probability based on past seismicity, with optimal smoothing to represent the spatial distribution of earthquake triggering. Shen is relating strain rate to earthquake occurrence, to develop an earthquake potential model based on observed geodetic strain rates. Ward has developed a theoretical stress rate model based on paleoseismic data and a dislocation model of stresses on faults, which can be used in a predictive model. Jackson and Kagan have developed a formalism for computing the predicted spatial, temporal, and magnitude distributions of all of these models so that they can be tested statistically against future earthquake occurrence. Field has written a program to calculate probable ground shaking for use in exploring the implications of these models, and he has tested his program against results of CDMG and others.
SCEC Working Group B-Ground Motion Prediction
The effect of the basin structure on estimates of ground motion has been well documented by SCEC scientists, e.g., Olsen, Archuleta, Matarese (1995); Olsen and Archuleta (1996); Wald and Graves (1998); Graves (1998). Because of the significance of the basin structure SCEC has invested efforts from many different groups to refine the velocity and geometry of the southern California basins. In Group B there have been alternative approaches to document the size of the effect and to modify the basin structure based on computed waveforms. Chen Ji, Don Helmberger and Dave Wald have been using broadband seismograms recorded from earthquakes at local and regional distances to refine basin structure. For the region outside the basin, they assume one-dimensional (1D) crustal model and analytical techniques to propagate the energy from sources to the basin edge where the motions are then interfaced with a (2D) finite-difference algorithm. By comparing data from local and regional earthquakes with computed seismograms, they can iteratively adjust the velocity structure of the basin to improve the fit. Leo Eisner and Rob Clayton have used a slightly different approach. Using a 3D finite difference method they have been comparing synthetic seismograms with records from 30 earthquakes near or within the Los Angeles basins. They have found that both the basins and the velocity structure outside the basins must be improved for a better fit between computed and recorded ground motion. A similar study is continuing with Rob Graves, Dave Wald and Arben Pitarka. By comparing 3D synthetics with observations from Landers and Northridge earthquakes, they have been suggesting refinements to Version 1 of SCEC's 3D Seismic Velocity Model. Like Eisner and Clayton they find that the velocity structure outside the basins need to be improved, primarily by including a sharp velocity gradient in shear wave velocity-2.7 km/s decreasing to 1.0 km/s in the top 0.6 km of the model. They do find that synthetics in the SCEC velocity model agree with data in terms of duration and patterns of amplification in the San Bernardino and Los Angeles basins. Kim Olsen has been examining the amplification in the latest SCEC velocity model using nine different scenario earthquakes. The 3D model now includes the Chino, San Bernardino and Ventura Basins; the previous velocity structure has been updated according to the most recent constraints. The new model and rupture scenarios have generated a new and improved amplification pattern. For example, a large part of the strongest amplification now occurs above some of the basin edge sites, in addition to the deeper basin sites. Because most engineering considerations are for a 1D vertically propagating shear wave, Olsen has computed the ratio of 3D/1D amplification throughout the Los Angeles basin. Although there are variations, the average amplification due to 3D structure is around 1.7. In addition to the velocity structure, SCEC scientists have recognized that the attenuation structure is critical and becomes more important as the maximum frequency in the computations is increased. Modifying the coarse grained method proposed by Steve Day, Peng Cheng Liu and Ralph Archuleta added Q to a newly developed 3D finite difference code. They optimized coefficients (both weights and relaxation times) of the function that mimics Q while keeping the storage requirements as small as possible. Day and Chris Bradley have added Q to Olsen's 3D finite difference code for use in computing synthetics for southern California. Overall there is a convergence to a 3D velocity and attenuation model for southern California that is consistent with the data from past earthquakes. The goal is to compute realistic ground motion throughout southern California for suites of scenario earthquakes.
Because the numerical methods (finite difference and finite element) are primarily limitedcomputationallyto low frequencies (f < 1.0 Hz) for the physical size of problems involving the basins in southern California, other methods have to be considered for more broadband seismograms. SCEC has recognized that as the maximum frequency in the simulated ground motion is increased, the objective changes from waveform fits (phase and amplitude) to attributes of ground motion, e.g., peak acceleration, response spectra, cumulative absolute velocity (CAV) or Husid curves. At the forefront of this approach are John Anderson and Yuehua Zeng who have continued to refine their stochastic source method for estimating ground motion with the appropriate attributes. One of the critical refinements has been incorporating nonlinear soil response into the calculations of ground motion. There is considerable discussion on how nonlinearity of the soil affects the ground motion. Daniel Lavallee, Fabian Bonilla and Archuleta have been developing new analytical and computational algorithms for simulating nonlinear soil response. They have developed a robust and simple model that includes nonlinear effects such as anelasticity, hysteretic behavior (also known as the memory effect), and reduction of stiffness due to pore water pressure. Numerical comparisons with two other hysteresis models have shown that the new model provides a better description of soil behavior for situations ranging from simple laboratory loading to earthquake accelerograms. This model is being used to analyze data from recorded at the Van Norman Dam complex during the Northridge earthquake. The accelerometer sites at Van Norman Dam have been drilled and logged for their geotechnical properties. Jamison Steidl has been instrumenting these boreholes to record small events to test linear response and comparison with possible nonlinear during the Northridge mainshock. The borehole data are used as the input to linear models that are characterized by geotechnical data provided under collaboration with the ROSRINE project. Steidl has been able to reproduce the surface observations in the time, frequency, and response spectral domains for frequencies up to 10 Hz for the small earthquakes. This will be extended to the nonlinear range with the Northridge mainshock. In addition to the portable borehole instrumentation, Steidl and Archuleta have been installing permanent borehole instruments throughout the Los Angeles basin for the past three years. To date, seven boreholes have been drilled, logged and sampled for near-surface soil properties, and cased for installation. Four of the seven boreholes have been instrumented and are providing data real-time to the Caltech/USGS Southern California Seismic Network (SCSN). These data are available via the SCEC data center archive. A fifth borehole located at the CDMG California strong motion instrumentation program (CSMIP) Obregon Park site in the LA Basin should be instrumented this Fall; its data will be streamed real-time to both the SCSN and CDMG in Sacramento. A 350 m deep borehole has been drilled and cased in the LA Basin at the Long Beach water reclamation plant along with a shallow 30 meter borehole at the same location. Both will instrumented this year. This vertical array will provide critical data on the soil behavior of a "typical" LA Basin site.
The M 7.1 Hector Mines earthquake generated an immediate response. Though there were stringent restrictions in the immediate availability of instruments and in availability of field sites (the fault was contained entirely within the US Marine Corps Air-Ground Combat Center), 85 recorders and sensors were placed into the field. Steidl and Aaron Martin coordinated the field deployment involving 25 individuals from six SCEC institutions (UCSB, USC, UCLA, UCSD, SDSU and the USGS). Two dense arrays of seismometers were deployed near the fault trace two weeks after the mainshock. The northern array is in a remote site in the Boullion Mountains very near the section of the fault with the most slip. This site is mostly hard rock. The array is made of two perpendicular lines each about 1.0 km long. The fault-crossing line has 21 stations, and the fault parallel line has 11 stations. The station spacing near the fault is 10-25 m, farther from the fault it increases to 50 m. The southern array is located where the fault separates conglomerates on the west side from a sedimentary valley are the east side. During the mainshock this section of fault had about 1.0 m of slip. The southern array is similar to the northern one with two orthogonal lines. However the southern array is augmented with a 24 station 2-D array extending from the fault into the sedimentary valley. The combination of the data from the crossing array and 2-D array is expected to be useful for the study of both fault-guided waves and basin reverberations.
Accounting for Site Effects in Probabilistic Seismic Hazard Analysis
(SCEC Phase III Report)
Overview
It has been known for over 100 years that neighboring sites can experience significantly different levels of earthquake shaking during earthquakes, often referred to as a "site effect" or "site response". The most dramatic site effects are produced by sedimentary deposit, which influence ground motion in the form of impedance changes, resonant modes, focussing and defocusing effects, basin-edge induced surface waves, and nonlinear behavior. A fundamental question with respect to probabilistic seismic hazard analysis (PSHA) is how and if such site effects can be accounted for. Specifically, given the variety or earthquake locations considered in PSHA, are there any site attributes that systematically predispose a location to greater or lower levels of shaking?
This question is being addressed by a working group of the Southern California Earthquake Center (the so-called SCEC Phase III effort). In a collection of ten papers that will soon be published together (in the Bulletin of the Seismological Society of America) we have addressed the problem from a variety of angles. These include both theoretical modeling and analysis of empirical data. Based on the findings, two customized attenuation relationships have been developed to account for site effects in southern California (Steidl; and Lee & Anderson). These new models are being evaluated with respect to implied seismic hazard (Field et al.).
Tentative Conclusions: (based on existing data and theoretical considerations):
1) Detailed geological classifications (that is, beyond rock vs soil, or a Quaternary/Tertiary/Mesozoic distinction) are generally not significant with respect to site effects.
2) Sedimentary basin depth is a significant and important factor. For example, sites over the deepest parts of the LA basin have ground motion levels that are about 60% greater (on average) than sites near the edge. The physical explanation for this is not yet resolved; that is, the basin depth factor may be a proxy for something else (for example, the model by Joyner implies that distance from the edge, which is correlated with depth, is the relevant parameter).
3) The uncertainty of ground motion (i.e., sigma in the attenuation relationship) is not significantly reduced after making all possible site corrections. In other words, the intrinsic variability of response at a site remains high due to the sensitivity of basin effects and scattering in general with respect to different source locations. Thus, in terms of accounting for site effects, PSHA is reaching a point of diminishing returns. Obtaining more precise estimates of ground motion will require deterministic waveform modeling.
Issues that remain to be resolved in future studies include the exact influence of nonlinear sediment behavior (changes in amplification factors as a function of ground motion level), and ground motion during large (M>7) earthquakes, particularly in the near field. Existing data are very limited in terms of addressing these questions. As we await additional observations, theoretical investigations can provide valuable insights.
List of Papers: (to be published together in the Bulletin of the Seismological Society of America early in the year 2000)
Accounting for Site Effects in Probabilistic Seismic Hazard Analysis: Overview of the SCEC Phase III Report
By: E.H. Field & the Phase III Working Group
Evaluation of Methods for Estimating Linear Site Response Amplifications in the Los Angeles Region
By: L.A. Wald and J. Mori
Site Amplification in the Los Angeles Basin from 3D Modeling of Ground Motion
By: K.B. Olsen
Site Response in Southern California for Probabilistic Seismic Hazard Analysis
By: J.H. Steidl
Evaluation of Empirical Attenuation Relations
By: Y. Lee, J.G. Anderson, and Y. Zeng
Expected Shape of Regressions for Ground Motion Parameters on Rock
By J.G. Anderson
Expected Signature of Nonlinearity on Regression for Strong Ground Motion Parameter
By: S.-D. Ni, J.G. Anderson, Y. Zeng, and R.V Siddharthan
Strong Motion from Surface Waves in Deep Sedimentary Basins
By: W.B. Joyner
Study of Residuals from a Regression on Strong Ground Motion
By: Y. Lee and J.G. Anderson
Probabilistic Seismic Hazard Calculations: Test of Various Possible Site Response Parameterizations
By: E.H Field, M.D. Petersen, and C.H. Cramer
SCEC Working Group 2000 for the Development of Earthquake Probability Models for Southern California
(AKA SCEC Phase IV Initiative)
Goals: To develop in detail credible models for earthquake potential (the probability per unit area, magnitude, and time) for magnitude 5 and larger earthquakes southern California.
To examine the implications of these models for seismic hazard.
To test these models for consistency with existing seismic, geologic, and geodetic data, and to design conclusive prospective tests for use with future data.
To describe the range of uncertainty in earthquake potential based on present data and understanding of earthquake processes..
Present Participants: Edward (Ned) Field, David D. Jackson, William Foxall, Mark Petersen, Lucile Jones, James Dolan, Steven Ward, Egill Hauksson, Julie Nazareth, Kate Hutton, Zheng-kang Shen, Yan Y. Kagan, John Anderson.
Overview
SCEC's Phase II report (WGCEP, 1995) represented the first effort to integrate seismic, geodetic, and geologic constraints into a complete seismic-hazard source model using the concept of seismic moment budgeting. However, the model predicted that magnitude 6 to 7 earthquakes will occur about twice as often as they have historically, which led to a widely publicized debate on whether the apparent deficit was real, an artifact of the limiting magnitude implied by fault size, or simply a reflection of uncertainties (Jackson, 1996; Hough, 1996; Schwartz, 1996; Stirling and Wesnousky, 1997; Stein and Hanks, 1998). Similarly, the model developed for the USGS/CDMG statewide hazard maps also exhibits a factor of two discrepancy near magnitude 6.5 (Petersen et al., 1996). Field et al. (1999) have since demonstrated that an alternative source model based on active fault data can also satisfy historic earthquake data. They also identified several factors that produced the discrepancy in previous models.
Neither M>8 earthquakes nor an accelerated earthquake rate are required to satisfy available data, but neither phenomenon is precluded. For example, time-dependent recurrence models generally predict a rate acceleration because most faults are deemed overdue. In addition, some models allow a finite probability of M ~8.5 earthquakes (Kagan, 1999). Because these models cannot be excluded, it behooves us to evaluate their implications for seismic hazard. This will define the range of hazard levels implied by our current understanding and identify important issues for future research.
Our approach is different from previous "working group" reports in that we are evaluating several viable models rather than constructing one consensus model. This approach is appropriate for several reasons: 1) Our immediate goal is a scientific study, rather than a policy document or engineering study; 2) we won't force a consensus when none exists; 3) participants will not be asked to compromise their best judgement; 4) a comparison of results will reveal which factors are most significant, thereby establishing a basis for setting future research priorities; 5) our approach will provide the background research needed by those who produce official source models (in terms of exploring possible logic-tree branches); and 6) we won't confuse the user community with yet another "consensus" hazard map or interfere with those whose mandate it is to generate such maps.
Part of the effort will involve updating and documenting the geological fault database, the earthquake catalog, and the geodetic strain-rate map. We envision publishing our results as a collection of papers in a peer-reviewed journal such as the Bulletin of the Seismological Society of America. Specifically, there will be separate papers on each model or class of models and on the updated data constraints. There will also be a paper comparing the hazard implications of each model, a paper outlining a formalism for testing the models against observed and/or synthetic earthquake catalogs, and an overview paper.
Models to be Tested
The models must be well documented and scientifically defensible, and publishable. Those currently slated for analysis are:
(1) The standard CDMG/USGS (1996) model.
(2a) A characteristic earthquake model similar to the 1988 Working Group model, with segmentation and no cascades, with and without time dependence. Field, Petersen, & Jackson
(2b) A geologically based characteristic earthquake model with strong cascade interactions (with and without time dependence). Field, Jackson, & Petersen
(2c) A geologically based model that abandons segmentation. Field, Jackson, & Petersen
(3) A model based on smoothed historical seismicity that uses a modified Gutenberg-Richter distribution with magnitudes up to 8.5. Jackson & Kagan
(4) A model with seismicity proportional to the maximum shear strain rate. Shen
(5) Models with alternative LA Basin fault geometries (known sources versus known and speculative sources). Foxall & Dolan.
(6) A "Standard Physical Model" that includes stress interaction between earthquakes. Ward.
(7) A model that includes spatial and temporal foreshock/aftershock statistics. Jones & Hauksson (tentative; Kagan).
We also invite other models. In particular, we are interested in models based on stress evolution and rate and state friction.
Specification of Models
Each model must specify the probability per unit area, magnitude, and time for events of magnitude 5.0 and larger throughout southern California as defined in the 1995 Working Group report, but models need not be restricted to this region alone. Models that focus on a sub region of southern California, such as alternative rupture scenarios for LA Basin faults, will be folded into one of the other complete models for testing and comparing hazard implications.
Other Potential Contributions
The models developed by WGCEP (1996), Petersen et al. (1996), and Field et al. (1999) all relied on the Wells and Coppersmith (1994) regression of magnitude versus fault length or area. This regression relationship was derived for earthquakes around the globe, and some have suggested that California earthquakes have a larger magnitude for a given fault area It would seem appropriate to reexamine these magnitude regressions, especially given the recent spate of M>7 earthquakes.
Other topics of interest include: seismogenic thicknesses; percentage of aseismic slip; the amount of seismic moment accommodated by compression of the transverse ranges; through-going rupture along the San Bernardino Mt. segment of the San Andreas; and whether previously defined segment boundaries are meaningful. We welcome paper submissions on these or other related topics.
Timeframe
Our goal is to have the fault database and earthquake catalog updated by February, 2000, and to have all the models/papers submitted for publication by August 1, 2000. In the mean time, each model will be examined for probabilistic hazard implications, and for agreement with existing seismic, geologic, and geodetic data. We anticipate publishing the report sometime early in 2001.
|