Optimal separation of the coseismic and postseismic from the tectonic motion
(Progress Report)
deformation
Danan Dong
Space Geodetic Science and Applications Group
Jet Propulsion Laboratory, California Institute of Technology,
Pasadena, California
1. INTRODUCTION
In this year, the SCEC working group's phase I velocity map has been published in EOS (Shen et al., 1997). There were increasing demands for us to further develop phase II velocity map and to open it to the public soon. My first priority is to continue tnis velocity map work. This year the geodesy group changed its policy and required each institute to develop their own velocity map independently. Although such a velocity map is unlikely to be bully independent because so much information of our previous work has been shared, I still report my work here as if it was "independent". I have also done software development work to support my research and other groups. I did not report the details of the software development here and refer the interested readers to read our methodology paper (Dong et al., 1997) which has been accepted by the Journal of Geodesy (SCEC paper number 356). The separation of the coseismic and postseismic deformation fields from the steady tectonic motion becomes the key factor to further improve our velocity map work, in particular to test the recent discovery of velocity change after the Landers earthquake (Bock et al., 1997). I report my preliminary results here.
2. HORIZONTAL VELOCITY FIELD IN SOUTHERN CALIFORNIA
My previous strategy was to use velocity constraints from both local and global sites to define the velocity reference frame and allow stochastic perturbations in vertical coordinates to absorb the inconsistency between campaign quasi-observation solutions, in particular, I allowed some stochastic perturbations in horizontal coordinates at most global sites (see Shen et al., 1997). This strategy forces the rates of the coordinate time series at the tightly constrained sites to be consistent with the velocity reference frame and, if there are inconsistencies, forces the coordinates to move to absorb the inconsistencies. Essentially, such a strategy is similar to the "local filtering" technique (Bock et al., 1997), which spends several degrees of freedom to force the local site coordinates to move in order to keep the slope of the coordinate time series to be consistent with the predetermined velocities. Due to the unresolved systematic errors (see details of Bock et al., 1997), such a technique is inevitable in combining the GPS campaign data, in particular the early GPS campaign data and the quasi-observations from old version GAMIT analysis. However, such a technique sacrifices the determination of the vertical deformation field, which is important in the GPS/SAR comparison and combination, as well as the postseismic deformation field. Given the fact that several new 1997 GPS campaign data are available and part of the old GPS campaign data have been reanalyzed, I explore the possibility of reducing the ambiguities in vertical velocity field and postseismic deformation field, which in turn reinforces the determination of the horizontal velocity field.
My strategy is to perform time series analysis based on the
ITRF94 core site coordinates and velocities. The perturbation
levels and the affected sites are then determined based on the
time series analysis. Since there were no common ITRF94 core sites
for all GPS campaigns and some ITRF94 core sites could have some
problems during campaigns, I use a bootstrap approach. First,
I constrain the coordinates of the ITRF94
core sites (class A and B) to I meter level and loosely constrain
the coordinates of the other sites. Then I gradually tighten the
coordinate constraints at the ITRF94 core sites and check the
mutual consistency (using the separate analysis mode of QOCA).
The inconsistent ITRF94 core sites are removed &om the tight
constraining list. To the end, five ITRF94 core sites (DS10, VNDP,
YELL, ALGO, FAIR) survive at the I cm horizontal and 2 cm vertical
constraints level. However, only part of the GPS campaigns have
more than three of the five ITRF94 core sites. Using these part
of GPS campaign data and tightly constraining the five ITRF94
core sites, I check the consistency of other sites, in particular
the tracking sites with the most quasi-observations. Four sites
(VNDN, DRAO, DS6O, KOSG) turn out to be consistent with the five
core sites. By adding constraints of 2 cm horizontal and 5 cm
vertical on the four sites and extending the GPS campaigns to
those containing more than 3 sites of the 9 tightly constrained
sites, I further check the consistency of the remained sites.
9 more sites (KOKR, KOKE, MOJA, MOJM, MOJ1, ONSA, RICT, AUST,
OVRO) are adopted as the consistent sites. By adding constraints
of 5 cm horizontal and 10 cm vertical on the 9 sites, I perform
the time series analysis for all campaign data. At this stage,
most campaign data have at least 3 constrained sites of the 18
selected sites. Figure 1 shows typical time series at two southern
California sites. I also tested the internal constraint approach
(see Heflin et al., 1992). Due to lack of a set of consistent
tracking stations, the results were mixed, reflecting the redistribution
of the inconsistencies. From the time series analysis I reach
the following conclusions:
1. Under such a constraining, the consistency of the horizontal coordinate time series (after removing trends and coseismic jumps) at southern California sites is at the 2-5 centimeter level. The consistency of the vertical coordinate time series is at several decimeter level.
2. We also see some common deviations from the time series at local sites (see Figure 1 east component), which are likely from the unresolved systematic errors rather than from the real tectonic motion as described by Bock et al. (1997).
3. The velocity change after Landers earthquake at several local sites can be marginally seen from the time series (see Figure 1 north component of PINT). To gain the confidence, we need to tightly constrain more local sites so that the common deviations from unresolved systematic errors could be reduced or eliminated. At present we have no aufficient apriori knowledge to realize this. The combination of campaign and continuous GPS data will be helpful.
4. The local sites and global sites of the reanalyzed STRC89 quasi-observation data are fighting each other in the east component at 30-50 cm level. I suspect that some 1989 global sites probably used Minimac receivers, which were different from local campaign GPS receivers and had more than half second time tag differences. There were probably some inconsistencies at 1.0-1.5 nanosecond level when synchronizing the different time tags to a common epoch.
5. The global sites, in particular the sites between northern hemisphere and southern hemisphere, are fighting in the original quasi-observation data (unreprocessed yet), mostly in the north component. Such a inconsistency indicates some latitudedependent errors, probably related to the old GAMIT version. The most sign)ficant campaigns are CONTRACT92, VEN93 and VF4.
6. The vertical component time series are consistent for most campaigns (see Figure 1). If all campaign data have been reprocessed with reliable antenna phase center modeling, we are able to reduce or remove the stochastic perturbation to obtain the vertical deformation field.
Based on the conclusions 1 and 2, it seems premature to use
the coordinate constraint approach, in particular when part of
the campaign data are not reanalyzed yet. I still adopt the velocity
constraint approach aligned to the ITRF94 velocity reference frame.
From the conclusion 4, I remove all global sites from the STRC89
data. Based on the conclusion 5, I remove part of the global sites
in CONTRACT92, VEN93 and VF4 data,
mostly are southern hemisphere sites and high-latitude sites.
The vertical stochastic perturbation level is set to be 0.01 m2/yr
with sites ALGO, DRAO, DS60, FAIR, KOKE, KOSG having no perturbation.
The remained procedures are similar to my previous approach except
the postseismic deformation field (see next section). Figure 2
and 3 show the velocity field from GPS only and GPS/EDM combination
(420 effective sites).
3. POSTSEISMIC DEFORMATION FTEED
I adopt the work of Lu (1987) as the theoretical frame, which extended the theory of detectability for a single deformation model to the separability of deformation models. He introduced the correlation cofactor matrix between two deformation models. Given the observations and the parameterizations of two deformation models, we can quantitatively assess the correlation coefficient and the separability factor (i.e. if the probability of a wrong decision between the two deformation models is smaller than a certain criterion) between the two deformation models. I omit the mathematical details here.
The coseismic deformation is easier to be separated from the secular tectonic motion as long as there are multiple measurements spanning over one year before and after the earthquake. There are three commonly used postseismic deformation models (see Shen et al., 1994 for details):
(1) Exponential decay model reflects the relaxation behavior of viscoelastic material (lower crust or upper mantle).
(2) Logarithmic model is supported by the rock mechanics experiments.
(3) Power law model is the implementation of the aftershock seismicity.
Among the three models, only the exponential decay model is convergent with time so that it is separable with the velocity estimate. For a long period, the logarithmic model and the power law model will display the similar behavior as the "velocity change". In reality, however, it is unlikely to expect the crustal rock to break forever. I modify the logarithmic model as
d = Dlog[1 + (t-t(0)) / t] when t-t(0)< t
d = D when t-t(0)> t
Such a mod)fied model makes the separation of postseismic deformation field and velocity field possible if our data span after the earthquake is sign)ficantly longer than the
characteristic time t. In this model, the parameter D has much clear definition, i.e. the maximum amplitude of the postseismic deformation. I used this modified logarithmic model to estimate the postseismic deformation field after the Landers earthquake. Due to lack of temporal sampling after Landers earthquake, the separability of the postseismic deformation field from the velocity field is small unless strong constraints are assigned to either one of them. I tentatively assigned an apriori postseismic deformation model from 3% of coseismic displacements for near-field sites with 6 mm constraints to 10% of coseismic displacements for far-field sites with 1 mm constraints. The characteristic time was set to be 3 years. Such a model did improve the consistency of the estimated velocity field (Figure 2). However, such a treatment is somewhat subjective. I plan to loosen the constraints when performing the combination of campaign and continuous GPS data.
REFERENCES
Bock, Y., et al., Southern California Permanent GPS Geodetic Array: continuous measurements of regional crustal deformation between the 1992 Landers and 1994 Northridge earthquakes, J. Geophys. Res., 102, 18013-18033, 1997
Shen, Z., et al., Geodetic measurements of southern California
crustal deformation, EOS, Trans. Am. Geophys. Union, 78,
No. 43, 477-482, 1997
Dong, D., T. H. Herring, and R. W. King, Estimating regional
deformation from a combination of Space and terrestrial geodetic
data, in press, Journal of Geoclesy, 1997
Heflin, M., et al., Global geodesy using GPS without fiducial sites, Geophys. Res. Lett., 19, 131-134, 1992
Lu, G., On the separability of deformation models, Z. f Verm., 11, 555-563, 1987
Shen, Z., et al., Postseismic deformation following the Landers
earthquake, Calif., June 28, 1992, Bull. Seis. Soc. Amer.,
84, 780-791, 1994