Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Probabilistic Seismic Hazard Analysis without the Ergodic Assumption

John G. Anderson, & James N. Brune

Published 1999, SCEC Contribution #451

An ergodic process is a random process in which the distribution of a random variable in space is the same as the distribution of that same random process at a single point when sampled as a function of time. An ergodic assumption is commonly made in probabilistic seismic hazard analysis (PSHA). Regression analysis derives a mean curve to predict ground motions, as a function of magnitude and distance (and other parameters some of the time). The standard deviation of the regression is determined mainly by the misfit between observations and the prediction at multiple stations for a small number of well-recorded earthquakes. Thus the standard deviation is dominantly related to the statistics of the spatial variability of the ground motions. The basic model used for probabilistic seismic hazard analysis makes an ergodic assumption when it uses this estimate of the standard deviation to describe the temporal distribution of ground motion at a single site over multiple earthquakes. To the extent that path and site response play a major role in controlling ground motions, this assumption cannot be correct.

More general PSHA models distinguish between epistemic uncertainty (due to lack of knowledge) and aleatory uncertainty ( due to truely random effects). A thought experiment involving a site where hazard is dominated by repetition of identical characteristic earthquakes on a single fault demonstrates that the correct separation of aleatory and epistemic uncertainty can have a large impact on the results of PSHA. We propose that the distinction between aleatory and epistemic uncertainty in the attenuation relationships depends on an absolute standard rather than a model-dependent standard. The aleatory uncertainty should only include uncertainty that arises from temporal dependence in the Earth's behavior, such as variability in the source processes on a fault that change from one earthquake to the next. In contrast, epistemic uncertainty treats the repeatable, but presently unknown, behavior caused by path and site response.

The optimum distribution of uncertainty between aleatory uncertainty and epistemic uncertainty must be determined from data, not assumed. Evidence from the distribution of precarious rocks near the San Andreas fault suggests that the ergodic assumption causes the aleatory uncertainty to be overestimated and the epistemic uncertainty to be underestimated. The distinction is important for any part of a seismic hazard analysis where the exposure time is large compared to the repeat time of the earthquakes, as may happen in the magnitude 5-6 range, or for larger events on very active faults.

Citation
Anderson, J. G., & Brune, J. N. (1999). Probabilistic Seismic Hazard Analysis without the Ergodic Assumption. Seismological Research Letters, 70(1), 19-28.