It provides authoritative estimates of the likelihood and severity of potentially damaging earthquake ruptures in the long- and near-term.
[5] This allows seismicity to be distributed in a more realistic manner, which has corrected a problem with prior studies that overpredicted earthquakes of moderate size (between magnitude 6.5 and 7.0).
This requires accommodation of 34 to 48 millimeters (about one and a half inches) of slippage per year,[19] with some of that taken up in portions of the Basin and Range Province to the east of California.
In theory, this should produce some regularity in the earthquakes on a given fault, and knowing the date of the last rupture is a clue to how soon the next one can be expected.
[29] This uses a supercomputer to solve a system of linear equations that simultaneously satisfies multiple constraints such as known slip rates, etc.
[31] While UCERF3 represents a considerable improvement over UCERF2,[32] and the best available science to-date for estimating California's earthquake hazard,[33] the authors caution that it remains an approximation of the natural system.
[37] There are a number of sources of uncertainty, such as insufficient knowledge of fault geometry (especially at depth) and slip rates,[38] and there is considerable challenge in how to balance the various elements of the model to achieve the best fit with the available observations.
The data does fit if a certain constraint (the regional Magnitude-Frequency Distribution) is relaxed, but this brings back the problem over-predicting moderate events.
The model implies that achieving GR consistency would require certain changes in seismological understanding that "fall outside the current bounds of consensus-level acceptability".
[40] Whether the Gutenberg-Richter relation is inapplicable at the scale of individual faults, or some basis of the model is incorrect, "will be equally profound scientifically, and quite consequential with respect to hazard.