GLORIA

GEOMAR Library Ocean Research Information Access

feed icon rss

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Publication Date: 2021-06-25
    Description: The assessment of earthquake forecast models for practical purposes requires more than simply checking model consistency in a statistical framework. One also needs to understand how to construct the best model for specific forecasting applications. We describe a Bayesian approach to evaluating earthquake forecasting models, and we consider related procedures for constructing ensemble forecasts. We show how evaluations based on Bayes factors, which measure the relative skill among forecasts, can be complementary to common goodness-of-fit tests used to measure the absolute consistency of forecasts with data. To construct ensemble forecasts, we consider averages across a forecast set, weighted by either posterior probabilities or inverse log- likelihoods derived during prospective earthquake forecasting experiments. We account for model correlations by conditioning weights using the Garthwaite–Mubwandarikwa capped eigenvalue scheme. We apply these methods to the Regional Earthquake Like- lihood Models (RELM) five-year earthquake forecast experiment in California, and we discuss how this approach can be generalized to other ensemble forecasting applications. Specific applications of seismological importance include experiments being conducted within the Collaboratory for the Study of Earthquake Predictability (CSEP) and ensemble methods for operational earthquake forecasting.
    Description: Published
    Description: 2574 – 2584
    Description: 4.2. TTC - Modelli per la stima della pericolosità sismica a scala nazionale
    Description: JCR Journal
    Description: restricted
    Keywords: earthquake forecasting ; ensemble model ; 04. Solid Earth::04.06. Seismology::04.06.02. Earthquake interactions and probability
    Repository Name: Istituto Nazionale di Geofisica e Vulcanologia (INGV)
    Type: article
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 2
    Publication Date: 2021-06-25
    Description: We cannot yet predict large earthquakes in the short term with much reliability and skill, but the strong clustering exhibited in seismic sequences tells us that earthquake probabilities are not constant in time; they generally rise and fall over periods of days to years in correlation with nearby seismic activity. Opera- tional earthquake forecasting (OEF) is the dissemination of authoritative information about these time-dependent proba- bilities to help communities prepare for potentially destructive earthquakes. The goal of OEF is to inform the decisions that people and organizations must continually make to mitigate seismic risk and prepare for potentially destructive earthquakes on time scales from days to decades. To fulfill this role, OEF must provide a complete description of the seismic hazard—ground-motion exceedance probabilities as well as short-term rupture probabilities—in concert with the long-term forecasts of probabilistic seismic-hazard analysis (PSHA).
    Description: Published
    Description: 955-959
    Description: 3T. Pericolosità sismica e contributo alla definizione del rischio
    Description: JCR Journal
    Description: reserved
    Keywords: Operational earthquake forecasting ; seismic preparedness ; 04. Solid Earth::04.06. Seismology::04.06.11. Seismic risk
    Repository Name: Istituto Nazionale di Geofisica e Vulcanologia (INGV)
    Type: article
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 3
    Publication Date: 2013-03-22
    Description: The Regional Earthquake Likelihood Models (RELM) working group designed a 5-year experiment to forecast the number, spatial distribution, and magnitude distribution of subsequent target earthquakes, defined to be those with magnitude ≥4.95 ( M 4.95+) in a well-defined California testing region. Included in the experiment specification were the description of the data source, the methods for data processing, and the proposed evaluation metrics. The RELM experiment began on 1 January 2006 and involved 17 time-invariant forecasts constructed by seismicity modelers; by the end of the experiment on 1 January 2011, 31 target earthquakes had occurred. We analyze the experiment outcome by applying the proposed consistency tests based on likelihood measures and additional comparison tests based on a measure of information gain. We find that the smoothed seismicity forecast by Helmstetter et al. , 2007 based on M 2+ earthquakes since 1981, is the best forecast, regardless of whether aftershocks are included in the analysis. The RELM experiment has helped to clarify ideas about testing that can be applied to more wide-ranging earthquake forecasting experiments conducted by the Collaboratory for the Study of Earthquake Predictability (CSEP). Online Material: Figures and tables showing the RELM testing region and collection region definitions, numerical results associated with the RELM experiment, and the uncorrected forecast by Ebel et al. (2007) .
    Print ISSN: 0037-1106
    Electronic ISSN: 1943-3573
    Topics: Geosciences , Physics
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 4
    Publication Date: 2012-12-01
    Description: The assessment of earthquake forecast models for practical purposes requires more than simply checking model consistency in a statistical framework. One also needs to understand how to construct the best model for specific forecasting applications. We describe a Bayesian approach to evaluating earthquake forecasting models, and we consider related procedures for constructing ensemble forecasts. We show how evaluations based on Bayes factors, which measure the relative skill among forecasts, can be complementary to common goodness-of-fit tests used to measure the absolute consistency of forecasts with data. To construct ensemble forecasts, we consider averages across a forecast set, weighted by either posterior probabilities or inverse log-likelihoods derived during prospective earthquake forecasting experiments. We account for model correlations by conditioning weights using the Garthwaite–Mubwandarikwa capped eigenvalue scheme. We apply these methods to the Regional Earthquake Likelihood Models (RELM) five-year earthquake forecast experiment in California, and we discuss how this approach can be generalized to other ensemble forecasting applications. Specific applications of seismological importance include experiments being conducted within the Collaboratory for the Study of Earthquake Predictability (CSEP) and ensemble methods for operational earthquake forecasting. Online Material: Tables of likelihoods for each testing phase and code to analyze the RELM experiment.
    Print ISSN: 0037-1106
    Electronic ISSN: 1943-3573
    Topics: Geosciences , Physics
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 5
    Publication Date: 2017-05-31
    Description: We, the ongoing Working Group on California Earthquake Probabilities, present a spatiotemporal clustering model for the Third Uniform California Earthquake Rupture Forecast (UCERF3), with the goal being to represent aftershocks, induced seismicity, and otherwise triggered events as a potential basis for operational earthquake forecasting (OEF). Specifically, we add an epidemic-type aftershock sequence (ETAS) component to the previously published time-independent and long-term time-dependent forecasts. This combined model, referred to as UCERF3-ETAS, collectively represents a relaxation of segmentation assumptions, the inclusion of multifault ruptures, an elastic-rebound model for fault-based ruptures, and a state-of-the-art spatiotemporal clustering component. It also represents an attempt to merge fault-based forecasts with statistical seismology models, such that information on fault proximity, activity rate, and time since last event are considered in OEF. We describe several unanticipated challenges that were encountered, including a need for elastic rebound and characteristic magnitude–frequency distributions (MFDs) on faults, both of which are required to get realistic triggering behavior. UCERF3-ETAS produces synthetic catalogs of M ≥2.5 events, conditioned on any prior M ≥2.5 events that are input to the model. We evaluate results with respect to both long-term (1000 year) simulations as well as for 10-year time periods following a variety of hypothetical scenario mainshocks. Although the results are very plausible, they are not always consistent with the simple notion that triggering probabilities should be greater if a mainshock is located near a fault. Important factors include whether the MFD near faults includes a significant characteristic earthquake component, as well as whether large triggered events can nucleate from within the rupture zone of the mainshock. Because UCERF3-ETAS has many sources of uncertainty, as will any subsequent version or competing model, potential usefulness needs to be considered in the context of actual applications. Electronic Supplement: Figures showing discretization, verification of the DistanceDecayCubeSampler , average simulated participation rate, and average cumulative magnitude–frequency distributions (MFDs).
    Print ISSN: 0037-1106
    Electronic ISSN: 1943-3573
    Topics: Geosciences , Physics
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 6
    Publication Date: 2014-06-12
    Description: We generalize the formulation of probabilistic seismic hazard analysis to accommodate simulation-based hazard models by expressing the joint probability distribution among the parameters of a kinematically complete earthquake rupture forecast in terms of a conditional hypocenter distribution and a conditional slip distribution. The seismological hierarchy implied by these dependencies allows the logarithmic excitation functional to be exactly and uniquely decomposed into a series of uncorrelated terms that include zero-mean averages of the site, source, hypocenter, and source-complexity effects. We use this averaging-based factorization to compare the CyberShake prototype hazard model developed by the Southern California Earthquake Center, CS11, with the empirical ground-motion prediction equations (GMPEs) of the 2008 Next Generation Attenuation (NGA08) project. For horizontal-response spectral accelerations at long periods (2–10 s), the basin and directivity effects of CS11 are substantially larger than those of the NGA08 GMPEs. Directivity–basin coupling and other 3D wave propagation effects not represented in the GMPEs contribute significantly to the excitation patterns in CS11. The total variance of the CS11 excitations is about 60% higher than the NGA root mean square (rms) at the 2 s period but almost 30% lower at 10 s. Relative to the NGA rms, the residual variance in CS11 at 2 s is larger than the aleatory variability in the NGA08 database by a factor of nearly 1.6. Recent CyberShake experiments with alternative source and structural models suggest that the high CS11 variances are due to an overestimation of the basin and directivity effects at short periods. The CyberShake site and path effects unexplained by the NGA08 models account for 40%–50% of total residual variance, suggesting that improvements to the simulation-based hazard models could reduce the aleatory variability intrinsic to the current GMPEs by as much as 25%.
    Print ISSN: 0037-1106
    Electronic ISSN: 1943-3573
    Topics: Geosciences , Physics
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 7
    Publication Date: 2014-06-12
    Description: The 2014 Working Group on California Earthquake Probabilities (WGCEP14) present the time-independent component of the Uniform California Earthquake Rupture Forecast, Version 3 (UCERF3), which provides authoritative estimates of the magnitude, location, and time-averaged frequency of potentially damaging earthquakes in California. The primary achievements have been to relax fault segmentation and include multifault ruptures, both limitations of UCERF2. The rates of all earthquakes are solved for simultaneously and from a broader range of data, using a system-level inversion that is both conceptually simple and extensible. The inverse problem is large and underdetermined, so a range of models is sampled using an efficient simulated annealing algorithm. The approach is more derivative than prescriptive (e.g., magnitude–frequency distributions are no longer assumed), so new analysis tools were developed for exploring solutions. Epistemic uncertainties were also accounted for using 1440 alternative logic-tree branches, necessitating access to supercomputers. The most influential uncertainties include alternative deformation models (fault slip rates), a new smoothed seismicity algorithm, alternative values for the total rate of M w ≥5 events, and different scaling relationships, virtually all of which are new. As a notable first, three deformation models are based on kinematically consistent inversions of geodetic and geologic data, also providing slip-rate constraints on faults previously excluded due to lack of geologic data. The grand inversion constitutes a system-level framework for testing hypotheses and balancing the influence of different experts. For example, we demonstrate serious challenges with the Gutenberg–Richter hypothesis for individual faults. UCERF3 is still an approximation of the system, however, and the range of models is limited (e.g., constrained to stay close to UCERF2). Nevertheless, UCERF3 removes the apparent UCERF2 overprediction of M  6.5–7 earthquake rates and also includes types of multifault ruptures seen in nature. Although UCERF3 fits the data better than UCERF2 overall, there may be areas that warrant further site-specific investigation. Supporting products may be of general interest, and we list key assumptions and avenues for future model improvements.
    Print ISSN: 0037-1106
    Electronic ISSN: 1943-3573
    Topics: Geosciences , Physics
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 8
    Publication Date: 2015-01-30
    Description: We derive time-dependent, renewal-model earthquake probabilities for the case in which the date of the last event is completely unknown, and compare these with the time-independent Poisson probabilities that are customarily used as an approximation in this situation. For typical parameter values, the renewal-model probabilities exceed Poisson results by more than 10% when the forecast duration exceeds ~20% of the mean recurrence interval. We also derive probabilities for the case in which the last event is further constrained to have occurred before historical record keeping began (the historic open interval), which can only serve to increase earthquake probabilities for typically applied renewal models. We conclude that accounting for the historic open interval can improve long-term earthquake rupture forecasts for California and elsewhere.
    Print ISSN: 0037-1106
    Electronic ISSN: 1943-3573
    Topics: Geosciences , Physics
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 9
    Publication Date: 2015-04-01
    Description: The 2014 Working Group on California Earthquake Probabilities (WGCEP 2014) presents time-dependent earthquake probabilities for the third Uniform California Earthquake Rupture Forecast (UCERF3). Building on the UCERF3 time-independent model published previously, renewal models are utilized to represent elastic-rebound-implied probabilities. A new methodology has been developed that solves applicability issues in the previous approach for unsegmented models. The new methodology also supports magnitude-dependent aperiodicity and accounts for the historic open interval on faults that lack a date-of-last-event constraint. Epistemic uncertainties are represented with a logic tree, producing 5760 different forecasts. Results for a variety of evaluation metrics are presented, including logic-tree sensitivity analyses and comparisons to the previous model (UCERF2). For 30 yr M ≥6.7 probabilities, the most significant changes from UCERF2 are a threefold increase on the Calaveras fault and a threefold decrease on the San Jacinto fault. Such changes are due mostly to differences in the time-independent models (e.g., fault-slip rates), with relaxation of segmentation and inclusion of multifault ruptures being particularly influential. In fact, some UCERF2 faults were simply too long to produce M  6.7 size events given the segmentation assumptions in that study. Probability model differences are also influential, with the implied gains (relative to a Poisson model) being generally higher in UCERF3. Accounting for the historic open interval is one reason. Another is an effective 27% increase in the total elastic-rebound-model weight. The exact factors influencing differences between UCERF2 and UCERF3, as well as the relative importance of logic-tree branches, vary throughout the region and depend on the evaluation metric of interest. For example, M ≥6.7 probabilities may not be a good proxy for other hazard or loss measures. This sensitivity, coupled with the approximate nature of the model and known limitations, means the applicability of UCERF3 should be evaluated on a case-by-case basis.
    Print ISSN: 0037-1106
    Electronic ISSN: 1943-3573
    Topics: Geosciences , Physics
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 10
    Publication Date: 2014-12-05
    Description: The Regional Earthquake Likelihood Models experiment in California tested the performance of earthquake likelihood models over a five-year period. First-order analysis showed a smoothed-seismicity model by Helmstetter et al. (2007) to be the best model. We construct optimal multiplicative hybrids involving the best individual model as a baseline and one or more conjugate models. Conjugate models are transformed using an order-preserving function. Two parameters for each conjugate model and an overall normalizing constant are fitted to optimize the hybrid model. Many two-model hybrids have an appreciable information gain (log probability gain) per earthquake relative to the best individual model. For the whole of California, the Bird and Liu (2007) Neokinema and Holliday et al. (2007) pattern informatics (PI) models both give gains close to 0.25. For southern California, the Shen et al. (2007) geodetic model gives a gain of more than 0.5, and several others give gains of about 0.2. The best three-model hybrid for the whole region has the Neokinema and PI models as conjugates. The best three-model hybrid for southern California has the Shen et al. (2007) and PI models as conjugates. The information gains of the best multiplicative hybrids are greater than those of additive hybrids constructed from the same set of models. The gains tend to be larger when the contributing models involve markedly different concepts or data. These results need to be confirmed by further prospective tests. Multiplicative hybrids will be useful for assimilating other earthquake-related observations into forecasting models and for combining forecasting models at all timescales.
    Print ISSN: 0037-1106
    Electronic ISSN: 1943-3573
    Topics: Geosciences , Physics
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...