Sample records for california earthquake probabilities

  1. Probability of one or more M ≥7 earthquakes in southern California in 30 years

    USGS Publications Warehouse

    Savage, J.C.

    1994-01-01

    Eight earthquakes of magnitude greater than or equal to seven have occurred in southern California in the past 200 years. If one assumes that such events are the product of a Poisson process, the probability of one or more earthquakes of magnitude seven or larger in southern California within any 30 year interval is 67% ?? 23% (95% confidence interval). Because five of the eight M ??? 7 earthquakes in southern California in the last 200 years occurred away from the San Andreas fault system, the probability of one or more M ??? 7 earthquakes in southern California but not on the San Andreas fault system occurring within 30 years is 52% ?? 27% (95% confidence interval). -Author

  2. An empirical model for earthquake probabilities in the San Francisco Bay region, California, 2002-2031

    USGS Publications Warehouse

    Reasenberg, P.A.; Hanks, T.C.; Bakun, W.H.

    2003-01-01

    The moment magnitude M 7.8 earthquake in 1906 profoundly changed the rate of seismic activity over much of northern California. The low rate of seismic activity in the San Francisco Bay region (SFBR) since 1906, relative to that of the preceding 55 yr, is often explained as a stress-shadow effect of the 1906 earthquake. However, existing elastic and visco-elastic models of stress change fail to fully account for the duration of the lowered rate of earthquake activity. We use variations in the rate of earthquakes as a basis for a simple empirical model for estimating the probability of M ≥6.7 earthquakes in the SFBR. The model preserves the relative magnitude distribution of sources predicted by the Working Group on California Earthquake Probabilities' (WGCEP, 1999; WGCEP, 2002) model of characterized ruptures on SFBR faults and is consistent with the occurrence of the four M ≥6.7 earthquakes in the region since 1838. When the empirical model is extrapolated 30 yr forward from 2002, it gives a probability of 0.42 for one or more M ≥6.7 in the SFBR. This result is lower than the probability of 0.5 estimated by WGCEP (1988), lower than the 30-yr Poisson probability of 0.60 obtained by WGCEP (1999) and WGCEP (2002), and lower than the 30-yr time-dependent probabilities of 0.67, 0.70, and 0.63 obtained by WGCEP (1990), WGCEP (1999), and WGCEP (2002), respectively, for the occurrence of one or more large earthquakes. This lower probability is consistent with the lack of adequate accounting for the 1906 stress-shadow in these earlier reports. The empirical model represents one possible approach toward accounting for the stress-shadow effect of the 1906 earthquake. However, the discrepancy between our result and those obtained with other modeling methods underscores the fact that the physics controlling the timing of earthquakes is not well understood. Hence, we advise against using the empirical model alone (or any other single probability model) for estimating the

  3. Earthquake Rate Model 2 of the 2007 Working Group for California Earthquake Probabilities, Magnitude-Area Relationships

    USGS Publications Warehouse

    Stein, Ross S.

    2008-01-01

    The Working Group for California Earthquake Probabilities must transform fault lengths and their slip rates into earthquake moment-magnitudes. First, the down-dip coseismic fault dimension, W, must be inferred. We have chosen the Nazareth and Hauksson (2004) method, which uses the depth above which 99% of the background seismicity occurs to assign W. The product of the observed or inferred fault length, L, with the down-dip dimension, W, gives the fault area, A. We must then use a scaling relation to relate A to moment-magnitude, Mw. We assigned equal weight to the Ellsworth B (Working Group on California Earthquake Probabilities, 2003) and Hanks and Bakun (2007) equations. The former uses a single logarithmic relation fitted to the M=6.5 portion of data of Wells and Coppersmith (1994); the latter uses a bilinear relation with a slope change at M=6.65 (A=537 km2) and also was tested against a greatly expanded dataset for large continental transform earthquakes. We also present an alternative power law relation, which fits the newly expanded Hanks and Bakun (2007) data best, and captures the change in slope that Hanks and Bakun attribute to a transition from area- to length-scaling of earthquake slip. We have not opted to use the alternative relation for the current model. The selections and weights were developed by unanimous consensus of the Executive Committee of the Working Group, following an open meeting of scientists, a solicitation of outside opinions from additional scientists, and presentation of our approach to the Scientific Review Panel. The magnitude-area relations and their assigned weights are unchanged from that used in Working Group (2003).

  4. The Uniform California Earthquake Rupture Forecast, Version 2 (UCERF 2)

    USGS Publications Warehouse

    ,

    2008-01-01

    California?s 35 million people live among some of the most active earthquake faults in the United States. Public safety demands credible assessments of the earthquake hazard to maintain appropriate building codes for safe construction and earthquake insurance for loss protection. Seismic hazard analysis begins with an earthquake rupture forecast?a model of probabilities that earthquakes of specified magnitudes, locations, and faulting types will occur during a specified time interval. This report describes a new earthquake rupture forecast for California developed by the 2007 Working Group on California Earthquake Probabilities (WGCEP 2007).

  5. Uniform California earthquake rupture forecast, version 2 (UCERF 2)

    USGS Publications Warehouse

    Field, E.H.; Dawson, T.E.; Felzer, K.R.; Frankel, A.D.; Gupta, V.; Jordan, T.H.; Parsons, T.; Petersen, M.D.; Stein, R.S.; Weldon, R.J.; Wills, C.J.

    2009-01-01

    The 2007 Working Group on California Earthquake Probabilities (WGCEP, 2007) presents the Uniform California Earthquake Rupture Forecast, Version 2 (UCERF 2). This model comprises a time-independent (Poisson-process) earthquake rate model, developed jointly with the National Seismic Hazard Mapping Program and a time-dependent earthquake-probability model, based on recent earthquake rates and stress-renewal statistics conditioned on the date of last event. The models were developed from updated statewide earthquake catalogs and fault deformation databases using a uniform methodology across all regions and implemented in the modular, extensible Open Seismic Hazard Analysis framework. The rate model satisfies integrating measures of deformation across the plate-boundary zone and is consistent with historical seismicity data. An overprediction of earthquake rates found at intermediate magnitudes (6.5 ??? M ???7.0) in previous models has been reduced to within the 95% confidence bounds of the historical earthquake catalog. A logic tree with 480 branches represents the epistemic uncertainties of the full time-dependent model. The mean UCERF 2 time-dependent probability of one or more M ???6.7 earthquakes in the California region during the next 30 yr is 99.7%; this probability decreases to 46% for M ???7.5 and to 4.5% for M ???8.0. These probabilities do not include the Cascadia subduction zone, largely north of California, for which the estimated 30 yr, M ???8.0 time-dependent probability is 10%. The M ???6.7 probabilities on major strike-slip faults are consistent with the WGCEP (2003) study in the San Francisco Bay Area and the WGCEP (1995) study in southern California, except for significantly lower estimates along the San Jacinto and Elsinore faults, owing to provisions for larger multisegment ruptures. Important model limitations are discussed.

  6. Earthquake Rate Model 2.2 of the 2007 Working Group for California Earthquake Probabilities, Appendix D: Magnitude-Area Relationships

    USGS Publications Warehouse

    Stein, Ross S.

    2007-01-01

    Summary To estimate the down-dip coseismic fault dimension, W, the Executive Committee has chosen the Nazareth and Hauksson (2004) method, which uses the 99% depth of background seismicity to assign W. For the predicted earthquake magnitude-fault area scaling used to estimate the maximum magnitude of an earthquake rupture from a fault's length, L, and W, the Committee has assigned equal weight to the Ellsworth B (Working Group on California Earthquake Probabilities, 2003) and Hanks and Bakun (2002) (as updated in 2007) equations. The former uses a single relation; the latter uses a bilinear relation which changes slope at M=6.65 (A=537 km2).

  7. The California Earthquake Advisory Plan: A history

    USGS Publications Warehouse

    Roeloffs, Evelyn A.; Goltz, James D.

    2017-01-01

    Since 1985, the California Office of Emergency Services (Cal OES) has issued advisory statements to local jurisdictions and the public following seismic activity that scientists on the California Earthquake Prediction Evaluation Council view as indicating elevated probability of a larger earthquake in the same area during the next several days. These advisory statements are motivated by statistical studies showing that about 5% of moderate earthquakes in California are followed by larger events within a 10-km, five-day space-time window (Jones, 1985; Agnew and Jones, 1991; Reasenberg and Jones, 1994). Cal OES issued four earthquake advisories from 1985 to 1989. In October, 1990, the California Earthquake Advisory Plan formalized this practice, and six Cal OES Advisories have been issued since then. This article describes that protocol’s scientific basis and evolution.

  8. Next-Day Earthquake Forecasts for California

    NASA Astrophysics Data System (ADS)

    Werner, M. J.; Jackson, D. D.; Kagan, Y. Y.

    2008-12-01

    We implemented a daily forecast of m > 4 earthquakes for California in the format suitable for testing in community-based earthquake predictability experiments: Regional Earthquake Likelihood Models (RELM) and the Collaboratory for the Study of Earthquake Predictability (CSEP). The forecast is based on near-real time earthquake reports from the ANSS catalog above magnitude 2 and will be available online. The model used to generate the forecasts is based on the Epidemic-Type Earthquake Sequence (ETES) model, a stochastic model of clustered and triggered seismicity. Our particular implementation is based on the earlier work of Helmstetter et al. (2006, 2007), but we extended the forecast to all of Cali-fornia, use more data to calibrate the model and its parameters, and made some modifications. Our forecasts will compete against the Short-Term Earthquake Probabilities (STEP) forecasts of Gersten-berger et al. (2005) and other models in the next-day testing class of the CSEP experiment in California. We illustrate our forecasts with examples and discuss preliminary results.

  9. Computing Earthquake Probabilities on Global Scales

    NASA Astrophysics Data System (ADS)

    Holliday, James R.; Graves, William R.; Rundle, John B.; Turcotte, Donald L.

    2016-03-01

    Large devastating events in systems such as earthquakes, typhoons, market crashes, electricity grid blackouts, floods, droughts, wars and conflicts, and landslides can be unexpected and devastating. Events in many of these systems display frequency-size statistics that are power laws. Previously, we presented a new method for calculating probabilities for large events in systems such as these. This method counts the number of small events since the last large event and then converts this count into a probability by using a Weibull probability law. We applied this method to the calculation of large earthquake probabilities in California-Nevada, USA. In that study, we considered a fixed geographic region and assumed that all earthquakes within that region, large magnitudes as well as small, were perfectly correlated. In the present article, we extend this model to systems in which the events have a finite correlation length. We modify our previous results by employing the correlation function for near mean field systems having long-range interactions, an example of which is earthquakes and elastic interactions. We then construct an application of the method and show examples of computed earthquake probabilities.

  10. Maximum Magnitude and Probabilities of Induced Earthquakes in California Geothermal Fields: Applications for a Science-Based Decision Framework

    NASA Astrophysics Data System (ADS)

    Weiser, Deborah Anne

    Induced seismicity is occurring at increasing rates around the country. Brodsky and Lajoie (2013) and others have recognized anthropogenic quakes at a few geothermal fields in California. I use three techniques to assess if there are induced earthquakes in California geothermal fields; there are three sites with clear induced seismicity: Brawley, The Geysers, and Salton Sea. Moderate to strong evidence is found at Casa Diablo, Coso, East Mesa, and Susanville. Little to no evidence is found for Heber and Wendel. I develop a set of tools to reduce or cope with the risk imposed by these earthquakes, and also to address uncertainties through simulations. I test if an earthquake catalog may be bounded by an upper magnitude limit. I address whether the earthquake record during pumping time is consistent with the past earthquake record, or if injection can explain all or some of the earthquakes. I also present ways to assess the probability of future earthquake occurrence based on past records. I summarize current legislation for eight states where induced earthquakes are of concern. Unlike tectonic earthquakes, the hazard from induced earthquakes has the potential to be modified. I discuss direct and indirect mitigation practices. I present a framework with scientific and communication techniques for assessing uncertainty, ultimately allowing more informed decisions to be made.

  11. Long‐term time‐dependent probabilities for the third Uniform California Earthquake Rupture Forecast (UCERF3)

    USGS Publications Warehouse

    Field, Edward; Biasi, Glenn P.; Bird, Peter; Dawson, Timothy E.; Felzer, Karen R.; Jackson, David A.; Johnson, Kaj M.; Jordan, Thomas H.; Madden, Christopher; Michael, Andrew J.; Milner, Kevin; Page, Morgan T.; Parsons, Thomas E.; Powers, Peter; Shaw, Bruce E.; Thatcher, Wayne R.; Weldon, Ray J.; Zeng, Yuehua

    2015-01-01

    The 2014 Working Group on California Earthquake Probabilities (WGCEP 2014) presents time-dependent earthquake probabilities for the third Uniform California Earthquake Rupture Forecast (UCERF3). Building on the UCERF3 time-independent model, published previously, renewal models are utilized to represent elastic-rebound-implied probabilities. A new methodology has been developed that solves applicability issues in the previous approach for un-segmented models. The new methodology also supports magnitude-dependent aperiodicity and accounts for the historic open interval on faults that lack a date-of-last-event constraint. Epistemic uncertainties are represented with a logic tree, producing 5,760 different forecasts. Results for a variety of evaluation metrics are presented, including logic-tree sensitivity analyses and comparisons to the previous model (UCERF2). For 30-year M≥6.7 probabilities, the most significant changes from UCERF2 are a threefold increase on the Calaveras fault and a threefold decrease on the San Jacinto fault. Such changes are due mostly to differences in the time-independent models (e.g., fault slip rates), with relaxation of segmentation and inclusion of multi-fault ruptures being particularly influential. In fact, some UCERF2 faults were simply too long to produce M 6.7 sized events given the segmentation assumptions in that study. Probability model differences are also influential, with the implied gains (relative to a Poisson model) being generally higher in UCERF3. Accounting for the historic open interval is one reason. Another is an effective 27% increase in the total elastic-rebound-model weight. The exact factors influencing differences between UCERF2 and UCERF3, as well as the relative importance of logic-tree branches, vary throughout the region, and depend on the evaluation metric of interest. For example, M≥6.7 probabilities may not be a good proxy for other hazard or loss measures. This sensitivity, coupled with the

  12. California Fault Parameters for the National Seismic Hazard Maps and Working Group on California Earthquake Probabilities 2007

    USGS Publications Warehouse

    Wills, Chris J.; Weldon, Ray J.; Bryant, W.A.

    2008-01-01

    This report describes development of fault parameters for the 2007 update of the National Seismic Hazard Maps and the Working Group on California Earthquake Probabilities (WGCEP, 2007). These reference parameters are contained within a database intended to be a source of values for use by scientists interested in producing either seismic hazard or deformation models to better understand the current seismic hazards in California. These parameters include descriptions of the geometry and rates of movements of faults throughout the state. These values are intended to provide a starting point for development of more sophisticated deformation models which include known rates of movement on faults as well as geodetic measurements of crustal movement and the rates of movements of the tectonic plates. The values will be used in developing the next generation of the time-independent National Seismic Hazard Maps, and the time-dependant seismic hazard calculations being developed for the WGCEP. Due to the multiple uses of this information, development of these parameters has been coordinated between USGS, CGS and SCEC. SCEC provided the database development and editing tools, in consultation with USGS, Golden. This database has been implemented in Oracle and supports electronic access (e.g., for on-the-fly access). A GUI-based application has also been developed to aid in populating the database. Both the continually updated 'living' version of this database, as well as any locked-down official releases (e.g., used in a published model for calculating earthquake probabilities or seismic shaking hazards) are part of the USGS Quaternary Fault and Fold Database http://earthquake.usgs.gov/regional/qfaults/ . CGS has been primarily responsible for updating and editing of the fault parameters, with extensive input from USGS and SCEC scientists.

  13. Results of the Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California.

    PubMed

    Lee, Ya-Ting; Turcotte, Donald L; Holliday, James R; Sachs, Michael K; Rundle, John B; Chen, Chien-Chih; Tiampo, Kristy F

    2011-10-04

    The Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California was the first competitive evaluation of forecasts of future earthquake occurrence. Participants submitted expected probabilities of occurrence of M ≥ 4.95 earthquakes in 0.1° × 0.1° cells for the period 1 January 1, 2006, to December 31, 2010. Probabilities were submitted for 7,682 cells in California and adjacent regions. During this period, 31 M ≥ 4.95 earthquakes occurred in the test region. These earthquakes occurred in 22 test cells. This seismic activity was dominated by earthquakes associated with the M = 7.2, April 4, 2010, El Mayor-Cucapah earthquake in northern Mexico. This earthquake occurred in the test region, and 16 of the other 30 earthquakes in the test region could be associated with it. Nine complete forecasts were submitted by six participants. In this paper, we present the forecasts in a way that allows the reader to evaluate which forecast is the most "successful" in terms of the locations of future earthquakes. We conclude that the RELM test was a success and suggest ways in which the results can be used to improve future forecasts.

  14. Results of the Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California

    PubMed Central

    Lee, Ya-Ting; Turcotte, Donald L.; Holliday, James R.; Sachs, Michael K.; Rundle, John B.; Chen, Chien-Chih; Tiampo, Kristy F.

    2011-01-01

    The Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California was the first competitive evaluation of forecasts of future earthquake occurrence. Participants submitted expected probabilities of occurrence of M≥4.95 earthquakes in 0.1° × 0.1° cells for the period 1 January 1, 2006, to December 31, 2010. Probabilities were submitted for 7,682 cells in California and adjacent regions. During this period, 31 M≥4.95 earthquakes occurred in the test region. These earthquakes occurred in 22 test cells. This seismic activity was dominated by earthquakes associated with the M = 7.2, April 4, 2010, El Mayor–Cucapah earthquake in northern Mexico. This earthquake occurred in the test region, and 16 of the other 30 earthquakes in the test region could be associated with it. Nine complete forecasts were submitted by six participants. In this paper, we present the forecasts in a way that allows the reader to evaluate which forecast is the most “successful” in terms of the locations of future earthquakes. We conclude that the RELM test was a success and suggest ways in which the results can be used to improve future forecasts. PMID:21949355

  15. Conditional Probabilities of Large Earthquake Sequences in California from the Physics-based Rupture Simulator RSQSim

    NASA Astrophysics Data System (ADS)

    Gilchrist, J. J.; Jordan, T. H.; Shaw, B. E.; Milner, K. R.; Richards-Dinger, K. B.; Dieterich, J. H.

    2017-12-01

    Within the SCEC Collaboratory for Interseismic Simulation and Modeling (CISM), we are developing physics-based forecasting models for earthquake ruptures in California. We employ the 3D boundary element code RSQSim (Rate-State Earthquake Simulator of Dieterich & Richards-Dinger, 2010) to generate synthetic catalogs with tens of millions of events that span up to a million years each. This code models rupture nucleation by rate- and state-dependent friction and Coulomb stress transfer in complex, fully interacting fault systems. The Uniform California Earthquake Rupture Forecast Version 3 (UCERF3) fault and deformation models are used to specify the fault geometry and long-term slip rates. We have employed the Blue Waters supercomputer to generate long catalogs of simulated California seismicity from which we calculate the forecasting statistics for large events. We have performed probabilistic seismic hazard analysis with RSQSim catalogs that were calibrated with system-wide parameters and found a remarkably good agreement with UCERF3 (Milner et al., this meeting). We build on this analysis, comparing the conditional probabilities of sequences of large events from RSQSim and UCERF3. In making these comparisons, we consider the epistemic uncertainties associated with the RSQSim parameters (e.g., rate- and state-frictional parameters), as well as the effects of model-tuning (e.g., adjusting the RSQSim parameters to match UCERF3 recurrence rates). The comparisons illustrate how physics-based rupture simulators might assist forecasters in understanding the short-term hazards of large aftershocks and multi-event sequences associated with complex, multi-fault ruptures.

  16. Time‐dependent renewal‐model probabilities when date of last earthquake is unknown

    USGS Publications Warehouse

    Field, Edward H.; Jordan, Thomas H.

    2015-01-01

    We derive time-dependent, renewal-model earthquake probabilities for the case in which the date of the last event is completely unknown, and compare these with the time-independent Poisson probabilities that are customarily used as an approximation in this situation. For typical parameter values, the renewal-model probabilities exceed Poisson results by more than 10% when the forecast duration exceeds ~20% of the mean recurrence interval. We also derive probabilities for the case in which the last event is further constrained to have occurred before historical record keeping began (the historic open interval), which can only serve to increase earthquake probabilities for typically applied renewal models.We conclude that accounting for the historic open interval can improve long-term earthquake rupture forecasts for California and elsewhere.

  17. FORESHOCKS AND TIME-DEPENDENT EARTHQUAKE HAZARD ASSESSMENT IN SOUTHERN CALIFORNIA.

    USGS Publications Warehouse

    Jones, Lucile M.

    1985-01-01

    The probability that an earthquake in southern California (M greater than equivalent to 3. 0) will be followed by an earthquake of larger magnitude within 5 days and 10 km (i. e. , will be a foreshock) is 6 plus or minus 0. 5 per cent (1 S. D. ), and is not significantly dependent on the magnitude of the possible foreshock between M equals 3 and M equals 5. The probability that an earthquake will be followed by an M greater than equivalent to 5. 0 main shock, however, increases with magnitude of the foreshock from less than 1 per cent at M greater than equivalent to 3 to 6. 5 plus or minus 2. 5 per cent (1 S. D. ) at M greater than equivalent to 5. The main shock will most likely occur in the first hour after the foreshock, and the probability that a main shock will occur in the first hour decreases with elapsed time from the occurrence of the possible foreshock by approximately the inverse of time. Thus, the occurrence of an earthquake of M greater than equivalent to 3. 0 in southern California increases the earthquake hazard within a small space-time window several orders of magnitude above the normal background level.

  18. Nowcasting Earthquakes: A Comparison of Induced Earthquakes in Oklahoma and at the Geysers, California

    NASA Astrophysics Data System (ADS)

    Luginbuhl, Molly; Rundle, John B.; Hawkins, Angela; Turcotte, Donald L.

    2018-01-01

    Nowcasting is a new method of statistically classifying seismicity and seismic risk (Rundle et al. 2016). In this paper, the method is applied to the induced seismicity at the Geysers geothermal region in California and the induced seismicity due to fluid injection in Oklahoma. Nowcasting utilizes the catalogs of seismicity in these regions. Two earthquake magnitudes are selected, one large say M_{λ } ≥ 4, and one small say M_{σ } ≥ 2. The method utilizes the number of small earthquakes that occurs between pairs of large earthquakes. The cumulative probability distribution of these values is obtained. The earthquake potential score (EPS) is defined by the number of small earthquakes that has occurred since the last large earthquake, the point where this number falls on the cumulative probability distribution of interevent counts defines the EPS. A major advantage of nowcasting is that it utilizes "natural time", earthquake counts, between events rather than clock time. Thus, it is not necessary to decluster aftershocks and the results are applicable if the level of induced seismicity varies in time. The application of natural time to the accumulation of the seismic hazard depends on the applicability of Gutenberg-Richter (GR) scaling. The increasing number of small earthquakes that occur after a large earthquake can be scaled to give the risk of a large earthquake occurring. To illustrate our approach, we utilize the number of M_{σ } ≥ 2.75 earthquakes in Oklahoma to nowcast the number of M_{λ } ≥ 4.0 earthquakes in Oklahoma. The applicability of the scaling is illustrated during the rapid build-up of injection-induced seismicity between 2012 and 2016, and the subsequent reduction in seismicity associated with a reduction in fluid injections. The same method is applied to the geothermal-induced seismicity at the Geysers, California, for comparison.

  19. Foreshocks, aftershocks, and earthquake probabilities: Accounting for the landers earthquake

    USGS Publications Warehouse

    Jones, Lucile M.

    1994-01-01

    The equation to determine the probability that an earthquake occurring near a major fault will be a foreshock to a mainshock on that fault is modified to include the case of aftershocks to a previous earthquake occurring near the fault. The addition of aftershocks to the background seismicity makes its less probable that an earthquake will be a foreshock, because nonforeshocks have become more common. As the aftershocks decay with time, the probability that an earthquake will be a foreshock increases. However, fault interactions between the first mainshock and the major fault can increase the long-term probability of a characteristic earthquake on that fault, which will, in turn, increase the probability that an event is a foreshock, compensating for the decrease caused by the aftershocks.

  20. Earthquake education in California

    USGS Publications Warehouse

    MacCabe, M. P.

    1980-01-01

    In a survey of community response to the earthquake threat in southern California, Ralph Turner and his colleagues in the Department of Sociology at the University of California, Los Angeles, found that the public very definitely wants to be educated about the kinds of problems and hazards they can expect during and after a damaging earthquake; and they also want to know how they can prepare themselves to minimize their vulnerability. Decisionmakers, too, are recognizing this new wave of public concern. 

  1. Accessing northern California earthquake data via Internet

    NASA Astrophysics Data System (ADS)

    Romanowicz, Barbara; Neuhauser, Douglas; Bogaert, Barbara; Oppenheimer, David

    The Northern California Earthquake Data Center (NCEDC) provides easy access to central and northern California digital earthquake data. It is located at the University of California, Berkeley, and is operated jointly with the U.S. Geological Survey (USGS) in Menlo Park, Calif., and funded by the University of California and the National Earthquake Hazard Reduction Program. It has been accessible to users in the scientific community through Internet since mid-1992.The data center provides an on-line archive for parametric and waveform data from two regional networks: the Northern California Seismic Network (NCSN) operated by the USGS and the Berkeley Digital Seismic Network (BDSN) operated by the Seismographic Station at the University of California, Berkeley.

  2. A physically-based earthquake recurrence model for estimation of long-term earthquake probabilities

    USGS Publications Warehouse

    Ellsworth, William L.; Matthews, Mark V.; Nadeau, Robert M.; Nishenko, Stuart P.; Reasenberg, Paul A.; Simpson, Robert W.

    1999-01-01

    A physically-motivated model for earthquake recurrence based on the Brownian relaxation oscillator is introduced. The renewal process defining this point process model can be described by the steady rise of a state variable from the ground state to failure threshold as modulated by Brownian motion. Failure times in this model follow the Brownian passage time (BPT) distribution, which is specified by the mean time to failure, μ, and the aperiodicity of the mean, α (equivalent to the familiar coefficient of variation). Analysis of 37 series of recurrent earthquakes, M -0.7 to 9.2, suggests a provisional generic value of α = 0.5. For this value of α, the hazard function (instantaneous failure rate of survivors) exceeds the mean rate for times > μ⁄2, and is ~ ~ 2 ⁄ μ for all times > μ. Application of this model to the next M 6 earthquake on the San Andreas fault at Parkfield, California suggests that the annual probability of the earthquake is between 1:10 and 1:13.

  3. WGCEP Historical California Earthquake Catalog

    USGS Publications Warehouse

    Felzer, Karen R.; Cao, Tianqing

    2008-01-01

    This appendix provides an earthquake catalog for California and the surrounding area. Our goal is to provide a listing for all known M > 5.5 earthquakes that occurred from 1850-1932 and all known M > 4.0 earthquakes that occurred from 1932-2006 within the region of 31.0 to 43.0 degrees North and -126.0 to -114.0 degrees West. Some pre-1932 earthquakes 4 5, before the Northern California network was online. Some earthquakes from 1900-1932, and particularly from 1910-1932 are also based on instrumental readings, but the quality of the instrumental record and the resulting analysis are much less precise than for later listings. A partial exception is for some of the largest earthquakes, such as the San Francisco earthquake of April 18, 1906, for which global teleseismic records (Wald et al. 1993) and geodetic measurements (Thatcher et al. 1906) have been used to help determine magnitudes.

  4. A 30-year history of earthquake crisis communication in California and lessons for the future

    NASA Astrophysics Data System (ADS)

    Jones, L.

    2015-12-01

    The first statement from the US Geological Survey to the California Office of Emergency Services quantifying the probability of a possible future earthquake was made in October 1985 about the probability (approximately 5%) that a M4.7 earthquake located directly beneath the Coronado Bay Bridge in San Diego would be a foreshock to a larger earthquake. In the next 30 years, publication of aftershock advisories have become routine and formal statements about the probability of a larger event have been developed in collaboration with the California Earthquake Prediction Evaluation Council (CEPEC) and sent to CalOES more than a dozen times. Most of these were subsequently released to the public. These communications have spanned a variety of approaches, with and without quantification of the probabilities, and using different ways to express the spatial extent and the magnitude distribution of possible future events. The USGS is re-examining its approach to aftershock probability statements and to operational earthquake forecasting with the goal of creating pre-vetted automated statements that can be released quickly after significant earthquakes. All of the previous formal advisories were written during the earthquake crisis. The time to create and release a statement became shorter with experience from the first public advisory (to the 1988 Lake Elsman earthquake) that was released 18 hours after the triggering event, but was never completed in less than 2 hours. As was done for the Parkfield experiment, the process will be reviewed by CEPEC and NEPEC (National Earthquake Prediction Evaluation Council) so the statements can be sent to the public automatically. This talk will review the advisories, the variations in wording and the public response and compare this with social science research about successful crisis communication, to create recommendations for future advisories

  5. Keeping the History in Historical Seismology: The 1872 Owens Valley, California Earthquake

    NASA Astrophysics Data System (ADS)

    Hough, Susan E.

    2008-07-01

    The importance of historical earthquakes is being increasingly recognized. Careful investigations of key pre-instrumental earthquakes can provide critical information and insights for not only seismic hazard assessment but also for earthquake science. In recent years, with the explosive growth in computational sophistication in Earth sciences, researchers have developed increasingly sophisticated methods to analyze macroseismic data quantitatively. These methodological developments can be extremely useful to exploit fully the temporally and spatially rich information source that seismic intensities often represent. For example, the exhaustive and painstaking investigations done by Ambraseys and his colleagues of early Himalayan earthquakes provides information that can be used to map out site response in the Ganges basin. In any investigation of macroseismic data, however, one must stay mindful that intensity values are not data but rather interpretations. The results of any subsequent analysis, regardless of the degree of sophistication of the methodology, will be only as reliable as the interpretations of available accounts—and only as complete as the research done to ferret out, and in many cases translate, these accounts. When intensities are assigned without an appreciation of historical setting and context, seemingly careful subsequent analysis can yield grossly inaccurate results. As a case study, I report here on the results of a recent investigation of the 1872 Owen's Valley, California earthquake. Careful consideration of macroseismic observations reveals that this event was probably larger than the great San Francisco earthquake of 1906, and possibly the largest historical earthquake in California. The results suggest that some large earthquakes in California will generate significantly larger ground motions than San Andreas fault events of comparable magnitude.

  6. Keeping the History in Historical Seismology: The 1872 Owens Valley, California Earthquake

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hough, Susan E.

    2008-07-08

    The importance of historical earthquakes is being increasingly recognized. Careful investigations of key pre-instrumental earthquakes can provide critical information and insights for not only seismic hazard assessment but also for earthquake science. In recent years, with the explosive growth in computational sophistication in Earth sciences, researchers have developed increasingly sophisticated methods to analyze macroseismic data quantitatively. These methodological developments can be extremely useful to exploit fully the temporally and spatially rich information source that seismic intensities often represent. For example, the exhaustive and painstaking investigations done by Ambraseys and his colleagues of early Himalayan earthquakes provides information that can bemore » used to map out site response in the Ganges basin. In any investigation of macroseismic data, however, one must stay mindful that intensity values are not data but rather interpretations. The results of any subsequent analysis, regardless of the degree of sophistication of the methodology, will be only as reliable as the interpretations of available accounts - and only as complete as the research done to ferret out, and in many cases translate, these accounts. When intensities are assigned without an appreciation of historical setting and context, seemingly careful subsequent analysis can yield grossly inaccurate results. As a case study, I report here on the results of a recent investigation of the 1872 Owen's Valley, California earthquake. Careful consideration of macroseismic observations reveals that this event was probably larger than the great San Francisco earthquake of 1906, and possibly the largest historical earthquake in California. The results suggest that some large earthquakes in California will generate significantly larger ground motions than San Andreas fault events of comparable magnitude.« less

  7. Estimating earthquake-induced failure probability and downtime of critical facilities.

    PubMed

    Porter, Keith; Ramer, Kyle

    2012-01-01

    Fault trees have long been used to estimate failure risk in earthquakes, especially for nuclear power plants (NPPs). One interesting application is that one can assess and manage the probability that two facilities - a primary and backup - would be simultaneously rendered inoperative in a single earthquake. Another is that one can calculate the probabilistic time required to restore a facility to functionality, and the probability that, during any given planning period, the facility would be rendered inoperative for any specified duration. A large new peer-reviewed library of component damageability and repair-time data for the first time enables fault trees to be used to calculate the seismic risk of operational failure and downtime for a wide variety of buildings other than NPPs. With the new library, seismic risk of both the failure probability and probabilistic downtime can be assessed and managed, considering the facility's unique combination of structural and non-structural components, their seismic installation conditions, and the other systems on which the facility relies. An example is offered of real computer data centres operated by a California utility. The fault trees were created and tested in collaboration with utility operators, and the failure probability and downtime results validated in several ways.

  8. Earthquakes and faults in southern California (1970-2010)

    USGS Publications Warehouse

    Sleeter, Benjamin M.; Calzia, James P.; Walter, Stephen R.

    2012-01-01

    The map depicts both active and inactive faults and earthquakes magnitude 1.5 to 7.3 in southern California (1970–2010). The bathymetry was generated from digital files from the California Department of Fish And Game, Marine Region, Coastal Bathymetry Project. Elevation data are from the U.S. Geological Survey National Elevation Database. Landsat satellite image is from fourteen Landsat 5 Thematic Mapper scenes collected between 2009 and 2010. Fault data are reproduced with permission from 2006 California Geological Survey and U.S. Geological Survey data. The earthquake data are from the U.S. Geological Survey National Earthquake Information Center.

  9. The earthquake prediction experiment at Parkfield, California

    USGS Publications Warehouse

    Roeloffs, E.; Langbein, J.

    1994-01-01

    Since 1985, a focused earthquake prediction experiment has been in progress along the San Andreas fault near the town of Parkfield in central California. Parkfield has experienced six moderate earthquakes since 1857 at average intervals of 22 years, the most recent a magnitude 6 event in 1966. The probability of another moderate earthquake soon appears high, but studies assigning it a 95% chance of occurring before 1993 now appear to have been oversimplified. The identification of a Parkfield fault "segment" was initially based on geometric features in the surface trace of the San Andreas fault, but more recent microearthquake studies have demonstrated that those features do not extend to seismogenic depths. On the other hand, geodetic measurements are consistent with the existence of a "locked" patch on the fault beneath Parkfield that has presently accumulated a slip deficit equal to the slip in the 1966 earthquake. A magnitude 4.7 earthquake in October 1992 brought the Parkfield experiment to its highest level of alert, with a 72-hour public warning that there was a 37% chance of a magnitude 6 event. However, this warning proved to be a false alarm. Most data collected at Parkfield indicate that strain is accumulating at a constant rate on this part of the San Andreas fault, but some interesting departures from this behavior have been recorded. Here we outline the scientific arguments bearing on when the next Parkfield earthquake is likely to occur and summarize geophysical observations to date.

  10. The October 1992 Parkfield, California, earthquake prediction

    USGS Publications Warehouse

    Langbein, J.

    1992-01-01

    A magnitude 4.7 earthquake occurred near Parkfield, California, on October 20, 992, at 05:28 UTC (October 19 at 10:28 p.m. local or Pacific Daylight Time).This moderate shock, interpreted as the potential foreshock of a damaging earthquake on the San Andreas fault, triggered long-standing federal, state and local government plans to issue a public warning of an imminent magnitude 6 earthquake near Parkfield. Although the predicted earthquake did not take place, sophisticated suites of instruments deployed as part of the Parkfield Earthquake Prediction Experiment recorded valuable data associated with an unusual series of events. this article describes the geological aspects of these events, which occurred near Parkfield in October 1992. The accompnaying article, an edited version of a press conference b Richard Andrews, the Director of the California Office of Emergency Service (OES), describes governmental response to the prediction.   

  11. The southern California uplift and associated earthquakes

    USGS Publications Warehouse

    Castle, R.O.; Bernknopf, R.L.

    1996-01-01

    Southern California earthquakes ??? M5.5 during the period 1955/01/01-1994/01/17 were concentrated along or adjacent to the south flank of the southern California uplift, as defined both at its culmination and following its partial collapse. Spatial clustering of these earthquakes within three more-or-less distinct groups suggests either gaps along the south flank that were previously filled or are yet to be filled. Nearly all of the indicated earthquakes accompanied or followed partial collapse of the uplift, and seismic activity within this regime seems to have been increasing through at least 1994/01/17. Copyright 1996 by the American Geophysical Union.

  12. Prospective Tests of Southern California Earthquake Forecasts

    NASA Astrophysics Data System (ADS)

    Jackson, D. D.; Schorlemmer, D.; Gerstenberger, M.; Kagan, Y. Y.; Helmstetter, A.; Wiemer, S.; Field, N.

    2004-12-01

    We are testing earthquake forecast models prospectively using likelihood ratios. Several investigators have developed such models as part of the Southern California Earthquake Center's project called Regional Earthquake Likelihood Models (RELM). Various models are based on fault geometry and slip rates, seismicity, geodetic strain, and stress interactions. Here we describe the testing procedure and present preliminary results. Forecasts are expressed as the yearly rate of earthquakes within pre-specified bins of longitude, latitude, magnitude, and focal mechanism parameters. We test models against each other in pairs, which requires that both forecasts in a pair be defined over the same set of bins. For this reason we specify a standard "menu" of bins and ground rules to guide forecasters in using common descriptions. One menu category includes five-year forecasts of magnitude 5.0 and larger. Contributors will be requested to submit forecasts in the form of a vector of yearly earthquake rates on a 0.1 degree grid at the beginning of the test. Focal mechanism forecasts, when available, are also archived and used in the tests. Interim progress will be evaluated yearly, but final conclusions would be made on the basis of cumulative five-year performance. The second category includes forecasts of earthquakes above magnitude 4.0 on a 0.1 degree grid, evaluated and renewed daily. Final evaluation would be based on cumulative performance over five years. Other types of forecasts with different magnitude, space, and time sampling are welcome and will be tested against other models with shared characteristics. Tests are based on the log likelihood scores derived from the probability that future earthquakes would occur where they do if a given forecast were true [Kagan and Jackson, J. Geophys. Res.,100, 3,943-3,959, 1995]. For each pair of forecasts, we compute alpha, the probability that the first would be wrongly rejected in favor of the second, and beta, the probability

  13. THE GREAT SOUTHERN CALIFORNIA SHAKEOUT: Earthquake Science for 22 Million People

    NASA Astrophysics Data System (ADS)

    Jones, L.; Cox, D.; Perry, S.; Hudnut, K.; Benthien, M.; Bwarie, J.; Vinci, M.; Buchanan, M.; Long, K.; Sinha, S.; Collins, L.

    2008-12-01

    Earthquake science is being communicated to and used by the 22 million residents of southern California to improve resiliency to future earthquakes through the Great Southern California ShakeOut. The ShakeOut began when the USGS partnered with the California Geological Survey, Southern California Earthquake Center and many other organizations to bring 300 scientists and engineers together to formulate a comprehensive description of a plausible major earthquake, released in May 2008, as the ShakeOut Scenario, a description of the impacts and consequences of a M7.8 earthquake on the Southern San Andreas Fault (USGS OFR2008-1150). The Great Southern California ShakeOut was a week of special events featuring the largest earthquake drill in United States history. The ShakeOut drill occurred in houses, businesses, and public spaces throughout southern California at 10AM on November 13, 2008, when southern Californians were asked to pretend that the M7.8 scenario earthquake had occurred and to practice actions that could reduce the impact on their lives. Residents, organizations, schools and businesses registered to participate in the drill through www.shakeout.org where they could get accessible information about the scenario earthquake and share ideas for better reparation. As of September 8, 2008, over 2.7 million confirmed participants had been registered. The primary message of the ShakeOut is that what we do now, before a big earthquake, will determine what our lives will be like after. The goal of the ShakeOut has been to change the culture of earthquake preparedness in southern California, making earthquakes a reality that are regularly discussed. This implements the sociological finding that 'milling,' discussing a problem with loved ones, is a prerequisite to taking action. ShakeOut milling is taking place at all levels from individuals and families, to corporations and governments. Actions taken as a result of the ShakeOut include the adoption of earthquake

  14. Discrepancy between earthquake rates implied by historic earthquakes and a consensus geologic source model for California

    USGS Publications Warehouse

    Petersen, M.D.; Cramer, C.H.; Reichle, M.S.; Frankel, A.D.; Hanks, T.C.

    2000-01-01

    We examine the difference between expected earthquake rates inferred from the historical earthquake catalog and the geologic data that was used to develop the consensus seismic source characterization for the state of California [California Department of Conservation, Division of Mines and Geology (CDMG) and U.S. Geological Survey (USGS) Petersen et al., 1996; Frankel et al., 1996]. On average the historic earthquake catalog and the seismic source model both indicate about one M 6 or greater earthquake per year in the state of California. However, the overall earthquake rates of earthquakes with magnitudes (M) between 6 and 7 in this seismic source model are higher, by at least a factor of 2, than the mean historic earthquake rates for both southern and northern California. The earthquake rate discrepancy results from a seismic source model that includes earthquakes with characteristic (maximum) magnitudes that are primarily between M 6.4 and 7.1. Many of these faults are interpreted to accommodate high strain rates from geologic and geodetic data but have not ruptured in large earthquakes during historic time. Our sensitivity study indicates that the rate differences between magnitudes 6 and 7 can be reduced by adjusting the magnitude-frequency distribution of the source model to reflect more characteristic behavior, by decreasing the moment rate available for seismogenic slip along faults, by increasing the maximum magnitude of the earthquake on a fault, or by decreasing the maximum magnitude of the background seismicity. However, no single parameter can be adjusted, consistent with scientific consensus, to eliminate the earthquake rate discrepancy. Applying a combination of these parametric adjustments yields an alternative earthquake source model that is more compatible with the historic data. The 475-year return period hazard for peak ground and 1-sec spectral acceleration resulting from this alternative source model differs from the hazard resulting from the

  15. Earthquake probabilities in the San Francisco Bay Region: 2000 to 2030 - a summary of findings

    USGS Publications Warehouse

    ,

    1999-01-01

    The San Francisco Bay region sits astride a dangerous “earthquake machine,” the tectonic boundary between the Pacific and North American Plates. The region has experienced major and destructive earthquakes in 1838, 1868, 1906, and 1989, and future large earthquakes are a certainty. The ability to prepare for large earthquakes is critical to saving lives and reducing damage to property and infrastructure. An increased understanding of the timing, size, location, and effects of these likely earthquakes is a necessary component in any effective program of preparedness. This study reports on the probabilities of occurrence of major earthquakes in the San Francisco Bay region (SFBR) for the three decades 2000 to 2030. The SFBR extends from Healdsberg on the northwest to Salinas on the southeast and encloses the entire metropolitan area, including its most rapidly expanding urban and suburban areas. In this study a “major” earthquake is defined as one with M≥6.7 (where M is moment magnitude). As experience from the Northridge, California (M6.7, 1994) and Kobe, Japan (M6.9, 1995) earthquakes has shown us, earthquakes of this size can have a disastrous impact on the social and economic fabric of densely urbanized areas. To reevaluate the probability of large earthquakes striking the SFBR, the U.S. Geological Survey solicited data, interpretations, and analyses from dozens of scientists representing a wide crosssection of the Earth-science community (Appendix A). The primary approach of this new Working Group (WG99) was to develop a comprehensive, regional model for the long-term occurrence of earthquakes, founded on geologic and geophysical observations and constrained by plate tectonics. The model considers a broad range of observations and their possible interpretations. Using this model, we estimate the rates of occurrence of earthquakes and 30-year earthquake probabilities. Our study considers a range of magnitudes for earthquakes on the major faults in the

  16. A synoptic view of the Third Uniform California Earthquake Rupture Forecast (UCERF3)

    USGS Publications Warehouse

    Field, Edward; Jordan, Thomas H.; Page, Morgan T.; Milner, Kevin R.; Shaw, Bruce E.; Dawson, Timothy E.; Biasi, Glenn; Parsons, Thomas E.; Hardebeck, Jeanne L.; Michael, Andrew J.; Weldon, Ray; Powers, Peter; Johnson, Kaj M.; Zeng, Yuehua; Bird, Peter; Felzer, Karen; van der Elst, Nicholas; Madden, Christopher; Arrowsmith, Ramon; Werner, Maximillan J.; Thatcher, Wayne R.

    2017-01-01

    Probabilistic forecasting of earthquake‐producing fault ruptures informs all major decisions aimed at reducing seismic risk and improving earthquake resilience. Earthquake forecasting models rely on two scales of hazard evolution: long‐term (decades to centuries) probabilities of fault rupture, constrained by stress renewal statistics, and short‐term (hours to years) probabilities of distributed seismicity, constrained by earthquake‐clustering statistics. Comprehensive datasets on both hazard scales have been integrated into the Uniform California Earthquake Rupture Forecast, Version 3 (UCERF3). UCERF3 is the first model to provide self‐consistent rupture probabilities over forecasting intervals from less than an hour to more than a century, and it is the first capable of evaluating the short‐term hazards that result from multievent sequences of complex faulting. This article gives an overview of UCERF3, illustrates the short‐term probabilities with aftershock scenarios, and draws some valuable scientific conclusions from the modeling results. In particular, seismic, geologic, and geodetic data, when combined in the UCERF3 framework, reject two types of fault‐based models: long‐term forecasts constrained to have local Gutenberg–Richter scaling, and short‐term forecasts that lack stress relaxation by elastic rebound.

  17. Earthquake alarm; operating the seismograph station at the University of California, Berkeley.

    USGS Publications Warehouse

    Stump, B.

    1980-01-01

    At the University of California seismographic stations, the task of locating and determining magnitudes for both local and distant earthquakes is a continuous one. Teleseisms must be located rapidly so that events that occur in the Pacific can be identified and the Pacific Tsunami Warning System alerted. For great earthquakes anywhere, there is a responsibility to notify public agencies such as the California Office of Emergency Services, the Federal Disaster Assistance Administration, the Earthquake Engineering Research Institute, the California Seismic Safety Commission, and the American Red Cross. In the case of damaging local earthquakes, it is necessary to alert also the California Department of Water Resources, California Division of Mines and Geology, U.S Army Corps of Engineers, Federal Bureau of Reclamation, and the Bay Area Rapid Transit. These days, any earthquakes that are felt in northern California cause immediate inquiries from the news media and an interested public. The series of earthquakes that jolted the Livermore area from January 24 to 26 1980, is a good case in point. 

  18. The Virtual Quake Earthquake Simulator: Earthquake Probability Statistics for the El Mayor-Cucapah Region and Evidence of Predictability in Simulated Earthquake Sequences

    NASA Astrophysics Data System (ADS)

    Schultz, K.; Yoder, M. R.; Heien, E. M.; Rundle, J. B.; Turcotte, D. L.; Parker, J. W.; Donnellan, A.

    2015-12-01

    We introduce a framework for developing earthquake forecasts using Virtual Quake (VQ), the generalized successor to the perhaps better known Virtual California (VC) earthquake simulator. We discuss the basic merits and mechanics of the simulator, and we present several statistics of interest for earthquake forecasting. We also show that, though the system as a whole (in aggregate) behaves quite randomly, (simulated) earthquake sequences limited to specific fault sections exhibit measurable predictability in the form of increasing seismicity precursory to large m > 7 earthquakes. In order to quantify this, we develop an alert based forecasting metric similar to those presented in Keilis-Borok (2002); Molchan (1997), and show that it exhibits significant information gain compared to random forecasts. We also discuss the long standing question of activation vs quiescent type earthquake triggering. We show that VQ exhibits both behaviors separately for independent fault sections; some fault sections exhibit activation type triggering, while others are better characterized by quiescent type triggering. We discuss these aspects of VQ specifically with respect to faults in the Salton Basin and near the El Mayor-Cucapah region in southern California USA and northern Baja California Norte, Mexico.

  19. A post-Tohoku earthquake review of earthquake probabilities in the Southern Kanto District, Japan

    NASA Astrophysics Data System (ADS)

    Somerville, Paul G.

    2014-12-01

    The 2011 Mw 9.0 Tohoku earthquake generated an aftershock sequence that affected a large part of northern Honshu, and has given rise to widely divergent forecasts of changes in earthquake occurrence probabilities in northern Honshu. The objective of this review is to assess these forecasts as they relate to potential changes in the occurrence probabilities of damaging earthquakes in the Kanto Region. It is generally agreed that the 2011 Mw 9.0 Tohoku earthquake increased the stress on faults in the southern Kanto district. Toda and Stein (Geophys Res Lett 686, 40: doi:10.1002, 2013) further conclude that the probability of earthquakes in the Kanto Corridor has increased by a factor of 2.5 for the time period 11 March 2013 to 10 March 2018 in the Kanto Corridor. Estimates of earthquake probabilities in a wider region of the Southern Kanto District by Nanjo et al. (Geophys J Int, doi:10.1093, 2013) indicate that any increase in the probability of earthquakes is insignificant in this larger region. Uchida et al. (Earth Planet Sci Lett 374: 81-91, 2013) conclude that the Philippine Sea plate the extends well north of the northern margin of Tokyo Bay, inconsistent with the Kanto Fragment hypothesis of Toda et al. (Nat Geosci, 1:1-6,2008), which attributes deep earthquakes in this region, which they term the Kanto Corridor, to a broken fragment of the Pacific plate. The results of Uchida and Matsuzawa (J Geophys Res 115:B07309, 2013)support the conclusion that fault creep in southern Kanto may be slowly relaxing the stress increase caused by the Tohoku earthquake without causing more large earthquakes. Stress transfer calculations indicate a large stress transfer to the Off Boso Segment as a result of the 2011 Tohoku earthquake. However, Ozawa et al. (J Geophys Res 117:B07404, 2012) used onshore GPS measurements to infer large post-Tohoku creep on the plate interface in the Off-Boso region, and Uchida and Matsuzawa (ibid.) measured similar large creep off the Boso

  20. Earthquake outlook for the San Francisco Bay region 2014–2043

    USGS Publications Warehouse

    Aagaard, Brad T.; Blair, James Luke; Boatwright, John; Garcia, Susan H.; Harris, Ruth A.; Michael, Andrew J.; Schwartz, David P.; DiLeo, Jeanne S.; Jacques, Kate; Donlin, Carolyn

    2016-06-13

    Using information from recent earthquakes, improved mapping of active faults, and a new model for estimating earthquake probabilities, the 2014 Working Group on California Earthquake Probabilities updated the 30-year earthquake forecast for California. They concluded that there is a 72 percent probability (or likelihood) of at least one earthquake of magnitude 6.7 or greater striking somewhere in the San Francisco Bay region before 2043. Earthquakes this large are capable of causing widespread damage; therefore, communities in the region should take simple steps to help reduce injuries, damage, and disruption, as well as accelerate recovery from these earthquakes.

  1. Excel, Earthquakes, and Moneyball: exploring Cascadia earthquake probabilities using spreadsheets and baseball analogies

    NASA Astrophysics Data System (ADS)

    Campbell, M. R.; Salditch, L.; Brooks, E. M.; Stein, S.; Spencer, B. D.

    2017-12-01

    Much recent media attention focuses on Cascadia's earthquake hazard. A widely cited magazine article starts "An earthquake will destroy a sizable portion of the coastal Northwest. The question is when." Stories include statements like "a massive earthquake is overdue", "in the next 50 years, there is a 1-in-10 chance a "really big one" will erupt," or "the odds of the big Cascadia earthquake happening in the next fifty years are roughly one in three." These lead students to ask where the quoted probabilities come from and what they mean. These probability estimates involve two primary choices: what data are used to describe when past earthquakes happened and what models are used to forecast when future earthquakes will happen. The data come from a 10,000-year record of large paleoearthquakes compiled from subsidence data on land and turbidites, offshore deposits recording submarine slope failure. Earthquakes seem to have happened in clusters of four or five events, separated by gaps. Earthquakes within a cluster occur more frequently and regularly than in the full record. Hence the next earthquake is more likely if we assume that we are in the recent cluster that started about 1700 years ago, than if we assume the cluster is over. Students can explore how changing assumptions drastically changes probability estimates using easy-to-write and display spreadsheets, like those shown below. Insight can also come from baseball analogies. The cluster issue is like deciding whether to assume that a hitter's performance in the next game is better described by his lifetime record, or by the past few games, since he may be hitting unusually well or in a slump. The other big choice is whether to assume that the probability of an earthquake is constant with time, or is small immediately after one occurs and then grows with time. This is like whether to assume that a player's performance is the same from year to year, or changes over their career. Thus saying "the chance of

  2. UCERF3: A new earthquake forecast for California's complex fault system

    USGS Publications Warehouse

    Field, Edward H.; ,

    2015-01-01

    With innovations, fresh data, and lessons learned from recent earthquakes, scientists have developed a new earthquake forecast model for California, a region under constant threat from potentially damaging events. The new model, referred to as the third Uniform California Earthquake Rupture Forecast, or "UCERF" (http://www.WGCEP.org/UCERF3), provides authoritative estimates of the magnitude, location, and likelihood of earthquake fault rupture throughout the state. Overall the results confirm previous findings, but with some significant changes because of model improvements. For example, compared to the previous forecast (Uniform California Earthquake Rupture Forecast 2), the likelihood of moderate-sized earthquakes (magnitude 6.5 to 7.5) is lower, whereas that of larger events is higher. This is because of the inclusion of multifault ruptures, where earthquakes are no longer confined to separate, individual faults, but can occasionally rupture multiple faults simultaneously. The public-safety implications of this and other model improvements depend on several factors, including site location and type of structure (for example, family dwelling compared to a long-span bridge). Building codes, earthquake insurance products, emergency plans, and other risk-mitigation efforts will be updated accordingly. This model also serves as a reminder that damaging earthquakes are inevitable for California. Fortunately, there are many simple steps residents can take to protect lives and property.

  3. Long Period Earthquakes Beneath California's Young and Restless Volcanoes

    NASA Astrophysics Data System (ADS)

    Pitt, A. M.; Dawson, P. B.; Shelly, D. R.; Hill, D. P.; Mangan, M.

    2013-12-01

    The newly established USGS California Volcano Observatory has the broad responsibility of monitoring and assessing hazards at California's potentially threatening volcanoes, most notably Mount Shasta, Medicine Lake, Clear Lake Volcanic Field, and Lassen Volcanic Center in northern California; and Long Valley Caldera, Mammoth Mountain, and Mono-Inyo Craters in east-central California. Volcanic eruptions occur in California about as frequently as the largest San Andreas Fault Zone earthquakes-more than ten eruptions have occurred in the last 1,000 years, most recently at Lassen Peak (1666 C.E. and 1914-1917 C.E.) and Mono-Inyo Craters (c. 1700 C.E.). The Long Valley region (Long Valley caldera and Mammoth Mountain) underwent several episodes of heightened unrest over the last three decades, including intense swarms of volcano-tectonic (VT) earthquakes, rapid caldera uplift, and hazardous CO2 emissions. Both Medicine Lake and Lassen are subsiding at appreciable rates, and along with Clear Lake, Long Valley Caldera, and Mammoth Mountain, sporadically experience long period (LP) earthquakes related to migration of magmatic or hydrothermal fluids. Worldwide, the last two decades have shown the importance of tracking LP earthquakes beneath young volcanic systems, as they often provide indication of impending unrest or eruption. Herein we document the occurrence of LP earthquakes at several of California's young volcanoes, updating a previous study published in Pitt et al., 2002, SRL. All events were detected and located using data from stations within the Northern California Seismic Network (NCSN). Event detection was spatially and temporally uneven across the NCSN in the 1980s and 1990s, but additional stations, adoption of the Earthworm processing system, and heightened vigilance by seismologists have improved the catalog over the last decade. LP earthquakes are now relatively well-recorded under Lassen (~150 events since 2000), Clear Lake (~60 events), Mammoth Mountain

  4. Depth dependence of earthquake frequency-magnitude distributions in California: Implications for rupture initiation

    USGS Publications Warehouse

    Mori, J.; Abercrombie, R.E.

    1997-01-01

    Statistics of earthquakes in California show linear frequency-magnitude relationships in the range of M2.0 to M5.5 for various data sets. Assuming Gutenberg-Richter distributions, there is a systematic decrease in b value with increasing depth of earthquakes. We find consistent results for various data sets from northern and southern California that both include and exclude the larger aftershock sequences. We suggest that at shallow depth (???0 to 6 km) conditions with more heterogeneous material properties and lower lithospheric stress prevail. Rupture initiations are more likely to stop before growing into large earthquakes, producing relatively more smaller earthquakes and consequently higher b values. These ideas help to explain the depth-dependent observations of foreshocks in the western United States. The higher occurrence rate of foreshocks preceding shallow earthquakes can be interpreted in terms of rupture initiations that are stopped before growing into the mainshock. At greater depth (9-15 km), any rupture initiation is more likely to continue growing into a larger event, so there are fewer foreshocks. If one assumes that frequency-magnitude statistics can be used to estimate probabilities of a small rupture initiation growing into a larger earthquake, then a small (M2) rupture initiation at 9 to 12 km depth is 18 times more likely to grow into a M5.5 or larger event, compared to the same small rupture initiation at 0 to 3 km. Copyright 1997 by the American Geophysical Union.

  5. The 2004 Parkfield, CA Earthquake: A Teachable Moment for Exploring Earthquake Processes, Probability, and Earthquake Prediction

    NASA Astrophysics Data System (ADS)

    Kafka, A.; Barnett, M.; Ebel, J.; Bellegarde, H.; Campbell, L.

    2004-12-01

    The occurrence of the 2004 Parkfield earthquake provided a unique "teachable moment" for students in our science course for teacher education majors. The course uses seismology as a medium for teaching a wide variety of science topics appropriate for future teachers. The 2004 Parkfield earthquake occurred just 15 minutes after our students completed a lab on earthquake processes and earthquake prediction. That lab included a discussion of the Parkfield Earthquake Prediction Experiment as a motivation for the exercises they were working on that day. Furthermore, this earthquake was recorded on an AS1 seismograph right in their lab, just minutes after the students left. About an hour after we recorded the earthquake, the students were able to see their own seismogram of the event in the lecture part of the course, which provided an excellent teachable moment for a lecture/discussion on how the occurrence of the 2004 Parkfield earthquake might affect seismologists' ideas about earthquake prediction. The specific lab exercise that the students were working on just before we recorded this earthquake was a "sliding block" experiment that simulates earthquakes in the classroom. The experimental apparatus includes a flat board on top of which are blocks of wood attached to a bungee cord and a string wrapped around a hand crank. Plate motion is modeled by slowly turning the crank, and earthquakes are modeled as events in which the block slips ("blockquakes"). We scaled the earthquake data and the blockquake data (using how much the string moved as a proxy for time) so that we could compare blockquakes and earthquakes. This provided an opportunity to use interevent-time histograms to teach about earthquake processes, probability, and earthquake prediction, and to compare earthquake sequences with blockquake sequences. We were able to show the students, using data obtained directly from their own lab, how global earthquake data fit a Poisson exponential distribution better

  6. California Earthquake Residual Transportation Capability Study

    DOT National Transportation Integrated Search

    1983-12-01

    This report addresses the ability of transportation facilities in California to survive four postulated earthquakes that are based on historical events. The survival of highways, railroads, ports, airports, and pipelines is investigated following ind...

  7. Revisiting the 1872 Owens Valley, California, Earthquake

    USGS Publications Warehouse

    Hough, S.E.; Hutton, K.

    2008-01-01

    The 26 March 1872 Owens Valley earthquake is among the largest historical earthquakes in California. The felt area and maximum fault displacements have long been regarded as comparable to, if not greater than, those of the great San Andreas fault earthquakes of 1857 and 1906, but mapped surface ruptures of the latter two events were 2-3 times longer than that inferred for the 1872 rupture. The preferred magnitude estimate of the Owens Valley earthquake has thus been 7.4, based largely on the geological evidence. Reinterpreting macroseismic accounts of the Owens Valley earthquake, we infer generally lower intensity values than those estimated in earlier studies. Nonetheless, as recognized in the early twentieth century, the effects of this earthquake were still generally more dramatic at regional distances than the macroseismic effects from the 1906 earthquake, with light damage to masonry buildings at (nearest-fault) distances as large as 400 km. Macroseismic observations thus suggest a magnitude greater than that of the 1906 San Francisco earthquake, which appears to be at odds with geological observations. However, while the mapped rupture length of the Owens Valley earthquake is relatively low, the average slip was high. The surface rupture was also complex and extended over multiple fault segments. It was first mapped in detail over a century after the earthquake occurred, and recent evidence suggests it might have been longer than earlier studies indicated. Our preferred magnitude estimate is Mw 7.8-7.9, values that we show are consistent with the geological observations. The results of our study suggest that either the Owens Valley earthquake was larger than the 1906 San Francisco earthquake or that, by virtue of source properties and/or propagation effects, it produced systematically higher ground motions at regional distances. The latter possibility implies that some large earthquakes in California will generate significantly larger ground motions than San

  8. Significance of stress transfer in time-dependent earthquake probability calculations

    USGS Publications Warehouse

    Parsons, T.

    2005-01-01

    A sudden change in stress is seen to modify earthquake rates, but should it also revise earthquake probability? Data used to derive input parameters permits an array of forecasts; so how large a static stress change is require to cause a statistically significant earthquake probability change? To answer that question, effects of parameter and philosophical choices are examined through all phases of sample calculations, Drawing at random from distributions of recurrence-aperiodicity pairs identifies many that recreate long paleoseismic and historic earthquake catalogs. Probability density funtions built from the recurrence-aperiodicity pairs give the range of possible earthquake forecasts under a point process renewal model. Consequences of choices made in stress transfer calculations, such as different slip models, fault rake, dip, and friction are, tracked. For interactions among large faults, calculated peak stress changes may be localized, with most of the receiving fault area changed less than the mean. Thus, to avoid overstating probability change on segments, stress change values should be drawn from a distribution reflecting the spatial pattern rather than using the segment mean. Disparity resulting from interaction probability methodology is also examined. For a fault with a well-understood earthquake history, a minimum stress change to stressing rate ratio of 10:1 to 20:1 is required to significantly skew probabilities with >80-85% confidence. That ratio must be closer to 50:1 to exceed 90-95% confidence levels. Thus revision to earthquake probability is achievable when a perturbing event is very close to the fault in question or the tectonic stressing rate is low.

  9. Paleoseismic event dating and the conditional probability of large earthquakes on the southern San Andreas fault, California

    USGS Publications Warehouse

    Biasi, G.P.; Weldon, R.J.; Fumal, T.E.; Seitz, G.G.

    2002-01-01

    We introduce a quantitative approach to paleoearthquake dating and apply it to paleoseismic data from the Wrightwood and Pallett Creek sites on the southern San Andreas fault. We illustrate how stratigraphic ordering, sedimentological, and historical data can be used quantitatively in the process of estimating earthquake ages. Calibrated radiocarbon age distributions are used directly from layer dating through recurrence intervals and recurrence probability estimation. The method does not eliminate subjective judgements in event dating, but it does provide a means of systematically and objectively approaching the dating process. Date distributions for the most recent 14 events at Wrightwood are based on sample and contextual evidence in Fumal et al. (2002) and site context and slip history in Weldon et al. (2002). Pallett Creek event and dating descriptions are from published sources. For the five most recent events at Wrightwood, our results are consistent with previously published estimates, with generally comparable or narrower uncertainties. For Pallett Creek, our earthquake date estimates generally overlap with previous results but typically have broader uncertainties. Some event date estimates are very sensitive to details of data interpretation. The historical earthquake in 1857 ruptured the ground at both sites but is not constrained by radiocarbon data. Radiocarbon ages, peat accumulation rates, and historical constraints at Pallett Creek for event X yield a date estimate in the earliest 1800s and preclude a date in the late 1600s. This event is almost certainly the historical 1812 earthquake, as previously concluded by Sieh et al. (1989). This earthquake also produced ground deformation at Wrightwood. All events at Pallett Creek, except for event T, about A.D. 1360, and possibly event I, about A.D. 960, have corresponding events at Wrightwood with some overlap in age ranges. Event T falls during a period of low sedimentation at Wrightwood when conditions

  10. Two examples of earthquake- hazard reduction in southern California.

    USGS Publications Warehouse

    Kockelman, W.J.; Campbell, C.C.

    1983-01-01

    Because California is seismically active, planners and decisionmakers must try to anticipate earthquake hazards there and, where possible, to reduce the hazards. Geologic and seismologic information provides the basis for the necessary plans and actions. Two examples of how such information is used are presented. The first involves assessing the impact of a major earthquake on critical facilities in southern California, and the second involves strengthening or removing unsafe masonry buildings in the Los Angeles area. -from Authors

  11. Fundamental questions of earthquake statistics, source behavior, and the estimation of earthquake probabilities from possible foreshocks

    USGS Publications Warehouse

    Michael, Andrew J.

    2012-01-01

    Estimates of the probability that an ML 4.8 earthquake, which occurred near the southern end of the San Andreas fault on 24 March 2009, would be followed by an M 7 mainshock over the following three days vary from 0.0009 using a Gutenberg–Richter model of aftershock statistics (Reasenberg and Jones, 1989) to 0.04 using a statistical model of foreshock behavior and long‐term estimates of large earthquake probabilities, including characteristic earthquakes (Agnew and Jones, 1991). I demonstrate that the disparity between the existing approaches depends on whether or not they conform to Gutenberg–Richter behavior. While Gutenberg–Richter behavior is well established over large regions, it could be violated on individual faults if they have characteristic earthquakes or over small areas if the spatial distribution of large‐event nucleations is disproportional to the rate of smaller events. I develop a new form of the aftershock model that includes characteristic behavior and combines the features of both models. This new model and the older foreshock model yield the same results when given the same inputs, but the new model has the advantage of producing probabilities for events of all magnitudes, rather than just for events larger than the initial one. Compared with the aftershock model, the new model has the advantage of taking into account long‐term earthquake probability models. Using consistent parameters, the probability of an M 7 mainshock on the southernmost San Andreas fault is 0.0001 for three days from long‐term models and the clustering probabilities following the ML 4.8 event are 0.00035 for a Gutenberg–Richter distribution and 0.013 for a characteristic‐earthquake magnitude–frequency distribution. Our decisions about the existence of characteristic earthquakes and how large earthquakes nucleate have a first‐order effect on the probabilities obtained from short‐term clustering models for these large events.

  12. The initial subevent of the 1994 Northridge, California, earthquake: Is earthquake size predictable?

    USGS Publications Warehouse

    Kilb, Debi; Gomberg, J.

    1999-01-01

    We examine the initial subevent (ISE) of the M?? 6.7, 1994 Northridge, California, earthquake in order to discriminate between two end-member rupture initiation models: the 'preslip' and 'cascade' models. Final earthquake size may be predictable from an ISE's seismic signature in the preslip model but not in the cascade model. In the cascade model ISEs are simply small earthquakes that can be described as purely dynamic ruptures. In this model a large earthquake is triggered by smaller earthquakes; there is no size scaling between triggering and triggered events and a variety of stress transfer mechanisms are possible. Alternatively, in the preslip model, a large earthquake nucleates as an aseismically slipping patch in which the patch dimension grows and scales with the earthquake's ultimate size; the byproduct of this loading process is the ISE. In this model, the duration of the ISE signal scales with the ultimate size of the earthquake, suggesting that nucleation and earthquake size are determined by a more predictable, measurable, and organized process. To distinguish between these two end-member models we use short period seismograms recorded by the Southern California Seismic Network. We address questions regarding the similarity in hypocenter locations and focal mechanisms of the ISE and the mainshock. We also compare the ISE's waveform characteristics to those of small earthquakes and to the beginnings of earthquakes with a range of magnitudes. We find that the focal mechanisms of the ISE and mainshock are indistinguishable, and both events may have nucleated on and ruptured the same fault plane. These results satisfy the requirements for both models and thus do not discriminate between them. However, further tests show the ISE's waveform characteristics are similar to those of typical small earthquakes in the vicinity and more importantly, do not scale with the mainshock magnitude. These results are more consistent with the cascade model.

  13. Time-dependent earthquake probabilities

    USGS Publications Warehouse

    Gomberg, J.; Belardinelli, M.E.; Cocco, M.; Reasenberg, P.

    2005-01-01

    We have attempted to provide a careful examination of a class of approaches for estimating the conditional probability of failure of a single large earthquake, particularly approaches that account for static stress perturbations to tectonic loading as in the approaches of Stein et al. (1997) and Hardebeck (2004). We have loading as in the framework based on a simple, generalized rate change formulation and applied it to these two approaches to show how they relate to one another. We also have attempted to show the connection between models of seismicity rate changes applied to (1) populations of independent faults as in background and aftershock seismicity and (2) changes in estimates of the conditional probability of failures of different members of a the notion of failure rate corresponds to successive failures of different members of a population of faults. The latter application requires specification of some probability distribution (density function of PDF) that describes some population of potential recurrence times. This PDF may reflect our imperfect knowledge of when past earthquakes have occurred on a fault (epistemic uncertainty), the true natural variability in failure times, or some combination of both. We suggest two end-member conceptual single-fault models that may explain natural variability in recurrence times and suggest how they might be distinguished observationally. When viewed deterministically, these single-fault patch models differ significantly in their physical attributes, and when faults are immature, they differ in their responses to stress perturbations. Estimates of conditional failure probabilities effectively integrate over a range of possible deterministic fault models, usually with ranges that correspond to mature faults. Thus conditional failure probability estimates usually should not differ significantly for these models. Copyright 2005 by the American Geophysical Union.

  14. Losses to single-family housing from ground motions in the 1994 Northridge, California, earthquake

    USGS Publications Warehouse

    Wesson, R.L.; Perkins, D.M.; Leyendecker, E.V.; Roth, R.J.; Petersen, M.D.

    2004-01-01

    The distributions of insured losses to single-family housing following the 1994 Northridge, California, earthquake for 234 ZIP codes can be satisfactorily modeled with gamma distributions. Regressions of the parameters in the gamma distribution on estimates of ground motion, derived from ShakeMap estimates or from interpolated observations, provide a basis for developing curves of conditional probability of loss given a ground motion. Comparison of the resulting estimates of aggregate loss with the actual aggregate loss gives satisfactory agreement for several different ground-motion parameters. Estimates of loss based on a deterministic spatial model of the earthquake ground motion, using standard attenuation relationships and NEHRP soil factors, give satisfactory results for some ground-motion parameters if the input ground motions are increased about one and one-half standard deviations above the median, reflecting the fact that the ground motions for the Northridge earthquake tended to be higher than the median ground motion for other earthquakes with similar magnitude. The results give promise for making estimates of insured losses to a similar building stock under future earthquake loading. ?? 2004, Earthquake Engineering Research Institute.

  15. SCIGN; new Southern California GPS network advances the study of earthquakes

    USGS Publications Warehouse

    Hudnut, Ken; King, Nancy

    2001-01-01

    Southern California is a giant jigsaw puzzle, and scientists are now using GPS satellites to track the pieces. These puzzle pieces are continuously moving, slowly straining the faults in between. That strain is then eventually released in earthquakes. The innovative Southern California Integrated GPS Network (SCIGN) tracks the motions of these pieces over most of southern California with unprecedented precision. This new network greatly improves the ability to assess seismic hazards and quickly measure the larger displacements that occur during and immediatelyafter earthquakes.

  16. Relative Contributions of Geothermal Pumping and Long-Term Earthquake Rate to Seismicity at California Geothermal Fields

    NASA Astrophysics Data System (ADS)

    Weiser, D. A.; Jackson, D. D.

    2015-12-01

    In a tectonically active area, a definitive discrimination between geothermally-induced and tectonic earthquakes is difficult to achieve. We focus our study on California's 11 major geothermal fields: Amedee, Brawley, Casa Diablo, Coso, East Mesa, The Geysers, Heber, Litchfield, Salton Sea, Susanville, and Wendel. The Geysers geothermal field is the world's largest geothermal energy producer. California's Department of Oil Gas and Geothermal Resources provides field-wide monthly injection and production volumes for each of these sites, which allows us to study the relationship between geothermal pumping activities and seismicity. Since many of the geothermal fields began injecting and producing before nearby seismic stations were installed, we use smoothed seismicity since 1932 from the ANSS catalog as a proxy for tectonic earthquake rate. We examine both geothermal pumping and long-term earthquake rate as factors that may control earthquake rate. Rather than focusing only on the largest earthquake, which is essentially a random occurrence in time, we examine how M≥4 earthquake rate density (probability per unit area, time, and magnitude) varies for each field. We estimate relative contributions to the observed earthquake rate of M≥4 from both a long-term earthquake rate (Kagan and Jackson, 2010) and pumping activity. For each geothermal field, respective earthquake catalogs (NCEDC and SCSN) are complete above at least M3 during the test period (which we tailor to each site). We test the hypothesis that the observed earthquake rate at a geothermal site during the test period is a linear combination of the long-term seismicity and pumping rates. We use a grid search to determine the confidence interval of the weighting parameters.

  17. Future WGCEP Models and the Need for Earthquake Simulators

    NASA Astrophysics Data System (ADS)

    Field, E. H.

    2008-12-01

    The 2008 Working Group on California Earthquake Probabilities (WGCEP) recently released the Uniform California Earthquake Rupture Forecast version 2 (UCERF 2), developed jointly by the USGS, CGS, and SCEC with significant support from the California Earthquake Authority. Although this model embodies several significant improvements over previous WGCEPs, the following are some of the significant shortcomings that we hope to resolve in a future UCERF3: 1) assumptions of fault segmentation and the lack of fault-to-fault ruptures; 2) the lack of an internally consistent methodology for computing time-dependent, elastic-rebound-motivated renewal probabilities; 3) the lack of earthquake clustering/triggering effects; and 4) unwarranted model complexity. It is believed by some that physics-based earthquake simulators will be key to resolving these issues, either as exploratory tools to help guide the present statistical approaches, or as a means to forecast earthquakes directly (although significant challenges remain with respect to the latter).

  18. In the shadow of 1857-the effect of the great Ft. Tejon earthquake on subsequent earthquakes in southern California

    USGS Publications Warehouse

    Harris, R.A.; Simpson, R.W.

    1996-01-01

    The great 1857 Fort Tejon earthquake is the largest earthquake to have hit southern California during the historic period. We investigated if seismicity patterns following 1857 could be due to static stress changes generated by the 1857 earthquake. When post-1857 earthquakes with unknown focal mechanisms were assigned strike-slip mechanisms with strike and rake determined by the nearest active fault, 13 of the 13 southern California M???5.5 earthquakes between 1857 and 1907 were encouraged by the 1857 rupture. When post-1857 earthquakes in the Transverse Ranges with unknown focal mechanisms were assigned reverse mechanisms and all other events were assumed strike-slip, 11 of the 13 earthquakes were encouraged by the 1857 earthquake. These results show significant correlations between static stress changes and seismicity patterns. The correlation disappears around 1907, suggesting that tectonic loading began to overwhelm the effect of the 1857 earthquake early in the 20th century.

  19. Catalog of earthquakes along the San Andreas fault system in Central California, July-September 1972

    USGS Publications Warehouse

    Wesson, R.L.; Meagher, K.L.; Lester, F.W.

    1973-01-01

    Numerous small earthquakes occur each day in the coast ranges of Central California. The detailed study of these earthquakes provides a tool for gaining insight into the tectonic and physical processes responsible for the generation of damaging earthquakes. This catalog contains the fundamental parameters for earthquakes located within and adjacent to the seismograph network operated by the National Center for Earthquake Research (NCER), U.S. Geological Survey, during the period July - September, 1972. The motivation for these detailed studies has been described by Pakiser and others (1969) and by Eaton and others (1970). Similar catalogs of earthquakes for the years 1969, 1970 and 1971 have been prepared by Lee and others (1972 b, c, d). Catalogs for the first and second quarters of 1972 have been prepared by Wessan and others (1972 a & b). The basic data contained in these catalogs provide a foundation for further studies. This catalog contains data on 1254 earthquakes in Central California. Arrival times at 129 seismograph stations were used to locate the earthquakes listed in this catalog. Of these, 104 are telemetered stations operated by NCER. Readings from the remaining 25 stations were obtained through the courtesy of the Seismographic Stations, University of California, Berkeley (UCB), the Earthquake Mechanism Laboratory, National Oceanic and Atmospheric Administration, San Francisco (EML); and the California Department of Water Resources, Sacramento. The Seismographic Stations of the University of California, Berkeley, have for many years published a bulletin describing earthquakes in Northern California and the surrounding area, and readings at UCB Stations from more distant events. The purpose of the present catalog is not to replace the UCB Bulletin, but rather to supplement it, by describing the seismicity of a portion of central California in much greater detail.

  20. Catalog of earthquakes along the San Andreas fault system in Central California: January-March, 1972

    USGS Publications Warehouse

    Wesson, R.L.; Bennett, R.E.; Meagher, K.L.

    1973-01-01

    Numerous small earthquakes occur each day in the Coast Ranges of Central California. The detailed study of these earthquakes provides a tool for gaining insight into the tectonic and physical processes responsible for the generation of damaging earthquakes. This catalog contains the fundamental parameters for earthquakes located within and adjacent to the seismograph network operated by the National Center for Earthquake Research (NCER), U.S. Geological Survey, during the period January - March, 1972. The motivation for these detailed studies has been described by Pakiser and others (1969) and by Eaton and others (1970). Similar catalogs of earthquakes for the years 1969, 1970 and 1971 have been prepared by Lee and others (1972 b,c,d). The basic data contained in these catalogs provide a foundation for further studies. This catalog contains data on 1,718 earthquakes in Central California. Of particular interest is a sequence of earthquakes in the Bear Valley area which contained single shocks with local magnitudes of S.O and 4.6. Earthquakes from this sequence make up roughly 66% of the total and are currently the subject of an interpretative study. Arrival times at 118 seismograph stations were used to locate the earthquakes listed in this catalog. Of these, 94 are telemetered stations operated by NCER. Readings from the remaining 24 stations were obtained through the courtesy of the Seismographic Stations, University of California, Berkeley (UCB); the Earthquake Mechanism Laboratory, National Oceanic and Atmospheric Administration, San Francisco (EML); and the California Department of Water Resources, Sacramento. The Seismographic Stations of the University of California, Berkeley,have for many years published a bulletin describing earthquakes in Northern California and the surrounding area, and readings at UCB Stations from more distant events. The purpose of the present catalog is not to replace the UCB Bulletin, but rather to supplement it, by describing the

  1. Catalog of earthquakes along the San Andreas fault system in Central California, April-June 1972

    USGS Publications Warehouse

    Wesson, R.L.; Bennett, R.E.; Lester, F.W.

    1973-01-01

    Numerous small earthquakes occur each day in the coast ranges of Central California. The detailed study of these earthquakes provides a tool for gaining insight into the tectonic and physical processes responsible for the generation of damaging earthquakes. This catalog contains the fundamental parameters for earthquakes located within and adjacent to the seismograph network operated by the National Center for Earthquake Research (NCER), U.S. Geological Survey, during the period April - June, 1972. The motivation for these detailed studies has been described by Pakiser and others (1969) and by Eaton and others (1970). Similar catalogs of earthquakes for the years 1969, 1970 and 1971 have been prepared by Lee and others (1972 b, c, d). A catalog for the first quarter of 1972 has been prepared by Wesson and others (1972). The basic data contained in these catalogs provide a foundation for further studies. This catalog contains data on 910 earthquakes in Central California. A substantial portion of the earthquakes reported in this catalog represents a continuation of the sequence of earthquakes in the Bear Valley area which began in February, 1972 (Wesson and others, 1972). Arrival times at 126 seismograph stations were used to locate the earthquakes listed in this catalog. Of these, 101 are telemetered stations operated by NCER. Readings from the remaining 25 stations were obtained through the courtesy of the Seismographic Stations, University of California, Berkeley (UCB); the Earthquake Mechanism Laboratory, National Oceanic and Atmospheric Administration, San Francisco (EML); and the California Department of Water Resources, Sacramento. The Seismographic Stations of the University of California, Berkeley, have for many years published a bulletin describing earthquakes in Northern California and the surrounding area, and readings at UCB Stations from more distant events. The purpose of the present catalog is not to replace the UCB Bulletin, but rather to supplement

  2. Seismicity alert probabilities at Parkfield, California, revisited

    USGS Publications Warehouse

    Michael, A.J.; Jones, L.M.

    1998-01-01

    For a decade, the US Geological Survey has used the Parkfield Earthquake Prediction Experiment scenario document to estimate the probability that earthquakes observed on the San Andreas fault near Parkfield will turn out to be foreshocks followed by the expected magnitude six mainshock. During this time, we have learned much about the seismogenic process at Parkfield, about the long-term probability of the Parkfield mainshock, and about the estimation of these types of probabilities. The probabilities for potential foreshocks at Parkfield are reexamined and revised in light of these advances. As part of this process, we have confirmed both the rate of foreshocks before strike-slip earthquakes in the San Andreas physiographic province and the uniform distribution of foreshocks with magnitude proposed by earlier studies. Compared to the earlier assessment, these new estimates of the long-term probability of the Parkfield mainshock are lower, our estimate of the rate of background seismicity is higher, and we find that the assumption that foreshocks at Parkfield occur in a unique way is not statistically significant at the 95% confidence level. While the exact numbers vary depending on the assumptions that are made, the new alert probabilities are lower than previously estimated. Considering the various assumptions and the statistical uncertainties in the input parameters, we also compute a plausible range for the probabilities. The range is large, partly due to the extra knowledge that exists for the Parkfield segment, making us question the usefulness of these numbers.

  3. Probabilities of Earthquake Occurrences along the Sumatra-Andaman Subduction Zone

    NASA Astrophysics Data System (ADS)

    Pailoplee, Santi

    2017-03-01

    Earthquake activities along the Sumatra-Andaman Subduction Zone (SASZ) were clarified using the derived frequency-magnitude distribution in terms of the (i) most probable maximum magnitudes, (ii) return periods and (iii) probabilities of earthquake occurrences. The northern segment of SASZ, along the western coast of Myanmar to southern Nicobar, was found to be capable of generating an earthquake of magnitude 6.1-6.4 Mw in the next 30-50 years, whilst the southern segment of offshore of the northwestern and western parts of Sumatra (defined as a high hazard region) had a short recurrence interval of 6-12 and 10-30 years for a 6.0 and 7.0 Mw magnitude earthquake, respectively, compared to the other regions. Throughout the area along the SASZ, there are 70- almost 100% probabilities of the earthquake with Mw up to 6.0 might be generated in the next 50 years whilst the northern segment had less than 50% chance of occurrence of a 7.0 Mw earthquake in the next 50 year. Although Rangoon was defined as the lowest hazard among the major city in the vicinity of SASZ, there is 90% chance of a 6.0 Mw earthquake in the next 50 years. Therefore, the effective mitigation plan of seismic hazard should be contributed.

  4. Chapter C. The Loma Prieta, California, Earthquake of October 17, 1989 - Landslides

    USGS Publications Warehouse

    Keefer, David K.

    1998-01-01

    Central California, in the vicinity of San Francisco and Monterey Bays, has a history of fatal and damaging landslides, triggered by heavy rainfall, coastal and stream erosion, construction activity, and earthquakes. The great 1906 San Francisco earthquake (MS=8.2-8.3) generated more than 10,000 landslides throughout an area of 32,000 km2; these landslides killed at least 11 people and caused substantial damage to buildings, roads, railroads, and other civil works. Smaller numbers of landslides, which caused more localized damage, have also been reported from at least 20 other earthquakes that have occurred in the San Francisco Bay-Monterey Bay region since 1838. Conditions that make this region particularly susceptible to landslides include steep and rugged topography, weak rock and soil materials, seasonally heavy rainfall, and active seismicity. Given these conditions and history, it was no surprise that the 1989 Loma Prieta earthquake generated thousands of landslides throughout the region. Landslides caused one fatality and damaged at least 200 residences, numerous roads, and many other structures. Direct damage from landslides probably exceeded $30 million; additional, indirect economic losses were caused by long-term landslide blockage of two major highways and by delays in rebuilding brought about by concern over the potential long-term instability of some earthquake-damaged slopes.

  5. Automatic 3D Moment tensor inversions for southern California earthquakes

    NASA Astrophysics Data System (ADS)

    Liu, Q.; Tape, C.; Friberg, P.; Tromp, J.

    2008-12-01

    We present a new source mechanism (moment-tensor and depth) catalog for about 150 recent southern California earthquakes with Mw ≥ 3.5. We carefully select the initial solutions from a few available earthquake catalogs as well as our own preliminary 3D moment tensor inversion results. We pick useful data windows by assessing the quality of fits between the data and synthetics using an automatic windowing package FLEXWIN (Maggi et al 2008). We compute the source Fréchet derivatives of moment-tensor elements and depth for a recent 3D southern California velocity model inverted based upon finite-frequency event kernels calculated by the adjoint methods and a nonlinear conjugate gradient technique with subspace preconditioning (Tape et al 2008). We then invert for the source mechanisms and event depths based upon the techniques introduced by Liu et al 2005. We assess the quality of this new catalog, as well as the other existing ones, by computing the 3D synthetics for the updated 3D southern California model. We also plan to implement the moment-tensor inversion methods to automatically determine the source mechanisms for earthquakes with Mw ≥ 3.5 in southern California.

  6. Time-dependent earthquake probability calculations for southern Kanto after the 2011 M9.0 Tohoku earthquake

    NASA Astrophysics Data System (ADS)

    Nanjo, K. Z.; Sakai, S.; Kato, A.; Tsuruoka, H.; Hirata, N.

    2013-05-01

    Seismicity in southern Kanto activated with the 2011 March 11 Tohoku earthquake of magnitude M9.0, but does this cause a significant difference in the probability of more earthquakes at the present or in the To? future answer this question, we examine the effect of a change in the seismicity rate on the probability of earthquakes. Our data set is from the Japan Meteorological Agency earthquake catalogue, downloaded on 2012 May 30. Our approach is based on time-dependent earthquake probabilistic calculations, often used for aftershock hazard assessment, and are based on two statistical laws: the Gutenberg-Richter (GR) frequency-magnitude law and the Omori-Utsu (OU) aftershock-decay law. We first confirm that the seismicity following a quake of M4 or larger is well modelled by the GR law with b ˜ 1. Then, there is good agreement with the OU law with p ˜ 0.5, which indicates that the slow decay was notably significant. Based on these results, we then calculate the most probable estimates of future M6-7-class events for various periods, all with a starting date of 2012 May 30. The estimates are higher than pre-quake levels if we consider a period of 3-yr duration or shorter. However, for statistics-based forecasting such as this, errors that arise from parameter estimation must be considered. Taking into account the contribution of these errors to the probability calculations, we conclude that any increase in the probability of earthquakes is insignificant. Although we try to avoid overstating the change in probability, our observations combined with results from previous studies support the likelihood that afterslip (fault creep) in southern Kanto will slowly relax a stress step caused by the Tohoku earthquake. This afterslip in turn reminds us of the potential for stress redistribution to the surrounding regions. We note the importance of varying hazards not only in time but also in space to improve the probabilistic seismic hazard assessment for southern Kanto.

  7. A spatiotemporal clustering model for the Third Uniform California Earthquake Rupture Forecast (UCERF3‐ETAS): Toward an operational earthquake forecast

    USGS Publications Warehouse

    Field, Edward; Milner, Kevin R.; Hardebeck, Jeanne L.; Page, Morgan T.; van der Elst, Nicholas; Jordan, Thomas H.; Michael, Andrew J.; Shaw, Bruce E.; Werner, Maximillan J.

    2017-01-01

    We, the ongoing Working Group on California Earthquake Probabilities, present a spatiotemporal clustering model for the Third Uniform California Earthquake Rupture Forecast (UCERF3), with the goal being to represent aftershocks, induced seismicity, and otherwise triggered events as a potential basis for operational earthquake forecasting (OEF). Specifically, we add an epidemic‐type aftershock sequence (ETAS) component to the previously published time‐independent and long‐term time‐dependent forecasts. This combined model, referred to as UCERF3‐ETAS, collectively represents a relaxation of segmentation assumptions, the inclusion of multifault ruptures, an elastic‐rebound model for fault‐based ruptures, and a state‐of‐the‐art spatiotemporal clustering component. It also represents an attempt to merge fault‐based forecasts with statistical seismology models, such that information on fault proximity, activity rate, and time since last event are considered in OEF. We describe several unanticipated challenges that were encountered, including a need for elastic rebound and characteristic magnitude–frequency distributions (MFDs) on faults, both of which are required to get realistic triggering behavior. UCERF3‐ETAS produces synthetic catalogs of M≥2.5 events, conditioned on any prior M≥2.5 events that are input to the model. We evaluate results with respect to both long‐term (1000 year) simulations as well as for 10‐year time periods following a variety of hypothetical scenario mainshocks. Although the results are very plausible, they are not always consistent with the simple notion that triggering probabilities should be greater if a mainshock is located near a fault. Important factors include whether the MFD near faults includes a significant characteristic earthquake component, as well as whether large triggered events can nucleate from within the rupture zone of the mainshock. Because UCERF3‐ETAS has many sources of uncertainty, as

  8. Distribution and Characteristics of Repeating Earthquakes in Northern California

    NASA Astrophysics Data System (ADS)

    Waldhauser, F.; Schaff, D. P.; Zechar, J. D.; Shaw, B. E.

    2012-12-01

    Repeating earthquakes are playing an increasingly important role in the study of fault processes and behavior, and have the potential to improve hazard assessment, earthquake forecast, and seismic monitoring capabilities. These events rupture the same fault patch repeatedly, generating virtually identical seismograms. In California, repeating earthquakes have been found predominately along the creeping section of the central San Andreas Fault, where they are believed to represent failing asperities on an otherwise creeping fault. Here, we use the northern California double-difference catalog of 450,000 precisely located events (1984-2009) and associated database of 2 billion waveform cross-correlation measurements to systematically search for repeating earthquakes across various tectonic regions. An initial search for pairs of earthquakes with high-correlation coefficients and similar magnitudes resulted in 4,610 clusters including a total of over 26,000 earthquakes. A subsequent double-difference re-analysis of these clusters resulted in 1,879 sequences (8,640 events) where a common rupture area can be resolved to the precision of a few tens of meters or less. These repeating earthquake sequences (RES) include between 3 and 24 events with magnitudes up to ML=4. We compute precise relative magnitudes between events in each sequence from differential amplitude measurements. Differences between these and standard coda-duration magnitudes have a standard deviation of 0.09. The RES occur throughout northern California, but RES with 10 or more events (6%) only occur along the central San Andreas and Calaveras faults. We are establishing baseline characteristics for each sequence, such as recurrence intervals and their coefficient of variation (CV), in order to compare them across tectonic regions. CVs for these clusters range from 0.002 to 2.6, indicating a range of behavior between periodic occurrence (CV~0), random occurrence, and temporal clustering. 10% of the RES

  9. Forecasting California's earthquakes: What can we expect in the next 30 years?

    USGS Publications Warehouse

    Field, Edward H.; Milner, Kevin R.; ,

    2008-01-01

    In a new comprehensive study, scientists have determined that the chance of having one or more magnitude 6.7 or larger earthquakes in the California area over the next 30 years is greater than 99%. Such quakes can be deadly, as shown by the 1989 magnitude 6.9 Loma Prieta and the 1994 magnitude 6.7 Northridge earthquakes. The likelihood of at least one even more powerful quake of magnitude 7.5 or greater in the next 30 years is 46%?such a quake is most likely to occur in the southern half of the State. Building codes, earthquake insurance, and emergency planning will be affected by these new results, which highlight the urgency to prepare now for the powerful quakes that are inevitable in California?s future.

  10. The California Post-Earthquake Information Clearinghouse: A Plan to Learn From the Next Large California Earthquake

    NASA Astrophysics Data System (ADS)

    Loyd, R.; Walter, S.; Fenton, J.; Tubbesing, S.; Greene, M.

    2008-12-01

    In the rush to remove debris after a damaging earthquake, perishable data related to a wide range of impacts on the physical, built and social environments can be lost. The California Post-Earthquake Information Clearinghouse is intended to prevent this data loss by supporting the earth scientists, engineers, and social and policy researchers who will conduct fieldwork in the affected areas in the hours and days following the earthquake to study these effects. First called for by Governor Ronald Reagan following the destructive M6.5 San Fernando earthquake in 1971, the concept of the Clearinghouse has since been incorporated into the response plans of the National Earthquake Hazard Reduction Program (USGS Circular 1242). This presentation is intended to acquaint scientists with the purpose, functions, and services of the Clearinghouse. Typically, the Clearinghouse is set up in the vicinity of the earthquake within 24 hours of the mainshock and is maintained for several days to several weeks. It provides a location where field researchers can assemble to share and discuss their observations, plan and coordinate subsequent field work, and communicate significant findings directly to the emergency responders and to the public through press conferences. As the immediate response effort winds down, the Clearinghouse will ensure that collected data are archived and made available through "lessons learned" reports and publications that follow significant earthquakes. Participants in the quarterly meetings of the Clearinghouse include representatives from state and federal agencies, universities, NGOs and other private groups. Overall management of the Clearinghouse is delegated to the agencies represented by the authors above.

  11. Operational Earthquake Forecasting and Decision-Making in a Low-Probability Environment

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.; the International Commission on Earthquake ForecastingCivil Protection

    2011-12-01

    Operational earthquake forecasting (OEF) is the dissemination of authoritative information about the time dependence of seismic hazards to help communities prepare for potentially destructive earthquakes. Most previous work on the public utility of OEF has anticipated that forecasts would deliver high probabilities of large earthquakes; i.e., deterministic predictions with low error rates (false alarms and failures-to-predict) would be possible. This expectation has not been realized. An alternative to deterministic prediction is probabilistic forecasting based on empirical statistical models of aftershock triggering and seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains in excess of 100 relative to long-term forecasts. The utility of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing OEF in this sort of "low-probability environment." The need to move more quickly has been underscored by recent seismic crises, such as the 2009 L'Aquila earthquake sequence, in which an anxious public was confused by informal and inaccurate earthquake predictions. After the L'Aquila earthquake, the Italian Department of Civil Protection appointed an International Commission on Earthquake Forecasting (ICEF), which I chaired, to recommend guidelines for OEF utilization. Our report (Ann. Geophys., 54, 4, 2011; doi: 10.4401/ag-5350) concludes: (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and timely, and need to convey epistemic uncertainties. (b) Earthquake probabilities should be based on operationally qualified, regularly updated forecasting systems. (c) All operational models should be evaluated

  12. What to Expect from the Virtual Seismologist: Delay Times and Uncertainties of Initial Earthquake Alerts in California

    NASA Astrophysics Data System (ADS)

    Behr, Y.; Cua, G. B.; Clinton, J. F.; Racine, R.; Meier, M.; Cauzzi, C.

    2013-12-01

    The Virtual Seismologist (VS) method is a Bayesian approach to regional network-based earthquake early warning (EEW) originally formulated by Cua and Heaton (2007). Implementation of VS into real-time EEW codes has been an on-going effort of the Swiss Seismological Service at ETH Zürich since 2006, with support from ETH Zürich, various European projects, and the United States Geological Survey (USGS). VS is one of three EEW algorithms that form the basis of the California Integrated Seismic Network (CISN) ShakeAlert system, a USGS-funded prototype end-to-end EEW system that could potentially be implemented in California. In Europe, VS is currently operating as a real-time test system in Switzerland, western Greece and Istanbul. As part of the on-going EU project REAKT (Strategies and Tools for Real-Time Earthquake Risk Reduction), VS installations in southern Italy, Romania, and Iceland are planned or underway. The possible use cases for an EEW system will be determined by the speed and reliability of earthquake source parameter estimates. A thorough understanding of both is therefore essential to evaluate the usefulness of VS. For California, we present state-wide theoretical alert times for hypothetical earthquakes by analyzing time delays introduced by the different components in the VS EEW system. Taking advantage of the fully probabilistic formulation of the VS algorithm we further present an improved way to describe the uncertainties of every magnitude estimate by evaluating the width and shape of the probability density function that describes the relationship between waveform envelope amplitudes and magnitude. We evaluate these new uncertainty values for past seismicity in California through off-line playbacks and compare them to the previously defined static definitions of uncertainty based on real-time detections. Our results indicate where VS alerts are most useful in California and also suggest where most effective improvements to the VS EEW system

  13. Earthquake Education and Public Information Centers: A Collaboration Between the Earthquake Country Alliance and Free-Choice Learning Institutions in California

    NASA Astrophysics Data System (ADS)

    Degroot, R. M.; Springer, K.; Brooks, C. J.; Schuman, L.; Dalton, D.; Benthien, M. L.

    2009-12-01

    In 1999 the Southern California Earthquake Center initiated an effort to expand its reach to multiple target audiences through the development of an interpretive trail on the San Andreas fault at Wallace Creek and an earthquake exhibit at Fingerprints Youth Museum in Hemet. These projects and involvement with the San Bernardino County Museum in Redlands beginning in 2007 led to the creation of Earthquake Education and Public Information Centers (EPIcenters) in 2008. The impetus for the development of the network was to broaden participation in The Great Southern California ShakeOut. In 2009 it has grown to be more comprehensive in its scope including its evolution into a statewide network. EPIcenters constitute a variety of free-choice learning institutions, representing museums, science centers, libraries, universities, parks, and other places visited by a variety of audiences including families, seniors, and school groups. They share a commitment to demonstrating and encouraging earthquake preparedness. EPIcenters coordinate Earthquake Country Alliance activities in their county or region, lead presentations or organize events in their communities, or in other ways demonstrate leadership in earthquake education and risk reduction. The San Bernardino County Museum (Southern California) and The Tech Museum of Innovation (Northern California) serve as EPIcenter regional coordinating institutions. They interact with over thirty institutional partners who have implemented a variety of activities from displays and talks to earthquake exhibitions. While many activities are focused on the time leading up to and just after the ShakeOut, most EPIcenter members conduct activities year round. Network members at Kidspace Museum in Pasadena and San Diego Natural History Museum have formed EPIcenter focus groups on early childhood education and safety and security. This presentation highlights the development of the EPIcenter network, synergistic activities resulting from this

  14. Recalculated probability of M ≥ 7 earthquakes beneath the Sea of Marmara, Turkey

    USGS Publications Warehouse

    Parsons, T.

    2004-01-01

    New earthquake probability calculations are made for the Sea of Marmara region and the city of Istanbul, providing a revised forecast and an evaluation of time-dependent interaction techniques. Calculations incorporate newly obtained bathymetric images of the North Anatolian fault beneath the Sea of Marmara [Le Pichon et al., 2001; Armijo et al., 2002]. Newly interpreted fault segmentation enables an improved regional A.D. 1500-2000 earthquake catalog and interevent model, which form the basis for time-dependent probability estimates. Calculations presented here also employ detailed models of coseismic and postseismic slip associated with the 17 August 1999 M = 7.4 Izmit earthquake to investigate effects of stress transfer on seismic hazard. Probability changes caused by the 1999 shock depend on Marmara Sea fault-stressing rates, which are calculated with a new finite element model. The combined 2004-2034 regional Poisson probability of M≥7 earthquakes is ~38%, the regional time-dependent probability is 44 ± 18%, and incorporation of stress transfer raises it to 53 ± 18%. The most important effect of adding time dependence and stress transfer to the calculations is an increase in the 30 year probability of a M ??? 7 earthquake affecting Istanbul. The 30 year Poisson probability at Istanbul is 21%, and the addition of time dependence and stress transfer raises it to 41 ± 14%. The ranges given on probability values are sensitivities of the calculations to input parameters determined by Monte Carlo analysis; 1000 calculations are made using parameters drawn at random from distributions. Sensitivities are large relative to mean probability values and enhancements caused by stress transfer, reflecting a poor understanding of large-earthquake aperiodicity.

  15. Simple Physical Model for the Probability of a Subduction- Zone Earthquake Following Slow Slip Events and Earthquakes: Application to the Hikurangi Megathrust, New Zealand

    NASA Astrophysics Data System (ADS)

    Kaneko, Yoshihiro; Wallace, Laura M.; Hamling, Ian J.; Gerstenberger, Matthew C.

    2018-05-01

    Slow slip events (SSEs) have been documented in subduction zones worldwide, yet their implications for future earthquake occurrence are not well understood. Here we develop a relatively simple, simulation-based method for estimating the probability of megathrust earthquakes following tectonic events that induce any transient stress perturbations. This method has been applied to the locked Hikurangi megathrust (New Zealand) surrounded on all sides by the 2016 Kaikoura earthquake and SSEs. Our models indicate the annual probability of a M≥7.8 earthquake over 1 year after the Kaikoura earthquake increases by 1.3-18 times relative to the pre-Kaikoura probability, and the absolute probability is in the range of 0.6-7%. We find that probabilities of a large earthquake are mainly controlled by the ratio of the total stressing rate induced by all nearby tectonic sources to the mean stress drop of earthquakes. Our method can be applied to evaluate the potential for triggering a megathrust earthquake following SSEs in other subduction zones.

  16. Earthquake precursors: activation or quiescence?

    NASA Astrophysics Data System (ADS)

    Rundle, John B.; Holliday, James R.; Yoder, Mark; Sachs, Michael K.; Donnellan, Andrea; Turcotte, Donald L.; Tiampo, Kristy F.; Klein, William; Kellogg, Louise H.

    2011-10-01

    We discuss the long-standing question of whether the probability for large earthquake occurrence (magnitudes m > 6.0) is highest during time periods of smaller event activation, or highest during time periods of smaller event quiescence. The physics of the activation model are based on an idea from the theory of nucleation, that a small magnitude earthquake has a finite probability of growing into a large earthquake. The physics of the quiescence model is based on the idea that the occurrence of smaller earthquakes (here considered as magnitudes m > 3.5) may be due to a mechanism such as critical slowing down, in which fluctuations in systems with long-range interactions tend to be suppressed prior to large nucleation events. To illuminate this question, we construct two end-member forecast models illustrating, respectively, activation and quiescence. The activation model assumes only that activation can occur, either via aftershock nucleation or triggering, but expresses no choice as to which mechanism is preferred. Both of these models are in fact a means of filtering the seismicity time-series to compute probabilities. Using 25 yr of data from the California-Nevada catalogue of earthquakes, we show that of the two models, activation and quiescence, the latter appears to be the better model, as judged by backtesting (by a slight but not significant margin). We then examine simulation data from a topologically realistic earthquake model for California seismicity, Virtual California. This model includes not only earthquakes produced from increases in stress on the fault system, but also background and off-fault seismicity produced by a BASS-ETAS driving mechanism. Applying the activation and quiescence forecast models to the simulated data, we come to the opposite conclusion. Here, the activation forecast model is preferred to the quiescence model, presumably due to the fact that the BASS component of the model is essentially a model for activated seismicity. These

  17. Real-time forecasts of tomorrow's earthquakes in California: a new mapping tool

    USGS Publications Warehouse

    Gerstenberger, Matt; Wiemer, Stefan; Jones, Lucy

    2004-01-01

    We have derived a multi-model approach to calculate time-dependent earthquake hazard resulting from earthquake clustering. This file report explains the theoretical background behind the approach, the specific details that are used in applying the method to California, as well as the statistical testing to validate the technique. We have implemented our algorithm as a real-time tool that has been automatically generating short-term hazard maps for California since May of 2002, at http://step.wr.usgs.gov

  18. Chapter F. The Loma Prieta, California, Earthquake of October 17, 1989 - Tectonic Processes and Models

    USGS Publications Warehouse

    Simpson, Robert W.

    1994-01-01

    If there is a single theme that unifies the diverse papers in this chapter, it is the attempt to understand the role of the Loma Prieta earthquake in the context of the earthquake 'machine' in northern California: as the latest event in a long history of shocks in the San Francisco Bay region, as an incremental contributor to the regional deformation pattern, and as a possible harbinger of future large earthquakes. One of the surprises generated by the earthquake was the rather large amount of uplift that occurred as a result of the reverse component of slip on the southwest-dipping fault plane. Preearthquake conventional wisdom had been that large earthquakes in the region would probably be caused by horizontal, right-lateral, strike-slip motion on vertical fault planes. In retrospect, the high topography of the Santa Cruz Mountains and the elevated marine terraces along the coast should have provided some clues. With the observed ocean retreat and the obvious uplift of the coast near Santa Cruz that accompanied the earthquake, Mother Nature was finally caught in the act. Several investigators quickly saw the connection between the earthquake uplift and the long-term evolution of the Santa Cruz Mountains and realized that important insights were to be gained by attempting to quantify the process of crustal deformation in terms of Loma Prieta-type increments of northward transport and fault-normal shortening.

  19. Very-long-period volcanic earthquakes beneath Mammoth Mountain, California

    USGS Publications Warehouse

    Hill, D.P.; Dawson, P.; Johnston, M.J.S.; Pitt, A.M.; Biasi, G.; Smith, K.

    2002-01-01

    Detection of three very-long-period (VLP) volcanic earthquakes beneath Mammoth Mountain emphasizes that magmatic processes continue to be active beneath this young, eastern California volcano. These VLP earthquakes, which occured in October 1996 and July and August 2000, appear as bell-shaped pulses with durations of one to two minutes on a nearby borehole dilatometer and on the displacement seismogram from a nearby broadband seismometer. They are accompanied by rapid-fire sequences of high-frequency (HF) earthquakes and several long- period (LP) volcanic earthquakes. The limited VLP data are consistent with a CLVD source at a depth of ???3 km beneath the summit, which we interpret as resulting from a slug of fluid (CO2- saturated magmatic brine or perhaps basaltic magma) moving into a crack.

  20. The Loma Prieta, California, Earthquake of October 17, 1989: Earthquake Occurrence

    USGS Publications Warehouse

    Coordinated by Bakun, William H.; Prescott, William H.

    1993-01-01

    Professional Paper 1550 seeks to understand the M6.9 Loma Prieta earthquake itself. It examines how the fault that generated the earthquake ruptured, searches for and evaluates precursors that may have indicated an earthquake was coming, reviews forecasts of the earthquake, and describes the geology of the earthquake area and the crustal forces that affect this geology. Some significant findings were: * Slip during the earthquake occurred on 35 km of fault at depths ranging from 7 to 20 km. Maximum slip was approximately 2.3 m. The earthquake may not have released all of the strain stored in rocks next to the fault and indicates a potential for another damaging earthquake in the Santa Cruz Mountains in the near future may still exist. * The earthquake involved a large amount of uplift on a dipping fault plane. Pre-earthquake conventional wisdom was that large earthquakes in the Bay area occurred as horizontal displacements on predominantly vertical faults. * The fault segment that ruptured approximately coincided with a fault segment identified in 1988 as having a 30% probability of generating a M7 earthquake in the next 30 years. This was one of more than 20 relevant earthquake forecasts made in the 83 years before the earthquake. * Calculations show that the Loma Prieta earthquake changed stresses on nearby faults in the Bay area. In particular, the earthquake reduced stresses on the Hayward Fault which decreased the frequency of small earthquakes on it. * Geological and geophysical mapping indicate that, although the San Andreas Fault can be mapped as a through going fault in the epicentral region, the southwest dipping Loma Prieta rupture surface is a separate fault strand and one of several along this part of the San Andreas that may be capable of generating earthquakes.

  1. Earthquake and Tsunami planning, outreach and awareness in Humboldt County, California

    NASA Astrophysics Data System (ADS)

    Ozaki, V.; Nicolini, T.; Larkin, D.; Dengler, L.

    2008-12-01

    Humboldt County has the longest coastline in California and is one of the most seismically active areas of the state. It is at risk from earthquakes located on and offshore and from tsunamis generated locally from faults associated with the Cascadia subduction zone (CSZ), other regional fault systems, and from distant sources elsewhere in the Pacific. In 1995 the California Division of Mines and Geology published the first earthquake scenario to include both strong ground shaking effects and a tsunami. As a result of the scenario, the Redwood Coast Tsunami Work Group (RCTWG), an organization of representatives from government agencies, tribes, service groups, academia and the private sector from the three northern coastal California counties, was formed in 1996 to coordinate and promote earthquake and tsunami hazard awareness and mitigation. The RCTWG and its member agencies have sponsored a variety of projects including education/outreach products and programs, tsunami hazard mapping, signage and siren planning, and has sponsored an Earthquake - Tsunami Education Room at the Humboldt County fair for the past eleven years. Three editions of Living on Shaky Ground an earthquake-tsunami preparedness magazine for California's North Coast, have been published since 1993 and a fourth is due to be published in fall 2008. In 2007, Humboldt County was the first region in the country to participate in a tsunami training exercise at FEMA's Emergency Management Institute in Emmitsburg, MD and the first area in California to conduct a full-scale tsunami evacuation drill. The County has conducted numerous multi-agency, multi-discipline coordinated exercises using county-wide tsunami response plan. Two Humboldt County communities were recognized as TsunamiReady by the National Weather Service in 2007. Over 300 tsunami hazard zone signs have been posted in Humboldt County since March 2008. Six assessment surveys from 1993 to 2006 have tracked preparedness actions and personal

  2. The Northern California Earthquake Management System: A Unified System From Realtime Monitoring to Data Distribution

    NASA Astrophysics Data System (ADS)

    Neuhauser, D.; Dietz, L.; Lombard, P.; Klein, F.; Zuzlewski, S.; Kohler, W.; Hellweg, M.; Luetgert, J.; Oppenheimer, D.; Romanowicz, B.

    2006-12-01

    The longstanding cooperation between the USGS Menlo Park and UC Berkeley's Seismological Laboratory for monitoring earthquakes and providing data to the research community is achieving a new level of integration. While station support and data collection for each network (NC, BK, BP) remain the responsibilities of the host institution, picks, codas and amplitudes will be produced and shared between the data centers continuously. Thus, realtime earthquake processing from triggering and locating through magnitude and moment tensor calculation and Shakemap production will take place independently at both locations, improving the robustness of event reporting in the Northern California Earthquake Management Center. Parametric data will also be exchanged with the Southern California Earthquake Management System to allow statewide earthquake detection and processing for further redundancy within the California Integrated Seismic Network (CISN). The database plays an integral part in this system, providing the coordination for event processing as well as the repository for event, instrument (metadata) and waveform information. The same master database serves both realtime processing, data quality control and archival, and the data center which provides waveforms and earthquake data to users in the research community. Continuous waveforms from all BK, BP, and NC stations, event waveform gathers, and event information automatically become available at the Northern California Earthquake Data Center (NCEDC). Currently, the NCEDC is collecting and makes available over 4 TByes of data per year from the NCEMC stations and other seismic networks, as well as from GPS and and other geophysical instrumentation.

  3. School Safety Down to Earth: California's Earthquake-Resistant Schools.

    ERIC Educational Resources Information Center

    Progressive Architecture, 1979

    1979-01-01

    Schools in California being built to resist damage by earthquakes are part of a program to meet building standards established in 1933. The three new schools presented reflect the strengths and weaknesses of the program. (Author/MLF)

  4. Earthquake prediction research at the Seismological Laboratory, California Institute of Technology

    USGS Publications Warehouse

    Spall, H.

    1979-01-01

    Nevertheless, basic earthquake-related information has always been of consuming interest to the public and the media in this part of California (fig. 2.). So it is not surprising that earthquake prediction continues to be a significant reserach program at the laboratory. Several of the current spectrum of projects related to prediction are discussed below. 

  5. Post-Earthquake Traffic Capacity of Modern Bridges in California

    DOT National Transportation Integrated Search

    2010-03-01

    Evaluation of the capacity of a bridge to carry self-weight and traffic loads after an earthquake is essential for a : safe and timely re-opening of the bridge. In California, modern highway bridges designed using the Caltrans : Seismic Design Criter...

  6. Preparing a population for an earthquake like Chi-Chi: The Great Southern California ShakeOut

    USGS Publications Warehouse

    Jones, Lucile M.; ,

    2009-01-01

    The Great Southern California ShakeOut was a week of special events featuring the largest earthquake drill in United States history. On November 13, 2008, over 5 million southern Californians pretended that a magnitude-7.8 earthquake had occurred and practiced actions that could reduce its impact on their lives. The primary message of the ShakeOut is that what we do now, before a big earthquake, will determine what our lives will be like after. The drill was based on a scenario of the impacts and consequences of such an earthquake on the Southern San Andreas Fault, developed by over 300 experts led by the U.S. Geological Survey in partnership with the California Geological Survey, the Southern California Earthquake Center, Earthquake Engineering Research Institute, lifeline operators, emergency services and many other organizations. The ShakeOut campaign was designed and implemented by earthquake scientists, emergency managers, sociologists, art designers and community participants. The means of communication were developed using results from sociological research on what encouraged people to take action. This was structured around four objectives: 1) consistent messages – people are more inclined to believe something when they hear the same thing from multiple sources; 2) visual reinforcement – people are more inclined to do something they see other people doing; 3) encourage “milling” or discussing contemplated action – people need to discuss an action with others they care about before committing to undertaking it; and 4) focus on concrete actions – people are more likely to prepare for a set of concrete consequences of a particular hazard than for an abstract concept of risk. The goals of the ShakeOut were established in Spring 2008 and were: 1) to register 5 million people to participate in the drill; 2) to change the culture of earthquake preparedness in southern California; and 3) to reduce earthquake losses in southern California. All of these

  7. Post-earthquake traffic capacity of modern bridges in California.

    DOT National Transportation Integrated Search

    2010-03-01

    Evaluation of the capacity of a bridge to carry self-weight and traffic loads after an earthquake is essential for a safe and timely re-opening of the bridge. In California, modern highway bridges designed using the Caltrans Seismic Design Criteria a...

  8. Database of potential sources for earthquakes larger than magnitude 6 in Northern California

    USGS Publications Warehouse

    ,

    1996-01-01

    The Northern California Earthquake Potential (NCEP) working group, composed of many contributors and reviewers in industry, academia and government, has pooled its collective expertise and knowledge of regional tectonics to identify potential sources of large earthquakes in northern California. We have created a map and database of active faults, both surficial and buried, that forms the basis for the northern California portion of the national map of probabilistic seismic hazard. The database contains 62 potential sources, including fault segments and areally distributed zones. The working group has integrated constraints from broadly based plate tectonic and VLBI models with local geologic slip rates, geodetic strain rate, and microseismicity. Our earthquake source database derives from a scientific consensus that accounts for conflict in the diverse data. Our preliminary product, as described in this report brings to light many gaps in the data, including a need for better information on the proportion of deformation in fault systems that is aseismic.

  9. Images of crust beneath southern California will aid study of earthquakes and their effects

    USGS Publications Warehouse

    Fuis, G.S.; Okaya, D.A.; Clayton, R.W.; Lutter, W.J.; Ryberg, T.; Brocher, T.M.; Henyey, T.M.; Benthien, M.L.; Davis, P.M.; Mori, J.; Catchings, R.D.; ten Brink, Uri S.; Kohler, M.D.; Klitgord, Kim D.; Bohannon, R.G.

    1996-01-01

    The Whittier Narrows earthquake of 1987 and the Northridge earthquake of 1991 highlighted the earthquake hazards associated with buried faults in the Los Angeles region. A more thorough knowledge of the subsurface structure of southern California is needed to reveal these and other buried faults and to aid us in understanding how the earthquake-producing machinery works in this region.

  10. Probability estimates of seismic event occurrence compared to health hazards - Forecasting Taipei's Earthquakes

    NASA Astrophysics Data System (ADS)

    Fung, D. C. N.; Wang, J. P.; Chang, S. H.; Chang, S. C.

    2014-12-01

    Using a revised statistical model built on past seismic probability models, the probability of different magnitude earthquakes occurring within variable timespans can be estimated. The revised model is based on Poisson distribution and includes the use of best-estimate values of the probability distribution of different magnitude earthquakes recurring from a fault from literature sources. Our study aims to apply this model to the Taipei metropolitan area with a population of 7 million, which lies in the Taipei Basin and is bounded by two normal faults: the Sanchaio and Taipei faults. The Sanchaio fault is suggested to be responsible for previous large magnitude earthquakes, such as the 1694 magnitude 7 earthquake in northwestern Taipei (Cheng et. al., 2010). Based on a magnitude 7 earthquake return period of 543 years, the model predicts the occurrence of a magnitude 7 earthquake within 20 years at 1.81%, within 79 years at 6.77% and within 300 years at 21.22%. These estimates increase significantly when considering a magnitude 6 earthquake; the chance of one occurring within the next 20 years is estimated to be 3.61%, 79 years at 13.54% and 300 years at 42.45%. The 79 year period represents the average lifespan of the Taiwan population. In contrast, based on data from 2013, the probability of Taiwan residents experiencing heart disease or malignant neoplasm is 11.5% and 29%. The inference of this study is that the calculated risk that the Taipei population is at from a potentially damaging magnitude 6 or greater earthquake occurring within their lifetime is just as great as of suffering from a heart attack or other health ailments.

  11. Map showing surface ruptures associated with the Mammoth Lakes, California, earthquakes of May 1980

    USGS Publications Warehouse

    Clark, M.M.; Yount, J.C.; Vaughn, P.R.; Zepeda, R.L.

    1982-01-01

    This map shows surface ruptures associated with the M 6 Mammoth Lakes earthquakes of May 25-27, 1980 (Sherburne, 1980). The ruptures were mapped during USGS field investigations May 28 to June 4 and July 14-19, 1980. The map also includes some of the ruptures recorded by California Division of Mines and Geology investigators May 26-31, June 26-27, and July 7-11, 1980 (Taylor and Bryant, 1980). Because most of the surface ruptures developed in either unconsolidated pumice, alluvium, or till (and many were on slopes of scarps created by earlier faulting), wind, rain and animals quickly erased many of the ruptures. In places, the minimum detectable slip was 3-10 mm. Thus the lines on the map do not record all of the ruptures that formed at the time of the earthquake. Many of the areas were we show gaps between lines on the map probably had cracks originally. 

  12. Southern California Earthquake Center Geologic Vertical Motion Database

    NASA Astrophysics Data System (ADS)

    Niemi, Nathan A.; Oskin, Michael; Rockwell, Thomas K.

    2008-07-01

    The Southern California Earthquake Center Geologic Vertical Motion Database (VMDB) integrates disparate sources of geologic uplift and subsidence data at 104- to 106-year time scales into a single resource for investigations of crustal deformation in southern California. Over 1800 vertical deformation rate data points in southern California and northern Baja California populate the database. Four mature data sets are now represented: marine terraces, incised river terraces, thermochronologic ages, and stratigraphic surfaces. An innovative architecture and interface of the VMDB exposes distinct data sets and reference frames, permitting user exploration of this complex data set and allowing user control over the assumptions applied to convert geologic and geochronologic information into absolute uplift rates. Online exploration and download tools are available through all common web browsers, allowing the distribution of vertical motion results as HTML tables, tab-delimited GIS-compatible text files, or via a map interface through the Google Maps™ web service. The VMDB represents a mature product for research of fault activity and elastic deformation of southern California.

  13. Fluid‐driven seismicity response of the Rinconada fault near Paso Robles, California, to the 2003 M 6.5 San Simeon earthquake

    USGS Publications Warehouse

    Hardebeck, Jeanne L.

    2012-01-01

    The 2003 M 6.5 San Simeon, California, earthquake caused significant damage in the city of Paso Robles and a persistent cluster of aftershocks close to Paso Robles near the Rinconada fault. Given the importance of secondary aftershock triggering in sequences of large events, a concern is whether this cluster of events could trigger another damaging earthquake near Paso Robles. An epidemic‐type aftershock sequence (ETAS) model is fit to the Rinconada seismicity, and multiple realizations indicate a 0.36% probability of at least one M≥6.0 earthquake during the next 30 years. However, this probability estimate is only as good as the projection into the future of the ETAS model. There is evidence that the seismicity may be influenced by fluid pressure changes, which cannot be forecasted using ETAS. The strongest evidence for fluids is the delay between the San Simeon mainshock and a high rate of seismicity in mid to late 2004. This delay can be explained as having been caused by a pore pressure decrease due to an undrained response to the coseismic dilatation, followed by increased pore pressure during the return to equilibrium. Seismicity migration along the fault also suggests fluid involvement, although the migration is too slow to be consistent with pore pressure diffusion. All other evidence, including focal mechanisms and b‐value, is consistent with tectonic earthquakes. This suggests a model where the role of fluid pressure changes is limited to the first seven months, while the fluid pressure equilibrates. The ETAS modeling adequately fits the events after July 2004 when the pore pressure stabilizes. The ETAS models imply that while the probability of a damaging earthquake on the Rinconada fault has approximately doubled due to the San Simeon earthquake, the absolute probability remains low.

  14. Instability model for recurring large and great earthquakes in southern California

    USGS Publications Warehouse

    Stuart, W.D.

    1985-01-01

    The locked section of the San Andreas fault in southern California has experienced a number of large and great earthquakes in the past, and thus is expected to have more in the future. To estimate the location, time, and slip of the next few earthquakes, an earthquake instability model is formulated. The model is similar to one recently developed for moderate earthquakes on the San Andreas fault near Parkfield, California. In both models, unstable faulting (the earthquake analog) is caused by failure of all or part of a patch of brittle, strain-softening fault zone. In the present model the patch extends downward from the ground surface to about 12 km depth, and extends 500 km along strike from Parkfield to the Salton Sea. The variation of patch strength along strike is adjusted by trial until the computed sequence of instabilities matches the sequence of large and great earthquakes since a.d. 1080 reported by Sieh and others. The last earthquake was the M=8.3 Ft. Tejon event in 1857. The resulting strength variation has five contiguous sections of alternately low and high strength. From north to south, the approximate locations of the sections are: (1) Parkfield to Bitterwater Valley, (2) Bitterwater Valley to Lake Hughes, (3) Lake Hughes to San Bernardino, (4) San Bernardino to Palm Springs, and (5) Palm Springs to the Salton Sea. Sections 1, 3, and 5 have strengths between 53 and 88 bars; sections 2 and 4 have strengths between 164 and 193 bars. Patch section ends and unstable rupture ends usually coincide, although one or more adjacent patch sections may fail unstably at once. The model predicts that the next sections of the fault to slip unstably will be 1, 3, and 5; the order and dates depend on the assumed length of an earthquake rupture in about 1700. ?? 1985 Birkha??user Verlag.

  15. 1957 Gobi-Altay, Mongolia, earthquake as a prototype for southern California's most devastating earthquake

    USGS Publications Warehouse

    Bayarsayhan, C.; Bayasgalan, A.; Enhtuvshin, B.; Hudnut, K.W.; Kurushin, R.A.; Molnar, P.; Olziybat, M.

    1996-01-01

    The 1957 Gobi-Altay earthquake was associated with both strike-slip and thrust faulting, processes similar to those along the San Andreas fault and the faults bounding the San Gabriel Mountains just north of Los Angeles, California. Clearly, a major rupture either on the San Andreas fault north of Los Angeles or on the thrust faults bounding the Los Angeles basin poses a serious hazard to inhabitants of that area. By analogy with the Gobi-Altay earthquake, we suggest that simultaneous rupturing of both the San Andreas fault and the thrust faults nearer Los Angeles is a real possibility that amplifies the hazard posed by ruptures on either fault system separately.

  16. Aftershocks and triggered events of the Great 1906 California earthquake

    USGS Publications Warehouse

    Meltzner, A.J.; Wald, D.J.

    2003-01-01

    The San Andreas fault is the longest fault in California and one of the longest strike-slip faults in the world, yet little is known about the aftershocks following the most recent great event on the San Andreas, the Mw 7.8 San Francisco earthquake on 18 April 1906. We conducted a study to locate and to estimate magnitudes for the largest aftershocks and triggered events of this earthquake. We examined existing catalogs and historical documents for the period April 1906 to December 1907, compiling data on the first 20 months of the aftershock sequence. We grouped felt reports temporally and assigned modified Mercalli intensities for the larger events based on the descriptions judged to be the most reliable. For onshore and near-shore events, a grid-search algorithm (derived from empirical analysis of modern earthquakes) was used to find the epicentral location and magnitude most consistent with the assigned intensities. For one event identified as far offshore, the event's intensity distribution was compared with those of modern events, in order to contrain the event's location and magnitude. The largest aftershock within the study period, an M ???6.7 event, occurred ???100 km west of Eureka on 23 April 1906. Although not within our study period, another M ???6.7 aftershock occurred near Cape Mendocino on 28 October 1909. Other significant aftershocks included an M ???5.6 event near San Juan Bautista on 17 May 1906 and an M ???6.3 event near Shelter Cove on 11 August 1907. An M ???4.9 aftershock occurred on the creeping segment of the San Andreas fault (southeast of the mainshock rupture) on 6 July 1906. The 1906 San Francisco earthquake also triggered events in southern California (including separate events in or near the Imperial Valley, the Pomona Valley, and Santa Monica Bay), in western Nevada, in southern central Oregon, and in western Arizona, all within 2 days of the mainshock. Of these trigerred events, the largest were an M ???6.1 earthquake near Brawley

  17. Web-Based Real Time Earthquake Forecasting and Personal Risk Management

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Turcotte, D. L.; Donnellan, A.

    2012-12-01

    Earthquake forecasts have been computed by a variety of countries and economies world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. One example is the Working Group on California Earthquake Probabilities that has been responsible for the official California earthquake forecast since 1988. However, in a time of increasingly severe global financial constraints, we are now moving inexorably towards personal risk management, wherein mitigating risk is becoming the responsibility of individual members of the public. Under these circumstances, open access to a variety of web-based tools, utilities and information is a necessity. Here we describe a web-based system that has been operational since 2009 at www.openhazards.com and www.quakesim.org. Models for earthquake physics and forecasting require input data, along with model parameters. The models we consider are the Natural Time Weibull (NTW) model for regional earthquake forecasting, together with models for activation and quiescence. These models use small earthquakes ('seismicity-based models") to forecast the occurrence of large earthquakes, either through varying rates of small earthquake activity, or via an accumulation of this activity over time. These approaches use data-mining algorithms combined with the ANSS earthquake catalog. The basic idea is to compute large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Each of these approaches has computational challenges associated with computing forecast information in real time. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we show that real-time forecasting is possible at a grid scale of 0.1o. We have analyzed the performance of these models using Reliability/Attributes and standard Receiver Operating Characteristic (ROC) tests. We show how the Reliability and

  18. Seasonal Water Storage, the Resulting Deformation and Stress, and Occurrence of Earthquakes in California

    NASA Astrophysics Data System (ADS)

    Johnson, C. W.; Burgmann, R.; Fu, Y.; Dutilleul, P.

    2015-12-01

    In California the accumulated winter snow pack in the Sierra Nevada, reservoirs and groundwater water storage in the Central Valley follow an annual periodic cycle and each contribute to the resulting surface deformation, which can be observed using GPS time series. The ongoing drought conditions in the western U.S. amplify the observed uplift signal as the Earth's crust responds to the mass changes associated with the water loss. The near surface hydrological mass loss can result in annual stress changes of ~1kPa at seismogenic depths. Similarly, small static stress perturbations have previously been associated with changes in earthquake activity. Periodicity analysis of earthquake catalog time series suggest that periods of 4-, 6-, 12-, and 14.24-months are statistically significant in regions of California, and provide documentation for the modulation of earthquake populations at periods of natural loading cycles. Knowledge of what governs the timing of earthquakes is essential to understanding the nature of the earthquake cycle. If small static stress changes influence the timing of earthquakes, then one could expect that events will occur more rapidly during periods of greater external load increases. To test this hypothesis we develop a loading model using GPS derived surface water storage for California and calculate the stress change at seismogenic depths for different faulting geometries. We then evaluate the degree of correlation between the stress models and the seismicity taking into consideration the variable amplitude of stress cycles, the orientation of transient load stress with respect to the background stress field, and the geometry of active faults revealed by focal mechanisms.

  19. Cruise report for A1-98-SC southern California Earthquake Hazards Project

    USGS Publications Warehouse

    Normark, William R.; Bohannon, Robert G.; Sliter, Ray; Dunhill, Gita; Scholl, David W.; Laursen, Jane; Reid, Jane A.; Holton, David

    1999-01-01

    The focus of the Southern California Earthquake Hazards project, within the Western Region Coastal and Marine Geology team (WRCMG), is to identify the landslide and earthquake hazards and related ground-deformation processes that can potentially impact the social and economic well-being of the inhabitants of the Southern California coastal region, the most populated urban corridor along the U.S. Pacific margin. The primary objective is to help mitigate the earthquake hazards for the Southern California region by improving our understanding of how deformation is distributed (spatially and temporally) in the offshore with respect to the onshore region. To meet this overall objective, we are investigating the distribution, character, and relative intensity of active (i.e., primarily Holocene) deformation within the basins and along the shelf adjacent to the most highly populated areas (see Fig. 1). In addition, the project will examine the Pliocene-Pleistocene record of how this deformation has shifted in space and time. The results of this study should improve our knowledge of shifting deformation for both the long-term (105 to several 106 yr) and short-term (<50 ky) time frames and enable us to identify actively deforming structures that may constitute current significant seismic hazards.

  20. History of Modern Earthquake Hazard Mapping and Assessment in California Using a Deterministic or Scenario Approach

    NASA Astrophysics Data System (ADS)

    Mualchin, Lalliana

    2011-03-01

    Modern earthquake ground motion hazard mapping in California began following the 1971 San Fernando earthquake in the Los Angeles metropolitan area of southern California. Earthquake hazard assessment followed a traditional approach, later called Deterministic Seismic Hazard Analysis (DSHA) in order to distinguish it from the newer Probabilistic Seismic Hazard Analysis (PSHA). In DSHA, seismic hazard in the event of the Maximum Credible Earthquake (MCE) magnitude from each of the known seismogenic faults within and near the state are assessed. The likely occurrence of the MCE has been assumed qualitatively by using late Quaternary and younger faults that are presumed to be seismogenic, but not when or within what time intervals MCE may occur. MCE is the largest or upper-bound potential earthquake in moment magnitude, and it supersedes and automatically considers all other possible earthquakes on that fault. That moment magnitude is used for estimating ground motions by applying it to empirical attenuation relationships, and for calculating ground motions as in neo-DSHA (Z uccolo et al., 2008). The first deterministic California earthquake hazard map was published in 1974 by the California Division of Mines and Geology (CDMG) which has been called the California Geological Survey (CGS) since 2002, using the best available fault information and ground motion attenuation relationships at that time. The California Department of Transportation (Caltrans) later assumed responsibility for printing the refined and updated peak acceleration contour maps which were heavily utilized by geologists, seismologists, and engineers for many years. Some engineers involved in the siting process of large important projects, for example, dams and nuclear power plants, continued to challenge the map(s). The second edition map was completed in 1985 incorporating more faults, improving MCE's estimation method, and using new ground motion attenuation relationships from the latest published

  1. Long-range dependence in earthquake-moment release and implications for earthquake occurrence probability.

    PubMed

    Barani, Simone; Mascandola, Claudia; Riccomagno, Eva; Spallarossa, Daniele; Albarello, Dario; Ferretti, Gabriele; Scafidi, Davide; Augliera, Paolo; Massa, Marco

    2018-03-28

    Since the beginning of the 1980s, when Mandelbrot observed that earthquakes occur on 'fractal' self-similar sets, many studies have investigated the dynamical mechanisms that lead to self-similarities in the earthquake process. Interpreting seismicity as a self-similar process is undoubtedly convenient to bypass the physical complexities related to the actual process. Self-similar processes are indeed invariant under suitable scaling of space and time. In this study, we show that long-range dependence is an inherent feature of the seismic process, and is universal. Examination of series of cumulative seismic moment both in Italy and worldwide through Hurst's rescaled range analysis shows that seismicity is a memory process with a Hurst exponent H ≈ 0.87. We observe that H is substantially space- and time-invariant, except in cases of catalog incompleteness. This has implications for earthquake forecasting. Hence, we have developed a probability model for earthquake occurrence that allows for long-range dependence in the seismic process. Unlike the Poisson model, dependent events are allowed. This model can be easily transferred to other disciplines that deal with self-similar processes.

  2. Uncertainties in Earthquake Loss Analysis: A Case Study From Southern California

    NASA Astrophysics Data System (ADS)

    Mahdyiar, M.; Guin, J.

    2005-12-01

    Probabilistic earthquake hazard and loss analyses play important roles in many areas of risk management, including earthquake related public policy and insurance ratemaking. Rigorous loss estimation for portfolios of properties is difficult since there are various types of uncertainties in all aspects of modeling and analysis. It is the objective of this study to investigate the sensitivity of earthquake loss estimation to uncertainties in regional seismicity, earthquake source parameters, ground motions, and sites' spatial correlation on typical property portfolios in Southern California. Southern California is an attractive region for such a study because it has a large population concentration exposed to significant levels of seismic hazard. During the last decade, there have been several comprehensive studies of most regional faults and seismogenic sources. There have also been detailed studies on regional ground motion attenuations and regional and local site responses to ground motions. This information has been used by engineering seismologists to conduct regional seismic hazard and risk analysis on a routine basis. However, one of the more difficult tasks in such studies is the proper incorporation of uncertainties in the analysis. From the hazard side, there are uncertainties in the magnitudes, rates and mechanisms of the seismic sources and local site conditions and ground motion site amplifications. From the vulnerability side, there are considerable uncertainties in estimating the state of damage of buildings under different earthquake ground motions. From an analytical side, there are challenges in capturing the spatial correlation of ground motions and building damage, and integrating thousands of loss distribution curves with different degrees of correlation. In this paper we propose to address some of these issues by conducting loss analyses of a typical small portfolio in southern California, taking into consideration various source and ground

  3. On the reported ionospheric precursor of the Hector Mine, California earthquake

    USGS Publications Warehouse

    Thomas, J.N.; Love, J.J.; Komjathy, A.; Verkhoglyadova, O.P.; Butala, M.; Rivera, N.

    2012-01-01

    Using Global Positioning System (GPS) data from sites near the 16 Oct. 1999 Hector Mine, California earthquake, Pulinets et al. (2007) identified anomalous changes in the ionospheric total electron content (TEC) starting one week prior to the earthquake. Pulinets (2007) suggested that precursory phenomena of this type could be useful for predicting earthquakes. On the other hand, and in a separate analysis, Afraimovich et al. (2004) concluded that TEC variations near the epicenter were controlled by solar and geomagnetic activity that were unrelated to the earthquake. In an investigation of these very different results, we examine TEC time series of long duration from GPS stations near and far from the epicenter of the Hector Mine earthquake, and long before and long after the earthquake. While we can reproduce the essential time series results of Pulinets et al., we find that the signal they identified as being anomalous is not actually anomalous. Instead, it is just part of normal global-scale TEC variation. We conclude that the TEC anomaly reported by Pulinets et al. is unrelated to the Hector Mine earthquake.

  4. Earthquake potential in California-Nevada implied by correlation of strain rate and seismicity

    USGS Publications Warehouse

    Zeng, Yuehua; Petersen, Mark D.; Shen, Zheng-Kang

    2018-01-01

    Rock mechanics studies and dynamic earthquake simulations show that patterns of seismicity evolve with time through (1) accumulation phase, (2) localization phase, and (3) rupture phase. We observe a similar pattern of changes in seismicity during the past century across California and Nevada. To quantify these changes, we correlate GPS strain rates with seismicity. Earthquakes of M > 6.5 are collocated with regions of highest strain rates. By contrast, smaller magnitude earthquakes of M ≥ 4 show clear spatiotemporal changes. From 1933 to the late 1980s, earthquakes of M ≥ 4 were more diffused and broadly distributed in both high and low strain rate regions (accumulation phase). From the late 1980s to 2016, earthquakes were more concentrated within the high strain rate areas focused on the major fault strands (localization phase). In the same time period, the rate of M > 6.5 events also increased significantly in the high strain rate areas. The strong correlation between current strain rate and the later period of seismicity indicates that seismicity is closely related to the strain rate. The spatial patterns suggest that before the late 1980s, the strain rate field was also broadly distributed because of the stress shadows from previous large earthquakes. As the deformation field evolved out of the shadow in the late 1980s, strain has refocused on the major fault systems and we are entering a period of increased risk for large earthquakes in California.

  5. Earthquake Potential in California-Nevada Implied by Correlation of Strain Rate and Seismicity

    NASA Astrophysics Data System (ADS)

    Zeng, Yuehua; Petersen, Mark D.; Shen, Zheng-Kang

    2018-02-01

    Rock mechanics studies and dynamic earthquake simulations show that patterns of seismicity evolve with time through (1) accumulation phase, (2) localization phase, and (3) rupture phase. We observe a similar pattern of changes in seismicity during the past century across California and Nevada. To quantify these changes, we correlate GPS strain rates with seismicity. Earthquakes of M > 6.5 are collocated with regions of highest strain rates. By contrast, smaller magnitude earthquakes of M ≥ 4 show clear spatiotemporal changes. From 1933 to the late 1980s, earthquakes of M ≥ 4 were more diffused and broadly distributed in both high and low strain rate regions (accumulation phase). From the late 1980s to 2016, earthquakes were more concentrated within the high strain rate areas focused on the major fault strands (localization phase). In the same time period, the rate of M > 6.5 events also increased significantly in the high strain rate areas. The strong correlation between current strain rate and the later period of seismicity indicates that seismicity is closely related to the strain rate. The spatial patterns suggest that before the late 1980s, the strain rate field was also broadly distributed because of the stress shadows from previous large earthquakes. As the deformation field evolved out of the shadow in the late 1980s, strain has refocused on the major fault systems and we are entering a period of increased risk for large earthquakes in California.

  6. Dynamic models of an earthquake and tsunami offshore Ventura, California

    USGS Publications Warehouse

    Kenny J. Ryan,; Geist, Eric L.; Barall, Michael; David D. Oglesby,

    2015-01-01

    The Ventura basin in Southern California includes coastal dip-slip faults that can likely produce earthquakes of magnitude 7 or greater and significant local tsunamis. We construct a 3-D dynamic rupture model of an earthquake on the Pitas Point and Lower Red Mountain faults to model low-frequency ground motion and the resulting tsunami, with a goal of elucidating the seismic and tsunami hazard in this area. Our model results in an average stress drop of 6 MPa, an average fault slip of 7.4 m, and a moment magnitude of 7.7, consistent with regional paleoseismic data. Our corresponding tsunami model uses final seafloor displacement from the rupture model as initial conditions to compute local propagation and inundation, resulting in large peak tsunami amplitudes northward and eastward due to site and path effects. Modeled inundation in the Ventura area is significantly greater than that indicated by state of California's current reference inundation line.

  7. Observation of the seismic nucleation phase in the Ridgecrest, California, earthquake sequence

    USGS Publications Warehouse

    Ellsworth, W.L.; Beroza, G.C.

    1998-01-01

    Near-source observations of five M 3.8-5.2 earthquakes near Ridgecrest, California are consistent with the presence of a seismic nucleation phase. These earthquakes start abruptly, but then slow or stop before rapidly growing again toward their maximum rate of moment release. Deconvolution of instrument and path effects by empirical Green's functions demonstrates that the initial complexity at the start of the earthquake is a source effect. The rapid growth of the P-wave arrival at the start of the seismic nucleation phase supports the conclusion of Mori and Kanamori [1996] that these earthquakes begin without a magnitude-scaled slow initial phase of the type observed by Iio [1992, 1995].

  8. Initial rupture of earthquakes in the 1995 Ridgecrest, California sequence

    USGS Publications Warehouse

    Mori, J.; Kanamori, H.

    1996-01-01

    Close examination of the P waves from earthquakes ranging in size across several orders of magnitude shows that the shape of the initiation of the velocity waveforms is independent of the magnitude of the earthquake. A model in which earthquakes of all sizes have similar rupture initiation can explain the data. This suggests that it is difficult to estimate the eventual size of an earthquake from the initial portion of the waveform. Previously reported curvature seen in the beginning of some velocity waveforms can be largely explained as the effect of anelastic attenuation; thus there is little evidence for a departure from models of simple rupture initiation that grow dynamically from a small region. The results of this study indicate that any "precursory" radiation at seismic frequencies must emanate from a source region no larger than the equivalent of a M0.5 event (i.e. a characteristic length of ???10 m). The size of the nucleation region for magnitude 0 to 5 earthquakes thus is not resolvable with the standard seismic instrumentation deployed in California. Copyright 1996 by the American Geophysical Union.

  9. Products and Services Available from the Southern California Earthquake Data Center (SCEDC) and the Southern California Seismic Network (SCSN)

    NASA Astrophysics Data System (ADS)

    Yu, E.; Chen, S.; Chowdhury, F.; Bhaskaran, A.; Hutton, K.; Given, D.; Hauksson, E.; Clayton, R. W.

    2009-12-01

    The SCEDC archives continuous and triggered data from nearly 3000 data channels from 375 SCSN recorded stations. The SCSN and SCEDC process and archive an average of 12,000 earthquakes each year, contributing to the southern California earthquake catalog that spans from 1932 to present. The SCEDC provides public, searchable access to these earthquake parametric and waveform data through its website www.data.scec.org and through client applications such as STP, NETDC and DHI. New data products: ● The SCEDC is distributing synthetic waveform data from the 2008 ShakeOut scenario (Jones et al., USGS Open File Rep., 2008-1150) and (Graves et al. 2008; Geophys. Res. Lett.) This is a M 7.8 earthquake on the southern San Andreas fault. Users will be able to download 40 sps velocity waveforms in SAC format from the SCEDC website. The SCEDC is also distributing synthetic GPS data (Crowell et al., 2009; Seismo. Res. Letters.) for this scenario as well. ● The SCEDC has added a new web page to show the latest tomographic model of Southern California. This model is based on Tape et al., 2009 Science. New data services: ● The SCEDC is exporting data in QuakeML format. This is an xml format that has been adopted by the Advanced National Seismic System (ANSS). This data will also be available as a web service. ● The SCEDC is exporting data in StationXML format. This is an xml format created by the SCEDC and adopted by ANSS to fully describe station metadata. This data will also be available as a web service. ● The stp 1.6 client can now access both the SCEDC and the Northern California Earthquake Data Center (NCEDC) earthquake and waveform archives. In progress - SCEDC to distribute 1 sps GPS data in miniSEED format: ● As part of a NASA Advanced Information Systems Technology project in collaboration with Jet Propulsion Laboratory and Scripps Institution of Oceanography, the SCEDC will receive real time 1 sps streams of GPS displacement solutions from the California

  10. The Loma Prieta earthquake of October 17, 1989 : a brief geologic view of what caused the Loma Prieta earthquake and implications for future California earthquakes: What happened ... what is expected ... what can be done.

    USGS Publications Warehouse

    Ward, Peter L.; Page, Robert A.

    1990-01-01

    The San Andreas fault, in California, is the primary boundary between the North American plate and the Pacific plate. Land west of the fault has been moving northwestward relative to land on the east at an average rate of 2 inches per year for millions of years. This motion is not constant but occurs typically in sudden jumps during large earthquakes. This motion is relentless; therefore earthquakes in California are inevitable.

  11. Distributing Earthquakes Among California's Faults: A Binary Integer Programming Approach

    NASA Astrophysics Data System (ADS)

    Geist, E. L.; Parsons, T.

    2016-12-01

    Statement of the problem is simple: given regional seismicity specified by a Gutenber-Richter (G-R) relation, how are earthquakes distributed to match observed fault-slip rates? The objective is to determine the magnitude-frequency relation on individual faults. The California statewide G-R b-value and a-value are estimated from historical seismicity, with the a-value accounting for off-fault seismicity. UCERF3 consensus slip rates are used, based on geologic and geodetic data and include estimates of coupling coefficients. The binary integer programming (BIP) problem is set up such that each earthquake from a synthetic catalog spanning millennia can occur at any location along any fault. The decision vector, therefore, consists of binary variables, with values equal to one indicating the location of each earthquake that results in an optimal match of slip rates, in an L1-norm sense. Rupture area and slip associated with each earthquake are determined from a magnitude-area scaling relation. Uncertainty bounds on the UCERF3 slip rates provide explicit minimum and maximum constraints to the BIP model, with the former more important to feasibility of the problem. There is a maximum magnitude limit associated with each fault, based on fault length, providing an implicit constraint. Solution of integer programming problems with a large number of variables (>105 in this study) has been possible only since the late 1990s. In addition to the classic branch-and-bound technique used for these problems, several other algorithms have been recently developed, including pre-solving, sifting, cutting planes, heuristics, and parallelization. An optimal solution is obtained using a state-of-the-art BIP solver for M≥6 earthquakes and California's faults with slip-rates > 1 mm/yr. Preliminary results indicate a surprising diversity of on-fault magnitude-frequency relations throughout the state.

  12. The 1868 Hayward fault, California, earthquake: Implications for earthquake scaling relations on partially creeping faults

    USGS Publications Warehouse

    Hough, Susan E.; Martin, Stacey

    2015-01-01

    The 21 October 1868 Hayward, California, earthquake is among the best-characterized historical earthquakes in California. In contrast to many other moderate-to-large historical events, the causative fault is clearly established. Published magnitude estimates have been fairly consistent, ranging from 6.8 to 7.2, with 95% confidence limits including values as low as 6.5. The magnitude is of particular importance for assessment of seismic hazard associated with the Hayward fault and, more generally, to develop appropriate magnitude–rupture length scaling relations for partially creeping faults. The recent reevaluation of archival accounts by Boatwright and Bundock (2008), together with the growing volume of well-calibrated intensity data from the U.S. Geological Survey “Did You Feel It?” (DYFI) system, provide an opportunity to revisit and refine the magnitude estimate. In this study, we estimate the magnitude using two different methods that use DYFI data as calibration. Both approaches yield preferred magnitude estimates of 6.3–6.6, assuming an average stress drop. A consideration of data limitations associated with settlement patterns increases the range to 6.3–6.7, with a preferred estimate of 6.5. Although magnitude estimates for historical earthquakes are inevitably uncertain, we conclude that, at a minimum, a lower-magnitude estimate represents a credible alternative interpretation of available data. We further discuss implications of our results for probabilistic seismic-hazard assessment from partially creeping faults.

  13. Hydrothermal response to a volcano-tectonic earthquake swarm, Lassen, California

    USGS Publications Warehouse

    Ingebritsen, Steven E.; Shelly, David R.; Hsieh, Paul A.; Clor, Laura; P.H. Seward,; Evans, William C.

    2015-01-01

    The increasing capability of seismic, geodetic, and hydrothermal observation networks allows recognition of volcanic unrest that could previously have gone undetected, creating an imperative to diagnose and interpret unrest episodes. A November 2014 earthquake swarm near Lassen Volcanic National Park, California, which included the largest earthquake in the area in more than 60 years, was accompanied by a rarely observed outburst of hydrothermal fluids. Although the earthquake swarm likely reflects upward migration of endogenous H2O-CO2 fluids in the source region, there is no evidence that such fluids emerged at the surface. Instead, shaking from the modest sized (moment magnitude 3.85) but proximal earthquake caused near-vent permeability increases that triggered increased outflow of hydrothermal fluids already present and equilibrated in a local hydrothermal aquifer. Long-term, multiparametric monitoring at Lassen and other well-instrumented volcanoes enhances interpretation of unrest and can provide a basis for detailed physical modeling.

  14. Cascadia Earthquake and Tsunami Scenario for California's North Coast

    NASA Astrophysics Data System (ADS)

    Dengler, L.

    2006-12-01

    In 1995 the California Division of Mines and Geology (now the California Geological Survey) released a planning scenario for an earthquake on the southern portion of the Cascadia subduction zone (CSZ). This scenario was the 8th and last of the Earthquake Planning Scenarios published by CDMG. It was the largest magnitude CDMG scenario, an 8.4 earthquake rupturing the southern 200 km of the CSZ, and it was the only scenario to include tsunami impacts. This scenario event has not occurred in historic times and depicts impacts far more severe than any recent earthquake. The local tsunami hazard is new; there is no written record of significant local tsunami impact in the region. The north coast scenario received considerable attention in Humboldt and Del Norte Counties and contributed to a number of mitigation efforts. The Redwood Coast Tsunami Work Group (RCTWG), an organization of scientists, emergency managers, government agencies, and businesses from Humboldt, Mendocino, and Del Norte Counties, was formed in 1996 to assist local jurisdictions in understanding the implications of the scenario and to promote a coordinated, consistent mitigation program. The group has produced print and video materials and promoted response and evacuation planning. Since 1997 the RCTWG has sponsored an Earthquake Tsunami Education Room at county fairs featuring preparedness information, hands-on exhibits and regional tsunami hazard maps. Since the development of the TsunamiReady Program in 2001, the RCTWG facilitates community TsunamiReady certification. To assess the effectiveness of mitigation efforts, five telephone surveys between 1993 and 2001 were conducted by the Humboldt Earthquake Education Center. A sixth survey is planned for this fall. Each survey includes between 400 and 600 respondents. Over the nine year period covered by the surveys, the percent with houses secured to foundations has increased from 58 to 80 percent, respondents aware of a local tsunami hazard increased

  15. Foreshocks and aftershocks of the Great 1857 California earthquake

    USGS Publications Warehouse

    Meltzner, A.J.; Wald, D.J.

    1999-01-01

    The San Andreas fault is the longest fault in California and one of the longest strike-slip faults anywhere in the world, yet we know little about many aspects of its behavior before, during, and after large earthquakes. We conducted a study to locate and to estimate magnitudes for the largest foreshocks and aftershocks of the 1857 M 7.9 Fort Tejon earthquake on the central and southern segments of the fault. We began by searching archived first-hand accounts from 1857 through 1862, by grouping felt reports temporally, and by assigning modified Mercalli intensities to each site. We then used a modified form of the grid-search algorithm of Bakum and Wentworth, derived from empirical analysis of modern earthquakes, to find the location and magnitude most consistent with the assigned intensities for each of the largest events. The result confirms a conclusion of Sieh that at least two foreshocks ('dawn' and 'sunrise') located on or near the Parkfield segment of the San Andreas fault preceded the mainshock. We estimate their magnitudes to be M ~ 6.1 and M ~ 5.6, respectively. The aftershock rate was below average but within one standard deviation of the number of aftershocks expected based on statistics of modern southern California mainshock-aftershock sequences. The aftershocks included two significant events during the first eight days of the sequence, with magnitudes M ~ 6.25 and M ~ 6.7, near the southern half of the rupture; later aftershocks included a M ~ 6 event near San Bernardino in December 1858 and a M ~ 6.3 event near the Parkfield segment in April 1860. From earthquake logs at Fort Tejon, we conclude that the aftershock sequence lasted a minimum of 3.75 years.

  16. Analysis and selection of magnitude relations for the Working Group on Utah Earthquake Probabilities

    USGS Publications Warehouse

    Duross, Christopher; Olig, Susan; Schwartz, David

    2015-01-01

    Prior to calculating time-independent and -dependent earthquake probabilities for faults in the Wasatch Front region, the Working Group on Utah Earthquake Probabilities (WGUEP) updated a seismic-source model for the region (Wong and others, 2014) and evaluated 19 historical regressions on earthquake magnitude (M). These regressions relate M to fault parameters for historical surface-faulting earthquakes, including linear fault length (e.g., surface-rupture length [SRL] or segment length), average displacement, maximum displacement, rupture area, seismic moment (Mo ), and slip rate. These regressions show that significant epistemic uncertainties complicate the determination of characteristic magnitude for fault sources in the Basin and Range Province (BRP). For example, we found that M estimates (as a function of SRL) span about 0.3–0.4 units (figure 1) owing to differences in the fault parameter used; age, quality, and size of historical earthquake databases; and fault type and region considered.

  17. Injuries and Traumatic Psychological Exposures Associated with the South Napa Earthquake - California, 2014.

    PubMed

    Attfield, Kathleen R; Dobson, Christine B; Henn, Jennifer B; Acosta, Meileen; Smorodinsky, Svetlana; Wilken, Jason A; Barreau, Tracy; Schreiber, Merritt; Windham, Gayle C; Materna, Barbara L; Roisman, Rachel

    2015-09-11

    On August 24, 2014, at 3:20 a.m., a magnitude 6.0 earthquake struck California, with its epicenter in Napa County (1). The earthquake was the largest to affect the San Francisco Bay area in 25 years and caused significant damage in Napa and Solano counties, including widespread power outages, five residential fires, and damage to roadways, waterlines, and 1,600 buildings (2). Two deaths resulted (2). On August 25, Napa County Public Health asked the California Department of Public Health (CDPH) for assistance in assessing postdisaster health effects, including earthquake-related injuries and effects on mental health. On September 23, Solano County Public Health requested similar assistance. A household-level Community Assessment for Public Health Emergency Response (CASPER) was conducted for these counties in two cities (Napa, 3 weeks after the earthquake, and Vallejo, 6 weeks after the earthquake). Among households reporting injuries, a substantial proportion (48% in Napa and 37% in western Vallejo) reported that the injuries occurred during the cleanup period, suggesting that increased messaging on safety precautions after a disaster might be needed. One fifth of respondents overall (27% in Napa and 9% in western Vallejo) reported one or more traumatic psychological exposures in their households. These findings were used by Napa County Mental Health to guide immediate-term mental health resource allocations and to conduct public training sessions and education campaigns to support persons with mental health risks following the earthquake. In addition, to promote community resilience and future earthquake preparedness, Napa County Public Health subsequently conducted community events on the earthquake anniversary and provided outreach workers with psychological first aid training.

  18. Source properties of earthquakes near the Salton Sea triggered by the 16 October 1999 M 7.1 Hector Mine, California, earthquake

    USGS Publications Warehouse

    Hough, S.E.; Kanamori, H.

    2002-01-01

    We analyze the source properties of a sequence of triggered earthquakes that occurred near the Salton Sea in southern California in the immediate aftermath of the M 7.1 Hector Mine earthquake of 16 October 1999. The sequence produced a number of early events that were not initially located by the regional network, including two moderate earthquakes: the first within 30 sec of the P-wave arrival and a second approximately 10 minutes after the mainshock. We use available amplitude and waveform data from these events to estimate magnitudes to be approximately 4.7 and 4.4, respectively, and to obtain crude estimates of their locations. The sequence of small events following the initial M 4.7 earthquake is clustered and suggestive of a local aftershock sequence. Using both broadband TriNet data and analog data from the Southern California Seismic Network (SCSN), we also investigate the spectral characteristics of the M 4.4 event and other triggered earthquakes using empirical Green's function (EGF) analysis. We find that the source spectra of the events are consistent with expectations for tectonic (brittle shear failure) earthquakes, and infer stress drop values of 0.1 to 6 MPa for six M 2.1 to M 4.4 events. The estimated stress drop values are within the range observed for tectonic earthquakes elsewhere. They are relatively low compared to typically observed stress drop values, which is consistent with expectations for faulting in an extensional, high heat flow regime. The results therefore suggest that, at least in this case, triggered earthquakes are associated with a brittle shear failure mechanism. This further suggests that triggered earthquakes may tend to occur in geothermal-volcanic regions because shear failure occurs at, and can be triggered by, relatively low stresses in extensional regimes.

  19. Statiscal analysis of an earthquake-induced landslide distribution - The 1989 Loma Prieta, California event

    USGS Publications Warehouse

    Keefer, D.K.

    2000-01-01

    The 1989 Loma Prieta, California earthquake (moment magnitude, M=6.9) generated landslides throughout an area of about 15,000 km2 in central California. Most of these landslides occurred in an area of about 2000 km2 in the mountainous terrain around the epicenter, where they were mapped during field investigations immediately following the earthquake. The distribution of these landslides is investigated statistically, using regression and one-way analysisof variance (ANOVA) techniques to determine how the occurrence of landslides correlates with distance from the earthquake source, slope steepness, and rock type. The landslide concentration (defined as the number of landslide sources per unit area) has a strong inverse correlation with distance from the earthquake source and a strong positive correlation with slope steepness. The landslide concentration differs substantially among the various geologic units in the area. The differences correlate to some degree with differences in lithology and degree of induration, but this correlation is less clear, suggesting a more complex relationship between landslide occurrence and rock properties. ?? 2000 Elsevier Science B.V. All rights reserved.

  20. A record of large earthquakes during the past two millennia on the southern Green Valley Fault, California

    USGS Publications Warehouse

    Lienkaemper, James J.; Baldwin, John N.; Turner, Robert; Sickler, Robert R.; Brown, Johnathan

    2013-01-01

    We document evidence for surface-rupturing earthquakes (events) at two trench sites on the southern Green Valley fault, California (SGVF). The 75-80-km long dextral SGVF creeps ~1-4 mm/yr. We identify stratigraphic horizons disrupted by upward-flowering shears and in-filled fissures unlikely to have formed from creep alone. The Mason Rd site exhibits four events from ~1013 CE to the Present. The Lopes Ranch site (LR, 12 km to the south) exhibits three events from 18 BCE to Present including the most recent event (MRE), 1610 ±52 yr CE (1σ) and a two-event interval (18 BCE-238 CE) isolated by a millennium of low deposition. Using Oxcal to model the timing of the 4-event earthquake sequence from radiocarbon data and the LR MRE yields a mean recurrence interval (RI or μ) of 199 ±82 yr (1σ) and ±35 yr (standard error of the mean), the first based on geologic data. The time since the most recent earthquake (open window since MRE) is 402 yr ±52 yr, well past μ~200 yr. The shape of the probability density function (pdf) of the average RI from Oxcal resembles a Brownian Passage Time (BPT) pdf (i.e., rather than normal) that permits rarer longer ruptures potentially involving the Berryessa and Hunting Creek sections of the northernmost GVF. The model coefficient of variation (cv, σ/μ) is 0.41, but a larger value (cv ~0.6) fits better when using BPT. A BPT pdf with μ of 250 yr and cv of 0.6 yields 30-yr rupture probabilities of 20-25% versus a Poisson probability of 11-17%.

  1. One hundred years of earthquake recording at the University of California

    USGS Publications Warehouse

    Bolt, B. A.

    1987-01-01

    The best seismographs then available arrived from England in 1887 and were installed at Lick Observatory on Mt.Hamilton and at the Students Astronomical Observatory on the Berkeley campus. The first California earthquake recorded by the Lick instrument was on April 24, 1887. These seismographic stations have functioned continuously from their founding to the present day, with improvements in instruments from time to time as technology advanced. Now they are part of a sesimogrpahic network of 16 stations recording with great completeness both local and distant earthquakes

  2. NASA Satellite Imagery Shows Sparse Population of Region Near Baja, California Earthquake

    NASA Image and Video Library

    2010-04-09

    This image from NASA Terra spacecraft shows where a magnitude 7.2 earthquake struck in Mexico Baja, California at shallow depth along the principal plate boundary between the North American and Pacific plates on April 4, 2010.

  3. On the reported ionospheric precursor of the 1999 Hector Mine, California earthquake

    USGS Publications Warehouse

    Thomas, Jeremy N.; Love, Jeffrey J.; Komjathy, Attila; Verkhoglyadova, Olga P.; Butala, Mark; Rivera, Nicholas

    2012-01-01

    Using Global Positioning System (GPS) data from sites near the 16 Oct. 1999 Hector Mine, California earthquake, Pulinets et al. (2007) identified anomalous changes in the ionospheric total electron content (TEC) starting one week prior to the earthquake. Pulinets (2007) suggested that precursory phenomena of this type could be useful for predicting earthquakes. On the other hand, and in a separate analysis, Afraimovich et al. (2004) concluded that TEC variations near the epicenter were controlled by solar and geomagnetic activity that were unrelated to the earthquake. In an investigation of these very different results, we examine TEC time series of long duration from GPS stations near and far from the epicenter of the Hector Mine earthquake, and long before and long after the earthquake. While we can reproduce the essential time series results of Pulinets et al., we find that the signal they identify as anomalous is not actually anomalous. Instead, it is just part of normal global-scale TEC variation. We conclude that the TEC anomaly reported by Pulinets et al. is unrelated to the Hector Mine earthquake.

  4. Permanently enhanced dynamic triggering probabilities as evidenced by two M ≥ 7.5 earthquakes

    USGS Publications Warehouse

    Gomberg, Joan S.

    2013-01-01

    The 2012 M7.7 Haida Gwaii earthquake radiated waves that likely dynamically triggered the 2013M7.5 Craig earthquake, setting two precedents. First, the triggered earthquake is the largest dynamically triggered shear failure event documented to date. Second, the events highlight a connection between geologic structure, sedimentary troughs that act as waveguides, and triggering probability. The Haida Gwaii earthquake excited extraordinarily large waves within and beyond the Queen Charlotte Trough, which propagated well into mainland Alaska and likely triggering the Craig earthquake along the way. Previously, focusing and associated dynamic triggering have been attributed to unpredictable source effects. This case suggests that elevated dynamic triggering probabilities may exist along the many structures where sedimentary troughs overlie major faults, such as subduction zones’ accretionary prisms and transform faults’ axial valleys. Although data are sparse, I find no evidence of accelerating seismic activity in the vicinity of the Craig rupture between it and the Haida Gwaii earthquake.

  5. Earthquake swarms and local crustal spreading along major strike-slip faults in California

    USGS Publications Warehouse

    Weaver, C.S.; Hill, D.P.

    1978-01-01

    Earthquake swarms in California are often localized to areas within dextral offsets in the linear trend in active fault strands, suggesting a relation between earthquake swarms and local crustal spreading. Local crustal spereading is required by the geometry of dextral offsets when, as in the San Andreas system, faults have dominantly strike-slip motion with right-lateral displacement. Three clear examples of this relation occur in the Imperial Valley, Coso Hot Springs, and the Danville region, all in California. The first two of these areas are known for their Holocene volcanism and geothermal potential, which is consistent with crustal spreading and magmatic intrusion. The third example, however, shows no evidence for volcanism or geothermal activity at the surface. ?? 1978 Birkha??user Verlag.

  6. Heightened odds of large earthquakes near Istanbul: an interaction-based probability calculation

    USGS Publications Warehouse

    Parsons, T.; Toda, S.; Stein, R.S.; Barka, A.; Dieterich, J.H.

    2000-01-01

    We calculate the probability of strong shaking in Istanbul, an urban center of 10 million people, from the description of earthquakes on the North Anatolian fault system in the Marmara Sea during the past 500 years and test the resulting catalog against the frequency of damage in Istanbul during the preceding millennium, departing from current practice, we include the time-dependent effect of stress transferred by the 1999 moment magnitude M = 7.4 Izmit earthquake to faults nearer to Istanbul. We find a 62 ± 15% probability (one standard deviation) of strong shaking during the next 30 years and 32 ± 12% during the next decade.

  7. Liquefaction-induced lateral spreading in Oceano, California, during the 2003 San Simeon Earthquake

    USGS Publications Warehouse

    Holzer, Thomas L.; Noce, Thomas E.; Bennett, Michael J.; Di Alessandro, Carola; Boatwright, John; Tinsley, John C.; Sell, Russell W.; Rosenberg, Lewis I.

    2004-01-01

    The December 22, 2003, San Simeon, California, (M6.5) earthquake caused damage to houses, road surfaces, and underground utilities in Oceano, California. The community of Oceano is approximately 50 miles (80 km) from the earthquake epicenter. Damage at this distance from a M6.5 earthquake is unusual. To understand the causes of this damage, the U.S. Geological Survey conducted extensive subsurface exploration and monitoring of aftershocks in the months after the earthquake. The investigation included 37 seismic cone penetration tests, 5 soil borings, and aftershock monitoring from January 28 to March 7, 2004. The USGS investigation identified two earthquake hazards in Oceano that explain the San Simeon earthquake damage?site amplification and liquefaction. Site amplification is a phenomenon observed in many earthquakes where the strength of the shaking increases abnormally in areas where the seismic-wave velocity of shallow geologic layers is low. As a result, earthquake shaking is felt more strongly than in surrounding areas without similar geologic conditions. Site amplification in Oceano is indicated by the physical properties of the geologic layers beneath Oceano and was confirmed by monitoring aftershocks. Liquefaction, which is also commonly observed during earthquakes, is a phenomenon where saturated sands lose their strength during an earthquake and become fluid-like and mobile. As a result, the ground may undergo large permanent displacements that can damage underground utilities and well-built surface structures. The type of displacement of major concern associated with liquefaction is lateral spreading because it involves displacement of large blocks of ground down gentle slopes or towards stream channels. The USGS investigation indicates that the shallow geologic units beneath Oceano are very susceptible to liquefaction. They include young sand dunes and clean sandy artificial fill that was used to bury and convert marshes into developable lots. Most of

  8. Chapter B. The Loma Prieta, California, Earthquake of October 17, 1989 - Forecasts

    USGS Publications Warehouse

    Harris, Ruth A.

    1998-01-01

    The magnitude (Mw) 6.9 Loma Prieta earthquake struck the San Francisco Bay region of central California at 5:04 p.m. P.d.t. on October 17, 1989, killing 62 people and generating billions of dollars in property damage. Scientists were not surprised by the occurrence of a destructive earthquake in this region and had, in fact, been attempting to forecast the location of the next large earthquake in the San Francisco Bay region for decades. This paper summarizes more than 20 scientifically based forecasts made before the 1989 Loma Prieta earthquake for a large earthquake that might occur in the Loma Prieta area. The forecasts geographically closest to the actual earthquake primarily consisted of right-lateral strike-slip motion on the San Andreas Fault northwest of San Juan Bautista. Several of the forecasts did encompass the magnitude of the actual earthquake, and at least one approximately encompassed the along-strike rupture length. The 1989 Loma Prieta earthquake differed from most of the forecasted events in two ways: (1) it occurred with considerable dip-slip in addition to strike-slip motion, and (2) it was much deeper than expected.

  9. Rupture directivity of moderate earthquakes in northern California

    USGS Publications Warehouse

    Seekins, Linda C.; Boatwright, John

    2010-01-01

    We invert peak ground velocity and acceleration (PGV and PGA) to estimate rupture direction and rupture velocity for 47 moderate earthquakes (3.5≥M≥5.4) in northern California. We correct sets of PGAs and PGVs recorded at stations less than 55–125 km, depending on source depth, for site amplification and source–receiver distance, then fit the residual peak motions to the unilateral directivity function of Ben-Menahem (1961). We independently invert PGA and PGV. The rupture direction can be determined using as few as seven peak motions if the station distribution is sufficient. The rupture velocity is unstable, however, if there are no takeoff angles within 30° of the rupture direction. Rupture velocities are generally subsonic (0.5β–0.9β); for stability, we limit the rupture velocity at v=0.92β, the Rayleigh wave speed. For 73 of 94 inversions, the rupture direction clearly identifies one of the nodal planes as the fault plane. The 35 strike-slip earthquakes have rupture directions that range from nearly horizontal (6 events) to directly updip (5 events); the other 24 rupture partly along strike and partly updip. Two strike-slip earthquakes rupture updip in one inversion and downdip in the other. All but 1 of the 11 thrust earthquakes rupture predominantly updip. We compare the rupture directions for 10 M≥4.0 earthquakes to the relative location of the mainshock and the first two weeks of aftershocks. Spatial distributions of 8 of 10 aftershock sequences agree well with the rupture directivity calculated for the mainshock.

  10. Preliminary Analysis of Remote Triggered Seismicity in Northern Baja California Generated by the 2011, Tohoku-Oki, Japan Earthquake

    NASA Astrophysics Data System (ADS)

    Wong-Ortega, V.; Castro, R. R.; Gonzalez-Huizar, H.; Velasco, A. A.

    2013-05-01

    We analyze possible variations of seismicity in the northern Baja California due to the passage of seismic waves from the 2011, M9.0, Tohoku-Oki, Japan earthquake. The northwestern area of Baja California is characterized by a mountain range composed of crystalline rocks. These Peninsular Ranges of Baja California exhibits high microseismic activity and moderate size earthquakes. In the eastern region of Baja California shearing between the Pacific and the North American plates takes place and the Imperial and Cerro-Prieto faults generate most of the seismicity. The seismicity in these regions is monitored by the seismic network RESNOM operated by the Centro de Investigación Científica y de Educación Superior de Ensenada (CICESE). This network consists of 13 three-component seismic stations. We use the seismic catalog of RESNOM to search for changes in local seismic rates occurred after the passing of surface waves generated by the Tohoku-Oki, Japan earthquake. When we compare one month of seismicity before and after the M9.0 earthquake, the preliminary analysis shows absence of triggered seismicity in the northern Peninsular Ranges and an increase of seismicity south of the Mexicali valley where the Imperial fault jumps southwest and the Cerro Prieto fault continues.

  11. Stress transferred by the 1995 Mw = 6.9 Kobe, Japan, shock: Effect on aftershocks and future earthquake probabilities

    USGS Publications Warehouse

    Toda, S.; Stein, R.S.; Reasenberg, P.A.; Dieterich, J.H.; Yoshida, A.

    1998-01-01

    The Kobe earthquake struck at the edge of the densely populated Osaka-Kyoto corridor in southwest Japan. We investigate how the earthquake transferred stress to nearby faults, altering their proximity to failure and thus changing earthquake probabilities. We find that relative to the pre-Kobe seismicity, Kobe aftershocks were concentrated in regions of calculated Coulomb stress increase and less common in regions of stress decrease. We quantify this relationship by forming the spatial correlation between the seismicity rate change and the Coulomb stress change. The correlation is significant for stress changes greater than 0.2-1.0 bars (0.02-0.1 MPa), and the nonlinear dependence of seismicity rate change on stress change is compatible with a state- and rate-dependent formulation for earthquake occurrence. We extend this analysis to future mainshocks by resolving the stress changes on major faults within 100 km of Kobe and calculating the change in probability caused by these stress changes. Transient effects of the stress changes are incorporated by the state-dependent constitutive relation, which amplifies the permanent stress changes during the aftershock period. Earthquake probability framed in this manner is highly time-dependent, much more so than is assumed in current practice. Because the probabilities depend on several poorly known parameters of the major faults, we estimate uncertainties of the probabilities by Monte Carlo simulation. This enables us to include uncertainties on the elapsed time since the last earthquake, the repeat time and its variability, and the period of aftershock decay. We estimate that a calculated 3-bar (0.3-MPa) stress increase on the eastern section of the Arima-Takatsuki Tectonic Line (ATTL) near Kyoto causes fivefold increase in the 30-year probability of a subsequent large earthquake near Kyoto; a 2-bar (0.2-MPa) stress decrease on the western section of the ATTL results in a reduction in probability by a factor of 140 to

  12. Distribution of intensity for the Westmorland, California, earthquake of April 26, 1981

    USGS Publications Warehouse

    Barnhard, L.M.; Thenhaus, P.C.; Algermissen, Sylvester Theodore

    1982-01-01

    The maximum Modified Mercalli intensity of the April 26, 1981 earthquake located 5 km northwest of Westmorland, California is VII. Twelve buildings in Westmorland were severely damaged with an additional 30 sustaining minor damage. Two brick parapets fell in Calipatria, 14 km northeast of Westmorland and 10 km from the earthquake epicenter. Significant damage in rural areas was restricted to unreinforced, concrete-lined irrigation canals. Liquefaction effects and ground slumping were widespread in rural areas and were the primary causes of road cracking. Preliminary local government estimates of property loss range from one to three million dollars (Imperial Valley Press, 1981). The earthquake was felt over an area of approximately 160,000 km2; about the same felt area of the October 15, 1979 (Reagor and others, 1980), and May 18, 1940 (Ulrich, 1941) Imperial Valley earthquakes.

  13. Archiving and Distributing Seismic Data at the Southern California Earthquake Data Center (SCEDC)

    NASA Astrophysics Data System (ADS)

    Appel, V. L.

    2002-12-01

    The Southern California Earthquake Data Center (SCEDC) archives and provides public access to earthquake parametric and waveform data gathered by the Southern California Seismic Network and since January 1, 2001, the TriNet seismic network, southern California's earthquake monitoring network. The parametric data in the archive includes earthquake locations, magnitudes, moment-tensor solutions and phase picks. The SCEDC waveform archive prior to TriNet consists primarily of short-period, 100-samples-per-second waveforms from the SCSN. The addition of the TriNet array added continuous recordings of 155 broadband stations (20 samples per second or less), and triggered seismograms from 200 accelerometers and 200 short-period instruments. Since the Data Center and TriNet use the same Oracle database system, new earthquake data are available to the seismological community in near real-time. Primary access to the database and waveforms is through the Seismogram Transfer Program (STP) interface. The interface enables users to search the database for earthquake information, phase picks, and continuous and triggered waveform data. Output is available in SAC, miniSEED, and other formats. Both the raw counts format (V0) and the gain-corrected format (V1) of COSMOS (Consortium of Organizations for Strong-Motion Observation Systems) are now supported by STP. EQQuest is an interface to prepackaged waveform data sets for select earthquakes in Southern California stored at the SCEDC. Waveform data for large-magnitude events have been prepared and new data sets will be available for download in near real-time following major events. The parametric data from 1981 to present has been loaded into the Oracle 9.2.0.1 database system and the waveforms for that time period have been converted to mSEED format and are accessible through the STP interface. The DISC optical-disk system (the "jukebox") that currently serves as the mass-storage for the SCEDC is in the process of being replaced

  14. Spatial organization of foreshocks as a tool to forecast large earthquakes.

    PubMed

    Lippiello, E; Marzocchi, W; de Arcangelis, L; Godano, C

    2012-01-01

    An increase in the number of smaller magnitude events, retrospectively named foreshocks, is often observed before large earthquakes. We show that the linear density probability of earthquakes occurring before and after small or intermediate mainshocks displays a symmetrical behavior, indicating that the size of the area fractured during the mainshock is encoded in the foreshock spatial organization. This observation can be used to discriminate spatial clustering due to foreshocks from the one induced by aftershocks and is implemented in an alarm-based model to forecast m > 6 earthquakes. A retrospective study of the last 19 years Southern California catalog shows that the daily occurrence probability presents isolated peaks closely located in time and space to the epicenters of five of the six m > 6 earthquakes. We find daily probabilities as high as 25% (in cells of size 0.04 × 0.04deg(2)), with significant probability gains with respect to standard models.

  15. Spatial organization of foreshocks as a tool to forecast large earthquakes

    PubMed Central

    Lippiello, E.; Marzocchi, W.; de Arcangelis, L.; Godano, C.

    2012-01-01

    An increase in the number of smaller magnitude events, retrospectively named foreshocks, is often observed before large earthquakes. We show that the linear density probability of earthquakes occurring before and after small or intermediate mainshocks displays a symmetrical behavior, indicating that the size of the area fractured during the mainshock is encoded in the foreshock spatial organization. This observation can be used to discriminate spatial clustering due to foreshocks from the one induced by aftershocks and is implemented in an alarm-based model to forecast m > 6 earthquakes. A retrospective study of the last 19 years Southern California catalog shows that the daily occurrence probability presents isolated peaks closely located in time and space to the epicenters of five of the six m > 6 earthquakes. We find daily probabilities as high as 25% (in cells of size 0.04 × 0.04deg2), with significant probability gains with respect to standard models. PMID:23152938

  16. CISN ShakeAlert: Using early warnings for earthquakes in California

    NASA Astrophysics Data System (ADS)

    Vinci, M.; Hellweg, M.; Jones, L. M.; Khainovski, O.; Schwartz, K.; Lehrer, D.; Allen, R. M.; Neuhauser, D. S.

    2009-12-01

    Educated users who have developed response plans and procedures are just as important for an earthquake early warning (EEW) system as are the algorithms and computers that process the data and produce the warnings. In Japan, for example, the implementation of the EEW system which now provides advanced alerts of ground shaking included intense outreach efforts to both institutional and individual recipients. Alerts are now used in automatic control systems that stop trains, place sensitive equipment in safe mode and isolate hazards while the public takes cover. In California, the California Integrated Seismic Network (CISN) is now developing and implementing components of a prototype system for EEW, ShakeAlert. As this processing system is developed, we invite a suite of perspective users from critical industries and institutions throughout California to partner with us in developing useful ShakeAlert products and procedures. At the same time, we will support their efforts to determine and implement appropriate responses to an early warning of earthquake shaking. As a first step, in a collaboration with BART, we have developed a basic system allowing BART’s operation center to receive realtime ground shaking information from more than 150 seismic stations operating in the San Francisco Bay Area. BART engineers are implementing a display system for this information. Later phases will include the development of improved response procedures utilizing this information. We plan to continue this collaboration to include more sophisticated information from the prototype CISN ShakeAlert system.

  17. Impact of a Large San Andreas Fault Earthquake on Tall Buildings in Southern California

    NASA Astrophysics Data System (ADS)

    Krishnan, S.; Ji, C.; Komatitsch, D.; Tromp, J.

    2004-12-01

    In 1857, an earthquake of magnitude 7.9 occurred on the San Andreas fault, starting at Parkfield and rupturing in a southeasterly direction for more than 300~km. Such a unilateral rupture produces significant directivity toward the San Fernando and Los Angeles basins. The strong shaking in the basins due to this earthquake would have had a significant long-period content (2--8~s). If such motions were to happen today, they could have a serious impact on tall buildings in Southern California. In order to study the effects of large San Andreas fault earthquakes on tall buildings in Southern California, we use the finite source of the magnitude 7.9 2001 Denali fault earthquake in Alaska and map it onto the San Andreas fault with the rupture originating at Parkfield and proceeding southward over a distance of 290~km. Using the SPECFEM3D spectral element seismic wave propagation code, we simulate a Denali-like earthquake on the San Andreas fault and compute ground motions at sites located on a grid with a 2.5--5.0~km spacing in the greater Southern California region. We subsequently analyze 3D structural models of an existing tall steel building designed in 1984 as well as one designed according to the current building code (Uniform Building Code, 1997) subjected to the computed ground motion. We use a sophisticated nonlinear building analysis program, FRAME3D, that has the ability to simulate damage in buildings due to three-component ground motion. We summarize the performance of these structural models on contour maps of carefully selected structural performance indices. This study could benefit the city in laying out emergency response strategies in the event of an earthquake on the San Andreas fault, in undertaking appropriate retrofit measures for tall buildings, and in formulating zoning regulations for new construction. In addition, the study would provide risk data associated with existing and new construction to insurance companies, real estate developers, and

  18. Chapter B. The Loma Prieta, California, Earthquake of October 17, 1989 - Highway Systems

    USGS Publications Warehouse

    Yashinsky, Mark

    1998-01-01

    This paper summarizes the impact of the Loma Prieta earthquake on highway systems. City streets, urban freeways, county roads, state routes, and the national highway system were all affected. There was damage to bridges, roads, tunnels, and other highway structures. The most serious damage occurred in the cities of San Francisco and Oakland, 60 miles from the fault rupture. The cost to repair and replace highways damaged by this earthquake was $2 billion. About half of this cost was to replace the Cypress Viaduct, a long, elevated double-deck expressway that had a devastating collapse which resulted in 42 deaths and 108 injuries. The earthquake also resulted in some positive changes for highway systems. Research on bridges and earthquakes began to be funded at a much higher level. Retrofit programs were started to upgrade the seismic performance of the nation's highways. The Loma Prieta earthquake changed earthquake policy and engineering practice for highway departments not only in California, but all over the world.

  19. Nonlinear site response in medium magnitude earthquakes near Parkfield, California

    USGS Publications Warehouse

    Rubinstein, Justin L.

    2011-01-01

    Careful analysis of strong-motion recordings of 13 medium magnitude earthquakes (3.7 ≤ M ≤ 6.5) in the Parkfield, California, area shows that very modest levels of shaking (approximately 3.5% of the acceleration of gravity) can produce observable changes in site response. Specifically, I observe a drop and subsequent recovery of the resonant frequency at sites that are part of the USGS Parkfield dense seismograph array (UPSAR) and Turkey Flat array. While further work is necessary to fully eliminate other models, given that these frequency shifts correlate with the strength of shaking at the Turkey Flat array and only appear for the strongest shaking levels at UPSAR, the most plausible explanation for them is that they are a result of nonlinear site response. Assuming this to be true, the observation of nonlinear site response in small (M M 6.5 San Simeon earthquake and the 2004 M 6 Parkfield earthquake).

  20. Contrasts between source parameters of M [>=] 5. 5 earthquakes in northern Baja California and southern California

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doser, D.I.

    1993-04-01

    Source parameters determined from the body waveform modeling of large (M [>=] 5.5) historic earthquakes occurring between 1915 and 1956 along the San Jacinto and Imperial fault zones of southern California and the Cerro Prieto, Tres Hermanas and San Miguel fault zones of Baja California have been combined with information from post-1960's events to study regional variations in source parameters. The results suggest that large earthquakes along the relatively young San Miguel and Tres Hermanas fault zones have complex rupture histories, small source dimensions (< 25 km), high stress drops (60 bar average), and a high incidence of foreshock activity.more » This may be a reflection of the rough, highly segmented nature of the young faults. In contrast, Imperial-Cerro Prieto events of similar magnitude have low stress drops (16 bar average) and longer rupture lengths (42 km average), reflecting rupture along older, smoother fault planes. Events along the San Jacinto fault zone appear to lie in between these two groups. These results suggest a relationship between the structural and seismological properties of strike-slip faults that should be considered during seismic risk studies.« less

  1. Historigraphical analysis of the 1857 Ft. Tejon earthquake, San Andreas Fault, California: Preliminary results

    NASA Astrophysics Data System (ADS)

    Martindale, D.; Evans, J. P.

    2002-12-01

    Past historical analyses of the 1857 Forth Tejon earthquake include Townley and Allen (1939); Wood (1955) re-examined the earthquake and added some additional new material, and Agnew and Sieh (1978) published an extensive review of the previous publications and included primary sources not formerly known. Since 1978, most authors have reiterated the findings of Agnew and Sieh, with the exception of Meltzner and Wald's 1998 work that built on Sieh's foreshock research and included an extensive study of aftershocks. Approximately twenty-five years has past since the last full investigation of the event. In the last several decades, libraries and archives have continued to gather additional documents. Staff members continually inventory new and existing collections, making them accessible to researchers today. As a result, we are conducting an updated examination, with the hope of new insight regarding the 1857 Fort Tejon earthquake. We use a new approached to the topic: the research skills of a historian in collaboration with a geologist to generate quantitative data on the nature and location of ground shaking associated with the earthquake. We analyze documents from the Huntington Library, California State Historical Society, California State Library-California Room, Utah Historical Association Information Center, the Church of Jesus Christ of Latter-day Saints (LDS) Archives and Historical Department, Cal Tech Archives, the National Archives, and the Fort Tejon State Park. New facilities reviewed also include Utah State University, University of Utah, and the LDS Family History Center. Each facility not only provided formerly quoted sources, but many offered new materials. For example, previous scholars examined popular, well-known newspapers; yet, publications in smaller towns and in languages other than English, also existed. Thirty newspapers published in January 1857 were located. We find records of the event at least one year after the earthquake. One outcome

  2. Properties of the probability distribution associated with the largest event in an earthquake cluster and their implications to foreshocks.

    PubMed

    Zhuang, Jiancang; Ogata, Yosihiko

    2006-04-01

    The space-time epidemic-type aftershock sequence model is a stochastic branching process in which earthquake activity is classified into background and clustering components and each earthquake triggers other earthquakes independently according to certain rules. This paper gives the probability distributions associated with the largest event in a cluster and their properties for all three cases when the process is subcritical, critical, and supercritical. One of the direct uses of these probability distributions is to evaluate the probability of an earthquake to be a foreshock, and magnitude distributions of foreshocks and nonforeshock earthquakes. To verify these theoretical results, the Japan Meteorological Agency earthquake catalog is analyzed. The proportion of events that have 1 or more larger descendants in total events is found to be as high as about 15%. When the differences between background events and triggered event in the behavior of triggering children are considered, a background event has a probability about 8% to be a foreshock. This probability decreases when the magnitude of the background event increases. These results, obtained from a complicated clustering model, where the characteristics of background events and triggered events are different, are consistent with the results obtained in [Ogata, Geophys. J. Int. 127, 17 (1996)] by using the conventional single-linked cluster declustering method.

  3. Superficial simplicity of the 2010 El Mayorg-Cucapah earthquake of Baja California in Mexico

    USGS Publications Warehouse

    Wei, S.; Fielding, E.; Leprince, S.; Sladen, A.; Avouac, J.-P.; Helmberger, D.; Hauksson, E.; Chu, R.; Simons, M.; Hudnut, K.; Herring, T.; Briggs, R.

    2011-01-01

    The geometry of faults is usually thought to be more complicated at the surface than at depth and to control the initiation, propagation and arrest of seismic ruptures1-6. The fault system that runs from southern California into Mexico is a simple strike-slip boundary: the west side of California and Mexico moves northwards with respect to the east. However, the Mw 7.2 2010 El Mayorg-Cucapah earthquake on this fault system produced a pattern of seismic waves that indicates a far more complex source than slip on a planar strike-slip fault. Here we use geodetic, remote-sensing and seismological data to reconstruct the fault geometry and history of slip during this earthquake. We find that the earthquake produced a straight 120-km-long fault trace that cut through the Cucapah mountain range and across the Colorado River delta. However, at depth, the fault is made up of two different segments connected by a small extensional fault. Both segments strike N130 ??E, but dip in opposite directions. The earthquake was initiated on the connecting extensional fault and 15s later ruptured the two main segments with dominantly strike-slip motion. We show that complexities in the fault geometry at depth explain well the complex pattern of radiated seismic waves. We conclude that the location and detailed characteristics of the earthquake could not have been anticipated on the basis of observations of surface geology alone. ?? 2011 Macmillan Publishers Limited. All rights reserved.

  4. Triggered seismicity and deformation between the Landers, California, and Little Skull Mountain, Nevada, earthquakes

    USGS Publications Warehouse

    Bodin, Paul; Gomberg, Joan

    1994-01-01

    This article presents evidence for the channeling of strain energy released by the Ms = 7.4 Landers, California, earthquake within the eastern California shear zone (ECSZ). We document an increase in seismicity levels during the 22-hr period starting with the Landers earthquake and culminating 22 hr later with the Ms = 5.4 Little Skull Mountain (LSM), Nevada, earthquake. We evaluate the completeness of regional seismicity catalogs during this period and find that the continuity of post-Landers strain release within the ECSZ is even more pronounced than is evident from the catalog data. We hypothesize that regional-scale connectivity of faults within the ECSZ and LSM region is a critical ingredient in the unprecedented scale and distribution of remotely triggered earthquakes and geodetically manifest strain changes that followed the Landers earthquake. The viability of static strain changes as triggering agents is tested using numerical models. Modeling results illustrate that regional-scale fault connectivity can increase the static strain changes by approximately an order of magnitude at distances of at least 280 km, the distance between the Landers and LSM epicenters. This is possible for models that include both a network of connected faults that slip “sympathetically” and realistic levels of tectonic prestrain. Alternatively, if dynamic strains are a more significant triggering agent than static strains, ECSZ structure may still be important in determining the distribution of triggered seismic and aseismic deformation.

  5. A prototype operational earthquake loss model for California based on UCERF3-ETAS – A first look at valuation

    USGS Publications Warehouse

    Field, Edward; Porter, Keith; Milner, Kevn

    2017-01-01

    We present a prototype operational loss model based on UCERF3-ETAS, which is the third Uniform California Earthquake Rupture Forecast with an Epidemic Type Aftershock Sequence (ETAS) component. As such, UCERF3-ETAS represents the first earthquake forecast to relax fault segmentation assumptions and to include multi-fault ruptures, elastic-rebound, and spatiotemporal clustering, all of which seem important for generating realistic and useful aftershock statistics. UCERF3-ETAS is nevertheless an approximation of the system, however, so usefulness will vary and potential value needs to be ascertained in the context of each application. We examine this question with respect to statewide loss estimates, exemplifying how risk can be elevated by orders of magnitude due to triggered events following various scenario earthquakes. Two important considerations are the probability gains, relative to loss likelihoods in the absence of main shocks, and the rapid decay of gains with time. Significant uncertainties and model limitations remain, so we hope this paper will inspire similar analyses with respect to other risk metrics to help ascertain whether operationalization of UCERF3-ETAS would be worth the considerable resources required.

  6. The Redwood Coast Tsunami Work Group: Promoting Earthquake and Tsunami Resilience on California's North Coast

    NASA Astrophysics Data System (ADS)

    Dengler, L. A.; Henderson, C.; Larkin, D.; Nicolini, T.; Ozaki, V.

    2014-12-01

    In historic times, Northern California has suffered the greatest losses from tsunamis in the U.S. contiguous 48 states. 39 tsunamis have been recorded in the region since 1933, including five that caused damage. This paper describes the Redwood Coast Tsunami Work Group (RCTWG), an organization formed in 1996 to address the tsunami threat from both near and far sources. It includes representatives from government agencies, public, private and volunteer organizations, academic institutions, and individuals interested in working to reduce tsunami risk. The geographic isolation and absence of scientific agencies such as the USGS and CGS in the region, and relatively frequent occurrence of both earthquakes and tsunami events has created a unique role for the RCTWG, with activities ranging from basic research to policy and education and outreach programs. Regional interest in tsunami issues began in the early 1990s when there was relatively little interest in tsunamis elsewhere in the state. As a result, the group pioneered tsunami messaging and outreach programs. Beginning in 2008, the RCTWG has partnered with the National Weather Service and the California Office of Emergency Services in conducting the annual "live code" tsunami communications tests, the only area outside of Alaska to do so. In 2009, the RCTWG joined with the Southern California Earthquake Alliance and the Bay Area Earthquake Alliance to form the Earthquake Country Alliance to promote a coordinated and consistent approach to both earthquake and tsunami preparedness throughout the state. The RCTWG has produced and promoted a variety of preparedness projects including hazard mapping and sign placement, an annual "Earthquake - Tsunami Room" at County Fairs, public service announcements and print material, assisting in TsunamiReady community recognition, and facilitating numerous multi-agency, multidiscipline coordinated exercises, and community evacuation drills. Nine assessment surveys from 1993 to 2013

  7. Potential earthquake faults offshore Southern California, from the eastern Santa Barbara Channel south to Dana Point

    USGS Publications Warehouse

    Fisher, M.A.; Sorlien, C.C.; Sliter, R.W.

    2009-01-01

    Urban areas in Southern California are at risk from major earthquakes, not only quakes generated by long-recognized onshore faults but also ones that occur along poorly understood offshore faults. We summarize recent research findings concerning these lesser known faults. Research by the U.S. Geological Survey during the past five years indicates that these faults from the eastern Santa Barbara Channel south to Dana Point pose a potential earthquake threat. Historical seismicity in this area indicates that, in general, offshore faults can unleash earthquakes having at least moderate (M 5-6) magnitude. Estimating the earthquake hazard in Southern California is complicated by strain partitioning and by inheritance of structures from early tectonic episodes. The three main episodes are Mesozoic through early Miocene subduction, early Miocene crustal extension coeval with rotation of the Western Transverse Ranges, and Pliocene and younger transpression related to plate-boundary motion along the San Andreas Fault. Additional complication in the analysis of earthquake hazards derives from the partitioning of tectonic strain into strike-slip and thrust components along separate but kinematically related faults. The eastern Santa Barbara Basin is deformed by large active reverse and thrust faults, and this area appears to be underlain regionally by the north-dipping Channel Islands thrust fault. These faults could produce moderate to strong earthquakes and destructive tsunamis. On the Malibu coast, earthquakes along offshore faults could have left-lateral-oblique focal mechanisms, and the Santa Monica Mountains thrust fault, which underlies the oblique faults, could give rise to large (M ??7) earthquakes. Offshore faults near Santa Monica Bay and the San Pedro shelf are likely to produce both strike-slip and thrust earthquakes along northwest-striking faults. In all areas, transverse structures, such as lateral ramps and tear faults, which crosscut the main faults, could

  8. Foreshock occurrence before large earthquakes

    USGS Publications Warehouse

    Reasenberg, P.A.

    1999-01-01

    Rates of foreshock occurrence involving shallow M ??? 6 and M ??? 7 mainshocks and M ??? 5 foreshocks were measured in two worldwide catalogs over ???20-year intervals. The overall rates observed are similar to ones measured in previous worldwide and regional studies when they are normalized for the ranges of magnitude difference they each span. The observed worldwide rates were compared to a generic model of earthquake clustering based on patterns of small and moderate aftershocks in California. The aftershock model was extended to the case of moderate foreshocks preceding large mainshocks. Overall, the observed worldwide foreshock rates exceed the extended California generic model by a factor of ???2. Significant differences in foreshock rate were found among subsets of earthquakes defined by their focal mechanism and tectonic region, with the rate before thrust events higher and the rate before strike-slip events lower than the worldwide average. Among the thrust events, a large majority, composed of events located in shallow subduction zones, had a high foreshock rate, while a minority, located in continental thrust belts, had a low rate. These differences may explain why previous surveys have found low foreshock rates among thrust events in California (especially southern California), while the worldwide observations suggests the opposite: California, lacking an active subduction zone in most of its territory, and including a region of mountain-building thrusts in the south, reflects the low rate apparently typical for continental thrusts, while the worldwide observations, dominated by shallow subduction zone events, are foreshock-rich. If this is so, then the California generic model may significantly underestimate the conditional probability for a very large (M ??? 8) earthquake following a potential (M ??? 7) foreshock in Cascadia. The magnitude differences among the identified foreshock-mainshock pairs in the Harvard catalog are consistent with a uniform

  9. Conditional, Time-Dependent Probabilities for Segmented Type-A Faults in the WGCEP UCERF 2

    USGS Publications Warehouse

    Field, Edward H.; Gupta, Vipin

    2008-01-01

    This appendix presents elastic-rebound-theory (ERT) motivated time-dependent probabilities, conditioned on the date of last earthquake, for the segmented type-A fault models of the 2007 Working Group on California Earthquake Probabilities (WGCEP). These probabilities are included as one option in the WGCEP?s Uniform California Earthquake Rupture Forecast 2 (UCERF 2), with the other options being time-independent Poisson probabilities and an ?Empirical? model based on observed seismicity rate changes. A more general discussion of the pros and cons of all methods for computing time-dependent probabilities, as well as the justification of those chosen for UCERF 2, are given in the main body of this report (and the 'Empirical' model is also discussed in Appendix M). What this appendix addresses is the computation of conditional, time-dependent probabilities when both single- and multi-segment ruptures are included in the model. Computing conditional probabilities is relatively straightforward when a fault is assumed to obey strict segmentation in the sense that no multi-segment ruptures occur (e.g., WGCEP (1988, 1990) or see Field (2007) for a review of all previous WGCEPs; from here we assume basic familiarity with conditional probability calculations). However, and as we?ll see below, the calculation is not straightforward when multi-segment ruptures are included, in essence because we are attempting to apply a point-process model to a non point process. The next section gives a review and evaluation of the single- and multi-segment rupture probability-calculation methods used in the most recent statewide forecast for California (WGCEP UCERF 1; Petersen et al., 2007). We then present results for the methodology adopted here for UCERF 2. We finish with a discussion of issues and possible alternative approaches that could be explored and perhaps applied in the future. A fault-by-fault comparison of UCERF 2 probabilities with those of previous studies is given in the

  10. The HayWired Earthquake Scenario—Earthquake Hazards

    USGS Publications Warehouse

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of

  11. Monte Carlo Method for Determining Earthquake Recurrence Parameters from Short Paleoseismic Catalogs: Example Calculations for California

    USGS Publications Warehouse

    Parsons, Tom

    2008-01-01

    Paleoearthquake observations often lack enough events at a given site to directly define a probability density function (PDF) for earthquake recurrence. Sites with fewer than 10-15 intervals do not provide enough information to reliably determine the shape of the PDF using standard maximum-likelihood techniques [e.g., Ellsworth et al., 1999]. In this paper I present a method that attempts to fit wide ranges of distribution parameters to short paleoseismic series. From repeated Monte Carlo draws, it becomes possible to quantitatively estimate most likely recurrence PDF parameters, and a ranked distribution of parameters is returned that can be used to assess uncertainties in hazard calculations. In tests on short synthetic earthquake series, the method gives results that cluster around the mean of the input distribution, whereas maximum likelihood methods return the sample means [e.g., NIST/SEMATECH, 2006]. For short series (fewer than 10 intervals), sample means tend to reflect the median of an asymmetric recurrence distribution, possibly leading to an overestimate of the hazard should they be used in probability calculations. Therefore a Monte Carlo approach may be useful for assessing recurrence from limited paleoearthquake records. Further, the degree of functional dependence among parameters like mean recurrence interval and coefficient of variation can be established. The method is described for use with time-independent and time-dependent PDF?s, and results from 19 paleoseismic sequences on strike-slip faults throughout the state of California are given.

  12. Monte Carlo method for determining earthquake recurrence parameters from short paleoseismic catalogs: Example calculations for California

    USGS Publications Warehouse

    Parsons, T.

    2008-01-01

    Paleoearthquake observations often lack enough events at a given site to directly define a probability density function (PDF) for earthquake recurrence. Sites with fewer than 10-15 intervals do not provide enough information to reliably determine the shape of the PDF using standard maximum-likelihood techniques (e.g., Ellsworth et al., 1999). In this paper I present a method that attempts to fit wide ranges of distribution parameters to short paleoseismic series. From repeated Monte Carlo draws, it becomes possible to quantitatively estimate most likely recurrence PDF parameters, and a ranked distribution of parameters is returned that can be used to assess uncertainties in hazard calculations. In tests on short synthetic earthquake series, the method gives results that cluster around the mean of the input distribution, whereas maximum likelihood methods return the sample means (e.g., NIST/SEMATECH, 2006). For short series (fewer than 10 intervals), sample means tend to reflect the median of an asymmetric recurrence distribution, possibly leading to an overestimate of the hazard should they be used in probability calculations. Therefore a Monte Carlo approach may be useful for assessing recurrence from limited paleoearthquake records. Further, the degree of functional dependence among parameters like mean recurrence interval and coefficient of variation can be established. The method is described for use with time-independent and time-dependent PDFs, and results from 19 paleoseismic sequences on strike-slip faults throughout the state of California are given.

  13. Re‐estimated effects of deep episodic slip on the occurrence and probability of great earthquakes in Cascadia

    USGS Publications Warehouse

    Beeler, Nicholas M.; Roeloffs, Evelyn A.; McCausland, Wendy

    2013-01-01

    Mazzotti and Adams (2004) estimated that rapid deep slip during typically two week long episodes beneath northern Washington and southern British Columbia increases the probability of a great Cascadia earthquake by 30–100 times relative to the probability during the ∼58 weeks between slip events. Because the corresponding absolute probability remains very low at ∼0.03% per week, their conclusion is that though it is more likely that a great earthquake will occur during a rapid slip event than during other times, a great earthquake is unlikely to occur during any particular rapid slip event. This previous estimate used a failure model in which great earthquakes initiate instantaneously at a stress threshold. We refine the estimate, assuming a delayed failure model that is based on laboratory‐observed earthquake initiation. Laboratory tests show that failure of intact rock in shear and the onset of rapid slip on pre‐existing faults do not occur at a threshold stress. Instead, slip onset is gradual and shows a damped response to stress and loading rate changes. The characteristic time of failure depends on loading rate and effective normal stress. Using this model, the probability enhancement during the period of rapid slip in Cascadia is negligible (<10%) for effective normal stresses of 10 MPa or more and only increases by 1.5 times for an effective normal stress of 1 MPa. We present arguments that the hypocentral effective normal stress exceeds 1 MPa. In addition, the probability enhancement due to rapid slip extends into the interevent period. With this delayed failure model for effective normal stresses greater than or equal to 50 kPa, it is more likely that a great earthquake will occur between the periods of rapid deep slip than during them. Our conclusion is that great earthquake occurrence is not significantly enhanced by episodic deep slip events.

  14. Water level and strain changes preceding and following the August 4, 1985 Kettleman Hills, California, earthquake

    USGS Publications Warehouse

    Roeloffs, E.; Quilty, E.

    1997-01-01

    Two of the four wells monitored near Parkfield, California, during 1985 showed water level rises beginning three days before the M4 6.1 Kettleman Hills earthquake. In one of these wells, the 3.0 cm rise was nearly unique in five years of water level data. However, in the other well, which showed a 3.8 cm rise, many other changes of comparable size have been observed. Both wells that did not display pre-earthquake rises tap partially confined aquifers that cannot sustain pressure changes due to tectonic strain having periods longer than several days. We evaluate the effect of partial aquifer confinement on the ability of these four wells to display water level changes in response to aquifer strain. Although the vertical hydraulic diffusivities cannot be determined uniquely, we can find a value of diffusivity for each site that is consistent with the site's tidal and barometric responses as well as with the rate of partial recovery of the coseismic water level drops. Furthermore, the diffusivity for one well is high enough to explain why the preseismic rise could not have been detected there. For the fourth well, the diffusivity is high enough to have reduced the size of the preseismic signal as much as 50%, although it should still have been detectable. Imperfect confinement cannot explain the persistent water level changes in the two partially confined aquifers, but it does show that they were not due to volume strain. The pre-earthquake water level rises may have been precursors to the Kettleman Hills earthquake. If so, they probably were not caused by accelerating slip over the part of the fault plane that ruptured in that earthquake because they are of opposite sign to the observed coseismic water level drops.

  15. Landslides triggered by the 1994 Northridge, California, earthquake

    USGS Publications Warehouse

    Harp, E.L.; Jibson, R.W.

    1996-01-01

    The 17 January 1994 Northridge, California, earthquake (Mw, = 6.7) triggered more than 11,000 landslides over an area of about 10,000 km2. Most of the landslides were concentrated in a 1000-km2 area that included the Santa Susana Mountains and the mountains north of the Santa Clara River valley. We mapped landslides triggered by the earthquake in the field and from 1:60,000-nominal-scale aerial photography provided by the U.S. Air Force and taken the morning of the earthquake; these mapped landslides were subsequently digitized and plotted in a GIS-based format. Most of the triggered landslides were shallow (1- to 5-m thick), highly disrupted falls and slides within weakly cemented Tertiary to Pleistocene clastic sediment. Average volumes of these types of landslides were less than 1000 m3, but many had volumes exceeding 100,000 m3. The larger disrupted slides commonly had runout paths of more than 50 m, and a few traveled as far as 200 m from the bases of steep parent slopes. Deeper (>5-m thick) rotational slumps and block slides numbered in the tens to perhaps hundreds, a few of which exceeded 100,000 m3 in volume. Most of these were reactivations of previously existing landslides. The largest single landslide triggered by the earthquake was a rotational slump/block slide having a volume of 8 ?? 106 m3. Analysis of the mapped landslide distribution with respect to variations in (1) landslide susceptibility and (2) strong shaking recorded by hundreds of instruments will form the basis of a seismic landslide hazard analysis of the Los Angeles area.

  16. Assessing Lay Understanding of Common Presentations of Earthquake Hazard Information

    NASA Astrophysics Data System (ADS)

    Thompson, K. J.; Krantz, D. H.

    2010-12-01

    The Working Group on California Earthquake Probabilities (WGCEP) includes, in its introduction to earthquake rupture forecast maps, the assertion that "In daily living, people are used to making decisions based on probabilities -- from the flip of a coin (50% probability of heads) to weather forecasts (such as a 30% chance of rain) to the annual chance of being killed by lightning (about 0.0003%)." [3] However, psychology research identifies a large gap between lay and expert perception of risk for various hazards [2], and cognitive psychologists have shown in numerous studies [1,4-6] that people neglect, distort, misjudge, or misuse probabilities, even when given strong guidelines about the meaning of numerical or verbally stated probabilities [7]. The gap between lay and expert use of probability needs to be recognized more clearly by scientific organizations such as WGCEP. This study undertakes to determine how the lay public interprets earthquake hazard information, as presented in graphical map form by the Uniform California Earthquake Rupture Forecast (UCERF), compiled by the WGCEP and other bodies including the USGS and CGS. It also explores alternate ways of presenting hazard data, to determine which presentation format most effectively translates information from scientists to public. Participants both from California and from elsewhere in the United States are included, to determine whether familiarity -- either with the experience of an earthquake, or with the geography of the forecast area -- affects people's ability to interpret an earthquake hazards map. We hope that the comparisons between the interpretations by scientific experts and by different groups of laypeople will both enhance theoretical understanding of factors that affect information transmission and assist bodies such as the WGCEP in their laudable attempts to help people prepare themselves and their communities for possible natural hazards. [1] Kahneman, D & Tversky, A (1979). Prospect

  17. Radiated Seismic Energy of Earthquakes in the South-Central Region of the Gulf of California, Mexico

    NASA Astrophysics Data System (ADS)

    Castro, Raúl R.; Mendoza-Camberos, Antonio; Pérez-Vertti, Arturo

    2018-05-01

    We estimated the radiated seismic energy (ES) of 65 earthquakes located in the south-central region of the Gulf of California. Most of these events occurred along active transform faults that define the Pacific-North America plate boundary and have magnitudes between M3.3 and M5.9. We corrected the spectral records for attenuation using nonparametric S-wave attenuation functions determined with the whole data set. The path effects were isolated from the seismic source using a spectral inversion. We computed radiated seismic energy of the earthquakes by integrating the square velocity source spectrum and estimated their apparent stresses. We found that most events have apparent stress between 3 × 10-4 and 3 MPa. Model independent estimates of the ratio between seismic energy and moment (ES/M0) indicates that this ratio is independent of earthquake size. We conclude that in general the apparent stress is low (σa < 3 MPa) in the south-central and southern Gulf of California.

  18. Intensity earthquake scenario (scenario event - a damaging earthquake with higher probability of occurrence) for the city of Sofia

    NASA Astrophysics Data System (ADS)

    Aleksandrova, Irena; Simeonova, Stela; Solakov, Dimcho; Popova, Maria

    2014-05-01

    Among the many kinds of natural and man-made disasters, earthquakes dominate with regard to their social and economical impact on the urban environment. Global seismic risk to earthquakes are increasing steadily as urbanization and development occupy more areas that a prone to effects of strong earthquakes. Additionally, the uncontrolled growth of mega cities in highly seismic areas around the world is often associated with the construction of seismically unsafe buildings and infrastructures, and undertaken with an insufficient knowledge of the regional seismicity peculiarities and seismic hazard. The assessment of seismic hazard and generation of earthquake scenarios is the first link in the prevention chain and the first step in the evaluation of the seismic risk. The earthquake scenarios are intended as a basic input for developing detailed earthquake damage scenarios for the cities and can be used in earthquake-safe town and infrastructure planning. The city of Sofia is the capital of Bulgaria. It is situated in the centre of the Sofia area that is the most populated (the population is of more than 1.2 mil. inhabitants), industrial and cultural region of Bulgaria that faces considerable earthquake risk. The available historical documents prove the occurrence of destructive earthquakes during the 15th-18th centuries in the Sofia zone. In 19th century the city of Sofia has experienced two strong earthquakes: the 1818 earthquake with epicentral intensity I0=8-9 MSK and the 1858 earthquake with I0=9-10 MSK. During the 20th century the strongest event occurred in the vicinity of the city of Sofia is the 1917 earthquake with MS=5.3 (I0=7-8 MSK). Almost a century later (95 years) an earthquake of moment magnitude 5.6 (I0=7-8 MSK) hit the city of Sofia, on May 22nd, 2012. In the present study as a deterministic scenario event is considered a damaging earthquake with higher probability of occurrence that could affect the city with intensity less than or equal to VIII

  19. Frequency-Dependent Tidal Triggering of Low Frequency Earthquakes Near Parkfield, California

    NASA Astrophysics Data System (ADS)

    Xue, L.; Burgmann, R.; Shelly, D. R.

    2017-12-01

    The effect of small periodic stress perturbations on earthquake generation is not clear, however, the rate of low-frequency earthquakes (LFEs) near Parkfield, California has been found to be strongly correlated with solid earth tides. Laboratory experiments and theoretical analyses show that the period of imposed forcing and source properties affect the sensitivity to triggering and the phase relation of the peak seismicity rate and the periodic stress, but frequency-dependent triggering has not been quantitatively explored in the field. Tidal forcing acts over a wide range of frequencies, therefore the sensitivity to tidal triggering of LFEs provides a good probe to the physical mechanisms affecting earthquake generation. In this study, we consider the tidal triggering of LFEs near Parkfield, California since 2001. We find the LFEs rate is correlated with tidal shear stress, normal stress rate and shear stress rate. The occurrence of LFEs can also be independently modulated by groups of tidal constituents at semi-diurnal, diurnal and fortnightly frequencies. The strength of the response of LFEs to the different tidal constituents varies between LFE families. Each LFE family has an optimal triggering frequency, which does not appear to be depth dependent or systematically related to other known properties. This suggests the period of the applied forcing plays an important role in the triggering process, and the interaction of periods of loading history and source region properties, such as friction, effective normal stress and pore fluid pressure, produces the observed frequency-dependent tidal triggering of LFEs.

  20. The 1868 Hayward Earthquake Alliance: A Case Study - Using an Earthquake Anniversary to Promote Earthquake Preparedness

    NASA Astrophysics Data System (ADS)

    Brocher, T. M.; Garcia, S.; Aagaard, B. T.; Boatwright, J. J.; Dawson, T.; Hellweg, M.; Knudsen, K. L.; Perkins, J.; Schwartz, D. P.; Stoffer, P. W.; Zoback, M.

    2008-12-01

    Last October 21st marked the 140th anniversary of the M6.8 1868 Hayward Earthquake, the last damaging earthquake on the southern Hayward Fault. This anniversary was used to help publicize the seismic hazards associated with the fault because: (1) the past five such earthquakes on the Hayward Fault occurred about 140 years apart on average, and (2) the Hayward-Rodgers Creek Fault system is the most likely (with a 31 percent probability) fault in the Bay Area to produce a M6.7 or greater earthquake in the next 30 years. To promote earthquake awareness and preparedness, over 140 public and private agencies and companies and many individual joined the public-private nonprofit 1868 Hayward Earthquake Alliance (1868alliance.org). The Alliance sponsored many activities including a public commemoration at Mission San Jose in Fremont, which survived the 1868 earthquake. This event was followed by an earthquake drill at Bay Area schools involving more than 70,000 students. The anniversary prompted the Silver Sentinel, an earthquake response exercise based on the scenario of an earthquake on the Hayward Fault conducted by Bay Area County Offices of Emergency Services. 60 other public and private agencies also participated in this exercise. The California Seismic Safety Commission and KPIX (CBS affiliate) produced professional videos designed forschool classrooms promoting Drop, Cover, and Hold On. Starting in October 2007, the Alliance and the U.S. Geological Survey held a sequence of press conferences to announce the release of new research on the Hayward Fault as well as new loss estimates for a Hayward Fault earthquake. These included: (1) a ShakeMap for the 1868 Hayward earthquake, (2) a report by the U. S. Bureau of Labor Statistics forecasting the number of employees, employers, and wages predicted to be within areas most strongly shaken by a Hayward Fault earthquake, (3) new estimates of the losses associated with a Hayward Fault earthquake, (4) new ground motion

  1. ERTS Applications in earthquake research and mineral exploration in California

    NASA Technical Reports Server (NTRS)

    Abdel-Gawad, M.; Silverstein, J.

    1973-01-01

    Examples that ERTS imagery can be effectively utilized to identify, locate, and map faults which show geomorphic evidence of geologically recent breakage are presented. Several important faults not previously known have been identified. By plotting epicenters of historic earthquakes in parts of California, Sonora, Mexico, Arizona, and Nevada, we found that areas known for historic seismicity are often characterized by abundant evidence of recent fault and crustal movements. There are many examples of seismically quiet areas where outstanding evidence of recent fault movements is observed. One application is clear: ERTS-1 imagery could be effectively utilized to delineate areas susceptible to earthquake recurrence which, on the basis of seismic data alone, may be misleadingly considered safe. ERTS data can also be utilized in planning new sites in the geophysical network of fault movement monitoring and strain and tilt measurements.

  2. Earthquake epicenters and fault intersections in central and southern California

    NASA Technical Reports Server (NTRS)

    Abdel-Gawad, M. (Principal Investigator); Silverstein, J.

    1972-01-01

    The author has identifed the following significant results. ERTS-1 imagery provided evidence for the existence of short transverse fault segments lodged between faults of the San Andreas system in the Coast Ranges, California. They indicate that an early episode of transverse shear has affected the Coast Ranges prior to the establishment of the present San Andreas fault. The fault has been offset by transverse faults of the Transverse Ranges. It appears feasible to identify from ERTS-1 imagery geomorphic criteria of recent fault movements. Plots of historic earthquakes in the Coast Ranges and western Transverse Ranges show clusters in areas where structures are complicated by interaction of tow active fault systems. A fault lineament apparently not previously mapped was identified in the Uinta Mountains, Utah. Part of the lineament show evidence of recent faulting which corresponds to a moderate earthquake cluster.

  3. Great East Japan earthquake, JR East mitigation successes, and lessons for California high-speed rail.

    DOT National Transportation Integrated Search

    2015-04-01

    California and Japan both experience frequent seismic activity, which is often damaging to infrastructure. Seismologists have : developed systems for detecting and analyzing earthquakes in real-time. JR East has developed systems to mitigate the : da...

  4. A public health issue related to collateral seismic hazards: The valley fever outbreak triggered by the 1994 Northridge, California earthquake

    USGS Publications Warehouse

    Jibson, R.W.

    2002-01-01

    Following the 17 January 1994 Northridge. California earthquake (M = 6.7), Ventura County, California, experienced a major outbreak of coccidioidomycosis (CM), commonly known as valley fever, a respiratory disease contracted by inhaling airborne fungal spores. In the 8 weeks following the earthquake (24 January through 15 March), 203 outbreak-associated cases were reported, which is about an order of magnitude more than the expected number of cases, and three of these cases were fatal. Simi Valley, in easternmost Ventura County, had the highest attack rate in the county, and the attack rate decreased westward across the county. The temporal and spatial distribution of CM cases indicates that the outbreak resulted from inhalation of spore-contaminated dust generated by earthquake-triggered landslides. Canyons North East of Simi Valley produced many highly disrupted, dust-generating landslides during the earthquake and its aftershocks. Winds after the earthquake were from the North East, which transported dust into Simi Valley and beyond to communities to the West. The three fatalities from the CM epidemic accounted for 4 percent of the total earthquake-related fatalities.

  5. A Public Health Issue Related To Collateral Seismic Hazards: The Valley Fever Outbreak Triggered By The 1994 Northridge, California Earthquake

    NASA Astrophysics Data System (ADS)

    Jibson, Randall W.

    Following the 17 January 1994 Northridge, California earthquake (M = 6.7), Ventura County, California, experienced a major outbreak ofcoccidioidomycosis (CM), commonly known as valley fever, a respiratory disease contracted byinhaling airborne fungal spores. In the 8 weeks following the earthquake (24 Januarythrough 15 March), 203 outbreak-associated cases were reported, which is about an order of magnitude more than the expected number of cases, and three of these cases were fatal.Simi Valley, in easternmost Ventura County, had the highest attack rate in the county,and the attack rate decreased westward across the county. The temporal and spatial distribution of CM cases indicates that the outbreak resulted from inhalation of spore-contaminated dust generated by earthquake-triggered landslides. Canyons North East of Simi Valleyproduced many highly disrupted, dust-generating landslides during the earthquake andits aftershocks. Winds after the earthquake were from the North East, which transporteddust into Simi Valley and beyond to communities to the West. The three fatalities from the CM epidemic accounted for 4 percent of the total earthquake-related fatalities.

  6. Comparision of the different probability distributions for earthquake hazard assessment in the North Anatolian Fault Zone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yilmaz, Şeyda, E-mail: seydayilmaz@ktu.edu.tr; Bayrak, Erdem, E-mail: erdmbyrk@gmail.com; Bayrak, Yusuf, E-mail: bayrak@ktu.edu.tr

    In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed timemore » using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.« less

  7. Web Services and Other Enhancements at the Northern California Earthquake Data Center

    NASA Astrophysics Data System (ADS)

    Neuhauser, D. S.; Zuzlewski, S.; Allen, R. M.

    2012-12-01

    The Northern California Earthquake Data Center (NCEDC) provides data archive and distribution services for seismological and geophysical data sets that encompass northern California. The NCEDC is enhancing its ability to deliver rapid information through Web Services. NCEDC Web Services use well-established web server and client protocols and REST software architecture to allow users to easily make queries using web browsers or simple program interfaces and to receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, or MiniSEED depending on the service, and are compatible with the equivalent IRIS DMC web services. The NCEDC is currently providing the following Web Services: (1) Station inventory and channel response information delivered in StationXML format, (2) Channel response information delivered in RESP format, (3) Time series availability delivered in text and XML formats, (4) Single channel and bulk data request delivered in MiniSEED format. The NCEDC is also developing a rich Earthquake Catalog Web Service to allow users to query earthquake catalogs based on selection parameters such as time, location or geographic region, magnitude, depth, azimuthal gap, and rms. It will return (in QuakeML format) user-specified results that can include simple earthquake parameters, as well as observations such as phase arrivals, codas, amplitudes, and computed parameters such as first motion mechanisms, moment tensors, and rupture length. The NCEDC will work with both IRIS and the International Federation of Digital Seismograph Networks (FDSN) to define a uniform set of web service specifications that can be implemented by multiple data centers to provide users with a common data interface across data centers. The NCEDC now hosts earthquake catalogs and waveforms from the US Department of Energy (DOE) Enhanced Geothermal Systems (EGS) monitoring networks. These

  8. Earthquakes, September-October 1984

    USGS Publications Warehouse

    Person, W.J.

    1985-01-01

    In the United States, Wyoming experienced a couple of moderate earthquakes, and off the coast of northern California, a strong earthquake shook much of the northern coast of California and parts of the Oregon coast. 

  9. Earthquake recurrence models and occurrence probabilities of strong earthquakes in the North Aegean Trough (Greece)

    NASA Astrophysics Data System (ADS)

    Christos, Kourouklas; Eleftheria, Papadimitriou; George, Tsaklidis; Vassilios, Karakostas

    2018-06-01

    The determination of strong earthquakes' recurrence time above a predefined magnitude, associated with specific fault segments, is an important component of seismic hazard assessment. The occurrence of these earthquakes is neither periodic nor completely random but often clustered in time. This fact in connection with their limited number, due to shortage of the available catalogs, inhibits a deterministic approach for recurrence time calculation, and for this reason, application of stochastic processes is required. In this study, recurrence time determination in the area of North Aegean Trough (NAT) is developed by the application of time-dependent stochastic models, introducing an elastic rebound motivated concept for individual fault segments located in the study area. For this purpose, all the available information on strong earthquakes (historical and instrumental) with M w ≥ 6.5 is compiled and examined for magnitude completeness. Two possible starting dates of the catalog are assumed with the same magnitude threshold, M w ≥ 6.5 and divided into five data sets, according to a new segmentation model for the study area. Three Brownian Passage Time (BPT) models with different levels of aperiodicity are applied and evaluated with the Anderson-Darling test for each segment in both catalog data where possible. The preferable models are then used in order to estimate the occurrence probabilities of M w ≥ 6.5 shocks on each segment of NAT for the next 10, 20, and 30 years since 01/01/2016. Uncertainties in probability calculations are also estimated using a Monte Carlo procedure. It must be mentioned that the provided results should be treated carefully because of their dependence to the initial assumptions. Such assumptions exhibit large variability and alternative means of these may return different final results.

  10. Quasi-periodic recurrence of large earthquakes on the southern San Andreas fault

    USGS Publications Warehouse

    Scharer, Katherine M.; Biasi, Glenn P.; Weldon, Ray J.; Fumal, Tom E.

    2010-01-01

    It has been 153 yr since the last large earthquake on the southern San Andreas fault (California, United States), but the average interseismic interval is only ~100 yr. If the recurrence of large earthquakes is periodic, rather than random or clustered, the length of this period is notable and would generally increase the risk estimated in probabilistic seismic hazard analyses. Unfortunately, robust characterization of a distribution describing earthquake recurrence on a single fault is limited by the brevity of most earthquake records. Here we use statistical tests on a 3000 yr combined record of 29 ground-rupturing earthquakes from Wrightwood, California. We show that earthquake recurrence there is more regular than expected from a Poisson distribution and is not clustered, leading us to conclude that recurrence is quasi-periodic. The observation of unimodal time dependence is persistent across an observationally based sensitivity analysis that critically examines alternative interpretations of the geologic record. The results support formal forecast efforts that use renewal models to estimate probabilities of future earthquakes on the southern San Andreas fault. Only four intervals (15%) from the record are longer than the present open interval, highlighting the current hazard posed by this fault.

  11. The Active Fault Parameters for Time-Dependent Earthquake Hazard Assessment in Taiwan

    NASA Astrophysics Data System (ADS)

    Lee, Y.; Cheng, C.; Lin, P.; Shao, K.; Wu, Y.; Shih, C.

    2011-12-01

    Taiwan is located at the boundary between the Philippine Sea Plate and the Eurasian Plate, with a convergence rate of ~ 80 mm/yr in a ~N118E direction. The plate motion is so active that earthquake is very frequent. In the Taiwan area, disaster-inducing earthquakes often result from active faults. For this reason, it's an important subject to understand the activity and hazard of active faults. The active faults in Taiwan are mainly located in the Western Foothills and the Eastern longitudinal valley. Active fault distribution map published by the Central Geological Survey (CGS) in 2010 shows that there are 31 active faults in the island of Taiwan and some of which are related to earthquake. Many researchers have investigated these active faults and continuously update new data and results, but few people have integrated them for time-dependent earthquake hazard assessment. In this study, we want to gather previous researches and field work results and then integrate these data as an active fault parameters table for time-dependent earthquake hazard assessment. We are going to gather the seismic profiles or earthquake relocation of a fault and then combine the fault trace on land to establish the 3D fault geometry model in GIS system. We collect the researches of fault source scaling in Taiwan and estimate the maximum magnitude from fault length or fault area. We use the characteristic earthquake model to evaluate the active fault earthquake recurrence interval. In the other parameters, we will collect previous studies or historical references and complete our parameter table of active faults in Taiwan. The WG08 have done the time-dependent earthquake hazard assessment of active faults in California. They established the fault models, deformation models, earthquake rate models, and probability models and then compute the probability of faults in California. Following these steps, we have the preliminary evaluated probability of earthquake-related hazards in certain

  12. Earthquake Swarm Along the San Andreas Fault near Palmdale, Southern California, 1976 to 1977.

    PubMed

    McNally, K C; Kanamori, H; Pechmann, J C; Fuis, G

    1978-09-01

    Between November 1976 and November 1977 a swarm of small earthquakes (local magnitude California. This swarm was the first observed along this section of the San Andreas since cataloging of instrumental data began in 1932. The activity followed partial subsidence of the 35-centimeter vertical crustal uplift known as the Palmdale bulge along this "locked" section of the San Andreas, which last broke in the great (surface-wave magnitude = 8(1/4)+) 1857 Fort Tejon earthquake. The swarm events exhibit characteristics previously observed for some foreshock sequences, such as tight clustering of hypocenters and time-dependent rotations of stress axes inferred from focal mechanisms. However, because of our present lack of understanding of the processes that precede earthquake faulting, the implications of the swarm for future large earthquakes on the San Andreas fault are unknown.

  13. Earthquake swarm along the San Andreas fault near Palmdale, Southern California, 1976 to 1977

    USGS Publications Warehouse

    Mcnally, K.C.; Kanamori, H.; Pechmann, J.C.; Fuis, G.

    1978-01-01

    Between November 1976 and November 1977 a swarm of small earthquakes (local magnitude ??? 3) occurred on or near the San Andreas fault near Palmdale, California. This swarm was the first observed along this section of the San Andreas since cataloging of instrumental data began in 1932. The activity followed partial subsidence of the 35-centimeter vertical crustal uplift known as the Palmdale bulge along this "locked" section of the San Andreas, which last broke in the great (surface-wave magnitude = 81/4+) 1857 Fort Tejon earthquake. The swarm events exhibit characteristics previously observed for some foreshock sequences, such as tight clustering of hypocenters and time-dependent rotations of stress axes inferred from focal mechanisms. However, because of our present lack of understanding of the processes that precede earthquake faulting, the implications of the swarm for future large earthquakes on the San Andreas fault are unknown. Copyright ?? 1978 AAAS.

  14. ViscoSim Earthquake Simulator

    USGS Publications Warehouse

    Pollitz, Fred

    2012-01-01

    Synthetic seismicity simulations have been explored by the Southern California Earthquake Center (SCEC) Earthquake Simulators Group in order to guide long‐term forecasting efforts related to the Unified California Earthquake Rupture Forecast (Tullis et al., 2012a). In this study I describe the viscoelastic earthquake simulator (ViscoSim) of Pollitz, 2009. Recapitulating to a large extent material previously presented by Pollitz (2009, 2011) I describe its implementation of synthetic ruptures and how it differs from other simulators being used by the group.

  15. How does the 2010 El Mayor - Cucapah Earthquake Rupture Connect to the Southern California Plate Boundary Fault System

    NASA Astrophysics Data System (ADS)

    Donnellan, A.; Ben-Zion, Y.; Arrowsmith, R.

    2016-12-01

    The Pacific - North American plate boundary in southern California is marked by several major strike slip faults. The 2010 M7.2 El Mayor - Cucapah earthquake ruptured 120 km of upper crust in Baja California to the US-Mexico border. The earthquake triggered slip along an extensive network of faults in the Salton Trough from the Mexican border to the southern end of the San Andreas fault. Earthquakes >M5 were triggered in the gap between the Laguna Salada and Elsinore faults at Ocotillo and on the Coyote Creek segment of the San Jacinto fault 20 km northwest of Borrego Springs. UAVSAR observations, collected since October of 2009, measure slip associated with the M5.7 Ocotillo aftershock with deformation continuing into 2014. The Elsinore fault has been remarkably quiet, however, with only M5.0 and M5.2 earthquakes occurring on the Coyote Mountains segment of the fault in 1940 and 1968 respectively. In contrast, the Imperial Valley has been quite active historically with numerous moderate events occurring since 1935. Moderate event activity is increasing along the San Jacinto fault zone (SJFZ), especially the trifurcation area, where 6 of 12 historic earthquakes in this 20 km long fault zone have occurred since 2000. However, no recent deformation has been detected using UAVSAR measurements in this area, including the recent M5.2 June 2016 Borrego earthquake. Does the El Mayor - Cucapah rupture connect to and transfer stress primarily to a single southern California fault or several? What is its role relative to the background plate motion? UAVSAR observations indicate that the southward extension of the Elsinore fault has recently experienced the most localized deformation. Seismicity suggests that the San Jacinto fault is more active than neighboring major faults, and geologic evidence suggests that the Southern San Andreas fault has been the major plate boundary fault in southern California. Topographic data with 3-4 cm resolution using structure from motion from

  16. Development of damage probability matrices based on Greek earthquake damage data

    NASA Astrophysics Data System (ADS)

    Eleftheriadou, Anastasia K.; Karabinis, Athanasios I.

    2011-03-01

    A comprehensive study is presented for empirical seismic vulnerability assessment of typical structural types, representative of the building stock of Southern Europe, based on a large set of damage statistics. The observational database was obtained from post-earthquake surveys carried out in the area struck by the September 7, 1999 Athens earthquake. After analysis of the collected observational data, a unified damage database has been created which comprises 180,945 damaged buildings from/after the near-field area of the earthquake. The damaged buildings are classified in specific structural types, according to the materials, seismic codes and construction techniques in Southern Europe. The seismic demand is described in terms of both the regional macroseismic intensity and the ratio α g/ a o, where α g is the maximum peak ground acceleration (PGA) of the earthquake event and a o is the unique value PGA that characterizes each municipality shown on the Greek hazard map. The relative and cumulative frequencies of the different damage states for each structural type and each intensity level are computed in terms of damage ratio. Damage probability matrices (DPMs) and vulnerability curves are obtained for specific structural types. A comparison analysis is fulfilled between the produced and the existing vulnerability models.

  17. The Loma Prieta, California, Earthquake of October 17, 1989: Strong Ground Motion and Ground Failure

    USGS Publications Warehouse

    Coordinated by Holzer, Thomas L.

    1992-01-01

    Professional Paper 1551 describes the effects at the land surface caused by the Loma Prieta earthquake. These effects: include the pattern and characteristics of strong ground shaking, liquefaction of both floodplain deposits along the Pajaro and Salinas Rivers in the Monterey Bay region and sandy artificial fills along the margins of San Francisco Bay, landslides in the epicentral region, and increased stream flow. Some significant findings and their impacts were: * Strong shaking that was amplified by a factor of about two by soft soils caused damage at up to 100 kilometers (60 miles) from the epicenter. * Instrumental recordings of the ground shaking have been used to improve how building codes consider site amplification effects from soft soils. * Liquefaction at 134 locations caused $99.2 million of the total earthquake loss of $5.9 billion. Liquefaction of floodplain deposits and sandy artificial fills was similar in nature to that which occurred in the 1906 San Francisco earthquake and indicated that many areas remain susceptible to liquefaction damage in the San Francisco and Monterey Bay regions. * Landslides caused $30 million in earthquake losses, damaging at least 200 residences. Many landslides showed evidence of movement in previous earthquakes. * Recognition of the similarities between liquefaction and landslides in 1906 and 1989 and research in intervening years that established methodologies to map liquefaction and landslide hazards prompted the California legislature to pass in 1990 the Seismic Hazards Mapping Act that required the California Geological Survey to delineate regulatory zones of areas potentially susceptible to these hazards. * The earthquake caused the flow of many streams in the epicentral region to increase. Effects were noted up to 88 km from the epicenter. * Post-earthquake studies of the Marina District of San Francisco provide perhaps the most comprehensive case history of earthquake effects at a specific site developed for

  18. Stress/strain changes and triggered seismicity following the MW7.3 Landers, California, earthquake

    USGS Publications Warehouse

    Gomberg, J.

    1996-01-01

    Calculations of dynamic stresses and strains, constrained by broadband seismograms, are used to investigate their role in generating the remotely triggered seismicity that followed the June 28, 1992, MW7.3 Landers, California earthquake. I compare straingrams and dynamic Coulomb failure functions calculated for the Landers earthquake at sites that did experience triggered seismicity with those at sites that did not. Bounds on triggering thresholds are obtained from analysis of dynamic strain spectra calculated for the Landers and MW,6.1 Joshua Tree, California, earthquakes at various sites, combined with results of static strain investigations by others. I interpret three principal results of this study with those of a companion study by Gomberg and Davis [this issue]. First, the dynamic elastic stress changes themselves cannot explain the spatial distribution of triggered seismicity, particularly the lack of triggered activity along the San Andreas fault system. In addition to the requirement to exceed a Coulomb failure stress level, this result implies the need to invoke and satisfy the requirements of appropriate slip instability theory. Second, results of this study are consistent with the existence of frequency- or rate-dependent stress/strain triggering thresholds, inferred from the companion study and interpreted in terms of earthquake initiation involving a competition of processes, one promoting failure and the other inhibiting it. Such competition is also part of relevant instability theories. Third, the triggering threshold must vary from site to site, suggesting that the potential for triggering strongly depends on site characteristics and response. The lack of triggering along the San Andreas fault system may be correlated with the advanced maturity of its fault gouge zone; the strains from the Landers earthquake were either insufficient to exceed its larger critical slip distance or some other critical failure parameter; or the faults failed stably as

  19. Multi-sensor Integration of Space and Ground Observations of Pre-earthquake Anomalies Associated with M6.0, August 24, 2014 Napa, California

    NASA Astrophysics Data System (ADS)

    Ouzounov, Dimitar; Tramutoli, Valerio; Pulinets, Sergey; Liu, Tiger; Filizzola, Carolina; Genzano, Nicola; Lisi, Mariano; Petrov, Leonid; Kafatos, Menas

    2015-04-01

    We integrate multiple space-born and ground sensors for monitoring pre-earthquake geophysical anomalies that can provide significant early notification for earthquakes higher than M5.5 worldwide. The latest M6.0 event of August 24, 2014 in South Napa, California generated pre-earthquake signatures during our outgoing tests for California, and an experimental warning was documented about 17 days in advance. We process in controlled environment different satellite and ground data for California (and several other test areas) by using: a) data from the NPOES sensors recording OLR (Outgoing Longwave Radiation) in the infrared; b) 2/GNSS, FORMOSAT (GPS/TEC); c) Earth Observing System assimilation models from NASA; d) ground-based gas observations and meteorological data; e) TIR (Thermal Infrared) data from geostationary satellite (GOES). On Aug 4th, we detected (prospectively) a large anomaly of OLR transient field at the TOA over Northern California. The location was shifted in the northeast direction about 150 km from the Aug 23rd epicentral area. Compared to the reference field of August 2004 to 2014 the hotspot anomaly was the largest energy flux anomaly over the entire continental United States at this time. Based on the temporal and spatial estimates of the anomaly, on August 4th we issued an internal warning for a M5.5+ earthquake in Northern California within the next 1-4 weeks. TIR retrospective analysis showed significant (spatially extended and temporally persistent) sequences of TIR anomalies starting August 1st just in the future epicenter area and approximately in the same area affected by OLR anomalies in the following days. GPS/TEC retrospective analysis based on GIM and TGIM products show anomalies TEC variations 1-3 days, over region north form the Napa earthquake epicenter. The calculated index of atmospheric chemical potential based on the NASA numerical Assimilation weather model GEOS5 indicates for abnormal variations near the epicentral area days

  20. Some facts about aftershocks to large earthquakes in California

    USGS Publications Warehouse

    Jones, Lucile M.; Reasenberg, Paul A.

    1996-01-01

    Earthquakes occur in clusters. After one earthquake happens, we usually see others at nearby (or identical) locations. To talk about this phenomenon, seismologists coined three terms foreshock , mainshock , and aftershock. In any cluster of earthquakes, the one with the largest magnitude is called the mainshock; earthquakes that occur before the mainshock are called foreshocks while those that occur after the mainshock are called aftershocks. A mainshock will be redefined as a foreshock if a subsequent event in the cluster has a larger magnitude. Aftershock sequences follow predictable patterns. That is, a sequence of aftershocks follows certain global patterns as a group, but the individual earthquakes comprising the group are random and unpredictable. This relationship between the pattern of a group and the randomness (stochastic nature) of the individuals has a close parallel in actuarial statistics. We can describe the pattern that aftershock sequences tend to follow with well-constrained equations. However, we must keep in mind that the actual aftershocks are only probabilistically described by these equations. Once the parameters in these equations have been estimated, we can determine the probability of aftershocks occurring in various space, time and magnitude ranges as described below. Clustering of earthquakes usually occurs near the location of the mainshock. The stress on the mainshock's fault changes drastically during the mainshock and that fault produces most of the aftershocks. This causes a change in the regional stress, the size of which decreases rapidly with distance from the mainshock. Sometimes the change in stress caused by the mainshock is great enough to trigger aftershocks on other, nearby faults. While there is no hard "cutoff" distance beyond which an earthquake is totally incapable of triggering an aftershock, the vast majority of aftershocks are located close to the mainshock. As a rule of thumb, we consider earthquakes to be

  1. Seismology program; California Division of Mines and Geology

    USGS Publications Warehouse

    Sherburne, R. W.

    1981-01-01

    The year 1980 marked the centennial of the California Division of Mines and Geology (CDMG) and a decade of the Division's involvement in seismology. Factors which contributed to the formation of a Seismology Group within CDMG included increased concerns for environmental and earthquake safety, interest in earthquake prediction, the 1971 San Fernando earthquake and the 1973 publication by CDMG of an urban geology master plan for California. Reasons to be concerned about California's earthquake problem are demonstrated by the accompanying table and the figures. Recent seismicity in California, the Southern California uplift reflecting changes in crustal strain, and other possible earthquake precursors have heightened concern among scientific and governmental groups about the possible occurrence of a major damaging earthquake )M>7) in California

  2. Media exposure related to the 2008 Sichuan Earthquake predicted probable PTSD among Chinese adolescents in Kunming, China: A longitudinal study.

    PubMed

    Yeung, Nelson C Y; Lau, Joseph T F; Yu, Nancy Xiaonan; Zhang, Jianping; Xu, Zhening; Choi, Kai Chow; Zhang, Qi; Mak, Winnie W S; Lui, Wacy W S

    2018-03-01

    This study examined the prevalence and the psychosocial predictors of probable PTSD among Chinese adolescents in Kunming (approximately 444 miles from the epicenter), China, who were indirectly exposed to the Sichuan Earthquake in 2008. Using a longitudinal study design, primary and secondary school students (N = 3577) in Kunming completed questionnaires at baseline (June 2008) and 6 months afterward (December 2008) in classroom settings. Participants' exposure to earthquake-related imagery and content, perceptions and emotional reactions related to the earthquake, and posttraumatic stress symptoms were measured. Univariate and forward stepwise multivariable logistic regression models were fit to identify significant predictors of probable PTSD at the 6-month follow-up. Prevalences of probable PTSD (with a Children's Revised Impact of Event Scale score ≥30) among the participants at baseline and 6-month follow-up were 16.9% and 11.1% respectively. In the multivariable analysis, those who were frequently exposed to distressful imagery had experienced at least two types of negative life events, perceived that teachers were distressed due to the earthquake, believed that the earthquake resulted from damages to the ecosystem, and felt apprehensive and emotionally disturbed due to the earthquake reported a higher risk of probable PTSD at 6-month follow-up (all ps < .05). Exposure to distressful media images, emotional responses, and disaster-related perceptions at baseline were found to be predictive of probable PTSD several months after indirect exposure to the event. Parents, teachers, and the mass media should be aware of the negative impacts of disaster-related media exposure on adolescents' psychological health. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  3. Geometry and earthquake potential of the shoreline fault, central California

    USGS Publications Warehouse

    Hardebeck, Jeanne L.

    2013-01-01

    The Shoreline fault is a vertical strike‐slip fault running along the coastline near San Luis Obispo, California. Much is unknown about the Shoreline fault, including its slip rate and the details of its geometry. Here, I study the geometry of the Shoreline fault at seismogenic depth, as well as the adjacent section of the offshore Hosgri fault, using seismicity relocations and earthquake focal mechanisms. The Optimal Anisotropic Dynamic Clustering (OADC) algorithm (Ouillon et al., 2008) is used to objectively identify the simplest planar fault geometry that fits all of the earthquakes to within their location uncertainty. The OADC results show that the Shoreline fault is a single continuous structure that connects to the Hosgri fault. Discontinuities smaller than about 1 km may be undetected, but would be too small to be barriers to earthquake rupture. The Hosgri fault dips steeply to the east, while the Shoreline fault is essentially vertical, so the Hosgri fault dips towards and under the Shoreline fault as the two faults approach their intersection. The focal mechanisms generally agree with pure right‐lateral strike‐slip on the OADC planes, but suggest a non‐planar Hosgri fault or another structure underlying the northern Shoreline fault. The Shoreline fault most likely transfers strike‐slip motion between the Hosgri fault and other faults of the Pacific–North America plate boundary system to the east. A hypothetical earthquake rupturing the entire known length of the Shoreline fault would have a moment magnitude of 6.4–6.8. A hypothetical earthquake rupturing the Shoreline fault and the section of the Hosgri fault north of the Hosgri–Shoreline junction would have a moment magnitude of 7.2–7.5.

  4. Recent developments in understanding the tectonic evolution of the Southern California offshore area: Implications for earthquake-hazard analysis

    USGS Publications Warehouse

    Fisher, M.A.; Langenheim, V.E.; Nicholson, C.; Ryan, H.F.; Sliter, R.W.

    2009-01-01

    During late Mesozoic and Cenozoic time, three main tectonic episodes affected the Southern California offshore area. Each episode imposed its unique structural imprint such that early-formed structures controlled or at least influenced the location and development of later ones. This cascaded structural inheritance greatly complicates analysis of the extent, orientation, and activity of modern faults. These fault attributes play key roles in estimates of earthquake magnitude and recurrence interval. Hence, understanding the earthquake hazard posed by offshore and coastal faults requires an understanding of the history of structural inheritance and modifi-cation. In this report we review recent (mainly since 1987) findings about the tectonic development of the Southern California offshore area and use analog models of fault deformation as guides to comprehend the bewildering variety of offshore structures that developed over time. This report also provides a background in regional tectonics for other chapters in this section that deal with the threat from offshore geologic hazards in Southern California. ?? 2009 The Geological Society of America.

  5. The 2011 M = 9.0 Tohoku oki earthquake more than doubled the probability of large shocks beneath Tokyo

    USGS Publications Warehouse

    Toda, Shinji; Stein, Ross S.

    2013-01-01

    1] The Kanto seismic corridor surrounding Tokyo has hosted four to five M ≥ 7 earthquakes in the past 400 years. Immediately after the Tohoku earthquake, the seismicity rate in the corridor jumped 10-fold, while the rate of normal focal mechanisms dropped in half. The seismicity rate decayed for 6–12 months, after which it steadied at three times the pre-Tohoku rate. The seismicity rate jump and decay to a new rate, as well as the focal mechanism change, can be explained by the static stress imparted by the Tohoku rupture and postseismic creep to Kanto faults. We therefore fit the seismicity observations to a rate/state Coulomb model, which we use to forecast the time-dependent probability of large earthquakes in the Kanto seismic corridor. We estimate a 17% probability of a M ≥ 7.0 shock over the 5 year prospective period 11 March 2013 to 10 March 2018, two-and-a-half times the probability had the Tohoku earthquake not struck

  6. Hospital compliance with a state unfunded mandate: the case of California's Earthquake Safety Law.

    PubMed

    McCue, Michael J; Thompson, Jon M

    2012-01-01

    Abstract In recent years, community hospitals have experienced heightened regulation with many unfunded mandates. The authors assessed the market, organizational, operational, and financial characteristics of general acute care hospitals in California that have a main acute care hospital building that is noncompliant with state requirements and at risk of major structural collapse from earthquakes. Using California hospital data from 2007 to 2009, and employing logistic regression analysis, the authors found that hospitals having buildings that are at the highest risk of collapse are located in larger population markets, possess smaller market share, have a higher percentage of Medicaid patients, and have less liquidity.

  7. The Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT) Internship Program

    NASA Astrophysics Data System (ADS)

    Perry, S.; Jordan, T.

    2006-12-01

    Our undergraduate research program, SCEC/UseIT, an NSF Research Experience for Undergraduates site, provides software for earthquake researchers and educators, movies for outreach, and ways to strengthen the technical career pipeline. SCEC/UseIT motivates diverse undergraduates towards science and engineering careers through team-based research in the exciting field of earthquake information technology. UseIT provides the cross-training in computer science/information technology (CS/IT) and geoscience needed to make fundamental progress in earthquake system science. Our high and increasing participation of women and minority students is crucial given the nation"s precipitous enrollment declines in CS/IT undergraduate degree programs, especially among women. UseIT also casts a "wider, farther" recruitment net that targets scholars interested in creative work but not traditionally attracted to summer science internships. Since 2002, SCEC/UseIT has challenged 79 students in three dozen majors from as many schools with difficult, real-world problems that require collaborative, interdisciplinary solutions. Interns design and engineer open-source software, creating increasingly sophisticated visualization tools (see "SCEC-VDO," session IN11), which are employed by SCEC researchers, in new curricula at the University of Southern California, and by outreach specialists who make animated movies for the public and the media. SCEC-VDO would be a valuable tool for research-oriented professional development programs.

  8. Spatio-temporal variation of seismicity before the 1971 San Fernando earthquake, California

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ishida, M.; Kanamori, H.

    1977-08-01

    The spatio-temporal variation of seismicity prior to the 1971 San Fernando, California, earthquake is studied for the area within 35 km of the epicenter. During the period from 1932 to 1961, the seismicity in this area was relatively low and random. A remarkable NE-SW trending alignment of activity occurred during the period from 1961 to 1964, the period corresponding to the inferred onset of the Palmdale uplift. During the period from 1965 to 1968, the seismicity around the epicentral area became extremely low; no event was located within 13 km from the epicenter. During the period from 1969 to themore » occurrence of the San Fernando earthquake, activity around the epicentral area increased. This activity may be considered to be foreshock activity in a broad sense.« less

  9. Seismicity and stress transfer studies in eastern California and Nevada: Implications for earthquake sources and tectonics

    NASA Astrophysics Data System (ADS)

    Ichinose, Gene Aaron

    The source parameters for eastern California and western Nevada earthquakes are estimated from regionally recorded seismograms using a moment tensor inversion. We use the point source approximation and fit the seismograms, at long periods. We generated a moment tensor catalog for Mw > 4.0 since 1997 and Mw > 5.0 since 1990. The catalog includes centroid depths, seismic moments, and focal mechanisms. The regions with the most moderate sized earthquakes in the last decade were in aftershock zones located in Eureka Valley, Double Spring Flat, Coso, Ridgecrest, Fish Lake Valley, and Scotty's Junction. The remaining moderate size earthquakes were distributed across the region. The 1993 (Mw 6.0) Eureka Valley earthquake occurred in the Eastern California Shear Zone. Careful aftershock relocations were used to resolve structure from aftershock clusters. The mainshock appears to rupture along the western side of the Last Change Range along a 30° to 60° west dipping fault plane, consistent with previous geodetic modeling. We estimate the source parameters for aftershocks at source-receiver distances less than 20 km using waveform modeling. The relocated aftershocks and waveform modeling results do not indicate any significant evidence of low angle faulting (dips > 30°. The results did reveal deformation along vertical faults within the hanging-wall block, consistent with observed surface rupture along the Saline Range above the dipping fault plane. The 1994 (Mw 5.8) Double Spring Flat earthquake occurred along the eastern Sierra Nevada between overlapping normal faults. Aftershock migration and cross fault triggering occurred in the following two years, producing seventeen Mw > 4 aftershocks The source parameters for the largest aftershocks were estimated from regionally recorded seismograms using moment tensor inversion. We estimate the source parameters for two moderate sized earthquakes which occurred near Reno, Nevada, the 1995 (Mw 4.4) Border Town, and the 1998 (Mw

  10. Remotely triggered microearthquakes and tremor in central California following the 2010 Mw 8.8 Chile earthquake

    USGS Publications Warehouse

    Peng, Zhigang; Hill, David P.; Shelly, David R.; Aiken, Chastity

    2010-01-01

    We examine remotely triggered microearthquakes and tectonic tremor in central California following the 2010 Mw 8.8 Chile earthquake. Several microearthquakes near the Coso Geothermal Field were apparently triggered, with the largest earthquake (Ml 3.5) occurring during the large-amplitude Love surface waves. The Chile mainshock also triggered numerous tremor bursts near the Parkfield-Cholame section of the San Andreas Fault (SAF). The locally triggered tremor bursts are partially masked at lower frequencies by the regionally triggered earthquake signals from Coso, but can be identified by applying high-pass or matched filters. Both triggered tremor along the SAF and the Ml 3.5 earthquake in Coso are consistent with frictional failure at different depths on critically-stressed faults under the Coulomb failure criteria. The triggered tremor, however, appears to be more phase-correlated with the surface waves than the triggered earthquakes, likely reflecting differences in constitutive properties between the brittle, seismogenic crust and the underlying lower crust.

  11. Predicted Surface Displacements for Scenario Earthquakes in the San Francisco Bay Region

    USGS Publications Warehouse

    Murray-Moraleda, Jessica R.

    2008-01-01

    In the immediate aftermath of a major earthquake, the U.S. Geological Survey (USGS) will be called upon to provide information on the characteristics of the event to emergency responders and the media. One such piece of information is the expected surface displacement due to the earthquake. In conducting probabilistic hazard analyses for the San Francisco Bay Region, the Working Group on California Earthquake Probabilities (WGCEP) identified a series of scenario earthquakes involving the major faults of the region, and these were used in their 2003 report (hereafter referred to as WG03) and the recently released 2008 Uniform California Earthquake Rupture Forecast (UCERF). Here I present a collection of maps depicting the expected surface displacement resulting from those scenario earthquakes. The USGS has conducted frequent Global Positioning System (GPS) surveys throughout northern California for nearly two decades, generating a solid baseline of interseismic measurements. Following an earthquake, temporary GPS deployments at these sites will be important to augment the spatial coverage provided by continuous GPS sites for recording postseismic deformation, as will the acquisition of Interferometric Synthetic Aperture Radar (InSAR) scenes. The information provided in this report allows one to anticipate, for a given event, where the largest displacements are likely to occur. This information is valuable both for assessing the need for further spatial densification of GPS coverage before an event and prioritizing sites to resurvey and InSAR data to acquire in the immediate aftermath of the earthquake. In addition, these maps are envisioned to be a resource for scientists in communicating with emergency responders and members of the press, particularly during the time immediately after a major earthquake before displacements recorded by continuous GPS stations are available.

  12. TriNet "ShakeMaps": Rapid generation of peak ground motion and intensity maps for earthquakes in southern California

    USGS Publications Warehouse

    Wald, D.J.; Quitoriano, V.; Heaton, T.H.; Kanamori, H.; Scrivner, C.W.; Worden, C.B.

    1999-01-01

    Rapid (3-5 minutes) generation of maps of instrumental ground-motion and shaking intensity is accomplished through advances in real-time seismographic data acquisition combined with newly developed relationships between recorded ground-motion parameters and expected shaking intensity values. Estimation of shaking over the entire regional extent of southern California is obtained by the spatial interpolation of the measured ground motions with geologically based frequency and amplitude-dependent site corrections. Production of the maps is automatic, triggered by any significant earthquake in southern California. Maps are now made available within several minutes of the earthquake for public and scientific consumption via the World Wide Web; they will be made available with dedicated communications for emergency response agencies and critical users.

  13. GPS Time Series Analysis of Southern California Associated with the 2010 M7.2 El Mayor/Cucapah Earthquake

    NASA Technical Reports Server (NTRS)

    Granat, Robert; Donnellan, Andrea

    2011-01-01

    The Magnitude 7.2 El-Mayor/Cucapah earthquake the occurred in Mexico on April 4, 2012 was well instrumented with continuous GPS stations in California. Large Offsets were observed at the GPS stations as a result of deformation from the earthquake providing information about the co-seismic fault slip as well as fault slip from large aftershocks. Information can also be obtained from the position time series at each station.

  14. Potential Effects of a Scenario Earthquake on the Economy of Southern California: Small Business Exposure and Sensitivity Analysis to a Magnitude 7.8 Earthquake

    USGS Publications Warehouse

    Sherrouse, Benson C.; Hester, David J.; Wein, Anne M.

    2008-01-01

    The Multi-Hazards Demonstration Project (MHDP) is a collaboration between the U.S. Geological Survey (USGS) and various partners from the public and private sectors and academia, meant to improve Southern California's resiliency to natural hazards (Jones and others, 2007). In support of the MHDP objectives, the ShakeOut Scenario was developed. It describes a magnitude 7.8 (M7.8) earthquake along the southernmost 300 kilometers (200 miles) of the San Andreas Fault, identified by geoscientists as a plausible event that will cause moderate to strong shaking over much of the eight-county (Imperial, Kern, Los Angeles, Orange, Riverside, San Bernardino, San Diego, and Ventura) Southern California region. This report contains an exposure and sensitivity analysis of small businesses in terms of labor and employment statistics. Exposure is measured as the absolute counts of labor market variables anticipated to experience each level of Instrumental Intensity (a proxy measure of damage). Sensitivity is the percentage of the exposure of each business establishment size category to each Instrumental Intensity level. The analysis concerns the direct effect of the earthquake on small businesses. The analysis is inspired by the Bureau of Labor Statistics (BLS) report that analyzed the labor market losses (exposure) of a M6.9 earthquake on the Hayward fault by overlaying geocoded labor market data on Instrumental Intensity values. The method used here is influenced by the ZIP-code-level data provided by the California Employment Development Department (CA EDD), which requires the assignment of Instrumental Intensities to ZIP codes. The ZIP-code-level labor market data includes the number of business establishments, employees, and quarterly payroll categorized by business establishment size.

  15. Quantifying Earthquake Collapse Risk of Tall Steel Braced Frame Buildings Using Rupture-to-Rafters Simulations

    NASA Astrophysics Data System (ADS)

    Mourhatch, Ramses

    This thesis examines collapse risk of tall steel braced frame buildings using rupture-to-rafters simulations due to suite of San Andreas earthquakes. Two key advancements in this work are the development of (i) a rational methodology for assigning scenario earthquake probabilities and (ii) an artificial correction-free approach to broadband ground motion simulation. The work can be divided into the following sections: earthquake source modeling, earthquake probability calculations, ground motion simulations, building response, and performance analysis. As a first step the kinematic source inversions of past earthquakes in the magnitude range of 6-8 are used to simulate 60 scenario earthquakes on the San Andreas fault. For each scenario earthquake a 30-year occurrence probability is calculated and we present a rational method to redistribute the forecast earthquake probabilities from UCERF to the simulated scenario earthquake. We illustrate the inner workings of the method through an example involving earthquakes on the San Andreas fault in southern California. Next, three-component broadband ground motion histories are computed at 636 sites in the greater Los Angeles metropolitan area by superposing short-period (0.2s-2.0s) empirical Green's function synthetics on top of long-period (> 2.0s) spectral element synthetics. We superimpose these seismograms on low-frequency seismograms, computed from kinematic source models using the spectral element method, to produce broadband seismograms. Using the ground motions at 636 sites for the 60 scenario earthquakes, 3-D nonlinear analysis of several variants of an 18-story steel braced frame building, designed for three soil types using the 1994 and 1997 Uniform Building Code provisions and subjected to these ground motions, are conducted. Model performance is classified into one of five performance levels: Immediate Occupancy, Life Safety, Collapse Prevention, Red-Tagged, and Model Collapse. The results are combined with

  16. Connecting crustal seismicity and earthquake-driven stress evolution in Southern California

    USGS Publications Warehouse

    Pollitz, Fred; Cattania, Camilla

    2017-01-01

    Tectonic stress in the crust evolves during a seismic cycle, with slow stress accumulation over interseismic periods, episodic stress steps at the time of earthquakes, and transient stress readjustment during a postseismic period that may last months to years. Static stress transfer to surrounding faults has been well documented to alter regional seismicity rates over both short and long time scales. While static stress transfer is instantaneous and long lived, postseismic stress transfer driven by viscoelastic relaxation of the ductile lower crust and mantle leads to additional, slowly varying stress perturbations. Both processes may be tested by comparing a decade-long record of regional seismicity to predicted time-dependent seismicity rates based on a stress evolution model that includes viscoelastic stress transfer. Here we explore crustal stress evolution arising from the seismic cycle in Southern California from 1981 to 2014 using five M≥6.5 source quakes: the M7.3 1992 Landers, M6.5 1992 Big Bear, M6.7 1994 Big Bear, M7.1 1999 Hector Mine, and M7.2 2010 El Mayor-Cucapah earthquakes. We relate the stress readjustment in the surrounding crust generated by each quake to regional seismicity using rate-and-state friction theory. Using a log likelihood approach, we quantify the potential to trigger seismicity of both static and viscoelastic stress transfer, finding that both processes have systematically shaped the spatial pattern of Southern California seismicity since 1992.

  17. Web Services and Data Enhancements at the Northern California Earthquake Data Center

    NASA Astrophysics Data System (ADS)

    Neuhauser, D. S.; Zuzlewski, S.; Lombard, P. N.; Allen, R. M.

    2013-12-01

    The Northern California Earthquake Data Center (NCEDC) provides data archive and distribution services for seismological and geophysical data sets that encompass northern California. The NCEDC is enhancing its ability to deliver rapid information through Web Services. NCEDC Web Services use well-established web server and client protocols and REST software architecture to allow users to easily make queries using web browsers or simple program interfaces and to receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, simple text, or MiniSEED depending on the service and selected output format. The NCEDC offers the following web services that are compliant with the International Federation of Digital Seismograph Networks (FDSN) web services specifications: (1) fdsn-dataselect: time series data delivered in MiniSEED format, (2) fdsn-station: station and channel metadata and time series availability delivered in StationXML format, (3) fdsn-event: earthquake event information delivered in QuakeML format. In addition, the NCEDC offers the the following IRIS-compatible web services: (1) sacpz: provide channel gains, poles, and zeros in SAC format, (2) resp: provide channel response information in RESP format, (3) dataless: provide station and channel metadata in Dataless SEED format. The NCEDC is also developing a web service to deliver timeseries from pre-assembled event waveform gathers. The NCEDC has waveform gathers for ~750,000 northern and central California events from 1984 to the present, many of which were created by the USGS NCSN prior to the establishment of the joint NCSS (Northern California Seismic System). We are currently adding waveforms to these older event gathers with time series from the UCB networks and other networks with waveforms archived at the NCEDC, and ensuring that the waveform for each channel in the event gathers have the highest

  18. Unusual downhole and surface free-field records near the Carquinez Strait bridges during the 24 August 2014 Mw6.0 South Napa, California earthquake

    USGS Publications Warehouse

    Çelebi, Mehmet; Ghahari, S. Farid; Taciroglu, Ertugrul

    2015-01-01

    This paper reports the results of Part A of a study of the recorded strong-motion accelerations at the well-instrumented network of the two side-by-side parallel bridges over the Carquinez Strait during the 24 August 2014 (Mw6.0 ) South Napa, Calif. earthquake that occurred at 03:20:44 PDT with epicentral coordinates 38.22N, 122.31W. (http://earthquake.usgs.gov/earthquakes/eqarchives/poster/2014/20140824.php, last accessed on October 17, 2014). Both bridges and two boreholes were instrumented by the California Strong motion Instrumentation Program (CSMIP) of California Geological Survey (CGS) (Shakal et al., 2014). A comprehensive comparison of several ground motion prediction equations as they relate to recorded ground motions of the earthquake is provided by Baltay and Boatright (2015).

  19. Holocene paleoseismicity, temporal clustering, and probabilities of future large (M > 7) earthquakes on the Wasatch fault zone, Utah

    USGS Publications Warehouse

    McCalpin, J.P.; Nishenko, S.P.

    1996-01-01

    The chronology of M>7 paleoearthquakes on the central five segments of the Wasatch fault zone (WFZ) is one of the best dated in the world and contains 16 earthquakes in the past 5600 years with an average repeat time of 350 years. Repeat times for individual segments vary by a factor of 2, and range from about 1200 to 2600 years. Four of the central five segments ruptured between ??? 620??30 and 1230??60 calendar years B.P. The remaining segment (Brigham City segment) has not ruptured in the past 2120??100 years. Comparison of the WFZ space-time diagram of paleoearthquakes with synthetic paleoseismic histories indicates that the observed temporal clusters and gaps have about an equal probability (depending on model assumptions) of reflecting random coincidence as opposed to intersegment contagion. Regional seismicity suggests that for exposure times of 50 and 100 years, the probability for an earthquake of M>7 anywhere within the Wasatch Front region, based on a Poisson model, is 0.16 and 0.30, respectively. A fault-specific WFZ model predicts 50 and 100 year probabilities for a M>7 earthquake on the WFZ itself, based on a Poisson model, as 0.13 and 0.25, respectively. In contrast, segment-specific earthquake probabilities that assume quasi-periodic recurrence behavior on the Weber, Provo, and Nephi segments are less (0.01-0.07 in 100 years) than the regional or fault-specific estimates (0.25-0.30 in 100 years), due to the short elapsed times compared to average recurrence intervals on those segments. The Brigham City and Salt Lake City segments, however, have time-dependent probabilities that approach or exceed the regional and fault specific probabilities. For the Salt Lake City segment, these elevated probabilities are due to the elapsed time being approximately equal to the average late Holocene recurrence time. For the Brigham City segment, the elapsed time is significantly longer than the segment-specific late Holocene recurrence time.

  20. Hotspots, Lifelines, and the Safrr Haywired Earthquake Sequence

    NASA Astrophysics Data System (ADS)

    Ratliff, J. L.; Porter, K.

    2014-12-01

    Though California has experienced many large earthquakes (San Francisco, 1906; Loma Prieta, 1989; Northridge, 1994), the San Francisco Bay Area has not had a damaging earthquake for 25 years. Earthquake risk and surging reliance on smartphones and the Internet to handle everyday tasks raise the question: is an increasingly technology-reliant Bay Area prepared for potential infrastructure impacts caused by a major earthquake? How will a major earthquake on the Hayward Fault affect lifelines (roads, power, water, communication, etc.)? The U.S. Geological Survey Science Application for Risk Reduction (SAFRR) program's Haywired disaster scenario, a hypothetical two-year earthquake sequence triggered by a M7.05 mainshock on the Hayward Fault, addresses these and other questions. We explore four geographic aspects of lifeline damage from earthquakes: (1) geographic lifeline concentrations, (2) areas where lifelines pass through high shaking or potential ground-failure zones, (3) areas with diminished lifeline service demand due to severe building damage, and (4) areas with increased lifeline service demand due to displaced residents and businesses. Potential mainshock lifeline vulnerability and spatial demand changes will be discerned by superimposing earthquake shaking, liquefaction probability, and landslide probability damage thresholds with lifeline concentrations and with large-capacity shelters. Intersecting high hazard levels and lifeline clusters represent potential lifeline susceptibility hotspots. We will also analyze possible temporal vulnerability and demand changes using an aftershock shaking threshold. The results of this analysis will inform regional lifeline resilience initiatives and response and recovery planning, as well as reveal potential redundancies and weaknesses for Bay Area lifelines. Identified spatial and temporal hotspots can provide stakeholders with a reference for possible systemic vulnerability resulting from an earthquake sequence.

  1. Earthquakes, May-June 1991

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    In the United States, a magnitude 5.8 earthquake in southern California on June 28 killed two people and caused considerable damage. Strong earthquakes hit Alaska on May 1 and May 30; the May 1 earthquake caused some minor damage. 

  2. Characterizing potentially induced earthquake rate changes in the Brawley Seismic Zone, southern California

    USGS Publications Warehouse

    Llenos, Andrea L.; Michael, Andrew J.

    2016-01-01

    The Brawley seismic zone (BSZ), in the Salton trough of southern California, has a history of earthquake swarms and geothermal energy exploitation. Some earthquake rate changes may have been induced by fluid extraction and injection activity at local geothermal fields, particularly at the North Brawley Geothermal Field (NBGF) and at the Salton Sea Geothermal Field (SSGF). We explore this issue by examining earthquake rate changes and interevent distance distributions in these fields. In Oklahoma and Arkansas, where considerable wastewater injection occurs, increases in background seismicity rate and aftershock productivity and decreases in interevent distance were indicative of fluid‐injection‐induced seismicity. Here, we test if similar changes occur that may be associated with fluid injection and extraction in geothermal areas. We use stochastic epidemic‐type aftershock sequence models to detect changes in the underlying seismogenic processes, shown by statistically significant changes in the model parameters. The most robust model changes in the SSGF roughly occur when large changes in net fluid production occur, but a similar correlation is not seen in the NBGF. Also, although both background seismicity rate and aftershock productivity increased for fluid‐injection‐induced earthquake rate changes in Oklahoma and Arkansas, the background rate increases significantly in the BSZ only, roughly corresponding with net fluid production rate increases. Moreover, in both fields the interevent spacing does not change significantly during active energy projects. This suggests that, although geothermal field activities in a tectonically active region may not significantly change the physics of earthquake interactions, earthquake rates may still be driven by fluid injection or extraction rates, particularly in the SSGF.

  3. Scenario earthquake hazards for the Long Valley Caldera-Mono Lake area, east-central California (ver. 2.0, January 2018)

    USGS Publications Warehouse

    Chen, Rui; Branum, David M.; Wills, Chris J.; Hill, David P.

    2014-06-30

    As part of the U.S. Geological Survey’s (USGS) multi-hazards project in the Long Valley Caldera-Mono Lake area, the California Geological Survey (CGS) developed several earthquake scenarios and evaluated potential seismic hazards, including ground shaking, surface fault rupture, liquefaction, and landslide hazards associated with these earthquake scenarios. The results of these analyses can be useful in estimating the extent of potential damage and economic losses because of potential earthquakes and also for preparing emergency response plans.The Long Valley Caldera-Mono Lake area has numerous active faults. Five of these faults or fault zones are considered capable of producing magnitude ≥6.7 earthquakes according to the Uniform California Earthquake Rupture Forecast, Version 2 (UCERF 2) developed by the 2007 Working Group on California Earthquake Probabilities (WGCEP) and the USGS National Seismic Hazard Mapping Program. These five faults are the Fish Slough, Hartley Springs, Hilton Creek, Mono Lake, and Round Valley Faults. CGS developed earthquake scenarios for these five faults in the study area and for the White Mountains Fault Zone to the east of the study area.In this report, an earthquake scenario is intended to depict the potential consequences of significant earthquakes. A scenario earthquake is not necessarily the largest or most damaging earthquake possible on a recognized fault. Rather it is both large enough and likely enough that emergency planners should consider it in regional emergency response plans. In particular, the ground motion predicted for a given scenario earthquake does not represent a full probabilistic hazard assessment, and thus it does not provide the basis for hazard zoning and earthquake-resistant building design.Earthquake scenarios presented here are based on fault geometry and activity data developed by the WGCEP, and are consistent with the 2008 Update of the United States National Seismic Hazard Maps (NSHM). Alternatives

  4. Earthquakes; July-August, 1978

    USGS Publications Warehouse

    Person, W.J.

    1979-01-01

    Earthquake activity during this period was about normal. Deaths from earthquakes were reported from Greece and Guatemala. Three major earthquakes (magnitude 7.0-7.9) occurred in Taiwan, Chile, and Costa Rica. In the United States, the most significant earthquake was a magnitude 5.6 on August 13 in southern California

  5. Timing of paleoearthquakes on the northern Hayward Fault: preliminary evidence in El Cerrito, California

    USGS Publications Warehouse

    Lienkaemper, J.J.; Schwartz, D.P.; Kelson, K.I.; Lettis, W.R.; Simpson, Gary D.; Southon, J.R.; Wanket, J.A.; Williams, P.L.

    1999-01-01

    The Working Group on California Earthquake Probabilities estimated that the northern Hayward fault had the highest probability (0.28) of producing a M7 Bay Area earthquake in 30 years (WGCEP, 1990). This probability was based, in part, on the assumption that the last large earthquake occurred on this segment in 1836. However, a recent study of historical documents concludes that the 1836 earthquake did not occur on the northern Hayward fault, thereby extending the elapsed time to at least 220 yr ago, the beginning of the written record. The average recurrence interval for a M7 on the northern Hayward is unknown. WGCEP (1990) assumed an interval of 167 years. The 1996 Working Group on Northern California Earthquake Potential estimated ~210 yr, based on extrapolations from southern Hayward paleoseismological studies and a revised estimate of 1868 slip on the southern Hayward fault. To help constrain the timing of paleoearthquakes on the northern Hayward fault for the 1999 Bay Area probability update, we excavated two trenches that cross the fault and a sag pond on the Mira Vista golf course. As the site is on the second fairway, we were limited to less than ten days to document these trenches. Analysis was aided by rapid C-14 dating of more than 90 samples which gave near real-time results with the trenches still open. A combination of upward fault terminations, disrupted strata, and discordant angular relations indicates at least four, and possibly seven or more, surface faulting earthquakes occurred during a 1630-2130 yr interval. Hence, average recurrence time could be <270 yr, but is no more than 710 yr. The most recent earthquake (MRE) occurred after AD 1640. Preliminary analysis of calibrated dates supports the assumption that no large historical (post-1776) earthquakes have ruptured the surface here, but the youngest dates need more corroboration. Analyses of pollen for presence of non-native species help to constrain the time of the MRE. The earthquake

  6. Regional Earthquake Likelihood Models: A realm on shaky grounds?

    NASA Astrophysics Data System (ADS)

    Kossobokov, V.

    2005-12-01

    Seismology is juvenile and its appropriate statistical tools to-date may have a "medievil flavor" for those who hurry up to apply a fuzzy language of a highly developed probability theory. To become "quantitatively probabilistic" earthquake forecasts/predictions must be defined with a scientific accuracy. Following the most popular objectivists' viewpoint on probability, we cannot claim "probabilities" adequate without a long series of "yes/no" forecast/prediction outcomes. Without "antiquated binary language" of "yes/no" certainty we cannot judge an outcome ("success/failure"), and, therefore, quantify objectively a forecast/prediction method performance. Likelihood scoring is one of the delicate tools of Statistics, which could be worthless or even misleading when inappropriate probability models are used. This is a basic loophole for a misuse of likelihood as well as other statistical methods on practice. The flaw could be avoided by an accurate verification of generic probability models on the empirical data. It is not an easy task in the frames of the Regional Earthquake Likelihood Models (RELM) methodology, which neither defines the forecast precision nor allows a means to judge the ultimate success or failure in specific cases. Hopefully, the RELM group realizes the problem and its members do their best to close the hole with an adequate, data supported choice. Regretfully, this is not the case with the erroneous choice of Gerstenberger et al., who started the public web site with forecasts of expected ground shaking for `tomorrow' (Nature 435, 19 May 2005). Gerstenberger et al. have inverted the critical evidence of their study, i.e., the 15 years of recent seismic record accumulated just in one figure, which suggests rejecting with confidence above 97% "the generic California clustering model" used in automatic calculations. As a result, since the date of publication in Nature the United States Geological Survey website delivers to the public, emergency

  7. WHITTIER NARROWS, CALIFORNIA EARTHQUAKE OF OCTOBER 1, 1987-PRELIMINARY ASSESSMENT OF STRONG GROUND MOTION RECORDS.

    USGS Publications Warehouse

    Brady, A.G.; Etheredge, E.C.; Porcella, R.L.

    1988-01-01

    More than 250 strong-motion accelerograph stations were triggered by the Whittier Narrows, California earthquake of 1 October 1987. Considering the number of multichannel structural stations in the area of strong shaking, this set of records is one of the more significant in history. Three networks, operated by the U. S. Geological Survey, the California Division of Mines and Geology, and the University of Southern California produced the majority of the records. The excellent performance of the instruments in these and the smaller arrays is attributable to the quality of the maintenance programs. Readiness for a magnitude 8 event is directly related to these maintenance programs. Prior to computer analysis of the analog film records, a number of important structural resonant modes can be identified, and frequencies and simple mode shapes have been scaled.

  8. History of earthquakes and tsunamis along the eastern Aleutian-Alaska megathrust, with implications for tsunami hazards in the California Continental Borderland

    USGS Publications Warehouse

    Ryan, Holly F.; von Huene, Roland E.; Wells, Ray E.; Scholl, David W.; Kirby, Stephen; Draut, Amy E.; Dumoulin, Julie A.; Dusel-Bacon, C.

    2012-01-01

    During the past several years, devastating tsunamis were generated along subduction zones in Indonesia, Chile, and most recently Japan. Both the Chile and Japan tsunamis traveled across the Pacific Ocean and caused localized damage at several coastal areas in California. The question remains as to whether coastal California, in particular the California Continental Borderland, is vulnerable to more extensive damage from a far-field tsunami sourced along a Pacific subduction zone. Assuming that the coast of California is at risk from a far-field tsunami, its coastline is most exposed to a trans-Pacific tsunami generated along the eastern Aleutian-Alaska subduction zone. We present the background geologic constraints that could control a possible giant (Mw ~9) earthquake sourced along the eastern Aleutian-Alaska megathrust. Previous great earthquakes (Mw ~8) in 1788, 1938, and 1946 ruptured single segments of the eastern Aleutian-Alaska megathrust. However, in order to generate a giant earthquake, it is necessary to rupture through multiple segments of the megathrust. Potential barriers to a throughgoing rupture, such as high-relief fracture zones or ridges, are absent on the subducting Pacific Plate between the Fox and Semidi Islands. Possible asperities (areas on the megathrust that are locked and therefore subject to infrequent but large slip) are identified by patches of high moment release observed in the historical earthquake record, geodetic studies, and the location of forearc basin gravity lows. Global Positioning System (GPS) data indicate that some areas of the eastern Aleutian-Alaska megathrust, such as that beneath Sanak Island, are weakly coupled. We suggest that although these areas will have reduced slip during a giant earthquake, they are not really large enough to form a barrier to rupture. A key aspect in defining an earthquake source for tsunami generation is determining the possibility of significant slip on the updip end of the megathrust near

  9. Responses of a tall building in Los Angeles, California as inferred from local and distant earthquakes

    USGS Publications Warehouse

    Çelebi, Mehmet; Hasan Ulusoy,; Nori Nakata,

    2016-01-01

    Increasing inventory of tall buildings in the United States and elsewhere may be subjected to motions generated by near and far seismic sources that cause long-period effects. Multiple sets of records that exhibited such effects were retrieved from tall buildings in Tokyo and Osaka ~ 350 km and 770 km from the epicenter of the 2011 Tohoku earthquake. In California, very few tall buildings have been instrumented. An instrumented 52-story building in downtown Los Angeles recorded seven local and distant earthquakes. Spectral and system identification methods exhibit significant low frequencies of interest (~0.17 Hz, 0.56 Hz and 1.05 Hz). These frequencies compare well with those computed by transfer functions; however, small variations are observed between the significant low frequencies for each of the seven earthquakes. The torsional and translational frequencies are very close and are coupled. Beating effect is observed in at least two of the seven earthquake data.

  10. Earthquakes, September-October, 1979

    USGS Publications Warehouse

    Person, W.J.

    1980-01-01

    In the United States, California experienced the strongest earthquake in that State since 1971. The quake, a M=6.8, occurred on October 15, in Baja California, Mexico, near the California border and caused injuries and damage. 

  11. Application of Second-Moment Source Analysis to Three Problems in Earthquake Forecasting

    NASA Astrophysics Data System (ADS)

    Donovan, J.; Jordan, T. H.

    2011-12-01

    probabilities? The FMT representation allows us to generalize the models typically used for this purpose (e.g., marked point process models, such as ETAS), which will again be necessary in operational earthquake forecasting. To quantify aftershock probabilities, we compare mainshock FMTs with the first and second spatial moments of weighted aftershock hypocenters. We will describe applications of these results to the Uniform California Earthquake Rupture Forecast, version 3, which is now under development by the Working Group on California Earthquake Probabilities.

  12. Predicted liquefaction of East Bay fills during a repeat of the 1906 San Francisco earthquake

    USGS Publications Warehouse

    Holzer, T.L.; Blair, J.L.; Noce, T.E.; Bennett, M.J.

    2006-01-01

    Predicted conditional probabilities of surface manifestations of liquefaction during a repeat of the 1906 San Francisco (M7.8) earthquake range from 0.54 to 0.79 in the area underlain by the sandy artificial fills along the eastern shore of San Francisco Bay near Oakland, California. Despite widespread liquefaction in 1906 of sandy fills in San Francisco, most of the East Bay fills were emplaced after 1906 without soil improvement to increase their liquefaction resistance. They have yet to be shaken strongly. Probabilities are based on the liquefaction potential index computed from 82 CPT soundings using median (50th percentile) estimates of PGA based on a ground-motion prediction equation. Shaking estimates consider both distance from the San Andreas Fault and local site conditions. The high probabilities indicate extensive and damaging liquefaction will occur in East Bay fills during the next M ??? 7.8 earthquake on the northern San Andreas Fault. ?? 2006, Earthquake Engineering Research Institute.

  13. Operational earthquake forecasting can enhance earthquake preparedness

    USGS Publications Warehouse

    Jordan, T.H.; Marzocchi, W.; Michael, A.J.; Gerstenberger, M.C.

    2014-01-01

    We cannot yet predict large earthquakes in the short term with much reliability and skill, but the strong clustering exhibited in seismic sequences tells us that earthquake probabilities are not constant in time; they generally rise and fall over periods of days to years in correlation with nearby seismic activity. Operational earthquake forecasting (OEF) is the dissemination of authoritative information about these time‐dependent probabilities to help communities prepare for potentially destructive earthquakes. The goal of OEF is to inform the decisions that people and organizations must continually make to mitigate seismic risk and prepare for potentially destructive earthquakes on time scales from days to decades. To fulfill this role, OEF must provide a complete description of the seismic hazard—ground‐motion exceedance probabilities as well as short‐term rupture probabilities—in concert with the long‐term forecasts of probabilistic seismic‐hazard analysis (PSHA).

  14. Products and Services Available from the Southern California Earthquake Data Center (SCEDC) and the Southern California Seismic Network (SCSN)

    NASA Astrophysics Data System (ADS)

    Yu, E.; Bhaskaran, A.; Chen, S.; Chowdhury, F. R.; Meisenhelter, S.; Hutton, K.; Given, D.; Hauksson, E.; Clayton, R. W.

    2010-12-01

    Currently the SCEDC archives continuous and triggered data from nearly 5000 data channels from 425 SCSN recorded stations, processing and archiving an average of 12,000 earthquakes each year. The SCEDC provides public access to these earthquake parametric and waveform data through its website www.data.scec.org and through client applications such as STP and DHI. This poster will describe the most significant developments at the SCEDC in the past year. Updated hardware: ● The SCEDC has more than doubled its waveform file storage capacity by migrating to 2 TB disks. New data holdings: ● Waveform data: Beginning Jan 1, 2010 the SCEDC began continuously archiving all high-sample-rate strong-motion channels. All seismic channels recorded by SCSN are now continuously archived and available at SCEDC. ● Portable data from El Mayor Cucapah 7.2 sequence: Seismic waveforms from portable stations installed by researchers (contributed by Elizabeth Cochran, Jamie Steidl, and Octavio Lazaro-Mancilla) have been added to the archive and are accessible through STP either as continuous data or associated with events in the SCEDC earthquake catalog. This additional data will help SCSN analysts and researchers improve event locations from the sequence. ● Real time GPS solutions from El Mayor Cucapah 7.2 event: Three component 1Hz seismograms of California Real Time Network (CRTN) GPS stations, from the April 4, 2010, magnitude 7.2 El Mayor-Cucapah earthquake are available in SAC format at the SCEDC. These time series were created by Brendan Crowell, Yehuda Bock, the project PI, and Mindy Squibb at SOPAC using data from the CRTN. The El Mayor-Cucapah earthquake demonstrated definitively the power of real-time high-rate GPS data: they measure dynamic displacements directly, they do not clip and they are also able to detect the permanent (coseismic) surface deformation. ● Triggered data from the Quake Catcher Network (QCN) and Community Seismic Network (CSN): The SCEDC in

  15. 2016 National Earthquake Conference

    Science.gov Websites

    Thank you to our Presenting Sponsor, California Earthquake Authority. What's New? What's Next ? What's Your Role in Building a National Strategy? The National Earthquake Conference (NEC) is a , state government leaders, social science practitioners, U.S. State and Territorial Earthquake Managers

  16. The 1999 Mw 7.1 Hector Mine, California, earthquake: A test of the stress shadow hypothesis?

    USGS Publications Warehouse

    Harris, R.A.; Simpson, R.W.

    2002-01-01

    We test the stress shadow hypothesis for large earthquake interactions by examining the relationship between two large earthquakes that occurred in the Mojave Desert of southern California, the 1992 Mw 7.3 Landers and 1999 Mw 7.1 Hector Mine earthquakes. We want to determine if the 1999 Hector Mine earthquake occurred at a location where the Coulomb stress was increased (earthquake advance, stress trigger) or decreased (earthquake delay, stress shadow) by the previous large earthquake. Using four models of the Landers rupture and a range of possible hypocentral planes for the Hector Mine earthquake, we discover that most scenarios yield a Landers-induced relaxation (stress shadow) on the Hector Mine hypocentral plane. Although this result would seem to weigh against the stress shadow hypothesis, the results become considerably more uncertain when the effects of a nearby Landers aftershock, the 1992 ML 5.4 Pisgah earthquake, are taken into account. We calculate the combined static Coulomb stress changes due to the Landers and Pisgah earthquakes to range from -0.3 to +0.3 MPa (- 3 to +3 bars) at the possible Hector Mine hypocenters, depending on choice of rupture model and hypocenter. These varied results imply that the Hector Mine earthquake does not provide a good test of the stress shadow hypothesis for large earthquake interactions. We use a simple approach, that of static dislocations in an elastic half-space, yet we still obtain a wide range of both negative and positive Coulomb stress changes. Our findings serve as a caution that more complex models purporting to explain the triggering or shadowing relationship between the 1992 Landers and 1999 Hector Mine earthquakes need to also consider the parametric and geometric uncertainties raised here.

  17. Spatial-temporal variation of low-frequency earthquake bursts near Parkfield, California

    USGS Publications Warehouse

    Wu, Chunquan; Guyer, Robert; Shelly, David R.; Trugman, D.; Frank, William; Gomberg, Joan S.; Johnson, P.

    2015-01-01

    Tectonic tremor (TT) and low-frequency earthquakes (LFEs) have been found in the deeper crust of various tectonic environments globally in the last decade. The spatial-temporal behaviour of LFEs provides insight into deep fault zone processes. In this study, we examine recurrence times from a 12-yr catalogue of 88 LFE families with ∼730 000 LFEs in the vicinity of the Parkfield section of the San Andreas Fault (SAF) in central California. We apply an automatic burst detection algorithm to the LFE recurrence times to identify the clustering behaviour of LFEs (LFE bursts) in each family. We find that the burst behaviours in the northern and southern LFE groups differ. Generally, the northern group has longer burst duration but fewer LFEs per burst, while the southern group has shorter burst duration but more LFEs per burst. The southern group LFE bursts are generally more correlated than the northern group, suggesting more coherent deep fault slip and relatively simpler deep fault structure beneath the locked section of SAF. We also found that the 2004 Parkfield earthquake clearly increased the number of LFEs per burst and average burst duration for both the northern and the southern groups, with a relatively larger effect on the northern group. This could be due to the weakness of northern part of the fault, or the northwesterly rupture direction of the Parkfield earthquake.

  18. Operational Earthquake Forecasting: Proposed Guidelines for Implementation (Invited)

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.

    2010-12-01

    The goal of operational earthquake forecasting (OEF) is to provide the public with authoritative information about how seismic hazards are changing with time. During periods of high seismic activity, short-term earthquake forecasts based on empirical statistical models can attain nominal probability gains in excess of 100 relative to the long-term forecasts used in probabilistic seismic hazard analysis (PSHA). Prospective experiments are underway by the Collaboratory for the Study of Earthquake Predictability (CSEP) to evaluate the reliability and skill of these seismicity-based forecasts in a variety of tectonic environments. How such information should be used for civil protection is by no means clear, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing formal procedures for OEF in this sort of “low-probability environment.” Nevertheless, the need to move more quickly towards OEF has been underscored by recent experiences, such as the 2009 L’Aquila earthquake sequence and other seismic crises in which an anxious public has been confused by informal, inconsistent earthquake forecasts. Whether scientists like it or not, rising public expectations for real-time information, accelerated by the use of social media, will require civil protection agencies to develop sources of authoritative information about the short-term earthquake probabilities. In this presentation, I will discuss guidelines for the implementation of OEF informed by my experience on the California Earthquake Prediction Evaluation Council, convened by CalEMA, and the International Commission on Earthquake Forecasting, convened by the Italian government following the L’Aquila disaster. (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and

  19. Finite Moment Tensors of Southern California Earthquakes

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.; Chen, P.; Zhao, L.

    2003-12-01

    We have developed procedures for inverting broadband waveforms for the finite moment tensors (FMTs) of regional earthquakes. The FMT is defined in terms of second-order polynomial moments of the source space-time function and provides the lowest order representation of a finite fault rupture; it removes the fault-plane ambiguity of the centroid moment tensor (CMT) and yields several additional parameters of seismological interest: the characteristic length L{c}, width W{c}, and duration T{c} of the faulting, as well as the directivity vector {v}{d} of the fault slip. To formulate the inverse problem, we follow and extend the methods of McGuire et al. [2001, 2002], who have successfully recovered the second-order moments of large earthquakes using low-frequency teleseismic data. We express the Fourier spectra of a synthetic point-source waveform in its exponential (Rytov) form and represent the observed waveform relative to the synthetic in terms two frequency-dependent differential times, a phase delay δ τ {p}(ω ) and an amplitude-reduction time δ τ {q}(ω ), which we measure using Gee and Jordan's [1992] isolation-filter technique. We numerically calculate the FMT partial derivatives in terms of second-order spatiotemporal gradients, which allows us to use 3D finite-difference seismograms as our isolation filters. We have applied our methodology to a set of small to medium-sized earthquakes in Southern California. The errors in anelastic structure introduced perturbations larger than the signal level caused by finite source effect. We have therefore employed a joint inversion technique that recovers the CMT parameters of the aftershocks, as well as the CMT and FMT parameters of the mainshock, under the assumption that the source finiteness of the aftershocks can be ignored. The joint system of equations relating the δ τ {p} and δ τ {q} data to the source parameters of the mainshock-aftershock cluster is denuisanced for path anomalies in both observables

  20. The 2014 Mw 6.0 Napa Earthquake, California: Observations from Real-time GPS-enhanced Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Johanson, I. A.; Grapenthin, R.; Allen, R. M.

    2014-12-01

    Recently, progress has been made to demonstrate feasibility and benefits of including real-time GPS (rtGPS) in earthquake early warning and rapid response systems. While most concepts have yet to be integrated into operational environments, the Berkeley Seismological Laboratory is currently running an rtGPS based finite fault inversion scheme in true real-time, which is triggered by the seismic-based ShakeAlert system and then sends updated earthquake alerts to a test receiver. The Geodetic Alarm System (G-larmS) was online and responded to the 2014 Mw6.0 South Napa earthquake in California. We review G-larmS' performance during this event and for 13 aftershocks, and we present rtGPS observations and real-time modeling results for the main shock. The first distributed slip model and a magnitude estimate of Mw5.5 were available 24 s after the event origin time, which could be reduced to 14 s after a bug fix (~8 s S-wave travel time, ~6 s data latency). The system continued to re-estimate the magnitude once every second: it increased to Mw5.9 3 s after the first alert and stabilized at Mw5.8 after 15 s. G-larmS' solutions for the subsequent small magnitude aftershocks demonstrate that Mw~6.0 is the current limit for alert updates to contribute back to the seismic-based early warning system.

  1. Earthquake prediction in California using regression algorithms and cloud-based big data infrastructure

    NASA Astrophysics Data System (ADS)

    Asencio-Cortés, G.; Morales-Esteban, A.; Shang, X.; Martínez-Álvarez, F.

    2018-06-01

    Earthquake magnitude prediction is a challenging problem that has been widely studied during the last decades. Statistical, geophysical and machine learning approaches can be found in literature, with no particularly satisfactory results. In recent years, powerful computational techniques to analyze big data have emerged, making possible the analysis of massive datasets. These new methods make use of physical resources like cloud based architectures. California is known for being one of the regions with highest seismic activity in the world and many data are available. In this work, the use of several regression algorithms combined with ensemble learning is explored in the context of big data (1 GB catalog is used), in order to predict earthquakes magnitude within the next seven days. Apache Spark framework, H2 O library in R language and Amazon cloud infrastructure were been used, reporting very promising results.

  2. Expanding CyberShake Physics-Based Seismic Hazard Calculations to Central California

    NASA Astrophysics Data System (ADS)

    Silva, F.; Callaghan, S.; Maechling, P. J.; Goulet, C. A.; Milner, K. R.; Graves, R. W.; Olsen, K. B.; Jordan, T. H.

    2016-12-01

    As part of its program of earthquake system science, the Southern California Earthquake Center (SCEC) has developed a simulation platform, CyberShake, to perform physics-based probabilistic seismic hazard analysis (PSHA) using 3D deterministic wave propagation simulations. CyberShake performs PSHA by first simulating a tensor-valued wavefield of Strain Green Tensors. CyberShake then takes an earthquake rupture forecast and extends it by varying the hypocenter location and slip distribution, resulting in about 500,000 rupture variations. Seismic reciprocity is used to calculate synthetic seismograms for each rupture variation at each computation site. These seismograms are processed to obtain intensity measures, such as spectral acceleration, which are then combined with probabilities from the earthquake rupture forecast to produce a hazard curve. Hazard curves are calculated at seismic frequencies up to 1 Hz for hundreds of sites in a region and the results interpolated to obtain a hazard map. In developing and verifying CyberShake, we have focused our modeling in the greater Los Angeles region. We are now expanding the hazard calculations into Central California. Using workflow tools running jobs across two large-scale open-science supercomputers, NCSA Blue Waters and OLCF Titan, we calculated 1-Hz PSHA results for over 400 locations in Central California. For each location, we produced hazard curves using both a 3D central California velocity model created via tomographic inversion, and a regionally averaged 1D model. These new results provide low-frequency exceedance probabilities for the rapidly expanding metropolitan areas of Santa Barbara, Bakersfield, and San Luis Obispo, and lend new insights into the effects of directivity-basin coupling associated with basins juxtaposed to major faults such as the San Andreas. Particularly interesting are the basin effects associated with the deep sediments of the southern San Joaquin Valley. We will compare hazard

  3. Testing hypotheses of earthquake occurrence

    NASA Astrophysics Data System (ADS)

    Kagan, Y. Y.; Jackson, D. D.; Schorlemmer, D.; Gerstenberger, M.

    2003-12-01

    We present a relatively straightforward likelihood method for testing those earthquake hypotheses that can be stated as vectors of earthquake rate density in defined bins of area, magnitude, and time. We illustrate the method as it will be applied to the Regional Earthquake Likelihood Models (RELM) project of the Southern California Earthquake Center (SCEC). Several earthquake forecast models are being developed as part of this project, and additional contributed forecasts are welcome. Various models are based on fault geometry and slip rates, seismicity, geodetic strain, and stress interactions. We would test models in pairs, requiring that both forecasts in a pair be defined over the same set of bins. Thus we offer a standard "menu" of bins and ground rules to encourage standardization. One menu category includes five-year forecasts of magnitude 5.0 and larger. Forecasts would be in the form of a vector of yearly earthquake rates on a 0.05 degree grid at the beginning of the test. Focal mechanism forecasts, when available, would be also be archived and used in the tests. The five-year forecast category may be appropriate for testing hypotheses of stress shadows from large earthquakes. Interim progress will be evaluated yearly, but final conclusions would be made on the basis of cumulative five-year performance. The second category includes forecasts of earthquakes above magnitude 4.0 on a 0.05 degree grid, evaluated and renewed daily. Final evaluation would be based on cumulative performance over five years. Other types of forecasts with different magnitude, space, and time sampling are welcome and will be tested against other models with shared characteristics. All earthquakes would be counted, and no attempt made to separate foreshocks, main shocks, and aftershocks. Earthquakes would be considered as point sources located at the hypocenter. For each pair of forecasts, we plan to compute alpha, the probability that the first would be wrongly rejected in favor of

  4. Cross-sections and maps showing double-difference relocated earthquakes from 1984-2000 along the Hayward and Calaveras faults, California

    USGS Publications Warehouse

    Simpson, Robert W.; Graymer, Russell W.; Jachens, Robert C.; Ponce, David A.; Wentworth, Carl M.

    2004-01-01

    We present cross-section and map views of earthquakes that occurred from 1984 to 2000 in the vicinity of the Hayward and Calaveras faults in the San Francisco Bay region, California. These earthquakes came from a catalog of events relocated using the double-difference technique, which provides superior relative locations of nearby events. As a result, structures such as fault surfaces and alignments of events along these surfaces are more sharply defined than in previous catalogs.

  5. Prototype operational earthquake prediction system

    USGS Publications Warehouse

    Spall, Henry

    1986-01-01

    An objective if the U.S. Earthquake Hazards Reduction Act of 1977 is to introduce into all regions of the country that are subject to large and moderate earthquakes, systems for predicting earthquakes and assessing earthquake risk. In 1985, the USGS developed for the Secretary of the Interior a program for implementation of a prototype operational earthquake prediction system in southern California.

  6. Simulating Earthquakes for Science and Society: Earthquake Visualizations Ideal for use in Science Communication and Education

    NASA Astrophysics Data System (ADS)

    de Groot, R.

    2008-12-01

    The Southern California Earthquake Center (SCEC) has been developing groundbreaking computer modeling capabilities for studying earthquakes. These visualizations were initially shared within the scientific community but have recently gained visibility via television news coverage in Southern California. Computers have opened up a whole new world for scientists working with large data sets, and students can benefit from the same opportunities (Libarkin & Brick, 2002). For example, The Great Southern California ShakeOut was based on a potential magnitude 7.8 earthquake on the southern San Andreas fault. The visualization created for the ShakeOut was a key scientific and communication tool for the earthquake drill. This presentation will also feature SCEC Virtual Display of Objects visualization software developed by SCEC Undergraduate Studies in Earthquake Information Technology interns. According to Gordin and Pea (1995), theoretically visualization should make science accessible, provide means for authentic inquiry, and lay the groundwork to understand and critique scientific issues. This presentation will discuss how the new SCEC visualizations and other earthquake imagery achieve these results, how they fit within the context of major themes and study areas in science communication, and how the efficacy of these tools can be improved.

  7. Public Perception of Relative Risk: Earthquakes vs. Hurricanes in the San Diego Region

    NASA Astrophysics Data System (ADS)

    Means, J. D.

    2014-12-01

    Public perception of risk is key in pre-disaster preparation. Despite admonitions from emergency planners, people often fail to take reasonable precautions. But if emergency planners also fail to realize the possibility of a particular disaster scenario, there is very little chance that the public will plan for it. In Southern California there is a well-known risk associated with earthquakes, and it would be difficult to find anyone that didn't understand that the region was subject to risk from earthquakes. On the other hand, few, if any people consider the risk associated with tropical storms or hurricanes. This is reasonable considering people have always been told that the west coast of the United States is immune from hurricanes due to the cold water associated with the California Current, and the hazard of earthquakes is fairly obvious to anyone that has lived the for a while. Such an attitude is probably justifiable for most of Southern California, but it's unclear whether this is true for the San Diego region: destructive earthquakes are historically rare, and there is good evidence that the region was affected by a Category 1 hurricane in 1858. Indeed, during the last 70 years, more people have died from tropical cyclones in Calfornia's southernmost counties (San Diego and Imperial) than have died from earthquakes. In this paper we compare the relative risks from these two different types of disasters for the San Diego region, and attempt to answer why one type of hazard is emphasized in public planning and the other is neglected.

  8. Sensitivity of Earthquake Loss Estimates to Source Modeling Assumptions and Uncertainty

    USGS Publications Warehouse

    Reasenberg, Paul A.; Shostak, Nan; Terwilliger, Sharon

    2006-01-01

    Introduction: This report explores how uncertainty in an earthquake source model may affect estimates of earthquake economic loss. Specifically, it focuses on the earthquake source model for the San Francisco Bay region (SFBR) created by the Working Group on California Earthquake Probabilities. The loss calculations are made using HAZUS-MH, a publicly available computer program developed by the Federal Emergency Management Agency (FEMA) for calculating future losses from earthquakes, floods and hurricanes within the United States. The database built into HAZUS-MH includes a detailed building inventory, population data, data on transportation corridors, bridges, utility lifelines, etc. Earthquake hazard in the loss calculations is based upon expected (median value) ground motion maps called ShakeMaps calculated for the scenario earthquake sources defined in WGCEP. The study considers the effect of relaxing certain assumptions in the WG02 model, and explores the effect of hypothetical reductions in epistemic uncertainty in parts of the model. For example, it addresses questions such as what would happen to the calculated loss distribution if the uncertainty in slip rate in the WG02 model were reduced (say, by obtaining additional geologic data)? What would happen if the geometry or amount of aseismic slip (creep) on the region's faults were better known? And what would be the effect on the calculated loss distribution if the time-dependent earthquake probability were better constrained, either by eliminating certain probability models or by better constraining the inherent randomness in earthquake recurrence? The study does not consider the effect of reducing uncertainty in the hazard introduced through models of attenuation and local site characteristics, although these may have a comparable or greater effect than does source-related uncertainty. Nor does it consider sources of uncertainty in the building inventory, building fragility curves, and other assumptions

  9. Earthquakes, July-August 1991

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    There was one major earthquake during this reporting period-a magnitude 7.1 shock off the coast of Northern California on August 17. Earthquake-related deaths were reported from Indonesia, Romania, Peru, and Iraq. 

  10. CRUSTAL REFRACTION PROFILE OF THE LONG VALLEY CALDERA, CALIFORNIA, FROM THE JANUARY 1983 MAMMOTH LAKES EARTHQUAKE SWARM.

    USGS Publications Warehouse

    Luetgert, James H.; Mooney, Walter D.

    1985-01-01

    Seismic-refraction profiles recorded north of Mammoth Lakes, California, using earthquake sources from the January 1983 swarm complement earlier explosion refraction profiles and provide velocity information from deeper in the crust in the area of the Long Valley caldera. Eight earthquakes from a depth range of 4. 9 to 8. 0 km confirm the observation of basement rocks with seismic velocities ranging from 5. 8 to 6. 4 km/sec extending at least to depths of 20 km. The data provide further evidence for the existence of a partial melt zone beneath Long Valley caldera and constrain its geometry. Refs.

  11. Uniform California earthquake rupture forecast, version 3 (UCERF3): the time-independent model

    USGS Publications Warehouse

    Field, Edward H.; Biasi, Glenn P.; Bird, Peter; Dawson, Timothy E.; Felzer, Karen R.; Jackson, David D.; Johnson, Kaj M.; Jordan, Thomas H.; Madden, Christopher; Michael, Andrew J.; Milner, Kevin R.; Page, Morgan T.; Parsons, Thomas; Powers, Peter M.; Shaw, Bruce E.; Thatcher, Wayne R.; Weldon, Ray J.; Zeng, Yuehua; ,

    2013-01-01

    In this report we present the time-independent component of the Uniform California Earthquake Rupture Forecast, Version 3 (UCERF3), which provides authoritative estimates of the magnitude, location, and time-averaged frequency of potentially damaging earthquakes in California. The primary achievements have been to relax fault segmentation assumptions and to include multifault ruptures, both limitations of the previous model (UCERF2). The rates of all earthquakes are solved for simultaneously, and from a broader range of data, using a system-level "grand inversion" that is both conceptually simple and extensible. The inverse problem is large and underdetermined, so a range of models is sampled using an efficient simulated annealing algorithm. The approach is more derivative than prescriptive (for example, magnitude-frequency distributions are no longer assumed), so new analysis tools were developed for exploring solutions. Epistemic uncertainties were also accounted for using 1,440 alternative logic tree branches, necessitating access to supercomputers. The most influential uncertainties include alternative deformation models (fault slip rates), a new smoothed seismicity algorithm, alternative values for the total rate of M≥5 events, and different scaling relationships, virtually all of which are new. As a notable first, three deformation models are based on kinematically consistent inversions of geodetic and geologic data, also providing slip-rate constraints on faults previously excluded because of lack of geologic data. The grand inversion constitutes a system-level framework for testing hypotheses and balancing the influence of different experts. For example, we demonstrate serious challenges with the Gutenberg-Richter hypothesis for individual faults. UCERF3 is still an approximation of the system, however, and the range of models is limited (for example, constrained to stay close to UCERF2). Nevertheless, UCERF3 removes the apparent UCERF2 overprediction of

  12. Response of the San Andreas fault to the 1983 Coalinga-Nuñez earthquakes: an application of interaction-based probabilities for Parkfield

    USGS Publications Warehouse

    Toda, Shinji; Stein, Ross S.

    2002-01-01

    The Parkfield-Cholame section of the San Andreas fault, site of an unfulfilled earthquake forecast in 1985, is the best monitored section of the world's most closely watched fault. In 1983, the M = 6.5 Coalinga and M = 6.0 Nuñez events struck 25 km northeast of Parkfield. Seismicity rates climbed for 18 months along the creeping section of the San Andreas north of Parkfield and dropped for 6 years along the locked section to the south. Right-lateral creep also slowed or reversed from Parkfield south. Here we calculate that the Coalinga sequence increased the shear and Coulomb stress on the creeping section, causing the rate of small shocks to rise until the added stress was shed by additional slip. However, the 1983 events decreased the shear and Coulomb stress on the Parkfield segment, causing surface creep and seismicity rates to drop. We use these observations to cast the likelihood of a Parkfield earthquake into an interaction-based probability, which includes both the renewal of stress following the 1966 Parkfield earthquake and the stress transfer from the 1983 Coalinga events. We calculate that the 1983 shocks dropped the 10-year probability of a M ∼ 6 Parkfield earthquake by 22% (from 54 ± 22% to 42 ± 23%) and that the probability did not recover until about 1991, when seismicity and creep resumed. Our analysis may thus explain why the Parkfield earthquake did not strike in the 1980s, but not why it was absent in the 1990s. We calculate a 58 ± 17% probability of a M ∼ 6 Parkfield earthquake during 2001–2011.

  13. Source processes of industrially-induced earthquakes at the Geysers geothermal area, California

    USGS Publications Warehouse

    Ross, A.; Foulger, G.R.; Julian, B.R.

    1999-01-01

    Microearthquake activity at The Geysers geothermal area, California, mirrors the steam production rate, suggesting that the earthquakes are industrially induced. A 15-station network of digital, three-component seismic stations was operated for one month in 1991, and 3,900 earthquakes were recorded. Highly-accurate moment tensors were derived for 30 of the best recorded earthquakes by tracing rays through tomographically derived 3-D VP and VP / VS structures, and inverting P-and S-wave polarities and amplitude ratios. The orientations of the P-and T-axes are very scattered, suggesting that there is no strong, systematic deviatoric stress field in the reservoir, which could explain why the earthquakes are not large. Most of the events had significant non-double-couple (non-DC) components in their source mechanisms with volumetric components up to ???30% of the total moment. Explosive and implosive sources were observed in approximately equal numbers, and must be caused by cavity creation (or expansion) and collapse. It is likely that there is a causal relationship between these processes and fluid reinjection and steam withdrawal. Compensated linear vector dipole (CLVD) components were up to 100% of the deviatoric component. Combinations of opening cracks and shear faults cannot explain all the observations, and rapid fluid flow may also be involved. The pattern of non-DC failure at The Geysers contrasts with that of the Hengill-Grensdalur area in Iceland, a largely unexploited water-dominated field in an extensional stress regime. These differences are poorly understood but may be linked to the contrasting regional stress regimes and the industrial exploitation at The Geysers.

  14. Predicting the impact of tsunami in California under rising sea level

    NASA Astrophysics Data System (ADS)

    Dura, T.; Garner, A. J.; Weiss, R.; Kopp, R. E.; Horton, B.

    2017-12-01

    The flood hazard for the California coast depends not only on the magnitude, location, and rupture length of Alaska-Aleutian subduction zone earthquakes and their resultant tsunamis, but also on rising sea levels, which combine with tsunamis to produce overall flood levels. The magnitude of future sea-level rise remains uncertain even on the decadal scale, with future sea-level projections becoming even more uncertain at timeframes of a century or more. Earthquake statistics indicate that timeframes of ten thousand to one hundred thousand years are needed to capture rare, very large earthquakes. Because of the different timescales between reliable sea-level projections and earthquake distributions, simply combining the different probabilities in the context of a tsunami hazard assessment may be flawed. Here, we considered 15 earthquakes between Mw 8 to Mw 9.4 bound by -171oW and -140oW of the Alaska-Aleutian subduction zone. We employed 24 realizations at each magnitude with random epicenter locations and different fault length-to-width ratios, and simulated the tsunami evolution from these 360 earthquakes at each decade from the years 2000 to 2200. These simulations were then carried out for different sea-level-rise projections to analyze the future flood hazard for California. Looking at the flood levels at tide gauges, we found that the flood level simulated at, for example, the year 2100 (including respective sea-level change) is different from the flood level calculated by adding the flood for the year 2000 to the sea-level change prediction for the year 2100. This is consistent for all sea-level rise scenarios, and this difference in flood levels range between 5% and 12% for the larger half of the given magnitude interval. Focusing on flood levels at the tide gauge in the Port of Los Angeles, the most probable flood level (including all earthquake magnitudes) in the year 2000 was 5 cm. Depending on the sea-level predictions, in the year 2050 the most probable

  15. Identifying a large landslide with small displacements in a zone of coseismic tectonic deformation; the Villa Del Monte landslide triggered by the 1989 Loma Prieta, California, earthquake

    USGS Publications Warehouse

    Keefer, David K.; Harp, Edwin L.; Griggs, Gary B.; Evans, Stephen G.; DeGraff, Jerome V.

    2002-01-01

    The Villa Del Monte landslide was one of 20 large and complex landslides triggered by the 1989 LomaPrieta, California, earthquake in a zone of pervasive coseismicground cracking near the fault rupture. The landslide was approximately 980 m long, 870 m wide, and encompassed an area of approximately 68 ha. Drilling data suggested that movement may have extended to depths as great as 85 m below the ground surface. Even though the landslide moved <1 m, it caused substantial damage to numerous dwellings and other structures, primarily as a result of differential displacements and internal Assuring. Surface cracks, scarps, and compression features delineating the Villa Del Monte landslide were discontinuous, probably because coseismic displacements were small; such discontinuous features were also characteristic of the other large, coseismic landslides in the area, which also moved only short distances during the earthquake. Because features marking landslide boundaries were discontinuous and because other types of coseismic ground cracks were widespread in the area, identification of the landslides required detailed mapping and analysis. Recognition that landslides such as that at Villa Del Monte may occur near earthquake-generating fault ruptures should aid in future hazard evaluations of areas along active faults.

  16. Northern California Earthquake Data Center: Data Sets and Data Services

    NASA Astrophysics Data System (ADS)

    Neuhauser, D. S.; Allen, R. M.; Zuzlewski, S.

    2015-12-01

    The Northern California Earthquake Data Center (NCEDC) provides a permanent archive and real-time data distribution services for a unique and comprehensive data set of seismological and geophysical data sets encompassing northern and central California. We provide access to over 85 terabytes of continuous and event-based time series data from broadband, short-period, strong motion, and strain sensors as well as continuous and campaign GPS data at both standard and high sample rates. The Northen California Seismic System (NCSS), operated by UC Berkeley and USGS Menlo Park, has recorded over 900,000 events from 1984 to the present, and the NCEDC serves catalog, parametric information, moment tensors and first motion mechanisms, and time series data for these events. We also serve event catalogs, parametric information, and event waveforms for DOE enhanced geothermal system monitoring in northern California and Nevada. The NCEDC provides a several ways for users to access these data. The most recent development are web services, which provide interactive, command-line, or program-based workflow access to data. Web services use well-established server and client protocols and RESTful software architecture that allow users to easily submit queries and receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, simple text, or MiniSEED depending on the service and selected output format. The NCEDC supports all FDSN-defined web services as well as a number of IRIS-defined and NCEDC-defined services. We also continue to support older email-based and browser-based access to data. NCEDC data and web services can be found at http://www.ncedc.org and http://service.ncedc.org.

  17. Napa Earthquake impact on water systems

    NASA Astrophysics Data System (ADS)

    Wang, J.

    2014-12-01

    South Napa earthquake occurred in Napa, California on August 24 at 3am, local time, and the magnitude is 6.0. The earthquake was the largest in SF Bay Area since the 1989 Loma Prieta earthquake. Economic loss topped $ 1 billion. Wine makers cleaning up and estimated the damage on tourism. Around 15,000 cases of lovely cabernet were pouring into the garden at the Hess Collection. Earthquake potentially raise water pollution risks, could cause water crisis. CA suffered water shortage recent years, and it could be helpful on how to prevent underground/surface water pollution from earthquake. This research gives a clear view on drinking water system in CA, pollution on river systems, as well as estimation on earthquake impact on water supply. The Sacramento-San Joaquin River delta (close to Napa), is the center of the state's water distribution system, delivering fresh water to more than 25 million residents and 3 million acres of farmland. Delta water conveyed through a network of levees is crucial to Southern California. The drought has significantly curtailed water export, and salt water intrusion reduced fresh water outflows. Strong shaking from a nearby earthquake can cause saturated, loose, sandy soils liquefaction, and could potentially damage major delta levee systems near Napa. Napa earthquake is a wake-up call for Southern California. It could potentially damage freshwater supply system.

  18. Persistent water level changes in a well near Parkfield, California, due to local and distant earthquakes

    NASA Astrophysics Data System (ADS)

    Roeloffs, Evelyn A.

    1998-01-01

    Coseismic water level rises in the 30-m deep Bourdieu Valley (BV) well near Parkfield, California, have occurred in response to three local and five distant earthquakes. Coseismic changes in static strain cannot explain these water level rises because (1) the well is insensitive to strain at tidal periods; (2) for the distant earthquakes, the expected coseismic static strain is extremely small; and (3) the water level response is of the incorrect sign for the local earthquakes. These water level changes must therefore be caused by seismic waves, but unlike seismic water level oscillations, they are monotonic, persist for days or weeks, and seem to be caused by waves with periods of several seconds rather than long-period surface waves. Other investigators have reported a similar phenomenon in Japan. Certain wells consistently exhibit this type of coseismic water level change, which is always in the same direction, regardless of the earthquake's azimuth or focal mechanism, and approximately proportional to the inverse square of hypocentral distance. To date, the coseismic water level rises in the B V well have never exceeded the seasonal water level maximum, although their sizes are relatively well correlated with earthquake magnitude and distance. The frequency independence of the well's response to barometric pressure in the frequency band 0.1 to 0.7 cpd implies that the aquifer is fairly well confined. High aquifer compressibility, probably due to a gas phase in the pore space, is the most likely reason why the well does not respond to Earth tides. The phase and amplitude relationships between the seasonal water level and precipitation cycles constrain the horizontal hydraulic diffusivity to within a factor of 4.5, bounding hypothetical earthquake-induced changes in aquifer hydraulic properties. Moreover, changes of hydraulic conductivity and/or diffusivity throughout the aquifer would not be expected to change the water level in the same direction at every time

  19. Persistent water level changes in a well near Parkfield, California, due to local and distant earthquakes

    USGS Publications Warehouse

    Roeloffs, E.A.

    1998-01-01

    Coseismic water level rises in the 30-m deep Bourdieu Valley (BV) well near Parkfield, California, have occurred in response to three local and five distant earthquakes. Coseismic changes in static strain cannot explain these water level rises because (1) the well is insensitive to strain at tidal periods; (2) for the distant earthquakes, the expected coseismic static strain is extremely small; and (3) the water level response is of the incorrect sign for the local earthquakes. These water level changes must therefore be caused by seismic waves, but unlike seismic water level oscillations, they are monotonic, persist for days or weeks, and seem to be caused by waves with periods of several seconds rather than long-period surface waves. Other investigators have reported a similar phenomenon in Japan. Certain wells consistently exhibit this type of coseismic water level change, which is always in the same direction, regardless of the earthquake's azimuth or focal mechanism, and approximately proportional to the inverse square of hypocentral distance. To date, the coseismic water level rises in the BV well have never exceeded the seasonal water level maximum, although their sizes are relatively well correlated with earthquake magnitude and distance. The frequency independence of the well's response to barometric pressure in the frequency band 0.1 to 0.7 cpd implies that the aquifer is fairly well confined. High aquifer compressibility, probably due to a gas phase in the pore space, is the most likely reason why the well does not respond to Earth tides. The phase and amplitude relationships between the seasonal water level and precipitation cycles constrain the horizontal hydraulic diffusivity to within a factor of 4.5, bounding hypothetical earthquake-induced changes in aquifer hydraulic properties. Moreover, changes of hydraulic conductivity and/or diffusivity throughout the aquifer would not be expected to change the water level in the same direction at every time

  20. Earthquakes, September-October 1978

    USGS Publications Warehouse

    Person, W.J.

    1979-01-01

    The months of September and October were somewhat quiet seismically speaking. One major earthquake, magnitude (M) 7.7 occurred in Iran on September 16. In Germany, a magntidue 5.0 earthquake caused damage and considerable alarm to many people in parts of that country. In the United States, the largest earthquake occurred along the California-Nevada border region. 

  1. Dynamic 3D simulations of earthquakes on en echelon faults

    USGS Publications Warehouse

    Harris, R.A.; Day, S.M.

    1999-01-01

    One of the mysteries of earthquake mechanics is why earthquakes stop. This process determines the difference between small and devastating ruptures. One possibility is that fault geometry controls earthquake size. We test this hypothesis using a numerical algorithm that simulates spontaneous rupture propagation in a three-dimensional medium and apply our knowledge to two California fault zones. We find that the size difference between the 1934 and 1966 Parkfield, California, earthquakes may be the product of a stepover at the southern end of the 1934 earthquake and show how the 1992 Landers, California, earthquake followed physically reasonable expectations when it jumped across en echelon faults to become a large event. If there are no linking structures, such as transfer faults, then strike-slip earthquakes are unlikely to propagate through stepovers >5 km wide. Copyright 1999 by the American Geophysical Union.

  2. The Southern California Earthquake Survival Program

    USGS Publications Warehouse

    Harris, J.M.

    1989-01-01

    In July 1988, the Los Angeles County Board of Supervisors directed the Chief Administrative Office to develop an educational program aimed at improving earthquake preparedness among Los Angeles County residents. the board recognized that current earthquake education efforts were not only insufficient, but also often confusing and costly. The board unanimously approved the development of a program that would make earthquake preparedness a year-long effort by encouraging residents to take a different precaution each month. 

  3. Inventory of landslides triggered by the 1994 Northridge, California earthquake

    USGS Publications Warehouse

    Harp, Edwin L.; Jibson, Randall W.

    1995-01-01

    The 17 January 1994 Northridge, California, earthquake (M=6.7) triggered more than 11,000 landslides over an area of about 10,000 km?. Most of the landslides were concentrated in a 1,000-km? area that includes the Santa Susana Mountains and the mountains north of the Santa Clara River valley. We mapped landslides triggered by the earthquake in the field and from 1:60,000-scale aerial photography provided by the U.S. Air Force and taken the morning of the earthquake; these were subsequently digitized and plotted in a GIS-based format, as shown on the accompanying maps (which also are accessible via Internet). Most of the triggered landslides were shallow (1-5 m), highly disrupted falls and slides in weakly cemented Tertiary to Pleistocene clastic sediment. Average volumes of these types of landslides were less than 1,000 m?, but many had volumes exceeding 100,000 m?. Many of the larger disrupted slides traveled more than 50 m, and a few moved as far as 200 m from the bases of steep parent slopes. Deeper ( >5 m) rotational slumps and block slides numbered in the hundreds, a few of which exceeded 100,000 m? in volume. The largest triggered landslide was a block slide having a volume of 8X10E06 m?. Triggered landslides damaged or destroyed dozens of homes, blocked roads, and damaged oil-field infrastructure. Analysis of landslide distribution with respect to variations in (1) landslide susceptibility and (2) strong shaking recorded by hundreds of instruments will form the basis of a seismic landslide hazard analysis of the Los Angeles area.

  4. Triggering of repeating earthquakes in central California

    USGS Publications Warehouse

    Wu, Chunquan; Gomberg, Joan; Ben-Naim, Eli; Johnson, Paul

    2014-01-01

    Dynamic stresses carried by transient seismic waves have been found capable of triggering earthquakes instantly in various tectonic settings. Delayed triggering may be even more common, but the mechanisms are not well understood. Catalogs of repeating earthquakes, earthquakes that recur repeatedly at the same location, provide ideal data sets to test the effects of transient dynamic perturbations on the timing of earthquake occurrence. Here we employ a catalog of 165 families containing ~2500 total repeating earthquakes to test whether dynamic perturbations from local, regional, and teleseismic earthquakes change recurrence intervals. The distance to the earthquake generating the perturbing waves is a proxy for the relative potential contributions of static and dynamic deformations, because static deformations decay more rapidly with distance. Clear changes followed the nearby 2004 Mw6 Parkfield earthquake, so we study only repeaters prior to its origin time. We apply a Monte Carlo approach to compare the observed number of shortened recurrence intervals following dynamic perturbations with the distribution of this number estimated for randomized perturbation times. We examine the comparison for a series of dynamic stress peak amplitude and distance thresholds. The results suggest a weak correlation between dynamic perturbations in excess of ~20 kPa and shortened recurrence intervals, for both nearby and remote perturbations.

  5. Catalog of Oroville, California, earthquakes; June 7, 1975 to July 31, 1976

    USGS Publications Warehouse

    Mantis, Constance; Lindh, Allan; Savage, William; Marks, Shirley

    1979-01-01

    On August 1, 1975, at 2020 GMT a magnitude 5.7 (ML) earthquake occurred 15 km south of Oroville, California, in the western foothills of the Sierra Nevada. It was preceded by 61 foreshocks that began on June 7, 1975, and was followed by thousands of aftershocks. Several studies have reported locations or analyses of various subsets of the Oroville sequence, including Morrison and others (1975), Savage and others (1975), Lester and others (1975), Toppozada and others (1975), Ryall and others (1975), Bufe and others (1976), Morrison and others (1976), and Lahr and others (1976). In this report arrival time data have been compiled from the original records at several institutions to produce a single catalog of the Oroville sequence from June 7, 1975, through July 31, 1976. This study has four objectives: to compile a list of earthquakes in the Oroville sequence that is as complete as possible above the minimum magnitude threshold of approximately 1.0;to determine accurate and uniform hypocentral coordinates for the earthquakes;to determine reliable and consistent magnitude values for the sequence; andto provide a statistically uniform basis for further investigation of the physical processes involved in the Oroville sequence as revealed by the parameters of the foreshocks and aftershocks.The basis and procedures for the data analysis are described in this report.

  6. Earthquakes, July-August, 1979

    USGS Publications Warehouse

    Person, W.J.

    1980-01-01

    In the United States, on August 6, central California experienced a moderately strong earthquake, which injured several people and caused some damage. A number of earthquakes occurred in other parts of the United States but caused very little damage. 

  7. Earthquake effects at nuclear reactor facilities: San Fernando earthquake of February 9th, 1971

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Howard, G.; Ibanez, P.; Matthiesen, F.

    1972-02-01

    The effects of the San Fernando earthquake of February 9, 1971 on 26 reactor facilities located in California, Arizona, and Nevada are reported. The safety performance of the facilities during the earthquake is discussed. (JWR)

  8. M≥7 Earthquake rupture forecast and time-dependent probability for the Sea of Marmara region, Turkey

    USGS Publications Warehouse

    Murru, Maura; Akinci, Aybige; Falcone, Guiseppe; Pucci, Stefano; Console, Rodolfo; Parsons, Thomas E.

    2016-01-01

    We forecast time-independent and time-dependent earthquake ruptures in the Marmara region of Turkey for the next 30 years using a new fault-segmentation model. We also augment time-dependent Brownian Passage Time (BPT) probability with static Coulomb stress changes (ΔCFF) from interacting faults. We calculate Mw > 6.5 probability from 26 individual fault sources in the Marmara region. We also consider a multisegment rupture model that allows higher-magnitude ruptures over some segments of the Northern branch of the North Anatolian Fault Zone (NNAF) beneath the Marmara Sea. A total of 10 different Mw=7.0 to Mw=8.0 multisegment ruptures are combined with the other regional faults at rates that balance the overall moment accumulation. We use Gaussian random distributions to treat parameter uncertainties (e.g., aperiodicity, maximum expected magnitude, slip rate, and consequently mean recurrence time) of the statistical distributions associated with each fault source. We then estimate uncertainties of the 30-year probability values for the next characteristic event obtained from three different models (Poisson, BPT, and BPT+ΔCFF) using a Monte Carlo procedure. The Gerede fault segment located at the eastern end of the Marmara region shows the highest 30-yr probability, with a Poisson value of 29%, and a time-dependent interaction probability of 48%. We find an aggregated 30-yr Poisson probability of M >7.3 earthquakes at Istanbul of 35%, which increases to 47% if time dependence and stress transfer are considered. We calculate a 2-fold probability gain (ratio time-dependent to time-independent) on the southern strands of the North Anatolian Fault Zone.

  9. Converting Advances in Seismology into Earthquake Science

    NASA Astrophysics Data System (ADS)

    Hauksson, Egill; Shearer, Peter; Vidale, John

    2004-01-01

    Federal and state agencies and university groups all operate seismic networks in California. The U.S. Geological Survey (USGS) operates seismic networks in California in cooperation with the California Institute of Technology (Caltech) in southern California, and the University of California (UC) at Berkeley in northern California. The California Geological Survey (CGS) and the USGS National Strong Motion Program (NSMP) operate dial-out strong motion instruments in the state, primarily to capture data from large earthquakes for earthquake engineering and, more recently, emergency response. The California Governor's Office of Emergency Services (OES) provides leadership for the most recent project, the California Integrated Seismic Network (CISN), to integrate all of the California efforts, and to take advantage of the emergency response capabilities of the seismic networks. The core members of the CISN are Caltech, UC Berkeley, CGS, USGS Menlo Park, and USGS Pasadena (http://www.cisn.org). New seismic instrumentation is in place across southern California, and significant progress has been made in improving instrumentation in northern California. Since 2001, these new field instrumentation efforts, data sharing, and software development for real-time reporting and archiving have been coordinated through the California Integrated Seismic Network (CISN). The CISN is also the California region of the Advanced National Seismic Network (ANSS). In addition, EarthScope deployments of USArray that will begin in early 2004 in California are coordinated with the CISN. The southern and northern California earthquake data centers (SCEDC and NCEDC) have new capabilities that enable seismologists to obtain large volumes of data with only modest effort.

  10. Application of an improved spectral decomposition method to examine earthquake source scaling in Southern California

    NASA Astrophysics Data System (ADS)

    Trugman, Daniel T.; Shearer, Peter M.

    2017-04-01

    Earthquake source spectra contain fundamental information about the dynamics of earthquake rupture. However, the inherent tradeoffs in separating source and path effects, when combined with limitations in recorded signal bandwidth, make it challenging to obtain reliable source spectral estimates for large earthquake data sets. We present here a stable and statistically robust spectral decomposition method that iteratively partitions the observed waveform spectra into source, receiver, and path terms. Unlike previous methods of its kind, our new approach provides formal uncertainty estimates and does not assume self-similar scaling in earthquake source properties. Its computational efficiency allows us to examine large data sets (tens of thousands of earthquakes) that would be impractical to analyze using standard empirical Green's function-based approaches. We apply the spectral decomposition technique to P wave spectra from five areas of active contemporary seismicity in Southern California: the Yuha Desert, the San Jacinto Fault, and the Big Bear, Landers, and Hector Mine regions of the Mojave Desert. We show that the source spectra are generally consistent with an increase in median Brune-type stress drop with seismic moment but that this observed deviation from self-similar scaling is both model dependent and varies in strength from region to region. We also present evidence for significant variations in median stress drop and stress drop variability on regional and local length scales. These results both contribute to our current understanding of earthquake source physics and have practical implications for the next generation of ground motion prediction assessments.

  11. Fluid-faulting interactions: Fracture-mesh and fault-valve behavior in the February 2014 Mammoth Mountain, California, earthquake swarm

    USGS Publications Warehouse

    Shelly, David R.; Taira, Taka’aki; Prejean, Stephanie; Hill, David P.; Dreger, Douglas S.

    2015-01-01

    Faulting and fluid transport in the subsurface are highly coupled processes, which may manifest seismically as earthquake swarms. A swarm in February 2014 beneath densely monitored Mammoth Mountain, California, provides an opportunity to witness these interactions in high resolution. Toward this goal, we employ massive waveform-correlation-based event detection and relative relocation, which quadruples the swarm catalog to more than 6000 earthquakes and produces high-precision locations even for very small events. The swarm's main seismic zone forms a distributed fracture mesh, with individual faults activated in short earthquake bursts. The largest event of the sequence, M 3.1, apparently acted as a fault valve and was followed by a distinct wave of earthquakes propagating ~1 km westward from the updip edge of rupture, 1–2 h later. Late in the swarm, multiple small, shallower subsidiary faults activated with pronounced hypocenter migration, suggesting that a broader fluid pressure pulse propagated through the subsurface.

  12. Computer systems for automatic earthquake detection

    USGS Publications Warehouse

    Stewart, S.W.

    1974-01-01

    U.S Geological Survey seismologists in Menlo park, California, are utilizing the speed, reliability, and efficiency of minicomputers to monitor seismograph stations and to automatically detect earthquakes. An earthquake detection computer system, believed to be the only one of its kind in operation, automatically reports about 90 percent of all local earthquakes recorded by a network of over 100 central California seismograph stations. The system also monitors the stations for signs of malfunction or abnormal operation. Before the automatic system was put in operation, all of the earthquakes recorded had to be detected by manually searching the records, a time-consuming process. With the automatic detection system, the stations are efficiently monitored continuously. 

  13. Earthquake!

    ERIC Educational Resources Information Center

    Hernandez, Hildo

    2000-01-01

    Examines the types of damage experienced by California State University at Northridge during the 1994 earthquake and what lessons were learned in handling this emergency are discussed. The problem of loose asbestos is addressed. (GR)

  14. A physical model for earthquakes. I - Fluctuations and interactions. II - Application to southern California

    NASA Technical Reports Server (NTRS)

    Rundle, John B.

    1988-01-01

    The idea that earthquakes represent a fluctuation about the long-term motion of plates is expressed mathematically through the fluctuation hypothesis, under which all physical quantities which pertain to the occurance of earthquakes are required to depend on the difference between the present state of slip on the fault and its long-term average. It is shown that under certain circumstances the model fault dynamics undergo a sudden transition from a spatially ordered, temporally disordered state to a spatially disordered, temporally ordered state, and that the latter stages are stable for long intervals of time. For long enough faults, the dynamics are evidently chaotic. The methods developed are then used to construct a detailed model for earthquake dynamics in southern California. The result is a set of slip-time histories for all the major faults, which are similar to data obtained by geological trenching studies. Although there is an element of periodicity to the events, the patterns shift, change and evolve with time. Time scales for pattern evolution seem to be of the order of a thousand years for average recurring intervals of about a hundred years.

  15. Earthquakes March-April 1992

    USGS Publications Warehouse

    Person, Waverly J.

    1992-01-01

    The months of March and April were quite active seismically speaking. There was one major earthquake (7.0California. Earthquake-related deaths were reported in Iran, Costa Rica, Turkey, and Germany.

  16. Slip on the San Andreas fault at Parkfield, California, over two earthquake cycles, and the implications for seismic hazard

    USGS Publications Warehouse

    Murray, J.; Langbein, J.

    2006-01-01

    Parkfield, California, which experienced M 6.0 earthquakes in 1934, 1966, and 2004, is one of the few locales for which geodetic observations span multiple earthquake cycles. We undertake a comprehensive study of deformation over the most recent earthquake cycle and explore the results in the context of geodetic data collected prior to the 1966 event. Through joint inversion of the variety of Parkfield geodetic measurements (trilateration, two-color laser, and Global Positioning System), including previously unpublished two-color data, we estimate the spatial distribution of slip and slip rate along the San Andreas using a fault geometry based on precisely relocated seismicity. Although the three most recent Parkfield earthquakes appear complementary in their along-strike distributions of slip, they do not produce uniform strain release along strike over multiple seismic cycles. Since the 1934 earthquake, more than 1 m of slip deficit has accumulated on portions of the fault that slipped in the 1966 and 2004 earthquakes, and an average of 2 m of slip deficit exists on the 33 km of the fault southeast of Gold Hill to be released in a future, perhaps larger, earthquake. It appears that the fault is capable of partially releasing stored strain in moderate earthquakes, maintaining a disequilibrium through multiple earthquake cycles. This complicates the application of simple earthquake recurrence models that assume only the strain accumulated since the most recent event is relevant to the size or timing of an upcoming earthquake. Our findings further emphasize that accumulated slip deficit is not sufficient for earthquake nucleation.

  17. Analysis of the Seismicity Preceding Large Earthquakes

    NASA Astrophysics Data System (ADS)

    Stallone, A.; Marzocchi, W.

    2016-12-01

    The most common earthquake forecasting models assume that the magnitude of the next earthquake is independent from the past. This feature is probably one of the most severe limitations of the capability to forecast large earthquakes.In this work, we investigate empirically on this specific aspect, exploring whether spatial-temporal variations in seismicity encode some information on the magnitude of the future earthquakes. For this purpose, and to verify the universality of the findings, we consider seismic catalogs covering quite different space-time-magnitude windows, such as the Alto Tiberina Near Fault Observatory (TABOO) catalogue, and the California and Japanese seismic catalog. Our method is inspired by the statistical methodology proposed by Zaliapin (2013) to distinguish triggered and background earthquakes, using the nearest-neighbor clustering analysis in a two-dimension plan defined by rescaled time and space. In particular, we generalize the metric based on the nearest-neighbor to a metric based on the k-nearest-neighbors clustering analysis that allows us to consider the overall space-time-magnitude distribution of k-earthquakes (k-foreshocks) which anticipate one target event (the mainshock); then we analyze the statistical properties of the clusters identified in this rescaled space. In essence, the main goal of this study is to verify if different classes of mainshock magnitudes are characterized by distinctive k-foreshocks distribution. The final step is to show how the findings of this work may (or not) improve the skill of existing earthquake forecasting models.

  18. Postearthquake relaxation and aftershock accumulation linearly related after the 2003 M 6.5 Chengkung, Taiwan, and the 2004 M 6.0 Parkfield, California, earthquakes

    USGS Publications Warehouse

    Savage, J.C.; Yu, S.-B.

    2007-01-01

    We treat both the number of earthquakes and the deformation following a mainshock as the superposition of a steady background accumulation and the post-earthquake process. The preseismic displacement and seismicity rates ru and rE are used as estimates of the background rates. Let t be the time after the mainshock, u(t) + u0 the postseismic displacement less the background accumulation rut, and ??N(t) the observed cumulative number of postseismic earthquakes less the background accumulation rE t. For the first 160 days (duration limited by the occurrence of another nearby earthquake) following the Chengkung (M 6.5, 10 December 2003, eastern Taiwan) and the first 560 days following the Parkfield (M 6.0, 28 September 2004, central California) earthquakes u(t) + u0 is a linear function of ??N(t). The aftershock accumulation ??N(t) for both earthquakes is described by the modified Omori Law d??N/dt ?? (1 + t/??)-p with p = 0.96 and ?? = 0.03 days. Although the Chengkung earthquake involved sinistral, reverse slip on a moderately dipping fault and the Parkfield earthquake right-lateral slip on a near-vertical fault, the earthquakes share an unusual feature: both occurred on faults exhibiting interseismic fault creep at the surface. The source of the observed postseismic deformation appears to be afterslip on the coseismic rupture. The linear relation between u(t) + u0 and N(t) suggests that this afterslip also generates the aftershocks. The linear relation between u(t) + u0 and ??N(t) obtains after neither the 1999 M 7.1 Hector Mine (southern California) nor the 1999 M 7.6 Chi-Chi (central Taiwan) earthquakes, neither of which occurred on fault segments exhibiting fault creep.

  19. Epistemic uncertainty in California-wide synthetic seismicity simulations

    USGS Publications Warehouse

    Pollitz, Fred F.

    2011-01-01

    The generation of seismicity catalogs on synthetic fault networks holds the promise of providing key inputs into probabilistic seismic-hazard analysis, for example, the coefficient of variation, mean recurrence time as a function of magnitude, the probability of fault-to-fault ruptures, and conditional probabilities for foreshock–mainshock triggering. I employ a seismicity simulator that includes the following ingredients: static stress transfer, viscoelastic relaxation of the lower crust and mantle, and vertical stratification of elastic and viscoelastic material properties. A cascade mechanism combined with a simple Coulomb failure criterion is used to determine the initiation, propagation, and termination of synthetic ruptures. It is employed on a 3D fault network provided by Steve Ward (unpublished data, 2009) for the Southern California Earthquake Center (SCEC) Earthquake Simulators Group. This all-California fault network, initially consisting of 8000 patches, each of ∼12 square kilometers in size, has been rediscretized into Graphic patches, each of ∼1 square kilometer in size, in order to simulate the evolution of California seismicity and crustal stress at magnitude M∼5–8. Resulting synthetic seismicity catalogs spanning 30,000 yr and about one-half million events are evaluated with magnitude-frequency and magnitude-area statistics. For a priori choices of fault-slip rates and mean stress drops, I explore the sensitivity of various constructs on input parameters, particularly mantle viscosity. Slip maps obtained for the southern San Andreas fault show that the ability of segment boundaries to inhibit slip across the boundaries (e.g., to prevent multisegment ruptures) is systematically affected by mantle viscosity.

  20. Epistemic uncertainty in California-wide synthetic seismicity simulations

    USGS Publications Warehouse

    Pollitz, F.F.

    2011-01-01

    The generation of seismicity catalogs on synthetic fault networks holds the promise of providing key inputs into probabilistic seismic-hazard analysis, for example, the coefficient of variation, mean recurrence time as a function of magnitude, the probability of fault-to-fault ruptures, and conditional probabilities for foreshock-mainshock triggering. I employ a seismicity simulator that includes the following ingredients: static stress transfer, viscoelastic relaxation of the lower crust and mantle, and vertical stratification of elastic and viscoelastic material properties. A cascade mechanism combined with a simple Coulomb failure criterion is used to determine the initiation, propagation, and termination of synthetic ruptures. It is employed on a 3D fault network provided by Steve Ward (unpublished data, 2009) for the Southern California Earthquake Center (SCEC) Earthquake Simulators Group. This all-California fault network, initially consisting of 8000 patches, each of ~12 square kilometers in size, has been rediscretized into ~100;000 patches, each of ~1 square kilometer in size, in order to simulate the evolution of California seismicity and crustal stress at magnitude M ~ 5-8. Resulting synthetic seismicity catalogs spanning 30,000 yr and about one-half million events are evaluated with magnitude-frequency and magnitude-area statistics. For a priori choices of fault-slip rates and mean stress drops, I explore the sensitivity of various constructs on input parameters, particularly mantle viscosity. Slip maps obtained for the southern San Andreas fault show that the ability of segment boundaries to inhibit slip across the boundaries (e.g., to prevent multisegment ruptures) is systematically affected by mantle viscosity.

  1. Transient stress-coupling between the 1992 Landers and 1999 Hector Mine, California, earthquakes

    USGS Publications Warehouse

    Masterlark, Timothy; Wang, H.F.

    2002-01-01

    A three-dimensional finite-element model (FEM) of the Mojave block region in southern California is constructed to investigate transient stress-coupling between the 1992 Landers and 1999 Hector Mine earthquakes. The FEM simulates a poroelastic upper-crust layer coupled to a viscoelastic lower-crust layer, which is decoupled from the upper mantle. FEM predictions of the transient mechanical behavior of the crust are constrained by global positioning system (GPS) data, interferometric synthetic aperture radar (InSAR) images, fluid-pressure data from water wells, and the dislocation source of the 1999 Hector Mine earthquake. Two time-dependent parameters, hydraulic diffusivity of the upper crust and viscosity of the lower crust, are calibrated to 10–2 m2·sec–1 and 5 × 1018 Pa·sec respectively. The hydraulic diffusivity is relatively insensitive to heterogeneous fault-zone permeability specifications and fluid-flow boundary conditions along the elastic free-surface at the top of the problem domain. The calibrated FEM is used to predict the evolution of Coulomb stress during the interval separating the 1992 Landers and 1999 Hector Mine earthquakes. The predicted change in Coulomb stress near the hypocenter of the Hector Mine earthquake increases from 0.02 to 0.05 MPa during the 7-yr interval separating the two events. This increase is primarily attributed to the recovery of decreased excess fluid pressure from the 1992 Landers coseismic (undrained) strain field. Coulomb stress predictions are insensitive to small variations of fault-plane dip and hypocentral depth estimations of the Hector Mine rupture.

  2. Connecting slow earthquakes to huge earthquakes.

    PubMed

    Obara, Kazushige; Kato, Aitaro

    2016-07-15

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. Copyright © 2016, American Association for the Advancement of Science.

  3. Instrumental intensity distribution for the Hector Mine, California, and the Chi-Chi, Taiwan, earthquakes: Comparison of two methods

    USGS Publications Warehouse

    Sokolov, V.; Wald, D.J.

    2002-01-01

    We compare two methods of seismic-intensity estimation from ground-motion records for the two recent strong earthquakes: the 1999 (M 7.1) Hector Mine, California, and the 1999 (M 7.6) Chi-Chi, Taiwan. The first technique utilizes the peak ground acceleration (PGA) and velocity (PGV), and it is used for rapid generation of the instrumental intensity map in California. The other method is based on the revised relationships between intensity and Fourier amplitude spectrum (FAS). The results of using the methods are compared with independently observed data and between the estimations from the records. For the case of the Hector Mine earthquake, the calculated intensities in general agree with the observed values. For the case of the Chi-Chi earthquake, the areas of maximum calculated intensity correspond to the areas of the greatest damage and highest number of fatalities. However, the FAS method producees higher-intensity values than those of the peak amplitude method. The specific features of ground-motion excitation during the large, shallow, thrust earthquake may be considered a reason for the discrepancy. The use of PGA and PGV is simple; however, the use of FAS provides a natural consideration of site amplification by means of generalized or site-specific spectral ratios. Because the calculation of seismic-intensity maps requires rapid processing of data from a large network, it is very practical to generate a "first-order" map from the recorded peak motions. Then, a "second-order" map may be compiled using an amplitude-spectra method on the basis of available records and numerical modeling of the site-dependent spectra for the regions of sparse station spacing.

  4. Collaboratory for the Study of Earthquake Predictability

    NASA Astrophysics Data System (ADS)

    Schorlemmer, D.; Jordan, T. H.; Zechar, J. D.; Gerstenberger, M. C.; Wiemer, S.; Maechling, P. J.

    2006-12-01

    Earthquake prediction is one of the most difficult problems in physical science and, owing to its societal implications, one of the most controversial. The study of earthquake predictability has been impeded by the lack of an adequate experimental infrastructure---the capability to conduct scientific prediction experiments under rigorous, controlled conditions and evaluate them using accepted criteria specified in advance. To remedy this deficiency, the Southern California Earthquake Center (SCEC) is working with its international partners, which include the European Union (through the Swiss Seismological Service) and New Zealand (through GNS Science), to develop a virtual, distributed laboratory with a cyberinfrastructure adequate to support a global program of research on earthquake predictability. This Collaboratory for the Study of Earthquake Predictability (CSEP) will extend the testing activities of SCEC's Working Group on Regional Earthquake Likelihood Models, from which we will present first results. CSEP will support rigorous procedures for registering prediction experiments on regional and global scales, community-endorsed standards for assessing probability-based and alarm-based predictions, access to authorized data sets and monitoring products from designated natural laboratories, and software to allow researchers to participate in prediction experiments. CSEP will encourage research on earthquake predictability by supporting an environment for scientific prediction experiments that allows the predictive skill of proposed algorithms to be rigorously compared with standardized reference methods and data sets. It will thereby reduce the controversies surrounding earthquake prediction, and it will allow the results of prediction experiments to be communicated to the scientific community, governmental agencies, and the general public in an appropriate research context.

  5. Remotely triggered earthquakes following moderate main shocks

    USGS Publications Warehouse

    Hough, S.E.

    2007-01-01

    Since 1992, remotely triggered earthquakes have been identified following large (M > 7) earthquakes in California as well as in other regions. These events, which occur at much greater distances than classic aftershocks, occur predominantly in active geothermal or volcanic regions, leading to theories that the earthquakes are triggered when passing seismic waves cause disruptions in magmatic or other fluid systems. In this paper, I focus on observations of remotely triggered earthquakes following moderate main shocks in diverse tectonic settings. I summarize evidence that remotely triggered earthquakes occur commonly in mid-continent and collisional zones. This evidence is derived from analysis of both historic earthquake sequences and from instrumentally recorded M5-6 earthquakes in eastern Canada. The latter analysis suggests that, while remotely triggered earthquakes do not occur pervasively following moderate earthquakes in eastern North America, a low level of triggering often does occur at distances beyond conventional aftershock zones. The inferred triggered events occur at the distances at which SmS waves are known to significantly increase ground motions. A similar result was found for 28 recent M5.3-7.1 earthquakes in California. In California, seismicity is found to increase on average to a distance of at least 200 km following moderate main shocks. This supports the conclusion that, even at distances of ???100 km, dynamic stress changes control the occurrence of triggered events. There are two explanations that can account for the occurrence of remotely triggered earthquakes in intraplate settings: (1) they occur at local zones of weakness, or (2) they occur in zones of local stress concentration. ?? 2007 The Geological Society of America.

  6. Reply to “Comment on “Should Memphis build for California's earthquakes?” From A.D. Frankel”

    NASA Astrophysics Data System (ADS)

    Stein, Seth; Tomasello, Joseph; Newman, Andrew

    Carl Sagan observed that “extraordinary claims require extraordinary evidence.” In our view, A.D. Frankel's arguments (see accompanying Comment piece) do not reach the level required to demonstrate the counter-intuitive propositions that the earthquake hazard in the New Madrid Seismic Zone (NMSZ) is comparable to that in coastal California, and that buildings should be built to similar standards.This interchange is the latest in an ongoing debate beginning with Newman et al.'s [1999a] recommendation, based on analysis of Global Positioning System and earthquake data, that Frankel et al.'s [1996] estimate of California-level seismic hazard for the NMSZ should be reduced. Most points at issue, except for those related to the costs and benefits of the proposed new International Building Code 2000, have already been argued at length by both sides in the literature [e.g.,Schweig et al., 1999; Newman et al., 1999b, 2001; Cramer, 2001]. Hence,rather than rehash these points, we will try here to provide readers not enmeshed in this morass with an overview of the primary differences between our view and that of Frankel.

  7. Crowdsourced earthquake early warning.

    PubMed

    Minson, Sarah E; Brooks, Benjamin A; Glennie, Craig L; Murray, Jessica R; Langbein, John O; Owen, Susan E; Heaton, Thomas H; Iannucci, Robert A; Hauser, Darren L

    2015-04-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an M w (moment magnitude) 7 earthquake on California's Hayward fault, and real data from the M w 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.

  8. Earthquakes, November-December 1992

    USGS Publications Warehouse

    Person, W.J.

    1993-01-01

    There were two major earthquakes (7.0≤M<8.0) during the last two months of the year, a magntidue 7.5 earthquake on December 12 in the Flores region, Indonesia, and a magnitude 7.0 earthquake on December 20 in the Banda Sea. Earthquakes caused fatalities in China and Indonesia. The greatest number of deaths (2,500) for the year occurred in Indonesia. In Switzerland, six people were killed by an accidental explosion recoreded by seismographs. In teh United States, a magnitude 5.3 earthquake caused slight damage at Big Bear in southern California

  9. Liquefaction and other ground failures in Imperial County, California, from the April 4, 2010, El Mayor-Cucapah earthquake

    USGS Publications Warehouse

    McCrink, Timothy P.; Pridmore, Cynthia L.; Tinsley, John C.; Sickler, Robert R.; Brandenberg, Scott J.; Stewart, Jonathan P.

    2011-01-01

    The Colorado River Delta region of southern Imperial Valley, California, and Mexicali Valley, Baja California, is a tectonically dynamic area characterized by numerous active faults and frequent large seismic events. Significant earthquakes that have been accompanied by surface fault rupture and/or soil liquefaction occurred in this region in 1892 (M7.1), 1915 (M6.3; M7.1), 1930 (M5.7), 1940 (M6.9), 1950 (M5.4), 1957 (M5.2), 1968 (6.5), 1979 (6.4), 1980 (M6.1), 1981 (M5.8), and 1987 (M6.2; M6.8). Following this trend, the M7.2 El Mayor-Cucapah earthquake of April 4, 2010, ruptured approximately 120 kilometers along several known faults in Baja California. Liquefaction caused by the M7.2 El Mayor-Cucapah earthquake was widespread throughout the southern Imperial Valley but concentrated in the southwest corner of the valley, southwest of the city centers of Calexico and El Centro where ground motions were highest. Although there are few strong motion recordings in the very western part of the area, the recordings that do exist indicate that ground motions were on the order of 0.3 to 0.6g where the majority of liquefaction occurrences were found. More distant liquefaction occurrences, at Fites Road southwest of Brawley and along Rosita Canal northwest of Holtville were triggered where ground motions were about 0.2 g. Damage to roads was associated mainly with liquefaction of sandy river deposits beneath bridge approach fills, and in some cases liquefaction within the fills. Liquefaction damage to canal and drain levees was not always accompanied by vented sand, but the nature of the damage leads the authors to infer that liquefaction was involved in the majority of observed cases. Liquefaction-related damage to several public facilities - Calexico Waste Water Treatment Plant, Fig Lagoon levee system, and Sunbeam Lake Dam in particular - appears to be extensive. The cost to repair these facilities to prevent future liquefaction damage will likely be prohibitive. As

  10. Water-level changes induced by local and distant earthquakes at Long Valley caldera, California

    USGS Publications Warehouse

    Roeloffs, Evelyn A.; Sneed, Michelle; Galloway, Devin L.; Sorey, Michael L.; Farrar, Christopher D.; Howle, James F.; Hughes, J.

    2003-01-01

    Distant as well as local earthquakes have induced groundwater-level changes persisting for days to weeks at Long Valley caldera, California. Four wells open to formations as deep as 300 m have responded to 16 earthquakes, and responses to two earthquakes in the 3-km-deep Long Valley Exploratory Well (LVEW) show that these changes are not limited to weathered or unconsolidated near-surface rocks. All five wells exhibit water-level variations in response to earth tides, indicating they can be used as low-resolution strainmeters. Earthquakes induce gradual water-level changes that increase in amplitude for as long as 30 days, then return more slowly to pre-earthquake levels. The gradual water-level changes are always drops at wells LKT, LVEW, and CH-10B, and always rises at well CW-3. At a dilatometer just outside the caldera, earthquake-induced strain responses consist of either a step followed by a contractional strain-rate increase, or a transient contractional signal that reaches a maximum in about seven days and then returns toward the pre-earthquake value. The sizes of the gradual water-level changes generally increase with earthquake magnitude and decrease with hypocentral distance. Local earthquakes in Long Valley produce coseismic water-level steps; otherwise the responses to local earthquakes and distant earthquakes are indistinguishable. In particular, water-level and strain changes in Long Valley following the 1992 M7.3 Landers earthquake, 450 km distant, closely resemble those initiated by a M4.9 local earthquake on November 22, 1997, during a seismic swarm with features indicative of fluid involvement. At the LKT well, many of the response time histories are identical for 20 days after each earthquake, and can be matched by a theoretical solution giving the pore pressure as a function of time due to diffusion of a nearby, instantaneous, pressure drop. Such pressure drops could be produced by accelerated inflation of the resurgent dome by amounts too

  11. Analysis of Earthquake Source Spectra in Salton Trough

    NASA Astrophysics Data System (ADS)

    Chen, X.; Shearer, P. M.

    2009-12-01

    Previous studies of the source spectra of small earthquakes in southern California show that average Brune-type stress drops vary among different regions, with particularly low stress drops observed in the Salton Trough (Shearer et al., 2006). The Salton Trough marks the southern end of the San Andreas Fault and is prone to earthquake swarms, some of which are driven by aseismic creep events (Lohman and McGuire, 2007). In order to learn the stress state and understand the physical mechanisms of swarms and slow slip events, we analyze the source spectra of earthquakes in this region. We obtain Southern California Seismic Network (SCSN) waveforms for earthquakes from 1977 to 2009 archived at the Southern California Earthquake Center (SCEC) data center, which includes over 17,000 events. After resampling the data to a uniform 100 Hz sample rate, we compute spectra for both signal and noise windows for each seismogram, and select traces with a P-wave signal-to-noise ratio greater than 5 between 5 Hz and 15 Hz. Using selected displacement spectra, we isolate the source spectra from station terms and path effects using an empirical Green’s function approach. From the corrected source spectra, we compute corner frequencies and estimate moments and stress drops. Finally we analyze spatial and temporal variations in stress drop in the Salton Trough and compare them with studies of swarms and creep events to assess the evolution of faulting and stress in the region. References: Lohman, R. B., and J. J. McGuire (2007), Earthquake swarms driven by aseismic creep in the Salton Trough, California, J. Geophys. Res., 112, B04405, doi:10.1029/2006JB004596 Shearer, P. M., G. A. Prieto, and E. Hauksson (2006), Comprehensive analysis of earthquake source spectra in southern California, J. Geophys. Res., 111, B06303, doi:10.1029/2005JB003979.

  12. The California Integrated Seismic Network

    NASA Astrophysics Data System (ADS)

    Hellweg, M.; Given, D.; Hauksson, E.; Neuhauser, D.; Oppenheimer, D.; Shakal, A.

    2007-05-01

    The mission of the California Integrated Seismic Network (CISN) is to operate a reliable, modern system to monitor earthquakes throughout the state; to generate and distribute information in real-time for emergency response, for the benefit of public safety, and for loss mitigation; and to collect and archive data for seismological and earthquake engineering research. To meet these needs, the CISN operates data processing and archiving centers, as well as more than 3000 seismic stations. Furthermore, the CISN is actively developing and enhancing its infrastructure, including its automated processing and archival systems. The CISN integrates seismic and strong motion networks operated by the University of California Berkeley (UCB), the California Institute of Technology (Caltech), and the United States Geological Survey (USGS) offices in Menlo Park and Pasadena, as well as the USGS National Strong Motion Program (NSMP), and the California Geological Survey (CGS). The CISN operates two earthquake management centers (the NCEMC and SCEMC) where statewide, real-time earthquake monitoring takes place, and an engineering data center (EDC) for processing strong motion data and making it available in near real-time to the engineering community. These centers employ redundant hardware to minimize disruptions to the earthquake detection and processing systems. At the same time, dual feeds of data from a subset of broadband and strong motion stations are telemetered in real- time directly to both the NCEMC and the SCEMC to ensure the availability of statewide data in the event of a catastrophic failure at one of these two centers. The CISN uses a backbone T1 ring (with automatic backup over the internet) to interconnect the centers and the California Office of Emergency Services. The T1 ring enables real-time exchange of selected waveforms, derived ground motion data, phase arrivals, earthquake parameters, and ShakeMaps. With the goal of operating similar and redundant

  13. Earthquake Education in Prime Time

    NASA Astrophysics Data System (ADS)

    de Groot, R.; Abbott, P.; Benthien, M.

    2004-12-01

    Since 2001, the Southern California Earthquake Center (SCEC) has collaborated on several video production projects that feature important topics related to earthquake science, engineering, and preparedness. These projects have also fostered many fruitful and sustained partnerships with a variety of organizations that have a stake in hazard education and preparedness. The Seismic Sleuths educational video first appeared in the spring season 2001 on Discovery Channel's Assignment Discovery. Seismic Sleuths is based on a highly successful curriculum package developed jointly by the American Geophysical Union and The Department of Homeland Security Federal Emergency Management Agency. The California Earthquake Authority (CEA) and the Institute for Business and Home Safety supported the video project. Summer Productions, a company with a reputation for quality science programming, produced the Seismic Sleuths program in close partnership with scientists, engineers, and preparedness experts. The program has aired on the National Geographic Channel as recently as Fall 2004. Currently, SCEC is collaborating with Pat Abbott, a geology professor at San Diego State University (SDSU) on the video project Written In Stone: Earthquake Country - Los Angeles. Partners on this project include the California Seismic Safety Commission, SDSU, SCEC, CEA, and the Insurance Information Network of California. This video incorporates live-action demonstrations, vivid animations, and a compelling host (Abbott) to tell the story about earthquakes in the Los Angeles region. The Written in Stone team has also developed a comprehensive educator package that includes the video, maps, lesson plans, and other supporting materials. We will present the process that facilitates the creation of visually effective, factually accurate, and entertaining video programs. We acknowledge the need to have a broad understanding of the literature related to communication, media studies, science education, and

  14. The Non-Regularity of Earthquake Recurrence in California: Lessons From Long Paleoseismic Records in Simple vs Complex Fault Regions (Invited)

    NASA Astrophysics Data System (ADS)

    Rockwell, T. K.

    2010-12-01

    A long paleoseismic record at Hog Lake on the central San Jacinto fault (SJF) in southern California documents evidence for 18 surface ruptures in the past 3.8-4 ka. This yields a long-term recurrence interval of about 210 years, consistent with its slip rate of ~16 mm/yr and field observations of 3-4 m of displacement per event. However, during the past 3800 years, the fault has switched from a quasi-periodic mode of earthquake production, during which the recurrence interval is similar to the long-term average, to clustered behavior with the inter-event periods as short as a few decades. There are also some periods as long as 450 years during which there were no surface ruptures, and these periods are commonly followed by one to several closely-timed ruptures. The coefficient of variation (CV) for the timing of these earthquakes is about 0.6 for the past 4000 years (17 intervals). Similar behavior has been observed on the San Andreas Fault (SAF) south of the Transverse Ranges where clusters of earthquakes have been followed by periods of lower seismic production, and the CV is as high as 0.7 for some portions of the fault. In contrast, the central North Anatolian Fault (NAF) in Turkey, which ruptured in 1944, appears to have produced ruptures with similar displacement at fairly regular intervals for the past 1600 years. With a CV of 0.16 for timing, and close to 0.1 for displacement, the 1944 rupture segment near Gerede appears to have been both periodic and characteristic. The SJF and SAF are part of a broad plate boundary system with multiple parallel strands with significant slip rates. Additional faults lay to the east (Eastern California shear zone) and west (faults of the LA basin and southern California Borderland), which makes the southern SAF system a complex and broad plate boundary zone. In comparison, the 1944 rupture section of the NAF is simple, straight and highly localized, which contrasts with the complex system of parallel faults in southern

  15. Comparative Study of Earthquake Clustering in Relation to Hydraulic Activities at Geothermal Fields in California

    NASA Astrophysics Data System (ADS)

    Martínez-Garzón, P.; Zaliapin, I. V.; Ben-Zion, Y.; Kwiatek, G.; Bohnhoff, M.

    2017-12-01

    We investigate earthquake clustering properties from three geothermal reservoirs to clarify how earthquake patterns respond to hydraulic activities. We process ≈ 9 years from four datasets corresponding to the Geysers (both the entire field and a local subset), Coso and Salton Sea geothermal fields, California. For each, the completeness magnitude, b-value and fractal dimension are calculated and used to identify seismicity clusters using the nearest-neighbor approach of Zaliapin and Ben-Zion [2013a, 2013b]. Estimations of temporal evolution of different clustering properties in relation to hydraulic parameters point to different responses of earthquake dynamics to hydraulic operations in each case study. The clustering at the Geysers at local scale and Salton Sea are most and least affected by hydraulic activities, respectively. The response of the earthquake clustering from different datasets to the hydraulic activities may reflect the regional seismo-tectonic complexity as well as the dimension of the geothermal activities performed (e.g. number of active wells and superposition of injection + production activities).Two clustering properties significantly respond to hydraulic changes across all datasets: the background rates and the proportion of clusters consisting of a single event. Background rates are larger at the Geysers and Coso during high injection-production periods, while the opposite holds for the Salton Sea. This possibly reflects the different physical mechanisms controlling seismicity at each geothermal field. Additionally, a lower proportion of singles is found during time periods with higher injection-production rates. This may reflect decreasing effective stress in areas subjected to higher pore pressure and larger earthquake triggering by stress transfer.

  16. Ring-Shaped Seismicity Structures in Southern California: Possible Preparation for Large Earthquake in the Los Angeles Basin

    NASA Astrophysics Data System (ADS)

    Kopnichev, Yu. F.; Sokolova, I. N.

    2017-12-01

    Some characteristics of seismicity in Southern California are studied. It is found that ring-shaped seismicity structures with threshold magnitudes M th of 4.1, 4.1, and 3.8 formed prior to three large ( M w > 7.0) earthquakes in 1992, 1999, and 2010, respectively. The sizes of these structures are several times smaller than for intracontinental strike-slip events with similar magnitudes. Two ring-shaped structures are identified in areas east of the city of Los Angeles, where relatively large earthquakes have not occurred for at least 150 years. The magnitudes of large events which can occur in the areas of these structures are estimated on the basis of the previously obtained correlation dependence of ring sizes on magnitudes of the strike-slip earthquakes. Large events with magnitudes of M w = 6.9 ± 0.2 and M w = 8.6 ± 0.2 can occur in the area to the east of the city of Los Angeles and in the rupture zone of the 1857 great Fort Tejon earthquake, respectively. We believe that ring-structure formation, similarly to the other regions, is connected with deep-seated fluid migration.

  17. Earthquake watch

    USGS Publications Warehouse

    Hill, M.

    1976-01-01

     When the time comes that earthquakes can be predicted accurately, what shall we do with the knowledge? This was the theme of a November 1975 conference on earthquake warning and response held in San Francisco called by Assistant Secretary of the Interior Jack W. Carlson. Invited were officials of State and local governments from Alaska, California, Hawaii, Idaho, Montana, Nevada, utah, Washington, and Wyoming and representatives of the news media. 

  18. Monitoring reservoir response to earthquakes and fluid extraction, Salton Sea geothermal field, California

    PubMed Central

    Taira, Taka’aki; Nayak, Avinash; Brenguier, Florent; Manga, Michael

    2018-01-01

    Continuous monitoring of in situ reservoir responses to stress transients provides insights into the evolution of geothermal reservoirs. By exploiting the stress dependence of seismic velocity changes, we investigate the temporal evolution of the reservoir stress state of the Salton Sea geothermal field (SSGF), California. We find that the SSGF experienced a number of sudden velocity reductions (~0.035 to 0.25%) that are most likely caused by openings of fractures due to dynamic stress transients (as small as 0.08 MPa and up to 0.45 MPa) from local and regional earthquakes. Depths of velocity changes are estimated to be about 0.5 to 1.5 km, similar to the depths of the injection and production wells. We derive an empirical in situ stress sensitivity of seismic velocity changes by relating velocity changes to dynamic stresses. We also observe systematic velocity reductions (0.04 to 0.05%) during earthquake swarms in mid-November 2009 and late-December 2010. On the basis of volumetric static and dynamic stress changes, the expected velocity reductions from the largest earthquakes with magnitude ranging from 3 to 4 in these swarms are less than 0.02%, which suggests that these earthquakes are likely not responsible for the velocity changes observed during the swarms. Instead, we argue that velocity reductions may have been induced by poroelastic opening of fractures due to aseismic deformation. We also observe a long-term velocity increase (~0.04%/year) that is most likely due to poroelastic contraction caused by the geothermal production. Our observations demonstrate that seismic interferometry provides insights into in situ reservoir response to stress changes. PMID:29326977

  19. Monitoring reservoir response to earthquakes and fluid extraction, Salton Sea geothermal field, California.

    PubMed

    Taira, Taka'aki; Nayak, Avinash; Brenguier, Florent; Manga, Michael

    2018-01-01

    Continuous monitoring of in situ reservoir responses to stress transients provides insights into the evolution of geothermal reservoirs. By exploiting the stress dependence of seismic velocity changes, we investigate the temporal evolution of the reservoir stress state of the Salton Sea geothermal field (SSGF), California. We find that the SSGF experienced a number of sudden velocity reductions (~0.035 to 0.25%) that are most likely caused by openings of fractures due to dynamic stress transients (as small as 0.08 MPa and up to 0.45 MPa) from local and regional earthquakes. Depths of velocity changes are estimated to be about 0.5 to 1.5 km, similar to the depths of the injection and production wells. We derive an empirical in situ stress sensitivity of seismic velocity changes by relating velocity changes to dynamic stresses. We also observe systematic velocity reductions (0.04 to 0.05%) during earthquake swarms in mid-November 2009 and late-December 2010. On the basis of volumetric static and dynamic stress changes, the expected velocity reductions from the largest earthquakes with magnitude ranging from 3 to 4 in these swarms are less than 0.02%, which suggests that these earthquakes are likely not responsible for the velocity changes observed during the swarms. Instead, we argue that velocity reductions may have been induced by poroelastic opening of fractures due to aseismic deformation. We also observe a long-term velocity increase (~0.04%/year) that is most likely due to poroelastic contraction caused by the geothermal production. Our observations demonstrate that seismic interferometry provides insights into in situ reservoir response to stress changes.

  20. Earthquakes, July-August 1992

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    There were two major earthquakes (7.0≤M<8.0) during this reporting period. A magnitude 7.5 earthquake occurred in Kyrgyzstan on August 19 and a magnitude 7.0 quake struck the Ascension Island region on August 28. In southern California, aftershocks of the magnitude 7.6 earthquake on June 28, 1992, continued. One of these aftershocks caused damage and injuries, and at least one other aftershock caused additional damage. Earthquake-related fatalities were reportred in Kyrgzstan and Pakistan. 

  1. Operational Earthquake Forecasting of Aftershocks for New England

    NASA Astrophysics Data System (ADS)

    Ebel, J.; Fadugba, O. I.

    2015-12-01

    Although the forecasting of mainshocks is not possible, recent research demonstrates that probabilistic forecasts of expected aftershock activity following moderate and strong earthquakes is possible. Previous work has shown that aftershock sequences in intraplate regions behave similarly to those in California, and thus the operational aftershocks forecasting methods that are currently employed in California can be adopted for use in areas of the eastern U.S. such as New England. In our application, immediately after a felt earthquake in New England, a forecast of expected aftershock activity for the next 7 days will be generated based on a generic aftershock activity model. Approximately 24 hours after the mainshock, the parameters of the aftershock model will be updated using the observed aftershock activity observed to that point in time, and a new forecast of expected aftershock activity for the next 7 days will be issued. The forecast will estimate the average number of weak, felt aftershocks and the average expected number of aftershocks based on the aftershock statistics of past New England earthquakes. The forecast also will estimate the probability that an earthquake that is stronger than the mainshock will take place during the next 7 days. The aftershock forecast will specify the expected aftershocks locations as well as the areas over which aftershocks of different magnitudes could be felt. The system will use web pages, email and text messages to distribute the aftershock forecasts. For protracted aftershock sequences, new forecasts will be issued on a regular basis, such as weekly. Initially, the distribution system of the aftershock forecasts will be limited, but later it will be expanded as experience with and confidence in the system grows.

  2. Analysis of Earthquake Recordings Obtained from the Seafloor Earthquake Measurement System (SEMS) Instruments Deployed off the Coast of Southern California

    USGS Publications Warehouse

    Boore, D.M.; Smith, C.E.

    1999-01-01

    For more than 20 years, a program has been underway to obtain records of earthquake shaking on the seafloor at sites offshore of southern California, near oil platforms. The primary goal of the program is to obtain data that can help determine if ground motions at offshore sites are significantly different than those at onshore sites; if so, caution may be necessary in using onshore motions as the basis for the seismic design of oil platforms. We analyze data from eight earthquakes recorded at six offshore sites; these are the most important data recorded on these stations to date. Seven of the earthquakes were recorded at only one offshore station; the eighth event was recorded at two sites. The earthquakes range in magnitude from 4.7 to 6.1. Because of the scarcity of multiple recordings from any one event, most of the analysis is based on the ratio of spectra from vertical and horizontal components of motion. The results clearly show that the offshore motions have very low vertical motions compared to those from an average onshore site, particularly at short periods. Theoretical calculations find that the water layer has little effect on the horizontal components of motion but that it produces a strong spectral null on the vertical component at the resonant frequency of P waves in the water layer. The vertical-to-horizontal ratios for a few selected onshore sites underlain by relatively low shear-wave velocities are similar to the ratios from offshore sites for frequencies less than about one-half the water layer P-wave resonant frequency, suggesting that the shear-wave velocities beneath a site are more important than the water layer in determining the character of the ground motions at lower frequencies.

  3. A testable model of earthquake probability based on changes in mean event size

    NASA Astrophysics Data System (ADS)

    Imoto, Masajiro

    2003-02-01

    We studied changes in mean event size using data on microearthquakes obtained from a local network in Kanto, central Japan, from a viewpoint that a mean event size tends to increase as the critical point is approached. A parameter describing changes was defined using a simple weighting average procedure. In order to obtain the distribution of the parameter in the background, we surveyed values of the parameter from 1982 to 1999 in a 160 × 160 × 80 km volume. The 16 events of M5.5 or larger in this volume were selected as target events. The conditional distribution of the parameter was estimated from the 16 values, each of which referred to the value immediately prior to each target event. The distribution of the background becomes a function of symmetry, the center of which corresponds to no change in b value. In contrast, the conditional distribution exhibits an asymmetric feature, which tends to decrease the b value. The difference in the distributions between the two groups was significant and provided us a hazard function for estimating earthquake probabilities. Comparing the hazard function with a Poisson process, we obtained an Akaike Information Criterion (AIC) reduction of 24. This reduction agreed closely with the probability gains of a retrospective study in a range of 2-4. A successful example of the proposed model can be seen in the earthquake of 3 June 2000, which is the only event during the period of prospective testing.

  4. Analysis of the seismicity preceding large earthquakes

    NASA Astrophysics Data System (ADS)

    Stallone, Angela; Marzocchi, Warner

    2017-04-01

    The most common earthquake forecasting models assume that the magnitude of the next earthquake is independent from the past. This feature is probably one of the most severe limitations of the capability to forecast large earthquakes. In this work, we investigate empirically on this specific aspect, exploring whether variations in seismicity in the space-time-magnitude domain encode some information on the size of the future earthquakes. For this purpose, and to verify the stability of the findings, we consider seismic catalogs covering quite different space-time-magnitude windows, such as the Alto Tiberina Near Fault Observatory (TABOO) catalogue, the California and Japanese seismic catalog. Our method is inspired by the statistical methodology proposed by Baiesi & Paczuski (2004) and elaborated by Zaliapin et al. (2008) to distinguish between triggered and background earthquakes, based on a pairwise nearest-neighbor metric defined by properly rescaled temporal and spatial distances. We generalize the method to a metric based on the k-nearest-neighbors that allows us to consider the overall space-time-magnitude distribution of k-earthquakes, which are the strongly correlated ancestors of a target event. Finally, we analyze the statistical properties of the clusters composed by the target event and its k-nearest-neighbors. In essence, the main goal of this study is to verify if different classes of target event magnitudes are characterized by distinctive "k-foreshocks" distributions. The final step is to show how the findings of this work may (or not) improve the skill of existing earthquake forecasting models.

  5. Current Development at the Southern California Earthquake Data Center (SCEDC)

    NASA Astrophysics Data System (ADS)

    Appel, V. L.; Clayton, R. W.

    2005-12-01

    Over the past year, the SCEDC completed or is near completion of three featured projects: Station Information System (SIS) Development: The SIS will provide users with an interface into complete and accurate station metadata for all current and historic data at the SCEDC. The goal of this project is to develop a system that can interact with a single database source to enter, update and retrieve station metadata easily and efficiently. The system will provide accurate station/channel information for active stations to the SCSN real-time processing system, as will as station/channel information for stations that have parametric data at the SCEDC i.e., for users retrieving data via STP. Additionally, the SIS will supply information required to generate dataless SEED and COSMOS V0 volumes and allow stations to be added to the system with a minimum, but incomplete set of information using predefined defaults that can be easily updated as more information becomes available. Finally, the system will facilitate statewide metadata exchange for both real-time processing and provide a common approach to CISN historic station metadata. Moment Tensor Solutions: The SCEDC is currently archiving and delivering Moment Magnitudes and Moment Tensor Solutions (MTS) produced by the SCSN in real-time and post-processing solutions for events spanning back to 1999. The automatic MTS runs on all local events with magnitudes > 3.0, and all regional events > 3.5. The distributed solution automatically creates links from all USGS Simpson Maps to a text e-mail summary solution, creates a .gif image of the solution, and updates the moment tensor database tables at the SCEDC. Searchable Scanned Waveforms Site: The Caltech Seismological Lab has made available 12,223 scanned images of pre-digital analog recordings of major earthquakes recorded in Southern California between 1962 and 1992 at http://www.data.scec.org/research/scans/. The SCEDC has developed a searchable web interface that allows

  6. Prevalence and Predictors of Somatic Symptoms among Child and Adolescents with Probable Posttraumatic Stress Disorder: A Cross-Sectional Study Conducted in 21 Primary and Secondary Schools after an Earthquake.

    PubMed

    Zhang, Ye; Zhang, Jun; Zhu, Shenyue; Du, Changhui; Zhang, Wei

    2015-01-01

    To explore the prevalence rates and predictors of somatic symptoms among child and adolescent survivors with probable posttraumatic stress disorder (PTSD) after an earthquake. A total of 3053 students from 21 primary and secondary schools in Baoxing County were administered the Patient Health Questionnaire-13 (PHQ-13), a short version of PHQ-15 without the two items about sexuality and menstruation, the Children's Revised Impact of Event Scale (CRIES), and the self-made Earthquake-Related Experience Questionnaire 3 months after the Lushan earthquake. Among child and adolescent survivors, the prevalence rates of all somatic symptoms were higher in the probable PTSD group compared with the controls. The most frequent somatic symptoms were trouble sleeping (83.2%), feeling tired or having low energy (74.4%), stomach pain (63.2%), dizziness (58.1%), and headache (57.7%) in the probable PTSD group. Older age, having lost family members, having witnessed someone get seriously injured, and having witnessed someone get buried were predictors for somatic symptoms among child and adolescent survivors with probable PTSD. Somatic symptoms among child and adolescent earthquake survivors with probable PTSD in schools were common, and predictors of these somatic symptoms were identified. These findings may help those providing psychological health programs to find the child and adolescent students with probable PTSD who are at high risk of somatic symptoms in schools after an earthquake in China.

  7. The race to seismic safety : protecting California's transportation system.

    DOT National Transportation Integrated Search

    2003-12-01

    Will future California earthquakes again cause destruction of portions of Californias transportation system, or will their impacts be controlled to limit the damage and disruption any large earthquake will cause? This is the key question addressed...

  8. Predicting earthquake effects—Learning from Northridge and Loma Prieta

    USGS Publications Warehouse

    Holzer, Thomas L.

    1994-01-01

    The continental United States has been rocked by two particularly damaging earthquakes in the last 4.5 years, Loma Prieta in northern California in 1989 and Northridge in southern California in 1994. Combined losses from these two earthquakes approached $30 billion. Approximately half these losses were reimbursed by the federal government. Because large earthquakes typically overwhelm state resources and place unplanned burdens on the federal government, it is important to learn from these earthquakes how to reduce future losses. My purpose here is to explore a potential implication of the Northridge and Loma Prieta earthquakes for hazard-mitigation strategies: earth scientists should increase their efforts to map hazardous areas within urban regions. 

  9. Portals for Real-Time Earthquake Data and Forecasting: Challenge and Promise (Invited)

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Feltstykket, R.; Donnellan, A.; Glasscoe, M. T.

    2013-12-01

    Earthquake forecasts have been computed by a variety of countries world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. However, recent events clearly demonstrate that mitigating personal risk is becoming the responsibility of individual members of the public. Open access to a variety of web-based forecasts, tools, utilities and information is therefore required. Portals for data and forecasts present particular challenges, and require the development of both apps and the client/server architecture to deliver the basic information in real time. The basic forecast model we consider is the Natural Time Weibull (NTW) method (JBR et al., Phys. Rev. E, 86, 021106, 2012). This model uses small earthquakes (';seismicity-based models') to forecast the occurrence of large earthquakes, via data-mining algorithms combined with the ANSS earthquake catalog. This method computes large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Localizing these forecasts in space so that global forecasts can be computed in real time presents special algorithmic challenges, which we describe in this talk. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we compute real-time global forecasts at a grid scale of 0.1o. We analyze and monitor the performance of these models using the standard tests, which include the Reliability/Attributes and Receiver Operating Characteristic (ROC) tests. It is clear from much of the analysis that data quality is a major limitation on the accurate computation of earthquake probabilities. We discuss the challenges of serving up these datasets over the web on web-based platforms such as those at www.quakesim.org , www.e-decider.org , and www.openhazards.com.

  10. Comparison of Observed Spatio-temporal Aftershock Patterns with Earthquake Simulator Results

    NASA Astrophysics Data System (ADS)

    Kroll, K.; Richards-Dinger, K. B.; Dieterich, J. H.

    2013-12-01

    Due to the complex nature of faulting in southern California, knowledge of rupture behavior near fault step-overs is of critical importance to properly quantify and mitigate seismic hazards. Estimates of earthquake probability are complicated by the uncertainty that a rupture will stop at or jump a fault step-over, which affects both the magnitude and frequency of occurrence of earthquakes. In recent years, earthquake simulators and dynamic rupture models have begun to address the effects of complex fault geometries on earthquake ground motions and rupture propagation. Early models incorporated vertical faults with highly simplified geometries. Many current studies examine the effects of varied fault geometry, fault step-overs, and fault bends on rupture patterns; however, these works are limited by the small numbers of integrated fault segments and simplified orientations. The previous work of Kroll et al., 2013 on the northern extent of the 2010 El Mayor-Cucapah rupture in the Yuha Desert region uses precise aftershock relocations to show an area of complex conjugate faulting within the step-over region between the Elsinore and Laguna Salada faults. Here, we employ an innovative approach of incorporating this fine-scale fault structure defined through seismological, geologic and geodetic means in the physics-based earthquake simulator, RSQSim, to explore the effects of fine-scale structures on stress transfer and rupture propagation and examine the mechanisms that control aftershock activity and local triggering of other large events. We run simulations with primary fault structures in state of California and northern Baja California and incorporate complex secondary faults in the Yuha Desert region. These models produce aftershock activity that enables comparison between the observed and predicted distribution and allow for examination of the mechanisms that control them. We investigate how the spatial and temporal distribution of aftershocks are affected by

  11. Broadband records of earthquakes in deep gold mines and a comparison with results from SAFOD, California

    USGS Publications Warehouse

    McGarr, Arthur F.; Boettcher, M.; Fletcher, Jon Peter B.; Sell, Russell; Johnston, Malcolm J.; Durrheim, R.; Spottiswoode, S.; Milev, A.

    2009-01-01

    For one week during September 2007, we deployed a temporary network of field recorders and accelerometers at four sites within two deep, seismically active mines. The ground-motion data, recorded at 200 samples/sec, are well suited to determining source and ground-motion parameters for the mining-induced earthquakes within and adjacent to our network. Four earthquakes with magnitudes close to 2 were recorded with high signal/noise at all four sites. Analysis of seismic moments and peak velocities, in conjunction with the results of laboratory stick-slip friction experiments, were used to estimate source processes that are key to understanding source physics and to assessing underground seismic hazard. The maximum displacements on the rupture surfaces can be estimated from the parameter , where  is the peak ground velocity at a given recording site, and R is the hypocentral distance. For each earthquake, the maximum slip and seismic moment can be combined with results from laboratory friction experiments to estimate the maximum slip rate within the rupture zone. Analysis of the four M 2 earthquakes recorded during our deployment and one of special interest recorded by the in-mine seismic network in 2004 revealed maximum slips ranging from 4 to 27 mm and maximum slip rates from 1.1 to 6.3 m/sec. Applying the same analyses to an M 2.1 earthquake within a cluster of repeating earthquakes near the San Andreas Fault Observatory at Depth site, California, yielded similar results for maximum slip and slip rate, 14 mm and 4.0 m/sec.

  12. Satellite IR thermal measurements prior to the September 2004 earthquakes in central California

    NASA Astrophysics Data System (ADS)

    Ouzounov, D.; Logan, T.; Braynt, N.; Taylor, P.

    2004-12-01

    We present and discuss observed variations in thermal transients and radiation fields prior to the earthquakes of September 18 near Bodie (M5.5) and September 28, 2004 near Parkfield(M6.0) in California. Previous analysis of earthquake events have indicated the presence of a thermal anomaly, where temperatures increased or did not return to its usual nighttime value. The procedures used in our work is to analyze weather satellite data taken at night and to record the general condition where the ground cools after sunset. Two days before the Bodie earthquake lower temperature radiation was observed by the NOAA/AVHRR satellite. This occurred when the entire region was relatively cloud-free. IR land surface nighttime temperature from the MODIS instrument rose to +4 degrees C in a 100 km radius around the Bodie epicenter. The thermal transient field recorded by MODIS in the vicinity of Parkfield, also with a cloud free environment,was around +1degree C and it is significantly smaller than the thermal anomaly around the Bodie epicenter. Ground surface temperature near the Parkfield epicenter, however, for that period showed a steady increase 4 days prior to the earthquake and a significant drop of the night before the quake. Geosynchronous weather satellite thermal IR measurements taken every half hour from sunset to dawn, were also recorded for 10 days prior to the Parkfield event and 5 days after as well as the day of the quake. To establish a baseline we also obtained GOES data for the same Julian days for the three years prior to the Parkfield earthquake. These September 2004 IR data sets were then used to systematically observe and record any thermal anomaly prior to the events that deviated from the baseline. Our recent results support the hypothesis of a possible relationship between an thermodynamic processes produced by increasing tectonic stress in the Earth's crust and a subsequent electro-chemical interaction between this crust and the atmosphere/ionosphere.

  13. Probability of detecting perchlorate under natural conditions in deep groundwater in California and the Southwestern United States

    USGS Publications Warehouse

    Fram, Miranda S.; Belitz, Kenneth

    2011-01-01

    We use data from 1626 groundwater samples collected in California, primarily from public drinking water supply wells, to investigate the distribution of perchlorate in deep groundwater under natural conditions. The wells were sampled for the California Groundwater Ambient Monitoring and Assessment Priority Basin Project. We develop a logistic regression model for predicting probabilities of detecting perchlorate at concentrations greater than multiple threshold concentrations as a function of climate (represented by an aridity index) and potential anthropogenic contributions of perchlorate (quantified as an anthropogenic score, AS). AS is a composite categorical variable including terms for nitrate, pesticides, and volatile organic compounds. Incorporating water-quality parameters in AS permits identification of perturbation of natural occurrence patterns by flushing of natural perchlorate salts from unsaturated zones by irrigation recharge as well as addition of perchlorate from industrial and agricultural sources. The data and model results indicate low concentrations (0.1-0.5 μg/L) of perchlorate occur under natural conditions in groundwater across a wide range of climates, beyond the arid to semiarid climates in which they mostly have been previously reported. The probability of detecting perchlorate at concentrations greater than 0.1 μg/L under natural conditions ranges from 50-70% in semiarid to arid regions of California and the Southwestern United States to 5-15% in the wettest regions sampled (the Northern California coast). The probability of concentrations above 1 μg/L under natural conditions is low (generally <3%).

  14. Satellite IR Thermal Measurements Prior to the September 2004 Earthquakes in Central California

    NASA Technical Reports Server (NTRS)

    Ouzounov, D.; Logan, T.; Taylor, Patrick

    2004-01-01

    We present and discuss observed variations in thermal transients and radiation fields prior to the earthquakes of September 18 near Bodie (M5.5) and September 28,2004 near Parkfield(M6.0) in California. Previous analysis of earthquake events have indicated the presence of a thermal anomaly, where temperatures increased or did not return to its usual nighttime value. The procedures used in our work is to analyze weather satellite data taken at night and to record the general condition where the ground cools after sunset. Two days before the Bodie earthquake lower temperature radiation was observed by the NOAA/AVHRR satellite. This occurred when the entire region was relatively cloud-free. IR land surface nighttime temperature from the MODIS instrument rose to +4 C in a 100 km radius around the Bodie epicenter. The thermal transient field recorded by MODIS in the vicinity of Parkfield, also with a cloud free environment, was around +l C and it is significantly smaller than the Parkfield epicenter, however, for that period showed a steady increase 4 days prior to the earthquake and a significant drop of the night before the quake. Geosynchronous weather satellite thermal IR measurements taken every half hour from sunset to dawn, were also recorded for 10 days prior to the Parkfield event and 5 days after as well as the day of the quake. To establish a baseline we also obtained GOES data for the same Julian sets were then used to systematically observe and record any thermal anomaly prior to the events that deviated from the baseline. Our recent results support the hypothesis of a possible relationship between an thermodynamic processes produced by increasing tectonic stress in the Earth's crust and a subsequent electro-chemical interaction between this crust and the atmosphere/ionosphere.

  15. Data Sets and Data Services at the Northern California Earthquake Data Center

    NASA Astrophysics Data System (ADS)

    Neuhauser, D. S.; Zuzlewski, S.; Allen, R. M.

    2014-12-01

    The Northern California Earthquake Data Center (NCEDC) houses a unique and comprehensive data archive and provides real-time services for a variety of seismological and geophysical data sets that encompass northern and central California. We have over 80 terabytes of continuous and event-based time series data from broadband, short-period, strong motion, and strain sensors as well as continuous and campaign GPS data at both standard and high sample rates in both raw and RINEX format. The Northen California Seismic System (NCSS), operated by UC Berkeley and USGS Menlo Park, has recorded over 890,000 events from 1984 to the present, and the NCEDC provides catalog, parametric information, moment tensors and first motion mechanisms, and time series data for these events. We also host and provide event catalogs, parametric information, and event waveforms for DOE enhanced geothermal system monitoring in northern California and Nevada. The NCEDC provides a variety of ways for users to access these data. The most recent development are web services, which provide interactive, command-line, or program-based workflow access to data. Web services use well-established server and client protocols and RESTful software architecture that allow users to easily submit queries and receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, simple text, or MiniSEED depending on the service and selected output format. The NCEDC supports all FDSN-defined web services as well as a number of IRIS-defined and NCEDC-defined services. We also continue to support older email-based and browser-based access to data. NCEDC data and web services can be found at http://www.ncedc.org and http://service.ncedc.org.

  16. Products and Services Available from the Southern California Earthquake Data Center (SCEDC) and the Southern California Seismic Network (SCSN)

    NASA Astrophysics Data System (ADS)

    Chen, S. E.; Yu, E.; Bhaskaran, A.; Chowdhury, F. R.; Meisenhelter, S.; Hutton, K.; Given, D.; Hauksson, E.; Clayton, R. W.

    2011-12-01

    Currently, the SCEDC archives continuous and triggered data from nearly 8400 data channels from 425 SCSN recorded stations, processing and archiving an average of 6.4 TB of continuous waveforms and 12,000 earthquakes each year. The SCEDC provides public access to these earthquake parametric and waveform data through its website www.data.scec.org and through client applications such as STP and DHI. This poster will describe the most significant developments at the SCEDC during 2011. New website design: ? The SCEDC has revamped its website. The changes make it easier for users to search the archive, discover updates and new content. These changes also improve our ability to manage and update the site. New data holdings: ? Post processing on El Mayor Cucapah 7.2 sequence continues. To date there have been 11847 events reviewed. Updates are available in the earthquake catalog immediately. ? A double difference catalog (Hauksson et. al 2011) spanning 1981 to 6/30/11 will be available for download at www.data.scec.org and available via STP. ? A focal mechanism catalog determined by Yang et al. 2011 is available for distribution at www.data.scec.org. ? Waveforms from Southern California NetQuake stations are now being stored in the SCEDC archive and available via STP as event associated waveforms. Amplitudes from these stations are also being stored in the archive and used by ShakeMap. ? As part of a NASA/AIST project in collaboration with JPL and SIO, the SCEDC will receive real time 1 sps streams of GPS displacement solutions from the California Real Time Network (http://sopac.ucsd.edu/projects/realtime; Genrich and Bock, 2006, J. Geophys. Res.). These channels will be archived at the SCEDC as miniSEED waveforms, which then can be distributed to the user community via applications such as STP. Improvements in the user tool STP: ? STP sac output now includes picks from the SCSN. New archival methods: ? The SCEDC is exploring the feasibility of archiving and distributing

  17. The Virtual Quake earthquake simulator: a simulation-based forecast of the El Mayor-Cucapah region and evidence of predictability in simulated earthquake sequences

    NASA Astrophysics Data System (ADS)

    Yoder, Mark R.; Schultz, Kasey W.; Heien, Eric M.; Rundle, John B.; Turcotte, Donald L.; Parker, Jay W.; Donnellan, Andrea

    2015-12-01

    In this manuscript, we introduce a framework for developing earthquake forecasts using Virtual Quake (VQ), the generalized successor to the perhaps better known Virtual California (VC) earthquake simulator. We discuss the basic merits and mechanics of the simulator, and we present several statistics of interest for earthquake forecasting. We also show that, though the system as a whole (in aggregate) behaves quite randomly, (simulated) earthquake sequences limited to specific fault sections exhibit measurable predictability in the form of increasing seismicity precursory to large m > 7 earthquakes. In order to quantify this, we develop an alert-based forecasting metric, and show that it exhibits significant information gain compared to random forecasts. We also discuss the long-standing question of activation versus quiescent type earthquake triggering. We show that VQ exhibits both behaviours separately for independent fault sections; some fault sections exhibit activation type triggering, while others are better characterized by quiescent type triggering. We discuss these aspects of VQ specifically with respect to faults in the Salton Basin and near the El Mayor-Cucapah region in southern California, USA and northern Baja California Norte, Mexico.

  18. Signatures of the seismic source in EMD-based characterization of the 1994 Northridge, California, earthquake recordings

    USGS Publications Warehouse

    Zhang, R.R.; Ma, S.; Hartzell, S.

    2003-01-01

    In this article we use empirical mode decomposition (EMD) to characterize the 1994 Northridge, California, earthquake records and investigate the signatures carried over from the source rupture process. Comparison of the current study results with existing source inverse solutions that use traditional data processing suggests that the EMD-based characterization contains information that sheds light on aspects of the earthquake rupture process. We first summarize the fundamentals of the EMD and illustrate its features through the analysis of a hypothetical and a real record. Typically, the Northridge strong-motion records are decomposed into eight or nine intrinsic mode functions (IMF's), each of which emphasizes a different oscillation mode with different amplitude and frequency content. The first IMF has the highest-frequency content; frequency content decreases with an increase in IMF component. With the aid of a finite-fault inversion method, we then examine aspects of the source of the 1994 Northridge earthquake that are reflected in the second to fifth IMF components. This study shows that the second IMF is predominantly wave motion generated near the hypocenter, with high-frequency content that might be related to a large stress drop associated with the initiation of the earthquake. As one progresses from the second to the fifth IMF component, there is a general migration of the source region away from the hypocenter with associated longer-period signals as the rupture propagates. This study suggests that the different IMF components carry information on the earthquake rupture process that is expressed in their different frequency bands.

  19. Evidence of shallow fault zone strengthening after the 1992 M7.5 Landers, California, earthquake

    USGS Publications Warehouse

    Li, Y.-G.; Vidale, J.E.; Aki, K.; Xu, Fei; Burdette, T.

    1998-01-01

    Repeated seismic surveys of the Landers, California, fault zone that ruptured in the magnitude (M) 7.5 earthquake of 1992 reveal an increase in seismic velocity with time. P, S, and fault zone trapped waves were excited by near-surface explosions in two locations in 1994 and 1996, and were recorded on two linear, three-component seismic arrays deployed across the Johnson Valley fault trace. The travel times of P and S waves for identical shot-receiver pairs decreased by 0.5 to 1.5 percent from 1994 to 1996, with the larger changes at stations located within the fault zone. These observations indicate that the shallow Johnson Valley fault is strengthening after the main shock, most likely because of closure of cracks that were opened by the 1992 earthquake. The increase in velocity is consistent with the prevalence of dry over wet cracks and with a reduction in the apparent crack density near the fault zone by approximately 1.0 percent from 1994 to 1996.

  20. Magnetic field observations in the near-field the 28 June 1992 Mw 7.3 Landers, California, earthquake

    USGS Publications Warehouse

    Johnston, M.J.; Mueller, R.J.; Sasai, Yoichi

    1994-01-01

    Recent reports suggest that large magnetic field changes occur prior to, and during, large earthquakes. Two continuously operating proton magnetometers, LSBM and OCHM, at distances of 17.3 and 24.2 km, respectively, from the epicenter of the 28 June 1992 Mw 7.3 Landers earthquake, recorded data through the earthquake and its aftershocks. These two stations are part of a differentially connected array of proton magnetometers that has been operated along the San Andreas fault since 1976. The instruments have a sensitivity of 0.25 nT or better and transmit data every 10 min through the GOES satellite to the USGS headquarters in Menlo Park, California. Seismomagnetic offsets of −1.2 ± 0.6 and −0.7 ± 0.7 nT were observed at these sites. In comparison, offsets of −0.3 ± 0.2 and −1.3 ± 0.2 nT were observed during the 8 July 1986 ML 5.9 North Palm Springs earthquake, which occurred directly beneath the OCHM magnetometer site. The observations are generally consistent with seismomagnetic models of the earthquake, in which fault geometry and slip have the same from as that determined by either inversion of the seismic data or inversion of geodetically determined ground displacements produced by the earthquake. In these models, right-lateral rupture occurs on connected fault segments in a homogeneous medium with average magnetization of 2 A/m. The fault-slip distribution has roughly the same form as the observed surface rupture, and the total moment release is 1.1 × 1020 Nm. There is no indication of diffusion-like character to the magnetic field offsets that might indicate these effects result from fluid flow phenomena. It thus seems unlikely that these earthquake-generated offsets and those produced by the North Palm Springs earthquake were generated by electrokinetic effects. Also, there are no indications of enhanced low-frequency magnetic noise before the earthquake at frequencies below 0.001 Hz.

  1. The ShakeOut Earthquake Scenario - A Story That Southern Californians Are Writing

    USGS Publications Warehouse

    Perry, Suzanne; Cox, Dale; Jones, Lucile; Bernknopf, Richard; Goltz, James; Hudnut, Kenneth; Mileti, Dennis; Ponti, Daniel; Porter, Keith; Reichle, Michael; Seligson, Hope; Shoaf, Kimberley; Treiman, Jerry; Wein, Anne

    2008-01-01

    The question is not if but when southern California will be hit by a major earthquake - one so damaging that it will permanently change lives and livelihoods in the region. How severe the changes will be depends on the actions that individuals, schools, businesses, organizations, communities, and governments take to get ready. To help prepare for this event, scientists of the U.S. Geological Survey (USGS) have changed the way that earthquake scenarios are done, uniting a multidisciplinary team that spans an unprecedented number of specialties. The team includes the California Geological Survey, Southern California Earthquake Center, and nearly 200 other partners in government, academia, emergency response, and industry, working to understand the long-term impacts of an enormous earthquake on the complicated social and economic interactions that sustain southern California society. This project, the ShakeOut Scenario, has applied the best current scientific understanding to identify what can be done now to avoid an earthquake catastrophe. More information on the science behind this project will be available in The ShakeOut Scenario (USGS Open-File Report 2008-1150; http://pubs.usgs.gov/of/2008/1150/). The 'what if?' earthquake modeled in the ShakeOut Scenario is a magnitude 7.8 on the southern San Andreas Fault. Geologists selected the details of this hypothetical earthquake by considering the amount of stored strain on that part of the fault with the greatest risk of imminent rupture. From this, seismologists and computer scientists modeled the ground shaking that would occur in this earthquake. Engineers and other professionals used the shaking to produce a realistic picture of this earthquake's damage to buildings, roads, pipelines, and other infrastructure. From these damages, social scientists projected casualties, emergency response, and the impact of the scenario earthquake on southern California's economy and society. The earthquake, its damages, and

  2. Triggered reverse fault and earthquake due to crustal unloading, northwest Transverse Ranges, California.

    USGS Publications Warehouse

    Yerkes, R.F.; Ellsworth, W.L.; Tinsley, J.C.

    1983-01-01

    A reverse-right-oblique surface rupture, associated with a ML 2.5 earthquake, formed in a diatomite quarry near Lompoc, California, in the northwesternmost Transverse Ranges on April 7, 1981. The 575-m-long narrow zone of ruptures formed in clay interbeds in diatomite and diatomaceous shale of the Neogene Monterey Formation. The ruptures parallel bedding, dip 39o-59oS, and trend about N84oE on the north limb of an open symmetrical syncline. Maximum net slip was 25 cm; maximum reverse dip slip was 23 cm, maximum right-lateral strike slip was about 9 cm, and average net slip was about 12 cm. The seismic moment of the earthquake is estimated at 1 to 2 X 1018 dyne/cm and the static stress drop at about 3 bar. The removal of an average of about 44 m of diatomite resulted in an average load reduction of about 5 bar, which decreased the normal stress by about 3.5 bar and increased the shear stress on the tilted bedding plane by about 2 bar. The April 7, 1981, event was a very shallow bedding-plane rupture, apparently triggered by crustal unloading. -Authors

  3. The Road to Total Earthquake Safety

    NASA Astrophysics Data System (ADS)

    Frohlich, Cliff

    Cinna Lomnitz is possibly the most distinguished earthquake seismologist in all of Central and South America. Among many other credentials, Lomnitz has personally experienced the shaking and devastation that accompanied no fewer than five major earthquakes—Chile, 1939; Kern County, California, 1952; Chile, 1960; Caracas,Venezuela, 1967; and Mexico City, 1985. Thus he clearly has much to teach someone like myself, who has never even actually felt a real earthquake.What is this slim book? The Road to Total Earthquake Safety summarizes Lomnitz's May 1999 presentation at the Seventh Mallet-Milne Lecture, sponsored by the Society for Earthquake and Civil Engineering Dynamics. His arguments are motivated by the damage that occurred in three earthquakes—Mexico City, 1985; Loma Prieta, California, 1989; and Kobe, Japan, 1995. All three quakes occurred in regions where earthquakes are common. Yet in all three some of the worst damage occurred in structures located a significant distance from the epicenter and engineered specifically to resist earthquakes. Some of the damage also indicated that the structures failed because they had experienced considerable rotational or twisting motion. Clearly, Lomnitz argues, there must be fundamental flaws in the usually accepted models explaining how earthquakes generate strong motions, and how we should design resistant structures.

  4. Dynamic deformations and the M6.7, Northridge, California earthquake

    USGS Publications Warehouse

    Gomberg, J.

    1997-01-01

    A method of estimating the complete time-varying dynamic formation field from commonly available three-component single station seismic data has been developed and applied to study the relationship between dynamic deformation and ground failures and structural damage using observations from the 1994 Northridge, California earthquake. Estimates from throughout the epicentral region indicate that the horizontal strains exceed the vertical ones by more than a factor of two. The largest strains (exceeding ???100 ??strain) correlate with regions of greatest ground failure. There is a poor correlation between structural damage and peak strain amplitudes. The smallest strains, ???35 ??strain, are estimated in regions of no damage or ground failure. Estimates in the two regions with most severe and well mapped permanent deformation, Potrero Canyon and the Granada-Mission Hills regions, exhibit the largest strains; peak horizontal strains estimates in these regions equal ???139 and ???229 ??strain respectively. Of note, the dynamic principal strain axes have strikes consistent with the permanent failure features suggesting that, while gravity, sub-surface materials, and hydrologic conditions undoubtedly played fundamental roles in determining where and what types of failures occurred, the dynamic deformation field may have been favorably sized and oriented to initiate failure processes. These results support other studies that conclude that the permanent deformation resulted from ground shaking, rather than from static strains associated with primary or secondary faulting. They also suggest that such an analysis, either using data or theoretical calculations, may enable observations of paleo-ground failure to be used as quantitative constraints on the size and geometry of previous earthquakes. ?? 1997 Elsevier Science Limited.

  5. Earthquakes, November-December 1977

    USGS Publications Warehouse

    Person, W.J.

    1978-01-01

    In the United States, the largest earthquake during this reporting period was a magntidue 6.6 in the Andreanof Islands, which are part of the Aleutian Islands chain, on November 4 that caused some minor damage. Northern California was struck by a magnitude 4.8 earthquake on November 22 causing moderate damage in the Willits area. This was the most damaging quake in the United States during the year. Two major earthquakes of magntidues 7.0 or above to 14 for the year. 

  6. The Redwood Coast Tsunami Work Group: a unique organization promoting earthquake and tsunami resilience on California's North Coast

    NASA Astrophysics Data System (ADS)

    Dengler, L.; Henderson, C.; Larkin, D.; Nicolini, T.; Ozaki, V.

    2012-12-01

    The Northern California counties of Del Norte, Humboldt, and Mendocino account for over 30% of California's coastline and is one of the most seismically active areas of the contiguous 48 states. The region is at risk from earthquakes located on- and offshore and from tsunamis generated locally from faults associated with the Cascadia subduction zone (CSZ) and from distant sources elsewhere in the Pacific. In 1995 the California Geological Survey (CGS) published a scenario for a CSZ earthquake that included both strong ground shaking effects and a tsunami. As a result of the scenario, the Redwood Coast Tsunami Work Group (RCTWG), an organization of government agencies, tribes, service groups, academia and the private sector, was formed to coordinate and promote earthquake and tsunami hazard awareness and mitigation in the three-county region. The RCTWG and its member agencies projects include education/outreach products and programs, tsunami hazard mapping, signage and siren planning. Since 2008, RCTWG has worked with the California Emergency Management Agency (Cal EMA) in conducting tsunami warning communications tests on the North Coast. In 2007, RCTWG members helped develop and carry out the first tsunami training exercise at FEMA's Emergency Management Institute in Emmitsburg, MD. The RCTWG has facilitated numerous multi-agency, multi-discipline coordinated exercises, and RCTWG county tsunami response plans have been a model for other regions of the state and country. Eight North Coast communities have been recognized as TsunamiReady by the National Weather Service, including the first National Park the first State Park and only tribe in California to be so recognized. Over 500 tsunami hazard zone signs have been posted in the RCTWG region since 2008. Eight assessment surveys from 1993 to 2010 have tracked preparedness actions and personal awareness of earthquake and tsunami hazards in the county and additional surveys have tracked public awareness and tourist

  7. Potential Effects of a Scenario Earthquake on the Economy of Southern California: Intraregional Commuter, Worker, and Earnings Flow Analysis

    USGS Publications Warehouse

    Sherrouse, Benson C.; Hester, David J.

    2008-01-01

    The Multi-Hazards Demonstration Project (MHDP) is a collaboration between the U.S. Geological Survey (USGS) and various partners from the public and private sectors and academia, meant to improve Southern California's resiliency to natural hazards (Jones and others, 2007). In support of the MHDP objectives, the ShakeOut Scenario was developed. It describes a magnitude 7.8 (M7.8) earthquake along the southernmost 300 kilometers (200 miles) of the San Andreas Fault, identified by geoscientists as a plausible event that will cause moderate to strong shaking over much of the eight-county (Imperial, Kern, Los Angeles, Orange, Riverside, San Bernardino, San Diego, and Ventura) Southern California region (Jones and others, 2008). This report uses selected datasets from the U.S. Census Bureau and the State of California's Employment Development Department to develop preliminary estimates of the number and spatial distribution of commuters who cross the San Andreas Fault and to characterize these commuters by the industries in which they work and their total earnings. The analysis concerns the relative exposure of the region's economy to the effects of the earthquake as described by the location, volume, and earnings of those commuters who work in each of the region's economic sectors. It is anticipated that damage to transportation corridors traversing the fault would lead to at least short-term disruptions in the ability of commuters to travel between their places of residence and work.

  8. The influence of one earthquake on another

    NASA Astrophysics Data System (ADS)

    Kilb, Deborah Lyman

    1999-12-01

    Part one of my dissertation examines the initiation of earthquake rupture. We study the initial subevent (ISE) of the Mw 6.7 1994 Northridge, California earthquake to distinguish between two end-member hypotheses of an organized and predictable earthquake rupture initiation process or, alternatively, a random process. We find that the focal mechanisms of the ISE and mainshock are indistinguishable, and both events may have nucleated on and ruptured the same fault plane. These results satisfy the requirements for both end-member models, and do not allow us to distinguish between them. However, further tests show the ISE's waveform characteristics are similar to those of typical nearby small earthquakes (i.e., dynamic ruptures). The second part of my dissertation examines aftershocks of the M 7.1 1989 Loma Prieta, California earthquake to determine if theoretical models of static Coulomb stress changes correctly predict the fault plane geometries and slip directions of Loma Prieta aftershocks. Our work shows individual aftershock mechanisms cannot be successfully predicted because a similar degree of predictability can be obtained using a randomized catalogue. This result is probably a function of combined errors in the models of mainshock slip distribution, background stress field, and aftershock locations. In the final part of my dissertation, we test the idea that earthquake triggering occurs when properties of a fault and/or its loading are modified by Coulomb failure stress changes that may be transient and oscillatory (i.e., dynamic) or permanent (i.e., static). We propose a triggering threshold failure stress change exists, above which the earthquake nucleation process begins although failure need not occur instantaneously. We test these ideas using data from the 1992 M 7.4 Landers earthquake and its aftershocks. Stress changes can be categorized as either dynamic (generated during the passage of seismic waves), static (associated with permanent fault offsets

  9. A comparison among observations and earthquake simulator results for the allcal2 California fault model

    USGS Publications Warehouse

    Tullis, Terry. E.; Richards-Dinger, Keith B.; Barall, Michael; Dieterich, James H.; Field, Edward H.; Heien, Eric M.; Kellogg, Louise; Pollitz, Fred F.; Rundle, John B.; Sachs, Michael K.; Turcotte, Donald L.; Ward, Steven N.; Yikilmaz, M. Burak

    2012-01-01

    In order to understand earthquake hazards we would ideally have a statistical description of earthquakes for tens of thousands of years. Unfortunately the ∼100‐year instrumental, several 100‐year historical, and few 1000‐year paleoseismological records are woefully inadequate to provide a statistically significant record. Physics‐based earthquake simulators can generate arbitrarily long histories of earthquakes; thus they can provide a statistically meaningful history of simulated earthquakes. The question is, how realistic are these simulated histories? This purpose of this paper is to begin to answer that question. We compare the results between different simulators and with information that is known from the limited instrumental, historic, and paleoseismological data.As expected, the results from all the simulators show that the observational record is too short to properly represent the system behavior; therefore, although tests of the simulators against the limited observations are necessary, they are not a sufficient test of the simulators’ realism. The simulators appear to pass this necessary test. In addition, the physics‐based simulators show similar behavior even though there are large differences in the methodology. This suggests that they represent realistic behavior. Different assumptions concerning the constitutive properties of the faults do result in enhanced capabilities of some simulators. However, it appears that the similar behavior of the different simulators may result from the fault‐system geometry, slip rates, and assumed strength drops, along with the shared physics of stress transfer.This paper describes the results of running four earthquake simulators that are described elsewhere in this issue of Seismological Research Letters. The simulators ALLCAL (Ward, 2012), VIRTCAL (Sachs et al., 2012), RSQSim (Richards‐Dinger and Dieterich, 2012), and ViscoSim (Pollitz, 2012) were run on our most recent all‐California fault

  10. Rates and patterns of surface deformation from laser scanning following the South Napa earthquake, California

    USGS Publications Warehouse

    DeLong, Stephen B.; Lienkaemper, James J.; Pickering, Alexandra J; Avdievitch, Nikita N.

    2015-01-01

    The A.D. 2014 M6.0 South Napa earthquake, despite its moderate magnitude, caused significant damage to the Napa Valley in northern California (USA). Surface rupture occurred along several mapped and unmapped faults. Field observations following the earthquake indicated that the magnitude of postseismic surface slip was likely to approach or exceed the maximum coseismic surface slip and as such presented ongoing hazard to infrastructure. Using a laser scanner, we monitored postseismic deformation in three dimensions through time along 0.5 km of the main surface rupture. A key component of this study is the demonstration of proper alignment of repeat surveys using point cloud–based methods that minimize error imposed by both local survey errors and global navigation satellite system georeferencing errors. Using solid modeling of natural and cultural features, we quantify dextral postseismic displacement at several hundred points near the main fault trace. We also quantify total dextral displacement of initially straight cultural features. Total dextral displacement from both coseismic displacement and the first 2.5 d of postseismic displacement ranges from 0.22 to 0.29 m. This range increased to 0.33–0.42 m at 59 d post-earthquake. Furthermore, we estimate up to 0.15 m of vertical deformation during the first 2.5 d post-earthquake, which then increased by ∼0.02 m at 59 d post-earthquake. This vertical deformation is not expressed as a distinct step or scarp at the fault trace but rather as a broad up-to-the-west zone of increasing elevation change spanning the fault trace over several tens of meters, challenging common notions about fault scarp development in strike-slip systems. Integrating these analyses provides three-dimensional mapping of surface deformation and identifies spatial variability in slip along the main fault trace that we attribute to distributed slip via subtle block rotation. These results indicate the benefits of laser scanner surveys along

  11. Comments on potential geologic and seismic hazards affecting coastal Ventura County, California

    USGS Publications Warehouse

    Ross, Stephanie L.; Boore, David M.; Fisher, Michael A.; Frankel, Arthur D.; Geist, Eric L.; Hudnut, Kenneth W.; Kayen, Robert E.; Lee, Homa J.; Normark, William R.; Wong, Florence L.

    2004-01-01

    This report examines the regional seismic and geologic hazards that could affect proposed liquefied natural gas (LNG) facilities in coastal Ventura County, California. Faults throughout this area are thought to be capable of producing earthquakes of magnitude 6.5 to 7.5, which could produce surface fault offsets of as much as 15 feet. Many of these faults are sufficiently well understood to be included in the current generation of the National Seismic Hazard Maps; others may become candidates for inclusion in future revisions as research proceeds. Strong shaking is the primary hazard that causes damage from earthquakes and this area is zoned with a high level of shaking hazard. The estimated probability of a magnitude 6.5 or larger earthquake (comparable in size to the 2003 San Simeon quake) occurring in the next 30 years within 30 miles of Platform Grace is 50-60%; for Cabrillo Port, the estimate is a 35% likelihood. Combining these probabilities of earthquake occurrence with relationships that give expected ground motions yields the estimated seismic-shaking hazard. In parts of the project area, the estimated shaking hazard is as high as along the San Andreas Fault. The combination of long-period basin waves and LNG installations with large long-period resonances potentially increases this hazard.

  12. Aftershocks of the 2014 South Napa, California, Earthquake: Complex faulting on secondary faults

    USGS Publications Warehouse

    Hardebeck, Jeanne L.; Shelly, David R.

    2016-01-01

    We investigate the aftershock sequence of the 2014 MW6.0 South Napa, California, earthquake. Low-magnitude aftershocks missing from the network catalog are detected by applying a matched-filter approach to continuous seismic data, with the catalog earthquakes serving as the waveform templates. We measure precise differential arrival times between events, which we use for double-difference event relocation in a 3D seismic velocity model. Most aftershocks are deeper than the mainshock slip, and most occur west of the mapped surface rupture. While the mainshock coseismic and postseismic slip appears to have occurred on the near-vertical, strike-slip West Napa fault, many of the aftershocks occur in a complex zone of secondary faulting. Earthquake locations in the main aftershock zone, near the mainshock hypocenter, delineate multiple dipping secondary faults. Composite focal mechanisms indicate strike-slip and oblique-reverse faulting on the secondary features. The secondary faults were moved towards failure by Coulomb stress changes from the mainshock slip. Clusters of aftershocks north and south of the main aftershock zone exhibit vertical strike-slip faulting more consistent with the West Napa Fault. The northern aftershocks correspond to the area of largest mainshock coseismic slip, while the main aftershock zone is adjacent to the fault area that has primarily slipped postseismically. Unlike most creeping faults, the zone of postseismic slip does not appear to contain embedded stick-slip patches that would have produced on-fault aftershocks. The lack of stick-slip patches along this portion of the fault may contribute to the low productivity of the South Napa aftershock sequence.

  13. School Site Preparedness for the Safety of California's Children K-12. Official Report of the Northridge Earthquake Task Force on Education.

    ERIC Educational Resources Information Center

    California State Legislature, Sacramento. Senate Select Committee on the Northridge Earthquake.

    This report asserts that disaster preparedness at all school sites must become a major and immediate priority. Should a disaster equaling the magnitude of the Northridge earthquake occur, the current varying levels of site preparedness may not adequately protect California's children. The report describes why the state's children are not safe and…

  14. Crustal deformation in great California earthquake cycles

    NASA Technical Reports Server (NTRS)

    Li, Victor C.; Rice, James R.

    1986-01-01

    Periodic crustal deformation associated with repeated strike slip earthquakes is computed for the following model: A depth L (less than or similiar to H) extending downward from the Earth's surface at a transform boundary between uniform elastic lithospheric plates of thickness H is locked between earthquakes. It slips an amount consistent with remote plate velocity V sub pl after each lapse of earthquake cycle time T sub cy. Lower portions of the fault zone at the boundary slip continuously so as to maintain constant resistive shear stress. The plates are coupled at their base to a Maxwellian viscoelastic asthenosphere through which steady deep seated mantle motions, compatible with plate velocity, are transmitted to the surface plates. The coupling is described approximately through a generalized Elsasser model. It is argued that the model gives a more realistic physical description of tectonic loading, including the time dependence of deep slip and crustal stress build up throughout the earthquake cycle, than do simpler kinematic models in which loading is represented as imposed uniform dislocation slip on the fault below the locked zone.

  15. The USGS Earthquake Notification Service (ENS): Customizable notifications of earthquakes around the globe

    USGS Publications Warehouse

    Wald, Lisa A.; Wald, David J.; Schwarz, Stan; Presgrave, Bruce; Earle, Paul S.; Martinez, Eric; Oppenheimer, David

    2008-01-01

    At the beginning of 2006, the U.S. Geological Survey (USGS) Earthquake Hazards Program (EHP) introduced a new automated Earthquake Notification Service (ENS) to take the place of the National Earthquake Information Center (NEIC) "Bigquake" system and the various other individual EHP e-mail list-servers for separate regions in the United States. These included northern California, southern California, and the central and eastern United States. ENS is a "one-stop shopping" system that allows Internet users to subscribe to flexible and customizable notifications for earthquakes anywhere in the world. The customization capability allows users to define the what (magnitude threshold), the when (day and night thresholds), and the where (specific regions) for their notifications. Customization is achieved by employing a per-user based request profile, allowing the notifications to be tailored for each individual's requirements. Such earthquake-parameter-specific custom delivery was not possible with simple e-mail list-servers. Now that event and user profiles are in a structured query language (SQL) database, additional flexibility is possible. At the time of this writing, ENS had more than 114,000 subscribers, with more than 200,000 separate user profiles. On a typical day, more than 188,000 messages get sent to a variety of widely distributed users for a wide range of earthquake locations and magnitudes. The purpose of this article is to describe how ENS works, highlight the features it offers, and summarize plans for future developments.

  16. Repeating Earthquake and Nonvolcanic Tremor Observations of Aseismic Deep Fault Transients in Central California.

    NASA Astrophysics Data System (ADS)

    Nadeau, R. M.; Traer, M.; Guilhem, A.

    2005-12-01

    Seismic indicators of fault zone deformation can complement geodetic measurements by providing information on aseismic transient deformation: 1) from deep within the fault zone, 2) on a regional scale, 3) with intermediate temporal resolution (weeks to months) and 4) that spans over 2 decades (1984 to early 2005), including pre- GPS and INSAR coverage. Along the San Andreas Fault (SAF) in central California, two types of seismic indicators are proving to be particularly useful for providing information on deep fault zone deformation. The first, characteristically repeating microearthquakes, provide long-term coverage (decades) on the evolution of aseismic fault slip rates at seismogenic depths along a large (~175 km) stretch of the SAF between the rupture zones of the ~M8 1906 San Francisco and 1857 Fort Tejon earthquakes. In Cascadia and Japan the second type of seismic indicator, nonvolcanic tremors, have shown a remarkable correlation between their activity rates and GPS and tiltmeter measurements of transient deformation in the deep (sub-seismogenic) fault zone. This correlation suggests that tremor rate changes and deep transient deformation are intimately related and that deformation associated with the tremor activity may be stressing the seismogenic zone in both areas. Along the SAF, nonvolcanic tremors have only recently been discovered (i.e., in the Parkfield-Cholame area), and knowledge of their full spatial extent is still relatively limited. Nonetheless the observed temporal correlation between earthquake and tremor activity in this area is consistent with a model in which sub-seismogenic deformation and seismogenic zone stress changes are closely related. We present observations of deep aseismic transient deformation associated with the 28 September 2004, M6 Parkfield earthquake from both repeating earthquake and nonvolcanic tremor data. Also presented are updated deep fault slip rate estimates from prepeating quakes in the San Juan Bautista area with

  17. Hayward Fault, California Interferogram

    NASA Technical Reports Server (NTRS)

    2000-01-01

    This image of California's Hayward fault is an interferogram created using a pair of images taken by Synthetic Aperture Radar(SAR) combined to measure changes in the surface that may have occurred between the time the two images were taken.

    The images were collected by the European Space Agency's Remote Sensing satellites ERS-1 and ERS-2 in June 1992 and September 1997 over the central San Francisco Bay in California.

    The radar image data are shown as a gray-scale image, with the interferometric measurements that show the changes rendered in color. Only the urbanized area could be mapped with these data. The color changes from orange tones to blue tones across the Hayward fault (marked by a thin red line) show about 2-3centimeters (0.8-1.1 inches) of gradual displacement or movement of the southwest side of the fault. The block west of the fault moved horizontally toward the northwest during the 63 months between the acquisition of the two SAR images. This fault movement is called a seismic creep because the fault moved slowly without generating an earthquake.

    Scientists are using the SAR interferometry along with other data collected on the ground to monitor this fault motion in an attempt to estimate the probability of earthquake on the Hayward fault, which last had a major earthquake of magnitude 7 in 1868. This analysis indicates that the northern part of the Hayward fault is creeping all the way from the surface to a depth of 12 kilometers (7.5 miles). This suggests that the potential for a large earthquake on the northern Hayward fault might be less than previously thought. The blue area to the west (lower left) of the fault near the center of the image seemed to move upward relative to the yellow and orange areas nearby by about 2 centimeters (0.8 inches). The cause of this apparent motion is not yet confirmed, but the rise of groundwater levels during the time between the images may have caused the reversal of a small portion of the subsidence that

  18. Seasonal water storage, stress modulation, and California seismicity.

    PubMed

    Johnson, Christopher W; Fu, Yuning; Bürgmann, Roland

    2017-06-16

    Establishing what controls the timing of earthquakes is fundamental to understanding the nature of the earthquake cycle and critical to determining time-dependent earthquake hazard. Seasonal loading provides a natural laboratory to explore the crustal response to a quantifiable transient force. In California, water storage deforms the crust as snow and water accumulates during the wet winter months. We used 9 years of global positioning system (GPS) vertical deformation time series to constrain models of monthly hydrospheric loading and the resulting stress changes on fault planes of small earthquakes. The seasonal loading analysis reveals earthquakes occurring more frequently during stress conditions that favor earthquake rupture. We infer that California seismicity rates are modestly modulated by natural hydrological loading cycles. Copyright © 2017, American Association for the Advancement of Science.

  19. Preparing for a "Big One": The great southern California shakeout

    USGS Publications Warehouse

    Jones, L.M.; Benthien, M.

    2011-01-01

    The Great Southern California ShakeOut was a week of special events featuring the largest earthquake drill in United States history. On November 13, 2008, over 5 million Southern Californians pretended that the magnitude-7.8 ShakeOut scenario earthquake was occurring and practiced actions derived from results of the ShakeOut Scenario, to reduce the impact of a real, San Andreas Fault event. The communications campaign was based on four principles: 1) consistent messaging from multiple sources; 2) visual reinforcement: 3) encouragement of "milling"; and 4) focus on concrete actions. The goals of the Shake-Out established in Spring 2008 were: 1) to register 5 million people to participate in the drill; 2) to change the culture of earthquake preparedness in Southern California; and 3) to reduce earthquake losses in Southern California. Over 90% of the registrants surveyed the next year reported improvement in earthquake preparedness at their organization as a result of the ShakeOut. ?? 2011, Earthquake Engineering Research Institute.

  20. Tests of remote aftershock triggering by small mainshocks using Taiwan's earthquake catalog

    NASA Astrophysics Data System (ADS)

    Peng, W.; Toda, S.

    2014-12-01

    To understand earthquake interaction and forecast time-dependent seismic hazard, it is essential to evaluate which stress transfer, static or dynamic, plays a major role to trigger aftershocks and subsequent mainshocks. Felzer and Brodsky focused on small mainshocks (2≤M<3) and their aftershocks, and then argued that only dynamic stress change brings earthquake-to-earthquake triggering, whereas Richards-Dingers et al. (2010) claimed that those selected small mainshock-aftershock pairs were not earthquake-to-earthquake triggering but simultaneous occurrence of independent aftershocks following a larger earthquake or during a significant swarm sequence. We test those hypotheses using Taiwan's earthquake catalog by taking the advantage of lacking any larger event and the absence of significant seismic swarm typically seen with active volcano. Using Felzer and Brodsky's method and their standard parameters, we only found 14 mainshock-aftershock pairs occurred within 20 km distance in Taiwan's catalog from 1994 to 2010. Although Taiwan's catalog has similar number of earthquakes as California's, the number of pairs is about 10% of the California catalog. It may indicate the effect of no large earthquakes and no significant seismic swarm in the catalog. To fully understand the properties in the Taiwan's catalog, we loosened the screening parameters to earn more pairs and then found a linear aftershock density with a power law decay of -1.12±0.38 that is very similar to the one in Felzer and Brodsky. However, none of those mainshock-aftershock pairs were associated with a M7 rupture event or M6 events. To find what mechanism controlled the aftershock density triggered by small mainshocks in Taiwan, we randomized earthquake magnitude and location. We then found that those density decay in a short time period is more like a randomized behavior than mainshock-aftershock triggering. Moreover, 5 out of 6 pairs were found in a swarm-like temporal seismicity rate increase

  1. New cooperative seismograph networks established in southern California

    USGS Publications Warehouse

    Hill, D.P.

    1974-01-01

    Southern California has more active faults located close to large, urban population centers than any other region in the United States. Reduction of risk to life and property posed by potential earthquakes along these active faults is a primary motivation for a cooperative earthquake research program between the U.S Geological Survey and major universities in Southern California

  2. In-situ fluid-pressure measurements for earthquake prediction: An example from a deep well at Hi Vista, California

    USGS Publications Warehouse

    Healy, J.H.; Urban, T.C.

    1985-01-01

    Short-term earthquake prediction requires sensitive instruments for measuring the small anomalous changes in stress and strain that precede earthquakes. Instruments installed at or near the surface have proven too noisy for measuring anomalies of the size expected to occur, and it is now recognized that even to have the possibility of a reliable earthquake-prediction system will require instruments installed in drill holes at depths sufficient to reduce the background noise to a level below that of the expected premonitory signals. We are conducting experiments to determine the maximum signal-to-noise improvement that can be obtained in drill holes. In a 592 m well in the Mojave Desert near Hi Vista, California, we measured water-level changes with amplitudes greater than 10 cm, induced by earth tides. By removing the effects of barometric pressure and the stress related to earth tides, we have achieved a sensitivity to volumetric strain rates of 10-9 to 10-10 per day. Further improvement may be possible, and it appears that a successful earthquake-prediction capability may be achieved with an array of instruments installed in drill holes at depths of about 1 km, assuming that the premonitory strain signals are, in fact, present. ?? 1985 Birkha??user Verlag.

  3. Earthquake Shaking - Finding the "Hot Spots"

    USGS Publications Warehouse

    Field, Edward; Jones, Lucile; Jordan, Tom; Benthien, Mark; Wald, Lisa

    2001-01-01

    A new Southern California Earthquake Center study has quantified how local geologic conditions affect the shaking experienced in an earthquake. The important geologic factors at a site are softness of the rock or soil near the surface and thickness of the sediments above hard bedrock. Even when these 'site effects' are taken into account, however, each earthquake exhibits unique 'hotspots' of anomalously strong shaking. Better predictions of strong ground shaking will therefore require additional geologic data and more comprehensive computer simulations of individual earthquakes.

  4. Earthquakes-Rattling the Earth's Plumbing System

    USGS Publications Warehouse

    Sneed, Michelle; Galloway, Devin L.; Cunningham, William L.

    2003-01-01

    Hydrogeologic responses to earthquakes have been known for decades, and have occurred both close to, and thousands of miles from earthquake epicenters. Water wells have become turbid, dry or begun flowing, discharge of springs and ground water to streams has increased and new springs have formed, and well and surface-water quality have become degraded as a result of earthquakes. Earthquakes affect our Earth’s intricate plumbing system—whether you live near the notoriously active San Andreas Fault in California, or far from active faults in Florida, an earthquake near or far can affect you and the water resources you depend on.

  5. Estimating the Locations of Past and Future Large Earthquake Ruptures using Recent M4 and Greater Events

    NASA Astrophysics Data System (ADS)

    Ebel, J.; Chambers, D. W.

    2017-12-01

    Although most aftershock activity dies away within months or a few years of a mainshock, there is evidence that aftershocks still occur decades or even centuries after mainshocks, particularly in areas of low background seismicity such as stable continental regions. There also is evidence of long-lasting aftershock sequences in California. New work to study the occurrences of recent M≥4 in California shows that these events occur preferentially at the edges of past major ruptures, with the effect lessening with decreasing magnitude below M4. Prior to several California mainshocks, the M≥4 seismicity was uniformly spread along the future fault ruptures without concentrations at the fault ends. On these faults, the rates of the M≥4 earthquakes prior to the mainshocks were much greater than the rates of the recent M≥4 earthquakes. These results suggest that the spatial patterns and rates of M≥4 earthquakes may help identify which faults are most prone to rupturing in the near future. Using this idea, speculation on which faults in California may be the next ones to experience major earthquakes is presented. Some Japanese earthquakes were also tested for the patterns of M≥4 earthquake seen in California. The 2000 Mw6.6 Western Tottori earthquake shows a premonitory pattern similar to the patterns seen in California, and there have not been any M≥4 earthquakes in the fault vicinity since 2010. The 1995 Mw6.9 Kobe earthquake had little M≥4 seismicity in the years prior to the mainshock, and the M≥4 seismicity since 2000 has been scattered along the fault rupture. Both the 2016 M7.3 Kumamoto, Kyushu earthquake and the 2016 Mw6.2 Central Tottori earthquake had some M≥4 earthquakes along the fault in the two decades before the mainshocks. The results of these analyses suggest that the locations of recent M≥4 earthquakes may be useful for determining the spatial extents of past earthquake ruptures and also may help indicate which faults may have strong

  6. Simulating Earthquakes for Science and Society: New Earthquake Visualizations Ideal for Use in Science Communication

    NASA Astrophysics Data System (ADS)

    de Groot, R. M.; Benthien, M. L.

    2006-12-01

    The Southern California Earthquake Center (SCEC) has been developing groundbreaking computer modeling capabilities for studying earthquakes. These visualizations were initially shared within the scientific community but have recently have gained visibility via television news coverage in Southern California. These types of visualizations are becoming pervasive in the teaching and learning of concepts related to earth science. Computers have opened up a whole new world for scientists working with large data sets, and students can benefit from the same opportunities (Libarkin &Brick, 2002). Earthquakes are ideal candidates for visualization products: they cannot be predicted, are completed in a matter of seconds, occur deep in the earth, and the time between events can be on a geologic time scale. For example, the southern part of the San Andreas fault has not seen a major earthquake since about 1690, setting the stage for an earthquake as large as magnitude 7.7 -- the "big one." Since no one has experienced such an earthquake, visualizations can help people understand the scale of such an event. Accordingly, SCEC has developed a revolutionary simulation of this earthquake, with breathtaking visualizations that are now being distributed. According to Gordin and Pea (1995), theoretically visualization should make science accessible, provide means for authentic inquiry, and lay the groundwork to understand and critique scientific issues. This presentation will discuss how the new SCEC visualizations and other earthquake imagery achieve these results, how they fit within the context of major themes and study areas in science communication, and how the efficacy of these tools can be improved.

  7. A method for producing digital probabilistic seismic landslide hazard maps; an example from the Los Angeles, California, area

    USGS Publications Warehouse

    Jibson, Randall W.; Harp, Edwin L.; Michael, John A.

    1998-01-01

    The 1994 Northridge, California, earthquake is the first earthquake for which we have all of the data sets needed to conduct a rigorous regional analysis of seismic slope instability. These data sets include (1) a comprehensive inventory of triggered landslides, (2) about 200 strong-motion records of the mainshock, (3) 1:24,000-scale geologic mapping of the region, (4) extensive data on engineering properties of geologic units, and (5) high-resolution digital elevation models of the topography. All of these data sets have been digitized and rasterized at 10-m grid spacing in the ARC/INFO GIS platform. Combining these data sets in a dynamic model based on Newmark's permanent-deformation (sliding-block) analysis yields estimates of coseismic landslide displacement in each grid cell from the Northridge earthquake. The modeled displacements are then compared with the digital inventory of landslides triggered by the Northridge earthquake to construct a probability curve relating predicted displacement to probability of failure. This probability function can be applied to predict and map the spatial variability in failure probability in any ground-shaking conditions of interest. We anticipate that this mapping procedure will be used to construct seismic landslide hazard maps that will assist in emergency preparedness planning and in making rational decisions regarding development and construction in areas susceptible to seismic slope failure.

  8. Earthquake site response in Santa Cruz, California

    USGS Publications Warehouse

    Carver, D.; Hartzell, S.H.

    1996-01-01

    Aftershocks of the 1989 Loma Prieta, California, earthquake are used to estimate site response in a 12-km2 area centered on downtown Santa Cruz. A total of 258 S-wave records from 36 aftershocks recorded at 33 sites are used in a linear inversion for site-response spectra. The inversion scheme takes advantage of the redundancy of the large data set for which several aftershocks are recorded at each site. The scheme decomposes the observed spectra into source, path, and site terms. The path term is specified before the inversion. The undetermined degree of freedom in the decomposition into source and site spectra is removed by specifying the site-response factor to be approximately 1.0 at two sites on crystalline bedrock. The S-wave site responses correlate well with the surficial geology and observed damage pattern of the mainshock. The site-response spectra of the floodplain sites, which include the heavily damaged downtown area, exhibit significant peaks. The largest peaks are between 1 and 4 Hz. Five floodplain sites have amplification factors of 10 or greater. Most of the floodplain site-response spectra also have a smaller secondary peak between 6 and 8 Hz. Residential areas built on marine terraces above the flood-plain experienced much less severe damage. Site-response spectra for these areas also have their largest peaks between 1 and 4 Hz, but the amplification is generally below 6. Several of these sites also have a secondary peak between 6 and 8 Hz. The response peaks seen at nearly all sites between 1 and 4 Hz are probably caused by the natural resonance of the sedimentary rock column. The higher amplifications at floodplain sites may be caused by surface waves generated at the basin margins. The secondary peak between 6 and 8 Hz at many sites may be a harmonic of the 1- to 4-Hz peaks. We used waveforms from a seven-station approximately linear array located on the floodplain to calculate the apparent velocity and azimuth of propagation of coherent

  9. The California Hazards Institute

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Kellogg, L. H.; Turcotte, D. L.

    2006-12-01

    California's abundant resources are linked with its natural hazards. Earthquakes, landslides, wildfires, floods, tsunamis, volcanic eruptions, severe storms, fires, and droughts afflict the state regularly. These events have the potential to become great disasters, like the San Francisco earthquake and fire of 1906, that overwhelm the capacity of society to respond. At such times, the fabric of civic life is frayed, political leadership is tested, economic losses can dwarf available resources, and full recovery can take decades. A patchwork of Federal, state and local programs are in place to address individual hazards, but California lacks effective coordination to forecast, prevent, prepare for, mitigate, respond to, and recover from, the harmful effects of natural disasters. Moreover, we do not know enough about the frequency, size, time, or locations where they may strike, nor about how the natural environment and man-made structures would respond. As California's population grows and becomes more interdependent, even moderate events have the potential to trigger catastrophes. Natural hazards need not become natural disasters if they are addressed proactively and effectively, rather than reactively. The University of California, with 10 campuses distributed across the state, has world-class faculty and students engaged in research and education in all fields of direct relevance to hazards. For that reason, the UC can become a world leader in anticipating and managing natural hazards in order to prevent loss of life and property and degradation of environmental quality. The University of California, Office of the President, has therefore established a new system-wide Multicampus Research Project, the California Hazards Institute (CHI), as a mechanism to research innovative, effective solutions for California. The CHI will build on the rich intellectual capital and expertise of the Golden State to provide the best available science, knowledge and tools for

  10. Tilt precursors before earthquakes on the San Andreas fault, California

    USGS Publications Warehouse

    Johnston, M.J.S.; Mortensen, C.E.

    1974-01-01

    An array of 14 biaxial shallow-borehole tiltmeters (at 10-7 radian sensitivity) has been installed along 85 kilometers of the San Andreas fault during the past year. Earthquake-related changes in tilt have been simultaneously observed on up to four independent instruments. At earthquake distances greater than 10 earthquake source dimensions, there are few clear indications of tilt change. For the four instruments with the longest records (>10 months), 26 earthquakes have occurred since July 1973 with at least one instrument closer than 10 source dimensions and 8 earthquakes with more than one instrument within that distance. Precursors in tilt direction have been observed before more than 10 earthquakes or groups of earthquakes, and no similar effect has yet been seen without the occurrence of an earthquake.

  11. California quake assessed

    NASA Astrophysics Data System (ADS)

    Wuethrich, Bernice

    On January 17, at 4:31 A.M., a 6.6 magnitude earthquake hit the Los Angeles area, crippling much of the local infrastructure and claiming 51 lives. Members of the Southern California Earthquake Network, a consortium of scientists at universities and the United States Geological Survey (USGS), entered a controlled crisis mode. Network scientists, including David Wald, Susan Hough, Kerry Sieh, and a half dozen others went into the field to gather information on the earthquake, which apparently ruptured an unmapped fault.

  12. Kinematics of the 2015 San Ramon, California earthquake swarm: Implications for fault zone structure and driving mechanisms

    NASA Astrophysics Data System (ADS)

    Xue, Lian; Bürgmann, Roland; Shelly, David R.; Johnson, Christopher W.; Taira, Taka'aki

    2018-05-01

    Earthquake swarms represent a sudden increase in seismicity that may indicate a heterogeneous fault-zone, the involvement of crustal fluids and/or slow fault slip. Swarms sometimes precede major earthquake ruptures. An earthquake swarm occurred in October 2015 near San Ramon, California in an extensional right step-over region between the northern Calaveras Fault and the Concord-Mt. Diablo fault zone, which has hosted ten major swarms since 1970. The 2015 San Ramon swarm is examined here from 11 October through 18 November using template matching analysis. The relocated seismicity catalog contains ∼4000 events with magnitudes between - 0.2

  13. Geodetic slip rate for the eastern California shear zone and the recurrence time of Mojave desert earthquakes

    USGS Publications Warehouse

    Sauber, J.; Thatcher, W.; Solomon, S.C.; Lisowski, M.

    1994-01-01

    Where the San Andreas fault passes along the southwestern margin of the Mojave desert, it exhibits a large change in trend, and the deformation associated with the Pacific/North American plate boundary is distributed broadly over a complex shear zone. The importance of understanding the partitioning of strain across this region, especially to the east of the Mojave segment of the San Andreas in a region known as the eastern California shear zone (ECSZ), was highlighted by the occurrence (on 28 June 1992) of the magnitude 7.3 Landers earthquake in this zone. Here we use geodetic observations in the central Mojave desert to obtain new estimates for the rate and distribution of strain across a segment of the ECSZ, and to determine a coseismic strain drop of ~770 ??rad for the Landers earthquake. From these results we infer a strain energy recharge time of 3,500-5,000 yr for a Landers-type earthquake and a slip rate of ~12 mm yr-1 across the faults of the central Mojave. The latter estimate implies that a greater fraction of plate motion than heretofore inferred from geodetic data is accommodated across the ECSZ.

  14. Potential Effects of a Scenario Earthquake on the Economy of Southern California: Labor Market Exposure and Sensitivity Analysis to a Magnitude 7.8 Earthquake

    USGS Publications Warehouse

    Sherrouse, Benson C.; Hester, David J.; Wein, Anne M.

    2008-01-01

    The Multi-Hazards Demonstration Project (MHDP) is a collaboration between the U.S. Geological Survey (USGS) and various partners from the public and private sectors and academia, meant to improve Southern California's resiliency to natural hazards (Jones and others, 2007). In support of the MHDP objectives, the ShakeOut Scenario was developed. It describes a magnitude 7.8 (M7.8) earthquake along the southernmost 300 kilometers (200 miles) of the San Andreas Fault, identified by geoscientists as a plausible event that will cause moderate to strong shaking over much of the eight-county (Imperial, Kern, Los Angeles, Orange, Riverside, San Bernardino, San Diego, and Ventura) Southern California region. This report contains an exposure and sensitivity analysis of economic Super Sectors in terms of labor and employment statistics. Exposure is measured as the absolute counts of labor market variables anticipated to experience each level of Instrumental Intensity (a proxy measure of damage). Sensitivity is the percentage of the exposure of each Super Sector to each Instrumental Intensity level. The analysis concerns the direct effect of the scenario earthquake on economic sectors and provides a baseline for the indirect and interactive analysis of an input-output model of the regional economy. The analysis is inspired by the Bureau of Labor Statistics (BLS) report that analyzed the labor market losses (exposure) of a M6.9 earthquake on the Hayward fault by overlaying geocoded labor market data on Instrumental Intensity values. The method used here is influenced by the ZIP-code-level data provided by the California Employment Development Department (CA EDD), which requires the assignment of Instrumental Intensities to ZIP codes. The ZIP-code-level labor market data includes the number of business establishments, employees, and quarterly payroll categorized by the North American Industry Classification System. According to the analysis results, nearly 225,000 business

  15. Earthquake forecast for the Wasatch Front region of the Intermountain West

    USGS Publications Warehouse

    DuRoss, Christopher B.

    2016-04-18

    The Working Group on Utah Earthquake Probabilities has assessed the probability of large earthquakes in the Wasatch Front region. There is a 43 percent probability of one or more magnitude 6.75 or greater earthquakes and a 57 percent probability of one or more magnitude 6.0 or greater earthquakes in the region in the next 50 years. These results highlight the threat of large earthquakes in the region.

  16. Static-stress impact of the 1992 Landers earthquake sequence on nucleation and slip at the site of the 1999 M=7.1 Hector Mine earthquake, southern California

    USGS Publications Warehouse

    Parsons, Tom; Dreger, Douglas S.

    2000-01-01

    The proximity in time (∼7 years) and space (∼20 km) between the 1992 M=7.3 Landers earthquake and the 1999 M=7.1 Hector Mine event suggests a possible link between the quakes. We thus calculated the static stress changes following the 1992 Joshua Tree/Landers/Big Bear earthquake sequence on the 1999 M=7.1 Hector Mine rupture plane in southern California. Resolving the stress tensor into rake-parallel and fault-normal components and comparing with changes in the post-Landers seismicity rate allows us to estimate a coefficient of friction on the Hector Mine plane. Seismicity following the 1992 sequence increased at Hector Mine where the fault was unclamped. This increase occurred despite a calculated reduction in right-lateral shear stress. The dependence of seismicity change primarily on normal stress change implies a high coefficient of static friction (µ≥0.8). We calculated the Coulomb stress change using µ=0.8 and found that the Hector Mine hypocenter was mildly encouraged (0.5 bars) by the 1992 earthquake sequence. In addition, the region of peak slip during the Hector Mine quake occurred where Coulomb stress is calculated to have increased by 0.5–1.5 bars. In general, slip was more limited where Coulomb stress was reduced, though there was some slip where the strongest stress decrease was calculated. Interestingly, many smaller earthquakes nucleated at or near the 1999 Hector Mine hypocenter after 1992, but only in 1999 did an event spread to become a M=7.1 earthquake.

  17. Introducing ShakeMap to potential users in Puerto Rico using scenarios of damaging historical and probable earthquakes

    NASA Astrophysics Data System (ADS)

    Huerfano, V. A.; Cua, G.; von Hillebrandt, C.; Saffar, A.

    2007-12-01

    The island of Puerto Rico has a long history of damaging earthquakes. Major earthquakes from off-shore sources have affected Puerto Rico in 1520, 1615, 1670, 1751, 1787, 1867, and 1918 (Mueller et al, 2003; PRSN Catalogue). Recent trenching has also yielded evidence of possible M7.0 events inland (Prentice, 2000). The high seismic hazard, large population, high tsunami potential and relatively poor construction practice can result in a potentially devastating combination. Efficient emergency response in event of a large earthquake will be crucial to minimizing the loss of life and disruption of lifeline systems in Puerto Rico. The ShakeMap system (Wald et al, 2004) developed by the USGS to rapidly display and disseminate information about the geographical distribution of ground shaking (and hence potential damage) following a large earthquake has proven to be a vital tool for post earthquake emergency response efforts, and is being adopted/emulated in various seismically active regions worldwide. Implementing a robust ShakeMap system is among the top priorities of the Puerto Rico Seismic Network. However, the ultimate effectiveness of ShakeMap in post- earthquake response depends not only on its rapid availability, but also on the effective use of the information it provides. We developed ShakeMap scenarios of a suite of damaging historical and probable earthquakes that severely impact San Juan, Ponce, and Mayagüez, the 3 largest cities in Puerto Rico. Earthquake source parameters were obtained from McCann and Mercado (1998); and Huérfano (2004). For historical earthquakes that generated tsunamis, tsunami inundation maps were generated using the TIME method (Shuto, 1991). The ShakeMap ground shaking maps were presented to local and regional governmental and emergency response agencies at the 2007 Annual conference of the Puerto Rico Emergency Management and Disaster Administration in San Juan, PR, and at numerous other emergency management talks and training

  18. Earthquakes, May-June, 1992

    USGS Publications Warehouse

    Person, Waverly J.

    1992-01-01

    The months of May and June were very active in terms of earthquake occurrence. Six major earthquakes (7.0earthquakes included a magnitude 7.1 in Papua New Guinea on May 15, a magnitude 7.1 followed by a magnitude 7.5 in the Philippine Islands on May 17, a magnitude 7.0 in the Cuba region on May 25, and a magnitude 7.3 in the Santa Cruz Islands of the Pacific on May 27. In the United States, a magnitude 7.6 earthquake struck in southern California on June 28 followed by a magnitude 6.7 quake about three hours later.

  19. Winnetka deformation zone: Surface expression of coactive slip on a blind fault during the Northridge earthquake sequence, California. Evidence that coactive faulting occurred in the Canoga Park, Winnetka, and Northridge areas during the 17 January 1994, Northridge, California earthquake

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cruikshank, K.M.; Johnson, A.M.; Fleming, R.W.

    1996-12-31

    Measurements of normalized length changes of streets over an area of 9 km{sup 2} in San Fernando Valley of Los Angeles, California, define a distinctive strain pattern that may well reflect blind faulting during the 1994 Northridge earthquake. Strain magnitudes are about 3 {times} 10{sup {minus}4}, locally 10{sup {minus}3}. They define a deformation zone trending diagonally from near Canoga Park in the southwest, through Winnetka, to near Northridge in the northeast. The deformation zone is about 4.5 km long and 1 km wide. The northwestern two-thirds of the zone is a belt of extension of streets, and the southeastern one-thirdmore » is a belt of shortening of streets. On the northwest and southeast sides of the deformation zone the magnitude of the strains is too small to measure, less than 10{sup {minus}4}. Complete states of strain measured in the northeastern half of the deformation zone show that the directions of principal strains are parallel and normal to the walls of the zone, so the zone is not a strike-slip zone. The magnitudes of strains measured in the northeastern part of the Winnetka area were large enough to fracture concrete and soils, and the area of larger strains correlates with the area of greater damage to such roads and sidewalks. All parts of the pattern suggest a blind fault at depth, most likely a reverse fault dipping northwest but possibly a normal fault dipping southeast. The magnitudes of the strains in the Winnetka area are consistent with the strains produced at the ground surface by a blind fault plane extending to depth on the order of 2 km and a net slip on the order of 1 m, within a distance of about 100 to 500 m of the ground surface. The pattern of damage in the San Fernando Valley suggests a fault segment much longer than the 4.5 km defined by survey data in the Winnetka area. The blind fault segment may extend several kilometers in both directions beyond the Winnetka area. This study of the Winnetka area further supports

  20. Earthquake potential revealed by tidal influence on earthquake size-frequency statistics

    NASA Astrophysics Data System (ADS)

    Ide, Satoshi; Yabe, Suguru; Tanaka, Yoshiyuki

    2016-11-01

    The possibility that tidal stress can trigger earthquakes is long debated. In particular, a clear causal relationship between small earthquakes and the phase of tidal stress is elusive. However, tectonic tremors deep within subduction zones are highly sensitive to tidal stress levels, with tremor rate increasing at an exponential rate with rising tidal stress. Thus, slow deformation and the possibility of earthquakes at subduction plate boundaries may be enhanced during periods of large tidal stress. Here we calculate the tidal stress history, and specifically the amplitude of tidal stress, on a fault plane in the two weeks before large earthquakes globally, based on data from the global, Japanese, and Californian earthquake catalogues. We find that very large earthquakes, including the 2004 Sumatran, 2010 Maule earthquake in Chile and the 2011 Tohoku-Oki earthquake in Japan, tend to occur near the time of maximum tidal stress amplitude. This tendency is not obvious for small earthquakes. However, we also find that the fraction of large earthquakes increases (the b-value of the Gutenberg-Richter relation decreases) as the amplitude of tidal shear stress increases. The relationship is also reasonable, considering the well-known relationship between stress and the b-value. This suggests that the probability of a tiny rock failure expanding to a gigantic rupture increases with increasing tidal stress levels. We conclude that large earthquakes are more probable during periods of high tidal stress.

  1. Business losses, transportation damage and the Northridge Earthquake

    DOT National Transportation Integrated Search

    1998-05-01

    The 1994 Northridge earthquake damaged four major freeways in the Los Angeles area. Southern California firms were surveyed to assess the role that these transportation disruptions played in business losses. Of the firms that reported any earthquake ...

  2. Earthquake early Warning ShakeAlert system: West coast wide production prototype

    USGS Publications Warehouse

    Kohler, Monica D.; Cochran, Elizabeth S.; Given, Douglas; Guiwits, Stephen; Neuhauser, Doug; Hensen, Ivan; Hartog, Renate; Bodin, Paul; Kress, Victor; Thompson, Stephen; Felizardo, Claude; Brody, Jeff; Bhadha, Rayo; Schwarz, Stan

    2017-01-01

    Earthquake early warning (EEW) is an application of seismological science that can give people, as well as mechanical and electrical systems, up to tens of seconds to take protective actions before peak earthquake shaking arrives at a location. Since 2006, the U.S. Geological Survey has been working in collaboration with several partners to develop EEW for the United States. The goal is to create and operate an EEW system, called ShakeAlert, for the highest risk areas of the United States, starting with the West Coast states of California, Oregon, and Washington. In early 2016, the Production Prototype v.1.0 was established for California; then, in early 2017, v.1.2 was established for the West Coast, with earthquake notifications being distributed to a group of beta users in California, Oregon, and Washington. The new ShakeAlert Production Prototype was an outgrowth from an earlier demonstration EEW system that began sending test notifications to selected users in California in January 2012. ShakeAlert leverages the considerable physical, technical, and organizational earthquake monitoring infrastructure of the Advanced National Seismic System, a nationwide federation of cooperating seismic networks. When fully implemented, the ShakeAlert system may reduce damage and injury caused by large earthquakes, improve the nation’s resilience, and speed recovery.

  3. The 2010 M w 7.2 El Mayor-Cucapah Earthquake Sequence, Baja California, Mexico and Southernmost California, USA: Active Seismotectonics along the Mexican Pacific Margin

    NASA Astrophysics Data System (ADS)

    Hauksson, Egill; Stock, Joann; Hutton, Kate; Yang, Wenzheng; Vidal-Villegas, J. Antonio; Kanamori, Hiroo

    2011-08-01

    The El Mayor-Cucapah earthquake sequence started with a few foreshocks in March 2010, and a second sequence of 15 foreshocks of M > 2 (up to M4.4) that occurred during the 24 h preceding the mainshock. The foreshocks occurred along a north-south trend near the mainshock epicenter. The M w 7.2 mainshock on April 4 exhibited complex faulting, possibly starting with a ~M6 normal faulting event, followed ~15 s later by the main event, which included simultaneous normal and right-lateral strike-slip faulting. The aftershock zone extends for 120 km from the south end of the Elsinore fault zone north of the US-Mexico border almost to the northern tip of the Gulf of California. The waveform-relocated aftershocks form two abutting clusters, each about 50 km long, as well as a 10 km north-south aftershock zone just north of the epicenter of the mainshock. Even though the Baja California data are included, the magnitude of completeness and the hypocentral errors increase gradually with distance south of the international border. The spatial distribution of large aftershocks is asymmetric with five M5+ aftershocks located to the south of the mainshock, and only one M5.7 aftershock, but numerous smaller aftershocks to the north. Further, the northwest aftershock cluster exhibits complex faulting on both northwest and northeast planes. Thus, the aftershocks also express a complex pattern of stress release along strike. The overall rate of decay of the aftershocks is similar to the rate of decay of a generic California aftershock sequence. In addition, some triggered seismicity was recorded along the Elsinore and San Jacinto faults to the north, but significant northward migration of aftershocks has not occurred. The synthesis of the El Mayor-Cucapah sequence reveals transtensional regional tectonics, including the westward growth of the Mexicali Valley and the transfer of Pacific-North America plate motion from the Gulf of California in the south into the southernmost San

  4. Statistical analysis of low-rise building damage caused by the San Fernando earthquake

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholl, R.E.

    1974-02-01

    An empirical investigation of damage to low-rise buildings in two selected control areas within Glendale, California, caused by the ground motion precipitated by the San Fernando earthquake of February 9, 1971 is summarized. The procedures for obtaining the appropriate data and the methodology used in deriving ground motion-damage relationships are described. Motion-damage relationships are derived for overall damage and for the most frequently damaged building components. Overall motion-damage relationships are expressed in terms of damage incidence (damage ratio) and damage cost (damage cost factor). The motion-damage relationships derived from the earthquake data are compared with similar data obtained for lou-risemore » buildings subjected to ground motion generated by an underground nuclear explosion. Overall comparison results show that for the same spectral acceleration, the earthquake caused slightly more damage. Differences in ground-motion characteristics for the two types of disturbances provide the most probable explanation for this discrepancy. (auth)« less

  5. Risk and return: evaluating Reverse Tracing of Precursors earthquake predictions

    NASA Astrophysics Data System (ADS)

    Zechar, J. Douglas; Zhuang, Jiancang

    2010-09-01

    In 2003, the Reverse Tracing of Precursors (RTP) algorithm attracted the attention of seismologists and international news agencies when researchers claimed two successful predictions of large earthquakes. These researchers had begun applying RTP to seismicity in Japan, California, the eastern Mediterranean and Italy; they have since applied it to seismicity in the northern Pacific, Oregon and Nevada. RTP is a pattern recognition algorithm that uses earthquake catalogue data to declare alarms, and these alarms indicate that RTP expects a moderate to large earthquake in the following months. The spatial extent of alarms is highly variable and each alarm typically lasts 9 months, although the algorithm may extend alarms in time and space. We examined the record of alarms and outcomes since the prospective application of RTP began, and in this paper we report on the performance of RTP to date. To analyse these predictions, we used a recently developed approach based on a gambling score, and we used a simple reference model to estimate the prior probability of target earthquakes for each alarm. Formally, we believe that RTP investigators did not rigorously specify the first two `successful' predictions in advance of the relevant earthquakes; because this issue is contentious, we consider analyses with and without these alarms. When we included contentious alarms, RTP predictions demonstrate statistically significant skill. Under a stricter interpretation, the predictions are marginally unsuccessful.

  6. A three-step Maximum-A-Posterior probability method for InSAR data inversion of coseismic rupture with application to four recent large earthquakes in Asia

    NASA Astrophysics Data System (ADS)

    Sun, J.; Shen, Z.; Burgmann, R.; Liang, F.

    2012-12-01

    We develop a three-step Maximum-A-Posterior probability (MAP) method for coseismic rupture inversion, which aims at maximizing the a posterior probability density function (PDF) of elastic solutions of earthquake rupture. The method originates from the Fully Bayesian Inversion (FBI) and the Mixed linear-nonlinear Bayesian inversion (MBI) methods , shares the same a posterior PDF with them and keeps most of their merits, while overcoming its convergence difficulty when large numbers of low quality data are used and improving the convergence rate greatly using optimization procedures. A highly efficient global optimization algorithm, Adaptive Simulated Annealing (ASA), is used to search for the maximum posterior probability in the first step. The non-slip parameters are determined by the global optimization method, and the slip parameters are inverted for using the least squares method without positivity constraint initially, and then damped to physically reasonable range. This step MAP inversion brings the inversion close to 'true' solution quickly and jumps over local maximum regions in high-dimensional parameter space. The second step inversion approaches the 'true' solution further with positivity constraints subsequently applied on slip parameters using the Monte Carlo Inversion (MCI) technique, with all parameters obtained from step one as the initial solution. Then the slip artifacts are eliminated from slip models in the third step MAP inversion with fault geometry parameters fixed. We first used a designed model with 45 degree dipping angle and oblique slip, and corresponding synthetic InSAR data sets to validate the efficiency and accuracy of method. We then applied the method on four recent large earthquakes in Asia, namely the 2010 Yushu, China earthquake, the 2011 Burma earthquake, the 2011 New Zealand earthquake and the 2008 Qinghai, China earthquake, and compared our results with those results from other groups. Our results show the effectiveness of

  7. Earthquake Preparedness 101: Planning Guidelines for Colleges and Universities.

    ERIC Educational Resources Information Center

    California Governor's Office, Sacramento.

    This publication is a guide for California colleges and universities wishing to prepare for earthquakes. An introduction aimed at institutional leaders emphasizes that earthquake preparedness is required by law and argues that there is much that can be done to prepare for earthquakes. The second section, addressed to the disaster planner, offers…

  8. Three dimensional images of geothermal systems: local earthquake P-wave velocity tomography at the Hengill and Krafla geothermal areas, Iceland, and The Geysers, California

    USGS Publications Warehouse

    Julian, B.R.; Prisk, A.; Foulger, G.R.; Evans, J.R.; ,

    1993-01-01

    Local earthquake tomography - the use of earthquake signals to form a 3-dimensional structural image - is now a mature geophysical analysis method, particularly suited to the study of geothermal reservoirs, which are often seismically active and severely laterally inhomogeneous. Studies have been conducted of the Hengill (Iceland), Krafla (Iceland) and The Geysers (California) geothermal areas. All three systems are exploited for electricity and/or heat production, and all are highly seismically active. Tomographic studies of volumes a few km in dimension were conducted for each area using the method of Thurber (1983).

  9. Earthquake likelihood model testing

    USGS Publications Warehouse

    Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A.

    2007-01-01

    INTRODUCTIONThe Regional Earthquake Likelihood Models (RELM) project aims to produce and evaluate alternate models of earthquake potential (probability per unit volume, magnitude, and time) for California. Based on differing assumptions, these models are produced to test the validity of their assumptions and to explore which models should be incorporated in seismic hazard and risk evaluation. Tests based on physical and geological criteria are useful but we focus on statistical methods using future earthquake catalog data only. We envision two evaluations: a test of consistency with observed data and a comparison of all pairs of models for relative consistency. Both tests are based on the likelihood method, and both are fully prospective (i.e., the models are not adjusted to fit the test data). To be tested, each model must assign a probability to any possible event within a specified region of space, time, and magnitude. For our tests the models must use a common format: earthquake rates in specified “bins” with location, magnitude, time, and focal mechanism limits.Seismology cannot yet deterministically predict individual earthquakes; however, it should seek the best possible models for forecasting earthquake occurrence. This paper describes the statistical rules of an experiment to examine and test earthquake forecasts. The primary purposes of the tests described below are to evaluate physical models for earthquakes, assure that source models used in seismic hazard and risk studies are consistent with earthquake data, and provide quantitative measures by which models can be assigned weights in a consensus model or be judged as suitable for particular regions.In this paper we develop a statistical method for testing earthquake likelihood models. A companion paper (Schorlemmer and Gerstenberger 2007, this issue) discusses the actual implementation of these tests in the framework of the RELM initiative.Statistical testing of hypotheses is a common task and a

  10. Viscoelastic coupling model of the San Andreas fault along the big bend, southern California

    USGS Publications Warehouse

    Savage, J.C.; Lisowski, M.

    1997-01-01

    The big bend segment of the San Andreas fault is the 300-km-long segment in southern California that strikes about N65??W, roughly 25?? counterclockwise from the local tangent to the small circle about the Pacific-North America pole of rotation. The broad distribution of deformation of trilateration networks along this segment implies a locking depth of at least 25 km as interpreted by the conventional model of strain accumulation (continuous slip on the fault below the locking depth at the rate of relative plate motion), whereas the observed seismicity and laboratory data on fault strength suggest that the locking depth should be no greater than 10 to 15 km. The discrepancy is explained by the viscoelastic coupling model which accounts for the viscoelastic response of the lower crust. Thus the broad distribution of deformation observed across the big bend segment can be largely associated with the San Andreas fault itself, not subsidiary faults distributed throughout the region. The Working Group on California Earthquake Probabilities [1995] in using geodetic data to estimate the seismic risk in southern California has assumed that strain accumulated off the San Andreas fault is released by earthquakes located off the San Andreas fault. Thus they count the San Andreas contribution to total seismic moment accumulation more than once, leading to an overestimate of the seismicity for magnitude 6 and greater earthquakes in their Type C zones.

  11. Foreshock occurrence rates before large earthquakes worldwide

    USGS Publications Warehouse

    Reasenberg, P.A.

    1999-01-01

    Global rates of foreshock occurrence involving shallow M ??? 6 and M ??? 7 mainshocks and M ??? 5 foreshocks were measured, using earthquakes listed in the Harvard CMT catalog for the period 1978-1996. These rates are similar to rates ones measured in previous worldwide and regional studies when they are normalized for the ranges of magnitude difference they each span. The observed worldwide rates were compared to a generic model of earthquake clustering, which is based on patterns of small and moderate aftershocks in California, and were found to exceed the California model by a factor of approximately 2. Significant differences in foreshock rate were found among subsets of earthquakes defined by their focal mechanism and tectonic region, with the rate before thrust events higher and the rate before strike-slip events lower than the worldwide average. Among the thrust events a large majority, composed of events located in shallow subduction zones, registered a high foreshock rate, while a minority, located in continental thrust belts, measured a low rate. These differences may explain why previous surveys have revealed low foreshock rates among thrust events in California (especially southern California), while the worldwide observations suggest the opposite: California, lacking an active subduction zone in most of its territory, and including a region of mountain-building thrusts in the south, reflects the low rate apparently typical for continental thrusts, while the worldwide observations, dominated by shallow subduction zone events, are foreshock-rich.

  12. 3-D P- and S-wave velocity structure and low-frequency earthquake locations in the Parkfield, California region

    USGS Publications Warehouse

    Zeng, Xiangfang; Thurber, Clifford H.; Shelly, David R.; Harrington, Rebecca M.; Cochran, Elizabeth S.; Bennington, Ninfa L.; Peterson, Dana; Guo, Bin; McClement, Kara

    2016-01-01

    To refine the 3-D seismic velocity model in the greater Parkfield, California region, a new data set including regular earthquakes, shots, quarry blasts and low-frequency earthquakes (LFEs) was assembled. Hundreds of traces of each LFE family at two temporary arrays were stacked with time–frequency domain phase weighted stacking method to improve signal-to-noise ratio. We extend our model resolution to lower crustal depth with LFE data. Our result images not only previously identified features but also low velocity zones (LVZs) in the area around the LFEs and the lower crust beneath the southern Rinconada Fault. The former LVZ is consistent with high fluid pressure that can account for several aspects of LFE behaviour. The latter LVZ is consistent with a high conductivity zone in magnetotelluric studies. A new Vs model was developed with S picks that were obtained with a new autopicker. At shallow depth, the low Vs areas underlie the strongest shaking areas in the 2004 Parkfield earthquake. We relocate LFE families and analyse the location uncertainties with the NonLinLoc and tomoDD codes. The two methods yield similar results.

  13. Earthquakes and faults in the San Francisco Bay area (1970-2003)

    USGS Publications Warehouse

    Sleeter, Benjamin M.; Calzia, James P.; Walter, Stephen R.; Wong, Florence L.; Saucedo, George J.

    2004-01-01

    The map depicts both active and inactive faults and earthquakes magnitude 1.5 to 7.0 in the greater San Francisco Bay area. Twenty-two earthquakes magnitude 5.0 and greater are indicated on the map and listed chronologically in an accompanying table. The data are compiled from records from 1970-2003. The bathymetry was generated from a digital version of NOAA maps and hydrogeographic data for San Francisco Bay. Elevation data are from the USGS National Elevation Database. Landsat satellite image is from seven Landsat 7 Enhanced Thematic Mapper Plus scenes. Fault data are reproduced with permission from the California Geological Survey. The earthquake data are from the Northern California Earthquake Catalog.

  14. Impacts and responses : goods movement after the Northridge Earthquake

    DOT National Transportation Integrated Search

    1998-05-01

    The 1994 Northridge earthquake disrupted goods movement on four major highway routes in : Southern California. This paper examines the impacts of the earthquake on Los Angeles County : trucking firms, and finds that the impact was initially widesprea...

  15. Predicted liquefaction in the greater Oakland area and northern Santa Clara Valley during a repeat of the 1868 Hayward Fault (M6.7-7.0) earthquake

    USGS Publications Warehouse

    Holzer, Thomas L.; Noce, Thomas E.; Bennett, Michael J.

    2010-01-01

    Probabilities of surface manifestations of liquefaction due to a repeat of the 1868 (M6.7-7.0) earthquake on the southern segment of the Hayward Fault were calculated for two areas along the margin of San Francisco Bay, California: greater Oakland and the northern Santa Clara Valley. Liquefaction is predicted to be more common in the greater Oakland area than in the northern Santa Clara Valley owing to the presence of 57 km2 of susceptible sandy artificial fill. Most of the fills were placed into San Francisco Bay during the first half of the 20th century to build military bases, port facilities, and shoreline communities like Alameda and Bay Farm Island. Probabilities of liquefaction in the area underlain by this sandy artificial fill range from 0.2 to ~0.5 for a M7.0 earthquake, and decrease to 0.1 to ~0.4 for a M6.7 earthquake. In the greater Oakland area, liquefaction probabilities generally are less than 0.05 for Holocene alluvial fan deposits, which underlie most of the remaining flat-lying urban area. In the northern Santa Clara Valley for a M7.0 earthquake on the Hayward Fault and an assumed water-table depth of 1.5 m (the historically shallowest water level), liquefaction probabilities range from 0.1 to 0.2 along Coyote and Guadalupe Creeks, but are less than 0.05 elsewhere. For a M6.7 earthquake, probabilities are greater than 0.1 along Coyote Creek but decrease along Guadalupe Creek to less than 0.1. Areas with high probabilities in the Santa Clara Valley are underlain by young Holocene levee deposits along major drainages where liquefaction and lateral spreading occurred during large earthquakes in 1868 and 1906.

  16. FORECAST MODEL FOR MODERATE EARTHQUAKES NEAR PARKFIELD, CALIFORNIA.

    USGS Publications Warehouse

    Stuart, William D.; Archuleta, Ralph J.; Lindh, Allan G.

    1985-01-01

    The paper outlines a procedure for using an earthquake instability model and repeated geodetic measurements to attempt an earthquake forecast. The procedure differs from other prediction methods, such as recognizing trends in data or assuming failure at a critical stress level, by using a self-contained instability model that simulates both preseismic and coseismic faulting in a natural way. In short, physical theory supplies a family of curves, and the field data select the member curves whose continuation into the future constitutes a prediction. Model inaccuracy and resolving power of the data determine the uncertainty of the selected curves and hence the uncertainty of the earthquake time.

  17. Probability of a great earthquake to recur in the Tokai district, Japan: reevaluation based on newly-developed paleoseismology, plate tectonics, tsunami study, micro-seismicity and geodetic measurements

    NASA Astrophysics Data System (ADS)

    Rikitake, T.

    1999-03-01

    In light of newly-acquired geophysical information about earthquake generation in the Tokai area, Central Japan, where occurrence of a great earthquake of magnitude 8 or so has recently been feared, probabilities of earthquake occurrence in the near future are reevaluated. Much of the data used for evaluation here relies on recently-developed paleoseismology, tsunami study and GPS geodesy.The new Weibull distribution analysis of recurrence tendency of great earthquakes in the Tokai-Nankai zone indicates that the mean return period of great earthquakes there is estimated as 109 yr with a standard deviation amounting to 33 yr. These values do not differ much from those of previous studies (Rikitake, 1976, 1986; Utsu, 1984).Taking the newly-determined velocities of the motion of Philippine Sea plate at various portions of the Tokai-Nankai zone into account, the ultimate displacements to rupture at the plate boundary are obtained. A Weibull distribution analysis results in the mean ultimate displacement amounting to 4.70 m with a standard deviation estimated as 0.86 m. A return period amounting to 117 yr is obtained at the Suruga Bay portion by dividing the mean ultimate displacement by the relative plate velocity.With the aid of the fault models as determined from the tsunami studies, the increases in the cumulative seismic slips associated with the great earthquakes are examined at various portions of the zone. It appears that a slip-predictable model can better be applied to the occurrence mode of great earthquakes in the zone than a time-predictable model. The crustal strain accumulating over the Tokai area as estimated from the newly-developed geodetic work including the GPS observations is compared to the ultimate strain presumed by the above two models.The probabilities for a great earthquake to recur in the Tokai district are then estimated with the aid of the Weibull analysis parameters obtained for the four cases discussed in the above. All the probabilities

  18. Comments on baseline correction of digital strong-motion data: Examples from the 1999 Hector Mine, California, earthquake

    USGS Publications Warehouse

    Boore, D.M.; Stephens, C.D.; Joyner, W.B.

    2002-01-01

    Residual displacements for large earthquakes can sometimes be determined from recordings on modern digital instruments, but baseline offsets of unknown origin make it difficult in many cases to do so. To recover the residual displacement, we suggest tailoring a correction scheme by studying the character of the velocity obtained by integration of zeroth-order-corrected acceleration and then seeing if the residual displacements are stable when the various parameters in the particular correction scheme are varied. For many seismological and engineering purposes, however, the residual displacement are of lesser importance than ground motions at periods less than about 20 sec. These ground motions are often recoverable with simple baseline correction and low-cut filtering. In this largely empirical study, we illustrate the consequences of various correction schemes, drawing primarily from digital recordings of the 1999 Hector Mine, California, earthquake. We show that with simple processing the displacement waveforms for this event are very similar for stations separated by as much as 20 km. We also show that a strong pulse on the transverse component was radiated from the Hector Mine earthquake and propagated with little distortion to distances exceeding 170 km; this pulse leads to large response spectral amplitudes around 10 sec.

  19. Are Earthquake Clusters/Supercycles Real or Random?

    NASA Astrophysics Data System (ADS)

    Salditch, L.; Brooks, E. M.; Stein, S.; Spencer, B. D.

    2016-12-01

    Long records of earthquakes at plate boundaries such as the San Andreas or Cascadia often show that large earthquakes occur in temporal clusters, also termed supercycles, separated by less active intervals. These are intriguing because the boundary is presumably being loaded by steady plate motion. If so, earthquakes resulting from seismic cycles - in which their probability is small shortly after the past one, and then increases with time - should occur quasi-periodically rather than be more frequent in some intervals than others. We are exploring this issue with two approaches. One is to assess whether the clusters result purely by chance from a time-independent process that has no "memory." Thus a future earthquake is equally likely immediately after the past one and much later, so earthquakes can cluster in time. We analyze the agreement between such a model and inter-event times for Parkfield, Pallet Creek, and other records. A useful tool is transformation by the inverse cumulative distribution function, so the inter-event times have a uniform distribution when the memorylessness property holds. The second is via a time-variable model in which earthquake probability increases with time between earthquakes and decreases after an earthquake. The probability of an event increases with time until one happens, after which it decreases, but not to zero. Hence after a long period of quiescence, the probability of an earthquake can remain higher than the long-term average for several cycles. Thus the probability of another earthquake is path dependent, i.e. depends on the prior earthquake history over multiple cycles. Time histories resulting from simulations give clusters with properties similar to those observed. The sequences of earthquakes result from both the model parameters and chance, so two runs with the same parameters look different. The model parameters control the average time between events and the variation of the actual times around this average, so

  20. Bayesian probabilities for Mw 9.0+ earthquakes in the Aleutian Islands from a regionally scaled global rate

    NASA Astrophysics Data System (ADS)

    Butler, Rhett; Frazer, L. Neil; Templeton, William J.

    2016-05-01

    We use the global rate of Mw ≥ 9.0 earthquakes, and standard Bayesian procedures, to estimate the probability of such mega events in the Aleutian Islands, where they pose a significant risk to Hawaii. We find that the probability of such an earthquake along the Aleutians island arc is 6.5% to 12% over the next 50 years (50% credibility interval) and that the annualized risk to Hawai'i is about $30 M. Our method (the regionally scaled global rate method or RSGR) is to scale the global rate of Mw 9.0+ events in proportion to the fraction of global subduction (units of area per year) that takes place in the Aleutians. The RSGR method assumes that Mw 9.0+ events are a Poisson process with a rate that is both globally and regionally stationary on the time scale of centuries, and it follows the principle of Burbidge et al. (2008) who used the product of fault length and convergence rate, i.e., the area being subducted per annum, to scale the Poisson rate for the GSS to sections of the Indonesian subduction zone. Before applying RSGR to the Aleutians, we first apply it to five other regions of the global subduction system where its rate predictions can be compared with those from paleotsunami, paleoseismic, and geoarcheology data. To obtain regional rates from paleodata, we give a closed-form solution for the probability density function of the Poisson rate when event count and observation time are both uncertain.

  1. BAREPP: Earthquake preparedness for the San Francisco Bay area

    USGS Publications Warehouse

    1986-01-01

    The threat of major and damaging earthquakes in California is a fact. How people respond to that threat is a concern shared by many local, state, federal, volunteer and private sector organizations. The Bay Area Regional Earthquake Preparedness Project (BAREPP) promotes comprehensive earthquake preparedness actions by these organizations and provides technical and planning assistance for a variety of programs.

  2. Earthquake and volcano hazard notices: An economic evaluation of changes in risk perceptions

    USGS Publications Warehouse

    Bernknopf, R.L.; Brookshire, D.S.; Thayer, M.A.

    1990-01-01

    Earthquake and volcano hazard notices were issued for the Mammoth Lakes, California area by the U.S. Geological Survey under the authority granted by the Disaster Relief Act of 1974. The effects on investment, recretion visitation, and risk perceptionsare explored. The hazard notices did not affect recreation visitation, although investment was affected. A perceived loss in the market value of homes was documented. Risk perceptions were altered for property owners. Communication of the probability of an event over time would enhance hazard notices as a policy instrument and would mitigate unnecessary market perturbations. ?? 1990.

  3. Earthquake-by-earthquake fold growth above the Puente Hills blind thrust fault, Los Angeles, California: Implications for fold kinematics and seismic hazard

    USGS Publications Warehouse

    Leon, L.A.; Christofferson, S.A.; Dolan, J.F.; Shaw, J.H.; Pratt, T.L.

    2007-01-01

    Boreholes and high-resolution seismic reflection data collected across the forelimb growth triangle above the central segment of the Puente Hills thrust fault (PHT) beneath Los Angeles, California, provide a detailed record of incremental fold growth during large earthquakes on this major blind thrust fault. These data document fold growth within a discrete kink band that narrows upward from ???460 m at the base of the Quaternary section (200-250 m depth) to 82% at 250 m depth) folding and uplift occur within discrete kink bands, thereby enabling us to develop a paleoseismic history of the underlying blind thrust fault. The borehole data reveal that the youngest part of the growth triangle in the uppermost 20 m comprises three stratigraphically discrete growth intervals marked by southward thickening sedimentary strata that are separated by intervals in which sediments do not change thickness across the site. We interpret the intervals of growth as occurring after the formation of now-buried paleofold scarps during three large PHT earthquakes in the past 8 kyr. The intervening intervals of no growth record periods of structural quiescence and deposition at the regional, near-horizontal stream gradient at the study site. Minimum uplift in each of the scarp-forming events, which occurred at 0.2-2.2 ka (event Y), 3.0-6.3 ka (event X), and 6.6-8.1 ka (event W), ranged from ???1.1 to ???1.6 m, indicating minimum thrust displacements of ???2.5 to 4.5 m. Such large displacements are consistent with the occurrence of large-magnitude earthquakes (Mw > 7). Cumulative, minimum uplift in the past three events was 3.3 to 4.7 m, suggesting cumulative thrust displacement of ???7 to 10.5 m. These values yield a minimum Holocene slip rate for the PHT of ???0.9 to 1.6 mm/yr. The borehole and seismic reflection data demonstrate that dip within the kink band is acquired incrementally, such that older strata that have been deformed by more earthquakes dip more steeply than younger

  4. The 1906 earthquake and a century of progress in understanding earthquakes and their hazards

    USGS Publications Warehouse

    Zoback, M.L.

    2006-01-01

    The 18 April 1906 San Francisco earthquake killed nearly 3000 people and left 225,000 residents homeless. Three days after the earthquake, an eight-person Earthquake Investigation Commission composed of 25 geologists, seismologists, geodesists, biologists and engineers, as well as some 300 others started work under the supervision of Andrew Lawson to collect and document physical phenomena related to the quake . On 31 May 1906, the commission published a preliminary 17-page report titled "The Report of the State Earthquake Investigation Commission". The report included the bulk of the geological and morphological descriptions of the faulting, detailed reports on shaking intensity, as well as an impressive atlas of 40 oversized maps and folios. Nearly 100 years after its publication, the Commission Report remains a model for post-earthquake investigations. Because the diverse data sets were so complete and carefully documented, researchers continue to apply modern analysis techniques to learn from the 1906 earthquake. While the earthquake marked a seminal event in the history of California, it served as impetus for the birth of modern earthquake science in the United States.

  5. Paleo-earthquake timing on the North Anatolian Fault: Where, when, and how sure are we?

    NASA Astrophysics Data System (ADS)

    Fraser, J.; Vanneste, K.; Hubert-Ferrari, A.

    2009-04-01

    fundamental questions in the field of paleoseismology. For example; can we use sample ages from PIs situated 100s of kilometres apart, on a historical rupture segment, to more accurately determine the timing of paleo-earthquakes? Because the approach to earthquake age constraint is continuing to evolve, this study highlights the importance of publishing raw data from paleoseismic investigations. Biasi, G. and R. Weldon (1994). "Quantitative refinement of calibrated C-14 distributions."Quaternary Research 41: 1-18 Biasi, G. and R. Weldon (2002). "Paleoseismic Event Dating and the Conditional Probability of Large Earthquakes on the Southern San Andreas Fault, California." Bulletin of the Seismological Society of America 92(7): 2761-2781. Bronk Ramsey, C. (2007). OxCal version 4.0.5 Radiocarbon Calibration software.

  6. Breaks in Pavement and Pipes as Indicators of Range-Front Faulting Resulting from the 1989 Loma Prieta Earthquake near the Southwest Margin of the Santa Clara Valley, California

    USGS Publications Warehouse

    Schmidt, Kevin M.; Ellen, Stephen D.; Haugerud, Ralph A.; Peterson, David M.; Phelps, Geoffery A.

    1995-01-01

    Damage to pavement and near-surface utility pipes, caused by the October 17, 1989, Loma Prieta earthquake, provide indicators for ground deformation in a 663 km2 area near the southwest margin of the Santa Clara Valley, California. The spatial distribution of 1284 sites of such damage documents the extent and distribution of detectable ground deformation. Damage was concentrated in four zones, three of which are near previously mapped faults. The zone through Los Gatos showed the highest concentration of damage, as well as evidence for pre- and post-earthquake deformation. Damage along the foot of the Santa Cruz Mountains reflected shortening that is consistent with movement along reverse faults in the region and with the hypothesis that tectonic strain is distributed widely across numerous faults in the California Coast Ranges.

  7. Comparison of four moderate-size earthquakes in southern California using seismology and InSAR

    USGS Publications Warehouse

    Mellors, R.J.; Magistrale, H.; Earle, P.; Cogbill, A.H.

    2004-01-01

    Source parameters determined from interferometric synthetic aperture radar (InSAR) measurements and from seismic data are compared from four moderate-size (less than M 6) earthquakes in southern California. The goal is to verify approximate detection capabilities of InSAR, assess differences in the results, and test how the two results can be reconciled. First, we calculated the expected surface deformation from all earthquakes greater than magnitude 4 in areas with available InSAR data (347 events). A search for deformation from the events in the interferograms yielded four possible events with magnitudes less than 6. The search for deformation was based on a visual inspection as well as cross-correlation in two dimensions between the measured signal and the expected signal. A grid-search algorithm was then used to estimate focal mechanism and depth from the InSAR data. The results were compared with locations and focal mechanisms from published catalogs. An independent relocation using seismic data was also performed. The seismic locations fell within the area of the expected rupture zone for the three events that show clear surface deformation. Therefore, the technique shows the capability to resolve locations with high accuracy and is applicable worldwide. The depths determined by InSAR agree with well-constrained seismic locations determined in a 3D velocity model. Depth control for well-imaged shallow events using InSAR data is good, and better than the seismic constraints in some cases. A major difficulty for InSAR analysis is the poor temporal coverage of InSAR data, which may make it impossible to distinguish deformation due to different earthquakes at the same location.

  8. Liquefaction at Oceano, California, during the 2003 San Simeon earthquake

    USGS Publications Warehouse

    Holzer, T.L.; Noce, T.E.; Bennett, M.J.; Tinsley, J. C.; Rosenberg, L.I.

    2005-01-01

    The 2003 M 6.5 San Simeon, California, earthquake caused liquefaction-induced lateral spreading at Oceano at an unexpectedly large distance from the seismogenic rupture. We conclude that the liquefaction was caused by ground motion that was enhanced by both rupture directivity in the mainshock and local site amplification by unconsolidated fine-grained deposits. Liquefaction occurred in sandy artificial fill and undisturbed eolian sand and fluvial deposits. The largest and most damaging lateral spread was caused by liquefaction of artificial fill; the head of this lateral spread coincided with the boundary between the artificial fill and undisturbed eolian sand deposits. Values of the liquefaction potential index, in general, were greater than 5 at liquefaction sites, the threshold value that has been proposed for liquefaction hazard mapping. Although the mainshock ground motion at Oceano was not recorded, peak ground acceleration was estimated to range from 0.25 and 0.28g on the basis of the liquefaction potential index and aftershock recordings. The estimates fall within the range of peak ground acceleration values associated with the modified Mercalli intensity = VII reported at the U.S. Geological Survey (USGS) "Did You Feel It?" web site.

  9. Turkish Compulsory Earthquake Insurance and "Istanbul Earthquake

    NASA Astrophysics Data System (ADS)

    Durukal, E.; Sesetyan, K.; Erdik, M.

    2009-04-01

    The city of Istanbul will likely experience substantial direct and indirect losses as a result of a future large (M=7+) earthquake with an annual probability of occurrence of about 2%. This paper dwells on the expected building losses in terms of probable maximum and average annualized losses and discusses the results from the perspective of the compulsory earthquake insurance scheme operational in the country. The TCIP system is essentially designed to operate in Turkey with sufficient penetration to enable the accumulation of funds in the pool. Today, with only 20% national penetration, and about approximately one-half of all policies in highly earthquake prone areas (one-third in Istanbul) the system exhibits signs of adverse selection, inadequate premium structure and insufficient funding. Our findings indicate that the national compulsory earthquake insurance pool in Turkey will face difficulties in covering incurring building losses in Istanbul in the occurrence of a large earthquake. The annualized earthquake losses in Istanbul are between 140-300 million. Even if we assume that the deductible is raised to 15%, the earthquake losses that need to be paid after a large earthquake in Istanbul will be at about 2.5 Billion, somewhat above the current capacity of the TCIP. Thus, a modification to the system for the insured in Istanbul (or Marmara region) is necessary. This may mean an increase in the premia and deductible rates, purchase of larger re-insurance covers and development of a claim processing system. Also, to avoid adverse selection, the penetration rates elsewhere in Turkey need to be increased substantially. A better model would be introduction of parametric insurance for Istanbul. By such a model the losses will not be indemnified, however will be directly calculated on the basis of indexed ground motion levels and damages. The immediate improvement of a parametric insurance model over the existing one will be the elimination of the claim processing

  10. Earthquake induced liquefaction hazard, probability and risk assessment in the city of Kolkata, India: its historical perspective and deterministic scenario

    NASA Astrophysics Data System (ADS)

    Nath, Sankar Kumar; Srivastava, Nishtha; Ghatak, Chitralekha; Adhikari, Manik Das; Ghosh, Ambarish; Sinha Ray, S. P.

    2018-01-01

    Liquefaction-induced ground failure is one amongst the leading causes of infrastructure damage due to the impact of large earthquakes in unconsolidated, non-cohesive, water saturated alluvial terrains. The city of Kolkata is located on the potentially liquefiable alluvial fan deposits of Ganga-Bramhaputra-Meghna Delta system with subsurface litho-stratigraphic sequence comprising of varying percentages of clay, cohesionless silt, sand, and gravel interbedded with decomposed wood and peat. Additionally, the region has moderately shallow groundwater condition especially in the post-monsoon seasons. In view of burgeoning population, there had been unplanned expansion of settlements in the hazardous geological, geomorphological, and hydrological conditions exposing the city to severe liquefaction hazard. The 1897 Shillong and 1934 Bihar-Nepal earthquakes both of M w 8.1 reportedly induced Modified Mercalli Intensity of IV-V and VI-VII respectively in the city reportedly triggering widespread to sporadic liquefaction condition with surface manifestation of sand boils, lateral spreading, ground subsidence, etc., thus posing a strong case for liquefaction potential analysis in the terrain. With the motivation of assessing seismic hazard, vulnerability, and risk of the city of Kolkata through a consorted federal funding stipulated for all the metros and upstart urban centers in India located in BIS seismic zones III, IV, and V with population more than one million, an attempt has been made here to understand the liquefaction susceptibility condition of Kolkata under the impact of earthquake loading employing modern multivariate techniques and also to predict deterministic liquefaction scenario of the city in the event of a probabilistic seismic hazard condition with 10% probability of exceedance in 50 years and a return period of 475 years. We conducted in-depth geophysical and geotechnical investigations in the city encompassing 435 km2 area. The stochastically

  11. Earthquake and ambient vibration monitoring of the steel-frame UCLA factor building

    USGS Publications Warehouse

    Kohler, M.D.; Davis, P.M.; Safak, E.

    2005-01-01

    Dynamic property measurements of the moment-resisting steel-frame University of California, Los Angeles, Factor building are being made to assess how forces are distributed over the building. Fourier amplitude spectra have been calculated from several intervals of ambient vibrations, a 24-hour period of strong winds, and from the 28 March 2003 Encino, California (ML = 2.9), the 3 September 2002 Yorba Linda, California (ML = 4.7), and the 3 November 2002 Central Alaska (Mw = 7.9) earthquakes. Measurements made from the ambient vibration records show that the first-mode frequency of horizontal vibration is between 0.55 and 0.6 Hz. The second horizontal mode has a frequency between 1.6 and 1.9 Hz. In contrast, the first-mode frequencies measured from earthquake data are about 0.05 to 0.1 Hz lower than those corresponding to ambient vibration recordings indicating softening of the soil-structure system as amplitudes become larger. The frequencies revert to pre-earthquake levels within five minutes of the Yorba Linda earthquake. Shaking due to strong winds that occurred during the Encino earthquake dominates the frequency decrease, which correlates in time with the duration of the strong winds. The first shear wave recorded from the Encino and Yorba Linda earthquakes takes about 0.4 sec to travel up the 17-story building. ?? 2005, Earthquake Engineering Research Institute.

  12. Understanding Earthquake Fault Systems Using QuakeSim Analysis and Data Assimilation Tools

    NASA Technical Reports Server (NTRS)

    Donnellan, Andrea; Parker, Jay; Glasscoe, Margaret; Granat, Robert; Rundle, John; McLeod, Dennis; Al-Ghanmi, Rami; Grant, Lisa

    2008-01-01

    We are using the QuakeSim environment to model interacting fault systems. One goal of QuakeSim is to prepare for the large volumes of data that spaceborne missions such as DESDynI will produce. QuakeSim has the ability to ingest distributed heterogenous data in the form of InSAR, GPS, seismicity, and fault data into various earthquake modeling applications, automating the analysis when possible. Virtual California simulates interacting faults in California. We can compare output from long time history Virtual California runs with the current state of strain and the strain history in California. In addition to spaceborne data we will begin assimilating data from UAVSAR airborne flights over the San Francisco Bay Area, the Transverse Ranges, and the Salton Trough. Results of the models are important for understanding future earthquake risk and for providing decision support following earthquakes. Improved models require this sensor web of different data sources, and a modeling environment for understanding the combined data.

  13. Hypothesis testing and earthquake prediction.

    PubMed

    Jackson, D D

    1996-04-30

    Requirements for testing include advance specification of the conditional rate density (probability per unit time, area, and magnitude) or, alternatively, probabilities for specified intervals of time, space, and magnitude. Here I consider testing fully specified hypotheses, with no parameter adjustments or arbitrary decisions allowed during the test period. Because it may take decades to validate prediction methods, it is worthwhile to formulate testable hypotheses carefully in advance. Earthquake prediction generally implies that the probability will be temporarily higher than normal. Such a statement requires knowledge of "normal behavior"--that is, it requires a null hypothesis. Hypotheses can be tested in three ways: (i) by comparing the number of actual earth-quakes to the number predicted, (ii) by comparing the likelihood score of actual earthquakes to the predicted distribution, and (iii) by comparing the likelihood ratio to that of a null hypothesis. The first two tests are purely self-consistency tests, while the third is a direct comparison of two hypotheses. Predictions made without a statement of probability are very difficult to test, and any test must be based on the ratio of earthquakes in and out of the forecast regions.

  14. Hypothesis testing and earthquake prediction.

    PubMed Central

    Jackson, D D

    1996-01-01

    Requirements for testing include advance specification of the conditional rate density (probability per unit time, area, and magnitude) or, alternatively, probabilities for specified intervals of time, space, and magnitude. Here I consider testing fully specified hypotheses, with no parameter adjustments or arbitrary decisions allowed during the test period. Because it may take decades to validate prediction methods, it is worthwhile to formulate testable hypotheses carefully in advance. Earthquake prediction generally implies that the probability will be temporarily higher than normal. Such a statement requires knowledge of "normal behavior"--that is, it requires a null hypothesis. Hypotheses can be tested in three ways: (i) by comparing the number of actual earth-quakes to the number predicted, (ii) by comparing the likelihood score of actual earthquakes to the predicted distribution, and (iii) by comparing the likelihood ratio to that of a null hypothesis. The first two tests are purely self-consistency tests, while the third is a direct comparison of two hypotheses. Predictions made without a statement of probability are very difficult to test, and any test must be based on the ratio of earthquakes in and out of the forecast regions. PMID:11607663

  15. Associating an ionospheric parameter with major earthquake occurrence throughout the world

    NASA Astrophysics Data System (ADS)

    Ghosh, D.; Midya, S. K.

    2014-02-01

    With time, ionospheric variation analysis is gaining over lithospheric monitoring in serving precursors for earthquake forecast. The current paper highlights the association of major (Ms ≥ 6.0) and medium (4.0 ≤ Ms < 6.0) earthquake occurrences throughout the world in different ranges of the Ionospheric Earthquake Parameter (IEP) where `Ms' is earthquake magnitude on the Richter scale. From statistical and graphical analyses, it is concluded that the probability of earthquake occurrence is maximum when the defined parameter lies within the range of 0-75 (lower range). In the higher ranges, earthquake occurrence probability gradually decreases. A probable explanation is also suggested.

  16. Baja Earthquake Perspective View

    NASA Image and Video Library

    2010-04-05

    The topography surrounding the Laguna Salada Fault in the Mexican state of Baja, California, is shown in this combined radar image and topographic view with data from NASA Shuttle Radar Topography Mission where a 7.2 earthquake struck on April 4, 2010.

  17. Earthquake Clusters and Spatio-temporal Migration of earthquakes in Northeastern Tibetan Plateau: a Finite Element Modeling

    NASA Astrophysics Data System (ADS)

    Sun, Y.; Luo, G.

    2017-12-01

    Seismicity in a region is usually characterized by earthquake clusters and earthquake migration along its major fault zones. However, we do not fully understand why and how earthquake clusters and spatio-temporal migration of earthquakes occur. The northeastern Tibetan Plateau is a good example for us to investigate these problems. In this study, we construct and use a three-dimensional viscoelastoplastic finite-element model to simulate earthquake cycles and spatio-temporal migration of earthquakes along major fault zones in northeastern Tibetan Plateau. We calculate stress evolution and fault interactions, and explore effects of topographic loading and viscosity of middle-lower crust and upper mantle on model results. Model results show that earthquakes and fault interactions increase Coulomb stress on the neighboring faults or segments, accelerating the future earthquakes in this region. Thus, earthquakes occur sequentially in a short time, leading to regional earthquake clusters. Through long-term evolution, stresses on some seismogenic faults, which are far apart, may almost simultaneously reach the critical state of fault failure, probably also leading to regional earthquake clusters and earthquake migration. Based on our model synthetic seismic catalog and paleoseismic data, we analyze probability of earthquake migration between major faults in northeastern Tibetan Plateau. We find that following the 1920 M 8.5 Haiyuan earthquake and the 1927 M 8.0 Gulang earthquake, the next big event (M≥7) in northeastern Tibetan Plateau would be most likely to occur on the Haiyuan fault.

  18. The magnitude 6.7 Northridge, California, earthquake of 17 January 1994

    USGS Publications Warehouse

    Jones, L.; Aki, K.; Boore, D.; Celebi, M.; Donnellan, A.; Hall, J.; Harris, R.; Hauksson, E.; Heaton, T.; Hough, S.; Hudnut, K.; Hutton, K.; Johnston, M.; Joyner, W.; Kanamori, H.; Marshall, G.; Michael, A.; Mori, J.; Murray, M.; Ponti, D.; Reasenberg, P.; Schwartz, D.; Seeber, L.; Shakal, A.; Simpson, R.; Thio, H.; Tinsley, J.; Todorovska, M.; Trifunac, M.; Wald, D.; Zoback, M.L.

    1994-01-01

    The most costly American earthquake since 1906 struck Los Angeles on 17 January 1994. The magnitude 6.7 Northridge earthquake resulted from more than 3 meters of reverse slip on a 15-kilometer-long south-dipping thrust fault that raised the Santa Susana mountains by as much as 70 centimeters. The fault appears to be truncated by the fault that broke in the 1971 San Fernando earthquake at a depth of 8 kilometers. Of these two events, the Northridge earthquake caused many times more damage, primarily because its causative fault is directly under the city. Many types of structures were damaged, but the fracture of welds in steel-frame buildings was the greatest surprise. The Northridge earthquake emphasizes the hazard posed to Los Angeles by concealed thrust faults and the potential for strong ground shaking in moderate earthquakes.The most costly American earthquake since 1906 struck Los Angeles on 17 January 1994. The magnitude 6.7 Northridge earthquake resulted from more than 3 meters of reverse slip on a 15-kilometer-long south-dipping thrust fault that raised the Santa Susana mountains by as much as 70 centimeters. The fault appears to be truncated by the fault that broke in the 1971 San Fernando earthquake at a depth of 8 kilometers. Of these two events, the Northridge earthquake caused many times more damage, primarily because its causative fault is directly under the city. Many types of structures were damaged, but the fracture of welds in steel-frame buildings was the greatest surprise. The Northridge earthquake emphasizes the hazard posed to Los Angeles by concealed thrust faults and the potential for strong ground shaking in moderate earthquakes.

  19. Earthquakes and Schools

    ERIC Educational Resources Information Center

    National Clearinghouse for Educational Facilities, 2008

    2008-01-01

    Earthquakes are low-probability, high-consequence events. Though they may occur only once in the life of a school, they can have devastating, irreversible consequences. Moderate earthquakes can cause serious damage to building contents and non-structural building systems, serious injury to students and staff, and disruption of building operations.…

  20. Fixed recurrence and slip models better predict earthquake behavior than the time- and slip-predictable models 1: repeating earthquakes

    USGS Publications Warehouse

    Rubinstein, Justin L.; Ellsworth, William L.; Chen, Kate Huihsuan; Uchida, Naoki

    2012-01-01

    The behavior of individual events in repeating earthquake sequences in California, Taiwan and Japan is better predicted by a model with fixed inter-event time or fixed slip than it is by the time- and slip-predictable models for earthquake occurrence. Given that repeating earthquakes are highly regular in both inter-event time and seismic moment, the time- and slip-predictable models seem ideally suited to explain their behavior. Taken together with evidence from the companion manuscript that shows similar results for laboratory experiments we conclude that the short-term predictions of the time- and slip-predictable models should be rejected in favor of earthquake models that assume either fixed slip or fixed recurrence interval. This implies that the elastic rebound model underlying the time- and slip-predictable models offers no additional value in describing earthquake behavior in an event-to-event sense, but its value in a long-term sense cannot be determined. These models likely fail because they rely on assumptions that oversimplify the earthquake cycle. We note that the time and slip of these events is predicted quite well by fixed slip and fixed recurrence models, so in some sense they are time- and slip-predictable. While fixed recurrence and slip models better predict repeating earthquake behavior than the time- and slip-predictable models, we observe a correlation between slip and the preceding recurrence time for many repeating earthquake sequences in Parkfield, California. This correlation is not found in other regions, and the sequences with the correlative slip-predictable behavior are not distinguishable from nearby earthquake sequences that do not exhibit this behavior.

  1. Probabilistic Relationships between Ground‐Motion Parameters and Modified Mercalli Intensity in California

    USGS Publications Warehouse

    Worden, C.B.; Wald, David J.; Rhoades, D.A.

    2012-01-01

    We use a database of approximately 200,000 modified Mercalli intensity (MMI) observations of California earthquakes collected from USGS "Did You Feel It?" (DYFI) reports, along with a comparable number of peak ground-motion amplitudes from California seismic networks, to develop probabilistic relationships between MMI and peak ground velocity (PGV), peak ground acceleration (PGA), and 0.3-s, 1-s, and 3-s 5% damped pseudospectral acceleration (PSA). After associating each ground-motion observation with an MMI computed from all the DYFI responses within 2 km of the observation, we derived a joint probability distribution between MMI and ground motion. We then derived reversible relationships between MMI and each ground-motion parameter by using a total least squares regression to fit a bilinear function to the median of the stacked probability distributions. Among the relationships, the fit to peak ground velocity has the smallest errors, though linear combinations of PGA and PGV give nominally better results. We also find that magnitude and distance terms reduce the overall residuals and are justifiable on an information theoretic basis. For intensities MMI≥5, our results are in close agreement with the relations of Wald, Quitoriano, Heaton, and Kanamori (1999); for lower intensities, our results fall midway between Wald, Quitoriano, Heaton, and Kanamori (1999) and those of Atkinson and Kaka (2007). The earthquakes in the study ranged in magnitude from 3.0 to 7.3, and the distances ranged from less than a kilometer to about 400 km from the source.

  2. Seasonal water storage, stress modulation and California seismicity

    NASA Astrophysics Data System (ADS)

    Johnson, C. W.; Burgmann, R.; Fu, Y.

    2017-12-01

    Establishing what controls the timing of earthquakes is fundamental to understanding the nature of the earthquake cycle and critical to determining time-dependent earthquake hazard. Seasonal loading provides a natural laboratory to explore the crustal response to a quantifiable transient force. In California, the accumulation of winter snowpack in the Sierra Nevada, surface water in lakes and reservoirs, and groundwater in sedimentary basins follow the annual cycle of wet winters and dry summers. The surface loads resulting from the seasonal changes in water storage produce elastic deformation of the Earth's crust. We used 9 years of global positioning system (GPS) vertical deformation time series to constrain models of monthly hydrospheric loading and the resulting stress changes on fault planes of small earthquakes. Previous studies posit that temperature, atmospheric pressure, or hydrologic changes may strain the lithosphere and promote additional earthquakes above background levels. Depending on fault geometry, the addition or removal of water increases the Coulomb failure stress. The largest stress amplitudes are occurring on dipping reverse faults in the Coast Ranges and along the eastern Sierra Nevada range front. We analyze 9 years of M≥2.0 earthquakes with known focal mechanisms in northern and central California to resolve fault-normal and fault-shear stresses for the focal geometry. Our results reveal 10% more earthquakes occurring during slip-encouraging fault-shear stress conditions and suggest that earthquake populations are modulated at periods of natural loading cycles, which promote failure by stress changes on the order of 1-5 kPa. We infer that California seismicity rates are modestly modulated by natural hydrological loading cycles.

  3. San Andreas fault geometry at Desert Hot Springs, California, and its effects on earthquake hazards and groundwater

    USGS Publications Warehouse

    Catchings, R.D.; Rymer, M.J.; Goldman, M.R.; Gandhok, G.

    2009-01-01

    The Mission Creek and Banning faults are two of the principal strands of the San Andreas fault zone in the northern Coachella Valley of southern California. Structural characteristics of the faults affect both regional earthquake hazards and local groundwater resources. We use seismic, gravity, and geological data to characterize the San Andreas fault zone in the vicinity of Desert Hot Springs. Seismic images of the upper 500 m of the Mission Creek fault at Desert Hot Springs show multiple fault strands distributed over a 500 m wide zone, with concentrated faulting within a central 200 m wide area of the fault zone. High-velocity (up to 5000 m=sec) rocks on the northeast side of the fault are juxtaposed against a low-velocity (6.0) earthquakes in the area (in 1948 and 1986) occurred at or near the depths (~10 to 12 km) of the merged (San Andreas) fault. Large-magnitude earthquakes that nucleate at or below the merged fault will likely generate strong shaking from guided waves along both fault zones and from amplified seismic waves in the low-velocity basin between the two fault zones. The Mission Creek fault zone is a groundwater barrier with the top of the water table varying by 60 m in depth and the aquifer varying by about 50 m in thickness across a 200 m wide zone of concentrated faulting.

  4. Reexamination of the subsurface fault structure in the vicinity of the 1989 moment-magnitude-6.9 Loma Prieta earthquake, central California, using steep-reflection, earthquake, and magnetic data

    USGS Publications Warehouse

    Zhang, Edward; Fuis, Gary S.; Catchings, Rufus D.; Scheirer, Daniel S.; Goldman, Mark; Bauer, Klaus

    2018-06-13

    We reexamine the geometry of the causative fault structure of the 1989 moment-magnitude-6.9 Loma Prieta earthquake in central California, using seismic-reflection, earthquake-hypocenter, and magnetic data. Our study is prompted by recent interpretations of a two-part dip of the San Andreas Fault (SAF) accompanied by a flower-like structure in the Coachella Valley, in southern California. Initially, the prevailing interpretation of fault geometry in the vicinity of the Loma Prieta earthquake was that the mainshock did not rupture the SAF, but rather a secondary fault within the SAF system, because network locations of aftershocks defined neither a vertical plane nor a fault plane that projected to the surface trace of the SAF. Subsequent waveform cross-correlation and double-difference relocations of Loma Prieta aftershocks appear to have clarified the fault geometry somewhat, with steeply dipping faults in the upper crust possibly connecting to the more moderately southwest-dipping mainshock rupture in the middle crust. Examination of steep-reflection data, extracted from a 1991 seismic-refraction profile through the Loma Prieta area, reveals three robust fault-like features that agree approximately in geometry with the clusters of upper-crustal relocated aftershocks. The subsurface geometry of the San Andreas, Sargent, and Berrocal Faults can be mapped using these features and the aftershock clusters. The San Andreas and Sargent Faults appear to dip northeastward in the uppermost crust and change dip continuously toward the southwest with depth. Previous models of gravity and magnetic data on profiles through the aftershock region also define a steeply dipping SAF, with an initial northeastward dip in the uppermost crust that changes with depth. At a depth 6 to 9 km, upper-crustal faults appear to project into the moderately southwest-dipping, planar mainshock rupture. The change to a planar dipping rupture at 6–9 km is similar to fault geometry seen in the

  5. Surface slip during large Owens Valley earthquakes

    NASA Astrophysics Data System (ADS)

    Haddon, E. K.; Amos, C. B.; Zielke, O.; Jayko, A. S.; Bürgmann, R.

    2016-06-01

    The 1872 Owens Valley earthquake is the third largest known historical earthquake in California. Relatively sparse field data and a complex rupture trace, however, inhibited attempts to fully resolve the slip distribution and reconcile the total moment release. We present a new, comprehensive record of surface slip based on lidar and field investigation, documenting 162 new measurements of laterally and vertically displaced landforms for 1872 and prehistoric Owens Valley earthquakes. Our lidar analysis uses a newly developed analytical tool to measure fault slip based on cross-correlation of sublinear topographic features and to produce a uniquely shaped probability density function (PDF) for each measurement. Stacking PDFs along strike to form cumulative offset probability distribution plots (COPDs) highlights common values corresponding to single and multiple-event displacements. Lateral offsets for 1872 vary systematically from ˜1.0 to 6.0 m and average 3.3 ± 1.1 m (2σ). Vertical offsets are predominantly east-down between ˜0.1 and 2.4 m, with a mean of 0.8 ± 0.5 m. The average lateral-to-vertical ratio compiled at specific sites is ˜6:1. Summing displacements across subparallel, overlapping rupture traces implies a maximum of 7-11 m and net average of 4.4 ± 1.5 m, corresponding to a geologic Mw ˜7.5 for the 1872 event. We attribute progressively higher-offset lateral COPD peaks at 7.1 ± 2.0 m, 12.8 ± 1.5 m, and 16.6 ± 1.4 m to three earlier large surface ruptures. Evaluating cumulative displacements in context with previously dated landforms in Owens Valley suggests relatively modest rates of fault slip, averaging between ˜0.6 and 1.6 mm/yr (1σ) over the late Quaternary.

  6. Surface slip during large Owens Valley earthquakes

    USGS Publications Warehouse

    Haddon, E.K.; Amos, C.B.; Zielke, O.; Jayko, Angela S.; Burgmann, R.

    2016-01-01

    The 1872 Owens Valley earthquake is the third largest known historical earthquake in California. Relatively sparse field data and a complex rupture trace, however, inhibited attempts to fully resolve the slip distribution and reconcile the total moment release. We present a new, comprehensive record of surface slip based on lidar and field investigation, documenting 162 new measurements of laterally and vertically displaced landforms for 1872 and prehistoric Owens Valley earthquakes. Our lidar analysis uses a newly developed analytical tool to measure fault slip based on cross-correlation of sublinear topographic features and to produce a uniquely shaped probability density function (PDF) for each measurement. Stacking PDFs along strike to form cumulative offset probability distribution plots (COPDs) highlights common values corresponding to single and multiple-event displacements. Lateral offsets for 1872 vary systematically from ∼1.0 to 6.0 m and average 3.3 ± 1.1 m (2σ). Vertical offsets are predominantly east-down between ∼0.1 and 2.4 m, with a mean of 0.8 ± 0.5 m. The average lateral-to-vertical ratio compiled at specific sites is ∼6:1. Summing displacements across subparallel, overlapping rupture traces implies a maximum of 7–11 m and net average of 4.4 ± 1.5 m, corresponding to a geologic Mw ∼7.5 for the 1872 event. We attribute progressively higher-offset lateral COPD peaks at 7.1 ± 2.0 m, 12.8 ± 1.5 m, and 16.6 ± 1.4 m to three earlier large surface ruptures. Evaluating cumulative displacements in context with previously dated landforms in Owens Valley suggests relatively modest rates of fault slip, averaging between ∼0.6 and 1.6 mm/yr (1σ) over the late Quaternary.

  7. California: Diamond Valley

    Atmospheric Science Data Center

    2014-05-15

    ... article title:  Watching the Creation of Southern California's Largest Reservoir     ... Valley Lake is designed to provide protection against drought and a six-month emergency supply in the event of earthquake damage to a ...

  8. The Parkfield earthquake prediction of October 1992; the emergency services response

    USGS Publications Warehouse

    Andrews, R.

    1992-01-01

    The science of earthquake prediction is interesting and worthy of support. In many respects the ultimate payoff of earthquake prediction or earthquake forecasting is how the information can be used to enhance public safety and public preparedness. This is a particularly important issue here in California where we have such a high level of seismic risk historically, and currently, as a consequence of activity in 1989 in the San Francisco Bay Area, in Humboldt County in April of this year (1992), and in southern California in the Landers-Big Bear area in late June of this year (1992). We are currently very concerned about the possibility of a major earthquake, one or more, happening close to one of our metropolitan areas. Within that context, the Parkfield experiment becomes very important. 

  9. Earthquake Prediction is Coming

    ERIC Educational Resources Information Center

    MOSAIC, 1977

    1977-01-01

    Describes (1) several methods used in earthquake research, including P:S ratio velocity studies, dilatancy models; and (2) techniques for gathering base-line data for prediction using seismographs, tiltmeters, laser beams, magnetic field changes, folklore, animal behavior. The mysterious Palmdale (California) bulge is discussed. (CS)

  10. Improved Data Access From the Northern California Earthquake Data Center

    NASA Astrophysics Data System (ADS)

    Neuhauser, D.; Oppenheimer, D.; Zuzlewski, S.; Klein, F.; Jensen, E.; Gee, L.; Murray, M.; Romanowicz, B.

    2002-12-01

    descriptions for for both historic and current NCSN instrumentation. The NCEDC and USGS jointly developed a procedure to create and maintain the hardware attributes and instrument responses at the NCEDC for the 3500 NCSN channels. As a result, the NCSN waveform data can now be distributed in SEED format. The NCEDC provides access to waveform data through Web forms, email requests, and programming interfaces. The SeismiQuery Web interface provides information about data holdings. NetDC allows users to retrieve inventory information, instrument responses, and waveforms in SEED format. STP provides both a Web and programming interface to retrieve data in SEED or other user-friendly formats. Through the newly formed California Integrated Seismic Network, we are working with the SCEDC to provide unified access to California earthquake data.

  11. Timing of large earthquakes since A.D. 800 on the Mission Creek strand of the San Andreas fault zone at Thousand Palms Oasis, near Palm Springs, California

    USGS Publications Warehouse

    Fumal, T.E.; Rymer, M.J.; Seitz, G.G.

    2002-01-01

    Paleoseismic investigations across the Mission Creek strand of the San Andreas fault at Thousand Palms Oasis indicate that four and probably five surface-rupturing earthquakes occurred during the past 1200 years. Calendar age estimates for these earthquakes are based on a chronological model that incorporates radio-carbon dates from 18 in situ burn layers and stratigraphic ordering constraints. These five earthquakes occurred in about A.D. 825 (770-890) (mean, 95% range), A.D. 982 (840-1150), A.D. 1231 (1170-1290), A.D. 1502 (1450-1555), and after a date in the range of A.D. 1520-1680. The most recent surface-rupturing earthquake at Thousand Palms is likely the same as the A.D. 1676 ?? 35 event at Indio reported by Sieh and Williams (1990). Each of the past five earthquakes recorded on the San Andreas fault in the Coachella Valley strongly overlaps in time with an event at the Wrightwood paleoseismic site, about 120 km northwest of Thousand Palms Oasis. Correlation of events between these two sites suggests that at least the southernmost 200 km of the San Andreas fault zone may have ruptured in each earthquake. The average repeat time for surface-rupturing earthquakes on the San Andreas fault in the Coachella Valley is 215 ?? 25 years, whereas the elapsed time since the most recent event is 326 ?? 35 years. This suggests the southernmost San Andreas fault zone likely is very near failure. The Thousand Palms Oasis site is underlain by a series of six channels cut and filled since about A.D. 800 that cross the fault at high angles. A channel margin about 900 years old is offset right laterally 2.0 ?? 0.5 m, indicating a slip rate of 4 ?? 2 mm/yr. This slip rate is low relative to geodetic and other geologic slip rate estimates (26 ?? 2 mm/yr and about 23-35 mm/yr, respectively) on the southernmost San Andreas fault zone, possibly because (1) the site is located in a small step-over in the fault trace and so the rate is not be representative of the Mission Creek fault

  12. Subduction zone earthquake probably triggered submarine hydrocarbon seepage offshore Pakistan

    NASA Astrophysics Data System (ADS)

    Fischer, David; José M., Mogollón; Michael, Strasser; Thomas, Pape; Gerhard, Bohrmann; Noemi, Fekete; Volkhard, Spiess; Sabine, Kasten

    2014-05-01

    Seepage of methane-dominated hydrocarbons is heterogeneous in space and time, and trigger mechanisms of episodic seep events are not well constrained. It is generally found that free hydrocarbon gas entering the local gas hydrate stability field in marine sediments is sequestered in gas hydrates. In this manner, gas hydrates can act as a buffer for carbon transport from the sediment into the ocean. However, the efficiency of gas hydrate-bearing sediments for retaining hydrocarbons may be corrupted: Hypothesized mechanisms include critical gas/fluid pressures beneath gas hydrate-bearing sediments, implying that these are susceptible to mechanical failure and subsequent gas release. Although gas hydrates often occur in seismically active regions, e.g., subduction zones, the role of earthquakes as potential triggers of hydrocarbon transport through gas hydrate-bearing sediments has hardly been explored. Based on a recent publication (Fischer et al., 2013), we present geochemical and transport/reaction-modelling data suggesting a substantial increase in upward gas flux and hydrocarbon emission into the water column following a major earthquake that occurred near the study sites in 1945. Calculating the formation time of authigenic barite enrichments identified in two sediment cores obtained from an anticlinal structure called "Nascent Ridge", we find they formed 38-91 years before sampling, which corresponds well to the time elapsed since the earthquake (62 years). Furthermore, applying a numerical model, we show that the local sulfate/methane transition zone shifted upward by several meters due to the increased methane flux and simulated sulfate profiles very closely match measured ones in a comparable time frame of 50-70 years. We thus propose a causal relation between the earthquake and the amplified gas flux and present reflection seismic data supporting our hypothesis that co-seismic ground shaking induced mechanical fracturing of gas hydrate-bearing sediments

  13. Safety and survival in an earthquake

    USGS Publications Warehouse

    ,

    1969-01-01

    Many earth scientists in this country and abroad are focusing their studies on the search for means of predicting impending earthquakes, but, as yet, an accurate prediction of the time and place of such an event cannot be made. From past experience, however, one can assume that earthquakes will continue to harass mankind and that they will occur most frequently in the areas where they have been relatively common in the past. In the United States, earthquakes can be expected to occur most frequently in the western states, particularly in Alaska, California, Washington, Oregon, Nevada, Utah, and Montana. The danger, however, is not confined to any one part of the country; major earthquakes have occurred at widely scattered locations.

  14. Evaluating the Real-time and Offline Performance of the Virtual Seismologist Earthquake Early Warning Algorithm

    NASA Astrophysics Data System (ADS)

    Cua, G.; Fischer, M.; Heaton, T.; Wiemer, S.

    2009-04-01

    The Virtual Seismologist (VS) algorithm is a Bayesian approach to regional, network-based earthquake early warning (EEW). Bayes' theorem as applied in the VS algorithm states that the most probable source estimates at any given time is a combination of contributions from relatively static prior information that does not change over the timescale of earthquake rupture and a likelihood function that evolves with time to take into account incoming pick and amplitude observations from the on-going earthquake. Potentially useful types of prior information include network topology or station health status, regional hazard maps, earthquake forecasts, and the Gutenberg-Richter magnitude-frequency relationship. The VS codes provide magnitude and location estimates once picks are available at 4 stations; these source estimates are subsequently updated each second. The algorithm predicts the geographical distribution of peak ground acceleration and velocity using the estimated magnitude and location and appropriate ground motion prediction equations; the peak ground motion estimates are also updated each second. Implementation of the VS algorithm in California and Switzerland is funded by the Seismic Early Warning for Europe (SAFER) project. The VS method is one of three EEW algorithms whose real-time performance is being evaluated and tested by the California Integrated Seismic Network (CISN) EEW project. A crucial component of operational EEW algorithms is the ability to distinguish between noise and earthquake-related signals in real-time. We discuss various empirical approaches that allow the VS algorithm to operate in the presence of noise. Real-time operation of the VS codes at the Southern California Seismic Network (SCSN) began in July 2008. On average, the VS algorithm provides initial magnitude, location, origin time, and ground motion distribution estimates within 17 seconds of the earthquake origin time. These initial estimate times are dominated by the time for 4

  15. Recent Achievements of the Collaboratory for the Study of Earthquake Predictability

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.; Liukis, M.; Werner, M. J.; Schorlemmer, D.; Yu, J.; Maechling, P. J.; Jackson, D. D.; Rhoades, D. A.; Zechar, J. D.; Marzocchi, W.

    2016-12-01

    The Collaboratory for the Study of Earthquake Predictability (CSEP) supports a global program to conduct prospective earthquake forecasting experiments. CSEP testing centers are now operational in California, New Zealand, Japan, China, and Europe with 442 models under evaluation. The California testing center, started by SCEC, Sept 1, 2007, currently hosts 30-minute, 1-day, 3-month, 1-year and 5-year forecasts, both alarm-based and probabilistic, for California, the Western Pacific, and worldwide. Our tests are now based on the hypocentral locations and magnitudes of cataloged earthquakes, but we plan to test focal mechanisms, seismic hazard models, ground motion forecasts, and finite rupture forecasts as well. We have increased computational efficiency for high-resolution global experiments, such as the evaluation of the Global Earthquake Activity Rate (GEAR) model, introduced Bayesian ensemble models, and implemented support for non-Poissonian simulation-based forecasts models. We are currently developing formats and procedures to evaluate externally hosted forecasts and predictions. CSEP supports the USGS program in operational earthquake forecasting and a DHS project to register and test external forecast procedures from experts outside seismology. We found that earthquakes as small as magnitude 2.5 provide important information on subsequent earthquakes larger than magnitude 5. A retrospective experiment for the 2010-2012 Canterbury earthquake sequence showed that some physics-based and hybrid models outperform catalog-based (e.g., ETAS) models. This experiment also demonstrates the ability of the CSEP infrastructure to support retrospective forecast testing. Current CSEP development activities include adoption of the Comprehensive Earthquake Catalog (ComCat) as an authorized data source, retrospective testing of simulation-based forecasts, and support for additive ensemble methods. We describe the open-source CSEP software that is available to researchers as

  16. Testing earthquake source inversion methodologies

    USGS Publications Warehouse

    Page, M.; Mai, P.M.; Schorlemmer, D.

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data and the complex rupture process at depth. The resulting earthquake source models quantify the spatiotemporal evolution of ruptures. They are also used to provide a rapid assessment of the severity of an earthquake and to estimate losses. However, because of uncertainties in the data, assumed fault geometry and velocity structure, and chosen rupture parameterization, it is not clear which features of these source models are robust. Improved understanding of the uncertainty and reliability of earthquake source inversions will allow the scientific community to use the robust features of kinematic inversions to more thoroughly investigate the complexity of the rupture process and to better constrain other earthquakerelated computations, such as ground motion simulations and static stress change calculations.

  17. Association of earthquakes and faults in the San Francisco Bay area using Bayesian inference

    USGS Publications Warehouse

    Wesson, R.L.; Bakun, W.H.; Perkins, D.M.

    2003-01-01

    Bayesian inference provides a method to use seismic intensity data or instrumental locations, together with geologic and seismologic data, to make quantitative estimates of the probabilities that specific past earthquakes are associated with specific faults. Probability density functions are constructed for the location of each earthquake, and these are combined with prior probabilities through Bayes' theorem to estimate the probability that an earthquake is associated with a specific fault. Results using this method are presented here for large, preinstrumental, historical earthquakes and for recent earthquakes with instrumental locations in the San Francisco Bay region. The probabilities for individual earthquakes can be summed to construct a probabilistic frequency-magnitude relationship for a fault segment. Other applications of the technique include the estimation of the probability of background earthquakes, that is, earthquakes not associated with known or considered faults, and the estimation of the fraction of the total seismic moment associated with earthquakes less than the characteristic magnitude. Results for the San Francisco Bay region suggest that potentially damaging earthquakes with magnitudes less than the characteristic magnitudes should be expected. Comparisons of earthquake locations and the surface traces of active faults as determined from geologic data show significant disparities, indicating that a complete understanding of the relationship between earthquakes and faults remains elusive.

  18. Calculation of earthquake rupture histories using a hybrid global search algorithm: Application to the 1992 Landers, California, earthquake

    USGS Publications Warehouse

    Hartzell, S.; Liu, P.

    1996-01-01

    A method is presented for the simultaneous calculation of slip amplitudes and rupture times for a finite fault using a hybrid global search algorithm. The method we use combines simulated annealing with the downhill simplex method to produce a more efficient search algorithm then either of the two constituent parts. This formulation has advantages over traditional iterative or linearized approaches to the problem because it is able to escape local minima in its search through model space for the global optimum. We apply this global search method to the calculation of the rupture history for the Landers, California, earthquake. The rupture is modeled using three separate finite-fault planes to represent the three main fault segments that failed during this earthquake. Both the slip amplitude and the time of slip are calculated for a grid work of subfaults. The data used consist of digital, teleseismic P and SH body waves. Long-period, broadband, and short-period records are utilized to obtain a wideband characterization of the source. The results of the global search inversion are compared with a more traditional linear-least-squares inversion for only slip amplitudes. We use a multi-time-window linear analysis to relax the constraints on rupture time and rise time in the least-squares inversion. Both inversions produce similar slip distributions, although the linear-least-squares solution has a 10% larger moment (7.3 ?? 1026 dyne-cm compared with 6.6 ?? 1026 dyne-cm). Both inversions fit the data equally well and point out the importance of (1) using a parameterization with sufficient spatial and temporal flexibility to encompass likely complexities in the rupture process, (2) including suitable physically based constraints on the inversion to reduce instabilities in the solution, and (3) focusing on those robust rupture characteristics that rise above the details of the parameterization and data set.

  19. The nature of earthquake prediction

    USGS Publications Warehouse

    Lindh, A.G.

    1991-01-01

    Earthquake prediction is inherently statistical. Although some people continue to think of earthquake prediction as the specification of the time, place, and magnitude of a future earthquake, it has been clear for at least a decade that this is an unrealistic and unreasonable definition. the reality is that earthquake prediction starts from the long-term forecasts of place and magnitude, with very approximate time constraints, and progresses, at least in principle, to a gradual narrowing of the time window as data and understanding permit. Primitive long-term forecasts are clearly possible at this time on a few well-characterized fault systems. Tightly focuses monitoring experiments aimed at short-term prediction are already underway in Parkfield, California, and in the Tokai region in Japan; only time will tell how much progress will be possible. 

  20. Deviant Earthquakes: Data-driven Constraints on the Variability in Earthquake Source Properties and Seismic Hazard

    NASA Astrophysics Data System (ADS)

    Trugman, Daniel Taylor

    The complexity of the earthquake rupture process makes earthquakes inherently unpredictable. Seismic hazard forecasts often presume that the rate of earthquake occurrence can be adequately modeled as a space-time homogenenous or stationary Poisson process and that the relation between the dynamical source properties of small and large earthquakes obey self-similar scaling relations. While these simplified models provide useful approximations and encapsulate the first-order statistical features of the historical seismic record, they are inconsistent with the complexity underlying earthquake occurrence and can lead to misleading assessments of seismic hazard when applied in practice. The six principle chapters of this thesis explore the extent to which the behavior of real earthquakes deviates from these simplified models, and the implications that the observed deviations have for our understanding of earthquake rupture processes and seismic hazard. Chapter 1 provides a brief thematic overview and introduction to the scope of this thesis. Chapter 2 examines the complexity of the 2010 M7.2 El Mayor-Cucapah earthquake, focusing on the relation between its unexpected and unprecedented occurrence and anthropogenic stresses from the nearby Cerro Prieto Geothermal Field. Chapter 3 compares long-term changes in seismicity within California's three largest geothermal fields in an effort to characterize the relative influence of natural and anthropogenic stress transients on local seismic hazard. Chapter 4 describes a hybrid, hierarchical clustering algorithm that can be used to relocate earthquakes using waveform cross-correlation, and applies the new algorithm to study the spatiotemporal evolution of two recent seismic swarms in western Nevada. Chapter 5 describes a new spectral decomposition technique that can be used to analyze the dynamic source properties of large datasets of earthquakes, and applies this approach to revisit the question of self-similar scaling of

  1. Crustal deformation in Great California Earthquake cycles

    NASA Technical Reports Server (NTRS)

    Li, Victor C.; Rice, James R.

    1987-01-01

    A model in which coupling is described approximately through a generalized Elsasser model is proposed for computation of the periodic crustal deformation associated with repeated strike-slip earthquakes. The model is found to provide a more realistic physical description of tectonic loading than do simpler kinematic models. Parameters are chosen to model the 1857 and 1906 San Andreas ruptures, and predictions are found to be consistent with data on variations of contemporary surface strain and displacement rates as a function of distance from the 1857 and 1906 rupture traces. Results indicate that the asthenosphere appropriate to describe crustal deformation on the earthquake cycle time scale lies in the lower crust and perhaps the crust-mantle transition zone.

  2. The 3-D aftershock distribution of three recent M5~5.5 earthquakes in the Anza region,California

    NASA Astrophysics Data System (ADS)

    Zhang, Q.; Wdowinski, S.; Lin, G.

    2011-12-01

    The San Jacinto fault zone (SJFZ) exhibits the highest level of seismicity compared to other regions in southern California. On average, it produces four earthquakes per day, most of them at depth of 10-17 km. Over the past decade, an increasing seismic activity occurred in the Anza region, which included three M5~5.5 events and their aftershock sequences. These events occurred in 2001, 2005, and 2010. In this research we map the 3-D distribution of these three events to evaluate their rupture geometry and better understand the unusual deep seismic pattern along the SJFZ, which was termed "deep creep" (Wdowinski, 2009). We relocated 97,562 events from 1981 to 2011 in Anza region by applying the Source-Specific Station Term (SSST) method (Lin et al., 2006) and used an accurate 1-D velocity model derived from 3-D model of Lin et al (2007) and used In order to separate the aftershock sequence from background seismicity, we characterized each of the three aftershock sequences using Omori's law. Preliminary results show that all three sequences had a similar geometry of deep elongated aftershock distribution. Most aftershocks occurred at depth of 10-17 km and extended over a 70 km long segments of the SJFZ, centered at the mainshock hypocenters. A comparative study of other M5~5.5 mainshocks and their aftershock sequences in southern California reveals very different geometrical pattern, suggesting that the three Anza M5~5.5 events are unique and can be indicative of "deep creep" deformation processes. Reference 1.Lin, G.and Shearer,P.M.,2006, The COMPLOC earthquake location package,Seism. Res. Lett.77, pp.440-444. 2.Lin, G. and Shearer, P.M., Hauksson, E., and Thurber C.H.,2007, A three-dimensional crustal seismic velocity model for southern California from a composite event method,J. Geophys.Res.112, B12306, doi: 10.1029/ 2007JB004977. 3.Wdowinski, S. ,2009, Deep creep as a cause for the excess seismicity along the San Jacinto fault, Nat. Geosci.,doi:10.1038/NGEO684.

  3. Photomosaics and logs of trenches on the San Andreas Fault, Thousand Palms Oasis, California

    USGS Publications Warehouse

    Fumal, Thomas E.; Frost, William T.; Garvin, Christopher; Hamilton, John C.; Jaasma, Monique; Rymer, Michael J.

    2004-01-01

    We present photomosaics and logs of the walls of trenches excavated for a paleoseismic study at Thousand Palms Oasis (Fig. 1). The site is located on the Mission Creek strand of the San Andreas fault zone, one of two major active strands of the fault in the Indio Hills along the northeast margin of the Coachella Valley (Fig. 2). The Coachella Valley section is the most poorly understood major part of the San Andreas fault with regard to slip rate and timing of past large-magnitude earthquakes, and therefore earthquake hazard. No large earthquakes have occurred for more than three centuries, the longest elapsed time for any part of the southern San Andreas fault. In spite of this, the Working Group on California Earthquake Probabilities (1995) assigned the lowest 30-year conditional probability on the southern San Andreas fault to the Coachella Valley. Models of the behavior of this part of the fault, however, have been based on very limited geologic data. The Thousand Palms Oasis is an attractive location for paleoseismic study primarily because of the well-bedded late Holocene sedimentary deposits with abundant layers of organic matter for radiocarbon dating necessary to constrain the timing of large prehistoric earthquakes. Previous attempts to develop a chronology of paleoearthquakes for the region have been hindered by the scarcity of in-situ 14C-dateable material for age control in this desert environment. Also, the fault in the vicinity of Thousand Palms Oasis consists of a single trace that is well expressed, both geomorphically and as a vegetation lineament (Figs. 2, 3). Results of our investigations are discussed in Fumal et al. (2002) and indicate that four and probably five surface-rupturing earthquakes occurred along this part of the fault during the past 1200 years. The average recurrence time for these earthquakes is 215 ± 25 years, although interevent times may have been as short as a few decades or as long as 400 years. Thus, although the elapsed

  4. Children and the San Fernando earthquake

    USGS Publications Warehouse

    Howard, S. J.

    1980-01-01

    Before dawn, on February 9, 1971, a magnitude 6.4 earthquake occurred in the San Fernando Valley of California. On the following day, theSan Fernando Valley Child Guidance Clinic, through radio and newspapers, offered mental health crises services to children frightened by the earthquake. Response to this invitation was immediate and almost overwhelming. During the first 2 weeks, the Clinic's staff counseled hundreds of children who were experiencing various degrees of anxiety. 

  5. Precisely locating the Klamath Falls, Oregon, earthquakes

    USGS Publications Warehouse

    Qamar, A.; Meagher, K.L.

    1993-01-01

    In this article we present preliminary results of a close-in, instrumental study of the Klamath Falls earthquake sequence, carried as a cooperative effort by scientists from the U.S Geological Survey (USGS) and universities in Washington, Orgeon, and California. In addition to obtaining much mroe accurate earthquake locations, this study has improved our understanding of the relationship between seismicity and mapped faults in the region. 

  6. Earthquake Early Warning: A Prospective User's Perspective (Invited)

    NASA Astrophysics Data System (ADS)

    Nishenko, S. P.; Savage, W. U.; Johnson, T.

    2009-12-01

    With more than 25 million people at risk from high hazard faults in California alone, Earthquake Early Warning (EEW) presents a promising public safety and emergency response tool. EEW represents the real-time end of an earthquake information spectrum which also includes near real-time notifications of earthquake location, magnitude, and shaking levels; as well as geographic information system (GIS)-based products for compiling and visually displaying processed earthquake data such as ShakeMap and ShakeCast. Improvements to and increased multi-national implementation of EEW have stimulated interest in how such information products could be used in the future. Lifeline organizations, consisting of utilities and transportation systems, can use both onsite and regional EEW information as part of their risk management and public safety programs. Regional EEW information can provide improved situational awareness to system operators before automatic system protection devices activate, and allow trained personnel to take precautionary measures. On-site EEW is used for earthquake-actuated automatic gas shutoff valves, triggered garage door openers at fire stations, system controls, etc. While there is no public policy framework for preemptive, precautionary electricity or gas service shutdowns by utilities in the United States, gas shut-off devices are being required at the building owner level by some local governments. In the transportation sector, high-speed rail systems have already demonstrated the ‘proof of concept’ for EEW in several countries, and more EEW systems are being installed. Recently the Bay Area Rapid Transit District (BART) began collaborating with the California Integrated Seismic Network (CISN) and others to assess the potential benefits of EEW technology to mass transit operations and emergency response in the San Francisco Bay region. A key issue in this assessment is that significant earthquakes are likely to occur close to or within the BART

  7. Local Public Health System Response to the Tsunami Threat in Coastal California following the Tōhoku Earthquake

    PubMed Central

    Hunter, Jennifer C.; Crawley, Adam W.; Petrie, Michael; Yang, Jane E.; Aragón, Tomás J.

    2012-01-01

    Background On Friday March 11, 2011 a 9.0 magnitude earthquake triggered a tsunami off the eastern coast of Japan, resulting in thousands of lives lost and billions of dollars in damage around the Pacific Rim. The tsunami first reached the California coast on Friday, March 11th, causing more than $70 million in damage and at least one death. While the tsunami’s impact on California pales in comparison to the destruction caused in Japan and other areas of the Pacific, the event tested emergency responders’ ability to rapidly communicate and coordinate a response to a potential threat. Methods To evaluate the local public health system emergency response to the tsunami threat in California, we surveyed all local public health, emergency medical services (EMS), and emergency management agencies in coastal or floodplain counties about several domains related to the tsunami threat in California, including: (1) the extent to which their community was affected by the tsunami, (2) when and how they received notification of the event, (3) which public health response activities were carried out to address the tsunami threat in their community, and (4) which organizations contributed to the response. Public health activities were characterized using the Centers for Disease Control and Prevention (CDC) Public Health Preparedness Capabilities (PHEP) framework. Findings The tsunami's impact on coastal communities in California ranged widely, both in terms of the economic consequences and the response activities. Based on estimates from the National Oceanic and Atmospheric Administration (NOAA), ten jurisdictions in California reported tsunami-related damage, which ranged from $15,000 to $35 million. Respondents first became aware of the tsunami threat in California between the hours of 10:00pm Pacific Standard Time (PST) on Thursday March 10th and 2:00pm PST on Friday March 11th, a range of 16 hours, with notification occurring through both formal and informal channels. In

  8. Local Public Health System Response to the Tsunami Threat in Coastal California following the Tōhoku Earthquake.

    PubMed

    Hunter, Jennifer C; Crawley, Adam W; Petrie, Michael; Yang, Jane E; Aragón, Tomás J

    2012-07-16

    Background On Friday March 11, 2011 a 9.0 magnitude earthquake triggered a tsunami off the eastern coast of Japan, resulting in thousands of lives lost and billions of dollars in damage around the Pacific Rim. The tsunami first reached the California coast on Friday, March 11th, causing more than $70 million in damage and at least one death. While the tsunami's impact on California pales in comparison to the destruction caused in Japan and other areas of the Pacific, the event tested emergency responders' ability to rapidly communicate and coordinate a response to a potential threat. Methods To evaluate the local public health system emergency response to the tsunami threat in California, we surveyed all local public health, emergency medical services (EMS), and emergency management agencies in coastal or floodplain counties about several domains related to the tsunami threat in California, including: (1) the extent to which their community was affected by the tsunami, (2) when and how they received notification of the event, (3) which public health response activities were carried out to address the tsunami threat in their community, and (4) which organizations contributed to the response. Public health activities were characterized using the Centers for Disease Control and Prevention (CDC) Public Health Preparedness Capabilities (PHEP) framework. Findings The tsunami's impact on coastal communities in California ranged widely, both in terms of the economic consequences and the response activities. Based on estimates from the National Oceanic and Atmospheric Administration (NOAA), ten jurisdictions in California reported tsunami-related damage, which ranged from $15,000 to $35 million. Respondents first became aware of the tsunami threat in California between the hours of 10:00pm Pacific Standard Time (PST) on Thursday March 10th and 2:00pm PST on Friday March 11th, a range of 16 hours, with notification occurring through both formal and informal channels. In

  9. The effects of earthquake measurement concepts and magnitude anchoring on individuals' perceptions of earthquake risk

    USGS Publications Warehouse

    Celsi, R.; Wolfinbarger, M.; Wald, D.

    2005-01-01

    The purpose of this research is to explore earthquake risk perceptions in California. Specifically, we examine the risk beliefs, feelings, and experiences of lay, professional, and expert individuals to explore how risk is perceived and how risk perceptions are formed relative to earthquakes. Our results indicate that individuals tend to perceptually underestimate the degree that earthquake (EQ) events may affect them. This occurs in large part because individuals' personal felt experience of EQ events are generally overestimated relative to experienced magnitudes. An important finding is that individuals engage in a process of "cognitive anchoring" of their felt EQ experience towards the reported earthquake magnitude size. The anchoring effect is moderated by the degree that individuals comprehend EQ magnitude measurement and EQ attenuation. Overall, the results of this research provide us with a deeper understanding of EQ risk perceptions, especially as they relate to individuals' understanding of EQ measurement and attenuation concepts. ?? 2005, Earthquake Engineering Research Institute.

  10. Centrality in earthquake multiplex networks

    NASA Astrophysics Data System (ADS)

    Lotfi, Nastaran; Darooneh, Amir Hossein; Rodrigues, Francisco A.

    2018-06-01

    Seismic time series has been mapped as a complex network, where a geographical region is divided into square cells that represent the nodes and connections are defined according to the sequence of earthquakes. In this paper, we map a seismic time series to a temporal network, described by a multiplex network, and characterize the evolution of the network structure in terms of the eigenvector centrality measure. We generalize previous works that considered the single layer representation of earthquake networks. Our results suggest that the multiplex representation captures better earthquake activity than methods based on single layer networks. We also verify that the regions with highest seismological activities in Iran and California can be identified from the network centrality analysis. The temporal modeling of seismic data provided here may open new possibilities for a better comprehension of the physics of earthquakes.

  11. Evidence for large earthquakes on the San Andreas fault at the Wrightwood, California paleoseismic site: A.D. 500 to present

    USGS Publications Warehouse

    Fumal, T.E.; Weldon, R.J.; Biasi, G.P.; Dawson, T.E.; Seitz, G.G.; Frost, W.T.; Schwartz, D.P.

    2002-01-01

    We present structural and stratigraphic evidence from a paleoseismic site near Wrightwood, California, for 14 large earthquakes that occurred on the southern San Andreas fault during the past 1500 years. In a network of 38 trenches and creek-bank exposures, we have exposed a composite section of interbedded debris flow deposits and thin peat layers more than 24 m thick; fluvial deposits occur along the northern margin of the site. The site is a 150-m-wide zone of deformation bounded on the surface by a main fault zone along the northwest margin and a secondary fault zone to the southwest. Evidence for most of the 14 earthquakes occurs along structures within both zones. We identify paleoearthquake horizons using infilled fissures, scarps, multiple rupture terminations, and widespread folding and tilting of beds. Ages of stratigraphic units and earthquakes are constrained by historic data and 72 14C ages, mostly from samples of peat and some from plant fibers, wood, pine cones, and charcoal. Comparison of the long, well-resolved paleoseimic record at Wrightwood with records at other sites along the fault indicates that rupture lengths of past earthquakes were at least 100 km long. Paleoseismic records at sites in the Coachella Valley suggest that each of the past five large earthquakes recorded there ruptured the fault at least as far northwest as Wrightwood. Comparisons with event chronologies at Pallett Creek and sites to the northwest suggests that approximately the same part of the fault that ruptured in 1857 may also have failed in the early to mid-sixteenth century and several other times during the past 1200 years. Records at Pallett Creek and Pitman Canyon suggest that, in addition to the 14 earthquakes we document, one and possibly two other large earthquakes ruptured the part of the fault including Wrightwood since about A.D. 500. These observations and elapsed times that are significantly longer than mean recurrence intervals at Wrightwood and sites to

  12. Regional patterns of earthquake-triggered landslides and their relation to ground motion

    NASA Astrophysics Data System (ADS)

    Meunier, Patrick; Hovius, Niels; Haines, A. John

    2007-10-01

    We have documented patterns of landsliding associated with large earthquakes on three thrust faults: the Northridge earthquake in California, Chi-Chi earthquake in Taiwan, and two earthquakes on the Ramu-Markham fault bounding the Finisterre Mountains of Papua New Guinea. In each case, landslide densities are shown to be greatest in the area of strongest ground acceleration and to decay with distance from the epicenter. In California and Taiwan, the density of co-seismic landslides is linearly and highly correlated with both the vertical and horizontal components of measured peak ground acceleration. Based on this observation, we derive an expression for the spatial variation of landslide density analogous with regional seismic attenuation laws. In its general form, this expression applies to our three examples, and we determine best fit values for individual cases. Our findings open a window on the construction of shake maps from geomorphic observations for earthquakes in non-instrumented regions.

  13. The impact of land ownership, firefighting, and reserve status on fire probability in California

    NASA Astrophysics Data System (ADS)

    Starrs, Carlin Frances; Butsic, Van; Stephens, Connor; Stewart, William

    2018-03-01

    The extent of wildfires in the western United States is increasing, but how land ownership, firefighting, and reserve status influence fire probability is unclear. California serves as a unique natural experiment to estimate the impact of these factors, as ownership is split equally between federal and non-federal landowners; there is a relatively large proportion of reserved lands where extractive uses are prohibited and fire suppression is limited; and land ownership and firefighting responsibility are purposefully not always aligned. Panel Poisson regression techniques and pre-regression matching were used to model changes in annual fire probability from 1950-2015 on reserve and non-reserve lands on federal and non-federal ownerships across four vegetation types: forests, rangelands, shrublands, and forests without commercial species. Fire probability was found to have increased over time across all 32 categories. A marginal effects analysis showed that federal ownership and firefighting was associated with increased fire probability, and that the difference in fire probability on federal versus non-federal lands is increasing over time. Ownership, firefighting, and reserve status, played roughly equal roles in determining fire probability, and were found to have much greater influence than average maximum temperature (°C) during summer months (June, July, August), average annual precipitation (cm), and average annual topsoil moisture content by volume, demonstrating the critical role these factors play in western fire regimes and the importance of including them in future analysis focused on understanding and predicting wildfire in the Western United States.

  14. The relationship between earthquake exposure and posttraumatic stress disorder in 2013 Lushan earthquake

    NASA Astrophysics Data System (ADS)

    Wang, Yan; Lu, Yi

    2018-01-01

    The objective of this study is to explore the relationship between earthquake exposure and the incidence of PTSD. A stratification random sample survey was conducted to collect data in the Longmenshan thrust fault after Lushan earthquake three years. We used the Children's Revised Impact of Event Scale (CRIES-13) and the Earthquake Experience Scale. Subjects in this study included 3944 school student survivors in local eleven schools. The prevalence of probable PTSD is relatively higher, when the people was trapped in the earthquake, was injured in the earthquake or have relatives who died in the earthquake. It concluded that researchers need to pay more attention to the children and adolescents. The government should pay more attention to these people and provide more economic support.

  15. Inferred Rheology and Petrology of Southern California and Northwest Mexico Mantle from Postseismic Deformation following the 2010 El Mayor-Cucapah Earthquake

    NASA Astrophysics Data System (ADS)

    Freed, A. M.; Dickinson, H.; Huang, M. H.; Fielding, E. J.; Burgmann, R.; Andronicos, C.

    2015-12-01

    The Mw 7.2 El Mayor-Cucapah (EMC) earthquake ruptured a ~120 km long series of faults striking northwest from the Gulf of California to the Sierra Cucapah. Five years after the EMC event, a dense network of GPS stations in southern California and a sparse array of sites installed after the earthquake in northern Mexico measure ongoing surface deformation as coseismic stresses relax. We use 3D finite element models of seismically inferred crustal and mantle structure with earthquake slip constrained by GPS, InSAR range change and SAR and SPOT image sub-pixel offset measurements to infer the rheologic structure of the region. Model complexity, including 3D Moho structure and distinct geologic regions such as the Peninsular Ranges and Salton Trough, enable us to explore vertical and lateral heterogeneities of crustal and mantle rheology. We find that postseismic displacements can be explained by relaxation of a laterally varying, stratified rheologic structure controlled by temperature and crustal thickness. In the Salton Trough region, particularly large postseismic displacements require a relatively weak mantle column that weakens with depth, consistent with a strong but thin (22 km thick) crust and high regional temperatures. In contrast, beneath the neighboring Peninsular Ranges a strong, thick (up to 35 km) crust and cooler temperatures lead to a rheologically stronger mantle column. Thus, we find that the inferred rheologic structure corresponds with observed seismic structure and thermal variations. Significant afterslip is not required to explain postseismic displacements, but cannot be ruled out. Combined with isochemical phase diagrams, our results enable us to go beyond rheologic structure and infer some basic properties about the regional mantle, including composition, water content, and the degree of partial melting.

  16. Configurational entropy of critical earthquake populations

    NASA Astrophysics Data System (ADS)

    Goltz, C.; Böse, M.

    2002-10-01

    We present an approach to describe the evolution of distributed seismicity by configurational entropy. We demonstrate the detection of phase transitions in the sense of a critical point phenomenon in a 2D site-percolation model and in temporal and spatial vicinity to the 1992, M7.3 Landers earthquake in Southern California. Our findings support the assumption of intermittent criticality in the Earth's crust. We also address the potential usefulness of the method for earthquake catalogue declustering.

  17. Earthquake triggering by seismic waves following the landers and hector mine earthquakes

    USGS Publications Warehouse

    Gomberg, J.; Reasenberg, P.A.; Bodin, P.; Harris, R.A.

    2001-01-01

    The proximity and similarity of the 1992, magnitude 7.3 Landers and 1999, magnitude 7.1 Hector Mine earthquakes in California permit testing of earthquake triggering hypotheses not previously possible. The Hector Mine earthquake confirmed inferences that transient, oscillatory 'dynamic' deformations radiated as seismic waves can trigger seismicity rate increases, as proposed for the Landers earthquake1-6. Here we quantify the spatial and temporal patterns of the seismicity rate changes7. The seismicity rate increase was to the north for the Landers earthquake and primarily to the south for the Hector Mine earthquake. We suggest that rupture directivity results in elevated dynamic deformations north and south of the Landers and Hector Mine faults, respectively, as evident in the asymmetry of the recorded seismic velocity fields. Both dynamic and static stress changes seem important for triggering in the near field with dynamic stress changes dominating at greater distances. Peak seismic velocities recorded for each earthquake suggest the existence of, and place bounds on, dynamic triggering thresholds. These thresholds vary from a few tenths to a few MPa in most places, depend on local conditions, and exceed inferred static thresholds by more than an order of magnitude. At some sites, the onset of triggering was delayed until after the dynamic deformations subsided. Physical mechanisms consistent with all these observations may be similar to those that give rise to liquefaction or cyclic fatigue.

  18. Re-Evaluation of Event Correlations in Virtual California Using Statistical Analysis

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Heflin, M. B.; Granat, R. A.; Yikilmaz, M. B.; Heien, E.; Rundle, J.; Donnellan, A.

    2010-12-01

    Fusing the results of simulation tools with statistical analysis methods has contributed to our better understanding of the earthquake process. In a previous study, we used a statistical method to investigate emergent phenomena in data produced by the Virtual California earthquake simulator. The analysis indicated that there were some interesting fault interactions and possible triggering and quiescence relationships between events. We have converted the original code from Matlab to python/C++ and are now evaluating data from the most recent version of Virtual California in order to analyze and compare any new behavior exhibited by the model. The Virtual California earthquake simulator can be used to study fault and stress interaction scenarios for realistic California earthquakes. The simulation generates a synthetic earthquake catalog of events with a minimum size of ~M 5.8 that can be evaluated using statistical analysis methods. Virtual California utilizes realistic fault geometries and a simple Amontons - Coulomb stick and slip friction law in order to drive the earthquake process by means of a back-slip model where loading of each segment occurs due to the accumulation of a slip deficit at the prescribed slip rate of the segment. Like any complex system, Virtual California may generate emergent phenomena unexpected even by its designers. In order to investigate this, we have developed a statistical method that analyzes the interaction between Virtual California fault elements and thereby determine whether events on any given fault elements show correlated behavior. Our method examines events on one fault element and then determines whether there is an associated event within a specified time window on a second fault element. Note that an event in our analysis is defined as any time an element slips, rather than any particular “earthquake” along the entire fault length. Results are then tabulated and then differenced with an expected correlation

  19. Effects of Strike-Slip Fault Segmentation on Earthquake Energy and Seismic Hazard

    NASA Astrophysics Data System (ADS)

    Madden, E. H.; Cooke, M. L.; Savage, H. M.; McBeck, J.

    2014-12-01

    Many major strike-slip faults are segmented along strike, including those along plate boundaries in California and Turkey. Failure of distinct fault segments at depth may be the source of multiple pulses of seismic radiation observed for single earthquakes. However, how and when segmentation affects fault behavior and energy release is the basis of many outstanding questions related to the physics of faulting and seismic hazard. These include the probability for a single earthquake to rupture multiple fault segments and the effects of segmentation on earthquake magnitude, radiated seismic energy, and ground motions. Using numerical models, we quantify components of the earthquake energy budget, including the tectonic work acting externally on the system, the energy of internal rock strain, the energy required to overcome fault strength and initiate slip, the energy required to overcome frictional resistance during slip, and the radiated seismic energy. We compare the energy budgets of systems of two en echelon fault segments with various spacing that include both releasing and restraining steps. First, we allow the fault segments to fail simultaneously and capture the effects of segmentation geometry on the earthquake energy budget and on the efficiency with which applied displacement is accommodated. Assuming that higher efficiency correlates with higher probability for a single, larger earthquake, this approach has utility for assessing the seismic hazard of segmented faults. Second, we nucleate slip along a weak portion of one fault segment and let the quasi-static rupture propagate across the system. Allowing fractures to form near faults in these models shows that damage develops within releasing steps and promotes slip along the second fault, while damage develops outside of restraining steps and can prohibit slip along the second fault. Work is consumed in both the propagation of and frictional slip along these new fractures, impacting the energy available

  20. Borehole strainmeter measurements spanning the 2014, Mw6.0 South Napa Earthquake, California: The effect from instrument calibration

    USGS Publications Warehouse

    Langbein, John O.

    2015-01-01

    The 24 August 2014 Mw6.0 South Napa, California earthquake produced significant offsets on 12 borehole strainmeters in the San Francisco Bay area. These strainmeters are located between 24 and 80 km from the source and the observed offsets ranged up to 400 parts-per-billion (ppb), which exceeds their nominal precision by a factor of 100. However, the observed offsets of tidally calibrated strains differ by up to 130 ppb from predictions based on a moment tensor derived from seismic data. The large misfit can be attributed to a combination of poor instrument calibration and better modeling of the strain fit from the earthquake. Borehole strainmeters require in-situ calibration, which historically has been accomplished by comparing their measurements of Earth tides with the strain-tides predicted by a model. Although the borehole strainmeter accurately measure the deformation within the borehole, the long-wavelength strain signals from tides or other tectonic processes recorded in the borehole are modified by the presence of the borehole and the elastic properties of the grout and the instrument. Previous analyses of surface-mounted, strainmeter data and their relationship with the predicted tides suggest that tidal models could be in error by 30%. The poor fit of the borehole strainmeter data from this earthquake can be improved by simultaneously varying the components of the model tides up to 30% and making small adjustments to the point-source model of the earthquake, which reduces the RMS misfit from 130 ppb to 18 ppb. This suggests that relying on tidal models to calibrate borehole strainmeters significantly reduces their accuracy.

  1. Late Holocene slip rate and ages of prehistoric earthquakes along the Maacama Fault near Willits, Mendocino County, northern California

    USGS Publications Warehouse

    Prentice, Carol S.; Larsen, Martin C.; Kelsey, Harvey M.; Zachariasen, Judith

    2014-01-01

    The Maacama fault is the northward continuation of the Hayward–Rodgers Creek fault system and creeps at a rate of 5.7±0.1  mm/yr (averaged over the last 20 years) in Willits, California. Our paleoseismic studies at Haehl Creek suggest that the Maacama fault has produced infrequent large earthquakes in addition to creep. Fault terminations observed in several excavations provide evidence that a prehistoric surface‐rupturing earthquake occurred between 1060 and 1180 calibrated years (cal) B.P. at the Haehl Creek site. A folding event, which we attribute to a more recent large earthquake, occurred between 790 and 1060 cal B.P. In the last 560–690 years, a buried channel deposit has been offset 4.6±0.2  m, giving an average slip rate of 6.4–8.6  mm/yr, which is higher than the creep rate over the last 20 years. The difference between this slip rate and the creep rate suggests that coseismic slip up to 1.7 m could have occurred after the formation of the channel deposit and could be due to a paleoearthquake known from paleoseismic studies in the Ukiah Valley, about 25 km to the southeast. Therefore, we infer that at least two, and possibly three, large earthquakes have occurred at the Haehl Creek site since 1180 cal B.P. (770 C.E.), consistent with earlier studies suggesting infrequent, large earthquakes on the Maacama fault. The short‐term geodetic slip rate across the Maacama fault zone is approximately twice the slip rate that we have documented at the Haehl Creek site, which is averaged over the last approximately 600 years. If the geodetic rate represents the long‐term slip accumulation across the fault zone, then we infer that, in the last ∼1200 years, additional earthquakes may have occurred either on the Haehl Creek segment of the Maacama fault or on other active faults within the Maacama fault zone at this latitude.

  2. Triggering of the Ms = 5.4 Little Skull Mountain, Nevada, earthquake with dynamic strains

    USGS Publications Warehouse

    Gomberg, Joan; Bodin, Paul

    1994-01-01

    We have developed an approach to test the viability of dynamic strains as a triggering mechanism by quantifying the dynamic strain tensor at seismogenic depths. We focus on the dynamic strains at the hypocenter of the Ms = 5.4 Little Skull Mountain (LSM), Nevada, earthquake. This event is noteworthy because it is the largest earthquake demonstrably triggered at remote distances (∼280 km) by the Ms = 7.4 Landers, California, earthquake and because of its ambiguous association with magmatic activity. Our analysis shows that, if dynamic strains initiate remote triggering, the orientation and modes of faulting most favorable for being triggered by a given strain transient change with depth. The geometry of the most probable LSM fault plane was favorably oriented with respect to the geometry of the dynamic strain tensor. We estimate that the magnitude of the peak dynamic strains at the hypocentral depth of the LSM earthquake were ∼4 μstrain (∼.2 MPa) which are ∼50% smaller than those estimated from velocity seismograms recorded at the surface. We suggest that these strains are too small to cause Mohr-Coulomb style failure unless the fault was prestrained to near failure levels, the fault was exceptionally weak, and/or the dynamic strains trigger other processes that lead to failure.

  3. Crustal earthquake triggering by pre-historic great earthquakes on subduction zone thrusts

    USGS Publications Warehouse

    Sherrod, Brian; Gomberg, Joan

    2014-01-01

    Triggering of earthquakes on upper plate faults during and shortly after recent great (M>8.0) subduction thrust earthquakes raises concerns about earthquake triggering following Cascadia subduction zone earthquakes. Of particular regard to Cascadia was the previously noted, but only qualitatively identified, clustering of M>~6.5 crustal earthquakes in the Puget Sound region between about 1200–900 cal yr B.P. and the possibility that this was triggered by a great Cascadia thrust subduction thrust earthquake, and therefore portends future such clusters. We confirm quantitatively the extraordinary nature of the Puget Sound region crustal earthquake clustering between 1200–900 cal yr B.P., at least over the last 16,000. We conclude that this cluster was not triggered by the penultimate, and possibly full-margin, great Cascadia subduction thrust earthquake. However, we also show that the paleoseismic record for Cascadia is consistent with conclusions of our companion study of the global modern record outside Cascadia, that M>8.6 subduction thrust events have a high probability of triggering at least one or more M>~6.5 crustal earthquakes.

  4. Creating a Global Building Inventory for Earthquake Loss Assessment and Risk Management

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David J.

    2008-01-01

    contribution of building stock, its relative vulnerability, and distribution are vital components for determining the extent of casualties during an earthquake. It is evident from large deadly historical earthquakes that the distribution of vulnerable structures and their occupancy level during an earthquake control the severity of human losses. For example, though the number of strong earthquakes in California is comparable to that of Iran, the total earthquake-related casualties in California during the last 100 years are dramatically lower than the casualties from several individual Iranian earthquakes. The relatively low casualties count in California is attributed mainly to the fact that more than 90 percent of the building stock in California is made of wood and is designed to withstand moderate to large earthquakes (Kircher, Seligson and others, 2006). In contrast, the 80 percent adobe and or non-engineered masonry building stock with poor lateral load resisting systems in Iran succumbs even for moderate levels of ground shaking. Consequently, the heavy death toll for the 2003 Bam, Iran earthquake, which claimed 31,828 lives (Ghafory-Ashtiany and Mousavi, 2005), is directly attributable to such poorly resistant construction, and future events will produce comparable losses unless practices change. Similarly, multistory, precast-concrete framed buildings caused heavy casualties in the 1988 Spitak, Armenia earthquake (Bertero, 1989); weaker masonry and reinforced-concrete framed construction designed for gravity loads with soft first stories dominated losses in the Bhuj, India earthquake of 2001 (Madabhushi and Haigh, 2005); and adobe and weak masonry dwellings in Peru controlled the death toll in the Peru earthquake of 2007 (Taucer, J. and others, 2007). Spence (2007) after conducting a brief survey of most lethal earthquakes since 1960 found that building collapses remains a major cause of earthquake mortality and unreinforced masonry buildings are one of the mos

  5. Studies of earthquakes stress drops, seismic scattering, and dynamic triggering in North America

    NASA Astrophysics Data System (ADS)

    Escudero Ayala, Christian Rene

    I use the Relative Source Time Function (RSTF) method to determine the source properties of earthquakes within southeastern Alaska-northwestern Canada in a first part of the project, and earthquakes within the Denali fault in a second part. I deconvolve a small event P-arrival signal from a larger event by the following method: select arrivals with a tapered cosine window, fast fourier transform to obtain the spectrum, apply water level deconvolution technique, and bandpass filter before inverse transforming the result to obtain the RSTF. I compare the source processes of earthquakes within the area to determine stress drop differences to determine their relation with the tectonic setting of the earthquakes location. Results show an consistency with previous results, stress drop independent of moment implying self-similarity, correlation of stress drop with tectonic regime, stress drop independent of depth, stress drop depends of focal mechanism where strike-slip present larger stress drops, and decreasing stress drop as function of time. I determine seismic wave attenuation in the central western United States using coda waves. I select approximately 40 moderate earthquakes (magnitude between 5.5 and 6.5) located alocated along the California-Baja California, California-Nevada, Eastern Idaho, Gulf of California, Hebgen Lake, Montana, Nevada, New Mexico, off coast of Northern California, off coast of Oregon, southern California, southern Illinois, Vancouver Island, Washington, and Wyoming regions. These events were recorded by the EarthScope transportable array (TA) network from 2005 to 2009. We obtain the data from the Incorporated Research Institutions for Seismology (IRIS). In this study we implement a method based on the assumption that coda waves are single backscattered waves from randomly distributed heterogeneities to calculate the coda Q. The frequencies studied lie between 1 and 15 Hz. The scattering attenuation is calculated for frequency bands centered

  6. Rapid finite-fault inversions in Southern California using Cybershake Green's functions

    NASA Astrophysics Data System (ADS)

    Thio, H. K.; Polet, J.

    2017-12-01

    We have developed a system for rapid finite fault inversion for intermediate and large Southern California earthquakes using local, regional and teleseismic seismic waveforms as well as geodetic data. For modeling the local seismic data, we use 3D Green's functions from the Cybershake project, which were made available to us courtesy of the Southern California Earthquake Center (SCEC). The use of 3D Green's functions allows us to extend the inversion to higher frequency waveform data and smaller magnitude earthquakes, in addition to achieving improved solutions in general. The ultimate aim of this work is to develop the ability to provide high quality finite fault models within a few hours after any damaging earthquake in Southern California, so that they may be used as input to various post-earthquake assessment tools such as ShakeMap, as well as by the scientific community and other interested parties. Additionally, a systematic determination of finite fault models has value as a resource for scientific studies on detailed earthquake processes, such as rupture dynamics and scaling relations. We are using an established least-squares finite fault inversion method that has been applied extensively both on large as well as smaller regional earthquakes, in conjunction with the 3D Green's functions, where available, as well as 1D Green's functions for areas for which the Cybershake library has not yet been developed. We are carrying out validation and calibration of this system using significant earthquakes that have occurred in the region over the last two decades, spanning a range of locations and magnitudes (5.4 and higher).

  7. A Double-difference Earthquake location algorithm: Method and application to the Northern Hayward Fault, California

    USGS Publications Warehouse

    Waldhauser, F.; Ellsworth, W.L.

    2000-01-01

    We have developed an efficient method to determine high-resolution hypocenter locations over large distances. The location method incorporates ordinary absolute travel-time measurements and/or cross-correlation P-and S-wave differential travel-time measurements. Residuals between observed and theoretical travel-time differences (or double-differences) are minimized for pairs of earthquakes at each station while linking together all observed event-station pairs. A least-squares solution is found by iteratively adjusting the vector difference between hypocentral pairs. The double-difference algorithm minimizes errors due to unmodeled velocity structure without the use of station corrections. Because catalog and cross-correlation data are combined into one system of equations, interevent distances within multiplets are determined to the accuracy of the cross-correlation data, while the relative locations between multiplets and uncorrelated events are simultaneously determined to the accuracy of the absolute travel-time data. Statistical resampling methods are used to estimate data accuracy and location errors. Uncertainties in double-difference locations are improved by more than an order of magnitude compared to catalog locations. The algorithm is tested, and its performance is demonstrated on two clusters of earthquakes located on the northern Hayward fault, California. There it colapses the diffuse catalog locations into sharp images of seismicity and reveals horizontal lineations of hypocenter that define the narrow regions on the fault where stress is released by brittle failure.

  8. Probing the mechanical properties of seismically active crust with space geodesy: Study of the coseismic deformation due to the 1992 Mw7.3 Landers (southern California) earthquake

    NASA Astrophysics Data System (ADS)

    Fialko, Yuri

    2004-03-01

    The coseismic deformation due to the 1992 Mw7.3 Landers earthquake, southern California, is investigated using synthetic aperture radar (SAR) and Global Positioning System (GPS) measurements. The ERS-1 satellite data from the ascending and descending orbits are used to generate contiguous maps of three orthogonal components (east, north, up) of the coseismic surface displacement field. The coseismic displacement field exhibits symmetries with respect to the rupture plane that are suggestive of a linear relationship between stress and strain in the crust. Interferometric synthetic aperture radar (InSAR) data show small-scale deformation on nearby faults of the Eastern California Shear Zone. Some of these faults (in particular, the Calico, Rodman, and Pinto Mountain faults) were also subsequently strained by the 1999 Mw7.1 Hector Mine earthquake. I test the hypothesis that the anomalous fault strain represents essentially an elastic response of kilometer-scale compliant fault zones to stressing by nearby earthquakes [, 2002]. The coseismic stress perturbations due to the Landers earthquake are computed using a slip model derived from inversions of the InSAR and GPS data. Calculations are performed for both homogeneous and transversely isotropic half-space models. The compliant zone model that best explains the deformation on the Calico and Pinto Mountain faults due to the Hector Mine earthquake successfully predicts the coseismic displacements on these faults induced by the Landers earthquake. Deformation on the Calico and Pinto Mountain faults implies about a factor of 2 reduction in the effective shear modulus within the ˜2 km wide fault zones. The depth extent of the low-rigidity zones is poorly constrained but is likely in excess of a few kilometers. The same type of structure is able to explain high gradients in the radar line of sight displacements observed on other faults adjacent to the Landers rupture. In particular, the Lenwood fault north of the Soggy

  9. Multi-Scale Structure and Earthquake Properties in the San Jacinto Fault Zone Area

    NASA Astrophysics Data System (ADS)

    Ben-Zion, Y.

    2014-12-01

    I review multi-scale multi-signal seismological results on structure and earthquake properties within and around the San Jacinto Fault Zone (SJFZ) in southern California. The results are based on data of the southern California and ANZA networks covering scales from a few km to over 100 km, additional near-fault seismometers and linear arrays with instrument spacing 25-50 m that cross the SJFZ at several locations, and a dense rectangular array with >1100 vertical-component nodes separated by 10-30 m centered on the fault. The structural studies utilize earthquake data to image the seismogenic sections and ambient noise to image the shallower structures. The earthquake studies use waveform inversions and additional time domain and spectral methods. We observe pronounced damage regions with low seismic velocities and anomalous Vp/Vs ratios around the fault, and clear velocity contrasts across various sections. The damage zones and velocity contrasts produce fault zone trapped and head waves at various locations, along with time delays, anisotropy and other signals. The damage zones follow a flower-shape with depth; in places with velocity contrast they are offset to the stiffer side at depth as expected for bimaterial ruptures with persistent propagation direction. Analysis of PGV and PGA indicates clear persistent directivity at given fault sections and overall motion amplification within several km around the fault. Clear temporal changes of velocities, probably involving primarily the shallow material, are observed in response to seasonal, earthquake and other loadings. Full source tensor properties of M>4 earthquakes in the complex trifurcation area include statistically-robust small isotropic component, likely reflecting dynamic generation of rock damage in the source volumes. The dense fault zone instruments record seismic "noise" at frequencies >200 Hz that can be used for imaging and monitoring the shallow material with high space and time details, and

  10. MMI attenuation and historical earthquakes in the basin and range province of western North America

    USGS Publications Warehouse

    Bakun, W.H.

    2006-01-01

    Earthquakes in central Nevada (1932-1959) were used to develop a modified Mercalli intensity (MMI) attenuation model for estimating moment magnitude M for earthquakes in the Basin and Range province of interior western North America. M is 7.4-7.5 for the 26 March 1872 Owens Valley, California, earthquake, in agreement with Beanland and Clark's (1994) M 7.6 that was estimated from geologic field observations. M is 7.5 for the 3 May 1887 Sonora, Mexico, earthquake, in agreement with Natali and Sbar's (1982) M 7.4 and Suter's (2006) M 7.5, both estimated from geologic field observations. MMI at sites in California for earthquakes in the Nevada Basin and Range apparently are not much affected by the Sierra Nevada except at sites near the Sierra Nevada where MMI is reduced. This reduction in MMI is consistent with a shadow zone produced by the root of the Sierra Nevada. In contrast, MMI assignments for earthquakes located in the eastern Sierra Nevada near the west margin of the Basin and Range are greater than predicted at sites in California. These higher MMI values may result from critical reflections due to layering near the base of the Sierra Nevada.

  11. Making the Handoff from Earthquake Hazard Assessments to Effective Mitigation Measures (Invited)

    NASA Astrophysics Data System (ADS)

    Applegate, D.

    2010-12-01

    of earthquake scientists and engineers. In addition to the national maps, the USGS produces more detailed urban seismic hazard maps that communities have used to prioritize retrofits and design critical infrastructure that can withstand large earthquakes. At a regional scale, the USGS and its partners in California have developed a time-dependent earthquake rupture forecast that is being used by the insurance sector, which can serve to distribute risk and foster mitigation if the right incentives are in place. What the USGS and partners are doing at the urban, regional, and national scales, the Global Earthquake Model project is seeking to do for the world. A significant challenge for engaging the public to prepare for earthquakes is making low-probability, high-consequence events real enough to merit personal action. Scenarios help by starting with the hazard posed by a specific earthquake and then exploring the fragility of the built environment, cascading failures, and the real-life consequences for the public. To generate such a complete picture takes multiple disciplines working together. Earthquake scenarios are being used both for emergency management exercises and much broader public preparedness efforts like the Great California ShakeOut, which engaged nearly 7 million people.

  12. Earthquakes: Risk, Monitoring, Notification, and Research

    DTIC Science & Technology

    2008-06-19

    Washington, Oregon, and Hawaii . The Rocky Mountain region, a portion of the central United States known as the New Madrid Seismic Zone, and portions...California, Washington, Oregon, and Alaska and Hawaii . Alaska is the most earthquake-prone state, experiencing a magnitude 7 earthquake1 almost every...Oakland, CA $349 23 Las Vegas, NV $28 4 San Francisco, CA $346 24 Anchorage, AK $25 5 San Jose, CA $243 25 Boston, MA $23 6 Orange, CA $214 26 Hilo , HI $20

  13. Detailed observations of California foreshock sequences: Implications for the earthquake initiation process

    USGS Publications Warehouse

    Dodge, D.A.; Beroza, G.C.; Ellsworth, W.L.

    1996-01-01

    We find that foreshocks provide clear evidence for an extended nucleation process before some earthquakes. In this study, we examine in detail the evolution of six California foreshock sequences, the 1986 Mount Lewis (ML, = 5.5), the 1986 Chalfant (ML = 6.4), the. 1986 Stone Canyon (ML = 4.7), the 1990 Upland (ML = 5.2), the 1992 Joshua Tree (MW= 6.1), and the 1992 Landers (MW = 7.3) sequence. Typically, uncertainties in hypocentral parameters are too large to establish the geometry of foreshock sequences and hence to understand their evolution. However, the similarity of location and focal mechanisms for the events in these sequences leads to similar foreshock waveforms that we cross correlate to obtain extremely accurate relative locations. We use these results to identify small-scale fault zone structures that could influence nucleation and to determine the stress evolution leading up to the mainshock. In general, these foreshock sequences are not compatible with a cascading failure nucleation model in which the foreshocks all occur on a single fault plane and trigger the mainshock by static stress transfer. Instead, the foreshocks seem to concentrate near structural discontinuities in the fault and may themselves be a product of an aseismic nucleation process. Fault zone heterogeneity may also be important in controlling the number of foreshocks, i.e., the stronger the heterogeneity, the greater the number of foreshocks. The size of the nucleation region, as measured by the extent of the foreshock sequence, appears to scale with mainshock moment in the same manner as determined independently by measurements of the seismic nucleation phase. We also find evidence for slip localization as predicted by some models of earthquake nucleation. Copyright 1996 by the American Geophysical Union.

  14. Earthquake Testing

    NASA Technical Reports Server (NTRS)

    1979-01-01

    During NASA's Apollo program, it was necessary to subject the mammoth Saturn V launch vehicle to extremely forceful vibrations to assure the moonbooster's structural integrity in flight. Marshall Space Flight Center assigned vibration testing to a contractor, the Scientific Services and Systems Group of Wyle Laboratories, Norco, California. Wyle-3S, as the group is known, built a large facility at Huntsville, Alabama, and equipped it with an enormously forceful shock and vibration system to simulate the liftoff stresses the Saturn V would encounter. Saturn V is no longer in service, but Wyle-3S has found spinoff utility for its vibration facility. It is now being used to simulate earthquake effects on various kinds of equipment, principally equipment intended for use in nuclear power generation. Government regulations require that such equipment demonstrate its ability to survive earthquake conditions. In upper left photo, Wyle3S is preparing to conduct an earthquake test on a 25ton diesel generator built by Atlas Polar Company, Ltd., Toronto, Canada, for emergency use in a Canadian nuclear power plant. Being readied for test in the lower left photo is a large circuit breaker to be used by Duke Power Company, Charlotte, North Carolina. Electro-hydraulic and electro-dynamic shakers in and around the pit simulate earthquake forces.

  15. Transport-related impacts of the Northridge Earthquake

    DOT National Transportation Integrated Search

    1998-05-01

    This research estimates the transport-related business interruption impacts of the 1994 Northridge earthquake using a spatial allocation model, SCPM (the Southern California Planning Model) and surveys of businesses and individuals. Total business in...

  16. Earthquakes May-June 1980.

    USGS Publications Warehouse

    Person, W.J.

    1981-01-01

    The months were seismically active, although only one major event (7.0-7.9) occurred in an unpopulated Philippine Island. Mexico was struck by a 6.3 quake on June 9 killing at least two people. The most significant earthquake in the United States was in the Mammoth Lakes area of California. -from Author

  17. Credible occurrence probabilities for extreme geophysical events: earthquakes, volcanic eruptions, magnetic storms

    USGS Publications Warehouse

    Love, Jeffrey J.

    2012-01-01

    Statistical analysis is made of rare, extreme geophysical events recorded in historical data -- counting the number of events $k$ with sizes that exceed chosen thresholds during specific durations of time $\\tau$. Under transformations that stabilize data and model-parameter variances, the most likely Poisson-event occurrence rate, $k/\\tau$, applies for frequentist inference and, also, for Bayesian inference with a Jeffreys prior that ensures posterior invariance under changes of variables. Frequentist confidence intervals and Bayesian (Jeffreys) credibility intervals are approximately the same and easy to calculate: $(1/\\tau)[(\\sqrt{k} - z/2)^{2},(\\sqrt{k} + z/2)^{2}]$, where $z$ is a parameter that specifies the width, $z=1$ ($z=2$) corresponding to $1\\sigma$, $68.3\\%$ ($2\\sigma$, $95.4\\%$). If only a few events have been observed, as is usually the case for extreme events, then these "error-bar" intervals might be considered to be relatively wide. From historical records, we estimate most likely long-term occurrence rates, 10-yr occurrence probabilities, and intervals of frequentist confidence and Bayesian credibility for large earthquakes, explosive volcanic eruptions, and magnetic storms.

  18. Marine geology and earthquake hazards of the San Pedro Shelf region, southern California

    USGS Publications Warehouse

    Fisher, Michael A.; Normark, William R.; Langenheim, V.E.; Calvert, Andrew J.; Sliter, Ray

    2004-01-01

    High-resolution seismic-reflection data have been com- bined with a variety of other geophysical and geological data to interpret the offshore structure and earthquake hazards of the San Pedro Shelf, near Los Angeles, California. Prominent structures investigated include the Wilmington Graben, the Palos Verdes Fault Zone, various faults below the western part of the shelf and slope, and the deep-water San Pedro Basin. The structure of the Palos Verdes Fault Zone changes mark- edly southeastward across the San Pedro Shelf and slope. Under the northern part of the shelf, this fault zone includes several strands, but the main strand dips west and is probably an oblique-slip fault. Under the slope, this fault zone con- sists of several fault strands having normal separation, most of which dip moderately east. To the southeast near Lasuen Knoll, the Palos Verdes Fault Zone locally is a low-angle fault that dips east, but elsewhere near this knoll the fault appears to dip steeply. Fresh sea-floor scarps near Lasuen Knoll indi- cate recent fault movement. The observed regional structural variation along the Palos Verdes Fault Zone is explained as the result of changes in strike and fault geometry along a master strike-slip fault at depth. The shallow summit and possible wavecut terraces on Lasuen knoll indicate subaerial exposure during the last sea-level lowstand. Modeling of aeromagnetic data indicates the presence of a large magnetic body under the western part of the San Pedro Shelf and upper slope. This is interpreted to be a thick body of basalt of Miocene(?) age. Reflective sedimentary rocks overlying the basalt are tightly folded, whereas folds in sedimentary rocks east of the basalt have longer wavelengths. This difference might mean that the basalt was more competent during folding than the encasing sedimentary rocks. West of the Palos Verdes Fault Zone, other northwest-striking faults deform the outer shelf and slope. Evidence for recent movement along these

  19. Estimating the empirical probability of submarine landslide occurrence

    USGS Publications Warehouse

    Geist, Eric L.; Parsons, Thomas E.; Mosher, David C.; Shipp, Craig; Moscardelli, Lorena; Chaytor, Jason D.; Baxter, Christopher D. P.; Lee, Homa J.; Urgeles, Roger

    2010-01-01

    The empirical probability for the occurrence of submarine landslides at a given location can be estimated from age dates of past landslides. In this study, tools developed to estimate earthquake probability from paleoseismic horizons are adapted to estimate submarine landslide probability. In both types of estimates, one has to account for the uncertainty associated with age-dating individual events as well as the open time intervals before and after the observed sequence of landslides. For observed sequences of submarine landslides, we typically only have the age date of the youngest event and possibly of a seismic horizon that lies below the oldest event in a landslide sequence. We use an empirical Bayes analysis based on the Poisson-Gamma conjugate prior model specifically applied to the landslide probability problem. This model assumes that landslide events as imaged in geophysical data are independent and occur in time according to a Poisson distribution characterized by a rate parameter λ. With this method, we are able to estimate the most likely value of λ and, importantly, the range of uncertainty in this estimate. Examples considered include landslide sequences observed in the Santa Barbara Channel, California, and in Port Valdez, Alaska. We confirm that given the uncertainties of age dating that landslide complexes can be treated as single events by performing statistical test of age dates representing the main failure episode of the Holocene Storegga landslide complex.

  20. SEISMICITY OF THE LASSEN PEAK AREA, CALIFORNIA: 1981-1983.

    USGS Publications Warehouse

    Walter, Stephen R.; Rojas, Vernonica; Kollmann, Auriel

    1984-01-01

    Over 700 earthquakes occurred in the vicinity of Lassen Peak, California, from February 1981 through December 1983. These earthquakes define a broad, northwest-trending seismic zone that extends from the Sierra Nevada through the Lassen Peak area and either terminates or is offset to the northeast about 20 kilometers northwest of Lassen Peak. Approximately 25% of these earthquakes are associated with the geothermal system south of Lassen Peak. Earthquakes in the geothermal area generally occur at depths shallower than 6 kilometers.

  1. Seismicity remotely triggered by the magnitude 7.3 landers, california, earthquake

    USGS Publications Warehouse

    Hill, D.P.; Reasenberg, P.A.; Michael, A.; Arabaz, W.J.; Beroza, G.; Brumbaugh, D.; Brune, J.N.; Castro, R.; Davis, S.; Depolo, D.; Ellsworth, W.L.; Gomberg, J.; Harmsen, S.; House, L.; Jackson, S.M.; Johnston, M.J.S.; Jones, L.; Keller, Rebecca Hylton; Malone, S.; Munguia, L.; Nava, S.; Pechmann, J.C.; Sanford, A.; Simpson, R.W.; Smith, R.B.; Stark, M.; Stickney, M.; Vidal, A.; Walter, S.; Wong, V.; Zollweg, J.

    1993-01-01

    The magnitude 7.3 Landers earthquake of 28 June 1992 triggered a remarkably sudden and widespread increase in earthquake activity across much of the western United States. The triggered earthquakes, which occurred at distances up to 1250 kilometers (17 source dimensions) from the Landers mainshock, were confined to areas of persistent seismicity and strike-slip to normal faulting. Many of the triggered areas also are sites of geothermal and recent volcanic activity. Static stress changes calculated for elastic models of the earthquake appear to be too small to have caused the triggering. The most promising explanations involve nonlinear interactions between large dynamic strains accompanying seismic waves from the mainshock and crustal fluids (perhaps including crustal magma).

  2. The Landers earthquake; preliminary instrumental results

    USGS Publications Warehouse

    Jones, L.; Mori, J.; Hauksson, E.

    1992-01-01

    Early on the morning of June 28, 1992, millions of people in southern California were awakened by the largest earthquake to occur in the western United States in the past 40 yrs. At 4:58 a.m PDT (local time), faulting associated with the magnitude 7.3 earthquake broke through to earth's surface near the town of Landers, California. the surface rupture then propagated 70km (45 mi) to the north and northwest along a band of faults passing through the middle of the Mojave Desert. Fortunately, the strongest shaking occurred in uninhabited regions of the Mojave Desert. Still one child was killed in Yucca Valley, and about 400 people were injured in the surrounding area. the desert communities of Landers, Yucca Valley, and Joshua Tree in San Bernardino Country suffered considerable damage to buildings and roads. Damage to water and power lines caused problems in many areas. 

  3. A record of large earthquakes on the southern Hayward fault for the past 1800 years

    USGS Publications Warehouse

    Lienkaemper, J.J.; Williams, P.L.

    2007-01-01

    This is the second article presenting evidence of the occurrence and timing of paleoearthquakes on the southern Hayward fault as interpreted from trenches excavated within a sag pond at the Tyson's Lagoon site in Fremont, California. We use the information to estimate the mean value and aperiodicity of the fault's recurrence interval (RI): two fundamental parameters for estimation of regional seismic hazard. An earlier article documented the four most recent earthquakes, including the historic 1868 earthquake. In this article we present evidence for at least seven earlier paleoruptures since about A.D. 170. We document these events with evidence for ground rupture, such as the presence of blocky colluvium at the base of the main trace fault scarp, and by corroborating evidence such as simultaneous liquefaction or an increase in deformation immediately below event horizons. The mean RI is 170 ?? 82 yr (1??, standard deviation of the sample), aperiodicity is 0.48, and individual intervals may be expected to range from 30 to 370 yr (95.4% confidence). The mean RI is consistent with the recurrence model of the Working Group on California Earthquake Probabilities (2003) (mean, 161 yr; range, 99 yr [2.5%]; 283 yr [97.5%]). We note that the mean RI for the five most recent events may have been only 138 ?? 58 yr (1??). Hypothesis tests for the shorter RI do not demonstrate that any recent acceleration has occurred compared to the earlier period or the entire 1800-yr record, principally because of inherent uncertainties of the event ages.

  4. Critical behavior in earthquake energy dissipation

    NASA Astrophysics Data System (ADS)

    Wanliss, James; Muñoz, Víctor; Pastén, Denisse; Toledo, Benjamín; Valdivia, Juan Alejandro

    2017-09-01

    We explore bursty multiscale energy dissipation from earthquakes flanked by latitudes 29° S and 35.5° S, and longitudes 69.501° W and 73.944° W (in the Chilean central zone). Our work compares the predictions of a theory of nonequilibrium phase transitions with nonstandard statistical signatures of earthquake complex scaling behaviors. For temporal scales less than 84 hours, time development of earthquake radiated energy activity follows an algebraic arrangement consistent with estimates from the theory of nonequilibrium phase transitions. There are no characteristic scales for probability distributions of sizes and lifetimes of the activity bursts in the scaling region. The power-law exponents describing the probability distributions suggest that the main energy dissipation takes place due to largest bursts of activity, such as major earthquakes, as opposed to smaller activations which contribute less significantly though they have greater relative occurrence. The results obtained provide statistical evidence that earthquake energy dissipation mechanisms are essentially "scale-free", displaying statistical and dynamical self-similarity. Our results provide some evidence that earthquake radiated energy and directed percolation belong to a similar universality class.

  5. Heterogeneous rupture in the great Cascadia earthquake of 1700 inferred from coastal subsidence estimates

    USGS Publications Warehouse

    Wang, Pei-Ling; Engelhart, Simon E.; Wang, Kelin; Hawkes, Andrea D.; Horton, Benjamin P.; Nelson, Alan R.; Witter, Robert C.

    2013-01-01

    Past earthquake rupture models used to explain paleoseismic estimates of coastal subsidence during the great A.D. 1700 Cascadia earthquake have assumed a uniform slip distribution along the megathrust. Here we infer heterogeneous slip for the Cascadia margin in A.D. 1700 that is analogous to slip distributions during instrumentally recorded great subduction earthquakes worldwide. The assumption of uniform distribution in previous rupture models was due partly to the large uncertainties of then available paleoseismic data used to constrain the models. In this work, we use more precise estimates of subsidence in 1700 from detailed tidal microfossil studies. We develop a 3-D elastic dislocation model that allows the slip to vary both along strike and in the dip direction. Despite uncertainties in the updip and downdip slip extensions, the more precise subsidence estimates are best explained by a model with along-strike slip heterogeneity, with multiple patches of high-moment release separated by areas of low-moment release. For example, in A.D. 1700, there was very little slip near Alsea Bay, Oregon (~44.4°N), an area that coincides with a segment boundary previously suggested on the basis of gravity anomalies. A probable subducting seamount in this area may be responsible for impeding rupture during great earthquakes. Our results highlight the need for more precise, high-quality estimates of subsidence or uplift during prehistoric earthquakes from the coasts of southern British Columbia, northern Washington (north of 47°N), southernmost Oregon, and northern California (south of 43°N), where slip distributions of prehistoric earthquakes are poorly constrained.

  6. Liquefaction Hazard Maps for Three Earthquake Scenarios for the Communities of San Jose, Campbell, Cupertino, Los Altos, Los Gatos, Milpitas, Mountain View, Palo Alto, Santa Clara, Saratoga, and Sunnyvale, Northern Santa Clara County, California

    USGS Publications Warehouse

    Holzer, Thomas L.; Noce, Thomas E.; Bennett, Michael J.

    2008-01-01

    Maps showing the probability of surface manifestations of liquefaction in the northern Santa Clara Valley were prepared with liquefaction probability curves. The area includes the communities of San Jose, Campbell, Cupertino, Los Altos, Los Gatos Milpitas, Mountain View, Palo Alto, Santa Clara, Saratoga, and Sunnyvale. The probability curves were based on complementary cumulative frequency distributions of the liquefaction potential index (LPI) for surficial geologic units in the study area. LPI values were computed with extensive cone penetration test soundings. Maps were developed for three earthquake scenarios, an M7.8 on the San Andreas Fault comparable to the 1906 event, an M6.7 on the Hayward Fault comparable to the 1868 event, and an M6.9 on the Calaveras Fault. Ground motions were estimated with the Boore and Atkinson (2008) attenuation relation. Liquefaction is predicted for all three events in young Holocene levee deposits along the major creeks. Liquefaction probabilities are highest for the M7.8 earthquake, ranging from 0.33 to 0.37 if a 1.5-m deep water table is assumed, and 0.10 to 0.14 if a 5-m deep water table is assumed. Liquefaction probabilities of the other surficial geologic units are less than 0.05. Probabilities for the scenario earthquakes are generally consistent with observations during historical earthquakes.

  7. Disaster Response and Decision Support in Partnership with the California Earthquake Clearinghouse

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Rosinski, A.; Vaughan, D.; Morentz, J.

    2014-12-01

    Getting the right information to the right people at the right time is critical during a natural disaster. E-DECIDER (Emergency Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response) is a NASA decision support system designed to produce remote sensing and geophysical modeling data products that are relevant to the emergency preparedness and response communities and serve as a gateway to enable the delivery of NASA decision support products to these communities. The E-DECIDER decision support system has several tools, services, and products that have been used to support end-user exercises in partnership with the California Earthquake Clearinghouse since 2012, including near real-time deformation modeling results and on-demand maps of critical infrastructure that may have been potentially exposed to damage by a disaster. E-DECIDER's underlying service architecture allows the system to facilitate delivery of NASA decision support products to the Clearinghouse through XchangeCore Web Service Data Orchestration that allows trusted information exchange among partner agencies. This in turn allows Clearinghouse partners to visualize data products produced by E-DECIDER and other NASA projects through incident command software such as SpotOnResponse or ArcGIS Online.

  8. Continuous GPS observations of postseismic deformation following the 16 October 1999 Hector Mine, California, earthquake (Mw 7.1)

    USGS Publications Warehouse

    Hudnutt, K.W.; King, N.E.; Galetzka, J.E.; Stark, K.F.; Behr, J.A.; Aspiotes, A.; van, Wyk S.; Moffitt, R.; Dockter, S.; Wyatt, F.

    2002-01-01

    Rapid field deployment of a new type of continuously operating Global Positioning System (GPS) network and data from Southern California Integrated GPS Network (SCIGN) stations that had recently begun operating in the area allow unique observations of the postseismic deformation associated with the 1999 Hector Mine earthquake. Innovative solutions in fieldcraft, devised for the 11 new GPS stations, provide high-quality observations with 1-year time histories on stable monuments at remote sites. We report on our results from processing the postseismic GPS data available from these sites, as well as 8 other SCIGN stations within 80 km of the event (a total of 19 sites). From these data, we analyze the temporal character and spatial pattern of the postseismic transients. Data from some sites display statistically significant time variation in their velocities. Although this is less certain, the spatial pattern of change in the postseismic velocity field also appears to have changed. The pattern now is similar to the pre-Landers (pre-1992) secular field, but laterally shifted and locally at twice the rate. We speculate that a 30 km ?? 50 km portion of crust (near Twentynine Palms), which was moving at nearly the North American plate rate (to within 3.5 mm/yr of that rate) prior to the 1992 Landers sequence, now is moving along with the crust to the west of it, as though it has been entrained in flow along with the Pacific Plate as a result of the Landers and Hector Mine earthquake sequence. The inboard axis of right-lateral shear deformation (at lower crustal to upper mantle depth) may have jumped 30 km farther into the continental crust at this fault junction that comprises the southern end of the eastern California shear zone.

  9. Role of stress triggering in earthquake migration on the North Anatolian fault

    USGS Publications Warehouse

    Stein, R.S.; Dieterich, J.H.; Barka, A.A.

    1996-01-01

    Ten M???6.7 earthquakes ruptured 1,000 km of the North Anatolian fault (Turkey) during 1939-92, providing an unsurpassed opportunity to study how one large shock sets up the next. Calculations of the change in Coulomb failure stress reveal that 9 out of 10 ruptures were brought closer to failure by the preceding shocks, typically by 5 bars, equivalent to 20 years of secular stressing. We translate the calculated stress changes into earthquake probabilities using an earthquake-nucleation constitutive relation, which includes both permanent and transient stress effects. For the typical 10-year period between triggering and subsequent rupturing shocks in the Anatolia sequence, the stress changes yield an average three-fold gain in the ensuing earthquake probability. Stress is now calculated to be high at several isolated sites along the fault. During the next 30 years, we estimate a 15% probability of a M???6.7 earthquake east of the major eastern center of Erzincan, and a 12% probability for a large event south of the major western port city of Izmit. Such stress-based probability calculations may thus be useful to assess and update earthquake hazards elsewhere. ?? 1997 Elsevier Science Ltd.

  10. Earthquakes: Risk, Detection, Warning, and Research

    DTIC Science & Technology

    2010-01-14

    which affect taller , multi-story buildings. Ground motion that affects shorter buildings of a few stories, called short-period seismic waves, is...places in a single fault, or jump between connected faults. Earthquakes that occur along the Sierra Madre fault in southern California, for example

  11. Earthquake nucleation by transient deformations caused by the M = 7.9 Denali, Alaska, earthquake

    USGS Publications Warehouse

    Gomberg, J.; Bodin, P.; Larson, K.; Dragert, H.

    2004-01-01

    The permanent and dynamic (transient) stress changes inferred to trigger earthquakes are usually orders of magnitude smaller than the stresses relaxed by the earthquakes themselves, implying that triggering occurs on critically stressed faults. Triggered seismicity rate increases may therefore be most likely to occur in areas where loading rates are highest and elevated pore pressures, perhaps facilitated by high-temperature fluids, reduce frictional stresses and promote failure. Here we show that the 2002 magnitude M = 7.9 Denali, Alaska, earthquake triggered wide-spread seismicity rate increases throughout British Columbia and into the western United States. Dynamic triggering by seismic waves should be enhanced in directions where rupture directivity focuses radiated energy, and we verify this using seismic and new high-sample GPS recordings of the Denali mainshock. These observations are comparable in scale only to the triggering caused by the 1992 M = 7.4 Landers, California, earthquake, and demonstrate that Landers triggering did not reflect some peculiarity of the region or the earthquake. However, the rate increases triggered by the Denali earthquake occurred in areas not obviously tectonically active, implying that even in areas of low ambient stressing rates, faults may still be critically stressed and that dynamic triggering may be ubiquitous and unpredictable.

  12. Baja Earthquake, Radar Image and Colored Height

    NASA Image and Video Library

    2010-04-05

    The topography surrounding the Laguna Salada Fault in the Mexican state of Baja, California, is shown in this perspective view with data from NASA Shuttle Radar Topography Mission where a 7.2 earthquake struck on April 4, 2010.

  13. Southern California Earthquake Center (SCEC) Communication, Education and Outreach Program

    NASA Astrophysics Data System (ADS)

    Benthien, M. L.

    2003-12-01

    The SCEC Communication, Education, and Outreach Program (CEO) offers student research experiences, web-based education tools, classroom curricula, museum displays, public information brochures, online newsletters, and technical workshops and publications. This year, much progress has been made on the development of the Electronic Encyclopedia of Earthquakes (E3), a collaborative project with CUREE and IRIS. The E3 development system is now fully operational, and 165 entries are in the pipeline. When complete, information and resources for over 500 Earth science and engineering topics will be included, with connections to curricular materials useful for teaching Earth Science, engineering, physics and mathematics. To coordinate activities for the 10-year anniversary of the Northridge Earthquake in 2004 (and beyond), the "Earthquake Country Alliance" is being organized by SCEC CEO to present common messages, to share or promote existing resources, and to develop new activities and products jointly (such as a new version of Putting Down Roots in Earthquake Country). The group includes earthquake science and engineering researchers and practicing professionals, preparedness experts, response and recovery officials, news media representatives, and education specialists. A web portal, http://www.earthquakecountry.info, is being developed established with links to web pages and descriptions of other resources and services that the Alliance members provide. Another ongoing strength of SCEC is the Summer Intern program, which now has a year-round counterpart with students working on IT projects at USC. Since Fall 2002, over 32 students have participated in the program, including 7 students working with scientists throughout SCEC, 17 students involved in the USC "Earthquake Information Technology" intern program, and 7 students involved in CEO projects. These and other activities of the SCEC CEO program will be presented, along with lessons learned during program design and

  14. Reducing Community Vulnerability to Wildland Fires in Southern California

    NASA Astrophysics Data System (ADS)

    Keeley, J. E.

    2010-12-01

    In the US fires are not treated like other hazards such as earthquakes but rather as preventable through landscape fuel treatments and aggressive fire suppression. In southern California extreme fire weather has made it impossible to control all fires and thus loss of homes and lives is a constant threat to communities. There is growing evidence that indicate we are not likely to ever eliminate fires on these landscapes. Thus, it is time to reframe the fire problem and think of fires like we do with other natural hazards such as earthquakes. We do not attempt to stop earthquakes, rather the primary emphasis is on altering human infrastructure in ways that minimize community vulnerability. In other words we need to change our approach from risk elimination to risk management. This approach means we accept that we cannot eliminate fires but rather learn to live with fire by communities becoming more fire adapted. We potentially can make great strides in reducing community vulnerability by finding those factors with high impacts and are sensitive to changes in management. Presently, decision makers have relatively little guidance about which of these is likely to have the greatest impact. Future reductions in fire risk to communities requires we address both wildland and urban elements that contribute to destructive losses. Damage risk or D is determined by: D = f (I, S, E, G, H) where I = the probability of a fire starting in the landscape S = the probability of the fire reaching a size sufficient to reach the urban environment E = probability of it encroaching into the urban environment G = probability of fire propagating within the built environment H = probability of a fire, once within the built environment, resulting in the destruction of a building. In southern California, reducing I through more strategic fire prevention has potential for reducing fire risk. There are many ignition sources that could be reduced, such as replacing power line ignitions with

  15. Computing and Visualizing the Complex Dynamics of Earthquake Fault Systems: Towards Ensemble Earthquake Forecasting

    NASA Astrophysics Data System (ADS)

    Rundle, J.; Rundle, P.; Donnellan, A.; Li, P.

    2003-12-01

    We consider the problem of the complex dynamics of earthquake fault systems, and whether numerical simulations can be used to define an ensemble forecasting technology similar to that used in weather and climate research. To effectively carry out such a program, we need 1) a topological realistic model to simulate the fault system; 2) data sets to constrain the model parameters through a systematic program of data assimilation; 3) a computational technology making use of modern paradigms of high performance and parallel computing systems; and 4) software to visualize and analyze the results. In particular, we focus attention of a new version of our code Virtual California (version 2001) in which we model all of the major strike slip faults extending throughout California, from the Mexico-California border to the Mendocino Triple Junction. We use the historic data set of earthquakes larger than magnitude M > 6 to define the frictional properties of all 654 fault segments (degrees of freedom) in the model. Previous versions of Virtual California had used only 215 fault segments to model the strike slip faults in southern California. To compute the dynamics and the associated surface deformation, we use message passing as implemented in the MPICH standard distribution on a small Beowulf cluster consisting of 10 cpus. We are also planning to run the code on significantly larger machines so that we can begin to examine much finer spatial scales of resolution, and to assess scaling properties of the code. We present results of simulations both as static images and as mpeg movies, so that the dynamical aspects of the computation can be assessed by the viewer. We also compute a variety of statistics from the simulations, including magnitude-frequency relations, and compare these with data from real fault systems.

  16. The 2014 Mw6.1 South Napa Earthquake: A unilateral rupture with shallow asperity and rapid afterslip

    USGS Publications Warehouse

    Wei, Shengji; Barbot, Sylvain; Graves, Robert; Lienkaemper, James J.; Wang, Teng; Hudnut, Kenneth W.; Fu, Yuning; Helmberger, Don

    2015-01-01

    The Mw6.1 South Napa earthquake occurred near Napa, California on August 24, 2014 (UTC), and was the largest inland earthquake in Northern California since the 1989 Mw6.9 Loma Prieta earthquake. The first report of the earthquake from the Northern California Earthquake Data Center (NCEDC) indicates a hypocentral depth of 11.0km with longitude and latitude of (122.3105°W, 38.217°N). Surface rupture was documented by field observations and Lidar imaging (Brooks et al. 2014; Hudnut et al. 2014; Brocher et al., 2015), with about 12 km of continuous rupture starting near the epicenter and extending to the northwest. The southern part of the rupture is relatively straight, but the strike changes by about 15° at the northern end over a 6-km segment. The peak dextral offset was observed near the Buhman residence with right-.‐lateral motion of 46 cm, near the location where the strike of fault begins to rotate clock-.‐wise (Hudnut et al., 2014). The earthquake was well recorded by the strong motion network operated by the NCEDC, the California Geological Survey and the U.S. Geological Survey (USGS). There are about 12 sites within an epicentral distance of 15km, with relatively good azimuthal coverage (Fig.1). The largest peak-ground-velocity (PGV) of nearly 100 cm/s was observed on station 1765, which is the closest station to the rupture and lies about 3 km east of the northern segment (Fig. 1). The ground deformation associated with the earthquake was also well recorded by the high-resolution COSMO-SkyMed satellite and Sentinel-1A satellite, providing independent static observations.

  17. Stress-based aftershock forecasts made within 24h post mainshock: Expected north San Francisco Bay area seismicity changes after the 2014M=6.0 West Napa earthquake

    USGS Publications Warehouse

    Parsons, Thomas E.; Segou, Margaret; Sevilgen, Volkan; Milner, Kevin; Field, Edward; Toda, Shinji; Stein, Ross S.

    2014-01-01

    We calculate stress changes resulting from the M= 6.0 West Napa earthquake on north San Francisco Bay area faults. The earthquake ruptured within a series of long faults that pose significant hazard to the Bay area, and we are thus concerned with potential increases in the probability of a large earthquake through stress transfer. We conduct this exercise as a prospective test because the skill of stress-based aftershock forecasting methodology is inconclusive. We apply three methods: (1) generalized mapping of regional Coulomb stress change, (2) stress changes resolved on Uniform California Earthquake Rupture Forecast faults, and (3) a mapped rate/state aftershock forecast. All calculations were completed within 24 h after the main shock and were made without benefit of known aftershocks, which will be used to evaluative the prospective forecast. All methods suggest that we should expect heightened seismicity on parts of the southern Rodgers Creek, northern Hayward, and Green Valley faults.

  18. Stress-based aftershock forecasts made within 24 h postmain shock: Expected north San Francisco Bay area seismicity changes after the 2014 M = 6.0 West Napa earthquake

    NASA Astrophysics Data System (ADS)

    Parsons, Tom; Segou, Margaret; Sevilgen, Volkan; Milner, Kevin; Field, Edward; Toda, Shinji; Stein, Ross S.

    2014-12-01

    We calculate stress changes resulting from the M = 6.0 West Napa earthquake on north San Francisco Bay area faults. The earthquake ruptured within a series of long faults that pose significant hazard to the Bay area, and we are thus concerned with potential increases in the probability of a large earthquake through stress transfer. We conduct this exercise as a prospective test because the skill of stress-based aftershock forecasting methodology is inconclusive. We apply three methods: (1) generalized mapping of regional Coulomb stress change, (2) stress changes resolved on Uniform California Earthquake Rupture Forecast faults, and (3) a mapped rate/state aftershock forecast. All calculations were completed within 24 h after the main shock and were made without benefit of known aftershocks, which will be used to evaluative the prospective forecast. All methods suggest that we should expect heightened seismicity on parts of the southern Rodgers Creek, northern Hayward, and Green Valley faults.

  19. Earthquake and tsunami forecasts: Relation of slow slip events to subsequent earthquake rupture

    PubMed Central

    Dixon, Timothy H.; Jiang, Yan; Malservisi, Rocco; McCaffrey, Robert; Voss, Nicholas; Protti, Marino; Gonzalez, Victor

    2014-01-01

    The 5 September 2012 Mw 7.6 earthquake on the Costa Rica subduction plate boundary followed a 62-y interseismic period. High-precision GPS recorded numerous slow slip events (SSEs) in the decade leading up to the earthquake, both up-dip and down-dip of seismic rupture. Deeper SSEs were larger than shallower ones and, if characteristic of the interseismic period, release most locking down-dip of the earthquake, limiting down-dip rupture and earthquake magnitude. Shallower SSEs were smaller, accounting for some but not all interseismic locking. One SSE occurred several months before the earthquake, but changes in Mohr–Coulomb failure stress were probably too small to trigger the earthquake. Because many SSEs have occurred without subsequent rupture, their individual predictive value is limited, but taken together they released a significant amount of accumulated interseismic strain before the earthquake, effectively defining the area of subsequent seismic rupture (rupture did not occur where slow slip was common). Because earthquake magnitude depends on rupture area, this has important implications for earthquake hazard assessment. Specifically, if this behavior is representative of future earthquake cycles and other subduction zones, it implies that monitoring SSEs, including shallow up-dip events that lie offshore, could lead to accurate forecasts of earthquake magnitude and tsunami potential. PMID:25404327

  20. Earthquake and tsunami forecasts: relation of slow slip events to subsequent earthquake rupture.

    PubMed

    Dixon, Timothy H; Jiang, Yan; Malservisi, Rocco; McCaffrey, Robert; Voss, Nicholas; Protti, Marino; Gonzalez, Victor

    2014-12-02

    The 5 September 2012 M(w) 7.6 earthquake on the Costa Rica subduction plate boundary followed a 62-y interseismic period. High-precision GPS recorded numerous slow slip events (SSEs) in the decade leading up to the earthquake, both up-dip and down-dip of seismic rupture. Deeper SSEs were larger than shallower ones and, if characteristic of the interseismic period, release most locking down-dip of the earthquake, limiting down-dip rupture and earthquake magnitude. Shallower SSEs were smaller, accounting for some but not all interseismic locking. One SSE occurred several months before the earthquake, but changes in Mohr-Coulomb failure stress were probably too small to trigger the earthquake. Because many SSEs have occurred without subsequent rupture, their individual predictive value is limited, but taken together they released a significant amount of accumulated interseismic strain before the earthquake, effectively defining the area of subsequent seismic rupture (rupture did not occur where slow slip was common). Because earthquake magnitude depends on rupture area, this has important implications for earthquake hazard assessment. Specifically, if this behavior is representative of future earthquake cycles and other subduction zones, it implies that monitoring SSEs, including shallow up-dip events that lie offshore, could lead to accurate forecasts of earthquake magnitude and tsunami potential.

  1. Long-Term Fault Memory: A New Time-Dependent Recurrence Model for Large Earthquake Clusters on Plate Boundaries

    NASA Astrophysics Data System (ADS)

    Salditch, L.; Brooks, E. M.; Stein, S.; Spencer, B. D.; Campbell, M. R.

    2017-12-01

    A challenge for earthquake hazard assessment is that geologic records often show large earthquakes occurring in temporal clusters separated by periods of quiescence. For example, in Cascadia, a paleoseismic record going back 10,000 years shows four to five clusters separated by approximately 1,000 year gaps. If we are still in the cluster that began 1700 years ago, a large earthquake is likely to happen soon. If the cluster has ended, a great earthquake is less likely. For a Gaussian distribution of recurrence times, the probability of an earthquake in the next 50 years is six times larger if we are still in the most recent cluster. Earthquake hazard assessments typically employ one of two recurrence models, neither of which directly incorporate clustering. In one, earthquake probability is time-independent and modeled as Poissonian, so an earthquake is equally likely at any time. The fault has no "memory" because when a prior earthquake occurred has no bearing on when the next will occur. The other common model is a time-dependent earthquake cycle in which the probability of an earthquake increases with time until one happens, after which the probability resets to zero. Because the probability is reset after each earthquake, the fault "remembers" only the last earthquake. This approach can be used with any assumed probability density function for recurrence times. We propose an alternative, Long-Term Fault Memory (LTFM), a modified earthquake cycle model where the probability of an earthquake increases with time until one happens, after which it decreases, but not necessarily to zero. Hence the probability of the next earthquake depends on the fault's history over multiple cycles, giving "long-term memory". Physically, this reflects an earthquake releasing only part of the elastic strain stored on the fault. We use the LTFM to simulate earthquake clustering along the San Andreas Fault and Cascadia. In some portions of the simulated earthquake history, events would

  2. Supercomputing meets seismology in earthquake exhibit

    ScienceCinema

    Blackwell, Matt; Rodger, Arthur; Kennedy, Tom

    2018-02-14

    When the California Academy of Sciences created the "Earthquake: Evidence of a Restless Planet" exhibit, they called on Lawrence Livermore to help combine seismic research with the latest data-driven visualization techniques. The outcome is a series of striking visualizations of earthquakes, tsunamis and tectonic plate evolution. Seismic-wave research is a core competency at Livermore. While most often associated with earthquakes, the research has many other applications of national interest, such as nuclear explosion monitoring, explosion forensics, energy exploration, and seismic acoustics. For the Academy effort, Livermore researchers simulated the San Andreas and Hayward fault events at high resolutions. Such calculations require significant computational resources. To simulate the 1906 earthquake, for instance, visualizing 125 seconds of ground motion required over 1 billion grid points, 10,000 time steps, and 7.5 hours of processor time on 2,048 cores of Livermore's Sierra machine.

  3. Testing new methodologies for short -term earthquake forecasting: Multi-parameters precursors

    NASA Astrophysics Data System (ADS)

    Ouzounov, Dimitar; Pulinets, Sergey; Tramutoli, Valerio; Lee, Lou; Liu, Tiger; Hattori, Katsumi; Kafatos, Menas

    2014-05-01

    We are conducting real-time tests involving multi-parameter observations over different seismo-tectonics regions in our investigation of phenomena preceding major earthquakes. Our approach is based on a systematic analysis of several selected parameters, namely: gas discharge; thermal infrared radiation; ionospheric electron density; and atmospheric temperature and humidity, which we believe are all associated with the earthquake preparation phase. We are testing a methodology capable to produce alerts in advance of major earthquakes (M > 5.5) in different regions of active earthquakes and volcanoes. During 2012-2013 we established a collaborative framework with PRE-EARTHQUAKE (EU) and iSTEP3 (Taiwan) projects for coordinated measurements and prospective validation over seven testing regions: Southern California (USA), Eastern Honshu (Japan), Italy, Greece, Turkey, Taiwan (ROC), Kamchatka and Sakhalin (Russia). The current experiment provided a "stress test" opportunity to validate the physical based earthquake precursor approach over regions of high seismicity. Our initial results are: (1) Real-time tests have shown the presence of anomalies in the atmosphere and ionosphere before most of the significant (M>5.5) earthquakes; (2) False positives exist and ratios are different for each region, varying between 50% for (Southern Italy), 35% (California) down to 25% (Taiwan, Kamchatka and Japan) with a significant reduction of false positives as soon as at least two geophysical parameters are contemporarily used; (3) Main problems remain related to the systematic collection and real-time integration of pre-earthquake observations. Our findings suggest that real-time testing of physically based pre-earthquake signals provides a short-term predictive power (in all three important parameters, namely location, time and magnitude) for the occurrence of major earthquakes in the tested regions and this result encourages testing to continue with a more detailed analysis of

  4. Prospective testing of Coulomb short-term earthquake forecasts

    NASA Astrophysics Data System (ADS)

    Jackson, D. D.; Kagan, Y. Y.; Schorlemmer, D.; Zechar, J. D.; Wang, Q.; Wong, K.

    2009-12-01

    Earthquake induced Coulomb stresses, whether static or dynamic, suddenly change the probability of future earthquakes. Models to estimate stress and the resulting seismicity changes could help to illuminate earthquake physics and guide appropriate precautionary response. But do these models have improved forecasting power compared to empirical statistical models? The best answer lies in prospective testing in which a fully specified model, with no subsequent parameter adjustments, is evaluated against future earthquakes. The Center of Study of Earthquake Predictability (CSEP) facilitates such prospective testing of earthquake forecasts, including several short term forecasts. Formulating Coulomb stress models for formal testing involves several practical problems, mostly shared with other short-term models. First, earthquake probabilities must be calculated after each “perpetrator” earthquake but before the triggered earthquakes, or “victims”. The time interval between a perpetrator and its victims may be very short, as characterized by the Omori law for aftershocks. CSEP evaluates short term models daily, and allows daily updates of the models. However, lots can happen in a day. An alternative is to test and update models on the occurrence of each earthquake over a certain magnitude. To make such updates rapidly enough and to qualify as prospective, earthquake focal mechanisms, slip distributions, stress patterns, and earthquake probabilities would have to be made by computer without human intervention. This scheme would be more appropriate for evaluating scientific ideas, but it may be less useful for practical applications than daily updates. Second, triggered earthquakes are imperfectly recorded following larger events because their seismic waves are buried in the coda of the earlier event. To solve this problem, testing methods need to allow for “censoring” of early aftershock data, and a quantitative model for detection threshold as a function of

  5. Learning from physics-based earthquake simulators: a minimal approach

    NASA Astrophysics Data System (ADS)

    Artale Harris, Pietro; Marzocchi, Warner; Melini, Daniele

    2017-04-01

    Physics-based earthquake simulators are aimed to generate synthetic seismic catalogs of arbitrary length, accounting for fault interaction, elastic rebound, realistic fault networks, and some simple earthquake nucleation process like rate and state friction. Through comparison of synthetic and real catalogs seismologists can get insights on the earthquake occurrence process. Moreover earthquake simulators can be used to to infer some aspects of the statistical behavior of earthquakes within the simulated region, by analyzing timescales not accessible through observations. The develoment of earthquake simulators is commonly led by the approach "the more physics, the better", pushing seismologists to go towards simulators more earth-like. However, despite the immediate attractiveness, we argue that this kind of approach makes more and more difficult to understand which physical parameters are really relevant to describe the features of the seismic catalog at which we are interested. For this reason, here we take an opposite minimal approach and analyze the behavior of a purposely simple earthquake simulator applied to a set of California faults. The idea is that a simple model may be more informative than a complex one for some specific scientific objectives, because it is more understandable. The model has three main components: the first one is a realistic tectonic setting, i.e., a fault dataset of California; the other two components are quantitative laws for earthquake generation on each single fault, and the Coulomb Failure Function for modeling fault interaction. The final goal of this work is twofold. On one hand, we aim to identify the minimum set of physical ingredients that can satisfactorily reproduce the features of the real seismic catalog, such as short-term seismic cluster, and to investigate on the hypothetical long-term behavior, and faults synchronization. On the other hand, we want to investigate the limits of predictability of the model itself.

  6. Automated Radar Image of Deformation for Amatrice, Italy Earthquake

    NASA Image and Video Library

    2016-08-31

    Amatrice earthquake in central Italy, which caused widespread building damage to several towns throughout the region. This earthquake was the strongest in that area since the 2009 earthquake that destroyed the city of L'Aquila. The Advanced Rapid Imaging and Analysis (ARIA) data system, a collaborative project between NASA's Jet Propulsion Laboratory, Pasadena, California, and the California Institute of Technology in Pasadena, automatically generated interferometric synthetic aperture radar images from the Copernicus Sentinel 1A satellite operated by the European Space Agency (ESA) for the European Commission to calculate a map of the deformation of Earth's surface caused by the quake. This false-color map shows the amount of permanent surface movement, as viewed by the satellite, during a 12-day interval between two Sentinel 1 images acquired on Aug. 15, 2016, and Aug. 27, 2016. The movement was caused almost entirely by the earthquake. In this map, the colors of the surface displacements are proportional to the surface motion. The red and pink tones show the areas where the land moved toward the satellite by up to 2 inches (5 centimeters). The area with various shades of blue moved away from the satellite, mostly downward, by as much as 8 inches (20 centimeters). Contours on the surface motion are 2 inches (5 centimeters) The green star shows the epicenter where the earthquake started as located by the U.S. Geological Survey National Earthquake Information Center. Black dots show town locations. Scientists use these maps to build detailed models of the fault slip at depth and associated land movements to better understand the impact on future earthquake activity. The map shows the fault or faults that moved in the earthquake is about 14 miles (22 kilometers) long between Amatrice and Norcia and slopes to the west beneath the area that moved downward. http://photojournal.jpl.nasa.gov/catalog/PIA20896

  7. The Loma Prieta, California, Earthquake of October 17, 1989: Societal Response

    USGS Publications Warehouse

    Coordinated by Mileti, Dennis S.

    1993-01-01

    Professional Paper 1553 describes how people and organizations responded to the earthquake and how the earthquake impacted people and society. The investigations evaluate the tools available to the research community to measure the nature, extent, and causes of damage and losses. They describe human behavior during and immediately after the earthquake and how citizens participated in emergency response. They review the challenges confronted by police and fire departments and disruptions to transbay transportations systems. And they survey the challenges of post-earthquake recovery. Some significant findings were: * Loma Prieta provided the first test of ATC-20, the red, yellow, and green tagging of buildings. It successful application has led to widespread use in other disasters including the September 11, 2001, New York City terrorist incident. * Most people responded calmly and without panic to the earthquake and acted to get themselves to a safe location. * Actions by people to help alleviate emergency conditions were proportional to the level of need at the community level. * Some solutions caused problems of their own. The police perimeter around the Cypress Viaduct isolated businesses from their customers leading to a loss of business and the evacuation of employees from those businesses hindered the movement of supplies to the disaster scene. * Emergency transbay ferry service was established 6 days after the earthquake, but required constant revision of service contracts and schedules. * The Loma Prieta earthquake produced minimal disruption to the regional economy. The total economic disruption resulted in maximum losses to the Gross Regional Product of $725 million in 1 month and $2.9 billion in 2 months, but 80% of the loss was recovered during the first 6 months of 1990. Approximately 7,100 workers were laid off.

  8. Earthquake Fingerprints: Representing Earthquake Waveforms for Similarity-Based Detection

    NASA Astrophysics Data System (ADS)

    Bergen, K.; Beroza, G. C.

    2016-12-01

    New earthquake detection methods, such as Fingerprint and Similarity Thresholding (FAST), use fast approximate similarity search to identify similar waveforms in long-duration data without templates (Yoon et al. 2015). These methods have two key components: fingerprint extraction and an efficient search algorithm. Fingerprint extraction converts waveforms into fingerprints, compact signatures that represent short-duration waveforms for identification and search. Earthquakes are detected using an efficient indexing and search scheme, such as locality-sensitive hashing, that identifies similar waveforms in a fingerprint database. The quality of the search results, and thus the earthquake detection results, is strongly dependent on the fingerprinting scheme. Fingerprint extraction should map similar earthquake waveforms to similar waveform fingerprints to ensure a high detection rate, even under additive noise and small distortions. Additionally, fingerprints corresponding to noise intervals should have mutually dissimilar fingerprints to minimize false detections. In this work, we compare the performance of multiple fingerprint extraction approaches for the earthquake waveform similarity search problem. We apply existing audio fingerprinting (used in content-based audio identification systems) and time series indexing techniques and present modified versions that are specifically adapted for seismic data. We also explore data-driven fingerprinting approaches that can take advantage of labeled or unlabeled waveform data. For each fingerprinting approach we measure its ability to identify similar waveforms in a low signal-to-noise setting, and quantify the trade-off between true and false detection rates in the presence of persistent noise sources. We compare the performance using known event waveforms from eight independent stations in the Northern California Seismic Network.

  9. The Origin of High-angle Dip-slip Earthquakes at Geothermal Fields in California

    NASA Astrophysics Data System (ADS)

    Barbour, A. J.; Schoenball, M.; Martínez-Garzón, P.; Kwiatek, G.

    2016-12-01

    We examine the source mechanisms of earthquakes occurring in three California geothermal fields: The Geysers, Salton Sea, and Coso. We find source mechanisms ranging from strike slip faulting, consistent with the tectonic settings, to dip slip with unusually steep dip angles which are inconsistent with local structures. For example, we identify a fault zone in the Salton Sea Geothermal Field imaged using precisely-relocated hypocenters with a dip angle of 60° yet double-couple focal mechanisms indicate higher-angle dip-slip on ≥75° dipping planes. We observe considerable temporal variability in the distribution of source mechanisms. For example, at the Salton Sea we find that the number of high angle dip-slip events increased after 1989, when net-extraction rates were highest. There is a concurrent decline in strike-slip and strike-slip-normal faulting, the mechanisms expected from regional tectonics. These unusual focal mechanisms and their spatio-temporal patterns are enigmatic in terms of our understanding of faulting in geothermal regions. While near-vertical fault planes are expected to slip in a strike-slip sense, and dip slip is expected to occur on moderately dipping faults, we observe dip slip on near-vertical fault planes. However, for plausible stress states and accounting for geothermal production, the resolved fault planes should be stable. We systematically analyze the source mechanisms of these earthquakes using full moment tensor inversion to understand the constraints imposed by assuming a double-couple source. Applied to The Geysers field, we find a significant reduction in the number of high-angle dip-slip mechanisms using the full moment tensor. The remaining mechanisms displaying high-angle dip-slip could be consistent with faults accommodating subsidence and compaction associated with volumetric strain changes in the geothermal reservoir.

  10. California's Vulnerability to Volcanic Hazards: What's at Risk?

    NASA Astrophysics Data System (ADS)

    Mangan, M.; Wood, N. J.; Dinitz, L.

    2015-12-01

    California is a leader in comprehensive planning for devastating earthquakes, landslides, floods, and tsunamis. Far less attention, however, has focused on the potentially devastating impact of volcanic eruptions, despite the fact that they occur in the State about as frequently as the largest earthquakes on the San Andreas Fault Zone. At least 10 eruptions have occurred in the past 1,000 years—most recently in northern California (Lassen Peak 1914 to 1917)—and future volcanic eruptions are inevitable. The likelihood of renewed volcanism in California is about one in a few hundred to one in a few thousand annually. Eight young volcanoes, ranked as Moderate to Very High Threat [1] are dispersed throughout the State. Partially molten rock (magma) resides beneath at least seven of these—Medicine Lake Volcano, Mount Shasta, Lassen Volcanic Center, Clear Lake Volcanic Field, Long Valley Volcanic Region, Coso Volcanic Field, and Salton Buttes— causing earthquakes, toxic gas emissions, hydrothermal activity, and (or) ground deformation. Understanding the hazards and identifying what is at risk are the first steps in building community resilience to volcanic disasters. This study, prepared in collaboration with the State of California Governor's Office of Emergency Management and the California Geological Survey, provides a broad perspective on the State's exposure to volcano hazards by integrating mapped volcano hazard zones with geospatial data on at-risk populations, infrastructure, and resources. The study reveals that ~ 16 million acres fall within California's volcano hazard zones, along with ~ 190 thousand permanent and 22 million transitory populations. Additionally, far-field disruption to key water delivery systems, agriculture, utilities, and air traffic is likely. Further site- and sector-specific analyses will lead to improved hazard mitigation efforts and more effective disaster response and recovery. [1] "Volcanic Threat and Monitoring Capabilities

  11. Identified EM Earthquake Precursors

    NASA Astrophysics Data System (ADS)

    Jones, Kenneth, II; Saxton, Patrick

    2014-05-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After a number of custom rock experiments, two hypotheses were formed which could answer the EM wave model. The first hypothesis concerned a sufficient and continuous electron movement either by surface or penetrative flow, and the second regarded a novel approach to radio transmission. Electron flow along fracture surfaces was determined to be inadequate in creating strong EM fields, because rock has a very high electrical resistance making it a high quality insulator. Penetrative flow could not be corroborated as well, because it was discovered that rock was absorbing and confining electrons to a very thin skin depth. Radio wave transmission and detection worked with every single test administered. This hypothesis was reviewed for propagating, long-wave generation with sufficient amplitude, and the capability of penetrating solid rock. Additionally, fracture spaces, either air or ion-filled, can facilitate this concept from great depths and allow for surficial detection. A few propagating precursor signals have been detected in the field occurring with associated phases using custom-built loop antennae. Field testing was conducted in Southern California from 2006-2011, and outside the NE Texas town of Timpson in February, 2013. The antennae have mobility and observations were noted for

  12. USGS SAFRR Tsunami Scenario: Potential Impacts to the U.S. West Coast from a Plausible M9 Earthquake near the Alaska Peninsula

    NASA Astrophysics Data System (ADS)

    Ross, S.; Jones, L. M.; Wilson, R. I.; Bahng, B.; Barberopoulou, A.; Borrero, J. C.; Brosnan, D.; Bwarie, J. T.; Geist, E. L.; Johnson, L. A.; Hansen, R. A.; Kirby, S. H.; Knight, E.; Knight, W. R.; Long, K.; Lynett, P. J.; Miller, K. M.; Mortensen, C. E.; Nicolsky, D.; Oglesby, D. D.; Perry, S. C.; Porter, K. A.; Real, C. R.; Ryan, K. J.; Suleimani, E. N.; Thio, H. K.; Titov, V. V.; Wein, A. M.; Whitmore, P.; Wood, N. J.

    2012-12-01

    inform decision makers. The SAFRR Tsunami Scenario is organized by a coordinating committee with several working groups, including Earthquake Source, Paleotsunami/Geology Field Work, Tsunami Modeling, Engineering and Physical Impacts, Ecological Impacts, Emergency Management and Education, Social Vulnerability, Economic and Business Impacts, and Policy. In addition, the tsunami scenario process is being assessed and evaluated by researchers from the Natural Hazards Center at the University of Colorado at Boulder. The source event, defined by the USGS' Tsunami Source Working Group, is an earthquake similar to the 2011 Tohoku event, but set in the Semidi subduction sector, between Kodiak Island and the Shumagin Islands off the Pacific coast of the Alaska Peninsula. The Semidi sector is probably late in its earthquake cycle and comparisons of the geology and tectonic settings between Tohoku and the Semidi sector suggest that this location is appropriate. Tsunami modeling and inundation results have been generated for many areas along the California coast and elsewhere, including current velocity modeling for the ports of Los Angeles, Long Beach, and San Diego, and Ventura Harbor. Work on impacts to Alaska and Hawaii will follow. Note: Costas Synolakis (USC) is also an author of this abstract.

  13. Understanding earthquake hazards in urban areas - Evansville Area Earthquake Hazards Mapping Project

    USGS Publications Warehouse

    Boyd, Oliver S.

    2012-01-01

    The region surrounding Evansville, Indiana, has experienced minor damage from earthquakes several times in the past 200 years. Because of this history and the proximity of Evansville to the Wabash Valley and New Madrid seismic zones, there is concern among nearby communities about hazards from earthquakes. Earthquakes currently cannot be predicted, but scientists can estimate how strongly the ground is likely to shake as a result of an earthquake and are able to design structures to withstand this estimated ground shaking. Earthquake-hazard maps provide one way of conveying such information and can help the region of Evansville prepare for future earthquakes and reduce earthquake-caused loss of life and financial and structural loss. The Evansville Area Earthquake Hazards Mapping Project (EAEHMP) has produced three types of hazard maps for the Evansville area: (1) probabilistic seismic-hazard maps show the ground motion that is expected to be exceeded with a given probability within a given period of time; (2) scenario ground-shaking maps show the expected shaking from two specific scenario earthquakes; (3) liquefaction-potential maps show how likely the strong ground shaking from the scenario earthquakes is to produce liquefaction. These maps complement the U.S. Geological Survey's National Seismic Hazard Maps but are more detailed regionally and take into account surficial geology, soil thickness, and soil stiffness; these elements greatly affect ground shaking.

  14. Prospects for earthquake prediction and control

    USGS Publications Warehouse

    Healy, J.H.; Lee, W.H.K.; Pakiser, L.C.; Raleigh, C.B.; Wood, M.D.

    1972-01-01

    The San Andreas fault is viewed, according to the concepts of seafloor spreading and plate tectonics, as a transform fault that separates the Pacific and North American plates and along which relative movements of 2 to 6 cm/year have been taking place. The resulting strain can be released by creep, by earthquakes of moderate size, or (as near San Francisco and Los Angeles) by great earthquakes. Microearthquakes, as mapped by a dense seismograph network in central California, generally coincide with zones of the San Andreas fault system that are creeping. Microearthquakes are few and scattered in zones where elastic energy is being stored. Changes in the rate of strain, as recorded by tiltmeter arrays, have been observed before several earthquakes of about magnitude 4. Changes in fluid pressure may control timing of seismic activity and make it possible to control natural earthquakes by controlling variations in fluid pressure in fault zones. An experiment in earthquake control is underway at the Rangely oil field in Colorado, where the rates of fluid injection and withdrawal in experimental wells are being controlled. ?? 1972.

  15. Larger earthquakes recur more periodically: New insights in the megathrust earthquake cycle from lacustrine turbidite records in south-central Chile

    NASA Astrophysics Data System (ADS)

    Moernaut, J.; Van Daele, M.; Fontijn, K.; Heirman, K.; Kempf, P.; Pino, M.; Valdebenito, G.; Urrutia, R.; Strasser, M.; De Batist, M.

    2018-01-01

    Historical and paleoseismic records in south-central Chile indicate that giant earthquakes on the subduction megathrust - such as in AD1960 (Mw 9.5) - reoccur on average every ∼300 yr. Based on geodetic calculations of the interseismic moment accumulation since AD1960, it was postulated that the area already has the potential for a Mw 8 earthquake. However, to estimate the probability of such a great earthquake to take place in the short term, one needs to frame this hypothesis within the long-term recurrence pattern of megathrust earthquakes in south-central Chile. Here we present two long lacustrine records, comprising up to 35 earthquake-triggered turbidites over the last 4800 yr. Calibration of turbidite extent with historical earthquake intensity reveals a different macroseismic intensity threshold (≥VII1/2 vs. ≥VI1/2) for the generation of turbidites at the coring sites. The strongest earthquakes (≥VII1/2) have longer recurrence intervals (292 ±93 yrs) than earthquakes with intensity of ≥VI1/2 (139 ± 69yr). Moreover, distribution fitting and the coefficient of variation (CoV) of inter-event times indicate that the stronger earthquakes recur in a more periodic way (CoV: 0.32 vs. 0.5). Regional correlation of our multi-threshold shaking records with coastal paleoseismic data of complementary nature (tsunami, coseismic subsidence) suggests that the intensity ≥VII1/2 events repeatedly ruptured the same part of the megathrust over a distance of at least ∼300 km and can be assigned to Mw ≥ 8.6. We hypothesize that a zone of high plate locking - identified by geodetic studies and large slip in AD 1960 - acts as a dominant regional asperity, on which elastic strain builds up over several centuries and mostly gets released in quasi-periodic great and giant earthquakes. Our paleo-records indicate that Poissonian recurrence models are inadequate to describe large megathrust earthquake recurrence in south-central Chile. Moreover, they show an enhanced

  16. Chapter A. The Loma Prieta, California, Earthquake of October 17, 1989 - Lifelines

    USGS Publications Warehouse

    Schiff, Anshel J.

    1998-01-01

    To the general public who had their televisions tuned to watch the World Series, the 1989 Loma Prieta earthquake was a lifelines earthquake. It was the images seen around the world of the collapsed Cypress Street viaduct, with the frantic and heroic efforts to pull survivors from the structure that was billowing smoke; the collapsed section of the San Francisco-Oakland Bay Bridge and subsequent home video of a car plunging off the open span; and the spectacular fire in the Marina District of San Francisco fed by a broken gasline. To many of the residents of the San Francisco Bay region, the relation of lifelines to the earthquake was characterized by sitting in the dark because of power outage, the inability to make telephone calls because of network congestion, and the slow and snarled traffic. Had the public been aware of the actions of the engineers and tradespeople working for the utilities and other lifeline organizations on the emergency response and restoration of lifelines, the lifeline characteristics of this earthquake would have been even more significant. Unobserved by the public were the warlike devastation in several electrical-power substations, the 13 miles of gas-distribution lines that had to be replaced in several communities, and the more than 1,200 leaks and breaks in water mains and service connections that had to be excavated and repaired. Like the 1971 San Fernando, Calif., earthquake, which was a seminal event for activity to improve the earthquake performance of lifelines, the 1989 Loma Prieta earthquake demonstrated that the tasks of preparing lifelines in 'earthquake country' were incomplete-indeed, new lessons had to be learned.

  17. Simulation Based Earthquake Forecasting with RSQSim

    NASA Astrophysics Data System (ADS)

    Gilchrist, J. J.; Jordan, T. H.; Dieterich, J. H.; Richards-Dinger, K. B.

    2016-12-01

    We are developing a physics-based forecasting model for earthquake ruptures in California. We employ the 3D boundary element code RSQSim to generate synthetic catalogs with millions of events that span up to a million years. The simulations incorporate rate-state fault constitutive properties in complex, fully interacting fault systems. The Unified California Earthquake Rupture Forecast Version 3 (UCERF3) model and data sets are used for calibration of the catalogs and specification of fault geometry. Fault slip rates match the UCERF3 geologic slip rates and catalogs are tuned such that earthquake recurrence matches the UCERF3 model. Utilizing the Blue Waters Supercomputer, we produce a suite of million-year catalogs to investigate the epistemic uncertainty in the physical parameters used in the simulations. In particular, values of the rate- and state-friction parameters a and b, the initial shear and normal stress, as well as the earthquake slip speed, are varied over several simulations. In addition to testing multiple models with homogeneous values of the physical parameters, the parameters a, b, and the normal stress are varied with depth as well as in heterogeneous patterns across the faults. Cross validation of UCERF3 and RSQSim is performed within the SCEC Collaboratory for Interseismic Simulation and Modeling (CISM) to determine the affect of the uncertainties in physical parameters observed in the field and measured in the lab, on the uncertainties in probabilistic forecasting. We are particularly interested in the short-term hazards of multi-event sequences due to complex faulting and multi-fault ruptures.

  18. Annualized earthquake loss estimates for California and their sensitivity to site amplification

    USGS Publications Warehouse

    Chen, Rui; Jaiswal, Kishor; Bausch, D; Seligson, H; Wills, C.J.

    2016-01-01

    Input datasets for annualized earthquake loss (AEL) estimation for California were updated recently by the scientific community, and include the National Seismic Hazard Model (NSHM), site‐response model, and estimates of shear‐wave velocity. Additionally, the Federal Emergency Management Agency’s loss estimation tool, Hazus, was updated to include the most recent census and economic exposure data. These enhancements necessitated a revisit to our previous AEL estimates and a study of the sensitivity of AEL estimates subjected to alternate inputs for site amplification. The NSHM ground motions for a uniform site condition are modified to account for the effect of local near‐surface geology. The site conditions are approximated in three ways: (1) by VS30 (time‐averaged shear‐wave velocity in the upper 30 m) value obtained from a geology‐ and topography‐based map consisting of 15 VS30 groups, (2) by site classes categorized according to National Earthquake Hazards Reduction Program (NEHRP) site classification, and (3) by a uniform NEHRP site class D. In case 1, ground motions are amplified using the Seyhan and Stewart (2014) semiempirical nonlinear amplification model. In cases 2 and 3, ground motions are amplified using the 2014 version of the NEHRP site amplification factors, which are also based on the Seyhan and Stewart model but are approximated to facilitate their use for building code applications. Estimated AELs are presented at multiple resolutions, starting with the state level assessment and followed by detailed assessments for counties, metropolitan statistical areas (MSAs), and cities. AEL estimate at the state level is ∼$3.7  billion, 70% of which is contributed from Los Angeles–Long Beach–Santa Ana, San Francisco–Oakland–Fremont, and Riverside–San Bernardino–Ontario MSAs. The statewide AEL estimate is insensitive to alternate assumptions of site amplification. However, we note significant differences in AEL estimates

  19. The 2015 Fillmore earthquake swarm and possible crustal deformation mechanisms near the bottom of the eastern Ventura Basin, California

    USGS Publications Warehouse

    Hauksson, Egill; Andrews, Jennifer; Plesch, Andreas; Shaw, John H.; Shelly, David R.

    2016-01-01

    The 2015 Fillmore swarm occurred about 6 km west of the city of Fillmore in Ventura, California, and was located beneath the eastern part of the actively subsiding Ventura basin at depths from 11.8 to 13.8 km, similar to two previous swarms in the area. Template‐matching event detection showed that it started on 5 July 2015 at 2:21 UTC with an M∼1.0 earthquake. The swarm exhibited unusual episodic spatial and temporal migrations and unusual diversity in the nodal planes of the focal mechanisms as compared to the simple hypocenter‐defined plane. It was also noteworthy because it consisted of >1400 events of M≥0.0, with M 2.8 being the largest event. We suggest that fluids released by metamorphic dehydration processes, migration of fluids along a detachment zone, and cascading asperity failures caused this prolific earthquake swarm, but other mechanisms (such as simple mainshock–aftershock stress triggering or a regional aseismic creep event) are less likely. Dilatant strengthening may be a mechanism that causes the temporal decay of the swarm as pore‐pressure drop increased the effective normal stress, and counteracted the instability driving the swarm.

  20. Estimating earthquake location and magnitude from seismic intensity data

    USGS Publications Warehouse

    Bakun, W.H.; Wentworth, C.M.

    1997-01-01

    Analysis of Modified Mercalli intensity (MMI) observations for a training set of 22 California earthquakes suggests a strategy for bounding the epicentral region and moment magnitude M from MMI observations only. We define an intensity magnitude MI that is calibrated to be equal in the mean to M. MI = mean (Mi), where Mi = (MMIi + 3.29 + 0.0206 * ??i)/1.68 and ??i is the epicentral distance (km) of observation MMIi. The epicentral region is bounded by contours of rms [MI] = rms (MI - Mi) - rms0 (MI - Mi-), where rms is the root mean square, rms0 (MI - Mi) is the minimum rms over a grid of assumed epicenters, and empirical site corrections and a distance weighting function are used. Empirical contour values for bounding the epicenter location and empirical bounds for M estimated from MI appropriate for different levels of confidence and different quantities of intensity observations are tabulated. The epicentral region bounds and MI obtained for an independent test set of western California earthquakes are consistent with the instrumental epicenters and moment magnitudes of these earthquakes. The analysis strategy is particularly appropriate for the evaluation of pre-1900 earthquakes for which the only available data are a sparse set of intensity observations.

  1. Retardations in fault creep rates before local moderate earthquakes along the San Andreas fault system, central California

    USGS Publications Warehouse

    Burford, R.O.

    1988-01-01

    Records of shallow aseismic slip (fault creep) obtained along parts of the San Andreas and Calaveras faults in central California demonstrate that significant changes in creep rates often have been associated with local moderate earthquakes. An immediate postearthquake increase followed by gradual, long-term decay back to a previous background rate is generally the most obvious earthquake effect on fault creep. This phenomenon, identified as aseismic afterslip, usually is characterized by above-average creep rates for several months to a few years. In several cases, minor step-like movements, called coseismic slip events, have occurred at or near the times of mainshocks. One extreme case of coseismic slip, recorded at Cienega Winery on the San Andreas fault 17.5 km southeast of San Juan Bautista, consisted of 11 mm of sudden displacement coincident with earthquakes of ML=5.3 and ML=5.2 that occurred 2.5 minutes apart on 9 April 1961. At least one of these shocks originated on the main fault beneath the winery. Creep activity subsequently stopped at the winery for 19 months, then gradually returned to a nearly steady rate slightly below the previous long-term average. The phenomena mentioned above can be explained in terms of simple models consisting of relatively weak material along shallow reaches of the fault responding to changes in load imposed by sudden slip within the underlying seismogenic zone. In addition to coseismic slip and afterslip phenomena, however, pre-earthquake retardations in creep rates also have been observed. Onsets of significant, persistent decreases in creep rates have occurred at several sites 12 months or more before the times of moderate earthquakes. A 44-month retardation before the 1979 ML=5.9 Coyote Lake earthquake on the Calaveras fault was recorded at the Shore Road creepmeter site 10 km northwest of Hollister. Creep retardation on the San Andreas fault near San Juan Bautista has been evident in records from one creepmeter site for

  2. Retardations in fault creep rates before local moderate earthquakes along the San Andreas fault system, central California

    NASA Astrophysics Data System (ADS)

    Burford, Robert O.

    1988-06-01

    Records of shallow aseismic slip (fault creep) obtained along parts of the San Andreas and Calaveras faults in central California demonstrate that significant changes in creep rates often have been associated with local moderate earthquakes. An immediate postearthquake increase followed by gradual, long-term decay back to a previous background rate is generally the most obvious earthquake effect on fault creep. This phenomenon, identified as aseismic afterslip, usually is characterized by above-average creep rates for several months to a few years. In several cases, minor step-like movements, called coseismic slip events, have occurred at or near the times of mainshocks. One extreme case of coseismic slip, recorded at Cienega Winery on the San Andreas fault 17.5 km southeast of San Juan Bautista, consisted of 11 mm of sudden displacement coincident with earthquakes of M L =5.3 and M L =5.2 that occurred 2.5 minutes apart on 9 April 1961. At least one of these shocks originated on the main fault beneath the winery. Creep activity subsequently stopped at the winery for 19 months, then gradually returned to a nearly steady rate slightly below the previous long-term average. The phenomena mentioned above can be explained in terms of simple models consisting of relatively weak material along shallow reaches of the fault responding to changes in load imposed by sudden slip within the underlying seismogenic zone. In addition to coseismic slip and afterslip phenomena, however, pre-earthquake retardations in creep rates also have been observed. Onsets of significant, persistent decreases in creep rates have occurred at several sites 12 months or more before the times of moderate earthquakes. A 44-month retardation before the 1979 M L =5.9 Coyote Lake earthquake on the Calaveras fault was recorded at the Shore Road creepmeter site 10 km northwest of Hollister. Creep retardation on the San Andreas fault near San Juan Bautista has been evident in records from one creepmeter

  3. The 2006 Bahía Asunción Earthquake Swarm: Seismic Evidence of Active Deformation Along the Western Margin of Baja California Sur, Mexico

    NASA Astrophysics Data System (ADS)

    Munguía, Luis; Mayer, Sergio; Aguirre, Alfredo; Méndez, Ignacio; González-Escobar, Mario; Luna, Manuel

    2016-10-01

    The study of the Bahía Asunción earthquake swarm is important for two reasons. First, the earthquakes are clear evidence of present activity along the zone of deformation on the Pacific margin of Baja California. The swarm, with earthquakes of magnitude M w of up to 5.0, occurred on the coastline of the peninsula, showing that the Tosco-Abreojos zone of deformation is wider than previously thought. Second, the larger earthquakes in the swarm caused some damage and much concern in Bahía Asunción, a small town located in the zone of epicenters. We relocated the larger earthquakes with regional and/or local seismic data. Our results put the earthquake sources below the urban area of Bahía Asunción, at 40-50 km to the north of the teleseismically determined epicenters. In addition, these new locations are in the area of epicenters of many smaller events that were located with data from local temporary stations. This area trends in an E-W direction and has dimensions of approximately 15 km by 10 km. Most earthquakes had sources at depths that are between 4 and 9 km. A composite focal mechanism for the smaller earthquakes indicated right-lateral strike-slip motion and pure-normal faulting occurred during this swarm. Interestingly, the ANSS earthquake catalog of the United States Geological Survey (USGS) reported each one of these faulting styles for two large events of the swarm, with one of these earthquakes occurring 2 days before the other one. We associate the earthquake with strike-slip mechanism with the San Roque Fault, and the earthquake with the normal faulting style with the Asunción Fault. However, there is need of further study to verify this possible relation between the faults and the earthquakes. In addition, we recorded peak accelerations of up to 0.63 g with an accelerometer installed in Bahía Asunción. At this site, an earthquake of M w 4.9 produced those high values at a distance of 4.1 km. We also used the acceleration dataset from this site

  4. Triggered surface slips in the Coachella Valley area associated with the 1992 Joshua Tree and Landers, California, Earthquakes

    USGS Publications Warehouse

    Rymer, M.J.

    2000-01-01

    The Coachella Valley area was strongly shaken by the 1992 Joshua Tree (23 April) and Landers (28 June) earthquakes, and both events caused triggered slip on active faults within the area. Triggered slip associated with the Joshua Tree earthquake was on a newly recognized fault, the East Wide Canyon fault, near the southwestern edge of the Little San Bernardino Mountains. Slip associated with the Landers earthquake formed along the San Andreas fault in the southeastern Coachella Valley. Surface fractures formed along the East Wide Canyon fault in association with the Joshua Tree earthquake. The fractures extended discontinuously over a 1.5-km stretch of the fault, near its southern end. Sense of slip was consistently right-oblique, west side down, similar to the long-term style of faulting. Measured offset values were small, with right-lateral and vertical components of slip ranging from 1 to 6 mm and 1 to 4 mm, respectively. This is the first documented historic slip on the East Wide Canyon fault, which was first mapped only months before the Joshua Tree earthquake. Surface slip associated with the Joshua Tree earthquake most likely developed as triggered slip given its 5 km distance from the Joshua Tree epicenter and aftershocks. As revealed in a trench investigation, slip formed in an area with only a thin (<3 m thick) veneer of alluvium in contrast to earlier documented triggered slip events in this region, all in the deep basins of the Salton Trough. A paleoseismic trench study in an area of 1992 surface slip revealed evidence of two and possibly three surface faulting events on the East Wide Canyon fault during the late Quaternary, probably latest Pleistocene (first event) and mid- to late Holocene (second two events). About two months after the Joshua Tree earthquake, the Landers earthquake then triggered slip on many faults, including the San Andreas fault in the southeastern Coachella Valley. Surface fractures associated with this event formed discontinuous

  5. CISN ShakeAlert: Accounting for site amplification effects and quantifying time and spatial dependence of uncertainty estimates in the Virtual Seismologist earthquake early warning algorithm

    NASA Astrophysics Data System (ADS)

    Caprio, M.; Cua, G. B.; Wiemer, S.; Fischer, M.; Heaton, T. H.; CISN EEW Team

    2011-12-01

    The Virtual Seismologist (VS) earthquake early warning (EEW) algorithm is one of 3 EEW approaches being incorporated into the California Integrated Seismic Network (CISN) ShakeAlert system, a prototype EEW system being tested in real-time in California. The VS algorithm, implemented by the Swiss Seismological Service at ETH Zurich, is a Bayesian approach to EEW, wherein the most probable source estimate at any given time is a combination of contributions from a likehihood function that evolves in response to incoming data from the on-going earthquake, and selected prior information, which can include factors such as network topology, the Gutenberg-Richter relationship or previously observed seismicity. The VS codes have been running in real-time at the Southern California Seismic Network (SCSN) since July 2008, and at the Northern California Seismic Network (NCSN) since February 2009. With the aim of improving the convergence of real-time VS magnitude estimates to network magnitudes, we evaluate various empirical and Vs30-based approaches to accounting for site amplification. Empirical station corrections for SCSN stations are derived from M>3.0 events from 2005 through 2009. We evaluate the performance of the various approaches using an independent 2010 dataset. In addition, we analyze real-time VS performance from 2008 to the present to quantify the time and spatial dependence of VS uncertainty estimates. We also summarize real-time VS performance for significant 2011 events in California. Improved magnitude and uncertainty estimates potentially increase the utility of EEW information for end-users, particularly those intending to automate damage-mitigating actions based on real-time information.

  6. "Did you feel it?" Intensity data: A surprisingly good measure of earthquake ground motion

    USGS Publications Warehouse

    Atkinson, G.M.; Wald, D.J.

    2007-01-01

    The U.S. Geological Survey is tapping a vast new source of engineering seismology data through its "Did You Feel It?" (DYFI) program, which collects online citizen responses to earthquakes. To date, more than 750,000 responses have been compiled in the United States alone. The DYFI data make up in quantity what they may lack in scientific quality and offer the potential to resolve longstanding issues in earthquake ground-motion science. Such issues have been difficult to address due to the paucity of instrumental ground-motion data in regions of low seismicity. In particular, DYFI data provide strong evidence that earthquake stress drops, which control the strength of high-frequency ground shaking, are higher in the central and eastern United States (CEUS) than in California. Higher earthquake stress drops, coupled with lower attenuation of shaking with distance, result in stronger overall shaking over a wider area and thus more potential damage for CEUS earthquakes in comparison to those of equal magnitude in California - a fact also definitively captured with these new DYFI data and maps.

  7. Potential Effects of a Scenario Earthquake on the Economy of Southern California: Baseline County-Level Migration Characteristics and Trends 1995-2000 and 2001-2010

    USGS Publications Warehouse

    Sherrouse, Benson C.; Hester, David J.

    2008-01-01

    The Multi-Hazards Demonstration Project (MHDP) is a collaboration between the U.S. Geological Survey (USGS) and various partners from the public and private sectors and academia, meant to improve Southern California's resiliency to natural hazards. In support of the MHDP objectives, the ShakeOut Scenario was developed. It describes a magnitude 7.8 earthquake along the southernmost 300 kilometers (200 miles) of the San Andreas Fault, identified by geoscientists as a plausible event that will cause moderate to strong shaking over much of the eight-county (Imperial, Kern, Los Angeles, Orange, Riverside, San Bernardino, San Diego, and Ventura) Southern California region. This report uses historical, estimated, and projected population data from several Federal and State data sources to estimate baseline characteristics and trends of the region's population migration (that is, changes in a person's place of residence over time). The analysis characterizes migration by various demographic, economic, family, and household variables for the period 1995-2000. It also uses existing estimates (beginning in 2001) of the three components of population change - births, deaths, and migration - to extrapolate near-term projections of county-level migration trends through 2010. The 2010 date was chosen to provide baseline projections corresponding to a two-year recovery period following the November 2008 date that was selected for the occurrence of the ShakeOut Scenario earthquake. The baseline characteristics and projections shall assist with evaluating the effects of inflow and outflow migration trends for alternative futures in which the simulated M7.8 earthquake either does or does not occur and the impact of the event on housing and jobs, as well as community composition and regional economy changes based on dispersion of intellectual, physical, economic, and cultural capital.

  8. Great earthquakes along the Western United States continental margin: implications for hazards, stratigraphy and turbidite lithology

    NASA Astrophysics Data System (ADS)

    Nelson, C. H.; Gutiérrez Pastor, J.; Goldfinger, C.; Escutia, C.

    2012-11-01

    We summarize the importance of great earthquakes (Mw ≳ 8) for hazards, stratigraphy of basin floors, and turbidite lithology along the active tectonic continental margins of the Cascadia subduction zone and the northern San Andreas Transform Fault by utilizing studies of swath bathymetry visual core descriptions, grain size analysis, X-ray radiographs and physical properties. Recurrence times of Holocene turbidites as proxies for earthquakes on the Cascadia and northern California margins are analyzed using two methods: (1) radiometric dating (14C method), and (2) relative dating, using hemipelagic sediment thickness and sedimentation rates (H method). The H method provides (1) the best estimate of minimum recurrence times, which are the most important for seismic hazards risk analysis, and (2) the most complete dataset of recurrence times, which shows a normal distribution pattern for paleoseismic turbidite frequencies. We observe that, on these tectonically active continental margins, during the sea-level highstand of Holocene time, triggering of turbidity currents is controlled dominantly by earthquakes, and paleoseismic turbidites have an average recurrence time of ~550 yr in northern Cascadia Basin and ~200 yr along northern California margin. The minimum recurrence times for great earthquakes are approximately 300 yr for the Cascadia subduction zone and 130 yr for the northern San Andreas Fault, which indicates both fault systems are in (Cascadia) or very close (San Andreas) to the early window for another great earthquake. On active tectonic margins with great earthquakes, the volumes of mass transport deposits (MTDs) are limited on basin floors along the margins. The maximum run-out distances of MTD sheets across abyssal-basin floors along active margins are an order of magnitude less (~100 km) than on passive margins (~1000 km). The great earthquakes along the Cascadia and northern California margins cause seismic strengthening of the sediment, which

  9. CyberShake: A Physics-Based Seismic Hazard Model for Southern California

    NASA Astrophysics Data System (ADS)

    Graves, Robert; Jordan, Thomas H.; Callaghan, Scott; Deelman, Ewa; Field, Edward; Juve, Gideon; Kesselman, Carl; Maechling, Philip; Mehta, Gaurang; Milner, Kevin; Okaya, David; Small, Patrick; Vahi, Karan

    2011-03-01

    CyberShake, as part of the Southern California Earthquake Center's (SCEC) Community Modeling Environment, is developing a methodology that explicitly incorporates deterministic source and wave propagation effects within seismic hazard calculations through the use of physics-based 3D ground motion simulations. To calculate a waveform-based seismic hazard estimate for a site of interest, we begin with Uniform California Earthquake Rupture Forecast, Version 2.0 (UCERF2.0) and identify all ruptures within 200 km of the site of interest. We convert the UCERF2.0 rupture definition into multiple rupture variations with differing hypocenter locations and slip distributions, resulting in about 415,000 rupture variations per site. Strain Green Tensors are calculated for the site of interest using the SCEC Community Velocity Model, Version 4 (CVM4), and then, using reciprocity, we calculate synthetic seismograms for each rupture variation. Peak intensity measures are then extracted from these synthetics and combined with the original rupture probabilities to produce probabilistic seismic hazard curves for the site. Being explicitly site-based, CyberShake directly samples the ground motion variability at that site over many earthquake cycles (i.e., rupture scenarios) and alleviates the need for the ergodic assumption that is implicitly included in traditional empirically based calculations. Thus far, we have simulated ruptures at over 200 sites in the Los Angeles region for ground shaking periods of 2 s and longer, providing the basis for the first generation CyberShake hazard maps. Our results indicate that the combination of rupture directivity and basin response effects can lead to an increase in the hazard level for some sites, relative to that given by a conventional Ground Motion Prediction Equation (GMPE). Additionally, and perhaps more importantly, we find that the physics-based hazard results are much more sensitive to the assumed magnitude-area relations and

  10. CyberShake: A Physics-Based Seismic Hazard Model for Southern California

    USGS Publications Warehouse

    Graves, R.; Jordan, T.H.; Callaghan, S.; Deelman, E.; Field, E.; Juve, G.; Kesselman, C.; Maechling, P.; Mehta, G.; Milner, K.; Okaya, D.; Small, P.; Vahi, K.

    2011-01-01

    CyberShake, as part of the Southern California Earthquake Center's (SCEC) Community Modeling Environment, is developing a methodology that explicitly incorporates deterministic source and wave propagation effects within seismic hazard calculations through the use of physics-based 3D ground motion simulations. To calculate a waveform-based seismic hazard estimate for a site of interest, we begin with Uniform California Earthquake Rupture Forecast, Version 2.0 (UCERF2.0) and identify all ruptures within 200 km of the site of interest. We convert the UCERF2.0 rupture definition into multiple rupture variations with differing hypocenter locations and slip distributions, resulting in about 415,000 rupture variations per site. Strain Green Tensors are calculated for the site of interest using the SCEC Community Velocity Model, Version 4 (CVM4), and then, using reciprocity, we calculate synthetic seismograms for each rupture variation. Peak intensity measures are then extracted from these synthetics and combined with the original rupture probabilities to produce probabilistic seismic hazard curves for the site. Being explicitly site-based, CyberShake directly samples the ground motion variability at that site over many earthquake cycles (i. e., rupture scenarios) and alleviates the need for the ergodic assumption that is implicitly included in traditional empirically based calculations. Thus far, we have simulated ruptures at over 200 sites in the Los Angeles region for ground shaking periods of 2 s and longer, providing the basis for the first generation CyberShake hazard maps. Our results indicate that the combination of rupture directivity and basin response effects can lead to an increase in the hazard level for some sites, relative to that given by a conventional Ground Motion Prediction Equation (GMPE). Additionally, and perhaps more importantly, we find that the physics-based hazard results are much more sensitive to the assumed magnitude-area relations and

  11. Forecasting of future earthquakes in the northeast region of India considering energy released concept

    NASA Astrophysics Data System (ADS)

    Zarola, Amit; Sil, Arjun

    2018-04-01

    This study presents the forecasting of time and magnitude size of the next earthquake in the northeast India, using four probability distribution models (Gamma, Lognormal, Weibull and Log-logistic) considering updated earthquake catalog of magnitude Mw ≥ 6.0 that occurred from year 1737-2015 in the study area. On the basis of past seismicity of the region, two types of conditional probabilities have been estimated using their best fit model and respective model parameters. The first conditional probability is the probability of seismic energy (e × 1020 ergs), which is expected to release in the future earthquake, exceeding a certain level of seismic energy (E × 1020 ergs). And the second conditional probability is the probability of seismic energy (a × 1020 ergs/year), which is expected to release per year, exceeding a certain level of seismic energy per year (A × 1020 ergs/year). The logarithm likelihood functions (ln L) were also estimated for all four probability distribution models. A higher value of ln L suggests a better model and a lower value shows a worse model. The time of the future earthquake is forecasted by dividing the total seismic energy expected to release in the future earthquake with the total seismic energy expected to release per year. The epicentre of recently occurred 4 January 2016 Manipur earthquake (M 6.7), 13 April 2016 Myanmar earthquake (M 6.9) and the 24 August 2016 Myanmar earthquake (M 6.8) are located in zone Z.12, zone Z.16 and zone Z.15, respectively and that are the identified seismic source zones in the study area which show that the proposed techniques and models yield good forecasting accuracy.

  12. Anomalous geomagnetic variations associated with Parkfield (Ms=6.0, 28-SEP-2004, California, USA) earthquake

    NASA Astrophysics Data System (ADS)

    Kotsarenko, A. A.; Pilinets, S. A.; Perez Enriquez, R.; Lopez Cruz Abeyro, J. A.

    2007-05-01

    Analysis of geomagnetic and telluric data, measured at the station PRK (Parkfield, ULF flux-gate 3-axial magnetometer) 1 week before (including) the day of the major EQ (EarthQuake, Ms=6.0, 28-SEP-2004, 17:15:24) near Parkfield, California, USA, are presented. Spectral analysis reveal the ULF geomagnetic disturbances observed the day before the event, Sep 27, at 15:00- 20:00 by UT, and at the day of the EQ, Sep 28, at 11:00-19:00. Filtering in the corresponding frequency band f = 0.25-0.5 Hz gives the following estimations of the amplitudes of the signals: up to 20 pT for the magnetic channels and 1.5 mkV/km for the telluric ones. Observed phenomena occurs under quiet geomagnetic conditions (|Dst|<20 nT); revision of the referent stations data situated far away from the EQ epicenter (330 km) does not reveal any similar effect. Moreover, the Quake Finder research group (http:www.quakefinder.com) received very similar results (ELF range instrument, placed about 50 km from the EQ epicenter) for the day of the EQ. Mentioned above suggests the localized character of the source, possibly of the ionosphere or tectonic origin rather than of magnetosphere. Comparative analysis of the mentioned 2 stations show that we observed the lower-frequency part of the ULF- ELF burst, localized in the frequency range 0.25-1 Hz, generated 9 hours before the earthquake. Acknowledgements. The authors are grateful to Malcolm Johnston for providing us with the geomagnetic data.

  13. Submarine landslides of the Southern California Borderland

    USGS Publications Warehouse

    Lee, H.J.; Greene, H. Gary; Edwards, B.D.; Fisher, M.A.; Normark, W.R.

    2009-01-01

    Conventional bathymetry, sidescan-sonar and seismic-reflection data, and recent, multibeam surveys of large parts of the Southern California Borderland disclose the presence of numerous submarine landslides. Most of these features are fairly small, with lateral dimensions less than ??2 km. In areas where multibeam surveys are available, only two large landslide complexes were identified on the mainland slope- Goleta slide in Santa Barbara Channel and Palos Verdes debris avalanche on the San Pedro Escarpment south of Palos Verdes Peninsula. Both of these complexes indicate repeated recurrences of catastrophic slope failure. Recurrence intervals are not well constrained but appear to be in the range of 7500 years for the Goleta slide. The most recent major activity of the Palos Verdes debris avalanche occurred roughly 7500 years ago. A small failure deposit in Santa Barbara Channel, the Gaviota mudflow, was perhaps caused by an 1812 earthquake. Most landslides in this region are probably triggered by earthquakes, although the larger failures were likely conditioned by other factors, such as oversteepening, development of shelf-edge deltas, and high fluid pressures. If a subsequent future landslide were to occur in the area of these large landslide complexes, a tsunami would probably result. Runup distances of 10 m over a 30-km-long stretch of the Santa Barbara coastline are predicted for a recurrence of the Goleta slide, and a runup of 3 m over a comparable stretch of the Los Angeles coastline is modeled for the Palos Verdes debris avalanche. ?? 2009 The Geological Society of America.

  14. Geophysical setting of the 2000 ML 5.2 Yountville, California, earthquake: Implications for seismic Hazard in Napa Valley, California

    USGS Publications Warehouse

    Langenheim, V.E.; Graymer, R.W.; Jachens, R.C.

    2006-01-01

    The epicenter of the 2000 ML 5.2 Yountville earthquake was located 5 km west of the surface trace of the West Napa fault, as defined by Helley and Herd (1977). On the basis of the re-examination of geologic data and the analysis of potential field data, the earthquake occurred on a strand of the West Napa fault, the main basin-bounding fault along the west side of Napa Valley. Linear aeromagnetic anomalies and a prominent gravity gradient extend the length of the fault to the latitude of Calistoga, suggesting that this fault may be capable of larger-magnitude earthquakes. Gravity data indicate an ???2-km-deep basin centered on the town of Napa, where damage was concentrated during the Yountville earthquake. It most likely played a minor role in enhancing shaking during this event but may lead to enhanced shaking caused by wave trapping during a larger-magnitude earthquake.

  15. Probabilistic Tsunami Hazard Assessment along Nankai Trough (1) An assessment based on the information of the forthcoming earthquake that Earthquake Research Committee(2013) evaluated

    NASA Astrophysics Data System (ADS)

    Hirata, K.; Fujiwara, H.; Nakamura, H.; Osada, M.; Morikawa, N.; Kawai, S.; Ohsumi, T.; Aoi, S.; Yamamoto, N.; Matsuyama, H.; Toyama, N.; Kito, T.; Murashima, Y.; Murata, Y.; Inoue, T.; Saito, R.; Takayama, J.; Akiyama, S.; Korenaga, M.; Abe, Y.; Hashimoto, N.

    2015-12-01

    The Earthquake Research Committee(ERC)/HERP, Government of Japan (2013) revised their long-term evaluation of the forthcoming large earthquake along the Nankai Trough; the next earthquake is estimated M8 to 9 class, and the probability (P30) that the next earthquake will occur within the next 30 years (from Jan. 1, 2013) is 60% to 70%. In this study, we assess tsunami hazards (maximum coastal tsunami heights) in the near future, in terms of a probabilistic approach, from the next earthquake along Nankai Trough, on the basis of ERC(2013)'s report. The probabilistic tsunami hazard assessment that we applied is as follows; (1) Characterized earthquake fault models (CEFMs) are constructed on each of the 15 hypothetical source areas (HSA) that ERC(2013) showed. The characterization rule follows Toyama et al.(2015, JpGU). As results, we obtained total of 1441 CEFMs. (2) We calculate tsunamis due to CEFMs by solving nonlinear, finite-amplitude, long-wave equations with advection and bottom friction terms by finite-difference method. Run-up computation on land is included. (3) A time predictable model predicts the recurrent interval of the present seismic cycle is T=88.2 years (ERC,2013). We fix P30 = 67% by applying the renewal process based on BPT distribution with T and alpha=0.24 as its aperiodicity. (4) We divide the probability P30 into P30(i) for i-th subgroup consisting of the earthquakes occurring in each of 15 HSA by following a probability re-distribution concept (ERC,2014). Then each earthquake (CEFM) in i-th subgroup is assigned a probability P30(i)/N where N is the number of CEFMs in each sub-group. Note that such re-distribution concept of the probability is nothing but tentative because the present seismology cannot give deep knowledge enough to do it. Epistemic logic-tree approach may be required in future. (5) We synthesize a number of tsunami hazard curves at every evaluation points on coasts by integrating the information about 30 years occurrence

  16. Some characteristics of the complex El Mayor-Cucapah, MW7.2, April 4, 2010, Baja California, Mexico, earthquake, from well-located aftershock data from local and regional networks.

    NASA Astrophysics Data System (ADS)

    Frez, J.; Nava Pichardo, F. A.; Acosta, J.; Munguia, L.; Carlos, J.; García, R.

    2015-12-01

    Aftershocks from the El Mayor-Cucapah (EMC), MW7.2, April 4, 2010, Baja California, Mexico, earthquake, were recorded over two months by a 31 station local array (Reftek RT130 seismographs loaned from IRIS-PASSCAL), complemented by regional data from SCSN, and CICESE. The resulting data base includes 518 aftershocks with ML ≥ 3.0, plus 181 smaller events. Reliable hypocenters were determined using HYPODD and a velocity structure determined from refraction data for a mesa located to the west of the Mexicali-Imperial Valley. Aftershock hypocenters show that the El Mayor-Cucapah earthquake was a multiple event comprising two or three different ruptures of which the last one constituted the main event. The main event rupture, which extends in a roughly N45°W direction, is complex with well-defined segments having different characteristics. The main event central segment, located close to the first event epicenter is roughly vertical, the northwest segment dips ~68°NE, while the two southeast segments dip ~60°SW and ~52°SW, respectively, which agrees with results of previous studies based on teleseismic long periods and on GPS-INSAR. All main rupture aftershock hypocenters have depths above 10-11km and, except for the central segment, they delineate the edges of zones with largest coseismic displacement. The two southern segments show seismicity concentrated below 5km and 3.5km, respectively; the paucity of shallow seismicity may be caused by the thick layer of non-consolidated sediments in this region. The ruptures delineated by aftershocks in the southern regions correspond to the Indiviso fault, unidentified until the occurrence of the EMC earthquake. The first event was relocated together with the aftershocks; the epicenter lies slightly westwards of published locations, but it definitely does not lie on, or close to, the main rupture. The focal mechanism of the first event, based on first arrival polarities, is predominantly strike-slip; the focal plane

  17. Chapter C. The Loma Prieta, California, Earthquake of October 17, 1989 - Preseismic Observations

    USGS Publications Warehouse

    Johnston, Malcolm J. S.

    1993-01-01

    The October 17, 1989, Loma Prieta, Calif., Ms=7.1 earthquake provided the first opportunity in the history of fault monitoring in the United States to gather multidisciplinary preearthquake data in the near field of an M=7 earthquake. The data obtained include observations on seismicity, continuous strain, long-term ground displacement, magnetic field, and hydrology. The papers in this chapter describe these data, their implications for fault-failure mechanisms, the scale of prerupture nucleation, and earthquake prediction in general. Of the 10 papers presented here, about half identify preearthquake anomalies in the data, but some of these results are equivocal. Seismicity in the Loma Prieta region during the 20 years leading up to the earthquake was unremarkable. In retrospect, however, it is apparent that the principal southwest-dipping segment of the subsequent Loma Prieta rupture was virtually aseismic during this period. Two M=5 earthquakes did occur near Lake Elsman near the junction of the Sargent and San Andreas faults within 2.5 and 15 months of, and 10 km to the north of, the Loma Prieta epicenter. Although these earthquakes were not on the subsequent rupture plane of the Loma Prieta earthquake and other M=5 earthquakes occurred in the preceding 25 years, it is now generally accepted that these events were, in some way, foreshocks to the main event.

  18. High-resolution seismic-reflection data offshore of Dana Point, southern California borderland

    USGS Publications Warehouse

    Sliter, Ray W.; Ryan, Holly F.; Triezenberg, Peter J.

    2010-01-01

    The U.S. Geological Survey collected high-resolution shallow seismic-reflection profiles in September 2006 in the offshore area between Dana Point and San Mateo Point in southern Orange and northern San Diego Counties, California. Reflection profiles were located to image folds and reverse faults associated with the San Mateo fault zone and high-angle strike-slip faults near the shelf break (the Newport-Inglewood fault zone) and at the base of the slope. Interpretations of these data were used to update the USGS Quaternary fault database and in shaking hazard models for the State of California developed by the Working Group for California Earthquake Probabilities. This cruise was funded by the U.S. Geological Survey Coastal and Marine Catastrophic Hazards project. Seismic-reflection data were acquired aboard the R/V Sea Explorer, which is operated by the Ocean Institute at Dana Point. A SIG ELC820 minisparker seismic source and a SIG single-channel streamer were used. More than 420 km of seismic-reflection data were collected. This report includes maps of the seismic-survey sections, linked to Google Earth? software, and digital data files showing images of each transect in SEG-Y, JPEG, and TIFF formats.

  19. The 2014 update to the National Seismic Hazard Model in California

    USGS Publications Warehouse

    Powers, Peter; Field, Edward H.

    2015-01-01

    The 2014 update to the U. S. Geological Survey National Seismic Hazard Model in California introduces a new earthquake rate model and new ground motion models (GMMs) that give rise to numerous changes to seismic hazard throughout the state. The updated earthquake rate model is the third version of the Uniform California Earthquake Rupture Forecast (UCERF3), wherein the rates of all ruptures are determined via a self-consistent inverse methodology. This approach accommodates multifault ruptures and reduces the overprediction of moderate earthquake rates exhibited by the previous model (UCERF2). UCERF3 introduces new faults, changes to slip or moment rates on existing faults, and adaptively smoothed gridded seismicity source models, all of which contribute to significant changes in hazard. New GMMs increase ground motion near large strike-slip faults and reduce hazard over dip-slip faults. The addition of very large strike-slip ruptures and decreased reverse fault rupture rates in UCERF3 further enhances these effects.

  20. Future Earth: Reducing Loss By Automating Response to Earthquake Shaking

    NASA Astrophysics Data System (ADS)

    Allen, R. M.

    2014-12-01

    Earthquakes pose a significant threat to society in the U.S. and around the world. The risk is easily forgotten given the infrequent recurrence of major damaging events, yet the likelihood of a major earthquake in California in the next 30 years is greater than 99%. As our societal infrastructure becomes ever more interconnected, the potential impacts of these future events are difficult to predict. Yet, the same inter-connected infrastructure also allows us to rapidly detect earthquakes as they begin, and provide seconds, tens or seconds, or a few minutes warning. A demonstration earthquake early warning system is now operating in California and is being expanded to the west coast (www.ShakeAlert.org). In recent earthquakes in the Los Angeles region, alerts were generated that could have provided warning to the vast majority of Los Angelinos who experienced the shaking. Efforts are underway to build a public system. Smartphone technology will be used not only to issue that alerts, but could also be used to collect data, and improve the warnings. The MyShake project at UC Berkeley is currently testing an app that attempts to turn millions of smartphones into earthquake-detectors. As our development of the technology continues, we can anticipate ever-more automated response to earthquake alerts. Already, the BART system in the San Francisco Bay Area automatically stops trains based on the alerts. In the future, elevators will stop, machinery will pause, hazardous materials will be isolated, and self-driving cars will pull-over to the side of the road. In this presentation we will review the current status of the earthquake early warning system in the US. We will illustrate how smartphones can contribute to the system. Finally, we will review applications of the information to reduce future losses.