Sample records for california earthquake probabilities

  1. An empirical model for earthquake probabilities in the San Francisco Bay region, California, 2002-2031

    USGS Publications Warehouse

    Reasenberg, P.A.; Hanks, T.C.; Bakun, W.H.

    2003-01-01

    The moment magnitude M 7.8 earthquake in 1906 profoundly changed the rate of seismic activity over much of northern California. The low rate of seismic activity in the San Francisco Bay region (SFBR) since 1906, relative to that of the preceding 55 yr, is often explained as a stress-shadow effect of the 1906 earthquake. However, existing elastic and visco-elastic models of stress change fail to fully account for the duration of the lowered rate of earthquake activity. We use variations in the rate of earthquakes as a basis for a simple empirical model for estimating the probability of M ???6.7 earthquakes in the SFBR. The model preserves the relative magnitude distribution of sources predicted by the Working Group on California Earthquake Probabilities' (WGCEP, 1999; WGCEP, 2002) model of characterized ruptures on SFBR faults and is consistent with the occurrence of the four M ???6.7 earthquakes in the region since 1838. When the empirical model is extrapolated 30 yr forward from 2002, it gives a probability of 0.42 for one or more M ???6.7 in the SFBR. This result is lower than the probability of 0.5 estimated by WGCEP (1988), lower than the 30-yr Poisson probability of 0.60 obtained by WGCEP (1999) and WGCEP (2002), and lower than the 30-yr time-dependent probabilities of 0.67, 0.70, and 0.63 obtained by WGCEP (1990), WGCEP (1999), and WGCEP (2002), respectively, for the occurrence of one or more large earthquakes. This lower probability is consistent with the lack of adequate accounting for the 1906 stress-shadow in these earlier reports. The empirical model represents one possible approach toward accounting for the stress-shadow effect of the 1906 earthquake. However, the discrepancy between our result and those obtained with other modeling methods underscores the fact that the physics controlling the timing of earthquakes is not well understood. Hence, we advise against using the empirical model alone (or any other single probability model) for estimating the earthquake hazard and endorse the use of all credible earthquake probability models for the region, including the empirical model, with appropriate weighting, as was done in WGCEP (2002).

  2. California Fault Parameters for the National Seismic Hazard Maps and Working Group on California Earthquake Probabilities 2007

    USGS Publications Warehouse

    Wills, Chris J.; Weldon, Ray J., II; Bryant, W.A.

    2008-01-01

    This report describes development of fault parameters for the 2007 update of the National Seismic Hazard Maps and the Working Group on California Earthquake Probabilities (WGCEP, 2007). These reference parameters are contained within a database intended to be a source of values for use by scientists interested in producing either seismic hazard or deformation models to better understand the current seismic hazards in California. These parameters include descriptions of the geometry and rates of movements of faults throughout the state. These values are intended to provide a starting point for development of more sophisticated deformation models which include known rates of movement on faults as well as geodetic measurements of crustal movement and the rates of movements of the tectonic plates. The values will be used in developing the next generation of the time-independent National Seismic Hazard Maps, and the time-dependant seismic hazard calculations being developed for the WGCEP. Due to the multiple uses of this information, development of these parameters has been coordinated between USGS, CGS and SCEC. SCEC provided the database development and editing tools, in consultation with USGS, Golden. This database has been implemented in Oracle and supports electronic access (e.g., for on-the-fly access). A GUI-based application has also been developed to aid in populating the database. Both the continually updated 'living' version of this database, as well as any locked-down official releases (e.g., used in a published model for calculating earthquake probabilities or seismic shaking hazards) are part of the USGS Quaternary Fault and Fold Database http://earthquake.usgs.gov/regional/qfaults/ . CGS has been primarily responsible for updating and editing of the fault parameters, with extensive input from USGS and SCEC scientists.

  3. The earthquake prediction experiment at Parkfield, California

    Microsoft Academic Search

    Evelyn Roeloffs; John Langbein

    1994-01-01

    Since 1985, a focused earthquake prediction experiment has been in progress along the San Andreas fault near the town of Parkfield in central California. Parkfield has experienced six moderate earthquakes since 1857 at average intervals of 22 years, the most recent a magnitude of 6 event in 1966. The probability of another moderate earthquake soon appears high, but studies assigning

  4. The Uniform California Earthquake Rupture Forecast, Version 2 (UCERF 2)

    USGS Publications Warehouse

    2007 Working Group on California Earthquake Probabilities

    2008-01-01

    California?s 35 million people live among some of the most active earthquake faults in the United States. Public safety demands credible assessments of the earthquake hazard to maintain appropriate building codes for safe construction and earthquake insurance for loss protection. Seismic hazard analysis begins with an earthquake rupture forecast?a model of probabilities that earthquakes of specified magnitudes, locations, and faulting types will occur during a specified time interval. This report describes a new earthquake rupture forecast for California developed by the 2007 Working Group on California Earthquake Probabilities (WGCEP 2007).

  5. Intro Probability Rabbits Description Predictions Ontology of Earthquake Probability: Metaphor

    E-print Network

    Stark, Philip B.

    Intro Probability Rabbits Description Predictions Ontology of Earthquake Probability: Metaphor be abandoned in favor of common sense. #12;Intro Probability Rabbits Description Predictions Earthquake not random. ­Wm. ShakesEarth #12;Intro Probability Rabbits Description Predictions Earthquake Poker

  6. Uniform California Earthquake Rupture Forecast, Version 2 (UCERF 2)

    Microsoft Academic Search

    E. H. Field; T. E. Dawson; K. R. Felzer; A. D. Frankel; V. Gupta; T. H. Jordan; T. Parsons; M. D. Petersen; R. S. Stein; R. J. Weldon; C. J. Wills

    2009-01-01

    The 2007 Working Group on California Earthquake Probabilities (WGCEP,2007)presentstheUniformCalifornia EarthquakeRupture Forecast, Version 2( UCERF 2). This model comprises a time-independent (Poisson-process) earthquake rate model, developed jointly with the National Seismic Hazard Mapping Program and a time-dependent earthquake-probability model, based on recent earthquake rates and stress-renewal statistics conditioned on the date of last event. The models were devel- opedfromupdatedstatewideearthquakecatalogsandfaultdeformationdatabasesusing a uniform

  7. Cosmogenic beryllium-10 exposure dating of probable earthquake-triggered rock avalanches in Yosemite Valley, California

    NASA Astrophysics Data System (ADS)

    Thompson, J. A.; Stock, G. M.; Rood, D.; Frankel, K. L.

    2013-12-01

    In Yosemite Valley, rock falls commonly originate from the glacially-steepened walls. Deposition of the smaller rock falls, from hundreds up to tens of thousands of cubic meters, is typically limited to the active talus slopes beneath the cliffs. The floor of Yosemite Valley, however, preserves at least seven extremely large rock fall deposits, here termed rock avalanches, up to several million cubic meters in volume. These deposits extend far beyond the base of active talus slopes onto the valley floor, and have occurred since the retreat of Last Glacial Maximum glaciers circa 15-17 ka. Using airborne LiDAR data that resolves individual boulders, we mapped the rock avalanche deposits in the field and in ArcGIS. Minimum exposed volumes range from hundreds of thousands to several million cubic meters. To assess the frequency of rock avalanche occurrence, we employed cosmogenic beryllium-10 surface exposure dating of large (>4 cubic meters) boulders embedded within the deposits. These deposits are ideal targets for cosmogenic 10Be exposure dating, as they are instantaneous events that excavate deep-seated quartz-rich granitic rocks, and once deposited, are essentially immune to post-depositional erosion or modification. Mean exposure ages indicate that failures occurred at 1.0, 1.8, 2.3, 3.7, 4.4, 6.4, and 11.6 ka. At least three of the deposits appear to represent two or more failures, separated in time by hundreds to thousands of years. Synchronous rock avalanches (within the uncertainty of the exposure ages (<200 yrs)) at different locations within the valley appear to have occurred at 3.7 ka, and possibly at 2.3 ka, suggesting possible coseismic triggering. Age correlations from paleoseismic work tentatively identify large earthquakes originating from the eastern Sierra Nevada or western Nevada as possible triggers for at least half of the rock avalanches. These unique and robust age data provide key information for accurately mapping rock avalanches in Yosemite Valley and for quantifying their recurrence intervals.

  8. Historic Earthquakes in Southern California

    NSDL National Science Digital Library

    This page contains a map of southern California with epicenters of earthquakes shown as circles of different sizes and colors. The size and color of each earthquake symbol corresponds to its magnitude, as indicated by a scale on the map. Clicking on an epicenter takes the user to a page of information about that earthquake. Earthquakes dating back to 1812 are shown. Also available on this page are links to fault maps, earthquake animations, and other indexes of seismological information.

  9. Parkfield, California, earthquake prediction experiment

    Microsoft Academic Search

    W. H. Bakun; A. G. Lindh

    1985-01-01

    Five moderate (magnitude 6) earthquakes with similar features have occurred on the Parkfield section of the San Andreas fault in central California since 1857. The next moderate Parkfield earthquake is expected to occur before 1993. The Parkfield prediction experiment is designed to monitor the details of the final stages of the earthquake preparation process; observations and reports of seismicity and

  10. The Parkfield, California, Earthquake Experiment

    NSDL National Science Digital Library

    This report decribes research being carried out in Parkfield, California whose purpose is to better understand the physics of earthquakes: what actually happens on the fault and in the surrounding region before, during and after an earthquake. Ultimately, scientists hope to better understand the earthquake process and, if possible, to provide a scientific basis for earthquake prediction. Topics include the scientific background for the experiment, including the tectonic setting at Parkfield, historical earthquake activity on this section of the San Andreas fault, the monitoring and data collecting activities currently being carried out, and plans for future research. Data are also available to view in real time and to download.

  11. Probabilities of occurrence of large plate rupturing earthquakes for the San Andreas, San Jacinto, and Imperial faults, California, 1983--2003

    SciTech Connect

    Sykes, L.R.; Nishenko, S.P.

    1984-07-10

    The San Andreas, San Jacinto, and Imperial faults in California are divided into 19 segments; conditional probabilities are calculated that a particular segment will be the site of a large plate rupturing earthquake, i.e., an event that breaks the entire down-dip extent of the seismogenic zone, during the next 20 years. The sizes of such events, which account for most of the slip that occurs seismically, appear to vary greatly for different segments of these faults. Repeat time of large shocks, coseismic displacement, moment release, rupture length, and seismic magnitude appear to correlated with one another and to be a function of the tectonic style of different parts of those fault zones. Tectonic inhomogeneities on a scale of about 1 to 100 km are much larger than displacement in any single seismic event and may be regarded as being invariant in their effects upon earthquake generation over many cycles of large shocks. It is this invariance that appears to lead to a given segment of a fault rupturing repeatedly in events of nearly the same size. Since repeat time varies, however, for a given segment of a fault, a simple probabilistic approach is used to forecast the likelihood of large future earthquakes for each segment, using as input the time of the last large shock, the average recurrence time, and the standard deviation of time intervals between events. Dates of the last large shocks are available for most of the segments investigated. Repeat times are estimated from times of historic and prehistoric events, tectonic similarity, and times calculated from coseismic displacement in the last large shock divided by a rate of fault motion or strain buildup.

  12. Appendix H: WGCEP Historical California Earthquake Catalog

    E-print Network

    Felzer, Karen

    , Appendix H in The Uniform California Earthquake Rupture Forecast, version 2 (UCERF 2): U.S. GeologicalAppendix H: WGCEP Historical California Earthquake Catalog By Karen R. Felzer1 and Tianqing Cao2-USGS Suggested citation: Felzer, K.R., and Cao, Tianqing, 2008, WGCEP Historical California earthquake catalog

  13. How Do Scientists Determine Earthquake Probabilities?

    NSDL National Science Digital Library

    This provides many links to articles, graphics, scientific papers and podcasts to help students understand how scientists determine probabilities for earthquake occurrences. Topics include the locations of faults and how much they need to move in order to release the strain that accumulates; the study of past earthquakes on each fault to predict the size of possible earthquakes that could occur in the future; and using information on how long it's been since the last earthquake to estimate the probability that an earthquake will occur in the next few years. Links to additional information are embedded in the text.

  14. Southern California Earthquake Data Center

    NSDL National Science Digital Library

    To say that there are a few earthquake research centers in Southern California is a bit like saying that Chicago sits on a lake of some size. It's a bit of an obvious remark, but given that there are a number of such projects, it's important to take a look at some of the more compelling ones out there. One such important resource is the Southern California Earthquake Data Center, sponsored by a host of organizations, including the California Institute of Technology and the United States Geological Survey. Visitors to the project site can peruse some of its recent work, which includes a clickable map of the region that features information on recent earthquakes in California and Nevada. Equally compelling is the clickable fault map of Southern California where visitors can learn about the local faults and recent activity along each fault. Another key element of the site is the historical earthquake database, which may be of interest to both the general public and those who are studying this area.

  15. Southern California Earthquake Center (SCEC) Home Page

    NSDL National Science Digital Library

    This is the home page of the Southern California Earthquake Center (SCEC), a consortium of universities and research institutions dedicated to gathering information about earthquakes in Southern California, integrate that knowledge into a comprehensive and predictive understanding of earthquake phenomena, and communicate this understanding to end-users and the general public in order to increase earthquake awareness, reduce economic losses, and save lives. News of recent earthquake research, online resources and educational information is available here.

  16. Uniform California earthquake rupture forecast, version 2 (UCERF 2)

    USGS Publications Warehouse

    Field, E.H.; Dawson, T.E.; Felzer, K.R.; Frankel, A.D.; Gupta, V.; Jordan, T.H.; Parsons, T.; Petersen, M.D.; Stein, R.S.; Weldon, R.J.; Wills, C.J.

    2009-01-01

    The 2007 Working Group on California Earthquake Probabilities (WGCEP, 2007) presents the Uniform California Earthquake Rupture Forecast, Version 2 (UCERF 2). This model comprises a time-independent (Poisson-process) earthquake rate model, developed jointly with the National Seismic Hazard Mapping Program and a time-dependent earthquake-probability model, based on recent earthquake rates and stress-renewal statistics conditioned on the date of last event. The models were developed from updated statewide earthquake catalogs and fault deformation databases using a uniform methodology across all regions and implemented in the modular, extensible Open Seismic Hazard Analysis framework. The rate model satisfies integrating measures of deformation across the plate-boundary zone and is consistent with historical seismicity data. An overprediction of earthquake rates found at intermediate magnitudes (6.5 ??? M ???7.0) in previous models has been reduced to within the 95% confidence bounds of the historical earthquake catalog. A logic tree with 480 branches represents the epistemic uncertainties of the full time-dependent model. The mean UCERF 2 time-dependent probability of one or more M ???6.7 earthquakes in the California region during the next 30 yr is 99.7%; this probability decreases to 46% for M ???7.5 and to 4.5% for M ???8.0. These probabilities do not include the Cascadia subduction zone, largely north of California, for which the estimated 30 yr, M ???8.0 time-dependent probability is 10%. The M ???6.7 probabilities on major strike-slip faults are consistent with the WGCEP (2003) study in the San Francisco Bay Area and the WGCEP (1995) study in southern California, except for significantly lower estimates along the San Jacinto and Elsinore faults, owing to provisions for larger multisegment ruptures. Important model limitations are discussed.

  17. Northern California Earthquake Data Center

    NSDL National Science Digital Library

    A project between the University of California Berkeley Seismological Laboratory and the United State Geological Survey, the Northern California Earthquake Data Center (NCEDC) "is a long-term archive and distribution center for seismological and geodetic data for Northern and Central California." Educators and students can examine recent seismograms from the Berkeley Digital Seismic Network. Researchers will benefit from the site's enormous amount of data collections including BARD; a system of 67 constantly operating Global Positioning System receivers in Northern California. By reading the annual reports, educators will also learn about the center's many outreach activities from talks and lab tours to the production of classroom resources for kindergarten through twelfth grade teachers. This site is also reviewed in the October 17, 2003 NSDL Physical Sciences Report.

  18. Probability of derailment under earthquake conditions

    E-print Network

    Guillaud, Lucile M. (Lucile Marie)

    2006-01-01

    A quantitative assessment of the probability of derailment under earthquake conditions is presented. Two derailment modes are considered: by vibratory motion - during the ground motion - and by permanent track deformation ...

  19. Northern California Earthquake Data Center (NCEDC)

    NSDL National Science Digital Library

    This is the home page of the Northern California Earthquake Data Center (NCEDC) which is a joint project of the University of California Berkeley Seismological Laboratory and the U. S. Geological Survey at Menlo Park. The NCEDC is an archive for seismological and geodetic data for Northern and Central California. Accessible through this page are news items, recent earthquake information, links to earthquake catalogs, seismic waveform data sets, and Global Positioning System information. Most data sets are accessible for downloading via ftp.

  20. Results of the Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California.

    PubMed

    Lee, Ya-Ting; Turcotte, Donald L; Holliday, James R; Sachs, Michael K; Rundle, John B; Chen, Chien-Chih; Tiampo, Kristy F

    2011-10-01

    The Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California was the first competitive evaluation of forecasts of future earthquake occurrence. Participants submitted expected probabilities of occurrence of M ? 4.95 earthquakes in 0.1° × 0.1° cells for the period 1 January 1, 2006, to December 31, 2010. Probabilities were submitted for 7,682 cells in California and adjacent regions. During this period, 31 M ? 4.95 earthquakes occurred in the test region. These earthquakes occurred in 22 test cells. This seismic activity was dominated by earthquakes associated with the M = 7.2, April 4, 2010, El Mayor-Cucapah earthquake in northern Mexico. This earthquake occurred in the test region, and 16 of the other 30 earthquakes in the test region could be associated with it. Nine complete forecasts were submitted by six participants. In this paper, we present the forecasts in a way that allows the reader to evaluate which forecast is the most "successful" in terms of the locations of future earthquakes. We conclude that the RELM test was a success and suggest ways in which the results can be used to improve future forecasts. PMID:21949355

  1. An Investigation of Southern California Earthquakes

    NSDL National Science Digital Library

    This site has directions for a classroom activity in which students plot locations of major Southern California earthquakes on a map. A table listing major earthquakes, when they occurred, their locations and their magnitudes is included. There is also a set of questions for the students to answer once they have plotted the earthquake data on their map. This site is in PDF format.

  2. Prospective Tests of Southern California Earthquake Forecasts

    Microsoft Academic Search

    D. D. Jackson; D. Schorlemmer; M. Gerstenberger; Y. Y. Kagan; A. Helmstetter; S. Wiemer; N. Field

    2004-01-01

    We are testing earthquake forecast models prospectively using likelihood ratios. Several investigators have developed such models as part of the Southern California Earthquake Center's project called Regional Earthquake Likelihood Models (RELM). Various models are based on fault geometry and slip rates, seismicity, geodetic strain, and stress interactions. Here we describe the testing procedure and present preliminary results. Forecasts are expressed

  3. Real-time forecasts of tomorrow's earthquakes in California

    USGS Publications Warehouse

    Gerstenberger, M.C.; Wiemer, S.; Jones, L.M.; Reasenberg, P.A.

    2005-01-01

    Despite a lack of reliable deterministic earthquake precursors, seismologists have significant predictive information about earthquake activity from an increasingly accurate understanding of the clustering properties of earthquakes. In the past 15 years, time-dependent earthquake probabilities based on a generic short-term clustering model have been made publicly available in near-real time during major earthquake sequences. These forecasts describe the probability and number of events that are, on average, likely to occur following a mainshock of a given magnitude, but are not tailored to the particular sequence at hand and contain no information about the likely locations of the aftershocks. Our model builds upon the basic principles of this generic forecast model in two ways: it recasts the forecast in terms of the probability of strong ground shaking, and it combines an existing time-independent earthquake occurrence model based on fault data and historical earthquakes with increasingly complex models describing the local time-dependent earthquake clustering. The result is a time-dependent map showing the probability of strong shaking anywhere in California within the next 24 hours. The seismic hazard modelling approach we describe provides a better understanding of time-dependent earthquake hazard, and increases its usefulness for the public, emergency planners and the media.

  4. Real-time forecasts of tomorrow's earthquakes in California.

    PubMed

    Gerstenberger, Matthew C; Wiemer, Stefan; Jones, Lucile M; Reasenberg, Paul A

    2005-05-19

    Despite a lack of reliable deterministic earthquake precursors, seismologists have significant predictive information about earthquake activity from an increasingly accurate understanding of the clustering properties of earthquakes. In the past 15 years, time-dependent earthquake probabilities based on a generic short-term clustering model have been made publicly available in near-real time during major earthquake sequences. These forecasts describe the probability and number of events that are, on average, likely to occur following a mainshock of a given magnitude, but are not tailored to the particular sequence at hand and contain no information about the likely locations of the aftershocks. Our model builds upon the basic principles of this generic forecast model in two ways: it recasts the forecast in terms of the probability of strong ground shaking, and it combines an existing time-independent earthquake occurrence model based on fault data and historical earthquakes with increasingly complex models describing the local time-dependent earthquake clustering. The result is a time-dependent map showing the probability of strong shaking anywhere in California within the next 24 hours. The seismic hazard modelling approach we describe provides a better understanding of time-dependent earthquake hazard, and increases its usefulness for the public, emergency planners and the media. PMID:15902254

  5. The Phase Dynamics of Earthquakes: Implications for Forecasting in Southern California

    Microsoft Academic Search

    Kristy F. Tiampo; John B. Rundle; Seth McGinnis; Susanna Gross; William Klein

    2001-01-01

    We analyze the space-time patterns of earthquake occurrence in southern California using a new method that treats earthquakes as a phase dynamical system. The system state vector is used to obtain a probability measure for current and future earthquake occurrence. Thousands of statistical tests indicate the method has considerable forecast skill. We emphasize that the method is not a model,

  6. Combining earthquake forecasts using differential probability gains

    NASA Astrophysics Data System (ADS)

    Shebalin, Peter N.; Narteau, Clément; Zechar, Jeremy Douglas; Holschneider, Matthias

    2014-12-01

    We describe an iterative method to combine seismicity forecasts. With this method, we produce the next generation of a starting forecast by incorporating predictive skill from one or more input forecasts. For a single iteration, we use the differential probability gain of an input forecast relative to the starting forecast. At each point in space and time, the rate in the next-generation forecast is the product of the starting rate and the local differential probability gain. The main advantage of this method is that it can produce high forecast rates using all types of numerical forecast models, even those that are not rate-based. Naturally, a limitation of this method is that the input forecast must have some information not already contained in the starting forecast. We illustrate this method using the Every Earthquake a Precursor According to Scale (EEPAS) and Early Aftershocks Statistics (EAST) models, which are currently being evaluated at the US testing center of the Collaboratory for the Study of Earthquake Predictability. During a testing period from July 2009 to December 2011 (with 19 target earthquakes), the combined model we produce has better predictive performance - in terms of Molchan diagrams and likelihood - than the starting model (EEPAS) and the input model (EAST). Many of the target earthquakes occur in regions where the combined model has high forecast rates. Most importantly, the rates in these regions are substantially higher than if we had simply averaged the models.

  7. Probability based earthquake load and resistance factor design criteria for offshore platforms

    SciTech Connect

    Bea, R.G. [Univ. of California, Berkeley, CA (United States). Dept. of Civil Engineering

    1996-12-31

    This paper describes a probability reliability based formulation to determine earthquake Load and Resistance Factor Design (LRFD) parameters for conventional, steel, pile supported, tubular membered platforms that is proposed as a basis for earthquake design criteria and guidelines for offshore platforms that are intended to have worldwide applicability. The formulation is illustrated with application to platforms located in five areas: offshore California, Venezuela (Rio Caribe), the East Coast of Canada, in the Caspian Sea (Azeri), and the Norwegian sector of the North Sea.

  8. High Resolution Long and Short-Term Earthquake Forecasts for California

    Microsoft Academic Search

    M. J. Werner; A. Helmstetter; D. D. Jackson; Y. Y. Kagan

    2009-01-01

    We present two models for estimating the probabilities of future earthquakes in California, to be tested in the Collaboratory for the Study of Earthquake Predictability (CSEP). The first, time-independent model, modified from Helmstetter et al. (2007), provides five-year forecasts for magnitudes m > 4.95. We show that large quakes occur on average near the locations of small m > 2

  9. A physically-based earthquake recurrence model for estimation of long-term earthquake probabilities

    USGS Publications Warehouse

    Ellsworth, William L.; Matthews, Mark V.; Nadeau, Robert M.; Nishenko, Stuart P.; Reasenberg, Paul A.; Simpson, Robert W.

    1999-01-01

    A physically-motivated model for earthquake recurrence based on the Brownian relaxation oscillator is introduced. The renewal process defining this point process model can be described by the steady rise of a state variable from the ground state to failure threshold as modulated by Brownian motion. Failure times in this model follow the Brownian passage time (BPT) distribution, which is specified by the mean time to failure, ?, and the aperiodicity of the mean, ? (equivalent to the familiar coefficient of variation). Analysis of 37 series of recurrent earthquakes, M -0.7 to 9.2, suggests a provisional generic value of ? = 0.5. For this value of ?, the hazard function (instantaneous failure rate of survivors) exceeds the mean rate for times > ??2, and is ~ ~ 2 ? ? for all times > ?. Application of this model to the next M 6 earthquake on the San Andreas fault at Parkfield, California suggests that the annual probability of the earthquake is between 1:10 and 1:13.

  10. Time?dependent renewal?model probabilities when date of last earthquake is unknown

    USGS Publications Warehouse

    Field, Edward H.; Jordan, Thomas H.

    2015-01-01

    We derive time-dependent, renewal-model earthquake probabilities for the case in which the date of the last event is completely unknown, and compare these with the time-independent Poisson probabilities that are customarily used as an approximation in this situation. For typical parameter values, the renewal-model probabilities exceed Poisson results by more than 10% when the forecast duration exceeds ~20% of the mean recurrence interval. We also derive probabilities for the case in which the last event is further constrained to have occurred before historical record keeping began (the historic open interval), which can only serve to increase earthquake probabilities for typically applied renewal models.We conclude that accounting for the historic open interval can improve long-term earthquake rupture forecasts for California and elsewhere.

  11. Conditional Probabilities for Large Events Estimated by Small Earthquake Rate

    NASA Astrophysics Data System (ADS)

    Wu, Yi-Hsuan; Chen, Chien-Chih; Li, Hsien-Chi

    2015-01-01

    We examined forecasting quiescence and activation models to obtain the conditional probability that a large earthquake will occur in a specific time period on different scales in Taiwan. The basic idea of the quiescence and activation models is to use earthquakes that have magnitudes larger than the completeness magnitude to compute the expected properties of large earthquakes. We calculated the probability time series for the whole Taiwan region and for three subareas of Taiwan—the western, eastern, and northeastern Taiwan regions—using 40 years of data from the Central Weather Bureau catalog. In the probability time series for the eastern and northeastern Taiwan regions, a high probability value is usually yielded in cluster events such as events with foreshocks and events that all occur in a short time period. In addition to the time series, we produced probability maps by calculating the conditional probability for every grid point at the time just before a large earthquake. The probability maps show that high probability values are yielded around the epicenter before a large earthquake. The receiver operating characteristic (ROC) curves of the probability maps demonstrate that the probability maps are not random forecasts, but also suggest that lowering the magnitude of a forecasted large earthquake may not improve the forecast method itself. From both the probability time series and probability maps, it can be observed that the probability obtained from the quiescence model increases before a large earthquake and the probability obtained from the activation model increases as the large earthquakes occur. The results lead us to conclude that the quiescence model has better forecast potential than the activation model.

  12. Earthquake Alerting in California Prof. of Engineering Seismology

    E-print Network

    Greer, Julia R.

    Earthquake Alerting in California Tom Heaton Prof. of Engineering Seismology Caltech #12;Earthquake Alerting ... a different kind of prediction · What if earthquakes were really slow, like the weather? · We could recognize that an earthquake is beginning and then broadcast information on its development

  13. Earthquake Probability Map for San Francisco Bay Area

    NSDL National Science Digital Library

    This map displays earthquake probabilities for several faults in the San Francisco Bay Area. The probability values are for the occurrence of one or more major (magnitude greater than or equal to 6.7) earthquakes in the San Francisco Bay Region during the next thirty years. Each fault on the map is color-coded to indicate the relative probability, and numerical values are displayed in boxes. The site includes links to a fact sheet and full technical report that summarize the findings, indicating a 62 percent total probability of a major earthquake over the next thirty years. There are also links to planning scenario maps, a study on potential losses, a webcast on earthquake probability, and to a set of downloadable graphics (TIF or PDF files) used in the probability study.

  14. Infrasonic observations of the Northridge, California, earthquake

    SciTech Connect

    Mutschlecner, J.P.; Whitaker, R.W.

    1994-09-01

    Infrasonic waves from the Northridge, California, earthquake of 17 January 1994 were observed at the St. George, Utah, infrasound array of the Los Alamos National Laboratory. The distance to the epicenter was 543 kilometers. The signal shows a complex character with many peaks and a long duration. An interpretation is given in terms of several modes of signal propagation and generation including a seismic-acoustic secondary source mechanism. A number of signals from aftershocks are also observed.

  15. Seismicity alert probabilities at Parkfield, California, revisited

    USGS Publications Warehouse

    Michael, A.J.; Jones, L.M.

    1998-01-01

    For a decade, the US Geological Survey has used the Parkfield Earthquake Prediction Experiment scenario document to estimate the probability that earthquakes observed on the San Andreas fault near Parkfield will turn out to be foreshocks followed by the expected magnitude six mainshock. During this time, we have learned much about the seismogenic process at Parkfield, about the long-term probability of the Parkfield mainshock, and about the estimation of these types of probabilities. The probabilities for potential foreshocks at Parkfield are reexamined and revised in light of these advances. As part of this process, we have confirmed both the rate of foreshocks before strike-slip earthquakes in the San Andreas physiographic province and the uniform distribution of foreshocks with magnitude proposed by earlier studies. Compared to the earlier assessment, these new estimates of the long-term probability of the Parkfield mainshock are lower, our estimate of the rate of background seismicity is higher, and we find that the assumption that foreshocks at Parkfield occur in a unique way is not statistically significant at the 95% confidence level. While the exact numbers vary depending on the assumptions that are made, the new alert probabilities are lower than previously estimated. Considering the various assumptions and the statistical uncertainties in the input parameters, we also compute a plausible range for the probabilities. The range is large, partly due to the extra knowledge that exists for the Parkfield segment, making us question the usefulness of these numbers.

  16. Estimating the Probability of Earthquake-Induced Landslides

    NASA Astrophysics Data System (ADS)

    McRae, M. E.; Christman, M. C.; Soller, D. R.; Sutter, J. F.

    2001-12-01

    The development of a regionally applicable, predictive model for earthquake-triggered landslides is needed to improve mitigation decisions at the community level. The distribution of landslides triggered by the 1994 Northridge earthquake in the Oat Mountain and Simi Valley quadrangles of southern California provided an inventory of failures against which to evaluate the significance of a variety of physical variables in probabilistic models of static slope stability. Through a cooperative project, the California Division of Mines and Geology provided 10-meter resolution data on elevation, slope angle, coincidence of bedding plane and topographic slope, distribution of pre-Northridge landslides, internal friction angle and cohesive strength of individual geologic units. Hydrologic factors were not evaluated since failures in the study area were dominated by shallow, disrupted landslides in dry materials. Previous studies indicate that 10-meter digital elevation data is required to properly characterize the short, steep slopes on which many earthquake-induced landslides occur. However, to explore the robustness of the model at different spatial resolutions, models were developed at the 10, 50, and 100-meter resolution using classification and regression tree (CART) analysis and logistic regression techniques. Multiple resampling algorithms were tested for each variable in order to observe how resampling affects the statistical properties of each grid, and how relationships between variables within the model change with increasing resolution. Various transformations of the independent variables were used to see which had the strongest relationship with the probability of failure. These transformations were based on deterministic relationships in the factor of safety equation. Preliminary results were similar for all spatial scales. Topographic variables dominate the predictive capability of the models. The distribution of prior landslides and the coincidence of slope with bedding plane had poor predictive capability. The predictive value of cohesive strength and internal friction is still unresolved. Although several strength variables were statistically significant, the explanatory power of the model was little improved over models based solely on topographic variables. Finally, 10-meter resolution digital elevation data was necessary to constrain locally steep slopes. However, regional slopes were also statistically significant, suggesting that coarser resolution models may refine finer resolution models by exposing larger scale controls on the distribution of landslides.

  17. The initial subevent of the 1994 Northridge, California, earthquake: Is earthquake size predictable?

    Microsoft Academic Search

    D. Kilb; J. Gomberg

    1999-01-01

    We examine the initial subevent (ISE) of the M 6.7, 1994 Northridge, California, earthquake in order to discriminate between two end-member rupture initiation models: the ‘preslip’ and ‘cascade’ models. Final earthquake size may be predictable from an ISE's seismic signature in the preslip model but not in the cascade model. In the cascade model ISEs are simply small earthquakes that

  18. A new procedure modeling the probability distribution of earthquake size

    NASA Astrophysics Data System (ADS)

    Wang, J. P.; Yun, X.; Chang, S. C.

    2014-11-01

    The probability distribution of earthquake size is needed as input data for some earthquake analyses. A common procedure is to calibrate the so-called b-value in the Gutenberg-Richter relationship and to use it as the best-estimate model parameter in an algorithm to simulate the observed earthquake-size distribution. This paper introduces a new procedure for such a simulation, on the basis of performing optimization to search for the optimum model parameter. The new option and an existing method are then both utilized to model the earthquake-size distribution around Taiwan since 1978. Owing to the nature and the power of optimization, three case studies presented in this paper all indicate that the new optimization procedure can indeed improve such a simulation over the existing procedure. Moreover, with a proper tool such as Excel Solver, practicing the new method to model the observed earthquake-size distribution is as effortless as using the existing procedure.

  19. Detection of hydrothermal precursors to large northern california earthquakes.

    PubMed

    Silver, P G; Valette-Silver, N J

    1992-09-01

    During the period 1973 to 1991 the interval between eruptions from a periodic geyser in Northern California exhibited precursory variations 1 to 3 days before the three largest earthquakes within a 250-kilometer radius of the geyser. These include the magnitude 7.1 Loma Prieta earthquake of 18 October 1989 for which a similar preseismic signal was recorded by a strainmeter located halfway between the geyser and the earthquake. These data show that at least some earthquakes possess observable precursors, one of the prerequisites for successful earthquake prediction. All three earthquakes were further than 130 kilometers from the geyser, suggesting that precursors might be more easily found around rather than within the ultimate rupture zone of large California earthquakes. PMID:17738277

  20. Fundamental questions of earthquake statistics, source behavior, and the estimation of earthquake probabilities from possible foreshocks

    USGS Publications Warehouse

    Michael, Andrew J.

    2012-01-01

    Estimates of the probability that an ML 4.8 earthquake, which occurred near the southern end of the San Andreas fault on 24 March 2009, would be followed by an M 7 mainshock over the following three days vary from 0.0009 using a Gutenberg–Richter model of aftershock statistics (Reasenberg and Jones, 1989) to 0.04 using a statistical model of foreshock behavior and long?term estimates of large earthquake probabilities, including characteristic earthquakes (Agnew and Jones, 1991). I demonstrate that the disparity between the existing approaches depends on whether or not they conform to Gutenberg–Richter behavior. While Gutenberg–Richter behavior is well established over large regions, it could be violated on individual faults if they have characteristic earthquakes or over small areas if the spatial distribution of large?event nucleations is disproportional to the rate of smaller events. I develop a new form of the aftershock model that includes characteristic behavior and combines the features of both models. This new model and the older foreshock model yield the same results when given the same inputs, but the new model has the advantage of producing probabilities for events of all magnitudes, rather than just for events larger than the initial one. Compared with the aftershock model, the new model has the advantage of taking into account long?term earthquake probability models. Using consistent parameters, the probability of an M 7 mainshock on the southernmost San Andreas fault is 0.0001 for three days from long?term models and the clustering probabilities following the ML 4.8 event are 0.00035 for a Gutenberg–Richter distribution and 0.013 for a characteristic?earthquake magnitude–frequency distribution. Our decisions about the existence of characteristic earthquakes and how large earthquakes nucleate have a first?order effect on the probabilities obtained from short?term clustering models for these large events.

  1. Earthquakes and faults in southern California (1970-2010)

    USGS Publications Warehouse

    Sleeter, Benjamin M.; Calzia, James P.; Walter, Stephen R.

    2012-01-01

    The map depicts both active and inactive faults and earthquakes magnitude 1.5 to 7.3 in southern California (1970–2010). The bathymetry was generated from digital files from the California Department of Fish And Game, Marine Region, Coastal Bathymetry Project. Elevation data are from the U.S. Geological Survey National Elevation Database. Landsat satellite image is from fourteen Landsat 5 Thematic Mapper scenes collected between 2009 and 2010. Fault data are reproduced with permission from 2006 California Geological Survey and U.S. Geological Survey data. The earthquake data are from the U.S. Geological Survey National Earthquake Information Center.

  2. Earthquake preparedness levels amongst youth and adults in Oakland, California

    NASA Astrophysics Data System (ADS)

    Burris, M.; Arroyo-Ruiz, D.; Crockett, C.; Dixon, G.; Jones, M.; Lei, P.; Phillips, B.; Romero, D.; Scott, M.; Spears, D.; Tate, L.; Whitlock, J.; Diaz, J.; Chagolla, R.

    2011-12-01

    The San Francisco Bay Area has not experienced a large earthquake since 1989. However research shows that the Hayward fault is overdue for a tremor, based on paleo-seismic research. To analyze the level of earthquake preparedness in the Oakland area (close to the Hayward fault), we surveyed over 150 people to assess their understanding of earthquakes. Our research evaluates whether increased earthquake knowledge impacts people's preparedness and concern toward earthquake events. Data was collected using smart-phone technology and survey software in four sites across Oakland including; North Oakland, Downtown, East Oakland, and a summer school program in East Oakland, which has youth from throughout the city. Preliminary studies show that over 60% of interviewees have sufficient earthquake knowledge, but that over half of all interviewees are not prepared for a seismic event. Our study shows that in Oakland, California earthquake preparedness levels vary, which could mean we need to develop more ways to disseminate information on earthquake preparedness.

  3. School Safety Down to Earth: California's Earthquake-Resistant Schools.

    ERIC Educational Resources Information Center

    Progressive Architecture, 1979

    1979-01-01

    Schools in California being built to resist damage by earthquakes are part of a program to meet building standards established in 1933. The three new schools presented reflect the strengths and weaknesses of the program. (Author/MLF)

  4. Static Coulomb stress-based Southern California earthquake forecasts: A pseudoprospective test

    NASA Astrophysics Data System (ADS)

    Strader, Anne; Jackson, David D.

    2015-03-01

    Many studies support the hypothesis that where earthquakes occur, recent changes in resolved Coulomb stress tend to be positive. How about the converse hypothesis, that where resolved Coulomb stress recently increased, earthquakes are more likely to occur? Successful earthquake forecasting by Coulomb stress changes requires the converse. To test this, we calculated stress everywhere in our study area, not just at earthquake locations. We modeled stress accumulation in Southern California since 1812 both from the elastic effect of slip below locked faults and from M ? 5 "source" earthquakes up to any given date. To minimize the effect of secondary aftershocks not directly related to the source earthquakes, we measured seismicity using a gridded binary map: each 0.1° × 0.1° cell is "activated" if containing one or more test events ("receiver" earthquakes) of M ? 2.8. We then constructed an empirical relationship between resolved Coulomb stress and activation rate within regions with similar stress values, defining probabilities of activated cells during the "test" period, within 11 years of the M7.1 Hector Mine earthquake. We found that Coulomb stress reliably indicates future earthquake locations at the 95% confidence interval. However, smoothed seismicity forecasts outperformed Coulomb forecasts in some areas with large earthquakes due to aftershock clustering. Most earthquakes tend to nucleate in areas with Coulomb stress changes greater than 0.5 MPa or less than -0.5 MPa. Within areas with increased Coulomb stress from older earthquakes, fewer earthquakes occurred than anticipated. After reducing stress uncertainty impact, Coulomb rate-and-state forecasts may also improve upon statistical earthquake forecasts.

  5. Assessment of Spatial Aftershock Probabilities: a Feasibility Study in Earthquake Hazard

    Microsoft Academic Search

    J. McCloskey

    2003-01-01

    Non-linearity in the generating dynamics of earthquakes which may forbid deterministic earthquake prediction does not preclude the estimation of earthquake probability and, in particular, how this might change in space and time. Recent developments in the understanding of stress triggering of earthquakes allow us satisfactorily to explain the special variation of aftershock distributions following any large earthquake. To date, however,

  6. Mathematical principles of predicting the probabilities of large earthquakes

    E-print Network

    Ghertzik, V M

    2009-01-01

    A multicomponent random process used as a model for the problem of space-time earthquake prediction; this allows us to develop consistent estimation for conditional probabilities of large earthquakes if the values of the predictor characterizing the seismicity prehistory are known. We introduce tools for assessing prediction efficiency, including a separate determination of efficiency for "time prediction" and "location prediction": a generalized correlation coefficient and the density of information gain. We suggest a technique for testing the predictor to decide whether the hypothesis of no prediction can be rejected.

  7. The magnitude distribution of declustered earthquakes in Southern California

    PubMed Central

    Knopoff, Leon

    2000-01-01

    The binned distribution densities of magnitudes in both the complete and the declustered catalogs of earthquakes in the Southern California region have two significantly different branches with crossover magnitude near M = 4.8. In the case of declustered earthquakes, the b-values on the two branches differ significantly from each other by a factor of about two. The absence of self-similarity across a broad range of magnitudes in the distribution of declustered earthquakes is an argument against the application of an assumption of scale-independence to models of main-shock earthquake occurrence, and in turn to the use of such models to justify the assertion that earthquakes are unpredictable. The presumption of scale-independence for complete local earthquake catalogs is attributable, not to a universal process of self-organization leading to future large earthquakes, but to the universality of the process that produces aftershocks, which dominate complete catalogs. PMID:11035770

  8. Simulations of the 1906 San Francisco Earthquake and Scenario Earthquakes in Northern California

    Microsoft Academic Search

    S. Larsen; D. Dreger; D. Dolenc

    2006-01-01

    3-D simulations of seismic ground motions are performed to better characterize the 1906 San Francisco earthquake and to investigate the seismic consequences from scenario events in northern California. Specifically, we perform simulations of: 1) the 1906 earthquake, which bilaterally ruptured a 480-km segment of the San Andreas fault from San Juan Bautista to Cape Mendocino (epicenter a few kilometers off

  9. Real-Time Foreshock Probability Forecasting Experiments in Japan, Southern California and Whole Globe

    NASA Astrophysics Data System (ADS)

    Ogata, Y.

    2014-12-01

    I am concerned with whether currently occurring earthquakes will be "foreshocks" of a significantly larger earthquake or not. When plural earthquakes occur in a region, I attempt to statistically discriminate foreshocks from a swarm or the mainshock-aftershock sequence. The forecast needs identification of an earthquake cluster using the single-link algorithm; and then the probability is calculated based on the clustering strength and magnitude correlations. The probability forecast model were estimated from the JMA hypocenter data of earthquakes of M?4 in the period 1926-1993 (Ogata et al., 1996). Then we presented the performance and validation of the forecasts during 1994 - 2010 by using the same model (Ogata and Katsura, 2012). The forecasts perform significantly better than the unconditional (average) foreshock probability throughout Japan region. The frequency of the actual foreshocks is consistent with the forecasted probabilities. In my poster, I would like to discuss details of the outcomes in the forecasting and evaluations. Furthermore, I would like to apply the forecasting in California and global catalogs to show some universality in the forecasting procedure. Reference: [1] Ogata, Y., Utsu, T. and Katsura, K. (1996). Statistical discrimination of foreshocks from other earthquake clusters, Geophys. J. Int. 127, 17-30. [2]Ogata, Y. and Katsura, K. (2012). Prospective foreshock forecast experiment during the last 17 years, Geophys. J. Int., 191, 1237-1244.

  10. The Loma Prieta, California, Earthquake of October 17, 1989: Earthquake Occurrence

    USGS Publications Warehouse

    Coordinated by Bakun, William H.; Prescott, William H.

    1993-01-01

    Professional Paper 1550 seeks to understand the M6.9 Loma Prieta earthquake itself. It examines how the fault that generated the earthquake ruptured, searches for and evaluates precursors that may have indicated an earthquake was coming, reviews forecasts of the earthquake, and describes the geology of the earthquake area and the crustal forces that affect this geology. Some significant findings were: * Slip during the earthquake occurred on 35 km of fault at depths ranging from 7 to 20 km. Maximum slip was approximately 2.3 m. The earthquake may not have released all of the strain stored in rocks next to the fault and indicates a potential for another damaging earthquake in the Santa Cruz Mountains in the near future may still exist. * The earthquake involved a large amount of uplift on a dipping fault plane. Pre-earthquake conventional wisdom was that large earthquakes in the Bay area occurred as horizontal displacements on predominantly vertical faults. * The fault segment that ruptured approximately coincided with a fault segment identified in 1988 as having a 30% probability of generating a M7 earthquake in the next 30 years. This was one of more than 20 relevant earthquake forecasts made in the 83 years before the earthquake. * Calculations show that the Loma Prieta earthquake changed stresses on nearby faults in the Bay area. In particular, the earthquake reduced stresses on the Hayward Fault which decreased the frequency of small earthquakes on it. * Geological and geophysical mapping indicate that, although the San Andreas Fault can be mapped as a through going fault in the epicentral region, the southwest dipping Loma Prieta rupture surface is a separate fault strand and one of several along this part of the San Andreas that may be capable of generating earthquakes.

  11. Keeping the History in Historical Seismology: The 1872 Owens Valley, California Earthquake

    SciTech Connect

    Hough, Susan E. [U.S. Geological Survey, 525 South Wilson Avenue, Pasadena, California 91106 (United States)

    2008-07-08

    The importance of historical earthquakes is being increasingly recognized. Careful investigations of key pre-instrumental earthquakes can provide critical information and insights for not only seismic hazard assessment but also for earthquake science. In recent years, with the explosive growth in computational sophistication in Earth sciences, researchers have developed increasingly sophisticated methods to analyze macroseismic data quantitatively. These methodological developments can be extremely useful to exploit fully the temporally and spatially rich information source that seismic intensities often represent. For example, the exhaustive and painstaking investigations done by Ambraseys and his colleagues of early Himalayan earthquakes provides information that can be used to map out site response in the Ganges basin. In any investigation of macroseismic data, however, one must stay mindful that intensity values are not data but rather interpretations. The results of any subsequent analysis, regardless of the degree of sophistication of the methodology, will be only as reliable as the interpretations of available accounts - and only as complete as the research done to ferret out, and in many cases translate, these accounts. When intensities are assigned without an appreciation of historical setting and context, seemingly careful subsequent analysis can yield grossly inaccurate results. As a case study, I report here on the results of a recent investigation of the 1872 Owen's Valley, California earthquake. Careful consideration of macroseismic observations reveals that this event was probably larger than the great San Francisco earthquake of 1906, and possibly the largest historical earthquake in California. The results suggest that some large earthquakes in California will generate significantly larger ground motions than San Andreas fault events of comparable magnitude.

  12. Discrepancy between earthquake rates implied by historic earthquakes and a consensus geologic source model for California

    USGS Publications Warehouse

    Petersen, M.D.; Cramer, C.H.; Reichle, M.S.; Frankel, A.D.; Hanks, T.C.

    2000-01-01

    We examine the difference between expected earthquake rates inferred from the historical earthquake catalog and the geologic data that was used to develop the consensus seismic source characterization for the state of California [California Department of Conservation, Division of Mines and Geology (CDMG) and U.S. Geological Survey (USGS) Petersen et al., 1996; Frankel et al., 1996]. On average the historic earthquake catalog and the seismic source model both indicate about one M 6 or greater earthquake per year in the state of California. However, the overall earthquake rates of earthquakes with magnitudes (M) between 6 and 7 in this seismic source model are higher, by at least a factor of 2, than the mean historic earthquake rates for both southern and northern California. The earthquake rate discrepancy results from a seismic source model that includes earthquakes with characteristic (maximum) magnitudes that are primarily between M 6.4 and 7.1. Many of these faults are interpreted to accommodate high strain rates from geologic and geodetic data but have not ruptured in large earthquakes during historic time. Our sensitivity study indicates that the rate differences between magnitudes 6 and 7 can be reduced by adjusting the magnitude-frequency distribution of the source model to reflect more characteristic behavior, by decreasing the moment rate available for seismogenic slip along faults, by increasing the maximum magnitude of the earthquake on a fault, or by decreasing the maximum magnitude of the background seismicity. However, no single parameter can be adjusted, consistent with scientific consensus, to eliminate the earthquake rate discrepancy. Applying a combination of these parametric adjustments yields an alternative earthquake source model that is more compatible with the historic data. The 475-year return period hazard for peak ground and 1-sec spectral acceleration resulting from this alternative source model differs from the hazard resulting from the standard CDMG-USGS model by less than 10% across most of California but is higher (generally about 10% to 30%) within 20 km from some faults.

  13. Depth dependence of earthquake frequency-magnitude distributions in California: Implications for rupture initiation

    USGS Publications Warehouse

    Mori, J.; Abercrombie, R.E.

    1997-01-01

    Statistics of earthquakes in California show linear frequency-magnitude relationships in the range of M2.0 to M5.5 for various data sets. Assuming Gutenberg-Richter distributions, there is a systematic decrease in b value with increasing depth of earthquakes. We find consistent results for various data sets from northern and southern California that both include and exclude the larger aftershock sequences. We suggest that at shallow depth (???0 to 6 km) conditions with more heterogeneous material properties and lower lithospheric stress prevail. Rupture initiations are more likely to stop before growing into large earthquakes, producing relatively more smaller earthquakes and consequently higher b values. These ideas help to explain the depth-dependent observations of foreshocks in the western United States. The higher occurrence rate of foreshocks preceding shallow earthquakes can be interpreted in terms of rupture initiations that are stopped before growing into the mainshock. At greater depth (9-15 km), any rupture initiation is more likely to continue growing into a larger event, so there are fewer foreshocks. If one assumes that frequency-magnitude statistics can be used to estimate probabilities of a small rupture initiation growing into a larger earthquake, then a small (M2) rupture initiation at 9 to 12 km depth is 18 times more likely to grow into a M5.5 or larger event, compared to the same small rupture initiation at 0 to 3 km. Copyright 1997 by the American Geophysical Union.

  14. Towards Practical, Real-Time Estimation of Spatial Aftershock Probabilities: A Feasibility Study in Earthquake Hazard

    Microsoft Academic Search

    P. Morrow; J. McCloskey; S. Steacy

    2001-01-01

    It is now widely accepted that the goal of deterministic earthquake prediction is unattainable in the short term and may even be forbidden by nonlinearity in the generating dynamics. This nonlinearity does not, however, preclude the estimation of earthquake probability and, in particular, how this probability might change in space and time; earthquake hazard estimation might be possible in the

  15. Triggering of repeating earthquakes in central California

    USGS Publications Warehouse

    Wu, Chunquan; Gomberg, Joan; Ben-Naim, Eli; Johnson, Paul

    2014-01-01

    Dynamic stresses carried by transient seismic waves have been found capable of triggering earthquakes instantly in various tectonic settings. Delayed triggering may be even more common, but the mechanisms are not well understood. Catalogs of repeating earthquakes, earthquakes that recur repeatedly at the same location, provide ideal data sets to test the effects of transient dynamic perturbations on the timing of earthquake occurrence. Here we employ a catalog of 165 families containing ~2500 total repeating earthquakes to test whether dynamic perturbations from local, regional, and teleseismic earthquakes change recurrence intervals. The distance to the earthquake generating the perturbing waves is a proxy for the relative potential contributions of static and dynamic deformations, because static deformations decay more rapidly with distance. Clear changes followed the nearby 2004 Mw6 Parkfield earthquake, so we study only repeaters prior to its origin time. We apply a Monte Carlo approach to compare the observed number of shortened recurrence intervals following dynamic perturbations with the distribution of this number estimated for randomized perturbation times. We examine the comparison for a series of dynamic stress peak amplitude and distance thresholds. The results suggest a weak correlation between dynamic perturbations in excess of ~20 kPa and shortened recurrence intervals, for both nearby and remote perturbations.

  16. Statistical analysis of earthquake event correlations in Virtual California

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Granat, R. A.; Rundle, J. B.; Kellogg, L. H.; Donnellan, A.

    2008-12-01

    The combination of advanced computer simulation tools and statistical analysis methods has yielded promising improvements in our understanding of the earthquake process. The Virtual California simulation tool can be used to study fault and stress interaction scenarios for realistic California earthquakes. The simulation generates a synthetic earthquake catalog of events with a minimum size of M 5.0 that can be evaluated using statistical analysis methods. Virtual California is a Monte Carlo based simulation code that utilizes realistic fault geometries and a rate and state friction model in order to drive the earthquake process by means of stress interactions between and slip deficits on faults within the model. Like any complex system, Virtual California may generate emergent phenomena unexpected even by its designers. In order to investigate this, we have developed a statistical method that analyzes the interaction between Virtual California fault elements and thereby determine whether events on any given fault elements show correlated behavior. Our method examines events on one fault element and then determines whether there is an associated event within a specified time window on a second fault element. Note that an event in our analysis is defined as any time an element slips, rather than any particular "earthquake" along the entire fault length. Results are then tabulated and then differenced with an expected correlation (calculated by assuming either 1) a uniform distribution of events in time or 2) a random distribution of events in time). We generate a correlation score matrix, which indicates how weakly or strongly correlated each fault element is to every other in the course of the VC simulation. We calculate correlation scores by summing the difference between the actual and expected correlations over all time window lengths and normalizing by the time window size. The correlation score matrix can focus attention on the most interesting areas for more in-depth analysis of event correlation vs. time. We have performed this analysis on 59 faults (639 elements) in the model, which includes all the faults save the creeping section of the San Andreas. The analysis spans 40,000 yrs of Virtual California-generated earthquake data. Preliminary statistical analysis of the data indicates promising insights into emergent behavior, such as interactions between fault elements that include long-range interaction between faults in different geographical regions (i.e. fault elements in northern California interacting with those in southern California). In addition, analysis indicates possible triggering and quiescence relationships between events (i.e. between the southern San Andreas and the Eastern California Shear Zone). We will carry out further investigations to compare model results to geologic observations.

  17. The San Ardo, California, earthquake of 24 November 1985

    USGS Publications Warehouse

    Poley, C.M.

    1988-01-01

    Presented here are the main shock focal-plane solution and accompnaying aftershock hypocentral distribution of an M = 4.5 earthquake that occurred at 1921 UTC on the 24 November 1985 (at latitude 36??2.3'N and at longitude 120??52.1'W) at 11.3km depth, near the town of San Ardo, California. -Author

  18. Moho Orientation Beneath Central California From Regional Earthquake Travel Times

    Microsoft Academic Search

    David H. Oppenheimer; Jerry P. Eaton

    1984-01-01

    Analysis of relative Pn travel times recorded by 238 stations of the U.S. Geological Survey central California seismic network (CALNET) from 77 regional earthquakes and explosions shows that arrivals are progressively delayed to the northeast in both the Coast Ranges and Sierra Nevada foothills. By dividing the CALNET into 10 subarrays and assuming laterally uniform velocities for a crust over

  19. In the shadow of 1857-the effect of the great Ft. Tejon earthquake on subsequent earthquakes in southern California

    USGS Publications Warehouse

    Harris, R.A.; Simpson, R.W.

    1996-01-01

    The great 1857 Fort Tejon earthquake is the largest earthquake to have hit southern California during the historic period. We investigated if seismicity patterns following 1857 could be due to static stress changes generated by the 1857 earthquake. When post-1857 earthquakes with unknown focal mechanisms were assigned strike-slip mechanisms with strike and rake determined by the nearest active fault, 13 of the 13 southern California M???5.5 earthquakes between 1857 and 1907 were encouraged by the 1857 rupture. When post-1857 earthquakes in the Transverse Ranges with unknown focal mechanisms were assigned reverse mechanisms and all other events were assumed strike-slip, 11 of the 13 earthquakes were encouraged by the 1857 earthquake. These results show significant correlations between static stress changes and seismicity patterns. The correlation disappears around 1907, suggesting that tectonic loading began to overwhelm the effect of the 1857 earthquake early in the 20th century.

  20. Transient Response of Seismicity and Earthquake Probabilities to Stress Transfer in a Brownian Earthquake Model

    NASA Astrophysics Data System (ADS)

    Ellsworth, W. L.; Matthews, M. V.; Simpson, R. W.

    2001-12-01

    A statistical mechanical description of elastic rebound is used to study earthquake interaction and stress transfer effects in a point process model of earthquakes. The model is a Brownian Relaxation Oscillator (BRO) in which a random walk (standard Brownian motion) is added to a steady tectonic loading to produce a stochastic load state process. Rupture occurs in this model when the load state reaches a critical value. The load state is a random variable and may be described at any point in time by its probability density. Load state evolves toward the failure threshold due to tectonic loading (drift), and diffuses due to Brownian motion (noise) according to a diffusion equation. The Brownian perturbation process formally represents the sum total of all factors, aside from tectonic loading, that govern rupture. Physically, these factors may include effects of earthquakes external to the source, aseismic loading, interaction effects within the source itself, healing, pore pressure evolution, etc. After a sufficiently long time, load state always evolves to a steady state probability density that is independent of the initial condition and completely described by the drift rate and noise scale. Earthquake interaction and stress transfer effects are modeled by an instantaneous change in the load state. A negative step reduces the probability of failure, while a positive step may either immediately trigger rupture or increase the failure probability (hazard). When the load state is far from failure, the effects are well-approximated by ``clock advances'' that shift the unperturbed hazard down or up, as appropriate for the sign of the step. However, when the load state is advanced in the earthquake cycle, the response is a sharp, temporally localized decrease or increase in hazard. Recovery of the hazard is characteristically ``Omori like'' ( ~ 1/t), which can be understood in terms of equilibrium thermodynamical considerations since state evolution is diffusion with drift. The recovery from a step is nearly identical in form to that given by rate and state friction models for interaction. Thus, the BRO offers an alternative approach to modeling the transient response of seismicity to stress perturbations in a formulation that contains a complete and self-consistent probabilistic model.

  1. 1957 Gobi-Altay, Mongolia, earthquake as a prototype for southern California's most devastating earthquake

    Microsoft Academic Search

    C. Bayarsayhan; A. Bayasgalan; B. Enhtuvshin; R. A. Kurushin; Peter Molnar; M. Ölziybat

    1996-01-01

    The 1957 Gobi-Altay earthquake was associated with both strike-slip and thrust faulting, processes similar to those along the San Andreas fault and the faults bounding the San Gabriel Mountains just north of Los Angeles, California. Clearly, a major rupture either on the San Andreas fault north of Los Angeles or on the thrust faults bounding the Los Angeles basin poses

  2. 1957 Gobi-Altay, Mongolia, earthquake as a prototype for southern California's most devastating earthquake

    E-print Network

    Mojzsis, Stephen J.

    1957 Gobi-Altay, Mongolia, earthquake as a prototype for southern California's most devastating 210351, Mongolia A. Bayasgalan B. Enhtuvshin Centre for Informatics and Remote Sensing, Mongolian Academy of Sciences, Ulaanbaatar 210351, Mongolia Kenneth W. Hudnut U.S. Geological Survey, 525 South Wilson Avenue

  3. Southern California Earthquake Center Operates 1991 present, $3 -$5 million per year

    E-print Network

    Southern California Earthquake Center · Operates 1991 ­ present, $3 - $5 million per year · NSF, USC · High profile seismic hazard reports from 1993 · Community data bases ­faults, earthquakes, 3-D faults Quake rates elsewhere Putting it all together ... Uniform California Earthquake Rupture Forecast

  4. An earthquake detection algorithm with pseudo-probabilities of multiple indicators

    NASA Astrophysics Data System (ADS)

    Ross, Z. E.; Ben-Zion, Y.

    2014-04-01

    We develop an automatic earthquake detection algorithm combining information from numerous indicator variables in a non-parametric framework. The method is shown to perform well with multiple ratios of moving short- and long-time averages having ranges of time intervals and frequency bands. The results from each indicator are transformed to a pseudo-probability time-series (PPTS) in the range [0, 1]. The various PPTS of the different indicators are multiplied to form a single joint PPTS that is used for detections. Since all information is combined, redundancy among the different indicators produces robust peaks in the output. This allows the trigger threshold applied to the joint PPTS to be significantly lower than for any one detector, leading to substantially more detected earthquakes. Application of the algorithm to a small data set recorded during a 7-d window by 13 stations near the San Jacinto fault zone detects 3.13 times as many earthquakes as listed in the Southern California Seismic Network catalogue. The method provides a convenient statistical platform for including other indicators, and may utilize different sets of indicators to detect other information such as specific seismic phases or tremor.

  5. Short-term earthquake probabilities during the L'Aquila earthquake sequence in central Italy, 2009

    NASA Astrophysics Data System (ADS)

    Falcone, G.; Murru, M.; Zhuang, J.; Console, R.

    2014-12-01

    We compare the forecasting performance of several statistical models, which are used to describe the occurrence process of earthquakes, in forecasting the short-term earthquake probabilities during the occurrence of the L'Aquila earthquake sequence in central Italy, 2009. These models include the Proximity to Past Earthquakes (PPE) model and different versions of the Epidemic Type Aftershock Sequence (ETAS) model. We used the information gains corresponding to the Poisson and binomial scores to evaluate the performance of these models. It is shown that all ETAS models work better than the PPE model. However, when comparing the different types of the ETAS models, the one with the same fixed exponent coefficient ? = 2.3 for both the productivity function and the scaling factor in the spatial response function, performs better in forecasting the active aftershock sequence than the other models with different exponent coefficients when the Poisson score is adopted. These latter models perform only better when a lower magnitude threshold of 2.0 and the binomial score are used. The reason is likely due to the fact that the catalog does not contain an event of magnitude similar to the L'Aquila main shock (Mw 6.3) in the training period (April 16, 2005 to March 15, 2009). In this case the a-value is under-estimated and thus also the forecasted seismicity is underestimated when the productivity function is extrapolated to high magnitudes. These results suggest that the training catalog used for estimating the model parameters should include earthquakes of similar magnitudes as the main shock when forecasting seismicity is during an aftershock sequences.

  6. Earthquake!

    ERIC Educational Resources Information Center

    Hernandez, Hildo

    2000-01-01

    Examines the types of damage experienced by California State University at Northridge during the 1994 earthquake and what lessons were learned in handling this emergency are discussed. The problem of loose asbestos is addressed. (GR)

  7. Significance of stress transfer in time-dependent earthquake probability calculations

    E-print Network

    of forecasts; so how large a static stress change is required to cause a statistically significant earthquake, 2003]. [3] If we want to make a probabilistic earthquake forecast in a region under the influenceSignificance of stress transfer in time-dependent earthquake probability calculations Tom Parsons U

  8. Automatic 3D Moment tensor inversions for southern California earthquakes

    NASA Astrophysics Data System (ADS)

    Liu, Q.; Tape, C.; Friberg, P.; Tromp, J.

    2008-12-01

    We present a new source mechanism (moment-tensor and depth) catalog for about 150 recent southern California earthquakes with Mw ? 3.5. We carefully select the initial solutions from a few available earthquake catalogs as well as our own preliminary 3D moment tensor inversion results. We pick useful data windows by assessing the quality of fits between the data and synthetics using an automatic windowing package FLEXWIN (Maggi et al 2008). We compute the source Fréchet derivatives of moment-tensor elements and depth for a recent 3D southern California velocity model inverted based upon finite-frequency event kernels calculated by the adjoint methods and a nonlinear conjugate gradient technique with subspace preconditioning (Tape et al 2008). We then invert for the source mechanisms and event depths based upon the techniques introduced by Liu et al 2005. We assess the quality of this new catalog, as well as the other existing ones, by computing the 3D synthetics for the updated 3D southern California model. We also plan to implement the moment-tensor inversion methods to automatically determine the source mechanisms for earthquakes with Mw ? 3.5 in southern California.

  9. Deterministic Earthquake Hazard Assessment by Public Agencies in California

    NASA Astrophysics Data System (ADS)

    Mualchin, L.

    2005-12-01

    Even in its short recorded history, California has experienced a number of damaging earthquakes that have resulted in new codes and other legislation for public safety. In particular, the 1971 San Fernando earthquake produced some of the most lasting results such as the Hospital Safety Act, the Strong Motion Instrumentation Program, the Alquist-Priolo Special Studies Zone Act, and the California Department of Transportation (Caltrans') fault-based deterministic seismic hazard (DSH) map. The latter product provides values for earthquake ground motions based on Maximum Credible Earthquakes (MCEs), defined as the largest earthquakes that can reasonably be expected on faults in the current tectonic regime. For surface fault rupture displacement hazards, detailed study of the same faults apply. Originally, hospital, dam, and other critical facilities used seismic design criteria based on deterministic seismic hazard analyses (DSHA). However, probabilistic methods grew and took hold by introducing earthquake design criteria based on time factors and quantifying "uncertainties", by procedures such as logic trees. These probabilistic seismic hazard analyses (PSHA) ignored the DSH approach. Some agencies were influenced to adopt only the PSHA method. However, deficiencies in the PSHA method are becoming recognized, and the use of the method is now becoming a focus of strong debate. Caltrans is in the process of producing the fourth edition of its DSH map. The reason for preferring the DSH method is that Caltrans believes it is more realistic than the probabilistic method for assessing earthquake hazards that may affect critical facilities, and is the best available method for insuring public safety. Its time-invariant values help to produce robust design criteria that are soundly based on physical evidence. And it is the method for which there is the least opportunity for unwelcome surprises.

  10. Earthquake resistant building design codes and safety standards: The California experience

    Microsoft Academic Search

    Stephen H. Cutcliffe

    2000-01-01

    Seismologists and earthquake engineers have sought to understand and predict earthquakes and to develop better building designs to withstand them for well over a century. In the United States, the 1906 San Francisco earthquake provided the first real impetus for establishing building design codes and safety standards. Subsequent major California earthquakes in Santa Barbara (1925), Long Beach (1933), San Fernando

  11. UNIVERSITY OF SOUTHERN CALIFORNIA BUILDING PERIODS FOR USE IN EARTHQUAKE RESISTANT DESIGN

    E-print Network

    Southern California, University of

    UNIVERSITY OF SOUTHERN CALIFORNIA BUILDING PERIODS FOR USE IN EARTHQUAKE RESISTANT DESIGN CODES October 3, 2007 #12;i BUILDING PERIODS FOR USE IN EARTHQUAKE RESISTANT DESIGN CODES ­ EARTHQUAKE RESPONSE ­ EARTHQUAKE RESPONSE DATA COMPILATION AND ANALYSIS OF TIME AND AMPLITUDE VARIATIONS Final Project Report

  12. Should Coulomb stress change calculations be used to forecast aftershocks and to influence earthquake probability estimates? (Invited)

    NASA Astrophysics Data System (ADS)

    Parsons, T.

    2009-12-01

    After a large earthquake, our concern immediately moves to the likelihood that another large shock could be triggered, threatening an already weakened building stock. A key question is whether it is best to map out Coulomb stress change calculations shortly after mainshocks to potentially highlight the most likely aftershock locations, or whether it is more prudent to wait until the best information is available. It has been shown repeatedly that spatial aftershock patterns can be matched with Coulomb stress change calculations a year or more after mainshocks. However, with the onset of rapid source slip model determinations, the method has produced encouraging results like the M=8.7 earthquake that was forecast using stress change calculations from 2004 great Sumatra earthquake by McCloskey et al. [2005]. Here, I look back at two additional prospective calculations published shortly after the 2005 M=7.6 Kashmir and 2008 M=8.0 Wenchuan earthquakes. With the benefit of 1.5-4 years of additional seismicity, it is possible to assess the performance of rapid Coulomb stress change calculations. In the second part of the talk, within the context of the ongoing Working Group on California Earthquake Probabilities (WGCEP) assessments, uncertainties associated with time-dependent probability calculations are convolved with uncertainties inherent to Coulomb stress change calculations to assess the strength of signal necessary for a physics-based calculation to merit consideration into a formal earthquake forecast. Conclusions are as follows: (1) subsequent aftershock occurrence shows that prospective static stress change calculations both for Kashmir and Wenchuan examples failed to adequately predict the spatial post-mainshock earthquake distributions. (2) For a San Andreas fault example with relatively well-understood recurrence, a static stress change on the order of 30 to 40 times the annual stressing rate would be required to cause a significant (90%) perturbation to the distribution of allowable 30-year time-dependent probability results.

  13. MOHO ORIENTATION BENEATH CENTRAL CALIFORNIA FROM REGIONAL EARTHQUAKE TRAVEL TIMES.

    USGS Publications Warehouse

    Oppenheimer, David H.; Eaton, Jerry P.

    1984-01-01

    This paper examines relative Pn arrival times, recorded by the U. S. Geological Survey seismic network in central and northern California from an azimuthally distributed set of regional earthquakes. Improved estimates are presented of upper mantle velocities in the Coast Ranges, Great Valley, and Sierra Nevada foothills and estimates of the orientation of the Moho throughout this region. Finally, the azimuthal distribution of apparent velocities, corrected for dip and individual station travel time effects, is then studied for evidence of upper mantle velocity anisotropy and for indications of lower crustal structure in central California.

  14. Recalculated probability of M !!!!!! 7 earthquakes beneath the Sea of Marmara, Turkey

    E-print Network

    Recalculated probability of M !!!!!! 7 earthquakes beneath the Sea of Marmara, Turkey Tom Parsons U February 2004; published 22 May 2004. [1] New earthquake probability calculations are made for the Sea of Marmara region and the city of Istanbul, providing a revised forecast and an evaluation of time

  15. UNIVERSITY OF CALIFORNIA, SAN DIEGO Parallel Finite Element Modeling of Earthquake Ground

    E-print Network

    Stanford University

    UNIVERSITY OF CALIFORNIA, SAN DIEGO Parallel Finite Element Modeling of Earthquake Ground Response: _____________________________________________ _____________________________________________ _____________________________________________ _____________________________________________ _____________________________________________ Chair University of California, San Diego 2006 #12;iv Table of Contents Table of Contents Signature Page......................................................................................................... 1 1.2 Parallel Computing in FE Analysis

  16. Cascadia Earthquake and Tsunami Scenario for California's North Coast

    NASA Astrophysics Data System (ADS)

    Dengler, L.

    2006-12-01

    In 1995 the California Division of Mines and Geology (now the California Geological Survey) released a planning scenario for an earthquake on the southern portion of the Cascadia subduction zone (CSZ). This scenario was the 8th and last of the Earthquake Planning Scenarios published by CDMG. It was the largest magnitude CDMG scenario, an 8.4 earthquake rupturing the southern 200 km of the CSZ, and it was the only scenario to include tsunami impacts. This scenario event has not occurred in historic times and depicts impacts far more severe than any recent earthquake. The local tsunami hazard is new; there is no written record of significant local tsunami impact in the region. The north coast scenario received considerable attention in Humboldt and Del Norte Counties and contributed to a number of mitigation efforts. The Redwood Coast Tsunami Work Group (RCTWG), an organization of scientists, emergency managers, government agencies, and businesses from Humboldt, Mendocino, and Del Norte Counties, was formed in 1996 to assist local jurisdictions in understanding the implications of the scenario and to promote a coordinated, consistent mitigation program. The group has produced print and video materials and promoted response and evacuation planning. Since 1997 the RCTWG has sponsored an Earthquake Tsunami Education Room at county fairs featuring preparedness information, hands-on exhibits and regional tsunami hazard maps. Since the development of the TsunamiReady Program in 2001, the RCTWG facilitates community TsunamiReady certification. To assess the effectiveness of mitigation efforts, five telephone surveys between 1993 and 2001 were conducted by the Humboldt Earthquake Education Center. A sixth survey is planned for this fall. Each survey includes between 400 and 600 respondents. Over the nine year period covered by the surveys, the percent with houses secured to foundations has increased from 58 to 80 percent, respondents aware of a local tsunami hazard increased from 51 to 73 percent and knowing what the Cascadia subduction zone is from 16 to 42 percent. It is not surprising that the earlier surveys showed increases as several strong earthquakes occurred in the area between 1992 and 1995 and there was considerable media attention. But the 2001 survey, seven years after the last widely felt event, still shows significant increases in almost all preparedness indicators. The 1995 CDMG scenario was not the sole reason for the increased interest in earthquake and tsunami hazards in the area, but the scenario gave government recognition to an event that was previously only considered seriously in the scientific community and has acted as a catalyst for mitigation and planning efforts.

  17. Southern California Edison's Evaluation of California Energy Commission

    E-print Network

    Forecast, Version 2 (UCERF 2) by the 2007 Working Group on California Earthquake Probabilities (WGCEP the seismic hazard at SONGS. The analysis specifically included the Uniform California Earthquake Rupture, 2008), which is the joint product of the Southern California Earthquake Center (SCEC

  18. High Resolution Long- and Short-Term Earthquake Forecasts for California

    E-print Network

    Werner, M J; Jackson, D D; Kagan, Y Y

    2009-01-01

    We present two models for estimating the probabilities of future earthquakes in California, to be tested in the Collaboratory for the Study of Earthquake Predictability (CSEP). The first, time-independent model, modified from Helmstetter et al. (2007), provides five-year forecasts for magnitudes m > 4.95. We show that large quakes occur on average near the locations of small m > 2 events, so that a high-resolution estimate of the spatial distribution of future large quakes is obtained from the locations of the numerous small events. We employ an adaptive spatial kernel of optimized bandwidth and assume a universal, tapered Gutenberg-Richter distribution. In retrospective tests, we show that no Poisson forecast could capture the observed variability. We therefore also test forecasts using a negative binomial distribution for the number of events. We modify existing likelihood-based tests to better evaluate the spatial forecast. Our time-dependent model, an Epidemic Type Aftershock Sequence (ETAS) model modifie...

  19. Earthquake epicenters and fault intersections in central and southern California

    NASA Technical Reports Server (NTRS)

    Abdel-Gawad, M. (principal investigator); Silverstein, J.

    1972-01-01

    The author has identifed the following significant results. ERTS-1 imagery provided evidence for the existence of short transverse fault segments lodged between faults of the San Andreas system in the Coast Ranges, California. They indicate that an early episode of transverse shear has affected the Coast Ranges prior to the establishment of the present San Andreas fault. The fault has been offset by transverse faults of the Transverse Ranges. It appears feasible to identify from ERTS-1 imagery geomorphic criteria of recent fault movements. Plots of historic earthquakes in the Coast Ranges and western Transverse Ranges show clusters in areas where structures are complicated by interaction of tow active fault systems. A fault lineament apparently not previously mapped was identified in the Uinta Mountains, Utah. Part of the lineament show evidence of recent faulting which corresponds to a moderate earthquake cluster.

  20. Comparison of Short-Term and Time-Independent Earthquake Forecast Models for Southern California

    Microsoft Academic Search

    Agnes Helmstetter; Yan Y. Kagan; David D. Jackson

    2006-01-01

    We have initially developed a time-independent forecast for southern California by smoothing the locations of magnitude 2 and larger earthquakes. We show that using small m 2 earthquakes gives a reasonably good prediction of m 5 earthquakes. Our forecast outperforms other time-independent models (Kagan and Jackson, 1994; Frankel et al., 1997), mostly because it has higher spatial resolution. We have

  1. Foreshocks and aftershocks of the Great 1857 California earthquake

    USGS Publications Warehouse

    Meltzner, A.J.; Wald, D.J.

    1999-01-01

    The San Andreas fault is the longest fault in California and one of the longest strike-slip faults anywhere in the world, yet we know little about many aspects of its behavior before, during, and after large earthquakes. We conducted a study to locate and to estimate magnitudes for the largest foreshocks and aftershocks of the 1857 M 7.9 Fort Tejon earthquake on the central and southern segments of the fault. We began by searching archived first-hand accounts from 1857 through 1862, by grouping felt reports temporally, and by assigning modified Mercalli intensities to each site. We then used a modified form of the grid-search algorithm of Bakum and Wentworth, derived from empirical analysis of modern earthquakes, to find the location and magnitude most consistent with the assigned intensities for each of the largest events. The result confirms a conclusion of Sieh that at least two foreshocks ('dawn' and 'sunrise') located on or near the Parkfield segment of the San Andreas fault preceded the mainshock. We estimate their magnitudes to be M ~ 6.1 and M ~ 5.6, respectively. The aftershock rate was below average but within one standard deviation of the number of aftershocks expected based on statistics of modern southern California mainshock-aftershock sequences. The aftershocks included two significant events during the first eight days of the sequence, with magnitudes M ~ 6.25 and M ~ 6.7, near the southern half of the rupture; later aftershocks included a M ~ 6 event near San Bernardino in December 1858 and a M ~ 6.3 event near the Parkfield segment in April 1860. From earthquake logs at Fort Tejon, we conclude that the aftershock sequence lasted a minimum of 3.75 years.

  2. Variability of Near-Term Probability for the Next Great Earthquake on the Cascadia Subduction Zone

    Microsoft Academic Search

    Stephane Mazzotti; John Adams

    2004-01-01

    The threat of a great (M 9) earthquake along the Cascadia subduction zone is evidenced by both paleoseismology data and current strain accumulation along the fault. On the basis of recent information on the characteristics of this subduction system, we estimate the conditional probabilities of a great earthquake occurring within the next 50 years and their variabilities. The most important

  3. Probability map of the next M ? 5.5 earthquakes in Italy

    Microsoft Academic Search

    F. R. Cinti; L. Faenza; W. Marzocchi; P. Montone

    2004-01-01

    The main goal of this work is to provide a probability map for the next moderate to large earthquakes (M ? 5.5) in Italy. For this purpose we apply a new nonparametric multivariate model to characterize the spatiotemporal distribution of earthquakes. The method is able to account for tectonics\\/physics parameters that can potentially influence the spatiotemporal variability and tests their

  4. Error propagation in time-dependent probability of occurrence for characteristic earthquakes in Italy

    Microsoft Academic Search

    Laura Peruzza; Bruno Pace; Fabio Cavallini

    2010-01-01

    Time-dependent models for seismic hazard and earthquake probabilities are at the leading edge of research nowadays. In the\\u000a framework of a 2-year national Italian project (2005–2007), we have applied the Brownian passage time (BPT) renewal model\\u000a to the recently released Database of Individual Seismogenic Sources (DISS) to compute earthquake probability in the period\\u000a 2007–2036. Observed interevent times on faults in

  5. SCIGN; new Southern California GPS network advances the study of earthquakes

    USGS Publications Warehouse

    Hudnut, Ken; King, Nancy

    2001-01-01

    Southern California is a giant jigsaw puzzle, and scientists are now using GPS satellites to track the pieces. These puzzle pieces are continuously moving, slowly straining the faults in between. That strain is then eventually released in earthquakes. The innovative Southern California Integrated GPS Network (SCIGN) tracks the motions of these pieces over most of southern California with unprecedented precision. This new network greatly improves the ability to assess seismic hazards and quickly measure the larger displacements that occur during and immediatelyafter earthquakes.

  6. Catalog of earthquakes along the San Andreas fault system in Central California, April-June 1972

    USGS Publications Warehouse

    Wesson, R.L.; Bennett, R.E.; Lester, F.W.

    1973-01-01

    Numerous small earthquakes occur each day in the coast ranges of Central California. The detailed study of these earthquakes provides a tool for gaining insight into the tectonic and physical processes responsible for the generation of damaging earthquakes. This catalog contains the fundamental parameters for earthquakes located within and adjacent to the seismograph network operated by the National Center for Earthquake Research (NCER), U.S. Geological Survey, during the period April - June, 1972. The motivation for these detailed studies has been described by Pakiser and others (1969) and by Eaton and others (1970). Similar catalogs of earthquakes for the years 1969, 1970 and 1971 have been prepared by Lee and others (1972 b, c, d). A catalog for the first quarter of 1972 has been prepared by Wesson and others (1972). The basic data contained in these catalogs provide a foundation for further studies. This catalog contains data on 910 earthquakes in Central California. A substantial portion of the earthquakes reported in this catalog represents a continuation of the sequence of earthquakes in the Bear Valley area which began in February, 1972 (Wesson and others, 1972). Arrival times at 126 seismograph stations were used to locate the earthquakes listed in this catalog. Of these, 101 are telemetered stations operated by NCER. Readings from the remaining 25 stations were obtained through the courtesy of the Seismographic Stations, University of California, Berkeley (UCB); the Earthquake Mechanism Laboratory, National Oceanic and Atmospheric Administration, San Francisco (EML); and the California Department of Water Resources, Sacramento. The Seismographic Stations of the University of California, Berkeley, have for many years published a bulletin describing earthquakes in Northern California and the surrounding area, and readings at UCB Stations from more distant events. The purpose of the present catalog is not to replace the UCB Bulletin, but rather to supplement it, by describing the seismicity of a portion of central California in much greater detail.

  7. Simulations of the 1906 San Francisco Earthquake and Scenario Earthquakes in Northern California

    NASA Astrophysics Data System (ADS)

    Larsen, S.; Dreger, D.; Dolenc, D.

    2006-12-01

    3-D simulations of seismic ground motions are performed to better characterize the 1906 San Francisco earthquake and to investigate the seismic consequences from scenario events in northern California. Specifically, we perform simulations of: 1) the 1906 earthquake, which bilaterally ruptured a 480-km segment of the San Andreas fault from San Juan Bautista to Cape Mendocino (epicenter a few kilometers off the coast of San Francisco); 2) large scenario San Andreas events with different epicentral locations; and 3) smaller scenario events along faults local to the San Francisco Bay Area. Simulations of the 1906 earthquake indicate that significant ground motion occurred up and down the northern California coast and out into the Central Valley. Comparisons between the simulated motions and observed data (e.g., shaking intensities, Boatwright and Bundock, 2005), suggest that the moment magnitude of this event was between M7.8 and M7.9. Simulations of 1906-like scenario events along the San Andreas fault reveal that ground motions in the San Francisco Bay Area and in the Sacramento Delta region would be significantly stronger for earthquakes initiating along the northern section of the fault and rupturing to the southeast. Simulations of smaller scenario events in the San Francisco Bay Area indicate areas of concentrated shaking. These simulations are performed using a recently developed 3-D geologic model of northern California (Brocher and Thurber, 2005; Jachens et al., 2005), together with finite-difference codes (E3D and a new public domain package). The effects of topography and attenuation are included. The full computational domain spans most of the geologic model and is 630x320x50 km3. The minimum S-wave velocity is constrained to 500 m/s, except in water. Frequencies up to 1.0 Hz are modeled. The grid spacing ranges from 75 m to 200 m. High performance supercomputers are used for the simulations, which include models of over 23 billion grid nodes using 2000 processors. This work was performed under the auspices of the U.S. Department of Energy by University of California Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48.

  8. Landslides triggered by the 1994 Northridge, California, earthquake

    USGS Publications Warehouse

    Harp, E.L.; Jibson, R.W.

    1996-01-01

    The 17 January 1994 Northridge, California, earthquake (Mw, = 6.7) triggered more than 11,000 landslides over an area of about 10,000 km2. Most of the landslides were concentrated in a 1000-km2 area that included the Santa Susana Mountains and the mountains north of the Santa Clara River valley. We mapped landslides triggered by the earthquake in the field and from 1:60,000-nominal-scale aerial photography provided by the U.S. Air Force and taken the morning of the earthquake; these mapped landslides were subsequently digitized and plotted in a GIS-based format. Most of the triggered landslides were shallow (1- to 5-m thick), highly disrupted falls and slides within weakly cemented Tertiary to Pleistocene clastic sediment. Average volumes of these types of landslides were less than 1000 m3, but many had volumes exceeding 100,000 m3. The larger disrupted slides commonly had runout paths of more than 50 m, and a few traveled as far as 200 m from the bases of steep parent slopes. Deeper (>5-m thick) rotational slumps and block slides numbered in the tens to perhaps hundreds, a few of which exceeded 100,000 m3 in volume. Most of these were reactivations of previously existing landslides. The largest single landslide triggered by the earthquake was a rotational slump/block slide having a volume of 8 ?? 106 m3. Analysis of the mapped landslide distribution with respect to variations in (1) landslide susceptibility and (2) strong shaking recorded by hundreds of instruments will form the basis of a seismic landslide hazard analysis of the Los Angeles area.

  9. A hypothesis for delayed dynamic earthquake triggering Tom Parsons

    E-print Network

    component of earthquake probability forecasts [e.g., Working Group on California Earthquake ProbabilitiesA hypothesis for delayed dynamic earthquake triggering Tom Parsons U.S. Geological Survey, Menlo; published 16 February 2005. [1] It's uncertain whether more near-field earthquakes are triggered by static

  10. Three-dimensional tomography of the 1992 southern California earthquake sequence: Constraints on dynamic earthquake rupture?

    NASA Astrophysics Data System (ADS)

    Lees, Jonathan M.; Nicholson, Craig

    1993-05-01

    Tomographic inversion of P-wave arrival times from aftershocks of 1992 southern California earthquakes is used to produce three dimensional images of subsurface velocity. The preliminary 1992 data set, augmented by the 1986 M 5.9 North Palm Springs sequence, consists of 6458 high-quality events recorded by the permanent regional network—providing 76306 raypaths for inversion. The target area consisted of a 104 x 104 x 32 km3 volume divided into 52 x 52 x 10 rectilinear blocks. Significant velocity perturbations appear to correlate with rupture properties of recent major earthquakes. Preliminary results indicate that a low-velocity anomaly separates the dynamic rupture of the M 6.5 Big Bear event from the M 7.4 Landers main shock; a similar low-velocity region separates the M 6.1 Joshua Tree sequence from the Landers rupture.High-velocity anomalies occur at or near nucleation sites of all four recent main shocks (North Palm Springs-Joshua Tree-LandersBig Bear). A high-velocity anomaly is present along the San Andreas fault between 5 and 12 km depth through San Gorgonio Pass; this high-velocity area may define an asperity where stress is concentrated and where future large earthquakes may begin.

  11. Recalculated probability of M ? 7 earthquakes beneath the Sea of Marmara, Turkey

    USGS Publications Warehouse

    Parsons, T.

    2004-01-01

    New earthquake probability calculations are made for the Sea of Marmara region and the city of Istanbul, providing a revised forecast and an evaluation of time-dependent interaction techniques. Calculations incorporate newly obtained bathymetric images of the North Anatolian fault beneath the Sea of Marmara [Le Pichon et al., 2001; Armijo et al., 2002]. Newly interpreted fault segmentation enables an improved regional A.D. 1500-2000 earthquake catalog and interevent model, which form the basis for time-dependent probability estimates. Calculations presented here also employ detailed models of coseismic and postseismic slip associated with the 17 August 1999 M = 7.4 Izmit earthquake to investigate effects of stress transfer on seismic hazard. Probability changes caused by the 1999 shock depend on Marmara Sea fault-stressing rates, which are calculated with a new finite element model. The combined 2004-2034 regional Poisson probability of M?7 earthquakes is ~38%, the regional time-dependent probability is 44 ± 18%, and incorporation of stress transfer raises it to 53 ± 18%. The most important effect of adding time dependence and stress transfer to the calculations is an increase in the 30 year probability of a M ??? 7 earthquake affecting Istanbul. The 30 year Poisson probability at Istanbul is 21%, and the addition of time dependence and stress transfer raises it to 41 ± 14%. The ranges given on probability values are sensitivities of the calculations to input parameters determined by Monte Carlo analysis; 1000 calculations are made using parameters drawn at random from distributions. Sensitivities are large relative to mean probability values and enhancements caused by stress transfer, reflecting a poor understanding of large-earthquake aperiodicity.

  12. Subduction zone earthquake probably triggered submarine hydrocarbon seepage offshore Pakistan

    NASA Astrophysics Data System (ADS)

    Fischer, David; José M., Mogollón; Michael, Strasser; Thomas, Pape; Gerhard, Bohrmann; Noemi, Fekete; Volkhard, Spiess; Sabine, Kasten

    2014-05-01

    Seepage of methane-dominated hydrocarbons is heterogeneous in space and time, and trigger mechanisms of episodic seep events are not well constrained. It is generally found that free hydrocarbon gas entering the local gas hydrate stability field in marine sediments is sequestered in gas hydrates. In this manner, gas hydrates can act as a buffer for carbon transport from the sediment into the ocean. However, the efficiency of gas hydrate-bearing sediments for retaining hydrocarbons may be corrupted: Hypothesized mechanisms include critical gas/fluid pressures beneath gas hydrate-bearing sediments, implying that these are susceptible to mechanical failure and subsequent gas release. Although gas hydrates often occur in seismically active regions, e.g., subduction zones, the role of earthquakes as potential triggers of hydrocarbon transport through gas hydrate-bearing sediments has hardly been explored. Based on a recent publication (Fischer et al., 2013), we present geochemical and transport/reaction-modelling data suggesting a substantial increase in upward gas flux and hydrocarbon emission into the water column following a major earthquake that occurred near the study sites in 1945. Calculating the formation time of authigenic barite enrichments identified in two sediment cores obtained from an anticlinal structure called "Nascent Ridge", we find they formed 38-91 years before sampling, which corresponds well to the time elapsed since the earthquake (62 years). Furthermore, applying a numerical model, we show that the local sulfate/methane transition zone shifted upward by several meters due to the increased methane flux and simulated sulfate profiles very closely match measured ones in a comparable time frame of 50-70 years. We thus propose a causal relation between the earthquake and the amplified gas flux and present reflection seismic data supporting our hypothesis that co-seismic ground shaking induced mechanical fracturing of gas hydrate-bearing sediments creating pathways for free gas to migrate from a shallow reservoir within the gas hydrate stability zone into the water column. Our results imply that free hydrocarbon gas trapped beneath a local gas hydrate seal was mobilized through earthquake-induced mechanical failure and in that way circumvented carbon sequestration within the sediment. These findings lead to conclude that hydrocarbon seepage triggered by earthquakes can play a role for carbon budgets at other seismically active continental margins. The newly identified process presented in our study is conceivable to help interpret data from similar sites. Reference: Fischer, D., Mogollon, J.M., Strasser, M., Pape, T., Bohrmann, G., Fekete, N., Spieß, V. and Kasten, S., 2013. Subduction zone earthquake as potential trigger of submarine hydrocarbon seepage. Nature Geoscience 6: 647-651.

  13. Probable maximum loss estimation in earthquakes: an application to welded steel moment frames

    Microsoft Academic Search

    Charles C. Thiel Jr

    1997-01-01

    An approach to evaluation of the damageability for buildings and the determination of probable maximum loss (PML) values is presented and applied to welded steel moment frame buildings. PML is defined as the loss that has a given (usually 10%) probability of exceedance in a specified number of years from earthquake ground shaking. A Markov Model developed is used in

  14. Catalog of earthquakes along the San Andreas fault system in Central California, July-September 1972

    USGS Publications Warehouse

    Wesson, R.L.; Meagher, K.L.; Lester, F.W.

    1973-01-01

    Numerous small earthquakes occur each day in the coast ranges of Central California. The detailed study of these earthquakes provides a tool for gaining insight into the tectonic and physical processes responsible for the generation of damaging earthquakes. This catalog contains the fundamental parameters for earthquakes located within and adjacent to the seismograph network operated by the National Center for Earthquake Research (NCER), U.S. Geological Survey, during the period July - September, 1972. The motivation for these detailed studies has been described by Pakiser and others (1969) and by Eaton and others (1970). Similar catalogs of earthquakes for the years 1969, 1970 and 1971 have been prepared by Lee and others (1972 b, c, d). Catalogs for the first and second quarters of 1972 have been prepared by Wessan and others (1972 a & b). The basic data contained in these catalogs provide a foundation for further studies. This catalog contains data on 1254 earthquakes in Central California. Arrival times at 129 seismograph stations were used to locate the earthquakes listed in this catalog. Of these, 104 are telemetered stations operated by NCER. Readings from the remaining 25 stations were obtained through the courtesy of the Seismographic Stations, University of California, Berkeley (UCB), the Earthquake Mechanism Laboratory, National Oceanic and Atmospheric Administration, San Francisco (EML); and the California Department of Water Resources, Sacramento. The Seismographic Stations of the University of California, Berkeley, have for many years published a bulletin describing earthquakes in Northern California and the surrounding area, and readings at UCB Stations from more distant events. The purpose of the present catalog is not to replace the UCB Bulletin, but rather to supplement it, by describing the seismicity of a portion of central California in much greater detail.

  15. UCERF3: A new earthquake forecast for California's complex fault system

    USGS Publications Warehouse

    Field, Edward H.; 2014 Working Group on California Earthquake Probabilities

    2015-01-01

    With innovations, fresh data, and lessons learned from recent earthquakes, scientists have developed a new earthquake forecast model for California, a region under constant threat from potentially damaging events. The new model, referred to as the third Uniform California Earthquake Rupture Forecast, or "UCERF" (http://www.WGCEP.org/UCERF3), provides authoritative estimates of the magnitude, location, and likelihood of earthquake fault rupture throughout the state. Overall the results confirm previous findings, but with some significant changes because of model improvements. For example, compared to the previous forecast (Uniform California Earthquake Rupture Forecast 2), the likelihood of moderate-sized earthquakes (magnitude 6.5 to 7.5) is lower, whereas that of larger events is higher. This is because of the inclusion of multifault ruptures, where earthquakes are no longer confined to separate, individual faults, but can occasionally rupture multiple faults simultaneously. The public-safety implications of this and other model improvements depend on several factors, including site location and type of structure (for example, family dwelling compared to a long-span bridge). Building codes, earthquake insurance products, emergency plans, and other risk-mitigation efforts will be updated accordingly. This model also serves as a reminder that damaging earthquakes are inevitable for California. Fortunately, there are many simple steps residents can take to protect lives and property.

  16. Inventory of landslides triggered by the 1994 Northridge, California earthquake

    USGS Publications Warehouse

    Harp, Edwin L.; Jibson, Randall W.

    1995-01-01

    The 17 January 1994 Northridge, California, earthquake (M=6.7) triggered more than 11,000 landslides over an area of about 10,000 km?. Most of the landslides were concentrated in a 1,000-km? area that includes the Santa Susana Mountains and the mountains north of the Santa Clara River valley. We mapped landslides triggered by the earthquake in the field and from 1:60,000-scale aerial photography provided by the U.S. Air Force and taken the morning of the earthquake; these were subsequently digitized and plotted in a GIS-based format, as shown on the accompanying maps (which also are accessible via Internet). Most of the triggered landslides were shallow (1-5 m), highly disrupted falls and slides in weakly cemented Tertiary to Pleistocene clastic sediment. Average volumes of these types of landslides were less than 1,000 m?, but many had volumes exceeding 100,000 m?. Many of the larger disrupted slides traveled more than 50 m, and a few moved as far as 200 m from the bases of steep parent slopes. Deeper ( >5 m) rotational slumps and block slides numbered in the hundreds, a few of which exceeded 100,000 m? in volume. The largest triggered landslide was a block slide having a volume of 8X10E06 m?. Triggered landslides damaged or destroyed dozens of homes, blocked roads, and damaged oil-field infrastructure. Analysis of landslide distribution with respect to variations in (1) landslide susceptibility and (2) strong shaking recorded by hundreds of instruments will form the basis of a seismic landslide hazard analysis of the Los Angeles area.

  17. Probability estimates of seismic event occurrence compared to health hazards - Forecasting Taipei's Earthquakes

    NASA Astrophysics Data System (ADS)

    Fung, D. C. N.; Wang, J. P.; Chang, S. H.; Chang, S. C.

    2014-12-01

    Using a revised statistical model built on past seismic probability models, the probability of different magnitude earthquakes occurring within variable timespans can be estimated. The revised model is based on Poisson distribution and includes the use of best-estimate values of the probability distribution of different magnitude earthquakes recurring from a fault from literature sources. Our study aims to apply this model to the Taipei metropolitan area with a population of 7 million, which lies in the Taipei Basin and is bounded by two normal faults: the Sanchaio and Taipei faults. The Sanchaio fault is suggested to be responsible for previous large magnitude earthquakes, such as the 1694 magnitude 7 earthquake in northwestern Taipei (Cheng et. al., 2010). Based on a magnitude 7 earthquake return period of 543 years, the model predicts the occurrence of a magnitude 7 earthquake within 20 years at 1.81%, within 79 years at 6.77% and within 300 years at 21.22%. These estimates increase significantly when considering a magnitude 6 earthquake; the chance of one occurring within the next 20 years is estimated to be 3.61%, 79 years at 13.54% and 300 years at 42.45%. The 79 year period represents the average lifespan of the Taiwan population. In contrast, based on data from 2013, the probability of Taiwan residents experiencing heart disease or malignant neoplasm is 11.5% and 29%. The inference of this study is that the calculated risk that the Taipei population is at from a potentially damaging magnitude 6 or greater earthquake occurring within their lifetime is just as great as of suffering from a heart attack or other health ailments.

  18. Bridge pier failure probabilities under combined hazard effects of scour, truck and earthquake. Part I: occurrence probabilities

    NASA Astrophysics Data System (ADS)

    Liang, Zach; Lee, George C.

    2013-06-01

    In many regions of the world, a bridge will experience multiple extreme hazards during its expected service life. The current American Association of State Highway and Transportation Officials (AASHTO) load and resistance factor design (LRFD) specifications are formulated based on failure probabilities, which are fully calibrated for dead load and nonextreme live loads. Design against earthquake loads is established separately. Design against scour effect is also formulated separately by using the concept of capacity reduction (or increased scour depth). Furthermore, scour effect cannot be linked directly to an LRFD limit state equation, because the latter is formulated using force-based analysis. This paper (in two parts) presents a probability-based procedure to estimate the combined hazard effects on bridges due to truck, earthquake and scour, by treating the effect of scour as an equivalent load effect so that it can be included in reliability-based bridge failure calculations. In Part I of this series, the general principle of treating the scour depth as an equivalent load effect is presented. The individual and combined partial failure probabilities due to truck, earthquake and scour effects are described. To explain the method of including non-force-based natural hazards effects, two types of common scour failures are considered. In Part II, the corresponding bridge failure probability, the occurrence of scour as well as simultaneously having both truck load and equivalent scour load are quantitatively discussed.

  19. Bridge pier failure probabilities under combined hazard effects of scour, truck and earthquake. Part II: failure probabilities

    NASA Astrophysics Data System (ADS)

    Liang, Zach; Lee, George C.

    2013-06-01

    In many regions of the world, a bridge will experience multiple extreme hazards during its expected service life. The current American Association of State Highway and Transportation Officials (AASHTO) load and resistance factor design (LRFD) specifications are formulated based on failure probabilities, which are fully calibrated for dead load and non-extreme live loads. Design against earthquake load effect is established separately. Design against scour effect is also formulated separately by using the concept of capacity reduction (or increased scour depth). Furthermore, scour effect cannot be linked directly to an LRFD limit state equation because the latter is formulated using force-based analysis. This paper (in two parts) presents a probability-based procedure to estimate the combined hazard effects on bridges due to truck, earthquake and scour, by treating the effect of scour as an equivalent load effect so that it can be included in reliability-based failure calculations. In Part I of this series, the general principle for treating the scour depth as an equivalent load effect is presented. In Part II, the corresponding bridge failure probability, the occurrence of scour as well as simultaneously having both truck load and equivalent scour load effect are quantitatively discussed. The key formulae of the conditional partial failure probabilities and the necessary conditions are established. In order to illustrate the methodology, an example of dead, truck, earthquake and scour effects on a simple bridge pile foundation is represented.

  20. Monte Carlo method for determining earthquake recurrence parameters from short paleoseismic catalogs: Example calculations for California

    USGS Publications Warehouse

    Parsons, T.

    2008-01-01

    Paleoearthquake observations often lack enough events at a given site to directly define a probability density function (PDF) for earthquake recurrence. Sites with fewer than 10-15 intervals do not provide enough information to reliably determine the shape of the PDF using standard maximum-likelihood techniques (e.g., Ellsworth et al., 1999). In this paper I present a method that attempts to fit wide ranges of distribution parameters to short paleoseismic series. From repeated Monte Carlo draws, it becomes possible to quantitatively estimate most likely recurrence PDF parameters, and a ranked distribution of parameters is returned that can be used to assess uncertainties in hazard calculations. In tests on short synthetic earthquake series, the method gives results that cluster around the mean of the input distribution, whereas maximum likelihood methods return the sample means (e.g., NIST/SEMATECH, 2006). For short series (fewer than 10 intervals), sample means tend to reflect the median of an asymmetric recurrence distribution, possibly leading to an overestimate of the hazard should they be used in probability calculations. Therefore a Monte Carlo approach may be useful for assessing recurrence from limited paleoearthquake records. Further, the degree of functional dependence among parameters like mean recurrence interval and coefficient of variation can be established. The method is described for use with time-independent and time-dependent PDFs, and results from 19 paleoseismic sequences on strike-slip faults throughout the state of California are given.

  1. Monte Carlo Method for Determining Earthquake Recurrence Parameters from Short Paleoseismic Catalogs: Example Calculations for California

    USGS Publications Warehouse

    Parsons, Tom

    2008-01-01

    Paleoearthquake observations often lack enough events at a given site to directly define a probability density function (PDF) for earthquake recurrence. Sites with fewer than 10-15 intervals do not provide enough information to reliably determine the shape of the PDF using standard maximum-likelihood techniques [e.g., Ellsworth et al., 1999]. In this paper I present a method that attempts to fit wide ranges of distribution parameters to short paleoseismic series. From repeated Monte Carlo draws, it becomes possible to quantitatively estimate most likely recurrence PDF parameters, and a ranked distribution of parameters is returned that can be used to assess uncertainties in hazard calculations. In tests on short synthetic earthquake series, the method gives results that cluster around the mean of the input distribution, whereas maximum likelihood methods return the sample means [e.g., NIST/SEMATECH, 2006]. For short series (fewer than 10 intervals), sample means tend to reflect the median of an asymmetric recurrence distribution, possibly leading to an overestimate of the hazard should they be used in probability calculations. Therefore a Monte Carlo approach may be useful for assessing recurrence from limited paleoearthquake records. Further, the degree of functional dependence among parameters like mean recurrence interval and coefficient of variation can be established. The method is described for use with time-independent and time-dependent PDF?s, and results from 19 paleoseismic sequences on strike-slip faults throughout the state of California are given.

  2. IMPACT OF A LARGE SAN ANDREAS FAULT EARTHQUAKE ON TALL BUILDINGS IN SOUTHERN CALIFORNIA

    E-print Network

    Krishnan, Swaminathan

    1995c). These buildings resist lateral forces from an earthquake through bending in rigidly connectedIMPACT OF A LARGE SAN ANDREAS FAULT EARTHQUAKE ON TALL BUILDINGS IN SOUTHERN CALIFORNIA Swaminathan exposed the vulnerability of steel moment-resisting frame buildings to fracture (SAC 1995a; SAC 1995b; SAC

  3. Persistent water level changes in a well near Parkfield, California, due to local and distant earthquakes

    Microsoft Academic Search

    Evelyn A. Roelofts

    1998-01-01

    Coseismic water level rises in the 30-m deep Bourdieu Valley (BV) well near Parkfield, California, have occurred in response to three local and five distant earthquakes. Coseismic changes in static strain cannot explain these water level rises because (1) the well is insensitive to strain at tidal periods; (2) for the distant earthquakes, the expected coseismic static strain is extremely

  4. Seismic Moment, Stress, and Source Dimensions for Earthquakes in the California-Nevada Region

    Microsoft Academic Search

    Max Wyss; James N. Brune

    1968-01-01

    The source mechanism of earthquakes in the California-Nevada region was studied using surface wave analyses, surface displacement observations in the source region, magnitude determinations, and accurate epicenter locations. Fourier analyses of surface waves from thirteen earthquakes in the Parkfield region have yielded the following relationship between seismic moment, Mo and Richter magnitude, M,: log Mo -- 1.4 M, n u

  5. Streamflow increase due to rupturing of hydrothermal reservoirs: Evidence from the 2003 San Simeon, California, Earthquake

    E-print Network

    Manga, Michael

    , California, Earthquake Chi-Yuen Wang, Michael Manga, Douglas Dreger, and Alexander Wong Department of Earth: Hydrothermal systems (8424). Citation: Wang, C.-Y., M. Manga, D. Dreger, and A. Wong (2004), Streamflow and Manga, 2003, for overview]. Following the 1974 Izu-Hanto-oki earthquake, Japan, Wakita [1975] suggests

  6. Stress transferred by the 1995 M ? = 6.9 Kobe, Japan, shock: Effect on aftershocks and future earthquake probabilities

    Microsoft Academic Search

    Shinji Toda; Ross S. Stein; Paul A. Reasenberg; James H. Dieterich; Akio Yoshida

    1998-01-01

    The Kobe earthquake struck at the edge of the densely populated Osaka-Kyoto corridor in southwest Japan. We investigate how the earthquake transferred stress to nearby faults, altering their proximity to failure and thus changing earthquake probabilities. We find that relative to the pre-Kobe seismicity, Kobe aftershocks were concentrated in regions of calculated Coulomb stress increase and less common in regions

  7. Regression models for predicting the probability of near-fault earthquake ground motion pulses, and their period.

    E-print Network

    Baker, Jack W.

    Regression models for predicting the probability of near-fault earthquake ground motion pulses to the earthquake magnitude, but other predictive parameters are also considered and discussed. Both empirical University, Stanford, CA, USA ABSTRACT: Near-fault earthquake ground motions containing large velocity pulses

  8. Analysis of Earthquake Recordings Obtained from the Seafloor Earthquake Measurement System (SEMS) Instruments Deployed off the Coast of Southern California

    Microsoft Academic Search

    David M. Boore; Charles E. Smith

    1999-01-01

    For more than 20 years, a program has been underway to obtain records of earthquake shaking on the seafloor at sites offshore of southern California, near oil platforms. The primary goal of the program is to obtain data that can help deter- mine if ground motions at offshore sites are significantly different than those at onshore sites; if so, caution

  9. Northern California Earthquake Data Center Data Retrieval (title provided or enhanced by cataloger)

    NSDL National Science Digital Library

    The Northern California Earthquake Data Center (NCEDC) offers various types of earthquake-related data. Most of the datasets are available on the WWW. A few require the establishment of a research account. Available information includes: earthquake catalogs and lists; seismic waveform data from the Berkeley Digital Seismic Network, the Northern California Seismic Network, the Parkfield High-Resolution Seismic Network, and the Calpine/Unocal Geysers Network; Global Positioning System data from continuous monitoring stations; and Berkeley Digital Seismic Network temperature, electromagnetic and strain data.

  10. Probability model and solution on earthquake effects combination in along wind resistant design of tall-flexible buildings

    Microsoft Academic Search

    Xiao-jian Hong; Ming Gu

    2006-01-01

    A model on the earthquake effects combination in wind resistant design of high-rise flexible structures is proposed in accordance\\u000a with the probability method. Based on the Turkstra criteria, the stochastic characters of wind velocity, earthquake ground\\u000a acceleration and excitations occurrence probability are taken into account and then the combination of the earthquake effects\\u000a in structure wind resistant design is analyzed

  11. A survey of expert opinion on low probability earthquakes. [Nuclear power plant site selection

    Microsoft Academic Search

    Okrent

    1975-01-01

    As one way of examining the uncertainties in the prediction of low probability earthquakes, a number of experts in the field review eleven sites within the United States and provide their independent estimates on seismicity. The seven individuals participating in the seismic survey are listed. They were each sent the same descriptive material about the eleven sites, together with a

  12. Heightened Odds of Large Earthquakes Near Istanbul: An Interaction-Based Probability Calculation

    Microsoft Academic Search

    Tom Parsons; Shinji Toda; Ross S. Stein; Aykut Barka; James H. Dieterich

    2000-01-01

    We calculate the probability of strong shaking in Istanbul, an urban center of 10 million people, from the description of earthquakes on the North Anatolian fault system in the Marmara Sea during the past 500 years and test the resulting catalog against the frequency of damage in Istanbul during the preceding millennium. Departing from current practice, we include the time-dependent

  13. Differential Energy Radiation from Two Earthquakes with Similar Mw: The Baja California 2010 and Haiti 2010 Earthquakes

    NASA Astrophysics Data System (ADS)

    Meng, L.; Shi, B.

    2010-12-01

    The Baja, Mexico, earthquake of the April 4, 2010, Mw 7.2 occurred in northern Baja California at shallow depth along the principal plate boundary between the North American and Pacific plates, 2 people killed in the Mexicali area. The January 12, 2010, Mw 7.0, Haiti, earthquake occurred in the vicinity of Port-au-Prince, the capital of Haiti, on the Enriquillo Plantain Garden Fault, and with estimates of almost 250,000 deaths. International media reports of such kind of disasters by Haiti earthquake is just resulted from poor building structure design comparing with Mexicali area. Although the moment magnitude of the Haiti earthquake is similar as the Baja earthquake, but the radiated energy of the Haiti earthquake almost as 15 times as the Baja earthquake, resulting stronger near-fault ground motions. For the Haiti earthquake and Baja earthquake with the similar moment magnitude, two special finite fault models are constructed to simulate the near-fault strong ground motion for comparison purpose. We propose a new technique based on the far-field energy integrand over a simple finite fault to estimate S-wave energy radiation with associated the composite source model. The fault slip distributions on both faults are generated based on the composite source model in which the subevent-source-function is described by Brune’s pulse. The near-field peak ground accelerations (PGAs) including the shallow velocity structures (V30, average shear-velocity down to 30 m ) from the Haiti earthquake is almost as 20 times as from Baja earthquake, while the peak ground velocities (PGVs) including the shallow velocity structures from Yushu earthquake is almost as 8 times as from the Baja earthquake. Therefore, the radiated seismic energy plays a significant role in determining the levels of strong grounds in which stronger ground accelerations usually could cause much more property damages on the ground. The source rupture dynamics related to the frictional overshoot and undershoot is discussed and used to constraint source parameters such as the static stress drop and dynamic stress drop. It needs to point out that, in addition to the moment conservation applied on the main fault, the measurement of radiated seismic energy or apparent stress should be added to the numerical simulation in order to obtain physically realistic results. The numerical modeling developed in this study has a potential application in ground motion estimation/prediction for earthquake engineering purpose.

  14. Mapping the source of the 1983 Mw 6.5 Coalinga thrust earthquake (California)

    NASA Astrophysics Data System (ADS)

    Meunier, Patrick; Marc, Odin; Hovius, Niels

    2013-04-01

    We have recently shown that density patterns of co-seismic landslides associated to large thrust earthquakes can be used to map the area of maximum slip of the fault plan (Meunier et al., 2013), arguing that once adjusted for site effects, landslide distributions can supplement or replace instrumental records of earthquakes. We have applied our method to the 1983 Mw 6.5 Coalinga thrust earthquake (California). At the times of the main shock, the epicentral area of this earthquake was not covered with the dense network of accelerometers that has been installed since. Consequently, the slip inversion, inverted from leveling cross-sections and teleseismic data, is poorly constrained in comparison to the recent big thrust earthquakes we've been studied. We discuss the inversion of the source of this earthquake and compare its localisation with the one proposed by Stein and Ekstrom (1992).

  15. Collaborative Projects at the Northern California Earthquake Data Center (NCEDC)

    NASA Astrophysics Data System (ADS)

    Neuhauser, D.; Oppenheimer, D.; Zuzlewski, S.; Gee, L.; Murray, M.; Bassett, A.; Prescott, W.; Romanowicz, B.

    2001-12-01

    The NCEDC is a joint project of the UC Berkeley Seismological Laboratory and the USGS Menlo Park to provide a long-term archive and distribution center for geophysical data for northern California. Most data are available via the Web at http://quake.geo.berkeley.edu and research accounts are available for access to specialized datasets. Current efforts continue to expand the available datasets and to enhance distribution methods. The NCEDC currently archives continuous and event seismic waveform data from the BDSN and the USGS NCSN. Data from the BDSN are available in SEED and work is underway to make NCSN data available in this format. This massive project requires assembling and tracking the instrument responses from over 5000 current and historic NCSN data channels. Event waveforms from specialized networks, such as Geysers and Parkfield, are also available. In collaboration with the USGS, the NCEDC has archived a total of 887 channels from 139 sites of the "USGS low-frequency" geophysical network (UL), including data from strainmeters, creep meters, magnetometers, water well levels, and tiltmeters. There are 486 current data channels being updated at the NCEDC on a daily basis. All UL data are available in SEED. Data from the BARD network of over 40 continuously recording GPS sites are archived at the NCEDC in both raw and RINEX format. The NCEDC is now the primary archive for survey-mode GPS and other geodetic data collected in northern California by the USGS, universities, and other agencies. All of the BARD data and GPS data archived from USGS Menlo Park surveys are now available from the NCEDC via FTP. To support more portable and uniform data query programs among data centers, the NCEDC developed a set of Generic Data Center Views (GDVs) that incorporates the basic information that most datacenters maintain about data channels, instrument responses, and waveform inventory. We defined MSQL (Meta SeismiQuery Language), a query language based on the SQL SELECT command, to perform queries on the GDVs, and developed a program which converts the MSQL to an SQL request. MSQL2SQL converts the MSQL command into a parse tree, and defines an API allowing each datacenter to traverse the parse tree and revise it to produce a data center-specific SQL request. The NCEDC converted the IRIS SeismiQuery program to use the GDVs and MSQL, installed it at the NCEDC, and distributed the software to IRIS, SCEC-DC, and other interested parties. The resulting program should be much easier to install and support at other data centers. The NCEDC is also working on several data center integration projects in order to provide users with seamless access to data. The NCEDC is collaborating with IRIS on the NETDC project and with UNAVCO on the GPS Seamless Archive Centers initiative. Through the newly formed California Integrated Seismic Network, we are working with the SCEC-DC to provide unified access to California earthquake data.

  16. Who bears the burden of earthquakes in California and how has this changed from 1990 to 2000?

    Microsoft Academic Search

    Stephanie Kay

    2008-01-01

    New Geographic Information Systems (GIS) analysis shows that there are different demographics in California that suffer more in the event of an earthquake. Earthquakes are a threat to the livelihood of many Californians and by identifying which group is suffering the most, policy can be implemented to improve the livelihood of this population. Since the risk of earthquakes could de-value

  17. Forecasting California's earthquakes: What can we expect in the next 30 years?

    USGS Publications Warehouse

    Field, Edward H.; Milner, Kevin R.; The 2007 Working Group on California Earthquake Probabilities

    2008-01-01

    In a new comprehensive study, scientists have determined that the chance of having one or more magnitude 6.7 or larger earthquakes in the California area over the next 30 years is greater than 99%. Such quakes can be deadly, as shown by the 1989 magnitude 6.9 Loma Prieta and the 1994 magnitude 6.7 Northridge earthquakes. The likelihood of at least one even more powerful quake of magnitude 7.5 or greater in the next 30 years is 46%?such a quake is most likely to occur in the southern half of the State. Building codes, earthquake insurance, and emergency planning will be affected by these new results, which highlight the urgency to prepare now for the powerful quakes that are inevitable in California?s future.

  18. Earthquake Counting Method for Spatially Localized Probabilities: Challenges in Real-Time Information Delivery

    E-print Network

    Holliday, James R; Rundle, John B; Turcotte, Donald L

    2013-01-01

    We develop and implement a new type of global earthquake forecast. Our forecast is a perturbation on a smoothed seismicity (Relative Intensity) spatial forecast combined with a temporal time-averaged (Poisson) forecast. A variety of statistical and fault-system models have been discussed for use in computing forecast probabilities. Our paper takes a new approach. The idea is based on the observation that GR statistics characterize seismicity for all space and time. Small magnitude event counts (quake counts) are used as markers for the approach of large events. More specifically, if the GR b-value = 1, then for every 1000 M>3 earthquakes, one expects 1 M>6 earthquake. So if ~1000 M>3 events have occurred in a spatial region since the last M>6 earthquake, another M>6 earthquake should be expected soon. In physics, event count models have been called natural time models, since counts of small events represent a physical or natural time scale characterizing the system dynamics. In a previous paper, we used condi...

  19. Real-time forecasts of tomorrow's earthquakes in California: a new mapping tool

    USGS Publications Warehouse

    Gerstenberger, Matt; Wiemer, Stefan; Jones, Lucy

    2004-01-01

    We have derived a multi-model approach to calculate time-dependent earthquake hazard resulting from earthquake clustering. This file report explains the theoretical background behind the approach, the specific details that are used in applying the method to California, as well as the statistical testing to validate the technique. We have implemented our algorithm as a real-time tool that has been automatically generating short-term hazard maps for California since May of 2002, at http://step.wr.usgs.gov

  20. Southern California Earthquake Center--Virtual Display of Objects (SCEC-VDO): An Earthquake Research and Education Tool

    NASA Astrophysics Data System (ADS)

    Perry, S.; Maechling, P.; Jordan, T.

    2006-12-01

    Interns in the program Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT, an NSF Research Experience for Undergraduates Site) have designed, engineered, and distributed SCEC-VDO (Virtual Display of Objects), an interactive software used by earthquake scientists and educators to integrate and visualize global and regional, georeferenced datasets. SCEC-VDO is written in Java/Java3D with an extensible, scalable architecture. An increasing number of SCEC-VDO datasets are obtained on the fly through web services and connections to remote databases; and user sessions may be saved in xml-encoded files. Currently users may display time-varying sequences of earthquake hypocenters and focal mechanisms, several 3-dimensional fault and rupture models, satellite imagery - optionally draped over digital elevation models - and cultural datasets including political boundaries. The ability to juxtapose and interactively explore these data and their temporal and spatial relationships has been particularly important to SCEC scientists who are evaluating fault and deformation models, or who must quickly evaluate the menace of evolving earthquake sequences. Additionally, SCEC-VDO users can annotate the display, plus script and render animated movies with adjustable compression levels. SCEC-VDO movies are excellent communication tools and have been featured in scientific presentations, classrooms, press conferences, and television reports.

  1. Earthquake Education and Public Information Centers: A Collaboration Between the Earthquake Country Alliance and Free-Choice Learning Institutions in California

    Microsoft Academic Search

    R. M. Degroot; K. Springer; C. J. Brooks; L. Schuman; D. Dalton; M. L. Benthien

    2009-01-01

    In 1999 the Southern California Earthquake Center initiated an effort to expand its reach to multiple target audiences through the development of an interpretive trail on the San Andreas fault at Wallace Creek and an earthquake exhibit at Fingerprints Youth Museum in Hemet. These projects and involvement with the San Bernardino County Museum in Redlands beginning in 2007 led to

  2. Short Notes Variability of Near-Term Probability for the Next Great Earthquake on the Cascadia Subduction Zone

    Microsoft Academic Search

    Stephane Mazzotti; John Adams

    2004-01-01

    The threat of a great (M 9) earthquake along the Cascadia subduction zone is evidenced by both paleoseismology data and current strain accumulation along the fault. On the basis of recent information on the characteristics of this subduction system, we estimate the conditional probabilities of a great earthquake occurring within the next 50 years and their variabilities. The most important

  3. Intermediate-term, pre-earthquake phenomena in California, 1975-1986, and preliminary forecast of seismicity for the next decade

    USGS Publications Warehouse

    Wesson, R.L.; Nicholson, C.

    1988-01-01

    Intermediate-term observations preceding earthquakes of magnitude 5.7 or greater in California from 1975 through 1986 suggest that: (1) The sudden appearance of earthquakes in a previously inactive area indicates an increased likelihood of a significant earthquake in that area for a period from days to years; (2) these larger earthquakes tend to occur towards the ends of creeping fault segments; (3) one large earthquake in a region increases the likelihood of a subsequent significant event in the adjacent area; and (4) marginal evidence for the occurrence of a regional deformation event suggests that such events increase the probability of earthquake occurrence throughout the entire area. A common element in many of these observed patterns appears to be the transmission and amplification of tectonic stress changes by the mechanism of fault creep, and suggests that surface fault creep is a sensitive indicator of changes in stress. The preceding critieria are used to construct a preliminary 'forecast' of the likely locations of significant earthquakes over the next decade. ?? 1988 Birkha??user Verlag.

  4. Diagnosis of Time of Increased Probability (TIP) for Volcanic Earthquakes at Mt. Vesuvius

    NASA Astrophysics Data System (ADS)

    Rotwain, I.; Natale, G. De; Kuznetsov, I.; Peresan, A.; Panza, G. F.

    2006-01-01

    The possibility of intermediate-term earthquake prediction at Mt. Vesuvius by means of the CN algorithm is explored. CN was originally designed to identify the Times of Increased Probability (TIPs) for the occurrence of strong tectonic earthquakes, with magnitude M ? M 0, within a region a priori delimited. Here the CN algorithm is applied, for the first time, to the analysis of volcanic seismicity. The earthquakes recorded at Mt. Vesuvius during the period from February 1972 to June 2004 are considered, and the magnitude threshold M 0 selecting the events to be predicted is varied within the range: 3.0 3.3. Satisfactory prediction results are obtained, by retrospective analysis, when a time scaling is introduced. In particular, when the length of the time windows is reduced by a factor 2.5 3, with respect to the standard version of CN algorithm, more than 90% of the events with M ? M 0 occur within the TIP intervals, with TIPs occupying about 30% of the total time considered. The control experiment ``Seismic History'' demonstrates the stability of the obtained results and indicates that the CN algorithm can be applied to monitor the preparation of impending earthquakes with M ? 3.0 at Mt. Vesuvius.

  5. Development of damage probability matrices based on Greek earthquake damage data

    NASA Astrophysics Data System (ADS)

    Eleftheriadou, Anastasia K.; Karabinis, Athanasios I.

    2011-03-01

    A comprehensive study is presented for empirical seismic vulnerability assessment of typical structural types, representative of the building stock of Southern Europe, based on a large set of damage statistics. The observational database was obtained from post-earthquake surveys carried out in the area struck by the September 7, 1999 Athens earthquake. After analysis of the collected observational data, a unified damage database has been created which comprises 180,945 damaged buildings from/after the near-field area of the earthquake. The damaged buildings are classified in specific structural types, according to the materials, seismic codes and construction techniques in Southern Europe. The seismic demand is described in terms of both the regional macroseismic intensity and the ratio ? g/ a o, where ? g is the maximum peak ground acceleration (PGA) of the earthquake event and a o is the unique value PGA that characterizes each municipality shown on the Greek hazard map. The relative and cumulative frequencies of the different damage states for each structural type and each intensity level are computed in terms of damage ratio. Damage probability matrices (DPMs) and vulnerability curves are obtained for specific structural types. A comparison analysis is fulfilled between the produced and the existing vulnerability models.

  6. Chapter B. The Loma Prieta, California, Earthquake of October 17, 1989 - Forecasts

    USGS Publications Warehouse

    Harris, Ruth A.

    1998-01-01

    The magnitude (Mw) 6.9 Loma Prieta earthquake struck the San Francisco Bay region of central California at 5:04 p.m. P.d.t. on October 17, 1989, killing 62 people and generating billions of dollars in property damage. Scientists were not surprised by the occurrence of a destructive earthquake in this region and had, in fact, been attempting to forecast the location of the next large earthquake in the San Francisco Bay region for decades. This paper summarizes more than 20 scientifically based forecasts made before the 1989 Loma Prieta earthquake for a large earthquake that might occur in the Loma Prieta area. The forecasts geographically closest to the actual earthquake primarily consisted of right-lateral strike-slip motion on the San Andreas Fault northwest of San Juan Bautista. Several of the forecasts did encompass the magnitude of the actual earthquake, and at least one approximately encompassed the along-strike rupture length. The 1989 Loma Prieta earthquake differed from most of the forecasted events in two ways: (1) it occurred with considerable dip-slip in addition to strike-slip motion, and (2) it was much deeper than expected.

  7. Moment tensor inversions of M ~ 3 earthquakes in the Geysers geothermal fields, California

    NASA Astrophysics Data System (ADS)

    Guilhem, A.; Hutchings, L.; Dreger, D. S.; Johnson, L. R.

    2014-03-01

    Microearthquakes have come into high public awareness due to being induced by the development and exploitation of enhanced and natural geothermal fields, hydrofracturing, and CO2 sequestration sites. Characterizing and understanding the faulting process of induced earthquakes, which is generally achieved through moment tensor inversion, could both help in risk prediction and in reservoir development monitoring. However, this is a challenging task because of their lower signal-to-noise ratio at frequencies typically used in earthquake source analyses. Therefore, higher-resolution velocity models and modeling of seismic waves at higher frequencies are required. In this study, we examine both the potentials to obtain moment tensor solutions for small earthquakes and the uncertainty of those solutions. We utilize a short-period seismic network located in the Geysers geothermal field in northern California and limit our study to that which would be achieved by industry in a typical reservoir environment. We obtain full moment tensor solutions of M ~ 3 earthquakes using waveform modeling and first-motion inversions. We find that these two data sets give complimentary but yet different solutions. Some earthquakes correspond possibly to complex processes in which both shear and tensile failures occur simultaneously or sequentially. This illuminates the presence of fluids at depth and their role for the generation of these small-magnitude earthquakes. Finally, since first motions are routinely obtained for all magnitude earthquakes, our approach could be extended to small earthquakes where noise level and complex Green's functions prohibit using waveforms in moment tensor inversions.

  8. FORECAST MODEL FOR MODERATE EARTHQUAKES NEAR PARKFIELD, CALIFORNIA.

    USGS Publications Warehouse

    Stuart, William D.; Archuleta, Ralph J.; Lindh, Allan G.

    1985-01-01

    The paper outlines a procedure for using an earthquake instability model and repeated geodetic measurements to attempt an earthquake forecast. The procedure differs from other prediction methods, such as recognizing trends in data or assuming failure at a critical stress level, by using a self-contained instability model that simulates both preseismic and coseismic faulting in a natural way. In short, physical theory supplies a family of curves, and the field data select the member curves whose continuation into the future constitutes a prediction. Model inaccuracy and resolving power of the data determine the uncertainty of the selected curves and hence the uncertainty of the earthquake time.

  9. Earthquake Hazards: The next big one?

    NSDL National Science Digital Library

    John Taber

    Students work in small groups or individually to investigate the earthquake hazards in California, Missouri, and their own location. Students begin the activity with an exploration of the concept of probability, and then work to understand how earthquake hazards are described by probabilities.

  10. Stress drops and radiated energies of aftershocks of the 1994 Northridge, California, earthquake

    E-print Network

    Abercrombie, Rachel E.

    Stress drops and radiated energies of aftershocks of the 1994 Northridge, California, earthquake December 2000; revised 11 May 2003; accepted 25 June 2003; published 28 November 2003. [1] We study stress estimate static and dynamic stress drops from the source time functions and compare them to well

  11. EARTHQUAKE PREPAREDNESS TIPS California Governor's Office of Emergency Services Secure Tabletop Objects

    E-print Network

    Vernon, Frank

    EARTHQUAKE PREPAREDNESS TIPS California Governor's Office of Emergency Services Secure Tabletop latches or positive catch latches, designed for boats, to secure your cabinet doors. Make sure your gas with a strong shatter-resistant film. Be sure you use safety film and not just a solar filter. Secure Overhead

  12. Uniform California Earthquake Rupture Forecast, Version 3 (UCERF3) --The Time-Independent Model

    E-print Network

    Shaw, Bruce E.

    of potentially damaging earthquakes in California. The primary achievements have been to relax fault segmentation with an outline of this report to help readers navigate the various sections: 1. Introduction · Background · Model · Inversion Setup and Associated Gridded Seismicity · Gardner­Knopoff Aftershock Filter 5. Results · Model

  13. Complexity of energy release during the Imperial Valley, California, earthquake of 1940

    Microsoft Academic Search

    M. D. Trifunac; JAMES N. BRUNE

    1970-01-01

    The pattern of energy release during the Imperial Valley, California, earth- quake of 1940 is studied by analyzing the El Centro strong motion seismograph record and records from the Tinemaha seismograph station, 546 km from the epicenter. The earthquake was a multiple event sequence with at least 4 events recorded at El Centro in the first 25 seconds, followed by

  14. Earthquake Locations and Three-Dimensional Crustal Structure in the Coyote Lake Area, Central California

    Microsoft Academic Search

    Clifford H. Thurber

    1983-01-01

    Previous work on the simultaneous inversion method has been improved and extended to incorporate iterative solution for earthquake locations and laterally heterogeneous structure. Approximate ray tracing and parameter separation are important elements of the improved method. Application of the method to P wave arrival time data recorded by stations of the U.S. Geological Survey Central California Network yields a three-dimensional

  15. Intensity of the 18 April 1906 earthquake in and near San Jose, California

    Microsoft Academic Search

    Nancy C. Shostak

    2009-01-01

    Intensities of shaking for previously untapped, historical sources of damage data from the 1906 earthquake in and near San José, California, were developed with a refined and expanded version of the Modified Mercalli intensity (MMI) scale designed to bring out rich detail in the dense data. Sanborn fire insurance maps provide construction details and precise locations for 80 percent of

  16. What Parts of PTSD Are Normal: Intrusion, Avoidance, or Arousal? Data from the Northridge, California, Earthquake

    Microsoft Academic Search

    J. Curtis McMillen; Carol S. North; Elizabeth M. Smith

    2000-01-01

    The incidence and comorbidity of posttraumatic stress disorder (PTSD) are addressed in a study of 130 Northridge, California, earthquake survivors interviewed 3 months postdisaster. Only 13% of the sample met full PTSD criteria, but 48% met both the reexperiencing and the arousal symptom criteria, without meeting the avoidance and numbing symptom criterion. Psychiatric comorbidity was associated mostly with avoidance and

  17. Precise estimation of repeating earthquake moment: Example from parkfield, california

    USGS Publications Warehouse

    Rubinstein, J.L.; Ellsworth, W.L.

    2010-01-01

    We offer a new method for estimating the relative size of repeating earthquakes using the singular value decomposition (SVD). This method takes advantage of the highly coherent waveforms of repeating earthquakes and arrives at far more precise and accurate descriptions of earthquake size than standard catalog techniques allow. We demonstrate that uncertainty in relative moment estimates is reduced from ??75% for standard coda-duration techniques employed by the network to an uncertainty of ??6.6% when the SVD method is used. This implies that a single-station estimate of moment using the SVD method has far less uncertainty than the whole-network estimates of moment based on coda duration. The SVD method offers a significant improvement in our ability to describe the size of repeating earthquakes and thus an opportunity to better understand how they accommodate slip as a function of time.

  18. Evidence for dyke intrusion earthquake mechanisms near long valley caldera, California

    USGS Publications Warehouse

    Julian, B.R.

    1983-01-01

    A re-analysis of the magnitude 6 earthquakes that occurred near Long Valley caldera in eastern California on 25 and 27 May 1980, suggests that at least two of them, including the largest, were probably caused by fluid injection along nearly vertical surfaces and not by slip on faults. Several investigators 1,2 have reported difficulty in explaining both the long-period surface-wave amplitudes and phases and the locally recorded short-period body-wave first motions from these events, using conventional double-couple (shear fault) source models. They attributed this difficulty to: (1) complex sources, not representable by single-fault models; (2) artefacts of the analysis methods used; or (3) effects of wave propagation through hypothetical structures beneath the caldera. We show here that the data agree well with the predictions for a compensated linear-vector dipole (CLVD) equivalent-force system3 with its principal extensional axis horizontal and trending N 55-65?? E. Such a mechanism is what would be expected for fluid injection into dykes striking N 25-35?? W, which is the approximate strike of numerous normal faults in the area. ?? 1983 Nature Publishing Group.

  19. Distribution of intensity for the Westmorland, California, earthquake of April 26, 1981

    USGS Publications Warehouse

    Barnhard, L.M.; Thenhaus, P.C.; Algermissen, Sylvester Theodore

    1982-01-01

    The maximum Modified Mercalli intensity of the April 26, 1981 earthquake located 5 km northwest of Westmorland, California is VII. Twelve buildings in Westmorland were severely damaged with an additional 30 sustaining minor damage. Two brick parapets fell in Calipatria, 14 km northeast of Westmorland and 10 km from the earthquake epicenter. Significant damage in rural areas was restricted to unreinforced, concrete-lined irrigation canals. Liquefaction effects and ground slumping were widespread in rural areas and were the primary causes of road cracking. Preliminary local government estimates of property loss range from one to three million dollars (Imperial Valley Press, 1981). The earthquake was felt over an area of approximately 160,000 km2; about the same felt area of the October 15, 1979 (Reagor and others, 1980), and May 18, 1940 (Ulrich, 1941) Imperial Valley earthquakes.

  20. Instability model for recurring large and great earthquakes in southern California

    USGS Publications Warehouse

    Stuart, W.D.

    1985-01-01

    The locked section of the San Andreas fault in southern California has experienced a number of large and great earthquakes in the past, and thus is expected to have more in the future. To estimate the location, time, and slip of the next few earthquakes, an earthquake instability model is formulated. The model is similar to one recently developed for moderate earthquakes on the San Andreas fault near Parkfield, California. In both models, unstable faulting (the earthquake analog) is caused by failure of all or part of a patch of brittle, strain-softening fault zone. In the present model the patch extends downward from the ground surface to about 12 km depth, and extends 500 km along strike from Parkfield to the Salton Sea. The variation of patch strength along strike is adjusted by trial until the computed sequence of instabilities matches the sequence of large and great earthquakes since a.d. 1080 reported by Sieh and others. The last earthquake was the M=8.3 Ft. Tejon event in 1857. The resulting strength variation has five contiguous sections of alternately low and high strength. From north to south, the approximate locations of the sections are: (1) Parkfield to Bitterwater Valley, (2) Bitterwater Valley to Lake Hughes, (3) Lake Hughes to San Bernardino, (4) San Bernardino to Palm Springs, and (5) Palm Springs to the Salton Sea. Sections 1, 3, and 5 have strengths between 53 and 88 bars; sections 2 and 4 have strengths between 164 and 193 bars. Patch section ends and unstable rupture ends usually coincide, although one or more adjacent patch sections may fail unstably at once. The model predicts that the next sections of the fault to slip unstably will be 1, 3, and 5; the order and dates depend on the assumed length of an earthquake rupture in about 1700. ?? 1985 Birkha??user Verlag.

  1. Paleoseismic Evidence for Prehistoric Earthquakes on the Northern Maacama Fault, Willits, California

    NASA Astrophysics Data System (ADS)

    Fenton, C. H.; Prentice, C. S.; Benton, J. L.; Crosby, C. J.; Sickler, R. R.; Stephens, T. A.

    2002-12-01

    The right-lateral strike-slip Maacama fault zone (MFZ) has been interpreted as the northern continuation of the Hayward-Rodgers Creek fault system, and is considered to be an active structure because of its youthful tectonic geomorphology, association with an elevated rate of small magnitude seismicity, and contemporary creep. However, it has not generated any known large historical earthquakes. The MFZ is known to creep at a rate of about 6.5 mm/yr (measured over a 10 year period) through the town of Willits (Galehouse, 2002). Modeling of geodetic data across northern California suggests approximately 14 mm/yr of strike-slip motion across the MFZ (Freymueller et al., 1999). Therefore, we expect the MFZ, like the Hayward fault, to both creep and produce large earthquakes in order to reconcile the difference between the creep rate and the total slip rate across the fault. Previous paleoseismic investigations near Ukiah, 30 km south of Willits, suggest that the most recent large earthquake on the MFZ occurred between 1500 AD and 1630 AD (Sickler et al., 1999). We excavated six trenches across the MFZ in Willits. Four trenches crossed a southwest-facing, two- to five-m-high fault scarp on the southwest side of a pressure ridge, and two trenches were excavated across the projection of the fault to the south in an area of younger alluvium with no surface expression of the fault. The trenches across the scarp exposed a steeply northeast-dipping fault plane, juxtaposing Pleistocene lacustrine clays (dated at 18,810 ñ 80 radiocarbon years BP) overlain by a cap of Holocene alluvium to the northeast, against a sequence of Holocene fluvial and colluvial deposits to the southwest. Structural and stratigraphic relations suggest at least four and probably five faulting events during the (late?) Holocene. Slickensides plunging 10-12o to the northwest show a horizontal to vertical slip ratio of about 5:1. The trenches excavated across the area of younger alluvium show evidence for only one faulting event. We interpret the uppermost deformation in these trenches, a series of upward fanning fractures with little displacement, to be the result of fault creep.

  2. Probable earthquake ground motion as related to structural response in Las Vegas, Nevada

    SciTech Connect

    Greensfelder, R.W.; Kintzer, F.C.; Somerville, M.R.

    1980-12-01

    Ground motion parameters are necessary for structural damage assessments and dynamic effects prediction in Las Vegas, Nevada. To develop these, a model of tectonic activity in the southern Basin and Range province was constructed on the basis of late Cenozoic patterns of crustal deformation, estimates of regional strain rates in Holocene time, and historic seismicity. From this information, the region surrounding Las Vegas was subdivided into six seismotectonic zones. Historic seismicity was analyzed on the basis of seismographically recorded earthquakes and compared with long-term seismotectonic activity. Apparent agreement between the two sets of data indicates that the average rates of historic seismicity observed in the areas analyzed are reasonably representative of long-term seismicity. Magnitude-recurrence relationships were developed for each of the six seismotectonic zones, and probable maximum values of peak ground acceleration in Las Vegas were calculated using a computer program (HAZARD) developed for the study. Probable causative earthquake magnitudes in each source zone, probable values of duration of seismic shaking, and predominant periods likely to be associated with various peak accelerations were also determined.

  3. Probabilistic approach to short- and intermediate-term earthquake forecasting: II test application to southern California

    Microsoft Academic Search

    M. Niazi; C. P. Mortgat; K. N. Truong

    1985-01-01

    A method is proposed for assessing the modifying effects of precursory observations on long-term probabilities of strong earthquakes (M=6). Estimated short-and intermediate-term probabilities so estimated rely on the mean precursory time and its uncertainty as a function of the mainshock magnitude and epicentral position. Short-and intermediate-term modification of long-term probabilities within a 120,000 km2 circular area covering most of Southern

  4. Probabilistic approach to short- and intermediate-term earthquake forecasting: II test application to southern California

    Microsoft Academic Search

    M. Niazi; C. P. Mortgat; K. N. Truong

    1985-01-01

    A method is proposed for assessing the modifying effects of precursory observations on long-term probabilities of strong earthquakes (M>=6). Estimated short-and intermediate-term probabilities so estimated rely on the mean precursory time and its uncertainty as a function of the mainshock magnitude and epicentral position. Short-and intermediate-term modification of long-term probabilities within a 120,000 km2 circular area covering most of Southern

  5. A record of large earthquakes during the past two millennia on the southern Green Valley Fault, California

    USGS Publications Warehouse

    Lienkaemper, James J.; Baldwin, John N.; Turner, Robert; Sickler, Robert R.; Brown, Johnathan

    2013-01-01

    We document evidence for surface-rupturing earthquakes (events) at two trench sites on the southern Green Valley fault, California (SGVF). The 75-80-km long dextral SGVF creeps ~1-4 mm/yr. We identify stratigraphic horizons disrupted by upward-flowering shears and in-filled fissures unlikely to have formed from creep alone. The Mason Rd site exhibits four events from ~1013 CE to the Present. The Lopes Ranch site (LR, 12 km to the south) exhibits three events from 18 BCE to Present including the most recent event (MRE), 1610 ±52 yr CE (1?) and a two-event interval (18 BCE-238 CE) isolated by a millennium of low deposition. Using Oxcal to model the timing of the 4-event earthquake sequence from radiocarbon data and the LR MRE yields a mean recurrence interval (RI or ?) of 199 ±82 yr (1?) and ±35 yr (standard error of the mean), the first based on geologic data. The time since the most recent earthquake (open window since MRE) is 402 yr ±52 yr, well past ?~200 yr. The shape of the probability density function (pdf) of the average RI from Oxcal resembles a Brownian Passage Time (BPT) pdf (i.e., rather than normal) that permits rarer longer ruptures potentially involving the Berryessa and Hunting Creek sections of the northernmost GVF. The model coefficient of variation (cv, ?/?) is 0.41, but a larger value (cv ~0.6) fits better when using BPT. A BPT pdf with ? of 250 yr and cv of 0.6 yields 30-yr rupture probabilities of 20-25% versus a Poisson probability of 11-17%.

  6. Pulse Azimuth Clusters Preceding Earthquakes in California, 2005-2010

    NASA Astrophysics Data System (ADS)

    Dunson, C.; Bleier, T. E.

    2010-12-01

    Increases in geomagnetic pulsation activity have been observed preceding many earthquakes in the Quakefinder magnetometer network. Some of these pulses are extremely large, and many have been shown to have consistent angle-of-arrival (azimuth), using the pairs of horizontal coils (North-South and East-West) found at each site. These pulsations can be seen 1-15 days prior to earthquakes nearby the same sites, and since they occur in bursts, they have been termed 'pulse azimuth clusters.' These clusters can be empirically studied using Bernoulli trials, and this approach can produce a favorable decision-making concept for an operational earthquake warning system. A number of issues regarding these ideas are elucidated for purposes of discussion by this poster.

  7. Earthquakes

    MedlinePLUS

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  8. Identification and Reduction of Nonstructural Earthquake Hazards in California Schools.

    ERIC Educational Resources Information Center

    Greene, Marjorie; And Others

    It is necessary to identify nonstructural hazards at the school site to reduce the possibly of injury in the event of an earthquake. Nonstructural hazards can occur in every part of a building and all of its contents with the exception of structure. In other words, nonstructural elements are everything but the columns, beams, floors, load-bearing…

  9. Residual analysis methods for space–time point processes with applications to earthquake forecast models in California

    Microsoft Academic Search

    Robert Alan Clements; Frederic Paik Schoenberg; Danijel Schorlemmer

    2011-01-01

    Modern, powerful techniques for the residual analysis of spatial-temporal point process models are reviewed and compared. These methods are applied to California earthquake forecast models used in the Collaboratory for the Study of Earthquake Predictability (CSEP). Assessments of these earthquake forecasting models have previously been performed using simple, low-power means such as the L-test and N-test. We instead propose residual

  10. The 2014 Mw 6.0 Napa Earthquake, California: Observations from Real-time GPS-enhanced Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Johanson, I. A.; Grapenthin, R.; Allen, R. M.

    2014-12-01

    Recently, progress has been made to demonstrate feasibility and benefits of including real-time GPS (rtGPS) in earthquake early warning and rapid response systems. While most concepts have yet to be integrated into operational environments, the Berkeley Seismological Laboratory is currently running an rtGPS based finite fault inversion scheme in true real-time, which is triggered by the seismic-based ShakeAlert system and then sends updated earthquake alerts to a test receiver. The Geodetic Alarm System (G-larmS) was online and responded to the 2014 Mw6.0 South Napa earthquake in California. We review G-larmS' performance during this event and for 13 aftershocks, and we present rtGPS observations and real-time modeling results for the main shock. The first distributed slip model and a magnitude estimate of Mw5.5 were available 24 s after the event origin time, which could be reduced to 14 s after a bug fix (~8 s S-wave travel time, ~6 s data latency). The system continued to re-estimate the magnitude once every second: it increased to Mw5.9 3 s after the first alert and stabilized at Mw5.8 after 15 s. G-larmS' solutions for the subsequent small magnitude aftershocks demonstrate that Mw~6.0 is the current limit for alert updates to contribute back to the seismic-based early warning system.

  11. The Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT) Internship Program

    NASA Astrophysics Data System (ADS)

    Perry, S.; Jordan, T.

    2006-12-01

    Our undergraduate research program, SCEC/UseIT, an NSF Research Experience for Undergraduates site, provides software for earthquake researchers and educators, movies for outreach, and ways to strengthen the technical career pipeline. SCEC/UseIT motivates diverse undergraduates towards science and engineering careers through team-based research in the exciting field of earthquake information technology. UseIT provides the cross-training in computer science/information technology (CS/IT) and geoscience needed to make fundamental progress in earthquake system science. Our high and increasing participation of women and minority students is crucial given the nation"s precipitous enrollment declines in CS/IT undergraduate degree programs, especially among women. UseIT also casts a "wider, farther" recruitment net that targets scholars interested in creative work but not traditionally attracted to summer science internships. Since 2002, SCEC/UseIT has challenged 79 students in three dozen majors from as many schools with difficult, real-world problems that require collaborative, interdisciplinary solutions. Interns design and engineer open-source software, creating increasingly sophisticated visualization tools (see "SCEC-VDO," session IN11), which are employed by SCEC researchers, in new curricula at the University of Southern California, and by outreach specialists who make animated movies for the public and the media. SCEC-VDO would be a valuable tool for research-oriented professional development programs.

  12. Chapter B. The Loma Prieta, California, Earthquake of October 17, 1989 - Highway Systems

    USGS Publications Warehouse

    Yashinsky, Mark

    1998-01-01

    This paper summarizes the impact of the Loma Prieta earthquake on highway systems. City streets, urban freeways, county roads, state routes, and the national highway system were all affected. There was damage to bridges, roads, tunnels, and other highway structures. The most serious damage occurred in the cities of San Francisco and Oakland, 60 miles from the fault rupture. The cost to repair and replace highways damaged by this earthquake was $2 billion. About half of this cost was to replace the Cypress Viaduct, a long, elevated double-deck expressway that had a devastating collapse which resulted in 42 deaths and 108 injuries. The earthquake also resulted in some positive changes for highway systems. Research on bridges and earthquakes began to be funded at a much higher level. Retrofit programs were started to upgrade the seismic performance of the nation's highways. The Loma Prieta earthquake changed earthquake policy and engineering practice for highway departments not only in California, but all over the world.

  13. Stress transferred by the 1995 Mw = 6.9 Kobe, Japan, shock: Effect on aftershocks and future earthquake probabilities

    USGS Publications Warehouse

    Toda, S.; Stein, R.S.; Reasenberg, P.A.; Dieterich, J.H.; Yoshida, A.

    1998-01-01

    The Kobe earthquake struck at the edge of the densely populated Osaka-Kyoto corridor in southwest Japan. We investigate how the earthquake transferred stress to nearby faults, altering their proximity to failure and thus changing earthquake probabilities. We find that relative to the pre-Kobe seismicity, Kobe aftershocks were concentrated in regions of calculated Coulomb stress increase and less common in regions of stress decrease. We quantify this relationship by forming the spatial correlation between the seismicity rate change and the Coulomb stress change. The correlation is significant for stress changes greater than 0.2-1.0 bars (0.02-0.1 MPa), and the nonlinear dependence of seismicity rate change on stress change is compatible with a state- and rate-dependent formulation for earthquake occurrence. We extend this analysis to future mainshocks by resolving the stress changes on major faults within 100 km of Kobe and calculating the change in probability caused by these stress changes. Transient effects of the stress changes are incorporated by the state-dependent constitutive relation, which amplifies the permanent stress changes during the aftershock period. Earthquake probability framed in this manner is highly time-dependent, much more so than is assumed in current practice. Because the probabilities depend on several poorly known parameters of the major faults, we estimate uncertainties of the probabilities by Monte Carlo simulation. This enables us to include uncertainties on the elapsed time since the last earthquake, the repeat time and its variability, and the period of aftershock decay. We estimate that a calculated 3-bar (0.3-MPa) stress increase on the eastern section of the Arima-Takatsuki Tectonic Line (ATTL) near Kyoto causes fivefold increase in the 30-year probability of a subsequent large earthquake near Kyoto; a 2-bar (0.2-MPa) stress decrease on the western section of the ATTL results in a reduction in probability by a factor of 140 to 2000. The probability of a Mw = 6.9 earthquake within 50 km of Osaka during 1997-2007 is estimated to have risen from 5-6% before the Kobe earthquake to 7-11% afterward; during 1997-2027, it is estimated to have risen from 14-16% before Kobe to 16-22%.

  14. Analysis of Earthquake Recordings Obtained from the Seafloor Earthquake Measurement System (SEMS) Instruments Deployed off the Coast of Southern California

    USGS Publications Warehouse

    Boore, D.M.; Smith, C.E.

    1999-01-01

    For more than 20 years, a program has been underway to obtain records of earthquake shaking on the seafloor at sites offshore of southern California, near oil platforms. The primary goal of the program is to obtain data that can help determine if ground motions at offshore sites are significantly different than those at onshore sites; if so, caution may be necessary in using onshore motions as the basis for the seismic design of oil platforms. We analyze data from eight earthquakes recorded at six offshore sites; these are the most important data recorded on these stations to date. Seven of the earthquakes were recorded at only one offshore station; the eighth event was recorded at two sites. The earthquakes range in magnitude from 4.7 to 6.1. Because of the scarcity of multiple recordings from any one event, most of the analysis is based on the ratio of spectra from vertical and horizontal components of motion. The results clearly show that the offshore motions have very low vertical motions compared to those from an average onshore site, particularly at short periods. Theoretical calculations find that the water layer has little effect on the horizontal components of motion but that it produces a strong spectral null on the vertical component at the resonant frequency of P waves in the water layer. The vertical-to-horizontal ratios for a few selected onshore sites underlain by relatively low shear-wave velocities are similar to the ratios from offshore sites for frequencies less than about one-half the water layer P-wave resonant frequency, suggesting that the shear-wave velocities beneath a site are more important than the water layer in determining the character of the ground motions at lower frequencies.

  15. Geodetic record of a complete earthquake cycle: constraints on frictional properties and earthquake hazard in the Imperial Valley, southern California

    NASA Astrophysics Data System (ADS)

    Lindsey, Eric; Fialko, Yuri

    2015-04-01

    We analyze a suite of geodetic observations across the Imperial fault in southern California that span the complete earthquake cycle, and show that co-, post- and interseismic observations are all required to obtain a robust constraint on the frictional properties and behavior of the fault. Coseismic and postseismic slip on the fault was measured with high precision following the 1979 M6.6 Imperial Valley earthquake, and interseismic deformation is presently recorded by a combination of multiple InSAR viewing geometries and survey-mode GPS. We combine more than 100 survey-mode GPS velocities (Crowell et al., 2013) with new InSAR observations from Envisat descending tracks 84 and 356 and ascending tracks 77 and 306 (149 total acquisitions), processed using the Stanford Method for Persistent Scatterers (StaMPS) package (Hooper et al., 2007). The result is a dense map of surface velocities across the Imperial fault and surrounding areas, revealing the rate of interseismic loading and along-strike variations in surface creep. We compare the geodetic data to models of earthquake cycles with rate- and state-dependent friction and find that a complete record of the earthquake cycle is required to constrain key fault properties including the velocity-strengthening or velocity-weakening parameter (a-b) and its variation with depth; moment accumulation rate; and recurrence interval of large events. We also investigate the possibility that a little-known extension of the San Jacinto fault through the town of El Centro may accommodate a significant portion of the slip previously attributed to the Imperial fault. Models including this additional fault are more consistent with the available observations, a scenario which has significant hazard implications.

  16. Hospital compliance with a state unfunded mandate: the case of California's Earthquake Safety Law.

    PubMed

    McCue, Michael J; Thompson, Jon M

    2012-01-01

    Abstract In recent years, community hospitals have experienced heightened regulation with many unfunded mandates. The authors assessed the market, organizational, operational, and financial characteristics of general acute care hospitals in California that have a main acute care hospital building that is noncompliant with state requirements and at risk of major structural collapse from earthquakes. Using California hospital data from 2007 to 2009, and employing logistic regression analysis, the authors found that hospitals having buildings that are at the highest risk of collapse are located in larger population markets, possess smaller market share, have a higher percentage of Medicaid patients, and have less liquidity. PMID:23216262

  17. Earthquake swarms and local crustal spreading along major strike-slip faults in California

    USGS Publications Warehouse

    Weaver, C.S.; Hill, D.P.

    1978-01-01

    Earthquake swarms in California are often localized to areas within dextral offsets in the linear trend in active fault strands, suggesting a relation between earthquake swarms and local crustal spreading. Local crustal spereading is required by the geometry of dextral offsets when, as in the San Andreas system, faults have dominantly strike-slip motion with right-lateral displacement. Three clear examples of this relation occur in the Imperial Valley, Coso Hot Springs, and the Danville region, all in California. The first two of these areas are known for their Holocene volcanism and geothermal potential, which is consistent with crustal spreading and magmatic intrusion. The third example, however, shows no evidence for volcanism or geothermal activity at the surface. ?? 1978 Birkha??user Verlag.

  18. Earthquake and Tsunami planning, outreach and awareness in Humboldt County, California

    NASA Astrophysics Data System (ADS)

    Ozaki, V.; Nicolini, T.; Larkin, D.; Dengler, L.

    2008-12-01

    Humboldt County has the longest coastline in California and is one of the most seismically active areas of the state. It is at risk from earthquakes located on and offshore and from tsunamis generated locally from faults associated with the Cascadia subduction zone (CSZ), other regional fault systems, and from distant sources elsewhere in the Pacific. In 1995 the California Division of Mines and Geology published the first earthquake scenario to include both strong ground shaking effects and a tsunami. As a result of the scenario, the Redwood Coast Tsunami Work Group (RCTWG), an organization of representatives from government agencies, tribes, service groups, academia and the private sector from the three northern coastal California counties, was formed in 1996 to coordinate and promote earthquake and tsunami hazard awareness and mitigation. The RCTWG and its member agencies have sponsored a variety of projects including education/outreach products and programs, tsunami hazard mapping, signage and siren planning, and has sponsored an Earthquake - Tsunami Education Room at the Humboldt County fair for the past eleven years. Three editions of Living on Shaky Ground an earthquake-tsunami preparedness magazine for California's North Coast, have been published since 1993 and a fourth is due to be published in fall 2008. In 2007, Humboldt County was the first region in the country to participate in a tsunami training exercise at FEMA's Emergency Management Institute in Emmitsburg, MD and the first area in California to conduct a full-scale tsunami evacuation drill. The County has conducted numerous multi-agency, multi-discipline coordinated exercises using county-wide tsunami response plan. Two Humboldt County communities were recognized as TsunamiReady by the National Weather Service in 2007. Over 300 tsunami hazard zone signs have been posted in Humboldt County since March 2008. Six assessment surveys from 1993 to 2006 have tracked preparedness actions and personal awareness of earthquake and tsunami hazards in the county and additional surveys have tracked public awareness and tourist concerns about tsunami hazard signs. Over the thirteen year period covered by the surveys, the percent with houses secured to foundations has increased from 58 to 80 percent, respondents aware of a local tsunami hazard increased from 51 to 73 percent and knowing what the Cascadia subduction zone is from 16 to 42 percent.

  19. Historigraphical analysis of the 1857 Ft. Tejon earthquake, San Andreas Fault, California: Preliminary results

    NASA Astrophysics Data System (ADS)

    Martindale, D.; Evans, J. P.

    2002-12-01

    Past historical analyses of the 1857 Forth Tejon earthquake include Townley and Allen (1939); Wood (1955) re-examined the earthquake and added some additional new material, and Agnew and Sieh (1978) published an extensive review of the previous publications and included primary sources not formerly known. Since 1978, most authors have reiterated the findings of Agnew and Sieh, with the exception of Meltzner and Wald's 1998 work that built on Sieh's foreshock research and included an extensive study of aftershocks. Approximately twenty-five years has past since the last full investigation of the event. In the last several decades, libraries and archives have continued to gather additional documents. Staff members continually inventory new and existing collections, making them accessible to researchers today. As a result, we are conducting an updated examination, with the hope of new insight regarding the 1857 Fort Tejon earthquake. We use a new approached to the topic: the research skills of a historian in collaboration with a geologist to generate quantitative data on the nature and location of ground shaking associated with the earthquake. We analyze documents from the Huntington Library, California State Historical Society, California State Library-California Room, Utah Historical Association Information Center, the Church of Jesus Christ of Latter-day Saints (LDS) Archives and Historical Department, Cal Tech Archives, the National Archives, and the Fort Tejon State Park. New facilities reviewed also include Utah State University, University of Utah, and the LDS Family History Center. Each facility not only provided formerly quoted sources, but many offered new materials. For example, previous scholars examined popular, well-known newspapers; yet, publications in smaller towns and in languages other than English, also existed. Thirty newspapers published in January 1857 were located. We find records of the event at least one year after the earthquake. One outcome of such a search includes letters, approximately eight pictures useful in structure-damage analysis. Over 170 newspapers were published during 1857 throughout California, Nevada, and New Mexico Territory, encompassing the area of Arizona and New Mexico today. Historical information regarding the settlement of areas also proved useful. Although earlier scholars knew of LDS settlement missions in San Bernardino, California and Las Vegas, Nevada, only brief information was located. Preliminary results include increasing the felt area to include Las Vegas, Nevada; support for a Mercalli Index of IX or even X for San Bernardino; VIII or greater for sites NE of Sacramento, a northwest to southeast rupture pattern, and reports of electromagnetic disturbances. Based on these results, we suggest that the 1857 Ft. Tejon earthquake be felt over a wider area, and in places created greater ground shaking, than previously documented.

  20. Earthquake-induced sediment failures on a 0.25o slope, Klamath River delta, California.

    USGS Publications Warehouse

    Field, M.E.; Gardner, J.V.; Jennings, A.E.; Edwards, B.D.

    1982-01-01

    On Nov. 8, 1980, a major earthquake (magnitude 6.5-7.2) occurred 60 km off the coast of N California. A survey of the area using high-resolution seismic-reflection and side-scan sonar equipment revealed the presence of extensive sediment failure and flows in a zone about 1 km wide and 20 km long that trends parallel to the shelf on the very gently sloping (less than 0.25o) Klamath River delta.-from Authors

  1. Processed seismic motion records from earthquakes (1982--1993): Recorded at Scotty`s Castle, California

    SciTech Connect

    Lum, P K; Honda, K K

    1993-10-01

    The 8mm data tape contains the processed seismic data of earthquakes recorded at Scotty`s Castle, California. The seismic data were recorded by seismographs maintained by the DOE/NV in Southern Nevada. Four files were generated from each seismic recorder. They are ``Uncorrected acceleration time histories, 2. corrected acceleration, velocity and displacement time histories, 3. original recording, and 4. Fourier amplitude spectra of acceleration.

  2. A New Trigger Criterion for Improved Real-Time Performance of Onsite Earthquake Early Warning in Southern California

    Microsoft Academic Search

    M. Bose; E. Hauksson; K. Solanki; H. Kanamori; Y.-M. Wu; T. H. Heaton

    2009-01-01

    We have implemented and tested an algorithm for onsite earthquake early warning (EEW) in California using the infrastructure of the Southern California Seismic Network (SCSN). The algorithm relies on two parameters derived from the initial 3 sec of P waveform data at a single seismic sensor: period parameter ?c and high-pass filtered displacement amplitude Pd. Previous studies have determined em-

  3. The 3 August 2009 Mw 6.9 Canal de Ballenas Region, Gulf of California, Earthquake and Its Aftershocks

    E-print Network

    Shearer, Peter

    The 3 August 2009 Mw 6.9 Canal de Ballenas Region, Gulf of California, Earthquake and Its.9 occurred near Canal de Ballenas, in the north-central region of the Gulf of California, Mexico. The focal August a third aftershock of Mw 5.7 was located in the Canal de Ballenas region. The events of August

  4. Earthquakes

    NSDL National Science Digital Library

    Mrs. Hemedinger

    2007-11-26

    Students will participate in a virtual earthquake lab where they will locate an epicenter and measure Richter Scale magnitude. They will also plot the positions of earthquakes that occurred that day. 1) Go to Virtual Earthquake website and follow instructions to complete the online lab assignment. 2) Go to the USGS earthquake site. Take a few minutes to explore the earthquakes displayed on the world map. Click on \\"M2.5/4+ Earthquake List\\". Use the world map provided by your teacher to plot the locations ...

  5. Earthquakes

    NSDL National Science Digital Library

    This lesson on earthquakes is based on naturalist John Muir's experiences with two significant earthquakes, the 1872 earthquake on the east side of the Sierra Nevada Mountains, and the Great San Francisco Earthquake of 1906. Students will learn to explain that earthquakes are sudden motions along breaks in the crust called faults, and list the major geologic events including earthquakes, volcanic eruptions and mountain building, which are the result of crustal plate motions. A downloadable, printable version (PDF) of the lesson plan is available.

  6. Earthquake Education and Public Information Centers: A Collaboration Between the Earthquake Country Alliance and Free-Choice Learning Institutions in California

    NASA Astrophysics Data System (ADS)

    Degroot, R. M.; Springer, K.; Brooks, C. J.; Schuman, L.; Dalton, D.; Benthien, M. L.

    2009-12-01

    In 1999 the Southern California Earthquake Center initiated an effort to expand its reach to multiple target audiences through the development of an interpretive trail on the San Andreas fault at Wallace Creek and an earthquake exhibit at Fingerprints Youth Museum in Hemet. These projects and involvement with the San Bernardino County Museum in Redlands beginning in 2007 led to the creation of Earthquake Education and Public Information Centers (EPIcenters) in 2008. The impetus for the development of the network was to broaden participation in The Great Southern California ShakeOut. In 2009 it has grown to be more comprehensive in its scope including its evolution into a statewide network. EPIcenters constitute a variety of free-choice learning institutions, representing museums, science centers, libraries, universities, parks, and other places visited by a variety of audiences including families, seniors, and school groups. They share a commitment to demonstrating and encouraging earthquake preparedness. EPIcenters coordinate Earthquake Country Alliance activities in their county or region, lead presentations or organize events in their communities, or in other ways demonstrate leadership in earthquake education and risk reduction. The San Bernardino County Museum (Southern California) and The Tech Museum of Innovation (Northern California) serve as EPIcenter regional coordinating institutions. They interact with over thirty institutional partners who have implemented a variety of activities from displays and talks to earthquake exhibitions. While many activities are focused on the time leading up to and just after the ShakeOut, most EPIcenter members conduct activities year round. Network members at Kidspace Museum in Pasadena and San Diego Natural History Museum have formed EPIcenter focus groups on early childhood education and safety and security. This presentation highlights the development of the EPIcenter network, synergistic activities resulting from this collaboration, and lessons learned from interacting with free-choice learning institutions.

  7. Cruise report for 01-99-SC: southern California earthquake hazards project

    USGS Publications Warehouse

    Normark, William R.; Reid, Jane A.; Sliter, Ray W.; Holton, David; Gutmacher, Christina E.; Fisher, Michael A.; Childs, Jonathan R.

    1999-01-01

    The focus of the Southern California Earthquake Hazards project is to identify the landslide and earthquake hazards and related ground-deformation processes occurring in the offshore areas that have significant potential to impact the inhabitants of the Southern California coastal region. The project activity is supported through the Coastal and Marine Geology Program of the Geologic Division of the U. S. Geological Survey (USGS) and is a component of the Geologic Division's Science Strategy under Goal 1—Conduct Geologic Hazard Assessments for Mitigation Planning (Bohlen et al., 1998). The project research is specifically stated under Activity 1.1.2 of the Science Strategy: Earthquake Hazard Assessments and Loss Reduction Products in Urban Regions. This activity involves "research, seismic and geodetic monitoring, field studies, geologic mapping, and analyses needed to provide seismic hazard assessments of major urban centers in earthquake-prone regions including adjoining coastal and offshore areas." The southern California urban areas, which form the most populated urban corridor along the U.S. Pacific margin, are among a few specifically designated for special emphasis under the Division's science strategy (Bohlen et al., 1998). The primary objective of the project is to help mitigate the earthquake hazards for the Southern California region by improving our understanding of how deformation is distributed (spatially and temporally) in the offshore with respect to the onshore region. To meet this objective, we are conducting field investigations to observe the distribution, character, and relative intensity of active (i.e., primarily Holocene) deformation within the basins and along the shelf adjacent to the most highly populated areas (Fig. 1). In addition, acoustic imaging should help determine the subsurface dimensions of the faults and identify the size and frequency of submarine landslides, both of which are necessary for evaluating the potential for generating destructive tsunamis in the southern California offshore. In order to evaluate the strain associated with the offshore structures, the initial results from the field mapping under this project will be used to identify possible sites for deployment of acoustic geodetic instruments to monitor strain in the offshore region. A major goal of mapping under this project is to provide detailed geologic and geophysical information in GIS data bases that build on the earlier studies and use the new data to precisely locate active faults and to map recent submarine landslide deposits.

  8. WHITTIER NARROWS, CALIFORNIA EARTHQUAKE OF OCTOBER 1, 1987-PRELIMINARY ASSESSMENT OF STRONG GROUND MOTION RECORDS.

    USGS Publications Warehouse

    Brady, A.G.; Etheredge, E.C.; Porcella, R.L.

    1988-01-01

    More than 250 strong-motion accelerograph stations were triggered by the Whittier Narrows, California earthquake of 1 October 1987. Considering the number of multichannel structural stations in the area of strong shaking, this set of records is one of the more significant in history. Three networks, operated by the U. S. Geological Survey, the California Division of Mines and Geology, and the University of Southern California produced the majority of the records. The excellent performance of the instruments in these and the smaller arrays is attributable to the quality of the maintenance programs. Readiness for a magnitude 8 event is directly related to these maintenance programs. Prior to computer analysis of the analog film records, a number of important structural resonant modes can be identified, and frequencies and simple mode shapes have been scaled.

  9. Factors Affecting the Probability of Default: Student Loans in California.

    ERIC Educational Resources Information Center

    Woo, Jennie H.

    2002-01-01

    Linked a database of California student borrowers with background financial and demographic information and post-college employment data to examine factors that predict default for borrowers in the federal Family Education Loan program. Found that background demographic and financial characteristics, leaving school without a degree, having low…

  10. The 2014 Mw 6.0 Napa earthquake, California: Observations from real-time GPS-enhanced earthquake early warning

    NASA Astrophysics Data System (ADS)

    Grapenthin, Ronni; Johanson, Ingrid; Allen, Richard M.

    2014-12-01

    Recently, progress has been made to demonstrate feasibility and benefits of including real-time GPS (rtGPS) in earthquake early warning and rapid response systems. Most concepts, however, have yet to be integrated into operational environments. The Berkeley Seismological Laboratory runs an rtGPS-based finite fault inversion scheme in real time. This system (G-larmS) detected the 2014 Mw 6.0 South Napa earthquake in California. We review G-larmS' performance during this event and 13 aftershocks and present rtGPS observations and real-time modeling results for the main shock. The first distributed slip model and magnitude estimates were available 24s after the event origin time, which, after optimizations, was reduced to 14s (?8s S wave travel time, ?6s data latency). G-larmS' solutions for the aftershocks (that had no measurable surface displacements) demonstrate that, in combination with the seismic early warning magnitude, Mw 6.0 is our current resolution limit.

  11. Web Services and Other Enhancements at the Northern California Earthquake Data Center

    NASA Astrophysics Data System (ADS)

    Neuhauser, D. S.; Zuzlewski, S.; Allen, R. M.

    2012-12-01

    The Northern California Earthquake Data Center (NCEDC) provides data archive and distribution services for seismological and geophysical data sets that encompass northern California. The NCEDC is enhancing its ability to deliver rapid information through Web Services. NCEDC Web Services use well-established web server and client protocols and REST software architecture to allow users to easily make queries using web browsers or simple program interfaces and to receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, or MiniSEED depending on the service, and are compatible with the equivalent IRIS DMC web services. The NCEDC is currently providing the following Web Services: (1) Station inventory and channel response information delivered in StationXML format, (2) Channel response information delivered in RESP format, (3) Time series availability delivered in text and XML formats, (4) Single channel and bulk data request delivered in MiniSEED format. The NCEDC is also developing a rich Earthquake Catalog Web Service to allow users to query earthquake catalogs based on selection parameters such as time, location or geographic region, magnitude, depth, azimuthal gap, and rms. It will return (in QuakeML format) user-specified results that can include simple earthquake parameters, as well as observations such as phase arrivals, codas, amplitudes, and computed parameters such as first motion mechanisms, moment tensors, and rupture length. The NCEDC will work with both IRIS and the International Federation of Digital Seismograph Networks (FDSN) to define a uniform set of web service specifications that can be implemented by multiple data centers to provide users with a common data interface across data centers. The NCEDC now hosts earthquake catalogs and waveforms from the US Department of Energy (DOE) Enhanced Geothermal Systems (EGS) monitoring networks. These data can be accessed through the above web services and through special NCEDC web pages.

  12. Earthquakes

    NSDL National Science Digital Library

    Timothy Heaton

    This site contains 22 questions on the topic of earthquakes, which covers seismic waves, earthquake characteristics, and earthquake magnitudes. This is part of the Principles of Earth Science course at the University of South Dakota. Users submit their answers and are provided immediate verification.

  13. Earthquakes

    MedlinePLUS

    Earthquakes are sudden rolling or shaking events caused by movement under the earth’s surface. Earthquakes happen along cracks in the earth's surface, called ... although they usually last less than one minute. Earthquakes cannot be predicted — although scientists are working on ...

  14. Localization of intermediate-term earthquake prediction

    Microsoft Academic Search

    V. G. Kossobokov; V. I. Keilis-Borok; S. W. Smith

    1990-01-01

    Relative seismic quiescence within a region which has already been diagnosed as having entered a Time of Increased Probability (TIP) for the occurrence of a strong earthquake can be used to refine the locality in which the earthquake may be expected to occur. A simple algorithm with parameters fitted from the data in Northern California preceding the 1980 magnitude 7.0

  15. Spatial and temporal synthesized probability gain for middle and long-term earthquake forecast and its preliminary application

    Microsoft Academic Search

    Xiao-Qing Wang; Zheng-Xiang Fu; Li-Ren Zhang; Sheng-Ping Su; Xiang Ding

    2000-01-01

    The principle of middle and long-term earthquake forecast model of spatial and temporal synthesized probability gain and the\\u000a evaluation of forecast efficiency (R-values) of various forecast methods are introduced in this paper. The R-value method, developed by Xu (1989), is further developed here, and can be applied to more complicated cases. Probability\\u000a gains in spatial and\\/or temporal domains and the

  16. The Redwood Coast Tsunami Work Group: Promoting Earthquake and Tsunami Resilience on California's North Coast

    NASA Astrophysics Data System (ADS)

    Dengler, L. A.; Henderson, C.; Larkin, D.; Nicolini, T.; Ozaki, V.

    2014-12-01

    In historic times, Northern California has suffered the greatest losses from tsunamis in the U.S. contiguous 48 states. 39 tsunamis have been recorded in the region since 1933, including five that caused damage. This paper describes the Redwood Coast Tsunami Work Group (RCTWG), an organization formed in 1996 to address the tsunami threat from both near and far sources. It includes representatives from government agencies, public, private and volunteer organizations, academic institutions, and individuals interested in working to reduce tsunami risk. The geographic isolation and absence of scientific agencies such as the USGS and CGS in the region, and relatively frequent occurrence of both earthquakes and tsunami events has created a unique role for the RCTWG, with activities ranging from basic research to policy and education and outreach programs. Regional interest in tsunami issues began in the early 1990s when there was relatively little interest in tsunamis elsewhere in the state. As a result, the group pioneered tsunami messaging and outreach programs. Beginning in 2008, the RCTWG has partnered with the National Weather Service and the California Office of Emergency Services in conducting the annual "live code" tsunami communications tests, the only area outside of Alaska to do so. In 2009, the RCTWG joined with the Southern California Earthquake Alliance and the Bay Area Earthquake Alliance to form the Earthquake Country Alliance to promote a coordinated and consistent approach to both earthquake and tsunami preparedness throughout the state. The RCTWG has produced and promoted a variety of preparedness projects including hazard mapping and sign placement, an annual "Earthquake - Tsunami Room" at County Fairs, public service announcements and print material, assisting in TsunamiReady community recognition, and facilitating numerous multi-agency, multidiscipline coordinated exercises, and community evacuation drills. Nine assessment surveys from 1993 to 2013 have tracked preparedness actions and personal awareness of tsunami hazards. Over the twenty-year period covered by the surveys, respondents aware of a local tsunami hazard increased from 51 to 90 percent and awareness of the Cascadia subduction zone increased from 16 to 60 percent.

  17. Systematic search for missing earthquakes in Central California around the 2003 Mw6.5 San Simeon and the 2004 Mw6.0 Parkfield earthquakes

    NASA Astrophysics Data System (ADS)

    Meng, X.; Peng, Z.; Hardebeck, J. L.; Yu, X.; Hong, B.

    2012-12-01

    Whether static or dynamic triggering is the dominant triggering mechanism in near field is still under debate. The key factor to differentiate the two triggering mechanisms is the 'stress shadow', where the static stress and seismicity rate both decrease. However, clear evidence of 'stress shadows' is very difficult to obtain, because it requires high seismicity rate before the mainshock. An ideal condition to find 'stress shadows' is where two large earthquakes occurred close in space and time within a dense seismic networks. In this case, if the static triggering is dominant, the seismicity rate in certain regions could be promoted by the first earthquake and then stifled by the second one. The 2003 Mw6.5 San Simeon and 2004 Mw6.0 Parkfield earthquakes, which are separated only by 9 months and ~60 km, fulfill such requirements and hence could be a perfect case to test earthquake triggering mechanisms. However, existing earthquake catalogs are incomplete immediately following large earthquakes, mainly due to extreme high seismicity rate and masking from the coda wave of the mainshock and large aftershocks. Discovering those missing earthquakes is crucial for obtaining the genuine seismicity rate changes and therefore further understanding of physical mechanisms of earthquake triggering. In this study, we apply the recent matched filter technique to systematically search for missing earthquakes from a half-year before the 2003 Mw6.5 San Simeon earthquake to a half-year after the 2004 Mw6.0 Parkfield earthquake. We use ~50,000 template events covering a wide region in Central California and run the detection codes on multiple GPU cards, which has significantly reduced the computation time. Preliminary results show that 8048 events are detected by the matched filter technique two days before and after the Parkfield mainshock, which is ~8 times more than listed in Northern California Seismic Network catalog. With the more complete earthquake catalog, we will be able to examine the correspondence between the genuine seismicity rate and stress changes pattern after the Parkfield earthquake, which could help to contribute to the aforementioned debate on static vs. dynamic triggering of aftershocks and triggered seismicity.

  18. CISN ShakeAlert: Using early warnings for earthquakes in California

    NASA Astrophysics Data System (ADS)

    Hellweg, M.; Vinci, M.; CISN-EEW Project Team

    2011-12-01

    As part of a USGS-funded project, the California Integrated Seismic Network (CISN) is now implementing and testing a prototype, end-to-end system for earthquake early warning, the ShakeAlert system. Having an alert of shaking just before it starts can improve resilience if the recipient of the alert has developed plans for responding to it and acts on them. We are working with a suite of perspective users from critical industries and institutions throughout California, such as the Bay Area Rapid Transit District, to identify information necessary for ShakeAlert users, as well as delivery mechanisms, procedures and products. At the same time, we will support their efforts to determine and implement appropriate responses to alerts of expected earthquake shaking, and to assess possible uses and especially benefits to themselves and to society. Thus, a detailed introduction to the CISN ShakeAlert system is an integral part of our interaction with the users, as are regular opportunities for feedback and support. In a final workshop, users will be surveyed for evaluations of perspective uses for early warning in their organizations as well as expected improvements in their response to earthquakes due to the early warning and their expected savings in terms of lives, damage and resilience.

  19. Cruise report for A1-00-SC southern California earthquake hazards project, part A

    USGS Publications Warehouse

    Gutmacher, Christina E.; Normark, William R.; Ross, Stephanie L.; Edwards, Brian D.; Sliter, Ray; Hart, Patrick; Cooper, Becky; Childs, Jon; Reid, Jane A.

    2000-01-01

    A three-week cruise to obtain high-resolution boomer and multichannel seismic-reflection profiles supported two project activities of the USGS Coastal and Marine Geology (CMG) Program: (1) evaluating the earthquake and related geologic hazards posed by faults in the near offshore area of southern California and (2) determining the pathways through which sea-water is intruding into aquifers of Los Angeles County in the area of the Long Beach and Los Angeles harbors. The 2000 cruise, A1-00-SC, is the third major data-collection effort in support of the first objective (Normark et al., 1999a, b); one more cruise is planned for 2002. This report deals primarily with the shipboard operations related to the earthquake-hazard activity. The sea-water intrusion survey is confined to shallow water and the techniques used are somewhat different from that of the hazards survey (see Edwards et al., in preparation).

  20. CSMIP (California Strong Motion Instrumentation Program) strong-motion records from the Santa Cruz Mountains (Loma Prieta), California earthquake of 17 October 1989

    SciTech Connect

    Shakal, A.; Reichle, M.; Ventura, C.; Cao, T.; Sherburne, R.; Savage, M.; Darragh, R.; Petersen, C.

    1990-01-01

    Strong-motion records were recovered from 93 stations of the California Strong Motion Instrumentation Program (CSMIP) after the earthquake. CSMIP provides information on the force of ground motion and the deformation induced in structures and in rock and soil by earthquake-generated ground motion. This information is recorded by strong-motion sensors placed in engineered structures and at free field (ground) sites, and is used by earthquake engineers and earth scientists to improve the design of earthquake-resistant structures. The strong-motion instrumentation program was established after the San Fernando earthquake in 1971. A total of 125 records were recovered from the 93 CSMIP stations which recorded the Loma Prieta event. These 125 records contain data from a total of 690 strong-motion sensors. These data are important because of the unique structures and sites at which records were obtained during this event. Some highlights of particular interest are included in this paper.

  1. GPS Time Series Analysis of Southern California Associated with the 2010 M7.2 El Mayor/Cucapah Earthquake

    NASA Technical Reports Server (NTRS)

    Granat, Robert; Donnellan, Andrea

    2011-01-01

    The Magnitude 7.2 El-Mayor/Cucapah earthquake the occurred in Mexico on April 4, 2012 was well instrumented with continuous GPS stations in California. Large Offsets were observed at the GPS stations as a result of deformation from the earthquake providing information about the co-seismic fault slip as well as fault slip from large aftershocks. Information can also be obtained from the position time series at each station.

  2. Investigating earthquake cycle vertical deformation recorded by GPS and regional tide gauge stations in California

    NASA Astrophysics Data System (ADS)

    Hardy, S.; Konter, B.

    2013-12-01

    Geodetic and tide gauge measurements of vertical deformation record localized zones of uplift and subsidence that may document critical components of both long and short-period earthquake cycle deformation. In this study, we compare vertical tide gauge data from the Permanent Service for Mean Sea Level (PSMSL) and vertical GPS data from the EarthScope Plate Boundary Observatory (PBO) for 10 approximately co-located station pairs along coastal California from Point Reyes, CA to Ensenada, Mexico. To compare these two datasets, we first truncate both datasets so that they span a common time frame for all stations (2007 - 2012). PSMSL data are treated for both average global sea level rise (~1.8 mm/yr) and global isostatic adjustment. We then calculate a 2-month running mean for tide gauge and a 1-month running mean for GPS datasets to smooth out daily oceanographic or anthropologic disturbances but maintain the overall trend of each dataset. As major ocean-climate signals, such as El Nino, are considered regional features of the Pacific Ocean and likely common to all California tide gauge stations, we subtract a reference sea level record (San Francisco station) from all other stations to eliminate this signal. The GPS and tide gauge data show varying degrees of correlation spanning both 3-month and 4-year time-scales. We infer that the slope of vertical displacements are largely controlled by interseismic motions, however displacements from major earthquakes are evident and are required to explain some of the unique signatures in the tide gauge and GPS data. Specifically, we find that stations from both datasets in Southern California show an anomalous trend since the 2010 Baja California earthquake. To further investigate this trend and others, we compare these data to vertical motions estimated by a suite of 3-D viscoelastic earthquake cycle deformation models. Long-term tide gauge time series are well simulated by the models, but short-term time series are not as well predicted; additional parameter adjustments are needed to improve these. Alternatively, both tide gauge and GPS data show a better short-term than long-term correlation; oceanographic and possibly groundwater effects could be responsible for these differences.

  3. Stress/strain changes and triggered seismicity following the Mw 7.3 Landers, California earthquake

    NASA Astrophysics Data System (ADS)

    Gomberg, Joan

    1996-01-01

    Calculations of dynamic stresses and strains, constrained by broadband seismograms, are used to investigate their role in generating the remotely triggered seismicity that followed the June 28, 1992 Mw7.3 Landers, California earthquake. I compare straingrams and dynamic Coulomb failure functions calculated for the Landers earthquake at sites that did experience triggered seismicity with those at sites that did not. Bounds on triggering thresholds are obtained from analysis of dynamic strain spectra calculated for the Landers and Mw6.1 Joshua Tree, California earthquakes at various sites, combined with results of static strain investigations by others. I interpret three principal results of this study with those of a companion study by Gomberg and Davis [this issue]. First, the dynamic elastic stress changes themselves cannot explain the spatial distribution of triggered seismicity, particularly the lack of triggered activity along the San Andreas fault system. In addition to the requirement to exceed a Coulomb failure stress level, this result implies the need to invoke and satisfy the requirements of appropriate slip instability theory. Second, results of this study are consistent with the existence of frequency- or rate-dependent stress/strain triggering thresholds, inferred from the companion study and interpreted in terms of earthquake initiation involving a competition of processes, one promoting failure and the other inhibiting it. Such competition is also part of relevant instability theories. Third, the triggering threshold must vary from site to site, suggesting that the potential for triggering strongly depends on site characteristics and response. The lack of triggering along the San Andreas fault system may be correlated with the advanced maturity of its fault gouge zone; the strains from the Landers earthquake were either insufficient to exceed its larger critical slip distance or some other critical failure parameter; or the faults failed stably as aseismic creep events. Variations in the triggering threshold at sites of triggered seismicity may be attributed to variations in gouge zone development and properties. Finally, these interpretations provide ready explanations for the time delays between the Landers earthquake and the triggered events.

  4. Satellite IR thermal measurements prior to the September 2004 earthquakes in central California

    NASA Astrophysics Data System (ADS)

    Ouzounov, D.; Logan, T.; Braynt, N.; Taylor, P.

    2004-12-01

    We present and discuss observed variations in thermal transients and radiation fields prior to the earthquakes of September 18 near Bodie (M5.5) and September 28, 2004 near Parkfield(M6.0) in California. Previous analysis of earthquake events have indicated the presence of a thermal anomaly, where temperatures increased or did not return to its usual nighttime value. The procedures used in our work is to analyze weather satellite data taken at night and to record the general condition where the ground cools after sunset. Two days before the Bodie earthquake lower temperature radiation was observed by the NOAA/AVHRR satellite. This occurred when the entire region was relatively cloud-free. IR land surface nighttime temperature from the MODIS instrument rose to +4 degrees C in a 100 km radius around the Bodie epicenter. The thermal transient field recorded by MODIS in the vicinity of Parkfield, also with a cloud free environment,was around +1degree C and it is significantly smaller than the thermal anomaly around the Bodie epicenter. Ground surface temperature near the Parkfield epicenter, however, for that period showed a steady increase 4 days prior to the earthquake and a significant drop of the night before the quake. Geosynchronous weather satellite thermal IR measurements taken every half hour from sunset to dawn, were also recorded for 10 days prior to the Parkfield event and 5 days after as well as the day of the quake. To establish a baseline we also obtained GOES data for the same Julian days for the three years prior to the Parkfield earthquake. These September 2004 IR data sets were then used to systematically observe and record any thermal anomaly prior to the events that deviated from the baseline. Our recent results support the hypothesis of a possible relationship between an thermodynamic processes produced by increasing tectonic stress in the Earth's crust and a subsequent electro-chemical interaction between this crust and the atmosphere/ionosphere.

  5. Marine geology and earthquake hazards of the San Pedro Shelf region, southern California

    USGS Publications Warehouse

    Fisher, Michael A.; Normark, William R.; Langenheim, Victoria E.; Calvert, Andrew J.; Sliter, Ray

    2004-01-01

    High-resolution seismic-reflection data have been com- bined with a variety of other geophysical and geological data to interpret the offshore structure and earthquake hazards of the San Pedro Shelf, near Los Angeles, California. Prominent structures investigated include the Wilmington Graben, the Palos Verdes Fault Zone, various faults below the western part of the shelf and slope, and the deep-water San Pedro Basin. The structure of the Palos Verdes Fault Zone changes mark- edly southeastward across the San Pedro Shelf and slope. Under the northern part of the shelf, this fault zone includes several strands, but the main strand dips west and is probably an oblique-slip fault. Under the slope, this fault zone con- sists of several fault strands having normal separation, most of which dip moderately east. To the southeast near Lasuen Knoll, the Palos Verdes Fault Zone locally is a low-angle fault that dips east, but elsewhere near this knoll the fault appears to dip steeply. Fresh sea-floor scarps near Lasuen Knoll indi- cate recent fault movement. The observed regional structural variation along the Palos Verdes Fault Zone is explained as the result of changes in strike and fault geometry along a master strike-slip fault at depth. The shallow summit and possible wavecut terraces on Lasuen knoll indicate subaerial exposure during the last sea-level lowstand. Modeling of aeromagnetic data indicates the presence of a large magnetic body under the western part of the San Pedro Shelf and upper slope. This is interpreted to be a thick body of basalt of Miocene(?) age. Reflective sedimentary rocks overlying the basalt are tightly folded, whereas folds in sedimentary rocks east of the basalt have longer wavelengths. This difference might mean that the basalt was more competent during folding than the encasing sedimentary rocks. West of the Palos Verdes Fault Zone, other northwest-striking faults deform the outer shelf and slope. Evidence for recent movement along these faults is equivocal, because age dates on deformed or offset sediment are lacking.

  6. Earthquakes!

    NSDL National Science Digital Library

    A strong earthquake struck Istanbul, Turkey on Monday, only weeks after a major quake in the same area claimed more than 15,500 lives. This site, from The Why Files (see the August 9, 1996 Scout Report), offers background information on the science of earthquakes, with particular emphasis on the recent tectonic activity in Turkey.

  7. High precision earthquake locations reveal seismogenic structure beneath Mammoth Mountain, California

    USGS Publications Warehouse

    Prejean, S.; Stork, A.; Ellsworth, W.; Hill, D.; Julian, B.

    2003-01-01

    In 1989, an unusual earthquake swarm occurred beneath Mammoth Mountain that was probably associated with magmatic intrusion. To improve our understanding of this swarm, we relocated Mammoth Mountain earthquakes using a double difference algorithm. Relocated hypocenters reveal that most earthquakes occurred on two structures, a near-vertical plane at 7-9 km depth that has been interpreted as an intruding dike, and a circular ring-like structure at ???5.5 km depth, above the northern end of the inferred dike. Earthquakes on this newly discovered ring structure form a conical section that dips outward away from the aseismic interior. Fault-plane solutions indicate that in 1989 the seismicity ring was slipping as a ring-normal fault as the center of the mountain rose with respect to the surrounding crust. Seismicity migrated around the ring, away from the underlying dike at a rate of ???0.4 km/month, suggesting that fluid movement triggered seismicity on the ring fault. Copyright 2003 by the American Geophysical Union.

  8. Earthquake source mechanisms and transform fault tectonics in the Gulf of California

    NASA Technical Reports Server (NTRS)

    Goff, John A.; Bergman, Eric A.; Solomon, Sean C.

    1987-01-01

    The source parameters of 19 large earthquakes in the Gulf of California were determined from inversions of long-period P and SH waveforms. The goal was to understand the recent slip history of this dominantly transform boundary between the Pacific and North American plates as well as the effect on earthquake characteristics of the transition from young oceanic to continental lithosphere. For the better recorded transform events, the fault strike is resolved to + or - 4 deg at 90 percent confidence. The slip vectors thus provide important constraints on the direction of relative plate motion. Most centroid depths are poorly resolved because of tradeoffs between depth and source time function. On the basis of waveform modeling, historical seismicity, and other factors, it is appropriate to divide the Gulf into three distinct zones. The difference in seismic character among the three zones is likely the result of differing levels of maturity of the processes of rifting, generation of oceanic crust, and formation of stable oceanic transform faults. The mechanism of an earthquake on the Tres Marias Escarpment is characterized by thrust faulting and likely indicates the direction of relative motion between the Rivera and North American plates. This mechanism requires revision in plate velocity models which predict strike slip motion at this location.

  9. Estimated Ground Motion From the 1994 Northridge, California, Earthquake at the Site of the Interstate 10 and La Cienega Boulevard Bridge Collapse, West Los Angeles, California

    Microsoft Academic Search

    David M. Boore; James F. Gibbs; William B. Joyner; John C. Tinsley; Daniel J. Ponti

    2003-01-01

    We have estimated ground motions at the site of a bridge collapse during the 1994 Northridge, California, earthquake. The estimated motions are based on correcting motions recorded during the mainshock 2.3 km from the collapse site for the relative site response of the two sites. Shear-wave slownesses and damping based on analysis of borehole measurements at the two sites were

  10. Introducing ShakeMap to potential users in Puerto Rico using scenarios of damaging historical and probable earthquakes

    NASA Astrophysics Data System (ADS)

    Huerfano, V. A.; Cua, G.; von Hillebrandt, C.; Saffar, A.

    2007-12-01

    The island of Puerto Rico has a long history of damaging earthquakes. Major earthquakes from off-shore sources have affected Puerto Rico in 1520, 1615, 1670, 1751, 1787, 1867, and 1918 (Mueller et al, 2003; PRSN Catalogue). Recent trenching has also yielded evidence of possible M7.0 events inland (Prentice, 2000). The high seismic hazard, large population, high tsunami potential and relatively poor construction practice can result in a potentially devastating combination. Efficient emergency response in event of a large earthquake will be crucial to minimizing the loss of life and disruption of lifeline systems in Puerto Rico. The ShakeMap system (Wald et al, 2004) developed by the USGS to rapidly display and disseminate information about the geographical distribution of ground shaking (and hence potential damage) following a large earthquake has proven to be a vital tool for post earthquake emergency response efforts, and is being adopted/emulated in various seismically active regions worldwide. Implementing a robust ShakeMap system is among the top priorities of the Puerto Rico Seismic Network. However, the ultimate effectiveness of ShakeMap in post- earthquake response depends not only on its rapid availability, but also on the effective use of the information it provides. We developed ShakeMap scenarios of a suite of damaging historical and probable earthquakes that severely impact San Juan, Ponce, and Mayagüez, the 3 largest cities in Puerto Rico. Earthquake source parameters were obtained from McCann and Mercado (1998); and Huérfano (2004). For historical earthquakes that generated tsunamis, tsunami inundation maps were generated using the TIME method (Shuto, 1991). The ShakeMap ground shaking maps were presented to local and regional governmental and emergency response agencies at the 2007 Annual conference of the Puerto Rico Emergency Management and Disaster Administration in San Juan, PR, and at numerous other emergency management talks and training sessions. Economic losses are estimated using the ShakeMap scenario ground motions (Saffar, 2007). The calibration tasks necessary in generating these scenarios (developing Vs30 maps, attenuation relationships) complement the on-going efforts of the Puerto Rico Seismic Network to generate ShakeMaps in real-time.

  11. Crustal velocities near Coalinga, California, modeled from a combined earthquake/explosion refraction profile

    USGS Publications Warehouse

    Macgregor-Scott, N.; Walter, A.

    1988-01-01

    Crustal velocity structure for the region near Coalinga, California, has been derived from both earthquake and explosion seismic phase data recorded along a NW-SE seismic-refraction profile on the western flank of the Great Valley east of the Diablo Range. Comparison of the two data sets reveals P-wave phases in common which can be correlated with changes in the velocity structure below the earthquake hypocenters. In addition, the earthquake records reveal secondary phases at station ranges of less than 20 km that could be the result of S- to P-wave conversions at velocity interfaces above the earthquake hypocenters. Two-dimensional ray-trace modeling of the P-wave travel times resulted in a P-wave velocity model for the western flank of the Great Valley comprised of: 1) a 7- to 9-km thick section of sedimentary strata with velocities similar to those found elsewhere in the Great Valley (1.6 to 5.2 km s-1); 2) a middle crust extending to about 14 km depth with velocities comparable to those reported for the Franciscan assemblage in the Diablo Range (5.6 to 5.9 km s-1); and 3) a 13- to 14-km thick lower crust with velocities similar to those reported beneath the Diablo Range and the Great Valley (6.5 to 7.30 km s-1). This lower crust may have been derived from subducted oceanic crust that was thickened by accretionary underplating or crustal shortening. -Authors

  12. Satellite IR Thermal Measurements Prior to the September 2004 Earthquakes in Central California

    NASA Technical Reports Server (NTRS)

    Ouzounov, D.; Logan, T.; Taylor, Patrick

    2004-01-01

    We present and discuss observed variations in thermal transients and radiation fields prior to the earthquakes of September 18 near Bodie (M5.5) and September 28,2004 near Parkfield(M6.0) in California. Previous analysis of earthquake events have indicated the presence of a thermal anomaly, where temperatures increased or did not return to its usual nighttime value. The procedures used in our work is to analyze weather satellite data taken at night and to record the general condition where the ground cools after sunset. Two days before the Bodie earthquake lower temperature radiation was observed by the NOAA/AVHRR satellite. This occurred when the entire region was relatively cloud-free. IR land surface nighttime temperature from the MODIS instrument rose to +4 C in a 100 km radius around the Bodie epicenter. The thermal transient field recorded by MODIS in the vicinity of Parkfield, also with a cloud free environment, was around +l C and it is significantly smaller than the Parkfield epicenter, however, for that period showed a steady increase 4 days prior to the earthquake and a significant drop of the night before the quake. Geosynchronous weather satellite thermal IR measurements taken every half hour from sunset to dawn, were also recorded for 10 days prior to the Parkfield event and 5 days after as well as the day of the quake. To establish a baseline we also obtained GOES data for the same Julian sets were then used to systematically observe and record any thermal anomaly prior to the events that deviated from the baseline. Our recent results support the hypothesis of a possible relationship between an thermodynamic processes produced by increasing tectonic stress in the Earth's crust and a subsequent electro-chemical interaction between this crust and the atmosphere/ionosphere.

  13. Source processes of industrially-induced earthquakes at the Geysers geothermal area, California

    USGS Publications Warehouse

    Ross, A.; Foulger, G.R.; Julian, B.R.

    1999-01-01

    Microearthquake activity at The Geysers geothermal area, California, mirrors the steam production rate, suggesting that the earthquakes are industrially induced. A 15-station network of digital, three-component seismic stations was operated for one month in 1991, and 3,900 earthquakes were recorded. Highly-accurate moment tensors were derived for 30 of the best recorded earthquakes by tracing rays through tomographically derived 3-D VP and VP / VS structures, and inverting P-and S-wave polarities and amplitude ratios. The orientations of the P-and T-axes are very scattered, suggesting that there is no strong, systematic deviatoric stress field in the reservoir, which could explain why the earthquakes are not large. Most of the events had significant non-double-couple (non-DC) components in their source mechanisms with volumetric components up to ???30% of the total moment. Explosive and implosive sources were observed in approximately equal numbers, and must be caused by cavity creation (or expansion) and collapse. It is likely that there is a causal relationship between these processes and fluid reinjection and steam withdrawal. Compensated linear vector dipole (CLVD) components were up to 100% of the deviatoric component. Combinations of opening cracks and shear faults cannot explain all the observations, and rapid fluid flow may also be involved. The pattern of non-DC failure at The Geysers contrasts with that of the Hengill-Grensdalur area in Iceland, a largely unexploited water-dominated field in an extensional stress regime. These differences are poorly understood but may be linked to the contrasting regional stress regimes and the industrial exploitation at The Geysers.

  14. California takes earthquakes very seriously. The state straddles two major tectonic plates and is subject to relatively frequent, often major, potentially devastating quakes.

    E-print Network

    California takes earthquakes very seriously. The state straddles two major tectonic plates the Pacific and North American tectonic plates. The Pacific Plate includes a sliver of California and Baja California, as well as Hawaii and most of the Pacific Ocean, while the North American Plate includes

  15. Paleoseismology of the Johnson Valley, Kickapoo, and Homestead Valley Faults: Clustering of Earthquakes in the Eastern California Shear Zone

    Microsoft Academic Search

    T. K. Rockwell; S. Lindvall; M. Herzberg; D. Murbach; T. Dawson; G. Berger

    2000-01-01

    Paleoseismic data from 11 trenches at seven sites excavated across the southern Johnson Valley, Kickapoo, and Homestead Valley faults that ruptured in the 1992 Landers earthquake, as well as the northern Johnson Valley fault which did not fail in 1992, indicate that the return period for large surface rupturing events in this part of the eastern California shear zone is

  16. Fault structure and kinematics of the Long Valley Caldera region, California, revealed by high-accuracy earthquake hypocenters and

    E-print Network

    Waldhauser, Felix

    Fault structure and kinematics of the Long Valley Caldera region, California, revealed by high that occurred between 1980 and 2000 in the Long Valley caldera area using a double- difference earthquake planes in the southern caldera and adjacent Sierra Nevada block (SNB). Intracaldera faults include

  17. A Public Health Issue Related To Collateral Seismic Hazards: The Valley Fever Outbreak Triggered By The 1994 Northridge, California Earthquake

    NASA Astrophysics Data System (ADS)

    Jibson, Randall W.

    Following the 17 January 1994 Northridge, California earthquake (M = 6.7), Ventura County, California, experienced a major outbreak ofcoccidioidomycosis (CM), commonly known as valley fever, a respiratory disease contracted byinhaling airborne fungal spores. In the 8 weeks following the earthquake (24 Januarythrough 15 March), 203 outbreak-associated cases were reported, which is about an order of magnitude more than the expected number of cases, and three of these cases were fatal.Simi Valley, in easternmost Ventura County, had the highest attack rate in the county,and the attack rate decreased westward across the county. The temporal and spatial distribution of CM cases indicates that the outbreak resulted from inhalation of spore-contaminated dust generated by earthquake-triggered landslides. Canyons North East of Simi Valleyproduced many highly disrupted, dust-generating landslides during the earthquake andits aftershocks. Winds after the earthquake were from the North East, which transporteddust into Simi Valley and beyond to communities to the West. The three fatalities from the CM epidemic accounted for 4 percent of the total earthquake-related fatalities.

  18. A public health issue related to collateral seismic hazards: The valley fever outbreak triggered by the 1994 Northridge, California earthquake

    USGS Publications Warehouse

    Jibson, R.W.

    2002-01-01

    Following the 17 January 1994 Northridge. California earthquake (M = 6.7), Ventura County, California, experienced a major outbreak of coccidioidomycosis (CM), commonly known as valley fever, a respiratory disease contracted by inhaling airborne fungal spores. In the 8 weeks following the earthquake (24 January through 15 March), 203 outbreak-associated cases were reported, which is about an order of magnitude more than the expected number of cases, and three of these cases were fatal. Simi Valley, in easternmost Ventura County, had the highest attack rate in the county, and the attack rate decreased westward across the county. The temporal and spatial distribution of CM cases indicates that the outbreak resulted from inhalation of spore-contaminated dust generated by earthquake-triggered landslides. Canyons North East of Simi Valley produced many highly disrupted, dust-generating landslides during the earthquake and its aftershocks. Winds after the earthquake were from the North East, which transported dust into Simi Valley and beyond to communities to the West. The three fatalities from the CM epidemic accounted for 4 percent of the total earthquake-related fatalities.

  19. Holocene paleoseismicity, temporal clustering, and probabilities of future large (M > 7) earthquakes on the Wasatch fault zone, Utah

    USGS Publications Warehouse

    McCalpin, J.P.; Nishenko, S.P.

    1996-01-01

    The chronology of M>7 paleoearthquakes on the central five segments of the Wasatch fault zone (WFZ) is one of the best dated in the world and contains 16 earthquakes in the past 5600 years with an average repeat time of 350 years. Repeat times for individual segments vary by a factor of 2, and range from about 1200 to 2600 years. Four of the central five segments ruptured between ??? 620??30 and 1230??60 calendar years B.P. The remaining segment (Brigham City segment) has not ruptured in the past 2120??100 years. Comparison of the WFZ space-time diagram of paleoearthquakes with synthetic paleoseismic histories indicates that the observed temporal clusters and gaps have about an equal probability (depending on model assumptions) of reflecting random coincidence as opposed to intersegment contagion. Regional seismicity suggests that for exposure times of 50 and 100 years, the probability for an earthquake of M>7 anywhere within the Wasatch Front region, based on a Poisson model, is 0.16 and 0.30, respectively. A fault-specific WFZ model predicts 50 and 100 year probabilities for a M>7 earthquake on the WFZ itself, based on a Poisson model, as 0.13 and 0.25, respectively. In contrast, segment-specific earthquake probabilities that assume quasi-periodic recurrence behavior on the Weber, Provo, and Nephi segments are less (0.01-0.07 in 100 years) than the regional or fault-specific estimates (0.25-0.30 in 100 years), due to the short elapsed times compared to average recurrence intervals on those segments. The Brigham City and Salt Lake City segments, however, have time-dependent probabilities that approach or exceed the regional and fault specific probabilities. For the Salt Lake City segment, these elevated probabilities are due to the elapsed time being approximately equal to the average late Holocene recurrence time. For the Brigham City segment, the elapsed time is significantly longer than the segment-specific late Holocene recurrence time.

  20. Frequency-magnitude statistics and spatial correlation dimensions of earthquakes at Long Valley caldera, California

    USGS Publications Warehouse

    Barton, D.J.; Foulger, G.R.; Henderson, J.R.; Julian, B.R.

    1999-01-01

    Intense earthquake swarms at Long Valley caldera in late 1997 and early 1998 occurred on two contrasting structures. The first is defined by the intersection of a north-northwesterly array of faults with the southern margin of the resurgent dome, and is a zone of hydrothermal upwelling. Seismic activity there was characterized by high b-values and relatively low values of D, the spatial fractal dimension of hypocentres. The second structure is the pre-existing South Moat fault, which has generated large-magnitude seismic activity in the past. Seismicity on this structure was characterized by low b-values and relatively high D. These observations are consistent with low-magnitude, clustered earthquakes on the first structure, and higher-magnitude, diffuse earthquakes on the second structure. The first structure is probably an immature fault zone, fractured on a small scale and lacking a well-developed fault plane. The second zone represents a mature fault with an extensive, coherent fault plane.

  1. Turbidity anomaly and probability of slope failure following the 2011 Great Tohoku Earthquake

    NASA Astrophysics Data System (ADS)

    Noguchi, T.; Tanikawa, W.; Hirose, T.; Lin, W.; Kawagucci, S.; Yoshida, Y.; Honda, M. C.; Takai, K.; Kitazato, H.; Okamura, K.

    2011-12-01

    Turbidity anomaly at seafloor is often observed immediately after earthquakes (Thunnell et al., 1999: Mikada et al., 2006). Such turbidity anomaly at deepsea is thought to be results of the seismically induced landslides at trench slopes. Turbidity distribution was observed using turbidity meter (Seapoint Sensors Inc.) at the mainshock area of the 2011 off the Pacific coast Tohoku earthquake (Mw 9.0) one month after the event. Turbidity anomalies, in which the turbidity increased with depth, were observed near the seafloor at all four sites. The thickness of the anomalous zones increased with water depth; the thickness at station B, the deepest measurement site, was about 1300 m above the seafloor and the average particle concentration which is equivalent to turbidity in the zone was 1.5 mg/L. We analyzed the mineral composition and grain size distribution of the suspended particle collected one month after the earthquake and shallow sediment core collected before the earthquake at the mainshock area. The grain size of the suspended particles was ranged from 1 to 300?m, and XRD analysis confirmed the presence of chlorite, illite, quartz, and albite in the particles. These characteristics are similar to the subsurface sediment material. Earlier studies (Prior, 1984) have introduced a mathematical model for analysis of submarine slope stability that include the effect of vertical and horizontal seismic accelerations caused by the earthquake. We analyzed slope instability on the basis of their model using the physical properties (density and shear strength) of the shallow sediment core materials and the acceleration of 2011 off the Pacific coast Tohoku earthquake. Our results show that a submarine landslide can be induced by a very large ground acceleration, as high as 3 m/s2, even if the sediment layer on the sliding surface is not very thick. We interpret the high turbidity observed one month after the Tohoku earthquake as the result of thin submarine landsliding. Reference Mikada, H. K. Mitsuzawa, H. Matsumoto, T. Watanabe, S. Morita, R. Otsuka, H. Sugioka, T. Baba, E. Araki, K. Suyehiro, (2006), New discoveries in dynamics of an M8 earthquake-phenomena and their implications from the 2003 Tokachi-oki earthquake using a long term monitoring cabled observatory, Tectonophysics, 426, 95-105. Prior, D. B. (1984), Methods of stability analysis, in Slope Instability, Edited by Brunsden, D. and D. B. Prior, pp. 419-455, Wiley, New York. Thunnell, R., E. Tappa, R. Varela, M. Llano, Y. Astor, F. Muller-Karger, and R. Bohrer (1999), Increased marine sediment suspension and fluxes following an earthquake. Nature, 398, 233-236.

  2. Probable Earthquake Archaeological Effects in the ancient pyramids of Quetzalcóatl and Sun in Teotihuacán (Central Mexico)

    NASA Astrophysics Data System (ADS)

    Perez-Lopez, Raul; Rodríguez-Pascua, Miguel Angel; Garduño-Monroy, Victor Hugo; Oliveros, Arturo; Giner-Robles, Jorge L.; Silva, Pablo G.

    2010-05-01

    Teotihuacán was one of the blooming and greater cities of the Prehispanic cultural period within the central valley of México and one of the best archaeological findings of the Earth. During the period of splendour (Middle-Late Classic Period, 350-650 AD), almost 125.000 inhabitants lived in a vast city with more than 2000 stucco and block buildings, including the great religious and ceremonial pyramids: the Great Sun Pyramid, built between 1- 150 AD, the Moon Pyramid, built during a large time span (1-650 AD) and the outstanding Quetzalcóatl Pyramid (Feathered Snake Temple), built in two phases: the first original edifice built before 350 AD and the second one mainly are repairs of the west side and dated post-350 AD. The Quetzalcóatl Pyramid (Q- pyramid) shows a quadrangular base of ca. 3500 m2 with an extraordinary decoration of feathered snakes (attributed to the God Quetzalcóatl) and lizards. The second phase of construction consisted in a townhouse façade covering the west side of the pyramid (post 350AD), up to now with no evidence to justify such annexed wrapper of this west side. This ceremonial building was built within the Citadel, a complex area of Teotihuacán with residential and common zones as well (i.e. market). A detailed view of the steps of the west side stairs, displays different patterns of deformation affecting the blocks of the stair. The original and ancient stair exhibits rotated, overturned and displaced blocks, being stronger this deformation at the base of the pyramid. Moreover, the upper corners of the blocks appear broken in a similar form than the seismic-related feature defined as dipping broken corners or chipped corners. However, the horizontal disposition of the blocks suggests lateral vibration between them from horizontal shaking propagation. Besides, this feature appears lesser evident affecting the lower blocks of the annexed west façade, the only originally preserved ones. We have carried out a systematic measurement of this feature across the original west stairs of the Q- pyramid and the first stair level of the Sun pyramid. Furthermore, these horizontal dipping broken corners were also described affecting the new stairs of the annexed façade of the Q- pyramid. This suggests that seismic shaking could produce that deformation with a relative date of 350 AD post-quem. More data are necessary to properly test the earthquake occurrence and to bracket a probable intensity value.

  3. Probability

    NSDL National Science Digital Library

    Ms. Thompson

    2008-12-01

    This is introduction into probability. This project allows students to explore with probability, the days following gives students a further look into probability. Today, you are going to experiment with probability. Go to the site An introduction into probability and read the first three sections, the last sections is the one with the picture of the coins. After you have read a little bit about probability, you now get to explore probability through some games. Start ...

  4. Data and Visualizations in the Southern California Earthquake Center's Fault Information System

    NASA Astrophysics Data System (ADS)

    Perry, S.

    2003-12-01

    The Southern California Earthquake Center's Fault Information System (FIS) provides a single point of access to fault-related data and models from multiple databases and datasets. The FIS is built of computer code, metadata and Web interfaces based on Web services technology, which enables queries and data interchange irrespective of computer software or platform. Currently we have working prototypes of programmatic and browser-based access. The first generation FIS may be searched and downloaded live, by automated processes, as well as interactively, by humans using a browser. Users get ascii data in plain text or encoded in XML. Via the Earthquake Information Technology (EIT) Interns (Juve and others, this meeting), we are also testing the effectiveness of querying multiple databases using a fault database ontology. For more than a decade, the California Geological Survey (CGS), SCEC, and the U. S. Geological Survey (USGS) have put considerable, shared resources into compiling and assessing published fault data, then providing the data on the Web. Several databases now exist, with different formats, datasets, purposes, and users, in various stages of completion. When fault databases were first envisioned, the full power of today's internet was not yet recognized, and the databases became the Web equivalents of review papers, where one could read an overview summation of a fault, then copy and paste pertinent data. Today, numerous researchers also require rapid queries and downloads of data. Consequently, the first components of the FIS are MySQL databases that deliver numeric values from earlier, text-based databases. Another essential service provided by the FIS is visualizations of fault representations such as those in SCEC's Community Fault Model. The long term goal is to provide a standardized, open-source, platform-independent visualization technique. Currently, the FIS makes available fault model viewing software for users with access to Matlab or Java3D. The latter is the interactive LA3D software of the SCEC EIT intern team, which will be demonstrated at this session.

  5. Comparison of four moderate-size earthquakes in southern California using seismology and InSAR

    USGS Publications Warehouse

    Mellors, R.J.; Magistrale, H.; Earle, P.; Cogbill, A.H.

    2004-01-01

    Source parameters determined from interferometric synthetic aperture radar (InSAR) measurements and from seismic data are compared from four moderate-size (less than M 6) earthquakes in southern California. The goal is to verify approximate detection capabilities of InSAR, assess differences in the results, and test how the two results can be reconciled. First, we calculated the expected surface deformation from all earthquakes greater than magnitude 4 in areas with available InSAR data (347 events). A search for deformation from the events in the interferograms yielded four possible events with magnitudes less than 6. The search for deformation was based on a visual inspection as well as cross-correlation in two dimensions between the measured signal and the expected signal. A grid-search algorithm was then used to estimate focal mechanism and depth from the InSAR data. The results were compared with locations and focal mechanisms from published catalogs. An independent relocation using seismic data was also performed. The seismic locations fell within the area of the expected rupture zone for the three events that show clear surface deformation. Therefore, the technique shows the capability to resolve locations with high accuracy and is applicable worldwide. The depths determined by InSAR agree with well-constrained seismic locations determined in a 3D velocity model. Depth control for well-imaged shallow events using InSAR data is good, and better than the seismic constraints in some cases. A major difficulty for InSAR analysis is the poor temporal coverage of InSAR data, which may make it impossible to distinguish deformation due to different earthquakes at the same location.

  6. Rupture propagation of the 2004 Parkfield, California, earthquake from observations at the UPSAR

    USGS Publications Warehouse

    Fletcher, Joe B.; Spudich, P.; Baker, L.M.

    2006-01-01

    Using a short-baseline seismic array (U.S. Geological Survey Parkfield Dense Seismograph Array [UPSAR]) about 12 km west of the rupture initiation of the 28 September 2004 M 6.0 Parkfield, California, earthquake, we have observed the movement of the rupture front of this earthquake on the San Andreas fault. The sources of high-frequency arrivals at UPSAR, which we use to identify the rupture front, are mapped onto the San Andreas fault using their apparent velocity and back azimuth. Measurements of apparent velocity and back azimuth are calibrated using aftershocks, which have a compact source and known location. Aftershock back azimuths show considerable lateral refraction, consistent with a high-velocity ridge on the southwest side of the fault. We infer that the initial mainshock rupture velocity was approximately the Rayleigh speed (with respect to slower side of the fault), and the rupture then slowed to about 0.66?? near the town of Parkfield after 2 sec. The last well-correlated pulse, 4 sec after S, is the largest at UPSAR, and its source is near the region of large accelerations recorded by strong-motion accelerographs and close to northern extent of continuous surface fractures on the southwest fracture zone. Coincidence of sources with preshock and aftershock distributions suggests fault material properties control rupture behavior. High-frequency sources approximately correlate with the edges of asperities identified as regions of high slip derived from inversion of strong-motion waveforms.

  7. Triggered reverse fault and earthquake due to crustal unloading, northwest Transverse Ranges, California.

    USGS Publications Warehouse

    Yerkes, R.F.; Ellsworth, W.L.; Tinsley, J.C.

    1983-01-01

    A reverse-right-oblique surface rupture, associated with a ML 2.5 earthquake, formed in a diatomite quarry near Lompoc, California, in the northwesternmost Transverse Ranges on April 7, 1981. The 575-m-long narrow zone of ruptures formed in clay interbeds in diatomite and diatomaceous shale of the Neogene Monterey Formation. The ruptures parallel bedding, dip 39o-59oS, and trend about N84oE on the north limb of an open symmetrical syncline. Maximum net slip was 25 cm; maximum reverse dip slip was 23 cm, maximum right-lateral strike slip was about 9 cm, and average net slip was about 12 cm. The seismic moment of the earthquake is estimated at 1 to 2 X 1018 dyne/cm and the static stress drop at about 3 bar. The removal of an average of about 44 m of diatomite resulted in an average load reduction of about 5 bar, which decreased the normal stress by about 3.5 bar and increased the shear stress on the tilted bedding plane by about 2 bar. The April 7, 1981, event was a very shallow bedding-plane rupture, apparently triggered by crustal unloading. -Authors

  8. The 1868 Hayward Earthquake Alliance: A Case Study - Using an Earthquake Anniversary to Promote Earthquake Preparedness

    NASA Astrophysics Data System (ADS)

    Brocher, T. M.; Garcia, S.; Aagaard, B. T.; Boatwright, J. J.; Dawson, T.; Hellweg, M.; Knudsen, K. L.; Perkins, J.; Schwartz, D. P.; Stoffer, P. W.; Zoback, M.

    2008-12-01

    Last October 21st marked the 140th anniversary of the M6.8 1868 Hayward Earthquake, the last damaging earthquake on the southern Hayward Fault. This anniversary was used to help publicize the seismic hazards associated with the fault because: (1) the past five such earthquakes on the Hayward Fault occurred about 140 years apart on average, and (2) the Hayward-Rodgers Creek Fault system is the most likely (with a 31 percent probability) fault in the Bay Area to produce a M6.7 or greater earthquake in the next 30 years. To promote earthquake awareness and preparedness, over 140 public and private agencies and companies and many individual joined the public-private nonprofit 1868 Hayward Earthquake Alliance (1868alliance.org). The Alliance sponsored many activities including a public commemoration at Mission San Jose in Fremont, which survived the 1868 earthquake. This event was followed by an earthquake drill at Bay Area schools involving more than 70,000 students. The anniversary prompted the Silver Sentinel, an earthquake response exercise based on the scenario of an earthquake on the Hayward Fault conducted by Bay Area County Offices of Emergency Services. 60 other public and private agencies also participated in this exercise. The California Seismic Safety Commission and KPIX (CBS affiliate) produced professional videos designed forschool classrooms promoting Drop, Cover, and Hold On. Starting in October 2007, the Alliance and the U.S. Geological Survey held a sequence of press conferences to announce the release of new research on the Hayward Fault as well as new loss estimates for a Hayward Fault earthquake. These included: (1) a ShakeMap for the 1868 Hayward earthquake, (2) a report by the U. S. Bureau of Labor Statistics forecasting the number of employees, employers, and wages predicted to be within areas most strongly shaken by a Hayward Fault earthquake, (3) new estimates of the losses associated with a Hayward Fault earthquake, (4) new ground motion simulations of a Hayward Fault earthquake, (5) a new USGS Fact Sheet about the earthquake and the Hayward Fault, (6) a virtual tour of the 1868 earthquake, and (7) a new online field trip guide to the Hayward Fault using locations accessible by car and public transit. Finally, the California Geological Survey and many other Alliance members sponsored the Third Conference on Earthquake Hazards in the East Bay at CSU East Bay in Hayward for the three days following the 140th anniversary. The 1868 Alliance hopes to commemorate the anniversary of the 1868 Hayward Earthquake every year to maintain and increase public awareness of this fault, the hazards it and other East Bay Faults pose, and the ongoing need for earthquake preparedness and mitigation.

  9. CRUSTAL REFRACTION PROFILE OF THE LONG VALLEY CALDERA, CALIFORNIA, FROM THE JANUARY 1983 MAMMOTH LAKES EARTHQUAKE SWARM.

    USGS Publications Warehouse

    Luetgert, James H.; Mooney, Walter D.

    1985-01-01

    Seismic-refraction profiles recorded north of Mammoth Lakes, California, using earthquake sources from the January 1983 swarm complement earlier explosion refraction profiles and provide velocity information from deeper in the crust in the area of the Long Valley caldera. Eight earthquakes from a depth range of 4. 9 to 8. 0 km confirm the observation of basement rocks with seismic velocities ranging from 5. 8 to 6. 4 km/sec extending at least to depths of 20 km. The data provide further evidence for the existence of a partial melt zone beneath Long Valley caldera and constrain its geometry. Refs.

  10. Probability

    NSDL National Science Digital Library

    Mrs. Marsh

    2006-11-16

    Using different probabilities Help cybersquad clean up the buggy mess. Which bugs will Probability show up? Try out all the probability Possibilities. Find the answers to these Ratio activities. Practice Ratios and Proportions. ...

  11. Earthquake Myths

    NSDL National Science Digital Library

    This site serves to belie several popular myths about earthquakes. Students will learn that most earthquakes do not occur in the early morning and one cannot be swallowed up by an earthquake. In addition, there is no such thing as earthquake weather and California is not falling into the ocean. On the more practical side, students can learn that good building codes do not insure good buildings, it is safer under a table than in a doorway during an earthquake, and most people do not panic during an earthquake.

  12. Monitoring of the stress state variations of the Southern California for the purpose of earthquake prediction

    NASA Astrophysics Data System (ADS)

    Gokhberg, M.; Garagash, I.; Bondur, V.; Steblov, G. M.

    2014-12-01

    The three-dimensional geomechanical model of Southern California was developed, including a mountain relief, fault tectonics and characteristic internal features such as the roof of the consolidated crust and Moho surface. The initial stress state of the model is governed by the gravitational forces and horizontal tectonic motions estimated from GPS observations. The analysis shows that the three-dimensional geomechanical models allows monitoring of the changes in the stress state during the seismic process in order to constrain the distribution of the future places with increasing seismic activity. This investigation demonstrates one of possible approach to monitor upcoming seismicity for the periods of days - weeks - months. Continuous analysis of the stress state was carried out during 2009-2014. Each new earthquake with ?~1 and above from USGS catalog was considered as the new defect of the Earth crust which has some definite size and causes redistribution of the stress state. Overall calculation technique was based on the single function of the Earth crust damage, recalculated each half month. As a result each half month in the upper crust layers and partially in the middle layers we revealed locations of the maximal values of the stress state parameters: elastic energy density, shear stress, proximity of the earth crust layers to their strength limit. All these parameters exhibit similar spatial and temporal distribution. How follows from observations all four strongest events with ? ~ 5.5-7.2 occurred in South California during the analyzed period were prefaced by the parameters anomalies in peculiar advance time of weeks-months in the vicinity of 10-50 km from the upcoming earthquake. After the event the stress state source disappeared. The figure shows migration of the maximums of the stress state variations gradients (parameter D) in the vicinity of the epicenter of the earthquake 04.04.2010 with ?=7.2 in the period of 01.01.2010-01.05.2010. Grey lines show the major faults. In the table the values are sampled by 2 weeks, "-" indicates time before the event, "+" indicates time after the event.

  13. Southern California permanent GPS geodetic array: Spatial filtering of daily positions for estimating coseismic and postseismic displacements induced by the 1992 Landers earthquake

    Microsoft Academic Search

    Shimon Wdowinkski; Yehuda Bock; Jie Zhang; Peng Fang; Joachim Genrich

    1997-01-01

    The June 28, 1992 (MW=7.3) Landers, California, earthquake was the first earthquake to be surveyed by a continuously operating Global Positioning System (GPS) array. The coordinate time series of seven sites are evaluated for station displacements during an interval of 100 days centered on the day of the earthquake. We employ a new spatial filtering technique that removes common-mode errors

  14. Fault structure and mechanics of the Hayward Fault, California from double-difference earthquake locations

    USGS Publications Warehouse

    Waldhauser, F.; Ellsworth, W.L.

    2002-01-01

    The relationship between small-magnitude seismicity and large-scale crustal faulting along the Hayward Fault, California, is investigated using a double-difference (DD) earthquake location algorithm. We used the DD method to determine high-resolution hypocenter locations of the seismicity that occurred between 1967 and 1998. The DD technique incorporates catalog travel time data and relative P and S wave arrival time measurements from waveform cross correlation to solve for the hypocentral separation between events. The relocated seismicity reveals a narrow, near-vertical fault zone at most locations. This zone follows the Hayward Fault along its northern half and then diverges from it to the east near San Leandro, forming the Mission trend. The relocated seismicity is consistent with the idea that slip from the Calaveras Fault is transferred over the Mission trend onto the northern Hayward Fault. The Mission trend is not clearly associated with any mapped active fault as it continues to the south and joins the Calaveras Fault at Calaveras Reservoir. In some locations, discrete structures adjacent to the main trace are seen, features that were previously hidden in the uncertainty of the network locations. The fine structure of the seismicity suggest that the fault surface on the northern Hayward Fault is curved or that the events occur on several substructures. Near San Leandro, where the more westerly striking trend of the Mission seismicity intersects with the surface trace of the (aseismic) southern Hayward Fault, the seismicity remains diffuse after relocation, with strong variation in focal mechanisms between adjacent events indicating a highly fractured zone of deformation. The seismicity is highly organized in space, especially on the northern Hayward Fault, where it forms horizontal, slip-parallel streaks of hypocenters of only a few tens of meters width, bounded by areas almost absent of seismic activity. During the interval from 1984 to 1998, when digital waveforms are available, we find that fewer than 6.5% of the earthquakes can be classified as repeating earthquakes, events that rupture the same fault patch more than one time. These most commonly are located in the shallow creeping part of the fault, or within the streaks at greater depth. The slow repeat rate of 2-3 times within the 15-year observation period for events with magnitudes around M = 1.5 is indicative of a low slip rate or a high stress drop. The absence of microearthquakes over large, contiguous areas of the northern Hayward Fault plane in the depth interval from ???5 to 10 km and the concentrations of seismicity at these depths suggest that the aseismic regions are either locked or retarded and are storing strain energy for release in future large-magnitude earthquakes.

  15. Probability

    NSDL National Science Digital Library

    2011-01-18

    This application demomstrates simple probability concepts by having student rank the probability of an event on a probability line (from impossible to certain). After several trials the application then allows students to complete a simulation and collect data based on the probability task (retrieving balls from a machine). Several guiding questions are provided throughout the activity to encourage student dialogue.

  16. Residual analysis methods for space--time point processes with applications to earthquake forecast models in California

    E-print Network

    Clements, Robert Alan; Schorlemmer, Danijel; 10.1214/11-AOAS487

    2012-01-01

    Modern, powerful techniques for the residual analysis of spatial-temporal point process models are reviewed and compared. These methods are applied to California earthquake forecast models used in the Collaboratory for the Study of Earthquake Predictability (CSEP). Assessments of these earthquake forecasting models have previously been performed using simple, low-power means such as the L-test and N-test. We instead propose residual methods based on rescaling, thinning, superposition, weighted K-functions and deviance residuals. Rescaled residuals can be useful for assessing the overall fit of a model, but as with thinning and superposition, rescaling is generally impractical when the conditional intensity $\\lambda$ is volatile. While residual thinning and superposition may be useful for identifying spatial locations where a model fits poorly, these methods have limited power when the modeled conditional intensity assumes extremely low or high values somewhere in the observation region, and this is commonly t...

  17. Case 21 water level and strain changes preceding and following the August 4, 1985 Kettleman Hills, California, earthquake

    Microsoft Academic Search

    Evelyn Roeloffs; Eddie Quilty; C. H. Scholtz

    1997-01-01

    Two of the four wells monitored near Parkfield, California, during 1985 showed water level rises beginning three days before theMw 6.1 Kettleman Hills earthquake. In one of these wells, the 3.0 cm rise was nearly unique in five years of water level data. However, in the other well, which showed a 3.8 cm rise, many other changes of comparable size

  18. Processed seismic motion records from Landers, California earthquake of June 28, 1992, recorded at seismograph stations in southern Nevada

    SciTech Connect

    Lum, P.K.; Honda, K.K.

    1993-04-01

    The 8mm data tape contains the processed seismic data of the Landers, California earthquake of June 28, 1992. The seismic, data were recorded by 19 seismographs maintained by the DOE/NV in Southern Nevada. Four files were generated from each seismic recorder. They are ``Uncorrected acceleration time histories, 2. Corrected acceleration, velocity and displacement time histories, 3. Pseudo response velocity spectra, and 4. Fourier amplitude spectra of acceleration.``

  19. Processed seismic motion records from Big Bear, California earthquake of June 28, 1992, recorded at seismograph stations in southern Nevada

    SciTech Connect

    Lum, P.K.; Honda, K.K.

    1993-04-01

    The 8mm data-tape contains the processed seismic data of the Big Bear, California earthquake of June 28, 1992. The seismic data were recorded by 15 seismographs maintained by the DOE/NV in Southern Nevada. Four files were generated from each seismic recorder. They are ``Uncorrected acceleration time histories, 2. Corrected acceleration, velocity and displacement time histories, 3. Pseudo response velocity spectra, and 4. Fourier amplitude spectra of acceleration.``

  20. Full Moment Tensor Variations and Isotropic Characteristics of Earthquakes in the Gulf of California Transform Fault System

    NASA Astrophysics Data System (ADS)

    Ortega, Roberto; Quintanar, Luis; Rivera, Luis

    2014-10-01

    The full moment tensor is a mathematical expression of six independent variables; however, on a routine basis, it is a common practice to reduce them to five assuming that the isotropic component is zero. This constraint is valid in most tectonic regimes where slip occurs entirely at the fault surface (e.g. subduction zones); however, we found that full moment tensors are best represented in transform fault systems. Here we present a method to analyze source complexity of earthquakes of different sizes using a simple formulation that relates the elastic constants obtained from independent studies with the angle between the slip and the fault normal vector, referred to as angle ; this angle is obtained from the full moment tensors. The angle , the proportion of volume change and the constant volume (shear) component are numerical indicators of complexity of the source; earthquakes are more complex as deviates from or as T and k deviate from zero as well. These parameters are obtained from the eigensolution of the full moment tensor. We analyzed earthquakes in the Gulf of California that exhibit a clear isotropic component and we observed that the constant volume parameter T is independent of scalar moments, suggesting that big and small earthquakes are equally complex. In addition, simple models of one single fault are not sufficient to describe physically all the combinations of in a source type plot. We also found that the principal direction of the strike of the Transform Fault System in the Gulf of California is following the first order approximation of the normal surface of the full moment tensor solution, whereas for deviatoric moment tensors the principal direction does not coincide with the strike of the Transform Fault System. Our observations that small and large earthquakes are equally complex are in agreement with recent studies of strike-slip earthquakes.

  1. Detailed observations of California foreshock sequences: Implications for the earthquake initiation process

    NASA Astrophysics Data System (ADS)

    Dodge, Douglas A.; Beroza, Gregory C.; Ellsworth, W. L.

    1996-10-01

    We find that foreshocks provide clear evidence for an extended nucleation process before some earthquakes. In this study, we examine in detail the evolution of six California foreshock sequences, the 1986 Mount Lewis (ML = 5.5), the 1986 Chalfant (ML = 6.4), the 1986 Stone Canyon (ML = 4.7), the 1990 Upland (ML = 5.2), the 1992 Joshua Tree (MW = 6.1), and the 1992 Landers (MW = 7.3) sequence. Typically, uncertainties in hypocentral parameters are too large to establish the geometry of foreshock sequences and hence to understand their evolution. However, the similarity of location and focal mechanisms for the events in these sequences leads to similar foreshock waveforms that we cross correlate to obtain extremely accurate relative locations. We use these results to identify small-scale fault zone structures that could influence nucleation and to determine the stress evolution leading up to the mainshock. In general, these foreshock sequences are not compatible with a cascading failure nucleation model in which the foreshocks all occur on a single fault plane and trigger the mainshock by static stress transfer. Instead, the foreshocks seem to concentrate near structural discontinuities in the fault and may themselves be a product of an aseismic nucleation process. Fault zone heterogeneity may also be important in controlling the number of foreshocks, i.e., the stronger the heterogeneity, the greater the number of foreshocks. The size of the nucleation region, as measured by the extent of the foreshock sequence, appears to scale with mainshock moment in the same manner as determined independently by measurements of the seismic nucleation phase. We also find evidence for slip localization as predicted by some models of earthquake nucleation.

  2. Dynamic deformations and the M6.7, Northridge, California earthquake

    USGS Publications Warehouse

    Gomberg, J.

    1997-01-01

    A method of estimating the complete time-varying dynamic formation field from commonly available three-component single station seismic data has been developed and applied to study the relationship between dynamic deformation and ground failures and structural damage using observations from the 1994 Northridge, California earthquake. Estimates from throughout the epicentral region indicate that the horizontal strains exceed the vertical ones by more than a factor of two. The largest strains (exceeding ???100 ??strain) correlate with regions of greatest ground failure. There is a poor correlation between structural damage and peak strain amplitudes. The smallest strains, ???35 ??strain, are estimated in regions of no damage or ground failure. Estimates in the two regions with most severe and well mapped permanent deformation, Potrero Canyon and the Granada-Mission Hills regions, exhibit the largest strains; peak horizontal strains estimates in these regions equal ???139 and ???229 ??strain respectively. Of note, the dynamic principal strain axes have strikes consistent with the permanent failure features suggesting that, while gravity, sub-surface materials, and hydrologic conditions undoubtedly played fundamental roles in determining where and what types of failures occurred, the dynamic deformation field may have been favorably sized and oriented to initiate failure processes. These results support other studies that conclude that the permanent deformation resulted from ground shaking, rather than from static strains associated with primary or secondary faulting. They also suggest that such an analysis, either using data or theoretical calculations, may enable observations of paleo-ground failure to be used as quantitative constraints on the size and geometry of previous earthquakes. ?? 1997 Elsevier Science Limited.

  3. Stability and uncertainty of finite-fault slip inversions: Application to the 2004 Parkfield, California, earthquake

    USGS Publications Warehouse

    Hartzell, S.; Liu, P.; Mendoza, C.; Ji, C.; Larson, K.M.

    2007-01-01

    The 2004 Parkfield, California, earthquake is used to investigate stability and uncertainty aspects of the finite-fault slip inversion problem with different a priori model assumptions. We utilize records from 54 strong ground motion stations and 13 continuous, 1-Hz sampled, geodetic instruments. Two inversion procedures are compared: a linear least-squares subfault-based methodology and a nonlinear global search algorithm. These two methods encompass a wide range of the different approaches that have been used to solve the finite-fault slip inversion problem. For the Parkfield earthquake and the inversion of velocity or displacement waveforms, near-surface related site response (top 100 m, frequencies above 1 Hz) is shown to not significantly affect the solution. Results are also insensitive to selection of slip rate functions with similar duration and to subfault size if proper stabilizing constraints are used. The linear and nonlinear formulations yield consistent results when the same limitations in model parameters are in place and the same inversion norm is used. However, the solution is sensitive to the choice of inversion norm, the bounds on model parameters, such as rake and rupture velocity, and the size of the model fault plane. The geodetic data set for Parkfield gives a slip distribution different from that of the strong-motion data, which may be due to the spatial limitation of the geodetic stations and the bandlimited nature of the strong-motion data. Cross validation and the bootstrap method are used to set limits on the upper bound for rupture velocity and to derive mean slip models and standard deviations in model parameters. This analysis shows that slip on the northwestern half of the Parkfield rupture plane from the inversion of strong-motion data is model dependent and has a greater uncertainty than slip near the hypocenter.

  4. Capture probability of released males of two Bactrocera species (Diptera: Tephritidae) in detection traps in California.

    PubMed

    Shelly, T; Nishimoto, J; Diaz, A; Leathers, J; War, M; Shoemaker, R; Al-Zubaidy, M; Joseph, D

    2010-12-01

    The genus Bactrocera (Diptera: Tephritidae) includes approximately 70 polyphagous species that are major pests of fruit and vegetable crops. Most Bactrocera species have limited geographic distributions, but several species are invasive, and many countries operate continuous trapping programs to detect infestations. In the United States, California maintains approximately 25,000 traps (baited with male lures) specifically for Bactrocera detection distributed over an area of approximately 6,400 km2 (2,500 miles2) in the Los Angeles area. Although prior studies have used male lures to describe movement of Bactrocera males, they do not explicitly relate capture probability with fly distance from lure-baited traps; consequently, they do not address the relative effectiveness of male lures in detecting incipient populations of Bactrocera species. The objective of this study was to measure the distance-dependent capture probability of marked, released males of Bactrocera dorsalis (Hendel) and Bactrocera cucurbitae (Coquillett) (methyl eugenol- and cue lure-responding species, respectively) within the detection trapping grid operating in southern California. These data were then used to compute simple probability estimates for detecting populations of different sizes of the two species. Methyl eugenol was the more powerful attractant, and based on the mark-recapture data, we estimated that B. dorsalis populations with as few as approximately 50 males would always (>99.9%) be detected using the current trap density of five methyl eugenol-baited traps per 2.6 km2 (1 mile2). By contrast, we estimated that certain detection of B. cucurbitae populations would not occur until these contained approximately 350 males. The implications of the results for the California trapping system are discussed, and the findings are compared with mark-release-recapture data obtained for the same two species in Hawaii. PMID:21309224

  5. Changes in the discharge characteristics of thermal springs and fumaroles in the Long Valley Caldera, California, resulting from earthquakes on May 25-27, 1980

    USGS Publications Warehouse

    Sorey, M.L.; Clark, Mark D.

    1981-01-01

    Changes in flow rate and turbidity have been observed and measured in hot springs in the Long Valley caldera, California, following earthquakes with magnitudes up to 6.3 in May 1980. Increases in flow rate of some hot springs occurred within minutes of the earthquakes, followed by more gradual decreases in flow rate to pre-earthquake levels. Spring temperatures and chemistries also show no long-term variations following earthquakes. Transient changes in discharge characteristics of the hot springs appear to result from increases in the permeability of fault conduits transmitting the hot water to the surface. (USGS)

  6. Fault tectonics and earthquake hazards in the Peninsular Ranges, Southern California

    NASA Technical Reports Server (NTRS)

    Merifield, P. M.; Lamar, D. L. (principal investigators)

    1974-01-01

    The author has identified the following significant results. ERTS and Skylab images reveal a number of prominent lineaments in the basement terrane of the Peninsular Ranges, Southern California. The major, well-known, active, northwest trending, right-slip faults are well displayed, but northeast and west to west-northwest trending lineaments are also present. Study of large-scale airphotos followed by field investigations have shown that several of these lineaments represent previously unmapped faults. Pitches of striations on shear surfaces of the northeast and west trending faults indicate oblique-slip movement; data are insufficient to determine the net-slip. These faults are restricted to the pre-Tertiary basement terrane and are truncated by the major northwest trending faults; therefore, they may have formed in response to an earlier stress system. Future work should be directed toward determining whether the northeast and west trending faults are related to the presently active stress system or to an older inactive system, because this question relates to the earthquake risk in the vicinity of these faults.

  7. Low Velocity Zones along the San Jacinto Fault, Southern California, inferred from Local Earthquakes

    NASA Astrophysics Data System (ADS)

    Li, Z.; Yang, H.; Peng, Z.; Ben-Zion, Y.; Vernon, F.

    2013-12-01

    Natural fault zones have regions of brittle damage leading to a low-velocity zone (LVZ) in the immediate vicinity of the main fault interface. The LVZ may amplify ground motion, modify rupture propagation, and impact derivation of earthquke properties. Here we image low-velocity fault zone structures along the San Jacinto Fault (SJF), southern California, using waveforms of local earthquakes that are recorded at several dense arrays across the SJFZ. We use generalized ray theory to compute synthetic travel times to track the direct and FZ-reflected waves bouncing from the FZ boundaries. This method can effectively reduce the trade-off between FZ width and velocity reduction relative to the host rock. Our preliminary results from travel time modeling show the clear signature of LVZs along the SJF, including the segment of the Anza seismic gap. At the southern part near the trifrication area, the LVZ of the Clark Valley branch (array JF) has a width of ~200 m with ~55% reduction in Vp and Vs. This is consistent with what have been suggested from previous studies. In comparison, we find that the velocity reduction relative to the host rock across the Anza seismic gap (array RA) is ~50% for both Vp and Vs, nearly as prominent as that on the southern branches. The width of the LVZ is ~230 m. In addition, the LVZ across the Anza gap appears to locate in the northeast side of the RA array, implying potential preferred propagation direction of past ruptures.

  8. Localization of intermediate-term earthquake prediction

    SciTech Connect

    Kossobokov, V.G.; Keilis-Borok, V.I. (International Inst. of Earthquake Prediction Theory and Mathematical Geophysics, Moscow (USSR)); Smith, S.W. (Univ. of Washington, Seattle (USA))

    1990-11-10

    Relative seismic quiescence within a region which has already been diagnosed as having entered a Time of Increased Probability (TIP) for the occurrence of a strong earthquake can be used to refine the locality in which the earthquake may be expected to occur. A simple algorithm with parameters fitted from the data in Northern California preceding the 1980 magnitude 7.0 earthquake offshore from Eureka depicts relative quiescence within the region of a TIP. The procedure was tested, without readaptation of parameter, on 17 other strong earthquake occurrences in North America, Japan, and Eurasia, most of which were in regions for which a TIP had been previously diagnosed. The localizing algorithm successfully outlined a region within which the subsequent earthquake occurred for 16 of these 17 strong earthquakes. The area of prediction in each case was reduced significantly, ranging between 7% and 25% of the total area covered by the TIP.

  9. Probability

    NSDL National Science Digital Library

    Ms. Banks

    2005-05-11

    What are the chances? What is probability? Math Glossary What are the chances that you will get a baby brother or a baby sister? Boy or Girl? If you flip more than one coin, what are the combinations you could get? What are the chances you will get each combination? Probability in Flipping Coins ...

  10. Seismicity remotely triggered by the magnitude 7.3 Landers, California, earthquake

    Microsoft Academic Search

    D. P. Hill; P. A. Reasenberg; A. Michael; W. J. Arabaz; G. Beroza; D. Brumbaugh; J. N. Brune; R. Castro; S. Davis; D. Depolo; R. B. Smith; L. Munguia; A. Vidal; V. Wong; J. Gomberg; S. Harmsen; L. House; S. M. Jackson; L. Jones; R. Keller; S. Malone; A. Sanford; S. Walter; J. Zollweg

    1993-01-01

    The magnitude 7.3 Landers earthquake of 28 June 1992 triggered a remarkably sudden and widespread increase in earthquake activity across much of the western United States. The triggered earthquakes, which occurred at distances up to 1250 kilometers (17 source dimensions) from the Landers mainshock, were confined to areas of persistent seismicity and strike-slip to normal faulting. Many of the triggered

  11. Possible Earthquake Rupture Connections on Mapped California Faults Ranked by Calculated Coulomb Linking Stresses

    E-print Network

    ruptures within localized fault systems. Introduction Fault-based earthquake forecasts require decisions distance is not a sufficient criterion for decision making. Generally, fault-based earthquake forecasts of these types of earthquakes are very difficult to forecast from a fault-based model, but recent experience

  12. Earthquakes, active faults, and geothermal areas in the imperial valley, california.

    PubMed

    Hill, D P; Mowinckel, P; Peake, L G

    1975-06-27

    A dense seismograph network in the Imperial Valley recorded a series of earthquake swarms along the Imperial and Brawley faults and a diffuse pattern of earthquakes along the San Jacinto fault. Two known geothermal areas are closely associated with these earthquake swarms. This seismicity pattern demonstrates that seismic slip is occurring along both the Imperial-Brawley and San Jacinto fault systems. PMID:17772600

  13. Adaptively Smoothed Seismicity Earthquake Forecasts for Italy

    Microsoft Academic Search

    M. J. Werner; A. Helmstetter; D. D. Jackson; Y. Y. Kagan; S. Wiemer

    2010-01-01

    We present a model for estimating the probabilities of future earthquakes of magnitudes m > 4.95 in Italy. The model, a slightly modified version of the one proposed for California by Helmstetter et al. (2007) and Werner et al. (2010), approximates seismicity by a spatially heterogeneous, temporally homogeneous Poisson point process. The temporal, spatial and magnitude dimensions are entirely decoupled.

  14. Earthquake Forecasting Using Hidden Markov Models

    Microsoft Academic Search

    Daniel W. Chambers; Jenny A. Baglivo; John E. Ebel; Alan L. Kafka

    2011-01-01

    This paper develops a novel method, based on hidden Markov models, to forecast earthquakes and applies the method to mainshock seismic activity in southern California and western Nevada. The forecasts are of the probability of a mainshock within 1, 5, and 10 days in the entire study region or in specific subregions and are based on the observations available at

  15. Micro-earthquake Analysis for Reservoir Properties at the Prati-32 Injection Test, The Geysers, California

    NASA Astrophysics Data System (ADS)

    Hutchings, L. J.; Singh, A. K.

    2014-12-01

    The Prati-32 injection test offers a particular opportunity to test rock physics theories and tomography results as it occurred in a previously undisturbed portion of The Geysers, California. Within the northwest Geysers, there is a high temperature zone (HTZ) directly below the normal temperature reservoir (NTR) at ˜2.6 km below ground surface. We demonstrate an analysis of micro-earthquake data with rock physics theory to identify fractures, state of fluids, and permeable zones. We obtain earthquake source properties (hypocenters, magnitudes, stress drops, and moment tensors), 3D isotropic velocity (Vp and Vs) and attenuation (Qp and Qs seismic quality factors), derived elastic moduli (Lambda, Bulk and Young's moduli), and Poisson's ratio. After one month of injection changes in these parameters occur right at the point where injection occured, which confirms the accuracy of the tomography. Bulk modulus, Poisson's ratio, and Lambda increased. Vs decreased. Qp and Vp increased slightly and Qs did not change. We interpret this observation to indicate that there is fluid saturation along with fracturing around the well bottom. Fracturing would decrease Vs, while saturation would not affect Vs. Whereas, saturation would increase Vp, even with fracturing. Saturation and fracturing should have competing effect of intrinsic and extrinsic Q. Saturation should increase intrinsic Qp, but not affect extrinsic Qp. We can't explain the unchanged Qs, unless the effect of increasing intrinsic Qs is offset by a decrease in extrinsic Qs. Poisson's ratio, and Lambda increased, which is another indication of saturation. After two months of injection, as compared to one month before injection. Bulk modulus and Vp have returned to values comparable to before injection for the volume around the well bottom. A new anomaly in Vp has moved below the well. Vs continues to be low and Lambda and Poisson's ratio continue to be high compared to before injection. These changes have not moved, but increased in size. We interpret these observations to indicate continued saturation, but with increased fracturing. Only Vp and bulk modulus have changed significantly and this is due to the increased fracturing offsetting the saturation.

  16. A three-step Maximum-A-Posterior probability method for InSAR data inversion of coseismic rupture with application to four recent large earthquakes in Asia

    NASA Astrophysics Data System (ADS)

    Sun, J.; Shen, Z.; Burgmann, R.; Liang, F.

    2012-12-01

    We develop a three-step Maximum-A-Posterior probability (MAP) method for coseismic rupture inversion, which aims at maximizing the a posterior probability density function (PDF) of elastic solutions of earthquake rupture. The method originates from the Fully Bayesian Inversion (FBI) and the Mixed linear-nonlinear Bayesian inversion (MBI) methods , shares the same a posterior PDF with them and keeps most of their merits, while overcoming its convergence difficulty when large numbers of low quality data are used and improving the convergence rate greatly using optimization procedures. A highly efficient global optimization algorithm, Adaptive Simulated Annealing (ASA), is used to search for the maximum posterior probability in the first step. The non-slip parameters are determined by the global optimization method, and the slip parameters are inverted for using the least squares method without positivity constraint initially, and then damped to physically reasonable range. This step MAP inversion brings the inversion close to 'true' solution quickly and jumps over local maximum regions in high-dimensional parameter space. The second step inversion approaches the 'true' solution further with positivity constraints subsequently applied on slip parameters using the Monte Carlo Inversion (MCI) technique, with all parameters obtained from step one as the initial solution. Then the slip artifacts are eliminated from slip models in the third step MAP inversion with fault geometry parameters fixed. We first used a designed model with 45 degree dipping angle and oblique slip, and corresponding synthetic InSAR data sets to validate the efficiency and accuracy of method. We then applied the method on four recent large earthquakes in Asia, namely the 2010 Yushu, China earthquake, the 2011 Burma earthquake, the 2011 New Zealand earthquake and the 2008 Qinghai, China earthquake, and compared our results with those results from other groups. Our results show the effectiveness of the method in earthquake studies and a number of advantages of it over other methods. The details will be reported on the meeting.

  17. In-situ fluid-pressure measurements for earthquake prediction: An example from a deep well at Hi Vista, California

    USGS Publications Warehouse

    Healy, J.H.; Urban, T.C.

    1985-01-01

    Short-term earthquake prediction requires sensitive instruments for measuring the small anomalous changes in stress and strain that precede earthquakes. Instruments installed at or near the surface have proven too noisy for measuring anomalies of the size expected to occur, and it is now recognized that even to have the possibility of a reliable earthquake-prediction system will require instruments installed in drill holes at depths sufficient to reduce the background noise to a level below that of the expected premonitory signals. We are conducting experiments to determine the maximum signal-to-noise improvement that can be obtained in drill holes. In a 592 m well in the Mojave Desert near Hi Vista, California, we measured water-level changes with amplitudes greater than 10 cm, induced by earth tides. By removing the effects of barometric pressure and the stress related to earth tides, we have achieved a sensitivity to volumetric strain rates of 10-9 to 10-10 per day. Further improvement may be possible, and it appears that a successful earthquake-prediction capability may be achieved with an array of instruments installed in drill holes at depths of about 1 km, assuming that the premonitory strain signals are, in fact, present. ?? 1985 Birkha??user Verlag.

  18. LLNL-Generated Content for the California Academy of Sciences, Morrison Planetarium Full-Dome Show: Earthquake

    SciTech Connect

    Rodgers, A J; Petersson, N A; Morency, C E; Simmons, N A; Sjogreen, B

    2012-01-23

    The California Academy of Sciences (CAS) Morrison Planetarium is producing a 'full-dome' planetarium show on earthquakes and asked LLNL to produce content for the show. Specifically the show features numerical ground motion simulations of the M 7.9 1906 San Francisco and a possible future M 7.05 Hayward fault scenario earthquake. The show also features concepts of plate tectonics and mantle convection using images from LLNL's G3D global seismic tomography. This document describes the data that was provided to the CAS in support of production of the 'Earthquake' show. The CAS is located in Golden Gate Park, San Francisco and hosts over 1.6 million visitors. The Morrison Planetarium, within the CAS, is the largest all digital planetarium in the world. It features a 75-foot diameter spherical section projection screen tilted at a 30-degree angle. Six projectors cover the entire field of view and give a three-dimensional immersive experience. CAS shows strive to use scientifically accurate digital data in their productions. The show, entitled simply 'Earthquake', will debut on 26 May 2012. They are working on graphics and animations based on the same data sets for display on LLNL powerwalls and flat-screens as well as for public release.

  19. Probability of failure in BWR reactor coolant piping: Guillotine break indirectly induced by earthquakes

    SciTech Connect

    Hardy, G.S.; Campbell, R.D.; Ravindra, M.K.

    1986-12-01

    The requirements to design nuclear power plants for the effects of an instantaneous double-ended guillotine break (DEGB) of the reactor coolant piping have led to excessive design costs, interference with normal plant operation and maintenance, and unnecessary radiation exposure of plant maintenance personnel. This report describes an aspect of the NRC/Lawrence Livermore National laboratory-sponsored research program aimed at investigating whether the probability of DEGB in Reactor Coolant Loop Piping of nuclear power plants is acceptably small such that the requirements to design for the DEGB effects (e.g., provision of pipe whip restraints) may be removed. This study estimates the probability of indirect DEGB in Reactor Coolant piping as a consequence of seismic-induced structural failures within the containment of the GE supplied boiling water reactor at the Brunswick nuclear power plant. The median probability of indirect DEGB was estimated to be 2 x 10/sup -8/ per year. Using conservation assumptions, the 90% subjective probability value (confidence) of P/sub DEGB/ was found to be less than 5 x 10/sup -7/ per year.

  20. Behavior of Repeating Earthquake Sequences in Central California and the Implications for Subsurface Fault Creep

    SciTech Connect

    Templeton, D C; Nadeau, R; Burgmann, R

    2007-07-09

    Repeating earthquakes (REs) are sequences of events that have nearly identical waveforms and are interpreted to represent fault asperities driven to failure by loading from aseismic creep on the surrounding fault surface at depth. We investigate the occurrence of these REs along faults in central California to determine which faults exhibit creep and the spatio-temporal distribution of this creep. At the juncture of the San Andreas and southern Calaveras-Paicines faults, both faults as well as a smaller secondary fault, the Quien Sabe fault, are observed to produce REs over the observation period of March 1984-May 2005. REs in this area reflect a heterogeneous creep distribution along the fault plane with significant variations in time. Cumulative slip over the observation period at individual sequence locations is determined to range from 5.5-58.2 cm on the San Andreas fault, 4.8-14.1 cm on the southern Calaveras-Paicines fault, and 4.9-24.8 cm on the Quien Sabe fault. Creep at depth appears to mimic the behaviors seen of creep on the surface in that evidence of steady slip, triggered slip, and episodic slip phenomena are also observed in the RE sequences. For comparison, we investigate the occurrence of REs west of the San Andreas fault within the southern Coast Range. Events within these RE sequences only occurred minutes to weeks apart from each other and then did not repeat again over the observation period, suggesting that REs in this area are not produced by steady aseismic creep of the surrounding fault surface.

  1. Full-3D Tomography of the Crustal Structure in Southern California Using Earthquake Seismograms and Ambient-Noise Correlagrams

    NASA Astrophysics Data System (ADS)

    Lee, E. J.; Chen, P.; Jordan, T. H.; Maechling, P. J.; Denolle, M.; Beroza, G. C.

    2014-12-01

    We have constructed a high-resolution model for the Southern California crust, CVM-S4.26, by inverting more than half-a-million waveform-misfit measurements from about 38,000 earthquake seismograms and 12,000 ambient-noise correlagrams. The inversion was initiated with the Southern California Earthquake Center's Community Velocity Model, CVM-S4, and seismograms were simulated using K. Olsen's staggered-grid finite-difference code, AWP-ODC, which was highly optimized for massively parallel computation on supercomputers by Y. Cui et al. We navigated the tomography through 26 iterations, alternating the inversion sequences between the adjoint-wavefield (AW) method and the more rapidly converging, but more data-intensive, scattering-integral (SI) method. Earthquake source errors were reduced at various stages of the tomographic navigation by inverting the waveform data for the earthquake centroid-moment tensors. All inversions were done on the Mira supercomputer of the Argonne Leadership Computing Facility. The resulting model, CVM-S4.26, is consistent with independent observations, such as high-resolution 2D refraction surveys and Bouguer gravity data. Many of the high-contrast features of CVM-S4.26 conform to known fault structures and other geological constraints not applied in the inversions. We have conducted several other validation experiments, including checking the model against a large number (>28,000) of seismograms not used in the inversions. We illustrate this consistency with the excellent fits at low frequencies (? 0.2 Hz) to three-component seismograms recorded throughout Southern California from the 17 Mar 2014 Encino (MW4.4) and 29 Mar 2014 La Habra (MW5.1) earthquakes, and we show these fits to be much better than those obtained by two community velocity models in current use, CVM-S4 and CVM-H11.9. We conclude by describing some of the novel features of the CVM-S4.26 model, which include unusual velocity reversals in some regions of the mid-crust.

  2. Point mugu, california, earthquake of 21 february 1973 and its aftershocks.

    PubMed

    Ellsworth, W L; Campbell, R H; Hill, D P; Page, R A; Alewine, R W; Hanks, T C; Heaton, T H; Hileman, J A; Kanamori, H; Minster, B; Whitcomb, J H

    1973-12-14

    Seismological investigations show that the Point Mugu earthquake involved north-south crustal shortening deep within the complex fault zone that marks the southern front of the Transverse Ranges province. This earthquake sequence results from the same stress system responsible for the deformation in this province in the Pliocene through Holocene and draws attention to the significant earthquake hazard that the southern frontal fault system poses to the Los Angeles metropolitan area. PMID:17810814

  3. Earthquake-induced structures in sediments of Van Norman Lake, San Fernando, California

    USGS Publications Warehouse

    Sims, J.D.

    1973-01-01

    The 9 February 1971 earthquake in the San Fernando Valley damaged the Lower Van Norman Dam severely enough to warrant draining the reservoir. In March 1972 the sediment deposited on the reservoir floor was examined to determine whether the 1971 earthquake had induced sediment deformation and, if so, what types. A zone of deformational structures characterized by small-scale loads and slightly recumbent folds associated with the 1971 earthquake was discovered, in addition to two older zones of load structures. Each of the zones has been tentatively correlated with an historic earthquake.

  4. Damage and restoration of geodetic infrastructure caused by the 1994 Northridge, California, earthquake

    USGS Publications Warehouse

    Hodgkinson, Kathleen M.; Stein, Ross S.; Hudnut, Kenneth W.; Satalich, Jay; Richards, John H.

    1996-01-01

    We seek to restore the integrity of the geodetic network in the San Fernando, Simi, Santa Clarita Valleys and in the northern Los Angeles Basin by remeasurement of the network and identification of BMs which experienced non-tectonic displacements associated with the Northridge earthquake. We then use the observed displacement of BMs in the network to portray or predict the permanent vertical and horizontal deformation associated with the 1994 Northridge earthquake throughout the area, including sites where we lack geodetic measurements. To accomplish this, we find the fault geometry and earthquake slip that are most compatible with the geodetic and independent seismic observations of the earthquake. We then use that fault model to predict the deformation everywhere at the earth's surface, both at locations where geodetic observations exist and also where they are absent. We compare displacements predicted for a large number of numerical models of the earthquake faulting to the coseismic displacements, treating the earthquake fault as a cut or discontinuity embedded in a stiff elastic solid. This comparison is made after non-tectonic deformation has been removed from the measured elevation changes. The fault slip produces strain in the medium and deforms the ground surface. The model compatible with seismic observations that best fits the geodetic data within their uncertainties is selected. The acceptable model fault bisects the mainshock focus, and the earthquake size , magnitude, is compatible with the earthquake size measured seismically. Our fault model was used to identify geodetic monuments on engineered structures that were anomalously displaced by the earthquake.

  5. Current progress in using multiple electromagnetic indicators to determine location, time, and magnitude of earthquakes in California and Peru (Invited)

    NASA Astrophysics Data System (ADS)

    Bleier, T. E.; Dunson, C.; Roth, S.; Heraud, J.; Freund, F. T.; Dahlgren, R.; Bryant, N.; Bambery, R.; Lira, A.

    2010-12-01

    Since ultra-low frequency (ULF) magnetic anomalies were discovered prior to the 1989 Loma Prieta, Ca. M7.0 earthquake, QuakeFinder, a small R&D group based in Palo Alto California has systematically monitored ULF magnetic signals with a network of 3-axis induction magnetometers since 2000 in California. This raw magnetometer data was collected at 20-50 samples per sec., with no preprocessing, in an attempt to collect an accurate time history of electromagnetic waveforms prior to, during, and after large earthquakes within 30 km. of these sensors. Finally in October 2007, the QuakeFinder team observed a series of strange magnetic pulsations at the Alum Rock, California site, 14 days prior to M5.4 earthquake. These magnetic signals observed were relatively short, random pulsations, not continuous waveform signals like Pc1 or Pc3 micropulsations. The magnetic pulses have a characteristic uni-polar shapes and 0.5 sec. to 30 sec. durations, much longer than lightning signals. In May of 2010, very similar pulses were observed at Tacna, Peru, 13 days prior to a M6.2 earthquake, using a QuakeFinder station jointly operated under collaboration with the Catholic University in Lima Peru (PUCP). More examples of these pulsations were sought, and a historical review of older California magnetic data discovered fewer but similar pulsations occurred at the Hollister, Ca. site operated by UC Berkeley (e.g. San Juan Bautista M5.1 earthquake on August 12, 1998). Further analysis of the direction of arrival of the magnetic pulses showed an interesting “azimuth clustering” observed in both Alum Rock, Ca. and Tacna, Peru data. The complete time series of the Alum Rock data allowed the team to analyze subsequent changes observed in magnetometer “filter banks” (0.001 Hz to 10 Hz filter bands, similar to those used by Fraser-Smith in 1989), but this time using time-adjusted limits based on time of day, time of year, Kp, and site background noise. These site-customized limits showed similar increases in 30 minute averaged energy excursions, but the 30 minute averages had a disadvantage in that they reduced the signal to noise ratio over the individual pulse counting method. In other electromagnetic monitoring methods, air conductivity instrumentation showed major changes in positive air-borne ions observed near the Alum Rock and Tacna sites, peaking during the 24 hours prior to the earthquake. The use of GOES (geosynchronous) satellite infra red (IR) data showed that an unusual apparent “night time heating” occurred in an extended area within 40+ km. of the Alum Rock site, and this IR signature peaked around the time of the magnetic pulse count peak. The combination of these 3 indicators (magnetic pulse counts, air conductivity, and IR night time heating) may be the start in determining the time (within 1-2 weeks), location (within 20-40km) and magnitude (within +/- 1 increment of Richter magnitude) of earthquake greater than M5.4

  6. The Magnitude 6.7 Northridge, California, Earthquake of January 17, 1994

    NASA Technical Reports Server (NTRS)

    Donnellan, A.

    1994-01-01

    The most damaging earthquake in the United States since 1906 struck northern Los Angeles on January 17.1994. The magnitude 6.7 Northridge earthquake produced a maximum of more than 3 meters of reverse (up-dip) slip on a south-dipping thrust fault rooted under the San Fernando Valley and projecting north under the Santa Susana Mountains.

  7. Chapter C. The Loma Prieta, California, Earthquake of October 17, 1989 - Preseismic Observations

    USGS Publications Warehouse

    Johnston, Malcolm J. S., (Edited By)

    1993-01-01

    The October 17, 1989, Loma Prieta, Calif., Ms=7.1 earthquake provided the first opportunity in the history of fault monitoring in the United States to gather multidisciplinary preearthquake data in the near field of an M=7 earthquake. The data obtained include observations on seismicity, continuous strain, long-term ground displacement, magnetic field, and hydrology. The papers in this chapter describe these data, their implications for fault-failure mechanisms, the scale of prerupture nucleation, and earthquake prediction in general. Of the 10 papers presented here, about half identify preearthquake anomalies in the data, but some of these results are equivocal. Seismicity in the Loma Prieta region during the 20 years leading up to the earthquake was unremarkable. In retrospect, however, it is apparent that the principal southwest-dipping segment of the subsequent Loma Prieta rupture was virtually aseismic during this period. Two M=5 earthquakes did occur near Lake Elsman near the junction of the Sargent and San Andreas faults within 2.5 and 15 months of, and 10 km to the north of, the Loma Prieta epicenter. Although these earthquakes were not on the subsequent rupture plane of the Loma Prieta earthquake and other M=5 earthquakes occurred in the preceding 25 years, it is now generally accepted that these events were, in some way, foreshocks to the main event.

  8. A search for evidence of secondary static stress triggering during the 1992 Mw7.3 Landers, California, earthquake sequence

    NASA Astrophysics Data System (ADS)

    Meier, M.-A.; Werner, M. J.; Woessner, J.; Wiemer, S.

    2014-04-01

    Secondary triggering of aftershocks is widely observed and often ascribed to secondary static stress transfer. However, small to moderate earthquakes are generally disregarded in estimates of Coulomb stress changes (?CFS), either because of source parameter uncertainties or a perceived lack of importance. We use recently published high-quality focal mechanisms and hypocenters to reassess the role of small to moderate earthquakes for static stress triggering of aftershocks during the 1992 Mw7.3 Landers, California, earthquake sequence. We compare the ?CFS imparted by aftershocks (2?M?6) onto subsequent aftershocks with the total ?CFS induced by the M>6 main shocks. We find that incremental stress changes between aftershock pairs are potentially more often positive than expected over intermediate distances. Cumulative aftershock stress changes are not reliable for receivers with nearby aftershock stress sources because we exclude unrealistic aftershock stress shadows that result from uniform slip models. Nonetheless, 27% of aftershocks receive greater positive stress from aftershocks than from the main shocks. Overall, 85% of aftershocks are encouraged by the main shocks, while adding secondary stress encourages only 79%. We infer that source parameter uncertainties of small aftershocks remain too large to convincingly demonstrate (or rule out) that secondary stress transfer induces aftershocks. An important exception concerns aftershocks in main shock stress shadows: well-resolved secondary stress from detected aftershocks rarely compensates negative main shock stress; these aftershocks require a different triggering mechanism.

  9. Comments on baseline correction of digital strong-motion data: Examples from the 1999 Hector Mine, California, earthquake

    USGS Publications Warehouse

    Boore, D.M.; Stephens, C.D.; Joyner, W.B.

    2002-01-01

    Residual displacements for large earthquakes can sometimes be determined from recordings on modern digital instruments, but baseline offsets of unknown origin make it difficult in many cases to do so. To recover the residual displacement, we suggest tailoring a correction scheme by studying the character of the velocity obtained by integration of zeroth-order-corrected acceleration and then seeing if the residual displacements are stable when the various parameters in the particular correction scheme are varied. For many seismological and engineering purposes, however, the residual displacement are of lesser importance than ground motions at periods less than about 20 sec. These ground motions are often recoverable with simple baseline correction and low-cut filtering. In this largely empirical study, we illustrate the consequences of various correction schemes, drawing primarily from digital recordings of the 1999 Hector Mine, California, earthquake. We show that with simple processing the displacement waveforms for this event are very similar for stations separated by as much as 20 km. We also show that a strong pulse on the transverse component was radiated from the Hector Mine earthquake and propagated with little distortion to distances exceeding 170 km; this pulse leads to large response spectral amplitudes around 10 sec.

  10. A model of earthquake triggering probabilities and application to dynamic deformations constrained by ground motion observations

    USGS Publications Warehouse

    Gomberg, J.; Felzer, K.

    2008-01-01

    We have used observations from Felzer and Brodsky (2006) of the variation of linear aftershock densities (i.e., aftershocks per unit length) with the magnitude of and distance from the main shock fault to derive constraints on how the probability of a main shock triggering a single aftershock at a point, P(r, D), varies as a function of distance, r, and main shock rupture dimension, D. We find that P(r, D) becomes independent of D as the triggering fault is approached. When r ??? D P(r, D) scales as Dm where m-2 and decays with distance approximately as r-n with n = 2, with a possible change to r-(n-1) at r > h, where h is the closest distance between the fault and the boundaries of the seismogenic zone. These constraints may be used to test hypotheses about the types of deformations and mechanisms that trigger aftershocks. We illustrate this using dynamic deformations (i.e., radiated seismic waves) and a posited proportionality with P(r, D). Deformation characteristics examined include peak displacements, peak accelerations and velocities (proportional to strain rates and strains, respectively), and two measures that account for cumulative deformations. Our model indicates that either peak strains alone or strain rates averaged over the duration of rupture may be responsible for aftershock triggering.

  11. Chapter B. The Loma Prieta, California, Earthquake of October 17, 1989 - Liquefaction

    USGS Publications Warehouse

    Holzer, Thomas L.

    1998-01-01

    The 1989 Loma Prieta earthquake both reconfirmed the vulnerability of areas in the San Francisco-Monterey Bay region to liquefaction and provided an opportunity to test methodologies for predicting liquefaction that have been developed since the mid-1970's. This vulnerability is documented in the chapter edited by O'Rourke and by the investigators in this chapter who describe case histories of liquefaction damage and warn us about the potential for even greater damage from liquefaction if an earthquake similar to the 1989 Loma Prieta earthquake, but located closer to their study sites, were to occur.

  12. Probing the mechanical properties of seismically active crust with space geodesy: Study of the coseismic deformation due to the 1992 Mw7.3 Landers (southern California) earthquake

    Microsoft Academic Search

    Yuri Fialko

    2004-01-01

    The coseismic deformation due to the 1992 Mw7.3 Landers earthquake, southern California, is investigated using synthetic aperture radar (SAR) and Global Positioning System (GPS) measurements. The ERS-1 satellite data from the ascending and descending orbits are used to generate contiguous maps of three orthogonal components (east, north, up) of the coseismic surface displacement field. The coseismic displacement field exhibits symmetries

  13. The dependence of peak horizontal acceleration on magnitude, distance, and site effects for small-magnitude earthquakes in California and eastern North America

    USGS Publications Warehouse

    Campbell, K.W.

    1989-01-01

    One-hundred and ninety free-field accelerograms recorded on deep soil (>10 m deep) were used to study the near-source scaling characteristics of peak horizontal acceleration for 91 earthquakes (2.5 ??? ML ??? 5.0) located primarily in California. An analysis of residuals based on an additional 171 near-source accelerograms from 75 earthquakes indicated that accelerograms recorded in building basements sited on deep soil have 30 per cent lower acclerations, and that free-field accelerograms recorded on shallow soil (???10 m deep) have 82 per cent higher accelerations than free-field accelerograms recorded on deep soil. An analysis of residuals based on 27 selected strong-motion recordings from 19 earthquakes in Eastern North America indicated that near-source accelerations associated with frequencies less than about 25 Hz are consistent with predictions based on attenuation relationships derived from California. -from Author

  14. Seismicity remotely triggered by the magnitude 7.3 landers, california, earthquake

    USGS Publications Warehouse

    Hill, D.P.; Reasenberg, P.A.; Michael, A.; Arabaz, W.J.; Beroza, G.; Brumbaugh, D.; Brune, J.N.; Castro, R.; Davis, S.; Depolo, D.; Ellsworth, W.L.; Gomberg, J.; Harmsen, S.; House, L.; Jackson, S.M.; Johnston, M.J.S.; Jones, L.; Keller, R.; Malone, S.; Munguia, L.; Nava, S.; Pechmann, J.C.; Sanford, A.; Simpson, R.W.; Smith, R.B.; Stark, M.; Stickney, M.; Vidal, A.; Walter, S.; Wong, V.; Zollweg, J.

    1993-01-01

    The magnitude 7.3 Landers earthquake of 28 June 1992 triggered a remarkably sudden and widespread increase in earthquake activity across much of the western United States. The triggered earthquakes, which occurred at distances up to 1250 kilometers (17 source dimensions) from the Landers mainshock, were confined to areas of persistent seismicity and strike-slip to normal faulting. Many of the triggered areas also are sites of geothermal and recent volcanic activity. Static stress changes calculated for elastic models of the earthquake appear to be too small to have caused the triggering. The most promising explanations involve nonlinear interactions between large dynamic strains accompanying seismic waves from the mainshock and crustal fluids (perhaps including crustal magma).

  15. Chapter C. The Loma Prieta, California, Earthquake of October 17, 1989 - Building Structures

    USGS Publications Warehouse

    Celebi, Mehmet

    1998-01-01

    Several approaches are used to assess the performance of the built environment following an earthquake -- preliminary damage surveys conducted by professionals, detailed studies of individual structures, and statistical analyses of groups of structures. Reports of damage that are issued by many organizations immediately following an earthquake play a key role in directing subsequent detailed investigations. Detailed studies of individual structures and statistical analyses of groups of structures may be motivated by particularly good or bad performance during an earthquake. Beyond this, practicing engineers typically perform stress analyses to assess the performance of a particular structure to vibrational levels experienced during an earthquake. The levels may be determined from recorded or estimated ground motions; actual levels usually differ from design levels. If a structure has seismic instrumentation to record response data, the estimated and recorded response and behavior of the structure can be compared.

  16. Chapter A. The Loma Prieta, California, Earthquake of October 17, 1989 - Lifelines

    USGS Publications Warehouse

    Schiff, Anshel J., (Edited By)

    1998-01-01

    To the general public who had their televisions tuned to watch the World Series, the 1989 Loma Prieta earthquake was a lifelines earthquake. It was the images seen around the world of the collapsed Cypress Street viaduct, with the frantic and heroic efforts to pull survivors from the structure that was billowing smoke; the collapsed section of the San Francisco-Oakland Bay Bridge and subsequent home video of a car plunging off the open span; and the spectacular fire in the Marina District of San Francisco fed by a broken gasline. To many of the residents of the San Francisco Bay region, the relation of lifelines to the earthquake was characterized by sitting in the dark because of power outage, the inability to make telephone calls because of network congestion, and the slow and snarled traffic. Had the public been aware of the actions of the engineers and tradespeople working for the utilities and other lifeline organizations on the emergency response and restoration of lifelines, the lifeline characteristics of this earthquake would have been even more significant. Unobserved by the public were the warlike devastation in several electrical-power substations, the 13 miles of gas-distribution lines that had to be replaced in several communities, and the more than 1,200 leaks and breaks in water mains and service connections that had to be excavated and repaired. Like the 1971 San Fernando, Calif., earthquake, which was a seminal event for activity to improve the earthquake performance of lifelines, the 1989 Loma Prieta earthquake demonstrated that the tasks of preparing lifelines in 'earthquake country' were incomplete-indeed, new lessons had to be learned.

  17. Surface Displacement of the 17 May 1993 Eureka Valley, California, Earthquake Observed by SAR Interferometry

    Microsoft Academic Search

    Gilles Peltzer; Paul Rosen

    1995-01-01

    Satellite synthetic aperture radar (SAR) interferometry shows that the magnitude 6.1 Eureka Valley earthquake of 17 May 1993 produced an elongated subsidence basin oriented north-northwest, parallel to the trend defined by the aftershock distribution, whereas the source mechanism of the earthquake implies a north-northeast-striking normal fault. The ±3-millimeter accuracy of the radar-observed displacement map over short spatial scales allowed identification

  18. Slip partitioning of the Calaveras fault, California, and prospects for future earthquakes

    SciTech Connect

    Oppenheimer, D.H.; Bakun, W.H.; Lindh, A.G. (Geological Survey, Menlo Park, CA (United States))

    1990-06-10

    Examination of main shock and microearthquake data from the Calaveras fault during the last 20 years reveals that main shock hypocenters occur at depths of 8-9 km near the base of the zone of microearthquakes. The spatial pattern of pre-main shock microseismicity surrounding the Coyote Lake and Morgan Hill hypocenters is similar to the pattern of the post-main shock microseismicity. Microseismicity extends between depths of 4 and 10 km and defines zones of concentrated microseismicity and aseismic zones. Estimates of the fault regions which slipped during the Coyote Lake Morgan Hill earthquakes as derived from seismic radiation coincide with zones which are otherwise aseismic. The authors propose that these persistent aseismic zones represent stuck patches which slip only during moderate earthquakes. From the pattern of microearthquake locations they recognize six aseismic zones where they expect future main shocks will rupture the Calaveras fault. From an analysis of historic seismic data they establish the main shock rupture history for each aseismic zone and identify two zones that are the most likely sites for the next M > 5 earthquakes. The first zone is located near Gilroy and was last ruptured by a M5.2 earthquake in 1949. The second zone is located south of Calaveras Reservoir and north of the 1988 M5.1 Alum Rock earthquake. It has not slipped seismically since at least 1903, and the size of the aseismic region is sufficiently large to sustain a M5.5 earthquake.

  19. The Redwood Coast Tsunami Work Group: a unique organization promoting earthquake and tsunami resilience on California's North Coast

    NASA Astrophysics Data System (ADS)

    Dengler, L.; Henderson, C.; Larkin, D.; Nicolini, T.; Ozaki, V.

    2012-12-01

    The Northern California counties of Del Norte, Humboldt, and Mendocino account for over 30% of California's coastline and is one of the most seismically active areas of the contiguous 48 states. The region is at risk from earthquakes located on- and offshore and from tsunamis generated locally from faults associated with the Cascadia subduction zone (CSZ) and from distant sources elsewhere in the Pacific. In 1995 the California Geological Survey (CGS) published a scenario for a CSZ earthquake that included both strong ground shaking effects and a tsunami. As a result of the scenario, the Redwood Coast Tsunami Work Group (RCTWG), an organization of government agencies, tribes, service groups, academia and the private sector, was formed to coordinate and promote earthquake and tsunami hazard awareness and mitigation in the three-county region. The RCTWG and its member agencies projects include education/outreach products and programs, tsunami hazard mapping, signage and siren planning. Since 2008, RCTWG has worked with the California Emergency Management Agency (Cal EMA) in conducting tsunami warning communications tests on the North Coast. In 2007, RCTWG members helped develop and carry out the first tsunami training exercise at FEMA's Emergency Management Institute in Emmitsburg, MD. The RCTWG has facilitated numerous multi-agency, multi-discipline coordinated exercises, and RCTWG county tsunami response plans have been a model for other regions of the state and country. Eight North Coast communities have been recognized as TsunamiReady by the National Weather Service, including the first National Park the first State Park and only tribe in California to be so recognized. Over 500 tsunami hazard zone signs have been posted in the RCTWG region since 2008. Eight assessment surveys from 1993 to 2010 have tracked preparedness actions and personal awareness of earthquake and tsunami hazards in the county and additional surveys have tracked public awareness and tourist concerns about tsunami hazard signs. Over the seventeen-year period covered by the surveys, the percent with houses secured to foundations has increased from 58 to 84 percent, respondents aware of a local tsunami hazard increased from 51 to 89 percent and knowing what the Cascadia subduction zone is from 16 to 57 percent. In 2009, the RCTWG was recognized by the Western States Seismic Policy Council (WSSPC) with an award for innovation and in 2010, the RCTWG-sponsored class "Living on Shaky Ground" was awarded WSSPC's overall Award in Excellence. The RCTWG works closely with CGS and Cal EMA on a number of projects including tsunami mapping, evacuation zone planning, siren policy, tsunami safety for boaters, and public education messaging. Current projects include working with CGS to develop a "playbook" tsunami mapping product to illustrate the expected effects from a range of tsunami source events and assist local governments in focusing future response actions to reflect the range expected impacts from distant source events. Preparedness efforts paid off on March 11, 2011 when a tsunami warning was issued for the region and significant damage occurred in harbor regions of Del Norte County and Mendocino County. Full-scale evacuations were carried out in a coordinated manner and the majority of the commercial fishing fleet in Crescent City was able to exit the harbor before the tsunami arrived.

  20. GPS Measurements of crustal motion associated with the 2010 Mw 7.2 Sierra El Mayor-Cucapah Earthquake, Baja California, Mexico

    NASA Astrophysics Data System (ADS)

    Spinler, J. C.; Bennett, R. A.; Gonzalez-Garcia, J. J.; Walls, C. P.; Lawrence, S.

    2010-12-01

    We present crustal motion data obtained from an analysis of continuous and campaign GPS data for southern California and northern Baja Mexico, including the first results from a new six-station extension of the NSF EarthScope PBO network across the international border into northern Baja California, Mexico. This PBO extension was constructed in late Summer and early Fall 2010 in response to the April 4, 2010, Mw7.2 Sierra El Mayor-Cucapah earthquake. The locations of the six new PBO sites were selected in order to complement the dense existing continuous and campaign GPS networks in southern California, as well as providing needed continuous data for northern Mexico. Coseismic displacements for sites located nearest to the earthquake exceed 20 cm. In addition to allowing precise estimates of coseismic displacements, the ~17 years of pre-earthquake data for northern Baja California and southern California obtained from the archives at SCEC, UNAVCO, SOPAC, the University of Miami, and other sources, provides an opportunity to assess the evolution of postseismic deformation over the coming years. We will present updated results for pre- and post-seismic velocity for an assessment of post-seismic perturbations to the secular velocity field.

  1. The Loma Prieta, California, Earthquake of October 17, 1989: Societal Response

    USGS Publications Warehouse

    Coordinated by Mileti, Dennis S.

    1993-01-01

    Professional Paper 1553 describes how people and organizations responded to the earthquake and how the earthquake impacted people and society. The investigations evaluate the tools available to the research community to measure the nature, extent, and causes of damage and losses. They describe human behavior during and immediately after the earthquake and how citizens participated in emergency response. They review the challenges confronted by police and fire departments and disruptions to transbay transportations systems. And they survey the challenges of post-earthquake recovery. Some significant findings were: * Loma Prieta provided the first test of ATC-20, the red, yellow, and green tagging of buildings. It successful application has led to widespread use in other disasters including the September 11, 2001, New York City terrorist incident. * Most people responded calmly and without panic to the earthquake and acted to get themselves to a safe location. * Actions by people to help alleviate emergency conditions were proportional to the level of need at the community level. * Some solutions caused problems of their own. The police perimeter around the Cypress Viaduct isolated businesses from their customers leading to a loss of business and the evacuation of employees from those businesses hindered the movement of supplies to the disaster scene. * Emergency transbay ferry service was established 6 days after the earthquake, but required constant revision of service contracts and schedules. * The Loma Prieta earthquake produced minimal disruption to the regional economy. The total economic disruption resulted in maximum losses to the Gross Regional Product of $725 million in 1 month and $2.9 billion in 2 months, but 80% of the loss was recovered during the first 6 months of 1990. Approximately 7,100 workers were laid off.

  2. Chapter F. The Loma Prieta, California, Earthquake of October 17, 1989 - Marina District

    USGS Publications Warehouse

    O'Rourke, Thomas D., (Edited By)

    1992-01-01

    During the earthquake, a total land area of about 4,300 km2 was shaken with seismic intensities that can cause significant damage to structures. The area of the Marina District of San Francisco is only 4.0 km2--less than 0.1 percent of the area most strongly affected by the earthquake--but its significance with respect to engineering, seismology, and planning far outstrips its proportion of shaken terrain and makes it a centerpiece for lessons learned from the earthquake. The Marina District provides perhaps the most comprehensive case history of seismic effects at a specific site developed for any earthquake. The reports assembled in this chapter, which provide an account of these seismic effects, constitute a unique collection of studies on site, as well as infrastructure and societal, response that cover virtually all aspects of the earthquake, ranging from incoming ground waves to the outgoing airwaves used for emergency communication. The Marina District encompasses the area bounded by San Francisco Bay on the north, the Presidio on the west, and Lombard Street and Van Ness Avenue on the south and east, respectively. Nearly all of the earthquake damage in the Marina District, however, occurred within a considerably smaller area of about 0.75 km2, bounded by San Francisco Bay and Baker, Chestnut, and Buchanan Streets. At least five major aspects of earthquake response in the Marina District are covered by the reports in this chapter: (1) dynamic site response, (2) soil liquefaction, (3) lifeline performance, (4) building performance, and (5) emergency services.

  3. Processed seismic motion records from earthquakes (1982--1993): Recorded at Scotty`s Castle, California

    SciTech Connect

    Lum, P.K.; Honda, K.K.

    1993-10-01

    As part of the contract with the US Department of Energy, Nevada Operations Office (DOE/NV), URS/John A. Blume & Associates, Engineers (URS/Blume) maintained a network of seismographs to monitor the ground motion generated by the underground nuclear explosions (UNEs) at the Nevada Test Site (NTS). The seismographs were located in the communities surrounding the NTS and the Las Vegas valley. When these seismographs were not used for monitoring the UNE generated motions, a limited number of seismographs were maintained for monitoring motion generated by other than UNEs (e.g. motion generated by earthquakes, wind, blast). Scotty`s Castle was one of the selected earthquake monitoring station. During the period from 1982 through 1993, numerous earthquakes with varied in magnitudes and distances were recorded at Scotty`s Castle. The records from 24 earthquakes were processed and included in this report. Tables 1 and 2 lists the processed earthquakes in chronological order and in the order of epicentral distances, respectively. Figure 1 shows these epicenters and magnitudes. Due to the potential benefit of these data for the scientific community, DOE/NV and the National Park Service authorize the release of these records.

  4. Earthquake response

    NASA Astrophysics Data System (ADS)

    Simpson, David; Hough, Susan; Lerner-Lam, Arthur; Phinney, Robert

    The Loma Prieta earthquake in northern California gave geophysicists an unexpected chance to mobilize a team to take portable seismographs to an earthquake region. The magnitude-7.1 earthquake occurred Tuesday, October 17 at 5:04 P.M. Pacific Daylight Time. Less than 48 hours after the main shock, IRIS consortium seismologists from Lamont-Doherty Geological Observatory in Palisades, N.Y., were setting up new, portable equipment around San Francisco.The ability to move quickly to the earthquake area was an unanticipated bonus of two National Science Foundation programs: IRIS, the Incorporated Research Institutions for Seismology in Arlington, Va., and NCEER, the National Center for Earthquake Engineering Research in Buffalo, N.Y.

  5. Surface Displacement of the 17 May 1993 Eureka Valley, California, Earthquake Observed by SAR Interferometry

    NASA Astrophysics Data System (ADS)

    Peltzer, Gilles; Rosen, Paul

    1995-06-01

    Satellite synthetic aperture radar (SAR) interferometry shows that the magnitude 6.1 Eureka Valley earthquake of 17 May 1993 produced an elongated subsidence basin oriented north-northwest, parallel to the trend defined by the aftershock distribution, whereas the source mechanism of the earthquake implies a north-northeast-striking normal fault. The ±3-millimeter accuracy of the radar-observed displacement map over short spatial scales allowed identification of the main surface rupture associated with the event. These observations suggest that the rupture began at depth and propagated diagonally upward and southward on a west-dipping, north-northeast fault plane, reactivating the largest escarpment in the Saline Range.

  6. Comment on "Revisiting the 1872 owens valley, California, earthquake" by Susan E. Hough and Kate Hutton

    USGS Publications Warehouse

    Bakun, W.H.

    2009-01-01

    Bakun (2009) argues that the conclusions of Hough and Hutton (2008) are wrong because the study failed to take into account the Sierra Nevada attenuation model of Bakun (2006). In particular, Bakun (2009) argues that propagation effects can explain the relatively high intensities generated by the 1872 Owens Valley earthquake. Using an intensity attenuation model that attempts to account for attenuation through the Sierra Nevada, Bakun (2006) infers the magnitude estimate (Mw 7.4–7.5) that is currently accepted by National Earthquake Information Center (NEIC).

  7. Chapter E. The Loma Prieta, California, Earthquake of October 17, 1989 - Geologic Setting and Crustal Structure

    USGS Publications Warehouse

    Wells, Ray E.

    2004-01-01

    Although some scientists considered the Ms=7.1 Loma Prieta, Calif., earthquake of 1989 to be an anticipated event, some aspects of the earthquake were surprising. It occurred 17 km beneath the Santa Cruz Mountains along a left-stepping restraining bend in the San Andreas fault system. Rupture on the southwest-dipping fault plane consisted of subequal amounts of right-lateral and reverse motion but did not reach the surface. In the area of maximum uplift, severe shaking and numerous ground cracks occurred along Summit Road and Skyland Ridge, several kilometers south of the main trace of the San Andreas fault. The relatively deep focus of the earthquake, the distribution of ground failure, the absence of throughgoing surface rupture on the San Andreas fault, and the large component of uplift raised several questions about the relation of the 1989 Loma Prieta earthquake to the San Andreas fault: Did the earthquake actually occur on the San Andreas fault? Where exactly is the San Andreas fault in the heavily forested Santa Cruz Mountains, and how does the fault relate to ground ruptures that occurred there in 1989 and 1906? What is the geometry of the San Andreas fault system at depth, and how does it relate to the major crustal blocks identified by geologic mapping? Subsequent geophysical and geologic investigations of crustal structure in the Loma Prieta region have addressed these and other questions about the relation of the earthquake to geologic structures observed in the southern Santa Cruz Mountains. The diverse papers in this chapter cover several topics: geologic mapping of the region, potential- field and electromagnetic modeling of crustal structure, and the velocity structure of the crust and mantle in and below the source region for the earthquake. Although these papers were mostly completed between 1992 and 1997, they provide critical documentation of the crustal structure of the Loma Prieta region. Together, they present a remarkably coherent, three-dimensional picture of the earthquake source region--a geologically complex volume of crust with a long history of both right-lateral faulting and fault-normal compression, thrusting, and uplift.

  8. Southern California Permanent GPS Geodetic Array: Continuous measurements of regional crustal deformation between the 1992 Landers and 1994 Northridge earthquakes

    USGS Publications Warehouse

    Bock, Y.; Wdowinski, S.; Fang, P.; Zhang, J.; Williams, S.; Johnson, H.; Behr, J.; Genrich, J.; Dean, J.; Van Domselaar, M.; Agnew, D.; Wyatt, F.; Stark, K.; Oral, B.; Hudnut, K.; King, R.; Herring, T.; Dinardo, S.; Young, W.; Jackson, D.; Gurtner, W.

    1997-01-01

    The southern California Permanent GPS Geodetic Array (PGGA) was established in 1990 across the Pacific-North America plate boundary to continuously monitor crustal deformation. We describe the development of the array and the time series of daily positions estimated for its first 10 sites in the 19-month period between the June 28, 1992 (Mw=7.3), Landers and January 17, 1994 (Mw=6.7), Northridge earthquakes. We compare displacement rates at four site locations with those reported by Feigl et al. [1993], which were derived from an independent set of Global Positioning System (GPS) and very long baseline interferometry (VLBI) measurements collected over nearly a decade prior to the Landers earthquake. The velocity differences for three sites 65-100 km from the earthquake's epicenter are of order of 3-5 mm/yr and are systematically coupled with the corresponding directions of coseismic displacement. The fourth site, 300 km from the epicenter, shows no significant velocity difference. These observations suggest large-scale postseismic deformation with a relaxation time of at least 800 days. The statistical significance of our observations is complicated by our incomplete knowledge of the noise properties of the two data sets; two possible noise models fit the PGGA data equally well as described in the companion paper by Zhang et al. [this issue]; the pre-Landers data are too sparse and heterogeneous to derive a reliable noise model. Under a fractal white noise model for the PGGA data we find that the velocity differences for all three sites are statistically different at the 99% significance level. A white noise plus flicker noise model results in significance levels of only 94%, 43%, and 88%. Additional investigations of the pre-Landers data, and analysis of longer spans of PGGA data, could have an important effect on the significance of these results and will be addressed in future work. Copyright 1997 by the American Geophysical Union.

  9. Formation of left-lateral fractures within the Summit Ridge shear zone, 1989 Loma Prieta, California, earthquake

    SciTech Connect

    Johnson, A.M.; Fleming, R.W. [Purdue Univ., West Lafayette, IN (United States)]|[Geological Survey, Denver, CO (United States)

    1993-12-01

    The 1989 Loma Prieta, California, earthquake is characterized by the lack of major, throughgoing, coseismic, right-lateral faulting along strands of the San Andreas fault zone in the epicentral area. Instead, throughout the Summit Ridge area there are zones of tension cracks and left-lateral fracture zones oriented about N45 deg W, that is, roughly parallel to the San Andreas fault in this area. The left-lateral fractures zones are enigmatic because their left-lateral slip is opposite to the right-lateral sense of the relative motion between the Pacific and North American plates. We suggest that the enigmatic fractures can be understood if we assume that coesiesmic deformation was by right-lateral shear across a broad zone, about 0.5 km wide and 4 km long, beneath Summit Ridge. Contrary to most previous reports on the Loma Prieta earthquake, which assert that coseismic, right-lateral ground rupture was restricted to considerable (greater than 4 km) depths in the epicentral area, we find that nearly all the right-lateral offset is represented at the ground surface by the Summit Ridge shear zone.

  10. The SAFRR Tsunami Scenario: Improving Resilience for California from a Plausible M9 Earthquake near the Alaska Peninsula

    NASA Astrophysics Data System (ADS)

    Ross, S.; Jones, L.; Wilson, R. I.; Bahng, B.; Barberopoulou, A.; Borrero, J. C.; Brosnan, D.; Bwarie, J.; Geist, E. L.; Johnson, L.; Kirby, S. H.; Knight, W.; Long, K.; Lynett, P. J.; Miller, K.; Mortensen, C. E.; Nicolsky, D.; Oglesby, D. D.; Perry, S. C.; Plumlee, G. S.; Porter, K. A.; Real, C. R.; Ryan, K. J.; Suleimani, E.; Thio, H. K.; Titov, V.; Wein, A. M.; Whitmore, P.; Wood, N. J.

    2013-12-01

    The SAFRR Tsunami Scenario models a hypothetical but plausible tsunami, created by an Mw9.1 earthquake occurring offshore from the Alaskan peninsula, and its impacts on the California coast. We present the likely inundation areas, current velocities in key ports and harbors, physical damage and repair costs, economic consequences, environmental impacts, social vulnerability, emergency management, and policy implications for California associated with the tsunami scenario. The intended users are those who must make mitigation decisions before and rapid decisions during future tsunamis. Around a half million people would be present in the scenario's inundation area in residences, businesses, public venues, parks and beaches. Evacuation would likely be ordered for the State of California's maximum mapped tsunami inundation zone, evacuating an additional quarter million people from residences and businesses. Some island and peninsula communities would face particular evacuation challenges because of limited access options and short warning time, caused by the distance between Alaska and California. Evacuations may also be a challenge for certain dependent-care populations. One third of the boats in California's marinas could be damaged or sunk, costing at least 700 million in repairs to boats and docks, and potentially much more to address serious issues due to sediment transport and environmental contamination. Fires would likely start at many sites where fuel and petrochemicals are stored in ports and marinas. Tsunami surges and bores may travel several miles inland up coastal rivers. Debris clean-up and recovery of inundated and damaged areas will take days, months, or years depending on the severity of impacts and the available resources for recovery. The Ports of Los Angeles and Long Beach (POLA/LB) would be shut down for a mini?m of two days due to strong currents. Inundation of dry land in the ports would result in 100 million damages to cargo and additional downtime. The direct exposure of port trade value totals over 1.2 billion, while associated business interruption losses in the California economy could more than triple that value. Other estimated damages include 1.8 billion of property damage and 85 million for highway and railroad repairs. In total, we have estimated repair and replacement costs of almost 3 billion to California marinas, coastal properties and the POLA/LB. These damages could cause $6 billion of business interruption losses in the California economy, but that could be reduced by 80-90% with the implementation of business continuity or resilience strategies. This scenario provides the basis for improving preparedness, mitigation, and continuity planning for tsunamis, which can reduce damage and economic impacts and enhance recovery efforts. Two positive outcomes have already resulted from the SAFRR Tsunami Scenario. Emergency managers in areas where the scenario inundation exceeds the State's maximum inundation zone have been notified and evacuation plans have been updated appropriately. The State has also worked with NOAA's West Coast and Alaska Tsunami Warning Center to modify future message protocols to facilitate effective evacuations in California. While our specific results pertain to California, the lessons learned and our scenario approach can be applied to other regions.

  11. Monitoring velocity variations in the crust using earthquake doublets: An application to the Calaveras fault, California

    Microsoft Academic Search

    G. Poupinet; V. L. Ellsworth; J. Frechet

    1984-01-01

    We present a technique that greatly improves the precision in measuring temporal variations of crustal velocities using an earthquake doublet, or pair of microearthquakes that have nearly identical waveforms and the same hypocenter and magnitude but occur on different dates. We compute differences in arrival times between seismograms recorded at the same station in the frequency domain by cross correlation

  12. Chapter B. The Loma Prieta, California, Earthquake of October 17, 1989 - Public Response

    USGS Publications Warehouse

    Bolton, Patricia A., (Edited By)

    1993-01-01

    Major earthquakes provide seismologists and engineers an opportunity to examine the performance of the Earth and the man-made structures in response to the forces of the quake. So, too, do they provide social scientists an opportunity to delve into human responses evoked by the ground shaking and its physical consequences. The findings from such research can serve to guide the development and application of programs and practices designed to reduce death, injury, property losses, and social disruption in subsequent earthquakes. This chapter contains findings from studies focused mainly on public response to the Loma Prieta earthquake; that is, on the behavior and perceptions of the general population rather than on the activities of specific organizations or on the impact on procedures or policies. A major feature of several of these studies is that the information was collected from the population throughout the Bay area, not just from persons in the most badly damaged communities or who had suffered the greatest losses. This wide range serves to provide comparisons of behavior for those most directly affected by the earthquake with others who were less directly affected by it but still had to consider it very 'close to home.'

  13. Direct and indirect evidence for earthquakes; an example from the Lake Tahoe Basin, California-Nevada

    NASA Astrophysics Data System (ADS)

    Maloney, J. M.; Noble, P. J.; Driscoll, N. W.; Kent, G.; Schmauder, G. C.

    2012-12-01

    High-resolution seismic CHIRP data can image direct evidence of earthquakes (i.e., offset strata) beneath lakes and the ocean. Nevertheless, direct evidence often is not imaged due to conditions such as gas in the sediments, or steep basement topography. In these cases, indirect evidence for earthquakes (i.e., debris flows) may provide insight into the paleoseismic record. The four sub-basins of the tectonically active Lake Tahoe Basin provide an ideal opportunity to image direct evidence for earthquake deformation and compare it to indirect earthquake proxies. We present results from high-resolution seismic CHIRP surveys in Emerald Bay, Fallen Leaf Lake, and Cascade Lake to constrain the recurrence interval on the West Tahoe Dollar Point Fault (WTDPF), which was previously identified as potentially the most hazardous fault in the Lake Tahoe Basin. Recently collected CHIRP profiles beneath Fallen Leaf Lake image slide deposits that appear synchronous with slides in other sub-basins. The temporal correlation of slides between multiple basins suggests triggering by events on the WTDPF. If correct, we postulate a recurrence interval for the WTDPF of ~3-4 k.y., indicating that the WTDPF is near its seismic recurrence cycle. In addition, CHIRP data beneath Cascade Lake image strands of the WTDPF that offset the lakefloor as much as ~7 m. The Cascade Lake data combined with onshore LiDAR allowed us to map the geometry of the WTDPF continuously across the southern Lake Tahoe Basin and yielded an improved geohazard assessment.

  14. Stability and Uncertainty of Finite-Fault Slip Inversions: Application to the 2004 Parkfield, California, Earthquake

    E-print Network

    Larson, Kristine

    aspects of the finite-fault slip inversion problem with different a priori model assumptions. We utilize inversion procedures are compared: a linear least-squares subfault-based methodology and a nonlinear global used to solve the finite-fault slip inversion problem. For the Parkfield earthquake and the inversion

  15. In Search for Thermal Precursors to Earthquakes in California Using MODIS Land Surface Temperature Data

    Microsoft Academic Search

    D. A. Adams; M. Eneva

    2007-01-01

    We test claims that earthquakes are preceded by thermal anomalies by analyzing daily nighttime land surface temperatures (LSTs) derived from data collected by the MODIS (Moderate Resolution Imaging Spectroradiometer) instruments mounted on the Terra and Aqua satellites. Terra precedes Aqua by ~3 hours, so the LST difference between the two satellites provides an estimate of nighttime cooling\\/warming rates. The MODIS

  16. Three dimensional images of geothermal systems: local earthquake P-wave velocity tomography at the Hengill and Krafla geothermal areas, Iceland, and The Geysers, California

    USGS Publications Warehouse

    Julian, B.R.; Prisk, A.; Foulger, G.R.; Evans, J.R.

    1993-01-01

    Local earthquake tomography - the use of earthquake signals to form a 3-dimensional structural image - is now a mature geophysical analysis method, particularly suited to the study of geothermal reservoirs, which are often seismically active and severely laterally inhomogeneous. Studies have been conducted of the Hengill (Iceland), Krafla (Iceland) and The Geysers (California) geothermal areas. All three systems are exploited for electricity and/or heat production, and all are highly seismically active. Tomographic studies of volumes a few km in dimension were conducted for each area using the method of Thurber (1983).

  17. Migrating swarms of brittle-failure earthquakes in the lower crust beneath Mammoth Mountain, California

    NASA Astrophysics Data System (ADS)

    Shelly, D. R.; Hill, D. P.

    2011-12-01

    Brittle-failure earthquakes in the lower crust, where high pressures and temperatures would typically promote ductile deformation, are relatively rare but occasionally observed beneath active volcanic centers. When they occur, these earthquakes provide a unique opportunity to constrain volcanic processes in the lower crust, such as fluid injection and migration. Here, we examine recent brief earthquakes swarms occurring deep beneath Mammoth Mountain, located on the southwestern margin of Long Valley Caldera. Brief lower-crustal swarms were observed beneath Mammoth in 2006, 2008, and 2009. These brittle-failure earthquakes at depths of 19 to 30 km are likely occurring within the more mafic mid- to lower crust, which can remain in the brittle domain to temperatures as high as ~700 degrees C. Above these deep events are two distinct shallower zones of seismicity. Mid-crustal, long-period earthquakes between 10 and 19 km are presumably occurring within the silicic crust, but below the rheological transition from brittle to plastic behavior, which is expected to occur at temperatures of ~350 to 400 degrees C. Above this transition shallow, brittle-failure earthquakes occur in the upper 8 kilometers of the silicic crust. We focus primarily on a deep swarm that occurred September 29-30, 2009, which is the best recorded of the recent lower-crustal swarms. To maximally illuminate the spatial-temporal progression of seismicity, we supplement the earthquake catalog by identifying additional small events with similar waveforms in the continuous data, achieving up to a 10-fold increase in the number of locatable events. We then relocate all events, using cross-correlation and a double-difference algorithm. We find that the 2009 swarm exhibits systematically decelerating upward migration, with hypocenters shallowing from 21 to 19 km depth over approximately 12 hours. We also observe substantial diversity in the pattern of P-wave first motions, where events with very similar hypocenters and origin times exhibit nearly opposite patterns of compressional and dilational first motions at network seismometers. These lower-crustal, brittle-failure earthquakes are similar in many respects to those that occurred beneath the Sierra Nevada crest in the vicinity of Lake Tahoe in late 2003, which Smith et al. (Science, 2004) concluded were associated with a magmatic intrusion into the lower crust. The 2009 Mammoth sequence, however, is much shorter in duration (1-2 days compared with several months), faster migrating, and has no detectible accompanying geodetic signal. This suggests that the events may be triggered by upward diffusion of a lower viscosity fluid. CO2 is a likely candidate, given its abundant release in the area at the surface. Thus our preferred hypothesis is that this earthquake swarm is a symptom of ascending high-pressure CO2, perhaps reflecting slip induced on pre-existing fractures by reducing the effective normal stress. Indeed, the concentration of earthquakes with similar epicenters at a wide range of depths beneath Mammoth Mountain suggests that this may be a preferred pathway for CO2, and occasionally melt, to travel upward through the crust.

  18. Superficial simplicity of the 2010 El Mayor-Cucapah earthquake of Baja California in Mexico

    E-print Network

    Herring, Thomas A.

    The geometry of faults is usually thought to be more complicated at the surface than at depth and to control the initiation, propagation and arrest of seismic ruptures. The fault system that runs from southern California ...

  19. Probability Assessment of Mega-thrust Earthquakes in Global Subduction Zones?-from the View of Slip Deficit-

    NASA Astrophysics Data System (ADS)

    Ikuta, R.; Mitsui, Y.; Ando, M.

    2014-12-01

    We studied inter-plate slip history for about 100 years using earthquake catalogs. On assumption that each earthquake has stick-slip patch centered in its centroid, we regard cumulative seismic slips around the centroid as representing the inter-plate dislocation. We evaluated the slips on the stick-slip patches of over-M5-class earthquakes prior to three recent mega-thrust earthquakes, the 2004 Sumatra (Mw9.2), the 2010 Chile (Mw8.8), and the 2011 Tohoku (Mw9.0) around them. Comparing the cumulative seismic slips with the plate convergence, the slips before the mega-thrust events are significantly short in large area corresponding to the size of the mega-thrust events. We also researched cumulative seismic slips after other three mega-thrust earthquakes occurred in this 100 years, the 1952 Kamchatka (Mw9.0), the 1960 Chile (Mw9.5), the 1964 Alaska (Mw9.2). The cumulative slips have been significantly short in and around the focal area after their occurrence. The result should reflect persistency of the strong or/and large inter-plate coupled area capable of mega-thrust earthquakes. We applied the same procedure to global subduction zones to find that 21 regions including the focal area of above mega-thrust earthquakes show slip deficit over large area corresponding to the size of M9-class earthquakes. Considering that at least six M9-class earthquakes occurred in this 100 years and each recurrence interval should be 500-1000 years, it would not be surprised that from five to ten times of the already known regions (30 to 60 regions) are capable of M9 class earthquakes. The 21 regions as expected M9 class focal areas in our study is less than 5 to 10 times of the known 6, some of these regions may be divided into a few M9 class focal area because they extend to much larger area than typical M9 class focal area.

  20. Multi-sensor Integration of Space and Ground Observations of Pre-earthquake Anomalies Associated with M6.0, August 24, 2014 Napa, California

    NASA Astrophysics Data System (ADS)

    Ouzounov, Dimitar; Tramutoli, Valerio; Pulinets, Sergey; Liu, Tiger; Filizzola, Carolina; Genzano, Nicola; Lisi, Mariano; Petrov, Leonid; Kafatos, Menas

    2015-04-01

    We integrate multiple space-born and ground sensors for monitoring pre-earthquake geophysical anomalies that can provide significant early notification for earthquakes higher than M5.5 worldwide. The latest M6.0 event of August 24, 2014 in South Napa, California generated pre-earthquake signatures during our outgoing tests for California, and an experimental warning was documented about 17 days in advance. We process in controlled environment different satellite and ground data for California (and several other test areas) by using: a) data from the NPOES sensors recording OLR (Outgoing Longwave Radiation) in the infrared; b) 2/GNSS, FORMOSAT (GPS/TEC); c) Earth Observing System assimilation models from NASA; d) ground-based gas observations and meteorological data; e) TIR (Thermal Infrared) data from geostationary satellite (GOES). On Aug 4th, we detected (prospectively) a large anomaly of OLR transient field at the TOA over Northern California. The location was shifted in the northeast direction about 150 km from the Aug 23rd epicentral area. Compared to the reference field of August 2004 to 2014 the hotspot anomaly was the largest energy flux anomaly over the entire continental United States at this time. Based on the temporal and spatial estimates of the anomaly, on August 4th we issued an internal warning for a M5.5+ earthquake in Northern California within the next 1-4 weeks. TIR retrospective analysis showed significant (spatially extended and temporally persistent) sequences of TIR anomalies starting August 1st just in the future epicenter area and approximately in the same area affected by OLR anomalies in the following days. GPS/TEC retrospective analysis based on GIM and TGIM products show anomalies TEC variations 1-3 days, over region north form the Napa earthquake epicenter. The calculated index of atmospheric chemical potential based on the NASA numerical Assimilation weather model GEOS5 indicates for abnormal variations near the epicentral area days before the quake; Our real-time and post-event integration of several atmospheric parameters from satellite and ground observations during the M6.0 on 08.24.2014 in Napa California demonstrated the synergy of related variations of these parameters implying their connection with the earthquake preparation process.

  1. Local Public Health System Response to the Tsunami Threat in Coastal California following the T?hoku Earthquake

    PubMed Central

    Hunter, Jennifer C.; Crawley, Adam W.; Petrie, Michael; Yang, Jane E.; Aragón, Tomás J.

    2012-01-01

    Background On Friday March 11, 2011 a 9.0 magnitude earthquake triggered a tsunami off the eastern coast of Japan, resulting in thousands of lives lost and billions of dollars in damage around the Pacific Rim. The tsunami first reached the California coast on Friday, March 11th, causing more than $70 million in damage and at least one death. While the tsunami’s impact on California pales in comparison to the destruction caused in Japan and other areas of the Pacific, the event tested emergency responders’ ability to rapidly communicate and coordinate a response to a potential threat. Methods To evaluate the local public health system emergency response to the tsunami threat in California, we surveyed all local public health, emergency medical services (EMS), and emergency management agencies in coastal or floodplain counties about several domains related to the tsunami threat in California, including: (1) the extent to which their community was affected by the tsunami, (2) when and how they received notification of the event, (3) which public health response activities were carried out to address the tsunami threat in their community, and (4) which organizations contributed to the response. Public health activities were characterized using the Centers for Disease Control and Prevention (CDC) Public Health Preparedness Capabilities (PHEP) framework. Findings The tsunami's impact on coastal communities in California ranged widely, both in terms of the economic consequences and the response activities. Based on estimates from the National Oceanic and Atmospheric Administration (NOAA), ten jurisdictions in California reported tsunami-related damage, which ranged from $15,000 to $35 million. Respondents first became aware of the tsunami threat in California between the hours of 10:00pm Pacific Standard Time (PST) on Thursday March 10th and 2:00pm PST on Friday March 11th, a range of 16 hours, with notification occurring through both formal and informal channels. In response to this threat, the activities most commonly reported by the local government agencies included in this study were: emergency public information and warning, emergency operations coordination, and inter-organizational information sharing, which were reported by 86%, 75%, and 65% of all respondents, respectively. When looking at the distribution of responsibility, emergency management agencies were the most likely to report assuming a lead role in these common activities as well as those related to evacuation and community recovery. While activated less frequently, public health agencies carried out emergency response functions related to surveillance and epidemiology, environmental health, and mental health/psychological support. Both local public health and EMS agencies took part in mass care and medical material management activities. A large network of organizations contributed to response activities, with emergency management, law enforcement, fire, public health, public works, EMS, and media cited by more than half of respondents. Conclusions In response to the tsunami threat in California, we found that emergency management agencies assumed a lead role in the local response efforts. While public health and medical agencies played a supporting role in the response, they uniquely contributed to a number of specific activities. If the response to the recent tsunami is any indication, these support activities can be anticipated in planning for future events with similar characteristics to the tsunami threat. Additionally, we found that many respondents first learned of the tsunami through the media, rather than through rapid notification systems, which suggests that government agencies must continue to develop and maintain the ability to rapidly aggregate and analyze information in order to provide accurate assessments and guidance to a potentially well-informed public. Citation: Hunter JC, Crawley AW, Petrie M, Yang JE, Aragón TJ. Local Public Health System Response to the Tsunami Threat in Coastal California following the T?hok

  2. Adaptively Smoothed Seismicity Earthquake Forecasts for Italy

    E-print Network

    Werner, M J; Jackson, D D; Kagan, Y Y; Wiemer, S

    2010-01-01

    We present a model for estimating the probabilities of future earthquakes of magnitudes m > 4.95 in Italy. The model, a slightly modified version of the one proposed for California by Helmstetter et al. (2007) and Werner et al. (2010), approximates seismicity by a spatially heterogeneous, temporally homogeneous Poisson point process. The temporal, spatial and magnitude dimensions are entirely decoupled. Magnitudes are independently and identically distributed according to a tapered Gutenberg-Richter magnitude distribution. We estimated the spatial distribution of future seismicity by smoothing the locations of past earthquakes listed in two Italian catalogs: a short instrumental catalog and a longer instrumental and historical catalog. The bandwidth of the adaptive spatial kernel is estimated by optimizing the predictive power of the kernel estimate of the spatial earthquake density in retrospective forecasts. When available and trustworthy, we used small earthquakes m>2.95 to illuminate active fault structur...

  3. Forecasting damaging earthquakes in the central and eastern United States

    USGS Publications Warehouse

    Nishenko, S.P.; Bollinger, G.A.

    1990-01-01

    Analysis of seismograph network data, earthquake catalogs from 1727 to 1982, and paleoseismic data for the central and eastern United States indicate that the Poisson probability of a damaging earthquake (magnitude ??? 6.0) occurring during the next 30 years is at a moderate to high level (0.4 to 0.6). When differences in seismic wave attenuation are taken into account, the central and eastern United States has approximately two-thirds the likelihood of California to produce an earthquake with comparable damage area and societal impact within the next 30 years.

  4. A RELM earthquake forecast based on pattern informatics

    E-print Network

    Holliday, J R; Donnelan, A; Rundle, J B; Tiampo, K F; Turcotte, D L; Chen, Chien-chih; Donnelan, Andrea; Holliday, James R.; Rundle, John B.; Tiampo, Kristy F.; Turcotte, Donald L.

    2005-01-01

    We present a RELM forecast of future earthquakes in California that is primarily based on the pattern informatics (PI) method. This method identifies regions that have systematic fluctuations in seismicity, and it has been demonstrated to be successful. A PI forecast map originally published on 19 February 2002 for southern California successfully forecast the locations of sixteen of eighteen M>5 earthquakes during the past three years. The method has also been successfully applied to Japan and on a worldwide basis. An alternative approach to earthquake forecasting is the relative intensity (RI) method. The RI forecast map is based on recent levels of seismic activity of small earthquakes. Recent advances in the PI method show considerable improvement, particularly when compared with the RI method using relative operating characteristic (ROC) diagrams for binary forecasts. The RELM application requires a probability for each location for a number of magnitude bins over a five year period. We have therefore co...

  5. The 1987 Whittier Narrows earthquake in the Los Angeles metropolitan area, California

    USGS Publications Warehouse

    Hauksson, E.; Jones, L.M.; Davis, T.L.; Hutton, L.K.; Brady, A.G.; Reasenberg, P.A.; Michael, A.J.; Yerkes, R.F.; Williams, Pat; Reagor, G.; Stover, C.W.; Bent, A.L.; Shakal, A.K.; Etheredge, E.; Porcella, R.L.; Bufe, C.G.; Johnston, M.J.S.; Cranswick, E.

    1988-01-01

    The Whittier Narrows earthquake sequence (local magnitude, ML=5.9), which caused over $358-million damage, indicates that assessments of earthquake hazards in the Los Angeles metropolitan area may be underestimated. The sequence ruptured a previously unidentified thrust fault that may be part of a large system of thrust faults that extends across the entire east-west length of the northern margin of the Los Angeles basin. Peak horizontal accelerations from the main shock, which were measured at ground level and in structures, were as high as 0.6g (where g is the acceleration of gravity at sea level) within 50 kilometers of the epicenter. The distribution of the modified Mercalli intensity VII reflects a broad north-south elongated zone of damage that is approximately centered on the main shock epicenter.

  6. Preseismic and coseismic deformation associated with the Coyote Lake, California, earthquake

    NASA Astrophysics Data System (ADS)

    King, N. E.; Savage, J. C.; Lisowski, M.; Prescott, W. H.

    1981-02-01

    Supplement is available with entire article on microfiche. Orderfrom American Geophysical Union, 2000 Florida Avenue, N.W.,Washington, D.C. 20006. Document J79-008; $01.00. Payment mustaccompany order. The Coyote Lake earthquake (ML = 5.9; August 6, 1979; epicenter about 100 km southeast of San Francisco) occurred on the Calaveras fault within a geodetic network that had been surveyed annually since 1972 to monitor strain accumulation. The rupture surface as defined by aftershocks is a vertical rectangle 20 km in length extending from a depth of 4 km to about 12 km. The observed deformation of the geodetic network constrains the average slip to be about 0.33 ± 0.05 m right lateral. Although the geodetic data furnished an exceptionally detailed picture of the preearthquake deformation, no significant premonitory anomaly associated with the Coyote Lake earthquake can be identified.

  7. Non-shear focal mechanisms of earthquakes at The Geysers, California and Hengill, Iceland, geothermal areas

    USGS Publications Warehouse

    Julian, B.R.; Miller, A.D.; Foulger, G.R.

    1993-01-01

    Several thousand earthquakes were recorded in each area. We report an initial investigation of the focal mechanisms based on P-wave polarities. Distortion by complicated three-dimensional crustal structure was minimized using tomographically derived three-dimensional crustal models. Events with explosive and implosive source mechanisms, suggesting cavity opening and collapse, have been tentatively identified at The Geysers. The new data show that some of these events do not fit the model of tensile cracking accompanied by isotropic pore pressure decreases that was suggested in earlier studies, but that they may instead involve combination of explosive and shear processes. However, the confirmation of earthquakes dominated by explosive components supports the model that the event are caused by crack opening induced by thermal contraction of the heat source.

  8. The use of geologic and seismologic information to reduce earthquake Hazards in California

    USGS Publications Warehouse

    Kockelman, W.J.; Campbell, C.C.

    1984-01-01

    Five examples illustrate how geologic and seismologic information can be used to reduce the effects of earthquakes Included are procedures for anticipating damage to critical facilities, preparing, adopting, or implementing seismic safety studies, plans, and programs, retrofitting highway bridges, regulating development in areas subject to fault-rupture, and strengthening or removing unreinforced masonry buildings. The collective effect of these procedures is to improve the public safety, health, and welfare of individuals and their communities. ?? 1984 Springer-Verlag New York Inc.

  9. A Double-Difference Earthquake Location Algorithm: Method and Application to the Northern Hayward Fault, California

    Microsoft Academic Search

    Felix Waldhauser; William L. Ellsworth

    2000-01-01

    We have developed an efficient method to determine high-resolution hypocenter locations over large distances. The location method incorporates ordinary absolute travel-time measurements and\\/or cross-correlation P-and S-wave differential travel-time measurements. Residuals between observed and theoretical travel-time differences (or double-differences) are minimized for pairs of earthquakes at each station while linking together all observed event-station pairs. A least-squares solu- tion is found

  10. Chapter A. The Loma Prieta, California, Earthquake of October 17, 1989 - Loss Estimation and Procedures

    USGS Publications Warehouse

    Tubbesing, Susan K., (Edited By)

    1994-01-01

    This Professional Paper includes a collection of papers on subjects ranging from evaluation of building safety, to human injuries, to correlation of ground deformation with building damage. What these papers share is a common goal to improve the tools available to the research community to measure the nature, extent, and causes of damage and losses due to earthquakes. These measurement tools are critical to reducing future loss.

  11. Chapter D. The Loma Prieta, California, Earthquake of October 17, 1989 - Recovery, Mitigation, and Reconstruction

    USGS Publications Warehouse

    Nigg, Joanne M., (Edited By)

    1998-01-01

    The papers in this chapter reflect the broad spectrum of issues that arise following a major damaging urban earthquake-the regional economic consequences, rehousing problems, reconstruction strategies and policies, and opportunities for mitigation before the next major seismic event. While some of these papers deal with structural or physical science topics, their significant social and policy implications make them relevant for improving our understanding of the processes and dynamics that take place during the recovery period.

  12. Chapter A. The Loma Prieta, California, Earthquake of October 17, 1989 - Main Shock Characteristics

    USGS Publications Warehouse

    Spudich, Paul, (Edited By)

    1996-01-01

    The October 17, 1989, Loma Prieta, Calif., earthquake (0004:15.2 G.m.t. October 18; lat 37.036? N., long 121.883? W.; 19-km depth) had a local magnitude (ML) of about 6.7, a surface-wave magnitude (MS) of 7.1, a seismic moment of 2.2x1019 N-m to 3.5x1019 N-m, a source duration of 6 to 15 s, and an average stress drop of at least 50 bars. Slip occurred on a dipping fault surface about 35 km long and was largely confined to a depth of about 7 to 20 km. The slip vector had a large vertical component, and slip was distributed in two main regions situated northwest and southeast of the hypocenter. This slip distribution caused about half of the earthquake's energy to be focused toward the urbanized San Francisco Bay region, while the other half was focused toward the southeast. Had the rupture initiated at the southeast end of the aftershock zone, shaking in the bay region would have been both longer and stronger. These source parameters suggest that the earthquake was not a typical shallow San Andreas-type event but a deeper event on a different fault with a recurrence interval of many hundreds of years. Therefore, the potential for a damaging shallow event on the San Andreas fault in the Santa Cruz Mountains may still exist.

  13. A century of oilfield operations and earthquakes in the greater Los Angeles Basin, southern California

    USGS Publications Warehouse

    Hauksson, Egill; Goebel, Thomas; Ampuero, Jean-Paul; Cochran, Elizabeth S.

    2015-01-01

    Most of the seismicity in the Los Angeles Basin (LA Basin) occurs at depth below the sediments and is caused by transpressional tectonics related to the big bend in the San Andreas fault. However, some of the seismicity could be associated with fluid extraction or injection in oil fields that have been in production for almost a century and cover ? 17% of the basin. In a recent study, first the influence of industry operations was evaluated by analyzing seismicity characteristics, including normalized seismicity rates, focal depths, and b-values, but no significant difference was found in seismicity characteristics inside and outside the oil fields. In addition, to identify possible temporal correlations, the seismicity and available monthly fluid extraction and injection volumes since 1977 were analyzed. Second, the production and deformation history of the Wilmington oil field were used to evaluate whether other oil fields are likely to experience similar surface deformation in the future. Third, the maximum earthquake magnitudes of events within the perimeters of the oil fields were analyzed to see whether they correlate with total net injected volumes, as suggested by previous studies. Similarly, maximum magnitudes were examined to see whether they exhibit an increase with net extraction volume. Overall, no obvious previously unidentified induced earthquakes were found, and the management of balanced production and injection of fluids appears to reduce the risk of induced-earthquake activity in the oil fields.

  14. Coseismic deformation during the 1989 Loma Prieta earthquake and range-front thrusting along the southwestern margin of the Santa Clara Valley, California

    USGS Publications Warehouse

    Langenheim, V.E.; Schmidt, K.M.; Jachens, R.C.

    1997-01-01

    Damage patterns caused by the 1989 Loma Prieta earthquake along the southwestern margin of the Santa Clara Valley, California, form three zones that coincide with mapped and inferred traces of range-front thrust faults northeast of the San Andreas fault. Damage in these zones was largely contractional, consistent with past displacement associated with these faults. The damage zones coincide with gravity and aeromagnetic anomalies; modeling of the anomalies defines a southwest-dipping thrust fault that places the Franciscan Complex over Cenozoic sedimentary rocks to minimum depths of 2 km. Diffuse Loma Prieta earthquake aftershocks encompass the downward projection of this modeled thrust to depths of 9 km. Our results indicate that in this region the potential for concentrated damage arising from either primary deformation along the thrust faults themselves or by sympathetic motion triggered by earthquakes on the San Andreas fault may be higher than previously recognized.

  15. Avian Flu / Earthquake Prediction

    NSDL National Science Digital Library

    This radio broadcast includes a discussion of the avian flu spreading though Southeast Asia, Russia and parts of Europe. Topics include whether the outbreak is a pandemic in the making, and what preparations might be made to control the outbreak. The next segment of the broadcast discusses earthquake prediction, in light of the 2005 earthquake in Pakistan. Two seismologists discuss what was learned in the Parkfield project, an experiment in earthquake prediction conducted in California. Other topics include the distribution of large versus small earthquakes; how poor construction magnifies earthquake devastation; and the relationship of plate tectonics to the Pakistan earthquake.

  16. Differentiating Tectonic and Anthropogenic Earthquakes in the Greater Los Angeles Basin, Southern California

    NASA Astrophysics Data System (ADS)

    Hauksson, E.; Goebel, T.; Cochran, E. S.; Ampuero, J. P.

    2014-12-01

    The 2014 flurry of moderate earthquakes in the Los Angeles region raised the concern if some of this or past seismicity was of anthropogenic origin as opposed to being caused by ongoing transpressional tectonics. The Mw5.1 La Habra sequence is located near several major oil fields but the Mw4.4 Encino sequence was located away from oil fields, within the Santa Monica Mountains. The last century of seismicity in the Los Angeles area consists of numerous small and large earthquakes. Most of these earthquakes occur beneath the basin sediments and are associated with transpressional tectonics, related to the big bend in the San Andreas fault, but some could be associated with large oil fields. In particular, both the 1933 Mw6.4 Long Beach and the 1987 Mw5.9 Whittier Narrows earthquakes were spatially associated with two major oil fields, the Huntington Beach and Montebello fields. Numerous large oil fields have been in production for more than 125 years. The geographical locations of the oil fields follow major tectonic trends such as the Newport-Inglewood fault, the Whittier fault, and the thrust belt located at the north edge of the Los Angeles basin. More than 60 fields have oil wells and some of these have both disposal and fracking wells. Before fluid injection became common, Kovach (1974) documented six damaging events induced by fluid extraction from 1947 to 1961 in the Wilmington oil field. Since 1981 the waveform-relocated earthquake catalog for the Los Angeles basin is complete on the average above M2.0. We compare the spatial distribution of these events and the proximity of nearby active oil fields. We will also analyze the seismicity in the context of available monthly fluid extraction and injection volumes and search for temporal correlations. The La Habra sequence apparently correlates with temporal changes in extraction and injection volumes in the Santa Fe Springs oil field but not with activities in other oil fields within closer spatial proximity.

  17. Coseismic and Initial Postseismic Deformation from the 2004 Parkfield, California, Earthquake, Observed by Global Positioning System, Electronic Distance Meter, Creepmeters, and Borehole Strainmeters

    Microsoft Academic Search

    J. Langbein; J. R. Murray; H. A. Snyder

    2006-01-01

    Global Positioning System (GPS), electronic distance meter, creepmeter, and strainmeter measurements spanning the M 6.0 Parkfield, California, earthquake are examined. Using these data from 100 sec through 9 months following the main- shock, the Omori's law, with rate inversely related to time, 1\\/tp and p ranging be- tween 0.7 and 1.3, characterizes the time-dependent deformation during the post- seismic period;

  18. Coseismic and Postseismic Deformation Associated with the Mw7.2 2010 El Mayor-Cucapah Earthquake, Baja California, Mexico, from GPS Geodesy

    NASA Astrophysics Data System (ADS)

    Spinler, J. C.; Bennett, R. A.; Gonzalez-Garcia, J. J.; Walls, C. P.

    2011-12-01

    We present crustal motion measurements from an analysis of continuous (CGPS) and campaign data for southern California and northern Baja California, Mexico. The 2010 April 4 Mw7.2 El Mayor-Cucapah (EMC) earthquake is the largest event to occur along the southern San Andreas fault system in nearly two decades. We analyzed data from 103 CGPS sites located within 250 km of the EMC earthquake, determining coordinate time series relative to a stable North America reference frame. In addition to these CGPS sites that had data for a significant time range prior to the EMC event, we analyzed data from 8 CGPS sites that were constructed following the earthquake, and are now part of the EarthScope Plate Boundary Observatory (PBO) network. We also analyzed data from both before and after the EMC earthquake for ~80 campaign sites located primarily in northern Baja California, Mexico. We combined the CGPS and campaign datasets during our analysis in order to produce the best estimates for postseismic deformation. We applied a constrained random search algorithm minimizing the misfit to the individual site coordinate positions to determine the parameters of our kinematic model. We estimate interseismic velocity, annual and semi-annual perturbations, coseismic offsets and postseismic deformation following the Mw7.2 EMC event and the largest aftershock, a Mw5.7 event in southern California on June 15th, 2010, and the characteristic exponential decay time for each event at each individual site. We assess any potential tradeoffs between the different parameters in our kinematic model, including tradeoffs between estimates for aftershock coseismic offsets and postseismic relaxation.

  19. Processed seismic motion records from Desert Hot Springs, California earthquake of April 22, 1992, recorded at seismic stations in southern Nevada

    SciTech Connect

    Lum, P.K.; Honda, K.K.

    1993-04-01

    The 8mm data tape contains the processed seismic data of the Desert Hot Springs, California earthquake of April 22nd, 1992. The seismic data were recorded by 19 seismographs maintained by the DOE/NV in Southern Nevada. Four files were generated from each seismic recorder. They are ``Uncorrected acceleration time histories, 2. corrected acceleration, velocity and displacement time histories, 3. pseudo response velocity spectra and 4. Fourier amplitude spectra of acceleration.

  20. INVERSION OF STRONG GROUND MOTION AND TELESEISMIC WAVEFORM DATA FOR THE FAULT RUPTURE HISTORY OF THE 1979 IMPERIAL VALLEY, CALIFORNIA, EARTHQUAKE

    Microsoft Academic Search

    STEPHEN H. HARTZELL; THOMAS H. HEATON

    1983-01-01

    A least-squares point-by-point inversion of strong ground motion and tele- seismic body waves is used to infer the fault rupture history of the 1979 Imperial Valley, California, earthquake. The Imperial fault is represented by a plane embedded in a half-space where the elastic properties vary with depth. The inversion yields both the spatial and temporal variations in dislocation on the

  1. Interaction of the san jacinto and san andreas fault zones, southern california: triggered earthquake migration and coupled recurrence intervals.

    PubMed

    Sanders, C O

    1993-05-14

    Two lines of evidence suggest that large earthquakes that occur on either the San Jacinto fault zone (SJFZ) or the San Andreas fault zone (SAFZ) may be triggered by large earthquakes that occur on the other. First, the great 1857 Fort Tejon earthquake in the SAFZ seems to have triggered a progressive sequence of earthquakes in the SJFZ. These earthquakes occurred at times and locations that are consistent with triggering by a strain pulse that propagated southeastward at a rate of 1.7 kilometers per year along the SJFZ after the 1857 earthquake. Second, the similarity in average recurrence intervals in the SJFZ (about 150 years) and in the Mojave segment of the SAFZ (132 years) suggests that large earthquakes in the northern SJFZ may stimulate the relatively frequent major earthquakes on the Mojave segment. Analysis of historic earthquake occurrence in the SJFZ suggests little likelihood of extended quiescence between earthquake sequences. PMID:17818388

  2. Earthquake Forecasting Using Hidden Markov Models

    Microsoft Academic Search

    Daniel W. Chambers; Jenny A. Baglivo; John E. Ebel; Alan L. Kafka

    This paper develops a novel method, based on hidden Markov models, to forecast earthquakes and applies the method to mainshock\\u000a seismic activity in southern California and western Nevada. The forecasts are of the probability of a mainshock within 1,\\u000a 5, and 10 days in the entire study region or in specific subregions and are based on the observations available at the

  3. Probability of chance correlations of earthquakes with predictions in areas of heterogeneous seismicity rate: the VAN case

    Microsoft Academic Search

    M. Wyss; A. Allmann

    1996-01-01

    Evaluations of 22 claims of successful earthquake predictions in Greece by Varotsos and Lazaridou [1991] were performed using the Ms (surface wave) as well as the ML (local) magnitude scales. If we assume that the predicted magnitudes were Ms (the scale was not specified in the prediction telegrams), and use the Preliminary Determinations of Epicenters (PDE) to estimate the seismicity

  4. Evidence for possible horizontal faulting in southern California from earthquake mechanisms

    Microsoft Academic Search

    Weishi Huang; L. T. Silver; H. Kanamori

    1996-01-01

    We find that 36 of the 505 fault-plane solutions (M >= 3.0, 1981 1990) in southern California have a nodal plane dipping no more than 30 °. With the assumption of the low-angle nodal planes being the fault planes, four cross sections are constructed to show the possible horizontal faults in the middle and upper crust. More than half of

  5. Three-dimensional fault framework of the 2014 South Napa Earthquake, San Francisco Bay region, California

    NASA Astrophysics Data System (ADS)

    Graymer, R. W.

    2014-12-01

    Assignment of the South Napa earthquake to a mapped fault is difficult, as it occurred where three large, northwest-trending faults converge and may interact in the subsurface. The surface rupture did not fall on the main trace of any of these faults, but instead between the Carneros and West Napa faults and northwest along strike from the northern mapped end of the Franklin Fault. The 2014 rupture plane appears to be nearly vertical, based on focal mechanisms of the mainshock and connection of the surface trace/rupture to the relocated hypocenter (J. Hardebeck, USGS). 3D surfaces constructed from published data show that the Carneros Fault is a steeply west-dipping fault that runs just west of the near-vertical 2014 rupture plane. The Carneros Fault does not appear to have been involved in the earthquake, although relocated aftershocks suggest possible minor triggered slip. The main West Napa Fault is also steeply west-dipping and that its projection intersects the 2014 rupture plane at around the depth of the mainshock hypocenter. UAVSAR data (A. Donnellan, JPL) and relocated aftershocks suggest that the main West Napa Fault experienced triggered slip/afterslip along a length of roughly 20 km. It is possible that the 2014 rupture took place along a largely unrecognized westerly strand of the West Napa Fault. The Franklin Fault is a steeply east-dipping fault (with a steeply west-dipping subordinate trace east of Mare Island) that has documented late Quaternary offset. Given the generally aligned orientation of the 3D fault surfaces, an alternative interpretation is that the South Napa earthquake occurred on the northernmost reach of the Franklin Fault within it's 3D junction with the West Napa Fault. This interpretation is supported, but not proven, by a short but prominent linear feature in the UAVSAR data at Slaughterhouse Point west of Vallejo, along trend south-southeast of the observed coseismic surface rupture.

  6. Predicting catastrophic earthquakes

    NSDL National Science Digital Library

    Iwata et al.

    This resource provides an abstract. This article discusses a method based on the magnitude-frequency distribution of previous earthquakes in a region. It is used to examine the probability of a small earthquake growing into a catastrophic one. When a small earthquake is detected in a region where a catastrophic one is expected, early warning systems can be modified to determine the probability that this earthquake will grow in magnitude. It was found that if the observed earthquake magnitude reaches 6.5, the estimated probability that the final magnitude will reach 7.5 is between 25 and 41 percent.

  7. Structural Constraints and Earthquake Recurrence Estimates for the West Tahoe-Dollar Point Fault, Lake Tahoe Basin, California

    NASA Astrophysics Data System (ADS)

    Maloney, J. M.; Driscoll, N. W.; Kent, G.; Brothers, D. S.; Baskin, R. L.; Babcock, J. M.; Noble, P. J.; Karlin, R. E.

    2011-12-01

    Previous work in the Lake Tahoe Basin (LTB), California, identified the West Tahoe-Dollar Point Fault (WTDPF) as the most hazardous fault in the region. Onshore and offshore geophysical mapping delineated three segments of the WTDPF extending along the western margin of the LTB. The rupture patterns between the three WTDPF segments remain poorly understood. Fallen Leaf Lake (FLL), Cascade Lake, and Emerald Bay are three sub-basins of the LTB, located south of Lake Tahoe, that provide an opportunity to image primary earthquake deformation along the WTDPF and associated landslide deposits. We present results from recent (June 2011) high-resolution seismic CHIRP surveys in FLL and Cascade Lake, as well as complete multibeam swath bathymetry coverage of FLL. Radiocarbon dates obtained from the new piston cores acquired in FLL provide age constraints on the older FLL slide deposits and build on and complement previous work that dated the most recent event (MRE) in Fallen Leaf Lake at ~4.1-4.5 k.y. BP. The CHIRP data beneath FLL image slide deposits that appear to correlate with contemporaneous slide deposits in Emerald Bay and Lake Tahoe. A major slide imaged in FLL CHIRP data is slightly younger than the Tsoyowata ash (7950-7730 cal yrs BP) identified in sediment cores and appears synchronous with a major Lake Tahoe slide deposit (7890-7190 cal yrs BP). The equivalent age of these slides suggests the penultimate earthquake on the WTDPF may have triggered them. If correct, we postulate a recurrence interval of ~3-4 k.y. These results suggest the FLL segment of the WTDPF is near its seismic recurrence cycle. Additionally, CHIRP profiles acquired in Cascade Lake image the WTDPF for the first time in this sub-basin, which is located near the transition zone between the FLL and Rubicon Point Sections of the WTDPF. We observe two fault-strands trending N45°W across southern Cascade Lake for ~450 m. The strands produce scarps of ~5 m and ~2.7 m, respectively, on the lake floor, but offset increases down-section to ~14 m and ~8 m at the acoustic basement. Studying the style and timing of earthquake deformation in Fallen Leaf Lake, Cascade Lake, Emerald Bay and Lake Tahoe will help us to understand how strain is partitioned between adjacent segments and the potential rupture magnitude.

  8. The 130-km-long Green Valley Fault Zone of Northern California: Discontinuities Regulate Its Earthquake Recurrence

    NASA Astrophysics Data System (ADS)

    Lienkaemper, J. J.

    2012-12-01

    The Green Valley fault (GVF), a branch of the dextral strike-slip San Andreas fault system, connects the Northern Calaveras fault (NCF) to the Bartlett Springs fault (BSF) to the north. Although, the GVF may occasionally rupture along its entire length to produce M7 earthquakes, 2-3 km discontinuities in its trace appear to modulate the length and frequency of ruptures. The global historical earthquake record suggests that ruptures tend to stop at such fault discontinuities (1-4 km steps) about half the time (Wesnousky and Biasi, 2011). The GVF has three sections: the 77-km-long southern GVF (SGVF), the 25-km Berryessa (BF), and the 30-km Hunting Creek (HCF). The SGVF itself could produce large (M6.7) events, and the BF and HCF somewhat smaller events (M6.3-6.6). The BF is centered on a compressional pop-up structure. It is separated to the north from the HCF by a ~2.5-3 km extensional stepover and to the south from the SGVF by a ~2.5-3 km extensional bend. At its south end, the GVF is separated from the NCF by the 5-km Alamo stepover, which is likely to stop all ruptures; and at its north end the GVF (HCF section) makes a 2.5 km right step to the BSF at Wilson Valley. The HCF apparently forms a significant transition between the BSF and the GVF. The overall trend of the GVF bends ~17° through the HCF and emerges on the BSF trend. Thus, this bend, along with the Wilson Valley step-over, would tend to inhibit ruptures between BSF and sections of the GVF. Creep rates along most of the GVF (SGVF, HCF) range from 1 to 4 mm/yr. No creep is known for the BF section, but its microseismicity levels are similar to creeping parts of the GVF and BSF, so we assume that the BF may creep too. We estimate slip rate on the GVF is 6±2 mm/yr by interpolating rates on the BSF and the NCF. Lienkaemper and Brown (2009) estimated ~6.5 mm/yr for the average deep loading rate on the BSF using a rigid block model of the USGS-GPS site velocities observed in the central BSF. This rate is comparable to the 6 mm/yr Holocene slip rate observed on the NCF (Kelson et al., 1996). Microearthquakes on the GVF reach a depth of ~14 km. Using methods of Savage and Lisowski (1993) for the GVF suggests that creep may on average extend to depths of ~7.5 km, leaving a width of ~6.5 km of locked fault zone below. Trenching on the SGVF indicates 400 (±50) years have elapsed since the most recent large earthquake (MRE) in 1610±50 yr CE. Previous earthquake recurrence intervals (RI) in the past millennium indicate a mean RI of 200±80 yr (?±?) for the SGVF, which is much shorter than the 400-yr open interval. Preliminary evidence from trenching on the BF gives a MRE of 1630±100 yr CE, which may thus coincide with of the MRE on the SGVF. If the MRE on the BF and SGVF sections is the same earthquake, then its expected larger size (M~6.9-7.0 vs 6.7) and greater fault complexity may have produced a large stress drop, which would possibly help explain the current long open interval. The SGVF paleoseismic recurrence model is consistent with a simple probabilistic rupture model (i.e., 50%-probable rupture across 1-4 km steps) and with a Brownian Passage Time recurrence model with a mean RI of 250 yr, CV (coefficient of variation, ?/?) of 0.6, and a 30-yr rupture probability of 20-25%.

  9. History of earthquakes and tsunamis along the eastern Aleutian-Alaska megathrust, with implications for tsunami hazards in the California Continental Borderland

    USGS Publications Warehouse

    Ryan, Holly F.; von Huene, Roland; Wells, Ray E.; Scholl, David W.; Kirby, Stephen; Draut, Amy E.

    2012-01-01

    During the past several years, devastating tsunamis were generated along subduction zones in Indonesia, Chile, and most recently Japan. Both the Chile and Japan tsunamis traveled across the Pacific Ocean and caused localized damage at several coastal areas in California. The question remains as to whether coastal California, in particular the California Continental Borderland, is vulnerable to more extensive damage from a far-field tsunami sourced along a Pacific subduction zone. Assuming that the coast of California is at risk from a far-field tsunami, its coastline is most exposed to a trans-Pacific tsunami generated along the eastern Aleutian-Alaska subduction zone. We present the background geologic constraints that could control a possible giant (Mw ~9) earthquake sourced along the eastern Aleutian-Alaska megathrust. Previous great earthquakes (Mw ~8) in 1788, 1938, and 1946 ruptured single segments of the eastern Aleutian-Alaska megathrust. However, in order to generate a giant earthquake, it is necessary to rupture through multiple segments of the megathrust. Potential barriers to a throughgoing rupture, such as high-relief fracture zones or ridges, are absent on the subducting Pacific Plate between the Fox and Semidi Islands. Possible asperities (areas on the megathrust that are locked and therefore subject to infrequent but large slip) are identified by patches of high moment release observed in the historical earthquake record, geodetic studies, and the location of forearc basin gravity lows. Global Positioning System (GPS) data indicate that some areas of the eastern Aleutian-Alaska megathrust, such as that beneath Sanak Island, are weakly coupled. We suggest that although these areas will have reduced slip during a giant earthquake, they are not really large enough to form a barrier to rupture. A key aspect in defining an earthquake source for tsunami generation is determining the possibility of significant slip on the updip end of the megathrust near the trench. Large slip on the updip part of the eastern Aleutian-Alaska megathrust is a viable possibility owing to the small frontal accretionary prism and the presence of arc basement relatively close to the trench along most of the megathrust.

  10. [Comment on “Should Memphis build for California's earthquakes?”] from A.D. Frankel

    NASA Astrophysics Data System (ADS)

    Frankel, Arthur D.

    Seth Stein et al. [2003] argue that the city of Memphis, Tennessee should not adopt the seismic design provisions of the new International Building Code (IBC), which are based on the national seismic hazard maps produced by the US.Geological Survey [Frankel et al., 1996; Frankel et al., 2000]. They contend that the USGS national seismic hazard maps overestimate the earthquake hazard from the New Madrid Seismic Zone (NMSZ) in the central United States.As project chief for the USGS national seismic hazard maps, I think that constructive criticism is always useful. However, Stein et al. [2003] do not present any valid scientific reason why they think the ground motion values in the USGS maps are too high. They also ignore important scientific research that has been conducted over the past 20 years on earthquakes and ground motions in intra-plate areas, research that is included in the USGS maps. They do not mention the extensive consensus process among scientific experts involved in developing the USGS hazard maps, as well as the elaborate consensus and vetting process among engineers in the decisions made in constructing the IBC design maps from the hazard maps. They also present a cost-benefit analysis to argue against adopting the IBC. I contend that their cost-benefit analysis is unrealistic.

  11. Using Logistic Regression to Predict the Probability of Debris Flows in Areas Burned by Wildfires, Southern California, 2003-2006

    USGS Publications Warehouse

    Rupert, Michael G.; Cannon, Susan H.; Gartner, Joseph E.; Michael, John A.; Helsel, Dennis R.

    2008-01-01

    Logistic regression was used to develop statistical models that can be used to predict the probability of debris flows in areas recently burned by wildfires by using data from 14 wildfires that burned in southern California during 2003-2006. Twenty-eight independent variables describing the basin morphology, burn severity, rainfall, and soil properties of 306 drainage basins located within those burned areas were evaluated. The models were developed as follows: (1) Basins that did and did not produce debris flows soon after the 2003 to 2006 fires were delineated from data in the National Elevation Dataset using a geographic information system; (2) Data describing the basin morphology, burn severity, rainfall, and soil properties were compiled for each basin. These data were then input to a statistics software package for analysis using logistic regression; and (3) Relations between the occurrence or absence of debris flows and the basin morphology, burn severity, rainfall, and soil properties were evaluated, and five multivariate logistic regression models were constructed. All possible combinations of independent variables were evaluated to determine which combinations produced the most effective models, and the multivariate models that best predicted the occurrence of debris flows were identified. Percentage of high burn severity and 3-hour peak rainfall intensity were significant variables in all models. Soil organic matter content and soil clay content were significant variables in all models except Model 5. Soil slope was a significant variable in all models except Model 4. The most suitable model can be selected from these five models on the basis of the availability of independent variables in the particular area of interest and field checking of probability maps. The multivariate logistic regression models can be entered into a geographic information system, and maps showing the probability of debris flows can be constructed in recently burned areas of southern California. This study demonstrates that logistic regression is a valuable tool for developing models that predict the probability of debris flows occurring in recently burned landscapes.

  12. Probability gains of an epidemic-type aftershock sequence model in retrospective forecasting of M ?? 5 earthquakes in Italy

    Microsoft Academic Search

    R. Console; M. Murru; G. Falcone

    2010-01-01

    A stochastic triggering (epidemic) model incorporating short-term clustering was fitted to the instrumental earthquake catalog\\u000a of Italy for event with local magnitudes 2.6 and greater to optimize its ability to retrospectively forecast 33 target events\\u000a of magnitude 5.0 and greater that occurred in the period 1990–2006. To obtain an unbiased evaluation of the information value\\u000a of the model, forecasts of

  13. Probable satellite thermal infrared anomaly before the Zhangbei M s =6.2 earthquake on January 10, 1998

    Microsoft Academic Search

    Qi-Qi Lü; Jian-Hai Ding; Cheng-Yu Cui

    2000-01-01

    This paper used the thermal infrared data of the satellite NOAA-AAVHRR of the north part of North China (113–119E, 38–42N),\\u000a and processed the remote sensing data through radiation adjustment, geometric adjustment and so on by the software “The Monitoring\\u000a and Fast Process System of Earthquake Precursor Thermal Infrared Anomaly”, inversed the earth surface temperature. Some disturbances\\u000a effect had been excluded,

  14. Spatial organization of foreshocks as a tool to forecast large earthquakes.

    PubMed

    Lippiello, E; Marzocchi, W; de Arcangelis, L; Godano, C

    2012-01-01

    An increase in the number of smaller magnitude events, retrospectively named foreshocks, is often observed before large earthquakes. We show that the linear density probability of earthquakes occurring before and after small or intermediate mainshocks displays a symmetrical behavior, indicating that the size of the area fractured during the mainshock is encoded in the foreshock spatial organization. This observation can be used to discriminate spatial clustering due to foreshocks from the one induced by aftershocks and is implemented in an alarm-based model to forecast m > 6 earthquakes. A retrospective study of the last 19 years Southern California catalog shows that the daily occurrence probability presents isolated peaks closely located in time and space to the epicenters of five of the six m > 6 earthquakes. We find daily probabilities as high as 25% (in cells of size 0.04 × 0.04deg(2)), with significant probability gains with respect to standard models. PMID:23152938

  15. Surface slip associated with the 2014 South Napa, California earthquake measured on alinement arrays

    NASA Astrophysics Data System (ADS)

    Lienkaemper, J. J.; Brooks, B. A.; DeLong, S. B.; Domrose, C. J.; Rosa, C. M.

    2014-12-01

    The main rupture associated with the South Napa earthquake of Sept. 24, 2014 was ~15 km long from its epicenter (defined here as km 0, see figure below) to the surface rupture's north end (~km 15). Near km 10 a maximum of ~0.45 m dextral slip was most likely entirely coseismic, because it showed the same amount of slip at 12 days post-earthquake (d-PE) as it did at 1.5 d-PE. However, farther south (km~6) by 1-2 d-PE conspicuous growth of offsets on cultural features indicated high rates of afterslip (~10-20 cm/day) had occurred. Although afterslip is gradually slowing, it is expected to continue for many months or possibly years. To closely monitor this rapid afterslip, we installed four 70-140-m-long alinement arrays across the main rupture (labeled NLAR-NLOD on figure below), measuring slip to millimeter accuracy. A fifth array that spans a northeastern branch rupture has shown no afterslip. We have run early observations (to 26-d-PE) of afterslip (coupled with accumulated total slip as measured on adjacent offset cultural features) in the program AFTER (Boatwright et al., 1989). This analysis allows us to make preliminary estimates of initial (1 d-PE), final or total accumulated event slip, and coseismic estimates (i.e., projecting slip toward a ~0.5-1 s rise time). Thus far modeled slip on all four arrays indicates that final values of total (coseismic plus post-seismic) slip might be approaching the maximum coseismic slip as a limit (~0.4 ± 0.1 m). The final values of total surface slip may thus become more uniform along the fault over time as compared to modeled heterogeneous seismic slip at depth. The timing of the surface slip release differs strikingly from south to north along the 2014 rupture; AFTER models suggest that slip south of the location of maximum slip (km 0-10) appears to have been dominantly postseismic (~50-100%), whereas north of the maximum slip (km 10-15) slip was mainly coseismic (~50-100%). The current AFTER model predicts that as surface slip along the fault approaches final values of total slip associated with this earthquake (e.g., ?1000 d-PE), the respective contributions to the total event surface slip integrated along the entire fault will approach being 27% coseismic slip and 73% postseismic slip. . . .

  16. Analysis of Injection-Induced Micro-Earthquakes in a Geothermal Steam Reservoir, The Geysers Geothermal Field, California

    SciTech Connect

    Rutqvist, Jonny; Rutqvist, J.; Oldenburg, C.M.

    2008-05-15

    In this study we analyze relative contributions to the cause and mechanism of injection-induced micro-earthquakes (MEQs) at The Geysers geothermal field, California. We estimated the potential for inducing seismicity by coupled thermal-hydrological-mechanical analysis of the geothermal steam production and cold water injection to calculate changes in stress (in time and space) and investigated if those changes could induce a rock mechanical failure and associated MEQs. An important aspect of the analysis is the concept of a rock mass that is critically stressed for shear failure. This means that shear stress in the region is near the rock-mass frictional strength, and therefore very small perturbations of the stress field can trigger an MEQ. Our analysis shows that the most important cause for injection-induced MEQs at The Geysers is cooling and associated thermal-elastic shrinkage of the rock around the injected fluid that changes the stress state in such a way that mechanical failure and seismicity can be induced. Specifically, the cooling shrinkage results in unloading and associated loss of shear strength in critically shear-stressed fractures, which are then reactivated. Thus, our analysis shows that cooling-induced shear slip along fractures is the dominant mechanism of injection-induced MEQs at The Geysers.

  17. Non-double-couple earthquake mechanisms at the Geysers geothermal area, California

    USGS Publications Warehouse

    Ross, A.; Foulger, G.R.; Julian, B.R.

    1996-01-01

    Inverting P- and S-wave polarities and P:SH amplitude ratios using linear programming methods suggests that about 20% of earthquakes at The Geysers geothermal area have significantly non-double-couple focal mechanisms, with explosive volumetric components as large as 33% of the seismic moment. This conclusion contrasts with those of earlier studies, which interpreted data in terms of double couples. The non-double-couple mechanisms are consistent with combined shear and tensile faulting, possibly caused by industrial water injection. Implosive mechanisms, which might be expected because of rapid steam withdrawal, have not been found. Significant compensated-linear-vector-dipole (CLVD) components in some mechanisms may indicate rapid fluid flow accompanying crack opening. Copyright 1996 by the American Geophysical Union.

  18. The 2007 M5.4 Alum Rock, California, earthquake: Implications for future earthquakes on the central and southern Calaveras Fault

    NASA Astrophysics Data System (ADS)

    Oppenheimer, David H.; Bakun, William H.; Parsons, Tom; Simpson, Robert W.; Boatwright, John; Uhrhammer, Robert A.

    2010-08-01

    The similarity of seismograms recorded by two seismic stations demonstrate that the 31 October 2007 moment magnitude M5.4 Alum Rock earthquake is a repeat of a 1955 ML5.5 earthquake. Both occurred on Oppenheimer et al.'s (1990) Zone V "stuck patch" on the central Calaveras fault, providing new support for their model of Calaveras fault earthquake activity. We suggest that Zone V fails only in a family of recurring M ˜ 5.4-5.5 earthquakes. The 1955 and 2007 earthquakes are the penultimate and ultimate Zone V events. Earthquakes in 1891 and 1864 are possible earlier Zone V events. The next Zone V event is not expected in the next few decades, assuming a time-dependent recurrence model: the mean forecast date is 2064 (2035-2104, 95% confidence range). We further suggest that Zones I, II, III, and IV fail in recurring M ˜ 5.1-5.3, M ˜ 5.6-5.8, M ˜ 6.1-6.3, and M ˜ 4.9-5.0 earthquakes, respectively. If our earthquake recurrence model is correct, the next Zone I event is overdue and could occur anytime, and M5-6 earthquakes should not occur on Zones II, III, and IV before 2014, 2012, and 2026, respectively. We cannot rule out the possibility that Zone VI, which lies at the southern end of the Mission Seismic Trend, where the southern Hayward and central Calaveras faults appear to connect at depth, fails aseismically or in large events on the southern Hayward fault, such as last occurred in 1868, or in large events on the adjoining northern Calaveras fault segment.

  19. Holocene slip rate and earthquake recurrence of the northern Calaveras Fault at Leyden Creek, northern California

    NASA Astrophysics Data System (ADS)

    Kelson, Keith I.; Simpson, Gary D.; Lettis, William R.; Haraden, Colleen C.

    1996-03-01

    The northern Calaveras fault traverses a heavily populated area in the eastern San Francisco Bay region and has not had a large earthquake in more than 130 years. To obtain data on the number, timing, and recurrence of large paleoearthquakes, we conducted paleoseismologic investigations at Leyden Creek, which crosses the fault in the rugged southern East Bay Hills. The site is characterized by a prominent west facing scarp and five fluvial terraces on the western (upstream) side of the fault. On the eastern (downstream) side of the fault, the creek flows through a narrow bedrock canyon that constricts the modern valley and has constrained the location of a late Pleistocene paleovalley. The margin of a buried bedrock valley west of the fault trends nearly perpendicular to the fault and is offset 54 (+18, -14) m in a right-lateral sense from the narrow bedrock canyon. Based on radiocarbon ages for alluvial sediments predating and postdating this paleovalley margin, we estimate an age of 11.5 (+3, -1) ka for the valley margin and a Holocene slip rate of 5 ± 2 mm/yr for the fault at Leyden Creek. Slickensides exposed in multiple trenches across the fault show that the most recent movement was predominantly lateral with a minor component of down-to-the-west slip. Multiple displaced scarp-derived colluvial deposits are interpreted as results of five or six surface ruptures within the past 2500 years. Twenty-one radiocarbon samples from scarp-derived colluvium and interfingered alluvial deposits suggest an average interval between surface rupture earthquakes of 250 to 850 years.

  20. Probability Probability

    E-print Network

    Hinton, Geoffrey E.

    of variables usually in lower case: x We will write p(x) to mean probability(X = x). #15; Sample Space believe in x. #15; Ensemble: random variable + sample space+ probability function Expectations, Moments xp(x), variance = P x (x E[x]) 2 p(x) #15; Moments are expectations of higher order powers. (Mean

  1. Static-stress impact of the 1992 Landers earthquake sequence on nucleation and slip at the site of the 1999 M=7.1 Hector Mine earthquake, southern California

    NASA Astrophysics Data System (ADS)

    Parsons, Tom; Dreger, Douglas S.

    2000-07-01

    The proximity in time (˜7 years) and space (˜20 km) between the 1992 M=7.3 Landers earthquake and the 1999 M=7.1 Hector Mine event suggests a possible link between the quakes. We thus calculated the static stress changes following the 1992 Joshua Tree/Landers/Big Bear earthquake sequence on the 1999 M=7.1 Hector Mine rupture plane in southern California. Resolving the stress tensor into rake-parallel and fault-normal components and comparing with changes in the post-Landers seismicity rate allows us to estimate a coefficient of friction on the Hector Mine plane. Seismicity following the 1992 sequence increased at Hector Mine where the fault was unclamped. This increase occurred despite a calculated reduction in right-lateral shear stress. The dependence of seismicity change primarily on normal stress change implies a high coefficient of static friction (µ?0.8). We calculated the Coulomb stress change using µ=0.8 and found that the Hector Mine hypocenter was mildly encouraged (0.5 bars) by the 1992 earthquake sequence. In addition, the region of peak slip during the Hector Mine quake occurred where Coulomb stress is calculated to have increased by 0.5-1.5 bars. In general, slip was more limited where Coulomb stress was reduced, though there was some slip where the strongest stress decrease was calculated. Interestingly, many smaller earthquakes nucleated at or near the 1999 Hector Mine hypocenter after 1992, but only in 1999 did an event spread to become a M=7.1 earthquake.

  2. Seismic guided waves trapped in the fault zone of the Landers, California, earthquake of 1992

    NASA Astrophysics Data System (ADS)

    Li, Yong-Gang; Aki, Keiiti; Adams, David; Hasemi, Akiko; Lee, William H. K.

    1994-06-01

    A mobile seismic array of seven stations was deployed at 11 sites along the fault trace of th M7.4 Landers earthquake of June 28, 1992, with a maximum offset of 1 km from the trace. We found a distinct wave train with a relatively long period following the S waves that shows up only when both the stations and the events are close to the fault trace. This wave train is interpreted as a seismic guided wave trapped in a low-velocity fault zone. To study the distribution of amplitude of the guided waves with distance from the fault trace and also their attenuation with travel distance along the fault zone, we eliminated source and recording site effects by the coda normalization method. The normalized amplitudes of guided waves show a spectra peak at 3-4 Hz, which decays sharply with distance from the fault trace. Spectral amplitudes at high frequencies (8-15 Hz) show an opposite trend, increasing with distance from the fault trace. The normalized amplitudes of guided waves at 3-4 Hz also show a systematic decrease with hypocentral distance along the fault zone, from which we infer an apparent Q of 50. In order to confirm the existence of the guided waves, a dense array of 31 stations was deployed at one of the 11 sites. The resultant records revealed unequivocal evidence for the existence of guided waves associated with the fault zone. By modeling the waveforms as S waves trapped in a low-velocity waveguide sandwiched between two homogenous half-spaces with velocity V(sub s) = 3.0 km/s, we infer a waveguide width of about 180 m, a shear velocity of 2.0-2.2 km/s, and a Q of approximately 50. Hypocenters of aftershocks with clear guided waves show a systematic distribution both laterally and with depth delineating the extent of the low-velocity fault zone in three dimensions. We find that the zone extends to a depth of at least 10 km. This zone apparently continues to the south across the Pinto Mountain fault because guided waves are observed at stations north of the Pinto Mountain fault for earthquakes with epicenters south of it. On the other hand, the zone appears to be discontinuous at the fault bend located about 20 km north of the mainshock epicenter; guided waves were observed for stations and epicenters which are located on the same sides of the fault bend but not for these opposite sides.

  3. Earthquakes Living Lab: Geology and the 1906 San Francisco Earthquake

    NSDL National Science Digital Library

    2014-09-18

    Students examine the effects of geology on earthquake magnitudes and how engineers anticipate and prepare for these effects. Using information provided through the Earthquakes Living Lab interface, students investigate how geology, specifically soil type, can amplify the magnitude of earthquakes and their consequences. Students look in-depth at the historical 1906 San Francisco earthquake and its destruction thorough photographs and data. They compare the 1906 California earthquake to another historical earthquake in Kobe, Japan, looking at the geological differences and impacts in the two regions, and learning how engineers, geologists and seismologists work to predict earthquakes and minimize calamity. A worksheet serves as a student guide for the activity.

  4. Surface deformation before, during and after the 2014 South Napa, California, earthquake from a spatially dense network of survey and continuous GPS site

    NASA Astrophysics Data System (ADS)

    Floyd, M.; Funning, G.; Murray, J. R.; Svarc, J. L.; Herring, T.; Johanson, I. A.; Swiatlowski, J.; Materna, K.; Johnson, C. W.; Boyd, O. S.; Sutton, J. M.; Phillips, E.

    2014-12-01

    The South Napa, California, earthquake occurred on the West Napa Fault within a dense network of established GPS sites. Continuous sites from the Plate Boundary Observatory (PBO), Bay Area Regional Deformation (BARD) and other publicly-available networks lie mostly in the far-field, greater than 15 km from the epicenter. The near-field is covered by two networks of survey GPS sites, one observed by the University of California, Riverside (UCR) and MIT, and the other by the US Geological Survey (USGS). First, we present the pre-earthquake GPS velocity solution from the combination of these networks, covering the entire Pacific-North America plate boundary zone north of San Francisco Bay. We take 1-D fault-perpendicular profile and block model approaches to fit the GPS velocities, both with and without an explicit West Napa Fault, to update previous estimates of slip rates on faults to the east of the Rodgers Creek Fault, which include the West Napa and Green Valley Faults. Second, we present results from the survey GPS field response to the South Napa earthquake. 13 survey sites within 20 km of the epicenter were re-observed within 15 hours of the earthquake by a UCR-MIT group with assistance from UC Berkeley. An additional two sites to the north of the rupture were re-observed within 36 hours. A USGS-led group re-observed 17 sites within 25 km of the epicenter, as well as sites further afield. In total, 35 survey-mode GPS instruments were deployed to observe post-earthquake motions for up to four weeks after the event. Maximum displacements of greater than 20 cm are observed at two survey GPS sites within 2 km and either side of the surface rupture, in agreement with visual inspection of surface rupture offsets. Other observed survey sites within 20 km show at least 2 cm of displacement and 22 continuous GPS sites show displacements that are statistically significant at the 2-sigma level. Further, we show post-earthquake displacements over time as a result of this event relative to our precise pre-earthquake velocity solution.

  5. Measuring Possible Tsunami Currents from the April 1, 2014 Mw 8.2 Chile Earthquake in Crescent City, California

    NASA Astrophysics Data System (ADS)

    Admire, A. R.; Crawford, G. B.; Dengler, L. A.

    2014-12-01

    Crescent City, California has a long history of damaging tsunamis. Thirty-nine tsunamis have been recorded since 1933, including five that caused damage. Crescent City's harbor and small boat basin are particularly vulnerable to strong currents. Humboldt State University has installed Acoustic Doppler Profilers (ADPs) in order to directly measure water pressure fluctuations and currents caused by tsunamis. An instrument in Humboldt Bay, ~100 km south of Crescent City, recorded tsunamis generated by the 2010 Mw 8.7 Chile and 2011 Mw 9.0 Japan earthquakes and demonstrated the usefulness of ADPs in measuring tsunami currents. In 2013, an ADP was deployed in Crescent City's harbor adjacent to the NOAA tide gauge. On April 1, 2014, a Mw 8.2 earthquake occurred in northern Chile, producing a modest Pacific-wide tsunami and a 16 cm peak amplitude on the Crescent City tide gauge. We analyze the ADP data before and during the expected arrival of the April 2 tsunami to see if a tsunami signal is present. Tidal currents are generally small (5 cm/s or less). For two months before the tsunami, intermittent, high-frequency variability is present in velocity and pressure at periods on the order of 20, 9 and 5 min, which compare favorably to modal periods predicted using some simplified models of open-ended basins. For several hours after the tsunami arrival on April 2, spectral power levels in velocity and pressure around the 20 min period are notably enhanced. These results suggest that: (1) the observed periods of enhanced variability represent the first three modes (n=0, 1 and 2) of free oscillations in the harbor, (2) the dominant period of (non-tidal) oscillations observed during the April 2, 2014 tsunami (~20 min) and during previous tsunamis (e.g., the water level record for the March 11, 2011 tsunami; also ~20 min) represents harbor resonance corresponding to the lowest order mode, and (3) this event is very near the ADP limit of detectability with peak tsunami currents of 5-10 cm/s and higher frequency variability and instrument noise root-mean-squared amplitude of 4-5 cm/s.

  6. Chronology of Postglacial Eruptive Activity and Calculation of Eruption Probabilities for Medicine Lake Volcano, Northern California

    USGS Publications Warehouse

    Nathenson, Manuel; Donnelly-Nolan, Julie M.; Champion, Duane E.; Lowenstern, Jacob B.

    2007-01-01

    Medicine Lake volcano has had 4 eruptive episodes in its postglacial history (since 13,000 years ago) comprising 16 eruptions. Time intervals between events within the episodes are relatively short, whereas time intervals between the episodes are much longer. An updated radiocarbon chronology for these eruptions is presented that uses paleomagnetic data to constrain the choice of calibrated ages. This chronology is used with exponential, Weibull, and mixed-exponential probability distributions to model the data for time intervals between eruptions. The mixed exponential distribution is the best match to the data and provides estimates for the conditional probability of a future eruption given the time since the last eruption. The probability of an eruption at Medicine Lake volcano in the next year from today is 0.00028.

  7. Operational earthquake forecasting can enhance earthquake preparedness

    USGS Publications Warehouse

    Jordan, T.H.; Marzocchi, W.; Michael, A.J.; Gerstenberger, M.C.

    2014-01-01

    We cannot yet predict large earthquakes in the short term with much reliability and skill, but the strong clustering exhibited in seismic sequences tells us that earthquake probabilities are not constant in time; they generally rise and fall over periods of days to years in correlation with nearby seismic activity. Operational earthquake forecasting (OEF) is the dissemination of authoritative information about these time?dependent probabilities to help communities prepare for potentially destructive earthquakes. The goal of OEF is to inform the decisions that people and organizations must continually make to mitigate seismic risk and prepare for potentially destructive earthquakes on time scales from days to decades. To fulfill this role, OEF must provide a complete description of the seismic hazard—ground?motion exceedance probabilities as well as short?term rupture probabilities—in concert with the long?term forecasts of probabilistic seismic?hazard analysis (PSHA).

  8. Predicting Earthquakes

    NSDL National Science Digital Library

    Five moderate-to-strong earthquakes struck California in June 2005. Could the cluster of quakes be a harbinger of the Big One? Another earthquake-prone area, New Madrid, near Memphis, Tennessee, has had more than its share of impressive quakes and strain is building along its fault lines. This radio broadcast discusses these two seismic zones, the new data based on years of GPS (Global Positioning System) measurements that may give scientists more information, and how the Earth generates the stress which leads to earthquakes. There is also discussion of the danger of tsunamis in the Virgin Islands and the need for a worldwide tsunami warning network. The broadcast is 18 minutes in length.

  9. Correlation of ground motion and intensity for the 17 January 1994 Northridge, California, earthquake

    USGS Publications Warehouse

    Boatwright, J.; Thywissen, K.; Seekins, L.C.

    2001-01-01

    We analyze the correlations between intensity and a set of groundmotion parameters obtained from 66 free-field stations in Los Angeles County that recorded the 1994 Northridge earthquake. We use the tagging intensities from Thywissen and Boatwright (1998) because these intensities are determined independently on census tracts, rather than interpolated from zip codes, as are the modified Mercalli isoseismals from Dewey et al. (1995). The ground-motion parameters we consider are the peak ground acceleration (PGA), the peak ground velocity (PGV), the 5% damped pseudovelocity response spectral (PSV) ordinates at 14 periods from 0.1 to 7.5 sec, and the rms average of these spectral ordinates from 0.3 to 3 sec. Visual comparisons of the distribution of tagging intensity with contours of PGA, PGV, and the average PSV suggest that PGV and the average PSV are better correlated with the intensity than PGA. The correlation coefficients between the intensity and the ground-motion parameters bear this out: r = 0.75 for PGA, 0.85 for PGV, and 0.85 for the average PSV. Correlations between the intensity and the PSV ordinates, as a function of period, are strongest at 1.5 sec (r = 0.83) and weakest at 0.2 sec (r = 0.66). Regressing the intensity on the logarithms of these ground-motion parameters yields relations I ?? mlog?? with 3.0 ??? m ??? 5.2 for the parameters analyzed, where m = 4.4 ?? 0.7 for PGA, 3.4 ?? 0.4 for PGV, and 3.6 ?? 0.5 for the average PSV.

  10. Finite-fault analysis of the 2004 Parkfield, California, earthquake using Pnl waveforms

    USGS Publications Warehouse

    Mendoza, C.; Hartzell, S.

    2008-01-01

    We apply a kinematic finite-fault inversion scheme to Pnl displacement waveforms recorded at 14 regional stations (?w 5.0 aftershock. Slip is modeled on a rectangular fault subdivided into 2×2 km subfaults assuming a constant rupture velocity and a 0.5 sec rise time. A passband filter of 0.1–0.5 Hz is applied to both data and subfault responses prior to waveform inversion. The SGF inversions are performed such that the final seismic moment is consistent with the known magnitude (Mw 6.0) of the earthquake. For these runs, it is difficult to reproduce the entire Pnl waveform due to inaccuracies in the assumed crustal structure. Also, the misfit between observed and predicted vertical waveforms is similar in character for different rupture velocities, indicating that neither the rupture velocity nor the exact position of slip sources along the fault can be uniquely identified. The pattern of coseismic slip, however, compares well with independent source models derived using other data types, indicating that the SGF inversion procedure provides a general first-order estimate of the 2004 Parkfield rupture using the vertical Pnl records. The best-constrained slip model is obtained using the single-aftershock EGF approach. In this case, the waveforms are very well reproduced for both vertical and horizontal components, suggesting that the method provides a powerful tool for estimating the distribution of coseismic slip using the regional Pnl waveforms. The inferred slip model shows a localized patch of high slip (55 cm peak) near the hypocenter and a larger slip area (~50 cm peak) extending between 6 and 20 km to the northwest.

  11. The Distribution of Earthquakes: Where Do Large Earthquakes Occur?

    NSDL National Science Digital Library

    John Marquis

    In this activity, students investigate the distribution of large earthquakes (magnitude greater than 6) in Southern California. Using online maps of earthquake epicenters in Southern California and the Los Angeles Basin, they will compare these distributions with historic distributions (1932-1996), and with respect to the locations of major fault traces.

  12. Spatially heterogeneous stress field in the source area of the 2011 Mw 6.6 Fukushima-Hamadori earthquake, NE Japan, probably caused by static stress change

    NASA Astrophysics Data System (ADS)

    Yoshida, Keisuke; Hasegawa, Akira; Okada, Tomomi

    2015-05-01

    In order to know whether principal stress orientations in the source area rotated after the 2011 April 11 Mw 6.6 Fukushima-Hamadori earthquake in NE Japan, we investigated detailed spatial distributions of stress orientations for both the pre- and post-main shock periods using a large amount of focal mechanism data. We applied stress tensor inversions to focal mechanism data from Japan's National Research Institute for Earth Science and Disaster Prevention's F-net broadband seismic network and the Japan Meteorological Agency (JMA). The ?3-axes estimated for the pre-main shock period are predominantly oriented WSW-ENE, and are relatively homogeneously in space. In contrast, the orientations of the ?3-axes show a significantly heterogeneous distribution in space for the post-main shock period. In the northern subarea of the focal region, the ?3-axes are oriented NW-SE. In the east and west portions of the central subarea, they are oriented NNW-SSE and WNW-ESE, respectively, almost perpendicular to each other. In the southern subarea, the ?3-axes are oriented WSW-ENE. On the whole, the ?3-axis orientations show concentric circle-like distribution surrounding the large slip area of the Mw Mw 6.6 main shock rupture. The change of principal stress axis orientations after the earthquake is not significant because of the sparse data set for the pre-main shock period. We calculated static stress changes from the Mw 6.6 main shock and three Mw > 5.5 earthquakes to compare with the observed stress axis orientations in the post-main shock period. The ?3-axis orientations of the calculated total static stress change show a concentric circle-like distribution surrounding the large slip area of the main shock, similar to that noted above. This observation strongly suggests that the spatially heterogeneous stress orientations in the post-main shock period were caused by the static stress change from the Mw 6.6 main shock and other large earthquakes. In order to estimate the differential stress magnitude in the focal area, we calculated deviatoric stress tensors in the post-main shock period by assuming that they are the sum of the deviatoric stress tensors in the pre-main shock period and the static stress changes. Comparison of the calculated and observed stress tensors revealed differential stress magnitudes of 2-30 MPa that explain the observed stress orientations, considering the probable range of estimated stress ratios in the pre-main shock period.

  13. The 1994 Northridge, California, earthquake: Investigation of rupture velocity, risetime, and high-frequency radiation

    NASA Astrophysics Data System (ADS)

    Hartzell, Stephen; Liu, Pengcheng; Mendoza, Carlos

    1996-09-01

    A hybrid global search algorithm is used to solve the nonlinear problem of calculating slip amplitude, rake, risetime, and rupture time on a finite fault. Thirty-five strong motion velocity records are inverted by this method over the frequency band from 0.1 to 1.0 Hz for the Northridge earthquake. Four regions of larger-amplitude slip are identified: one near the hypocenter at a depth of 17 km, a second west of the hypocenter at about the same depth, a third updip from the hypocenter at a depth of 10 km, and a fourth updip from the hypocenter and to the northwest. The results further show an initial fast rupture with a velocity of 2.8 to 3.0 km/s followed by a slow termination of the rupture with velocities of 2.0 to 2.5 km/s. The initial energetic rupture phase lasts for 3 s, extending out 10 km from the hypocenter. Slip near the hypocenter has a short risetime of 0.5 s, which increases to 1.5 s for the major slip areas removed from the hypocentral region. The energetic rupture phase is also shown to be the primary source of high-frequency radiation (1-15 Hz) by an inversion of acceleration envelopes. The same global search algorithm is used in the envelope inversion to calculate high-frequency radiation intensity on the fault and rupture time. The rupture timing from the low- and high-frequency inversions is similar, indicating that the high frequencies are produced primarily at the mainshock rupture front. Two major sources of high-frequency radiation are identified within the energetic rupture phase, one at the hypocenter and another deep source to the west of the hypocenter. The source at the hypocenter is associated with the initiation of rupture and the breaking of a high-stress-drop asperity and the second is associated with stopping of the rupture in a westerly direction.

  14. Prospective Tests of Earthquake Forecasts

    NASA Astrophysics Data System (ADS)

    Jackson, D. D.; Kagan, Y. Y.; Rong, Y.; Shen, Z.

    2001-12-01

    We use likelihood and other probabilistic tests to evaluate several recent earthquake forecasts. Here we discuss the testing methodology and applications to specific forecasts. When forecasts are made in terms of rate density (probability per unit area, magnitude, and time), it is especially easy to compare their performance against future earthquakes and against other forecasts. Here we use only prospective tests (that is, based on earthquakes that occur after the model and all parameters are fixed and announced). In 1999 we announced yearly test forecasts for earthquakes over magnitude 5.8 in the NW Pacific and SW Pacific regions (http://scec.ess.ucla.edu/ ~ykagan/predictions_index.html). The forecast model is based on smoothed seismicity determined from the Harvard CMT catalog, which begins in 1977. The forecast model assumes no time dependence, thus the forecast provides a good null hypothesis against which to test time-dependent models. Earthquakes in 1999 and 2000 were consistent with our forecasts except for the SW Pacific during 1999. Before then, we had smoothed the seismicity too much. We reduced the smoothing for the year 2000 forecast, with good results. In this presentation, we will present preliminary results for 2001. We also presented in 1999 a probability map for California based on the historic earthquake catalog back to 1850. Competing forecasts, based on geodetic strain rates, have since been proposed. All are time-independent and will provide a standard against which to test time-dependent models. We will present a comparison of the predictive ability of the various forecasts at a magnitude threshold of about 4.5. Kossobokov et al. went public in 1999 with semi-annual "M8" forecasts (http://mitp.ru/), which identified circles within which large earthquakes (magnitude 7.5 and above) are expected. Another version uses a threshold of 8.0. We will present an alternative "null hypothesis, " with earthquake rate constant in time and following a modified Gutenberg-Richter magnitude distribution. We will evaluate these hypotheses prospectively against each other. >http://scec.ess.ucla.edu/~ykagan/predictions_index.html

  15. Evidence for large earthquakes on the San Andreas fault at the Wrightwood, California paleoseismic site: A.D. 500 to present

    USGS Publications Warehouse

    Fumal, T.E.; Weldon, R.J.; Biasi, G.P.; Dawson, T.E.; Seitz, G.G.; Frost, W.T.; Schwartz, D.P.

    2002-01-01

    We present structural and stratigraphic evidence from a paleoseismic site near Wrightwood, California, for 14 large earthquakes that occurred on the southern San Andreas fault during the past 1500 years. In a network of 38 trenches and creek-bank exposures, we have exposed a composite section of interbedded debris flow deposits and thin peat layers more than 24 m thick; fluvial deposits occur along the northern margin of the site. The site is a 150-m-wide zone of deformation bounded on the surface by a main fault zone along the northwest margin and a secondary fault zone to the southwest. Evidence for most of the 14 earthquakes occurs along structures within both zones. We identify paleoearthquake horizons using infilled fissures, scarps, multiple rupture terminations, and widespread folding and tilting of beds. Ages of stratigraphic units and earthquakes are constrained by historic data and 72 14C ages, mostly from samples of peat and some from plant fibers, wood, pine cones, and charcoal. Comparison of the long, well-resolved paleoseimic record at Wrightwood with records at other sites along the fault indicates that rupture lengths of past earthquakes were at least 100 km long. Paleoseismic records at sites in the Coachella Valley suggest that each of the past five large earthquakes recorded there ruptured the fault at least as far northwest as Wrightwood. Comparisons with event chronologies at Pallett Creek and sites to the northwest suggests that approximately the same part of the fault that ruptured in 1857 may also have failed in the early to mid-sixteenth century and several other times during the past 1200 years. Records at Pallett Creek and Pitman Canyon suggest that, in addition to the 14 earthquakes we document, one and possibly two other large earthquakes ruptured the part of the fault including Wrightwood since about A.D. 500. These observations and elapsed times that are significantly longer than mean recurrence intervals at Wrightwood and sites to the southeast suggest that at least the southermost 200 km of the San Andreas fault is near failure.

  16. Downscaling of slip distribution for strong earthquakes

    NASA Astrophysics Data System (ADS)

    Yoshida, T.; Oya, S.; Kuzuha, Y.

    2013-12-01

    We intend to develop a downscaling model to enhance the earthquake slip distribution resolution. Slip distributions have been obtained by other researchers using various inversion methods. As a downscaling model, we are discussing fractal models that include mono-fractal models (fractional Brownian motion, fBm; fractional Lévy motion, fLm) and multi-fractal models as candidates. Log - log-linearity of k (wave number) versus E (k) (power spectrum) is the necessary condition for fractality: the slip distribution is expected to satisfy log - log-linearity described above if we can apply fractal model to a slip distribution as a downscaling model. Therefore, we conducted spectrum analyses using slip distributions of 11 earthquakes as explained below. 1) Spectrum analyses using one-dimensional slip distributions (strike direction) were conducted. 2) Averaging of some results of power spectrum (dip direction) was conducted. Results show that, from the viewpoint of log - log-linearity, applying a fractal model to slip distributions can be inferred as valid. We adopt the filtering method after Lavallée (2008) to generate fBm/ fLm. In that method, generated white noises (random numbers) are filtered using a power law type filter (log - log-linearity of the spectrum). Lavallée (2008) described that Lévy white noise that generates fLm is more appropriate than the Gaussian white noise which generates fBm. In addition, if the 'alpha' parameter of the Lévy law, which governs the degree of attenuation of tails of the probability distribution, is 2.0, then the Lévy distribution is equivalent to the Gauss distribution. We analyzed slip distributions of 11 earthquakes: the Tohoku earthquake (Wei et al., 2011), Haiti earthquake (Sladen, 2010), Simeulue earthquake (Sladen, 2008), eastern Sichuan earthquake (Sladen, 2008), Peru earthquake (Konca, 2007), Tocopilla earthquake (Sladen, 2007), Kuril earthquake (Sladen, 2007), Benkulu earthquake (Konca, 2007), and southern Java earthquake (Konca, 2006)). We obtained the following results. 1) Log - log-linearity (slope of the linear relationship is ' - ?') of k versus E(k) holds for all earthquakes. 2) For example, ? = 3.70 and ? = 1.96 for the Tohoku earthquake (2011) and ? = 4.16 and ? = 2.00 for the Haiti earthquake (2010). For these cases, the Gauss' law is appropriate because alpha is almost 2.00. 3) However, ? = 5.25 and ? = 1.25 for the Peru earthquake (2007) and ? = 2.24 and ? = 1.57 for the Simeulue earthquake (2008). For these earthquakes, the Lévy law is more appropriate because ? is far from 2.0. 4) Although Lavallée (2003, 2008) concluded that the Lévy law is more appropriate than the Gauss' law for white noise, which is later filtered, our results show that the Gauss law is appropriate for some earthquakes. Lavallée and Archuleta, 2003, Stochastic modeling of slip spatial complexities for the 1979 Imperial Valley, California, earthquake, GEOPHYSICAL RESEARCH LETTERS, 30(5). Lavallée, 2008, On the random nature of earthquake source and ground motion: A unified theory, ADVANCES IN GEOPHYSICS, 50, Chap 16.

  17. ULF Pulsations, Air Conductivity Changes, and Infrared (IR) Radiation Signatures Observed Prior to the 2008 Alum Rock (California) M5.4 Earthquake

    NASA Astrophysics Data System (ADS)

    Bleier, T. E.; Dunson, C.; Maniscalco, M.; Bryant, N.; Bambery, R.

    2008-12-01

    A collaboration between QuakeFinder (Palo Alto) and NASA JPL utilized both ground and space instruments to observe a series of electromagnetic (EM) signals detected up to 2 weeks prior to the Oct 30, 2007 Alum Rock, California, M5.4 earthquake. These signals included Ultra Low Frequency (ULF: 0.01 to 12 Hz) pulsations that were detected with a 3 axis induction magnetometer located 2 km from the epicenter. The 1- 12 sec wide pulsations were 10-50 times more intense than 2 year normal background noise levels, and the pulsations occurred 10-30 times more frequently in the 2 weeks prior to the quake than the average pulse count for the 1.8 years prior. The air conductivity sensor at the same site saturated for much of the evening prior to the quake. The conductivity levels were compared to the previous year's average conductivity patterns at the site, and determined not to be caused by moisture contamination. The GOES-West weather satellite typically observes the west coast of California, and during October of 2007, detected an area almost 100km around the quake that changed the usual night time cooling rate (a 4 year average negative temperature slope) to a positive slope during the night time for much of the 2 weeks prior to the quake. These EM signals were then compared against predictions based on several earthquake theories postulated during recent years.

  18. Real-time Testing of On-site Earthquake Early Warning within the California Integrated Seismic Network (CISN) Using Statewide Distributed and On-site Processing

    NASA Astrophysics Data System (ADS)

    Böse, M.; Hauksson, E.; Solanki, K.; Kanamori, H.; Heaton, T. H.; Wu, Y.

    2008-12-01

    Currently, the real-time performance of three algorithms for earthquake early warning is being tested within the California Integrated Seismic Network (CISN). We report on the implementation and performance of the ?c-Pd on-site warning algorithm in California and describe recent improvements of the software. These include: (1) the development of a new ?c-Pd based trigger criterion to reduce the number of false triggers and the scattering in magnitude estimates for small and moderate earthquakes; (2) the integration of additional broadband stations, including the ANZA network and stations with older dataloggers that provide waveform data with highest sampling rate of 80 sps; and (3) the implementation of leap-second capabilities in the real-time software used within CISN. At present, we are working on the implementation of the remote processing sites for the BK and NP, NC networks operated by UC Berkeley and USGS Menlo Park. These processing sites will analyze available local waveform data and provide ?c-Pd values as well as Mw and PGV estimates. The new processing sites will provide more data for algorithm testing and improved data analysis for earthquakes located in northern California. We are also implementing the ?c-Pd algorithm software on SLATE Field Processors to eliminate telemetry delays associated with waveform data. On-site the SLATE receives data from a Q330 datalogger and provides ?c-Pd estimates from the first 3 seconds of P-waveforms. These ?c-Pd values along with station-specific Mw and PGV values are transmitted to the central site as a short notification message. In the future, such station processors can also transmit warnings to local users directly. The ?c-Pd algorithm software performed well during the recent July 29 2008 Mw5.4 Chino Hills earthquake. A total of 36 stations provided real-time estimates of ?c-Pd values and derived Mw and PGV values. The current CISN network configuration would have provided a 6 second warning at Los Angles City Hall, which is located 50 km to the west-southwest of the mainshock epicenter.

  19. Comparisons of ground motions from the 1999 Chi-Chi, earthquake with empirical predictions largely based on data from California

    USGS Publications Warehouse

    Boore, D.M.

    2001-01-01

    This article has the modest goal of comparing the ground motions recorded during the 1999 Chi-Chi, Taiwan, mainshock with predictions from four empirical-based equations commonly used for western North America; these empirical predictions are largely based on data from California. Comparisons are made for peak acceleration and 5%-damped response spectra at periods between 0.1 and 4 sec. The general finding is that the Chi-Chi ground motions are smaller than those predicted from the empirically based equations for periods less than about 1 sec by factors averaging about 0.4 but as small as 0.26 (depending on period, on which equation is used, and on whether the sites are assumed to be rock or soil). There is a trend for the observed motions to approach or even exceed the predicted motions for longer periods. Motions at similar distances (30-60 km) to the east and to the west of the fault differ dramatically at periods between about 2 and 20 sec: Long-duration wave trains are present on the motions to the west, and when normalized to similar amplitudes at short periods, the response spectra of the motions at the western stations are as much as five times larger than those of motions from eastern stations. The explanation for the difference is probably related to site and propagation effects; the western stations are on the Coastal Plain, whereas the eastern stations are at the foot of young and steep mountains, either in the relatively narrow Longitudinal Valley or along the eastern coast-the sediments underlying the eastern stations are probably shallower and have higher velocity than those under the western stations.

  20. Testing hypotheses of earthquake occurrence

    NASA Astrophysics Data System (ADS)

    Kagan, Y. Y.; Jackson, D. D.; Schorlemmer, D.; Gerstenberger, M.

    2003-12-01

    We present a relatively straightforward likelihood method for testing those earthquake hypotheses that can be stated as vectors of earthquake rate density in defined bins of area, magnitude, and time. We illustrate the method as it will be applied to the Regional Earthquake Likelihood Models (RELM) project of the Southern California Earthquake Center (SCEC). Several earthquake forecast models are being developed as part of this project, and additional contributed forecasts are welcome. Various models are based on fault geometry and slip rates, seismicity, geodetic strain, and stress interactions. We would test models in pairs, requiring that both forecasts in a pair be defined over the same set of bins. Thus we offer a standard "menu" of bins and ground rules to encourage standardization. One menu category includes five-year forecasts of magnitude 5.0 and larger. Forecasts would be in the form of a vector of yearly earthquake rates on a 0.05 degree grid at the beginning of the test. Focal mechanism forecasts, when available, would be also be archived and used in the tests. The five-year forecast category may be appropriate for testing hypotheses of stress shadows from large earthquakes. Interim progress will be evaluated yearly, but final conclusions would be made on the basis of cumulative five-year performance. The second category includes forecasts of earthquakes above magnitude 4.0 on a 0.05 degree grid, evaluated and renewed daily. Final evaluation would be based on cumulative performance over five years. Other types of forecasts with different magnitude, space, and time sampling are welcome and will be tested against other models with shared characteristics. All earthquakes would be counted, and no attempt made to separate foreshocks, main shocks, and aftershocks. Earthquakes would be considered as point sources located at the hypocenter. For each pair of forecasts, we plan to compute alpha, the probability that the first would be wrongly rejected in favor of the second, and beta, the probability that the second would be wrongly rejected in favor of the first. Computing alpha and beta requires knowing the theoretical distribution of likelihood scores under each hypothesis, which we will estimate by simulations. Each forecast is given equal status; there is no "null hypothesis" which would be accepted by default. Forecasts and test results would be archived and posted on the RELM web site. The same methods can be applied to any region with adequate monitoring and sufficient earthquakes. If fewer than ten events are forecasted, the likelihood tests may not give definitive results. The tests do force certain requirements on the forecast models. Because the tests are based on absolute rates, stress models must be explicit about how stress increments affect past seismicity rates. Aftershocks of triggered events must be accounted for. Furthermore, the tests are sensitive to magnitude, so forecast models must specify the magnitude distribution of triggered events. Models should account for probable errors in magnitude and location by appropriate smoothing of the probabilities, as the tests will be "cold hearted:" near misses won't count.

  1. Web-Based Real Time Earthquake Forecasting and Personal Risk Management

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Turcotte, D. L.; Donnellan, A.

    2012-12-01

    Earthquake forecasts have been computed by a variety of countries and economies world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. One example is the Working Group on California Earthquake Probabilities that has been responsible for the official California earthquake forecast since 1988. However, in a time of increasingly severe global financial constraints, we are now moving inexorably towards personal risk management, wherein mitigating risk is becoming the responsibility of individual members of the public. Under these circumstances, open access to a variety of web-based tools, utilities and information is a necessity. Here we describe a web-based system that has been operational since 2009 at www.openhazards.com and www.quakesim.org. Models for earthquake physics and forecasting require input data, along with model parameters. The models we consider are the Natural Time Weibull (NTW) model for regional earthquake forecasting, together with models for activation and quiescence. These models use small earthquakes ('seismicity-based models") to forecast the occurrence of large earthquakes, either through varying rates of small earthquake activity, or via an accumulation of this activity over time. These approaches use data-mining algorithms combined with the ANSS earthquake catalog. The basic idea is to compute large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Each of these approaches has computational challenges associated with computing forecast information in real time. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we show that real-time forecasting is possible at a grid scale of 0.1o. We have analyzed the performance of these models using Reliability/Attributes and standard Receiver Operating Characteristic (ROC) tests. We show how the Reliability and ROC tests allow us to judge data completeness and estimate error. It is clear from much of the analysis that data quality is a major limitation on the accurate computation of earthquake probabilities. We discuss the challenges and pitfalls in serving up these datasets over the web.

  2. Locating Earthquake Epicenters

    NSDL National Science Digital Library

    Pinter, Nicholas

    In this exercise, students use data from the 1994 Northridge, California, earthquake to locate the earthquake and its time of occurrence, and plot data from Central and South America on a map to delineate plate boundaries. Introductory materials explain how earthquakes are caused, describe the types of seismic waves, and explain that the difference in arrival times may be used to calculate distance to the earthquake. Each portion of the exercise includes instructions, datsets, maps, travel-time graphs, study questions, and tables for entering data. A bibliography is also provided.

  3. USGS Earthquake Hazards Program

    NSDL National Science Digital Library

    This site serves as a portal to all US Geological Survey (USGS) earthquake information, both real-time and historic. At the site, visitors can find information on past, present, and predicted future earthquake activity; access a range of publications, maps, and fact sheets; use a number of earthquake education activities; link to various earthquake research centers; and read in-depth information on selected recent earthquakes worldwide. While the site does offer some detailed information, it is probably still best suited for K-12 students and general users.

  4. A simulation-based approach to forecasting the next great San Francisco earthquake.

    PubMed

    Rundle, J B; Rundle, P B; Donnellan, A; Turcotte, D L; Shcherbakov, R; Li, P; Malamud, B D; Grant, L B; Fox, G C; McLeod, D; Yakovlev, G; Parker, J; Klein, W; Tiampo, K F

    2005-10-25

    In 1906 the great San Francisco earthquake and fire destroyed much of the city. As we approach the 100-year anniversary of that event, a critical concern is the hazard posed by another such earthquake. In this article, we examine the assumptions presently used to compute the probability of occurrence of these earthquakes. We also present the results of a numerical simulation of interacting faults on the San Andreas system. Called Virtual California, this simulation can be used to compute the times, locations, and magnitudes of simulated earthquakes on the San Andreas fault in the vicinity of San Francisco. Of particular importance are results for the statistical distribution of recurrence times between great earthquakes, results that are difficult or impossible to obtain from a purely field-based approach. PMID:16219696

  5. Seismic Hazard Assessment: Conditional Probability

    NSDL National Science Digital Library

    Nicholas Pinter

    In this exercise, students investigate the use of conditional probability (the likelihood that a given event will occur within a specified time period) in assessing earthquake hazards. Introductory materials explain that conditional probability is based on the past history of earthquakes in a region and on how and when earthquakes recur; and discuss the different types of models that can be developed to predict recurrences. Using a table of probability values, students will calculate probabilities for earthquakes along the San Andreas and Wasatch Fault zones, and calculate probabilities that they will exceed a given acceleration (ground shaking) value. Example problems and a bibliography are provided.

  6. Hotspots, Lifelines, and the Safrr Haywired Earthquake Sequence

    NASA Astrophysics Data System (ADS)

    Ratliff, J. L.; Porter, K.

    2014-12-01

    Though California has experienced many large earthquakes (San Francisco, 1906; Loma Prieta, 1989; Northridge, 1994), the San Francisco Bay Area has not had a damaging earthquake for 25 years. Earthquake risk and surging reliance on smartphones and the Internet to handle everyday tasks raise the question: is an increasingly technology-reliant Bay Area prepared for potential infrastructure impacts caused by a major earthquake? How will a major earthquake on the Hayward Fault affect lifelines (roads, power, water, communication, etc.)? The U.S. Geological Survey Science Application for Risk Reduction (SAFRR) program's Haywired disaster scenario, a hypothetical two-year earthquake sequence triggered by a M7.05 mainshock on the Hayward Fault, addresses these and other questions. We explore four geographic aspects of lifeline damage from earthquakes: (1) geographic lifeline concentrations, (2) areas where lifelines pass through high shaking or potential ground-failure zones, (3) areas with diminished lifeline service demand due to severe building damage, and (4) areas with increased lifeline service demand due to displaced residents and businesses. Potential mainshock lifeline vulnerability and spatial demand changes will be discerned by superimposing earthquake shaking, liquefaction probability, and landslide probability damage thresholds with lifeline concentrations and with large-capacity shelters. Intersecting high hazard levels and lifeline clusters represent potential lifeline susceptibility hotspots. We will also analyze possible temporal vulnerability and demand changes using an aftershock shaking threshold. The results of this analysis will inform regional lifeline resilience initiatives and response and recovery planning, as well as reveal potential redundancies and weaknesses for Bay Area lifelines. Identified spatial and temporal hotspots can provide stakeholders with a reference for possible systemic vulnerability resulting from an earthquake sequence.

  7. Coseismic fault slip associated with the 1992 M(sub w) 6.1 Joshua Tree, California, earthquake: Implications for the Joshua Tree-Landers earthquake sequence

    NASA Technical Reports Server (NTRS)

    Bennett, Richard A.; Reilinger, Robert E.; Rodi, William; Li, Yingping; Toksoz, M. Nafi; Hudnut, Ken

    1995-01-01

    Coseismic surface deformation associated with the M(sub w) 6.1, April 23, 1992, Joshua Tree earthquake is well represented by estimates of geodetic monument displacements at 20 locations independently derived from Global Positioning System and trilateration measurements. The rms signal to noise ratio for these inferred displacements is 1.8 with near-fault displacement estimates exceeding 40 mm. In order to determine the long-wavelength distribution of slip over the plane of rupture, a Tikhonov regularization operator is applied to these estimates which minimizes stress variability subject to purely right-lateral slip and zero surface slip constraints. The resulting slip distribution yields a geodetic moment estimate of 1.7 x 10(exp 18) N m with corresponding maximum slip around 0.8 m and compares well with independent and complementary information including seismic moment and source time function estimates and main shock and aftershock locations. From empirical Green's functions analyses, a rupture duration of 5 s is obtained which implies a rupture radius of 6-8 km. Most of the inferred slip lies to the north of the hypocenter, consistent with northward rupture propagation. Stress drop estimates are in the range of 2-4 MPa. In addition, predicted Coulomb stress increases correlate remarkably well with the distribution of aftershock hypocenters; most of the aftershocks occur in areas for which the mainshock rupture produced stress increases larger than about 0.1 MPa. In contrast, predicted stress changes are near zero at the hypocenter of the M(sub w) 7.3, June 28, 1992, Landers earthquake which nucleated about 20 km beyond the northernmost edge of the Joshua Tree rupture. Based on aftershock migrations and the predicted static stress field, we speculate that redistribution of Joshua Tree-induced stress perturbations played a role in the spatio-temporal development of the earth sequence culminating in the Landers event.

  8. Processed seismic motion records from Big Bear, California earthquake of June 28, 1992, recorded at seismograph stations in southern Nevada

    SciTech Connect

    Lum, P.K.; Honda, K.K.

    1993-04-01

    As part of the contract with the US Department of Energy, Nevada Field office (DOE/NV), URS/John A. Blume & Associates, Engineers (URS/Blume) maintains a network of seismographs in southern Nevada to monitor the ground motion generated by the underground nuclear explosions (UNEs) at the Nevada Test Site (NTS). The seismographs are located in the communities surrounding the NTS and the Las Vegas valley. When these seismographs are not used for monitoring the UNE generated motions, a limited number of seismographs are maintained for monitoring motion generated by other than UNEs (e.g. motion generated by earthquakes, wind, blast). During the subject earthquake of June 28, 1992, a total of 15 of these systems recorded the earthquake motions. This report contains the recorded data.

  9. Processed seismic motion records from Landers, California earthquake of June 28, 1992, recorded at seismograph stations in southern Nevada

    SciTech Connect

    Lum, P.K.; Honda, K.K.

    1993-04-01

    As part of the contract with the US Department of Energy, Nevada Field office (DOE/NV), URS/John A. Blume & Associates, Engineers (URS/Blume) maintains a network of seismographs in southern Nevada to monitor the ground motion generated by the underground nuclear explosions (UNEs) at the Nevada Test Site (NTS). The seismographs are located in the communities surrounding the NTS and the Las Vegas valley. When these seismographs are not used for monitoring the UNE generated motions, a limited number of seismographs are maintained for monitoring motion generated by other than UNEs (e.g. motion generated by earthquakes, wind, blast). During the subject earthquake of June 28, 1992, a total of 19 of these systems recorded the earthquake motions. This report contains the recorded data.

  10. The influence of one earthquake on another

    NASA Astrophysics Data System (ADS)

    Kilb, Deborah Lyman

    1999-12-01

    Part one of my dissertation examines the initiation of earthquake rupture. We study the initial subevent (ISE) of the Mw 6.7 1994 Northridge, California earthquake to distinguish between two end-member hypotheses of an organized and predictable earthquake rupture initiation process or, alternatively, a random process. We find that the focal mechanisms of the ISE and mainshock are indistinguishable, and both events may have nucleated on and ruptured the same fault plane. These results satisfy the requirements for both end-member models, and do not allow us to distinguish between them. However, further tests show the ISE's waveform characteristics are similar to those of typical nearby small earthquakes (i.e., dynamic ruptures). The second part of my dissertation examines aftershocks of the M 7.1 1989 Loma Prieta, California earthquake to determine if theoretical models of static Coulomb stress changes correctly predict the fault plane geometries and slip directions of Loma Prieta aftershocks. Our work shows individual aftershock mechanisms cannot be successfully predicted because a similar degree of predictability can be obtained using a randomized catalogue. This result is probably a function of combined errors in the models of mainshock slip distribution, background stress field, and aftershock locations. In the final part of my dissertation, we test the idea that earthquake triggering occurs when properties of a fault and/or its loading are modified by Coulomb failure stress changes that may be transient and oscillatory (i.e., dynamic) or permanent (i.e., static). We propose a triggering threshold failure stress change exists, above which the earthquake nucleation process begins although failure need not occur instantaneously. We test these ideas using data from the 1992 M 7.4 Landers earthquake and its aftershocks. Stress changes can be categorized as either dynamic (generated during the passage of seismic waves), static (associated with permanent fault offsets caused by fault slip) or complete (including both static and dynamic). We examine theoretically calculated Coulomb failure stress changes for the static (DeltaCFS) and complete (DeltaCFS(t)) cases, and statistically test for a correlation with spatially varying post-Landers seismicity rate changes. We find that directivity, which was required to model waveforms of the 1992 Landers earthquake, creates an asymmetry in mapped peak DeltaCFS(t). A similar asymmetry is apparent in the seismicity rate change map but not in the DeltaCFS map. Statistical analyses show that peak DeltaCFS(t) correlates as well or better with seismicity rate change as DeltaCFS, and qualitatively peak DeltaCFS(t) is the preferred model. (Abstract shortened by UMI.)

  11. Earthquakes and Schools

    ERIC Educational Resources Information Center

    National Clearinghouse for Educational Facilities, 2008

    2008-01-01

    Earthquakes are low-probability, high-consequence events. Though they may occur only once in the life of a school, they can have devastating, irreversible consequences. Moderate earthquakes can cause serious damage to building contents and non-structural building systems, serious injury to students and staff, and disruption of building operations.…

  12. Fault systems of the 1971 San Fernando and 1994 Northridge earthquakes, southern California: Relocated aftershocks and seismic images from LARSE II

    USGS Publications Warehouse

    Fuis, G.S.; Clayton, R.W.; Davis, P.M.; Ryberg, T.; Lutter, W.J.; Okaya, D.A.; Hauksson, E.; Prodehl, C.; Murphy, J.M.; Benthien, M.L.; Baher, S.A.; Kohler, M.D.; Thygesen, K.; Simila, G.; Keller, Gordon R.

    2003-01-01

    We have constructed a composite image of the fault systems of the M 6.7 San Fernando (1971) and Northridge (1994), California, earthquakes, using industry reflection and oil test well data in the upper few kilometers of the crust, relocated aftershocks in the seismogenic crust, and LARSE II (Los Angeles Region Seismic Experiment, Phase II) reflection data in the middle and lower crust. In this image, the San Fernando fault system appears to consist of a decollement that extends 50 km northward at a dip of ???25?? from near the surface at the Northridge Hills fault, in the northern San Fernando Valley, to the San Andreas fault in the middle to lower crust. It follows a prominent aseismic reflective zone below and northward of the main-shock hypocenter. Interpreted upward splays off this decollement include the Mission Hills and San Gabriel faults and the two main rupture planes of the San Fernando earthquake, which appear to divide the hanging wall into shingle- or wedge-like blocks. In contrast, the fault system for the Northridge earthquake appears simple, at least east of the LARSE II transect, consisting of a fault that extends 20 km southward at a dip of ???33?? from ???7 km depth beneath the Santa Susana Mountains, where it abuts the interpreted San Fernando decollement, to ???20 km depth beneath the Santa Monica Mountains. It follows a weak aseismic reflective zone below and southward of the mainshock hypocenter. The middle crustal reflective zone along the interpreted San Fernando decollement appears similar to a reflective zone imaged beneath the San Gabriel Mountains along the LARSE I transect, to the east, in that it appears to connect major reverse or thrust faults in the Los Angeles region to the San Andreas fault. However, it differs in having a moderate versus a gentle dip and in containing no mid-crustal bright reflections.

  13. Liquefaction Hazard Maps for Three Earthquake Scenarios for the Communities of San Jose, Campbell, Cupertino, Los Altos, Los Gatos, Milpitas, Mountain View, Palo Alto, Santa Clara, Saratoga, and Sunnyvale, Northern Santa Clara County, California

    USGS Publications Warehouse

    Holzer, Thomas L.; Noce, Thomas E.; Bennett, Michael J.

    2008-01-01

    Maps showing the probability of surface manifestations of liquefaction in the northern Santa Clara Valley were prepared with liquefaction probability curves. The area includes the communities of San Jose, Campbell, Cupertino, Los Altos, Los Gatos Milpitas, Mountain View, Palo Alto, Santa Clara, Saratoga, and Sunnyvale. The probability curves were based on complementary cumulative frequency distributions of the liquefaction potential index (LPI) for surficial geologic units in the study area. LPI values were computed with extensive cone penetration test soundings. Maps were developed for three earthquake scenarios, an M7.8 on the San Andreas Fault comparable to the 1906 event, an M6.7 on the Hayward Fault comparable to the 1868 event, and an M6.9 on the Calaveras Fault. Ground motions were estimated with the Boore and Atkinson (2008) attenuation relation. Liquefaction is predicted for all three events in young Holocene levee deposits along the major creeks. Liquefaction probabilities are highest for the M7.8 earthquake, ranging from 0.33 to 0.37 if a 1.5-m deep water table is assumed, and 0.10 to 0.14 if a 5-m deep water table is assumed. Liquefaction probabilities of the other surficial geologic units are less than 0.05. Probabilities for the scenario earthquakes are generally consistent with observations during historical earthquakes.

  14. Comments on Baseline Correction of Digital Strong-Motion Data: Examples from the 1999 Hector Mine, California, Earthquake

    Microsoft Academic Search

    David M. Boore; Christopher D. Stephens; William B. Joyner

    2002-01-01

    Residual displacements for large earthquakes can sometimes be deter- mined from recordings on modern digital instruments, but baseline offsets of un- known origin make it difficult in many cases to do so. To recover the residual dis- placement, we suggest tailoring a correction scheme by studying the character of the velocity obtained by integration of zeroth-order-corrected acceleration and then see-

  15. Tectonics, Earthquakes, Volcanoes

    NSDL National Science Digital Library

    Camille Holmgren

    Students do background reading on plate tectonics and associated geologic hazards. In the first part of this exercise, students use on-line courseware from California State University, Los Angeles (Virtual Earthquake) to investigate seismograph records and use these records to determine earthquake epicenters and magnitudes. In the second part, they complete a crossword puzzle designed to help them master new vocabulary related to plate tectonics.

  16. Quasi-periodic recurrence of large earthquakes on the southern San Andreas fault

    USGS Publications Warehouse

    Scharer, Katherine M.; Biasi, Glenn P.; Weldon, Ray J., II; Fumal, Tom E.

    2010-01-01

    It has been 153 yr since the last large earthquake on the southern San Andreas fault (California, United States), but the average interseismic interval is only ~100 yr. If the recurrence of large earthquakes is periodic, rather than random or clustered, the length of this period is notable and would generally increase the risk estimated in probabilistic seismic hazard analyses. Unfortunately, robust characterization of a distribution describing earthquake recurrence on a single fault is limited by the brevity of most earthquake records. Here we use statistical tests on a 3000 yr combined record of 29 ground-rupturing earthquakes from Wrightwood, California. We show that earthquake recurrence there is more regular than expected from a Poisson distribution and is not clustered, leading us to conclude that recurrence is quasi-periodic. The observation of unimodal time dependence is persistent across an observationally based sensitivity analysis that critically examines alternative interpretations of the geologic record. The results support formal forecast efforts that use renewal models to estimate probabilities of future earthquakes on the southern San Andreas fault. Only four intervals (15%) from the record are longer than the present open interval, highlighting the current hazard posed by this fault.

  17. A reevaluation of the Pallett Creek earthquake chronology based on new AMS radiocarbon dates, San Andreas fault, California

    USGS Publications Warehouse

    Scharer, K.M.; Biasi, G.P.; Weldon, R.J., II

    2011-01-01

    The Pallett Creek paleoseismic record occupies a keystone position in most attempts to develop rupture histories for the southern San Andreas fault. Previous estimates of earthquake ages at Pallett Creek were determined by decay counting radiocarbon methods. That method requires large samples which can lead to unaccounted sources of uncertainty in radiocarbon ages because of the heterogeneous composition of organic layers. In contrast, accelerator mass spectrometry (AMS) radiocarbon dates may be obtained from small samples that have known carbon sources and also allow for a more complete sampling of the section. We present 65 new AMS radiocarbon dates that span nine ground-rupturing earthquakes at Pallett Creek. Overall, the AMS dates are similar to and reveal no dramatic bias in the conventional dates. For many layers, however, individual charcoal samples were younger than the conventional dates, leading to earthquake ages that are overall slightly younger than previously reported. New earthquake ages are determined by Bayesian refinement of the layer ages based on stratigraphic ordering and sedimentological constraints. The new chronology is more regular than previously published records in large part due to new samples constraining the age of event R. The closed interval from event C to 1857 has a mean recurrence of 135years (?? = 83.2 years) and a quasiperiodic coefficient of variation (COV) of 0.61. We show that the new dates and resultant earthquake chronology have a stronger effect on COV than the specific membership of this long series and dating precision improvements from sedimentation rates. Copyright 2011 by the American Geophysical Union.

  18. The search for repeating earthquakes in the northern San Francisco Bay area Nader Shakibay Senobari and Gareth J. Funning University of California, Riverside

    NASA Astrophysics Data System (ADS)

    Shakibay Senobari, N.; Funning, G.

    2013-12-01

    Repeating earthquakes (REs) are sequences of identical repeating events, which are recurring either irregularly (aperiodic) or nearly regularly (quasi-periodic). There are two important characteristics of the events in a RE sequence: they have the same source characteristics (i.e. magnitude and mechanism) and they have the same location and therefore their waveforms at the same stations are extremely similar. Several authors have proposed that the quasi-periodic REs result from recurrent rupture of a small locked patch on a fault surface surrounded by a larger area of creep. The implication is that any detection of characteristic REs along a fault can be interpreted as a signature of the creep at depth on that fault. In addition, REs can be used for locating faults and determining their geometries at depth. In this study, we are looking for REs on the northern Rodgers Creek and southern Maacama faults near Santa Rosa, CA. There is some observational evidence for creep along portions of these faults (e.g. from InSAR, alignment arrays and offset cultural features) but the depth of creep on both faults is still unknown. Finding the locations of REs in this area, and combining them with geodetic data, will help us to place stronger constraints on the distributions of aseismic slip on both the Rodgers Creek and Maacama faults. In order to identify such events, we use data from the Northern California Seismic Network (NCSN) for the period from 1984 to July 2013. We used earthquake waveforms from 2080 events located inside a 25 by 30 km area around Santa Rosa. We choose 7 stations located both inside and outside the selection area at distances of up to 50 km. We calculate the coherence for all pairs of events for each station during the time that the sensor has not been changed (mostly from 1987 to 2013). We align the waveforms using P arrival times from the NCSN catalog. The time windows for the seismogram analysis are set at 0.2 sec before and 10.2 sec after the P-wave arrivals. This time window always contains the S phase, which guarantees that the waves have the same S-P time (i.e. the same location) if they have high cross-correlation coefficients. We calculate the cross-correlation function by allowing a maximum time shift of 0.5 sec (50 samples) in the time domain. An earthquake pair is chosen to be a candidate for REs when the maximum cross-correlation coefficients at 1-15 Hz are larger than 0.95 at four or more stations. We then link a pair of REs with another pair if the two share the same earthquake. We find 49 such clusters of highly correlated events that are candidates for RE sequences. Using the northern California double-difference earthquake catalog (Waldhauser and Schaff, 2008), we find that they are mostly located close to the Rodgers Creek fault trace. In order to confirm whether these clusters are indeed REs, additional tests (e.g. phase and amplitude coherence, time recurrence, visually inspect) must be performed.

  19. The Distribution of Earthquakes: An Earthquake Deficit?

    NSDL National Science Digital Library

    John Marquis

    In this activity, students use online resources to investigate the occurrence of earthquakes in Southern California to decide if there has been a 'deficit', that is, not enough earthquakes in the area in historical time to release the amount of strain energy that plate tectonics is constantly supplying to the crust. In the first two parts, they must determine the appropriate year to begin their study of historic earthquake records (from 1860-1900), and then they must decide if the energy released by past earthquakes has been equivalent to the amount of energy accumulating through the action of plate tectonics over the same number of years. In part three, they perform an analysis of their findings by answering a set of questions. References are included.

  20. A seismic landslide susceptibility rating of geologic units based on analysis of characterstics of landslides triggered by the 17 January, 1994 Northridge, California earthquake

    USGS Publications Warehouse

    Parise, M.; Jibson, R.W.

    2000-01-01

    One of the most significant effects of the 17 January, 1994 Northridge, California earthquake (M=6.7) was the triggering of thousands of landslides over a broad area. Some of these landslides damaged and destroyed homes and other tructures, blocked roads, disrupted pipelines, and caused other serious damage. Analysis of the distribution and characteristics of these landslides is important in understanding what areas may be susceptible to landsliding in future earthquakes. We analyzed the frequency, distribution, and geometries of triggered landslides in the Santa Susana 7.5??? quadrangle, an area of intense seismic landslide activity near the earthquake epicenter. Landslides occured primarily in young (Late Miocene through Pleistocene) uncemented or very weakly cemented sediment that has been repeatedly folded, faulted, and uplifted in the past 1.5 million years. The most common types of landslide triggered by the earthquake were highly disrupted, shallow falls and slides of rock and debris. Far less numerous were deeper, more coherent slumps and block slides, primarily occuring in more cohesive or competent materials. The landslides in the Santa Susana quadrangle were divided into two samples: single landslides (1502) and landslide complexes (60), which involved multiple coalescing failures of surficial material. We described landslide, morphologies by computing simple morphometric parameters (area, length, width, aspect ratio, slope angle). To quantify and rank the relative susceptibility of each geologic unit to seismic landsliding, we calculated two indices: (1) the susceptibility index, which is the ratio (given as a percentage) of the area covered by landslide sources within a geologic unit to the total outcrop area of that unit: and (2) the frequency index [given in landslides per square kilometer (ls/km2)], which is the total number of landslides within each geologic unit divided by the outcrop area of that unit. Susceptibility categories include very high (>2.5% landslide area or >30 1s/km2). high (1.0-2.5% landslide area or 10-30 1s/km2), moderate (0.5-1.0% landslide area or 3-10 1s/km2), and low (<0.5% landslide area and <3 1s/km2). ?? 2000 Elsevier Science B.V. All rights reserved.

  1. Measuring Earthquakes: Intensity Maps

    NSDL National Science Digital Library

    This set of exercises will introduce students to the construction of earthquake intensity maps, familiarize them with the Modified Mercalli Intensity Scale, and give them the opportunity to build their own maps online in order to locate the epicenter of an earthquake. In the first exercise, they will use intensity data from the 1986 North Palm Springs, California earthquake to create an isoseismal map. In the second, they will use a special interactive page of dynamic HTML to plot intensities that they assign based on reports, and attempt to determine the epicenter based on the area of highest intensity.

  2. On the resolution of shallow mantle viscosity structure using post-earthquake relaxation data: Application to the 1999 Hector Mine, California, earthquake

    USGS Publications Warehouse

    Pollitz, Fred F.; Thatcher, Wayne R.

    2010-01-01

    Most models of lower crust/mantle viscosity inferred from postearthquake relaxation assume one or two uniform-viscosity layers. A few existing models possess apparently significant radially variable viscosity structure in the shallow mantle (e.g., the upper 200 km), but the resolution of such variations is not clear. We use a geophysical inverse procedure to address the resolving power of inferred shallow mantle viscosity structure using postearthquake relaxation data. We apply this methodology to 9 years of GPS-constrained crustal motions after the 16 October 1999 M = 7.1 Hector Mine earthquake. After application of a differencing method to isolate the postearthquake signal from the “background” crustal velocity field, we find that surface velocities diminish from ?20 mm/yr in the first few months to ?2 mm/yr after 2 years. Viscoelastic relaxation of the mantle, with a time-dependent effective viscosity prescribed by a Burgers body, provides a good explanation for the postseismic crustal deformation, capturing both the spatial and temporal pattern. In the context of the Burgers body model (which involves a transient viscosity and steady state viscosity), a resolution analysis based on the singular value decomposition reveals that at most, two constraints on depth-dependent steady state mantle viscosity are provided by the present data set. Uppermost mantle viscosity (depth ? 60 km) is moderately resolved, but deeper viscosity structure is poorly resolved. The simplest model that explains the data better than that of uniform steady state mantle viscosity involves a linear gradient in logarithmic viscosity with depth, with a small increase from the Moho to 220 km depth. However, the viscosity increase is not statistically significant. This suggests that the depth-dependent steady state viscosity is not resolvably different from uniformity in the uppermost mantle.

  3. Slip rate, earthquake recurrence, and seismogenic potential of the Rodgers Creek fault zone, northern California: Initial results

    SciTech Connect

    Budding, K.E. (Geological Survey, Denver, CO (United States)); Schwartz, D.P.; Oppenheimer, D.H. (Geological Survey, Menlo Park, CA (United States))

    1991-03-01

    Instrumental seismicity defines a seismic gap along the Rodgers Creek fault zone (RCFZ) between Santa Rose and San Pablo Bay. Results of a paleoseismicity study within the gap, using offset channels in late Holocene alluvial deposits as piercing points, indicate a minimum slip rate of 2.1 to 5.8 mm/yr for the past 1,300 years, a preferred range for the maximum recurrence interval of 248 to 679 years, and a surface offset of 2 +0.3, {minus}0.2 m during the most recent event. The RCFZ has produced past M7 earthquakes, and historical seismicity data indicate a minimum elapsed time of 182 years since the most recent earthquake of this size.

  4. Chapter C. The Loma Prieta, California, Earthquake of October 17, 1989 - Fire, Police, Transportation and Hazardous Materials

    USGS Publications Warehouse

    Van Anne, Craig, (Edited By); Scawthorn, Charles R.

    1994-01-01

    The papers in this chapter discuss some of the failures and successes that resulted from the societal response by a multitude of agencies to the Loma Prieta earthquake. Some of the lessons learned were old ones relearned. Other lessons were obvious ones which had gone unnoticed. Still, knowledge gained from past earthquakes spawned planning and mitigation efforts which proved to be successful in limiting the aftermath effects of the Loma Prieta event. At least four major areas of response are presented in this chapter: the Oakland Police Department response to the challenge of controlled access to the Cypress freeway collapse area without inhibiting relief and recovery efforts; search and rescue of the freeway collapse and the monumental crisis management problem that accompanied it; the short- and long-term impact on transbay transportation systems to move a large work force from home to business; and the handling of hazardous material releases throughout the Bay Area.

  5. Chapter D. The Loma Prieta, California, Earthquake of October 17, 1989 - Earth Structures and Engineering Characterization of Ground Motion

    USGS Publications Warehouse

    Holzer, Thomas L.

    1998-01-01

    This chapter contains two papers that summarize the performance of engineered earth structures, dams and stabilized excavations in soil, and two papers that characterize for engineering purposes the attenuation of ground motion with distance during the Loma Prieta earthquake. Documenting the field performance of engineered structures and confirming empirically based predictions of ground motion are critical for safe and cost effective seismic design of future structures as well as the retrofitting of existing ones.

  6. Earthquake clustering inferred from Pliocene Gilbert-type fan deltas in the Loreto basin, Baja California Sur, Mexico

    NASA Astrophysics Data System (ADS)

    Dorsey, Rebecca J.; Umhoefer, Paul J.; Falk, Peter D.

    1997-08-01

    A stacked sequence of Pliocene Gilbert-type fan deltas in the Loreto basin was shed from the footwall of the dextral-normal Loreto fault and deposited at the margin of a marine basin during rapid fault-controlled subsidence. Fan-delta parasequences coarsen upward from marine siltstone and sandstone at the base, through sandy bottomsets and gravelly foresets, to gravelly nonmarine topsets. Each topset unit is capped by a thin shell bed that records marine flooding of the delta plain. Several mechanisms may have produced repetitive vertical stacking of Gilbert deltas: (1) autocyclic delta-lobe switching; (2) eustatic sea-level fluctuations; (3) climatically controlled fluctuations in sediment input; and (4) episodic subsidence produced by temporal clustering of earthquakes. We favor hypothesis 4 for several reasons, but hypotheses 2 and 3 cannot be rejected at this time. Earthquake clustering can readily produce episodic subsidence at spatial and temporal scales consistent with stratigraphic trends observed in the Loreto basin. This model is supported by comparison with paleoseismological studies that document clustering on active faults over a wide range of time scales. Earthquake clustering is a new concept in basin analysis that may be helpful for understanding repetitive stratigraphy in tectonically active basins.

  7. The 1999 (Mw 7.1) Hector Mine, California, Earthquake: Near-Field Postseismic Deformation from ERS Interferometry

    NASA Technical Reports Server (NTRS)

    Jacobs, Allison; Sandwell, David; Fialko, Yuri; Sichoix, Lydie

    2002-01-01

    Interferometric synthetic aperture radar (InSAR) data over the area of the Hector Mine earthquake (Mw 7.1, 16 October 1999) reveal postseismic deformation of several centimeters over a spatial scale of 0.5 to 50 km. We analyzed seven SAR acquisitions to form interferograms over four time periods after the event. The main deformations seen in the line-of-sight (LOS) displacement maps are a region of subsidence (60 mm LOS increase) on the northern end of the fault, a region of uplift (45 mm LOS decrease) located to the northeast of the primary fault bend, and a linear trough running along the main rupture having a depth of up to 15 mm and a width of about 2 km. We correlate these features with a double left-bending, rightlateral, strike-slip fault that exhibits contraction on the restraining side and extension along the releasing side of the fault bends. The temporal variations in the near-fault postseismic deformation are consistent with a characteristic time scale of 135 + 42 or - 25 days, which is similar to the relaxation times following the 1992 Landers earthquake. High gradients in the LOS displacements occur on the fault trace, consistent with afterslip on the earthquake rupture. We derive an afterslip model by inverting the LOS data from both the ascending and descending orbits. Our model indicates that much of the afterslip occurs at depths of less than 3 to 4 km.

  8. Displaced rocks, strong motion, and the mechanics of shallow faulting associated with the 1999 Hector Mine, California, earthquake

    USGS Publications Warehouse

    Michael, A.J.; Ross, S.L.; Stenner, H.D.

    2002-01-01

    The paucity of strong-motion stations near the 1999 Hector Mine earthquake makes it impossible to make instrumental studies of key questions about near-fault strong-motion patterns associated with this event. However, observations of displaced rocks allow a qualitative investigation of these problems. By observing the slope of the desert surface and the frictional coefficient between these rocks and the desert surface, we estimate the minimum horizontal acceleration needed to displace the rocks. Combining this information with observations of how many rocks were displaced in different areas near the fault, we infer the level of shaking. Given current empirical shaking attenuation relationships, the number of rocks that moved is slightly lower than expected; this implies that slightly lower than expected shaking occurred during the Hector Mine earthquake. Perhaps more importantly, stretches of the fault with 4 m of total displacement at the surface displaced few nearby rocks on 15?? slopes, suggesting that the horizontal accelerations were below 0.2g within meters of the fault scarp. This low level of shaking suggests that the shallow parts of this rupture did not produce strong accelerations. Finally, we did not observe an increased incidence of displaced rocks along the fault zone itself. This suggests that, despite observations of fault-zone-trapped waves generated by aftershocks of the Hector Mine earthquake, such waves were not an important factor in controlling peak ground acceleration during the mainshock.

  9. Putting Down Roots in Earthquake Country

    NSDL National Science Digital Library

    Putting Down Roots in Earthquake Country is an informational Web site provided by the Southern California Earthquake Center. Citizens can learn about the San Andreas fault, other California faults, how to build and maintain an earthquake safe house, how to survive an earthquake, how they are measured and what the magnitude means, common earthquake myths, and much more. As a safety and an educational site, this unique resource does a good job of presenting a lot of information, illustrations, and graphics in an easy-to-follow format that helps explain this powerful and potentially deadly natural occurrence.

  10. Breaks in Pavement and Pipes as Indicators of Range-Front Faulting Resulting from the 1989 Loma Prieta Earthquake near the Southwest Margin of the Santa Clara Valley, California

    USGS Publications Warehouse

    Schmidt, Kevin M.; Ellen, Stephen D.; Haugerud, Ralph A.; Peterson, David M.; Phelps, Geoffery A.

    1995-01-01

    Damage to pavement and near-surface utility pipes, caused by the October 17, 1989, Loma Prieta earthquake, provide indicators for ground deformation in a 663 km2 area near the southwest margin of the Santa Clara Valley, California. The spatial distribution of 1284 sites of such damage documents the extent and distribution of detectable ground deformation. Damage was concentrated in four zones, three of which are near previously mapped faults. The zone through Los Gatos showed the highest concentration of damage, as well as evidence for pre- and post-earthquake deformation. Damage along the foot of the Santa Cruz Mountains reflected shortening that is consistent with movement along reverse faults in the region and with the hypothesis that tectonic strain is distributed widely across numerous faults in the California Coast Ranges.

  11. The possible statistical relation of Pc1 pulsations to Earthquake occurrence at low latitudes

    NASA Astrophysics Data System (ADS)

    Bortnik, J.; Cutler, J. W.; Dunson, C.; Bleier, T. E.

    2008-09-01

    We examine the association between earthquakes and Pc1 pulsations observed at a low-latitude station in Parkfield, California. The period under examination is ~7.5 years in total, from February 1999 to July 2006, and we use an automatic identification algorithm to extract information on Pc1 pulsations from the magnetometer data. These pulsations are then statistically correlated to earthquakes from the USGS NEIC catalog within a radius of 200 km around the magnetometer, and M>3.0. Results indicate that there is an enhanced occurrence probability of Pc1 pulsations ~5 15 days in advance of the earthquakes, during the daytime. We quantify the statistical significance and show that such an enhancement is unlikely to have occurred due to chance alone. We then examine the effect of declustering our earthquake catalog, and show that even though significance decreases, there is still a statistically significant daytime enhancement prior to the earthquakes. Finally, we select only daytime Pc1 pulsations as the fiducial time of our analysis, and show that earthquakes are ~3 5 times more likely to occur in the week following these pulsations, than normal. Comparing these results to other events, it is preliminarily shown that the normal earthquake probability is unaffected by geomagnetic activity, or a random event sequence.

  12. Spatio-temporal Signatures of Post-seismic Relaxation due to the Mojave Desert (S. California) Earthquakes from InSAR and GPS Data, With Implications for the Driving Mechanisms

    Microsoft Academic Search

    Y. Fialko

    2004-01-01

    The 1992 Mw7.3 Landers and 1999 Mw7.3 Hector Mine earthquakes in the Mojave desert (southern California) produced some of the best ever documented post-seismic deformation transients. I use a well-populated catalog of the Synthetic Aperture Radar (SAR) data from the ERS 1 and 2 satellites that includes more than 200 interferable acquisitions, and time series from several tens of Global

  13. Earthquake forecasting using the pattern informatics (PI) index

    Microsoft Academic Search

    K. F. Tiampo; J. B. Rundle; J. Holliday; K. Z. Nanjo; C. Chen; D. L. Turcotte; A. Jimenez; S. Levin

    2005-01-01

    Recent large earthquakes include the M ~ 7.4 event that struck Izmit, Turkey in August of 1999, the M ~ 7.6 Taiwan earthquake which occurred in September of 1999, the M ~ 7.1 Hector Mine, California earthquake of October 1999, and the M ~ 9 Indonesian earthquake of December 2005. Many similar examples have been documented over the course of

  14. Probable epizootic chlamydiosis in wild California (Larus californicus) and ring-billed (Larus delawarensis) gulls in North Dakota.

    PubMed

    Franson, J C; Pearson, J E

    1995-07-01

    During the summer of 1986, more than 400 California gulls (Larus californicus) and ring-billed gulls (Larvus delawarensis), primarily fledglings, died on an island in Lake Sakakawea near New Town, North Dakota (USA). Mortality was attributed largely to chlamydiosis. Necropsy findings in nine carcasses included splenomegaly (n = 9), hepatomegaly (n = 4), and pericarditis (n = 1). Livers from three California gulls and two ring-billed gulls, and spleens from the same five birds plus a third ring-billed gull were positive for Chlamydia psittaci by the direct immunofluorescence test. Chlamydia psittaci was isolated from separate pools of liver and spleen from one California gull and one ring-billed gull. This is believed to be the first record of epizootic chlamydiosis in gulls and the second report of epizootic chlamydial mortality in wild birds in North America. PMID:8592370

  15. Probable epizootic chlamydiosis in wild California (Larus californicus) and ring-billed (Larus delawarensis) gulls in North Dakota

    USGS Publications Warehouse

    Franson, J.C.; Pearson, J.E.

    1995-01-01

    During the summer of 1986, more than 400 California gulls (Larus californicus) and ring-billed gulls (Larvus delawarensis), primarily fledglings, died on an island in Lake Sakakawea near New Town, North Dakota (USA). Mortality was attributed largely to chlamydiosis. Necropsy findings in nine carcasses included splenomegaly (n = 9), hepatomegaly (n = 4), and pericarditis (n = 1). Livers from three California gulls and two ring-billed gulls, and spleens from the same five birds plus a third ring-billed gull were positive for Chlamydia psittaci by the direct immunofluorescence test. Chlamydia psittaci was isolated from separate pools of liver and spleen from one California gull and one ring-billed gull. This is believed to be the first record of epizootic chlamydiosis in gulls and the second report of epizootic chlamydial mortality in wild birds in North America.

  16. The Prediction of Spatial Aftershock Probabilities (PRESAP)

    Microsoft Academic Search

    J. McCloskey

    2003-01-01

    It is now widely accepted that the goal of deterministic earthquake prediction is unattainable in the short term and may even be forbidden by nonlinearity in the generating dynamics. This nonlinearity does not, however, preclude the estimation of earthquake probability and, in particular, how this probability might change in space and time; earthquake hazard estimation might be possible in the

  17. Permanent GPS Geodetic Array in Southern California

    NASA Technical Reports Server (NTRS)

    Green, Cecil H.; Green, Ida M.

    1998-01-01

    The southern California Permanent GPS Geodetic Array (PGGA) was established in the spring of 1990 to evaluate continuous Global Positioning System (GPS) measurements as a new too] for monitoring crustal deformation. Southern California is an ideal location because of the relatively high rate of tectonic deformation, the high probability of intense seismicity, the long history of conventional and space geodetic measurements, and the availability of a well developed infrastructure to support continuous operations. Within several months of the start of regular operations, the PGGA recorded far-field coseismic displacements induced by the June 28, 1992 (M(sub w)=7.3), Landers earthquake, the largest magnitude earthquake in California in the past 40 years and the first one to be recorded by a continuous GPS array. Only nineteen months later, on 17 January 1994, the PGGA recorded coseismic displacements for the strongest earthquake to strike the Los Angeles basin in two decades, the (M(sub e)=6.7) Northridge earthquake. At the time of the Landers earthquake, only seven continuous GPS sites were operating in southern California; by the beginning of 1994, three more sites had been added to the array. However, only a pair of sites were situated in the Los Angeles basin. The destruction caused by the Northridge earthquake spurred a fourfold increase in the number of continuous GPS sites in southern California within 2 years of this event. The PGGA is now the regional component of the Southern California Integrated GPS Network (SCIGN), a major ongoing densification of continuous GPS sites, with a concentration in the Los Angeles metropolitan region. Continuous GPS provides temporally dense measurements of surface displacements induced by crustal deformation processes including interseismic, coseismic, postseismic, and aseismic deformation and the potential for detecting anomalous events such as preseismic deformation and interseismic strain variations. Although strain meters yield much higher short-term resolution to a period of about 1 year, a single continuous GPS site is significantly less expensive than a single strain meter and probably has better long-term stability beyond a 1-year period. Compared to less frequent field measurements, continuous GPS provides the means to better characterize the errors in GPS position measurements and thereby obtain more realistic estimates of derived parameters such as site velocities.

  18. Earthquakes and Geology

    NSDL National Science Digital Library

    David Ozsvath

    In this activity, students investigate the relationship between intensity of ground motion and type of rock or alluvium, as seen in the 1994 Northridge, California earthquake. They will examine a map of Mercalli intensity, a cross-section showing geologic structures and rock types, and a map of surficial geology, and answer questions pertaining to amplification of ground motion and S-wave velocities.

  19. Small-scale deformations associated with the 1992 Landers, California, earthquake mapped by synthetic aperture radar interferometry phase gradients

    NASA Astrophysics Data System (ADS)

    Price, Evelyn J.; Sandwell, David T.

    1998-11-01

    The Landers earthquake (Mw 7.3) occurred on June 28, 1992, and ruptured nearly 100 km of previously mapped and unmapped faults in the Mojave Desert. We use synthetic aperture radar interferometry (InSAR) to examine the cumulative surface deformation between April 24 and August 7, 1992, in a 100 × 100 km region surrounding the northern portion of the earthquake rupture. Also, we introduce a technique for manipulating SAR interferograms to extract short-wavelength displacement information. This technique involves computation and subsequent combination of interferometric phase gradient maps. The InSAR results show significant deformation signatures associated with faults, fractures, dry lake beds, and mountainous regions within 75-100 km of the main rupture. Using the phase gradient method, we are able to extract small-scale deformation patterns near the main rupture. Many of the preexisting, mapped faults within 50 km of the main rupture experienced triggered slip; these include the Old Woman, Lenwood, Johnson Valley, West Calico, and Calico Faults. The InSAR results also indicate right-lateral offsets along secondary fractures trending N-NE within the left-lateral zone of shear between the main rupture and the Johnson Valley Fault. Additionally, there are interesting interferogram fringe signatures surrounding Troy Dry Lake and Coyote Dry Lake that are related to deformation of dry lake beds.

  20. A preliminary study of the Santa Barbara, California, earthquake of August 13, 1978, and its major aftershocks

    USGS Publications Warehouse

    Lee, William Hung Kan; Johnson, C.E.; Henyey, T.L.; Yerkes, R.L.

    1978-01-01

    The ML5.1 Santa Barbara earthquake of August 13, 1978 occurred at lat 34 ? 22.2'N., long 119 ? 43.0' 4 km south of Santa Barbara, Calif. at a depth of 12.5 km in the northeast Santa Barbara Channel, part of the western Transverse Ranges geomorphic-structural province. This part of the province is characterized by seismically active, east-trending reverse faults and rates of coastal uplift that have averaged up to about 10 m/1000 years over the last 45,000 years. No surface rupture was detected onshore. Subsurface rupture propagated northwest from the main shock toward Goleta, 15 km west of Santa Barbara, where a maximum acceleration of 0.44 g was measured at ground level and extensive minor damage occurred; only minor injuries were reported. A fairly well-constrained fault-plane solution of the main shock and distribution of the aftershocks indicate that left-reverse-oblique slip occurred on west-northwest-trending, north-dipping reverse faults; inadequate dip control precludes good correlation with any one of several mapped faults. Had the earthquake been larger and rupture propagated to the southeast or a greater distance to the northwest, it could have posed a hazard to oilfield operations. The fault-plane solution and aftershock pattern closely fit the model of regional deformation and the solution closely resembles those of five previously mapped events located within a 15-km radius.

  1. Earthquake prediction, societal implications

    NASA Astrophysics Data System (ADS)

    Aki, Keiiti

    1995-07-01

    "If I were a brilliant scientist, I would be working on earthquake prediction." This is a statement from a Los Angeles radio talk show I heard just after the Northridge earthquake of January 17, 1994. Five weeks later, at a monthly meeting of the Southern California Earthquake Center (SCEC), where more than two hundred scientists and engineers gathered to exchange notes on the earthquake, a distinguished French geologist who works on earthquake faults in China envied me for working now in southern California. This place is like northeastern China 20 years ago, when high seismicity and research activities led to the successful prediction of the Haicheng earthquake of February 4, 1975 with magnitude 7.3. A difficult question still haunting us [Aki, 1989] is whether the Haicheng prediction was founded on the physical reality of precursory phenomena or on the wishful thinking of observers subjected to the political pressure which encouraged precursor reporting. It is, however, true that a successful life-saving prediction like the Haicheng prediction can only be carried out by the coordinated efforts of decision makers and physical scientists.

  2. Simulating Earthquakes for Science and Society: New Earthquake Visualizations Ideal for Use in Science Communication

    Microsoft Academic Search

    R. M. de Groot; M. L. Benthien

    2006-01-01

    The Southern California Earthquake Center (SCEC) has been developing groundbreaking computer modeling capabilities for studying earthquakes. These visualizations were initially shared within the scientific community but have recently have gained visibility via television news coverage in Southern California. These types of visualizations are becoming pervasive in the teaching and learning of concepts related to earth science. Computers have opened up

  3. Coseismic and initial postseismic deformation from the 2004 Parkfield, California, earthquake, observed by global positioning system, electronic distance meter, creepmeters, and borehole strainmeters

    USGS Publications Warehouse

    Langbein, J.; Murray, J.R.; Snyder, H.A.

    2006-01-01

    Global Positioning System (GPS), electronic distance meter, creepmeter, and strainmeter measurements spanning the M 6.0 Parkfield, California, earthquake are examined. Using these data from 100 sec through 9 months following the main-shock, the Omori's law, with rate inversely related to time, l/t p and p ranging between 0.7 and 1.3, characterizes the time-dependent deformation during the post-seismic period; these results are consistent with creep models for elastic solids. With an accurate function of postseismic response, the coseismic displacements can be estimated from the high-rate, 1-min sampling GPS; and the coseismic displacements are approximately 75% of those estimated from the daily solutions. Consequently, fault-slip models using daily solutions overestimate coseismic slip. In addition, at 2 months and at 8 months following the mainshock, postseismic displacements are modeled as slip on the San Andreas fault with a lower bound on the moment exceeding that of the coseismic moment.

  4. A stochastic estimate of ground motion at Oceano, California, for the M 6.5 22 December 2003 San Simeon earthquake, derived from aftershock recordings

    USGS Publications Warehouse

    Di, Alessandro C.; Boatwright, J.

    2006-01-01

    The U.S. Geological Survey deployed a digital seismic station in Oceano, California, in February 2004, to investigate the cause of damage and liquefaction from the 22 December 2003 M 6.5 San Simeon earthquake. This station recorded 11 M > 2.8 aftershocks in almost 8 weeks. We analyze these recordings, together with recordings of the mainshock and the same aftershocks obtained from nearby stations in Park Hill and San Luis Obispo, to estimate the mainshock ground motion in Oceano. We estimate the Fourier amplitude spectrum using generalized spectral ratio analysis. We test a set of aftershocks as Green's functions by comparing simulated and recorded acceleration amplitude spectra for the mainshock at San Luis Obispo and Park Hill. We convolve the aftershock accelerograms with a stochastic operator to simulate the duration and phase of the mainshock accelerograms. This approximation allows us to extend the range of aftershocks that can be used as Green's functions to events nearly three magnitude units smaller than the mainshock. Our realizations for the mainshock accelerogram at Oceano yield peak ground accelerations distributed as 28% ?? 4%g. We interpret these realizations as upper bounds for the actual ground motion, because our analysis assumes a linear response, whereas the presence of liquefaction indicates that the ground behaved nonlinearly in Oceano.

  5. Systematic procedural and sensitivity analysis of the pattern informatics method for forecasting large (M > 5) earthquake events in southern California

    E-print Network

    Holliday, J R; Klein, B; Rundle, J B; Tiampo, K F

    2005-01-01

    Recent studies in the literature have introduced a new approach to earthquake forecasting based on representing the space-time patterns of localized seismicity by a time-dependent system state vector in a real-valued Hilbert space and deducing information about future space-time fluctuations from the phase angle of the state vector. While the success rate of this Pattern Informatics (PI) method has been encouraging, the method is still in its infancy. Procedural analysis, statistical testing, parameter sensitivity investigation and optimization all still need to be performed. In this paper, we attempt to optimize the PI approach by developing quantitative values for "predictive goodness" and analyzing possible variations in the proposed procedure. In addition, we attempt to quantify the systematic dependence on the quality of the input catalog of historic data and develop methods for combining catalogs from regions of different seismic rates.

  6. Use of microearthquakes in the study of the mechanics of earthquake generation along the San Andreas fault in central California

    USGS Publications Warehouse

    Eaton, J.P.; Lee, W.H.K.; Pakiser, L.C.

    1970-01-01

    A small, dense network of independently recording portable seismograph stations was used to delineate the slip surface associated with the 1966 Parkfield-Cholame earthquake by precise three dimensional mapping of the hypocenters of its aftershocks. The aftershocks were concentrated in a very narrow vertical zone beneath or immediately adjacent to the zone of surf ace fracturing that accompanied the main shock. Focal depths ranged from less than 1 km to a maximum of 15 km. The same type of portable network was used to study microearthquakes associated with an actively creeping section of the San Andreas fault south of Hollister during the summer of 1967. Microearthquake activity during the 6-week operation of this network was dominated by aftershocks of a magnitude-4 earthquake that occurred within the network near Bear Valley on July 23. Most of the aftershocks were concentrated in an equidimensional region about 2 1 2km across that contained the hypocenter of the main shock. The zone of the concentrated aftershocks was centered near the middle of the rift zone at a depth of about 3 1 2km. Hypocenters of other aftershocks outlined a 25 km long zone of activity beneath the actively creeping strand of the fault and extending from the surface to a depth of about 13 km. A continuing study of microearthquakes along the San Andreas, Hayward, and Calaveras faults between Hollister and San Francisco has been under way for about 2 years. The permanent telemetered network constructed for this purpose has grown from about 30 stations in early 1968 to about 45 stations in late 1969. Microearthquakes between Hollister and San Francisco are heavily concentrated in narrow, nearly vertical zones along sections of the Sargent, San Andreas, and Calaveras faults. Focal depths range from less than 1 km to about 14 km. ?? 1970.

  7. Stress loading from viscous flow in the lower crust and triggering of aftershocks following the 1994 Northridge, California, earthquake

    USGS Publications Warehouse

    Deng, J.; Hudnut, K.; Gurnis, M.; Hauksson, E.

    1999-01-01

    Following the M(w) 6.7 Northridge earthquake, significant postseismic displacements were resolved with GPS. Using a three-dimensional viscoelastic model, we suggest that this deformation is mainly driven by viscous flow in the lower crust. Such flow can transfer stress to the upper crust and load the rupture zone of the main shock at a decaying rate. Most aftershocks within the rupture zone, especially those that occurred after the first several weeks of the main shock, may have been triggered by continuous stress loading from viscous flow. The long-term decay time of aftershocks (about 2 years) approximately matches the decay of viscoelastic loading, and thus is controlled by the viscosity of the lower crust. Our model provides a physical interpretation of the observed correlation between aftershock decay rate and surface heat flow.Following the Mw 6.7 Northridge earthquake, significant postseismic displacements were resolved with GPS. Using a three-dimensional viscoelastic model, we suggest that this deformation is mainly driven by viscous flow in the lower crust. Such flow can transfer stress to the upper crust and load the rupture zone of the main shock at a decaying rate. Most aftershocks within the rupture zone, especially those that occurred after the first several weeks of the main shock, may have been triggered by continuous stress loading from viscous flow. The long-term decay time of aftershocks (about 2 years) approximately matches the decay of viscoelastic loading, and thus is controlled by the viscosity of the lower crust. Our model provides a physical interpretation of the observed correlation between aftershock decay rate and surface heat flow.

  8. Geologic investigations of a 'slip gap' in the surficial ruptures of the 1992 Landers earthquake, southern California

    NASA Technical Reports Server (NTRS)

    Spotila, James A.; Sieh, Kerry

    1995-01-01

    A 3-km-long gap in the dextral surficial rupture of the 1992 M(sub W) = 7.3 Landers earthquake occurs at the north end of a major fault stepover between the Johnson Valley and Homestead Valley faults. This gap is situated along a segment of the Landers rupture that has been modeled geophysically as having a deficit in average slip at depth. To better evaluate the nature of the slip gap, we document in detail the character and distribution of surficial rupture within it. Along the gap, is a northwest trending thrust fault rupture with an average of less than 1 m of northeast directed reverse-slip and nearly no oblique right slip. We interpret this rupture to be limited to the shallow crust of the northern end of the stepover and to have been the secondary result of dextral shear, rather than a mechanism of rigid-block slip-transfer from the Landers-Kickapoo fault. A zone of en echelon extensional ruptures also occurs along the slip gap, which we interpret as the secondary result of diffuse dextral shear that accommodated less than 0.5 m of west-northwest extension. These secondary ruptures represent a discontinuity in the surficial dextral ruptre of the Landers earthquake, which we propose resulted from the lack of a mature fault connection between the Johnson Valley and Homestead Valley faults. The rupture pattern of the slip gap implies a significant deficit in net surficial slip, which compares favorably with some geophysical models. Aspects of this rupture pattern also suggest a temporal sequence of rupture that compares favorably with geophysical interpretations of the dynamic rupture propagation.

  9. Modeling earthquake dynamics

    NASA Astrophysics Data System (ADS)

    Charpentier, Arthur; Durand, Marilou

    2015-07-01

    In this paper, we investigate questions arising in Parsons and Geist (Bull Seismol Soc Am 102:1-11, 2012). Pseudo causal models connecting magnitudes and waiting times are considered, through generalized regression. We do use conditional model (magnitude given previous waiting time, and conversely) as an extension to joint distribution model described in Nikoloulopoulos and Karlis (Environmetrics 19: 251-269, 2008). On the one hand, we fit a Pareto distribution for earthquake magnitudes, where the tail index is a function of waiting time following previous earthquake; on the other hand, waiting times are modeled using a Gamma or a Weibull distribution, where parameters are functions of the magnitude of the previous earthquake. We use those two models, alternatively, to generate the dynamics of earthquake occurrence, and to estimate the probability of occurrence of several earthquakes within a year or a decade.

  10. High-resolution seismic reflection/refraction imaging from Interstate 10 to Cherry Valley Boulevard, Cherry Valley, Riverside County, California: implications for water resources and earthquake hazards

    USGS Publications Warehouse

    Gandhok, G.; Catchings, R.D.; Goldman, M.R.; Horta, E.; Rymer, M.J.; Martin, P.; Christensen, A.

    1999-01-01

    This report is the second of two reports on seismic imaging investigations conducted by the U.S. Geological Survey (USGS) during the summers of 1997 and 1998 in the Cherry Valley area in California (Figure 1a). In the first report (Catchings et al., 1999), data and interpretations were presented for four seismic imaging profiles (CV-1, CV-2, CV-3, and CV-4) acquired during the summer of 1997 . In this report, we present data and interpretations for three additional profiles (CV-5, CV-6, and CV-7) acquired during the summer of 1998 and the combined seismic images for all seven profiles. This report addresses both groundwater resources and earthquake hazards in the San Gorgonio Pass area because the shallow (upper few hundred meters) subsurface stratigraphy and structure affect both issues. The cities of Cherry Valley and Beaumont are located approximately 130 km (~80 miles) east of Los Angeles, California along the southern alluvial fan of the San Bernardino Mountains (see Figure 1b). These cities are two of several small cities that are located within San Gorgonio Pass, a lower-lying area between the San Bernardino and the San Jacinto Mountains. Cherry Valley and Beaumont are desert cities with summer daytime temperatures often well above 100 o F. High water usage in the arid climate taxes the available groundwater supply in the region, increasing the need for efficient management of the groundwater resources. The USGS and the San Gorgonio Water District (SGWD) work cooperatively to evaluate the quantity and quality of groundwater supply in the San Gorgonio Pass region. To better manage the water supplies within the District during wet and dry periods, the SGWD sought to develop a groundwater recharge program, whereby, excess water would be stored in underground aquifers during wet periods (principally winter months) and retrieved during dry periods (principally summer months). The SGWD preferred a surface recharge approach because it could be less expensive than a recharging program based on injection wells. However, at an existing surface recharge site, surface recharge of the aquifer was limited by the presence of clayrich layers that impede the downward percolation of the surface water. In boreholes, these clay-rich layers were found to extend from the near surface to about 50 m depth. If practical, the SGWD desired to relocate the recharge ponds to another location within the Cherry Valley–Beaumont area. This required that sites be found where the clay-rich layers were absent. The SGWD elected to explore for such sites by employing a combination of drilling and seismic techniques. A number of near-surface faults have been suggested in the Cherry Valley-Beaumont area (Figure 1b). However, there may be additional unmapped faults that underlie the alluvial valley of San Gorgonio Pass. Because faults are known to act as barriers to lateral groundwater flow in alluvial groundwater systems, mapped and unmapped subsurface faults in the Cherry Valley-Beaumont area would likely influence groundwater flow and the lateral distribution of recharged water. These same faults may pose a significant hazard to the local desert communities and to greater areas of southern California due to the presence of lifelines (water, electrical, gas, transportation, etc.) that extend through San Gorgonio Pass to larger urban areas. The three principal goals of the seismic investigation presented in this report were to laterally map the subsurface stratigraphic horizons, locate faults that may act as barriers to groundwater flow, and measure velocities of shallow sediments that may give rise to amplified shaking during major earthquakes.

  11. Liquefaction potential in San Jose, California

    SciTech Connect

    Power, M.S.; Wesling, J.R.; Youngs, R.R.; Perman, R.C.; Disilvestro, L.A. (Geomatrix Consultants, San Francisco, CA (United States))

    1992-01-01

    San Jose, California is located within a region of high seismic potential. This study assesses liquefaction potential within San Jose by evaluating liquefaction susceptibility, liquefaction opportunity, and probability of liquefaction. Liquefaction susceptibility was evaluated principally by mapping the Quaternary geologic deposits in the City, and by analyzing soil data and groundwater data in approximately 850 geotechnical reports contained in the files of the City's Department of Public Works. A seismic zonation of liquefaction susceptibility was developed wherein susceptibility is a function of sediment age and mode of deposition, and of depth to groundwater. Liquefaction opportunity was evaluated by characterizing the frequency of occurrence of earthquakes on seismic sources in the region, characterizing earthquake ground motion attenuation, and calculating the probability or frequency of exceedance of earthquake ground motions at various locations throughout the City. Probabilities of liquefaction were evaluated by combining the results of the assessments of liquefaction susceptibility and opportunity. For Holocene fluvial deposits, the probabilities of liquefaction in a 50-year time period are about 60% to 80% for areas of shallow groundwater and 40% to 50% for areas of intermediate groundwater depth. For Holocene alluvial fan deposits and shallow groundwater, the probability of liquefaction in a 50-year time period is 40% to 50%.

  12. Risk and return: evaluating Reverse Tracing of Precursors earthquake predictions

    NASA Astrophysics Data System (ADS)

    Zechar, J. Douglas; Zhuang, Jiancang

    2010-09-01

    In 2003, the Reverse Tracing of Precursors (RTP) algorithm attracted the attention of seismologists and international news agencies when researchers claimed two successful predictions of large earthquakes. These researchers had begun applying RTP to seismicity in Japan, California, the eastern Mediterranean and Italy; they have since applied it to seismicity in the northern Pacific, Oregon and Nevada. RTP is a pattern recognition algorithm that uses earthquake catalogue data to declare alarms, and these alarms indicate that RTP expects a moderate to large earthquake in the following months. The spatial extent of alarms is highly variable and each alarm typically lasts 9 months, although the algorithm may extend alarms in time and space. We examined the record of alarms and outcomes since the prospective application of RTP began, and in this paper we report on the performance of RTP to date. To analyse these predictions, we used a recently developed approach based on a gambling score, and we used a simple reference model to estimate the prior probability of target earthquakes for each alarm. Formally, we believe that RTP investigators did not rigorously specify the first two `successful' predictions in advance of the relevant earthquakes; because this issue is contentious, we consider analyses with and without these alarms. When we included contentious alarms, RTP predictions demonstrate statistically significant skill. Under a stricter interpretation, the predictions are marginally unsuccessful.

  13. Written in Stone Earthquake Animations

    NSDL National Science Digital Library

    Jeff Sale, EdCenter Staff Scientist

    This group of brief animations shows destructive phenomena related to earthquakes and provides some advice on mitigating their effects. The collection includes an animation of Rayleigh waves, showing the reverse elliptical motion that makes them especially damaging; a demonstration of the difference in wave propagation and amplitude between hard rock and unconsolidated sediment; and an animation showing the relationship between earthquake magnitude and fault movement on the San Andreas Fault. For homeowners, there are animations depicting an unsecured cripple wall and chimney failure, with suggestions for strengthening these components. There are also animations of fault movement that occurred during specific earthquakes, including the 1994 Northridge earthquake, the 1992 Landers earthquake, and the 1906 San Francisco earthquake. The animations were developed for the educational video "Written in Stone," a project funded by and developed for the California Seismic Safety Commission.

  14. Tidal triggering of low frequency earthquakes near Parkfield, California: Implications for fault mechanics within the brittle-ductile transition

    NASA Astrophysics Data System (ADS)

    Thomas, A. M.; Bürgmann, R.; Shelly, D. R.; Beeler, N. M.; Rudolph, M. L.

    2012-05-01

    Studies of nonvolcanic tremor (NVT) have established the significant impact of small stress perturbations on NVT generation. Here we analyze the influence of the solid earth and ocean tides on a catalog of ˜550,000 low frequency earthquakes (LFEs) distributed along a 150 km section of the San Andreas Fault centered at Parkfield. LFE families are identified in the NVT data on the basis of waveform similarity and are thought to represent small, effectively co-located earthquakes occurring on brittle asperities on an otherwise aseismic fault at depths of 16 to 30 km. We calculate the sensitivity of each of these 88 LFE families to the tidally induced right-lateral shear stress (RLSS), fault-normal stress (FNS), and their time derivatives and use the hypocentral locations of each family to map the spatial variability of this sensitivity. LFE occurrence is most strongly modulated by fluctuations in shear stress, with the majority of families demonstrating a correlation with RLSS at the 99% confidence level or above. Producing the observed LFE rate modulation in response to shear stress perturbations requires low effective stress in the LFE source region. There are substantial lateral and vertical variations in tidal shear stress sensitivity, which we interpret to reflect spatial variation in source region properties, such as friction and pore fluid pressure. Additionally, we find that highly episodic, shallow LFE families are generally less correlated with tidal stresses than their deeper, continuously active counterparts. The majority of families have weaker or insignificant correlation with positive (tensile) FNS. Two groups of families demonstrate a stronger correlation with fault-normal tension to the north and with compression to the south of Parkfield. The families that correlate with fault-normal clamping coincide with a releasing right bend in the surface fault trace and the LFE locations, suggesting that the San Andreas remains localized and contiguous down to near the base of the crust. The deep families that have high sensitivity to both shear and tensile normal stress perturbations may be indicative of an increase in effective fault contact area with depth. Synthesizing our observations with those of other LFE-hosting localities will help to develop a comprehensive understanding of transient fault slip below the "seismogenic zone" by providing constraints on parameters in physical models of slow slip and LFEs.

  15. Reply to "Comment on 'Operational Earthquake Forecasting: Status of Knowledge and Guidelines for Implementation by Jordan et al. [2011]'

    E-print Network

    Reply to "Comment on 'Operational Earthquake Forecasting: Status of Knowledge and Guidelines Commission on Earthquake Forecasting (ICEF) report [Jordan et al. 2011], Crampin [2012] claims Yamaoka11, Jochen Zschau12 1 Southern California Earthquake Center, Los Angeles, USA 2 University

  16. The Pulse Azimuth effect as seen in induction coil magnetometers located in California and Peru 2007-2010, and its possible association with earthquakes

    NASA Astrophysics Data System (ADS)

    Dunson, J. C.; Bleier, T. E.; Roth, S.; Heraud, J.; Alvarez, C. H.; Lira, A.

    2011-07-01

    The QuakeFinder network of magnetometers has recorded geomagnetic field activity in California since 2000. Established as an effort to follow up observations of ULF activity reported from before and after the M = 7.1 Loma Prieta earthquake in 1989 by Stanford University, the QuakeFinder network has over 50 sites, fifteen of which are high-resolution QF1005 and QF1007 systems. Pairs of high-resolution sites have also been installed in Peru and Taiwan. Increases in pulse activity preceding nearby seismic events are followed by decreases in activity afterwards in the three cases that are discussed here. In addition, longer term data is shown, revealing a rich signal structure not previously known in QuakeFinder data, or by many other authors who have reported on pre-seismic ULF phenomena. These pulses occur as separate ensembles, with demonstrable repeatability and uniqueness across a number of properties such as waveform, angle of arrival, amplitude, and duration. Yet they appear to arrive with exponentially distributed inter-arrival times, which indicates a Poisson process rather than a periodic, i.e., stationary process. These pulses were observed using three-axis induction coil magnetometers that are buried 1-2 m under the surface of the Earth. Our sites use a Nyquist frequency of 16 Hertz (25 Hertz for the new QF1007 units), and they record these pulses at amplitudes from 0.1 to 20 nano-Tesla with durations of 0.1 to 12 s. They are predominantly unipolar pulses, which may imply charge migration, and they are stronger in the two horizontal (north-south and east-west) channels than they are in the vertical channels. Pulses have been seen to occur in bursts lasting many hours. The pulses have large amplitudes and study of the three-axis data shows that the amplitude ratios of the pulses taken from pairs of orthogonal coils is stable across the bursts, suggesting a similar source. This paper presents three instances of increases in pulse activity in the 30 days prior to an earthquake, which are each followed by steep declines after the event. The pulses are shown, methods of detecting the pulses and calculating their azimuths is developed and discussed, and then the paper is closed with a brief look at future work.

  17. Applying time-reverse-imaging techniques to locate individual low-frequency earthquakes on the San Andreas fault near Cholame, California

    NASA Astrophysics Data System (ADS)

    Horstmann, T.; Harrington, R. M.; Cochran, E.; Shelly, D. R.

    2013-12-01

    Observations of non-volcanic tremor have become ubiquitous in recent years. In spite of the abundance of observations, locating tremor remains a difficult task because of the lack of distinctive phase arrivals. Here we use time-reverse-imaging techniques that do not require identifying phase arrivals to locate individual low-frequency-earthquakes (LFEs) within tremor episodes on the San Andreas fault near Cholame, California. Time windows of 1.5-second duration containing LFEs are selected from continuously recorded waveforms of the local seismic network filtered between 1-5 Hz. We propagate the time-reversed seismic signal back through the subsurface using a staggered-grid finite-difference code. Assuming all rebroadcasted waveforms result from similar wave fields at the source origin, we search for wave field coherence in time and space to obtain the source location and origin time where the constructive interference is a maximum. We use an interpolated velocity model with a grid spacing of 100 m and a 5 ms time step to calculate the relative curl field energy amplitudes for each rebroadcasted seismogram every 50 ms for each grid point in the model. Finally, we perform a grid search for coherency in the curl field using a sliding time window, and taking the absolute value of the correlation coefficient to account for differences in radiation pattern. The highest median cross-correlation coefficient value over at a given grid point indicates the source location for the rebroadcasted event. Horizontal location errors based on the spatial extent of the highest 10% cross-correlation coefficient are on the order of 4 km, and vertical errors on the order of 3 km. Furthermore, a test of the method using earthquake data shows that the method produces an identical hypocentral location (within errors) as that obtained by standard ray-tracing methods. We also compare the event locations to a LFE catalog that locates the LFEs from stacked waveforms of repeated LFEs identified by cross-correlation techniques [Shelly and Hardebeck, 2010]. The LFE catalog uses stacks of at least several hundred templates to identify phase arrivals used to estimate the location. We find epicentral locations for individual LFEs based on the time-reverse-imaging technique are within ~4 km relative to the LFE catalog [Shelly and Hardebeck, 2010]. LFEs locate between 15-25 km depth, and have similar focal depths found in previous studies of the region. Overall, the method can provide robust locations of individual LFEs without identifying and stacking hundreds of LFE templates; the locations are also more accurate than envelope location methods, which have errors on the order of tens of km [Horstmann et al., 2013].

  18. Incorporating anthropogenic influences into fire probability models: Effects of development and climate change on fire activity in California

    NASA Astrophysics Data System (ADS)

    Mann, M.; Moritz, M.; Batllori, E.; Waller, E.; Krawchuk, M.; Berck, P.

    2014-12-01

    The costly interactions between humans and natural fire regimes throughout California demonstrate the need to understand the uncertainties surrounding wildfire, especially in the face of a changing climate and expanding human communities. Although a number of statistical and process-based wildfire models exist for California, there is enormous uncertainty about the location and number of future fires. Models estimate an increase in fire occurrence between nine and fifty-three percent by the end of the century. Our goal is to assess the role of uncertainty in climate and anthropogenic influences on the state's fire regime from 2000-2050. We develop an empirical model that integrates novel information about the distribution and characteristics of future plant communities without assuming a particular distribution, and improve on previous efforts by integrating dynamic estimates of population density at each forecast time step. Historically, we find that anthropogenic influences account for up to fifty percent of the total fire count, and that further housing development will incite or suppress additional fires according to their intensity. We also find that the total area burned is likely to increase but at a slower than historical rate. Previous findings of substantially increased numbers of fires may be tied to the assumption of static fuel loadings, and the use of proxy variables not relevant to plant community distributions. We also find considerable agreement between GFDL and PCM model A2 runs, with decreasing fire counts expected only in areas of coastal influence below San Francisco and above Los Angeles. Due to potential shifts in rainfall patterns, substantial uncertainty remains for the semiarid deserts of the inland south. The broad shifts of wildfire between California's climatic regions forecast in this study point to dramatic shifts in the pressures plant and human communities will face by midcentury. The information provided by this study reduces the level of uncertainty surrounding the influence that natural and anthropogenic systems have on wildfire.

  19. GUIDELINES FOR EARTHQUAKE RESISTANT DESIGN and EVALUATION OF EARTHQUAKE FORCES

    Microsoft Academic Search

    S. K. Bhattacharyya

    Seismic risk is the probability that social or economic consequences of earthquakes will equal or exceed specified values at a site or at several sites or in an area during a specified exposure time. The seismic risk for a project depends to a great extent on the seismic activity of the region. As most earthquakes arise from stress build-up due

  20. The 1999 Hector Mine Earthquake, Southern California: Vector Near-Field Displacements from ERS InSAR

    NASA Technical Reports Server (NTRS)

    Sandwell, David T.; Sichoix, Lydie; Smith, Bridget

    2002-01-01

    Two components of fault slip are uniquely determined from two line-of-sight (LOS) radar interferograms by assuming that the fault-normal component of displacement is zero. We use this approach with ascending and descending interferograms from the ERS satellites to estimate surface slip along the Hector Mine earthquake rupture. The LOS displacement is determined by visually counting fringes to within 1 km of the outboard ruptures. These LOS estimates and uncertainties are then transformed into strike- and dip-slip estimates and uncertainties; the transformation is singular for a N-S oriented fault and optimal for an E-W oriented fault. In contrast to our previous strike-slip estimates, which were based only on a descending interferogram, we now find good agreement with the geological measurements, except at the ends of the rupture. The ascending interferogram reveals significant west-sidedown dip-slip (approximately 1.0 m) which reduces the strike-slip estimates by 1 to 2 m, especially along the northern half of the rupture. A spike in the strike-slip displacement of 6 m is observed in central part of the rupture. This large offset is confirmed by subpixel cross correlation of features in the before and after amplitude images. In addition to strike slip and dip slip, we identify uplift and subsidence along the fault, related to the restraining and releasing bends in the fault trace, respectively. Our main conclusion is that at least two look directions are required for accurate estimates of surface slip even along a pure strike-slip fault. Models and results based only on a single look direction could have major errors. Our new estimates of strike slip and dip slip along the rupture provide a boundary condition for dislocation modeling. A simple model, which has uniform slip to a depth of 12 km, shows good agreement with the observed ascending and descending interferograms.

  1. Observation and prediction of dynamic ground strains, tilts, and torsions caused by the Mw 6.0 2004 Parkfield, California, earthquake and aftershocks, derived from UPSAR array observations

    USGS Publications Warehouse

    Spudich, P.; Fletcher, Joe B.

    2008-01-01

    The 28 September 2004 Parkfield, California, earthquake (Mw 6.0) and four aftershocks (Mw 4.7-5.1) were recorded on 12 accelerograph stations of the U.S. Geological Survey Parkfield seismic array (UPSAR), an array of three-component accelerographs occupying an area of about 1 km2 located 8.8 km from the San Andreas fault. Peak horizontal acceleration and velocity at UPSAR during the mainshock were 0.45g and 27 cm/sec, respectively. We determined both time-varying and peak values of ground dilatations, shear strains, torsions, tilts, torsion rates, and tilt rates by applying a time-dependent geodetic analysis to the observed array displacement time series. Array-derived dilatations agree fairly well with point measurements made on high sample rate recordings of the Parkfield-area dilatometers (Johnston et al., 2006). Torsion Fourier amplitude spectra agree well with ground velocity spectra, as expected for propagating plane waves. A simple predictive relation, using the predicted peak velocity from the Boore-Atkinson ground-motion prediction relation (Boore and Atkinson, 2007) scaled by a phase velocity of 1 km/sec, predicts observed peak Parkfield and Chi-Chi rotations (Huang, 2003) well. However, rotation rates measured during Mw 5 Ito, Japan, events observed on a gyro sensor (Takeo, 1998) are factors of 5-60 greater than those predicted by our predictive relation. This discrepancy might be caused by a scale dependence in rotation, with rotations measured over a short baseline exceeding those measured over long baselines. An alternative hypothesis is that events having significant non-double-couple mechanisms, like the Ito events, radiate much stronger rotations than double-couple events. If this is true, then rotational observations might provide an important source of new information for monitoring seismicity in volcanic areas.

  2. Lower crustal structure in northern California: Implications from strain rate variations following the 1906 San Francisco earthquake

    NASA Astrophysics Data System (ADS)

    Kenner, Shelley J.; Segall, Paul

    2003-01-01

    It is well known that geodetic data from a single instant in time cannot uniquely characterize structure or rheology beneath active seismogenic zones. Nevertheless, comparison of spatial and temporal variations in deformation rate with time-dependent mechanical models can place valuable constraints on fault zone geometry and rheology. We consider postseismic strain rate transients by comparing geodetic data from north of San Francisco Bay obtained between 1906 and 1995 to predictions from viscoelastic finite element models. Models include (1) an elastic plate over a viscoelastic half-space, (2) distributed shear within a viscoelastic layer, (3) discrete shear zones within an otherwise elastic layer, (4) discrete shear zones in combination with distributed viscoelastic shear, and (5) midcrustal detachment surfaces. We vary, as applicable, locking depth, elastic thickness, depth to the top and bottom of the distributed shear layer, distributed shear relaxation time, discrete shear zone relaxation time, and discrete shear zone width. The best fitting, physically reasonable elastic plate over viscoelastic half-space models (1) do a poor job simultaneously predicting spatial and temporal variations in the data. The best fitting distributed shear models (2) do a poor job predicting spatial variations in the deformation rate. Although they fit the geodetic data, recent findings from seismic reflection-refraction studies in northern California argue against models with shallow subhorizontal detachments (5). Models incorporating discrete shear zones (3, 4) provide the best fit to the geodetic data and are consistent with seismic studies that argue for discrete fault zones extending through the entire crust.

  3. Rating the Size of Earthquakes

    NSDL National Science Digital Library

    This report describes how the work of K. Wadati, Charles F. Richter, Harry O. Wood, and Beno Gutenberg resulted in a way of rating earthquakes in southern California according to an instrumental analysis of the amount of energy they released in the form of seismic waves. This work resulted in the first use of the term "magnitude" for describing the amount of energy released by an earthquake, and in the development of the now-famous Richter Scale for quantifying earthquake magnitudes. Topics include the original definition of Richter magnitude and a brief synopsis of how Richter used earthquake data from southern California to graphically represent trace amplitude and develop a table of values that could be used to calculate magnitudes.

  4. Parkfield: Earthquake Prediction: A Brief History

    NSDL National Science Digital Library

    This report describes recent efforts at earthquake prediction, focusing on the modern era beginning in the mid- to late 1970's. Topics include a history of prediction efforts, the measurement of physical parameters in areas where earthquakes occur, and the development of a model upon which predictions could be based. The efforts centered around Parkfield, California, whose well-known seismic history allowed the development of a 'characteristic Parkfield earthquake' model and led to a formal prediction that a moderate-size earthquake would occur at Parkfield between 1985 and 1993. However, the anticipated earthquake did not occur until September 2004.

  5. The California Hazards Institute

    Microsoft Academic Search

    J. B. Rundle; L. H. Kellogg; D. L. Turcotte

    2006-01-01

    California's abundant resources are linked with its natural hazards. Earthquakes, landslides, wildfires, floods, tsunamis, volcanic eruptions, severe storms, fires, and droughts afflict the state regularly. These events have the potential to become great disasters, like the San Francisco earthquake and fire of 1906, that overwhelm the capacity of society to respond. At such times, the fabric of civic life is

  6. Hayward Fault, California Interferogram

    NASA Technical Reports Server (NTRS)

    2000-01-01

    This image of California's Hayward fault is an interferogram created using a pair of images taken by Synthetic Aperture Radar(SAR) combined to measure changes in the surface that may have occurred between the time the two images were taken.

    The images were collected by the European Space Agency's Remote Sensing satellites ERS-1 and ERS-2 in June 1992 and September 1997 over the central San Francisco Bay in California.

    The radar image data are shown as a gray-scale image, with the interferometric measurements that show the changes rendered in color. Only the urbanized area could be mapped with these data. The color changes from orange tones to blue tones across the Hayward fault (marked by a thin red line) show about 2-3centimeters (0.8-1.1 inches) of gradual displacement or movement of the southwest side of the fault. The block west of the fault moved horizontally toward the northwest during the 63 months between the acquisition of the two SAR images. This fault movement is called a seismic creep because the fault moved slowly without generating an earthquake.

    Scientists are using the SAR interferometry along with other data collected on the ground to monitor this fault motion in an attempt to estimate the probability of earthquake on the Hayward fault, which last had a major earthquake of magnitude 7 in 1868. This analysis indicates that the northern part of the Hayward fault is creeping all the way from the surface to a depth of 12 kilometers (7.5 miles). This suggests that the potential for a large earthquake on the northern Hayward fault might be less than previously thought. The blue area to the west (lower left) of the fault near the center of the image seemed to move upward relative to the yellow and orange areas nearby by about 2 centimeters (0.8 inches). The cause of this apparent motion is not yet confirmed, but the rise of groundwater levels during the time between the images may have caused the reversal of a small portion of the subsidence that this area suffered in the past.

    This research is the result of collaboration between the University of California's Berkeley and Davis campuses, the Lawrence Berkeley National Laboratory, and NASA's Jet Propulsion Laboratory in Pasadena, Calif. and is reported in the August 18, 2000, issue of Science magazine.

  7. Earthquake forecasting: Statistics and Information

    E-print Network

    Gertsik, V; Krichevets, A

    2013-01-01

    We present an axiomatic approach to earthquake forecasting in terms of multi-component random fields on a lattice. This approach provides a method for constructing point estimates and confidence intervals for conditional probabilities of strong earthquakes under conditions on the levels of precursors. Also, it provides an approach for setting multilevel alarm system and hypothesis testing for binary alarms. We use a method of comparison for different earthquake forecasts in terms of the increase of Shannon information. 'Forecasting' and 'prediction' of earthquakes are equivalent in this approach.

  8. Self-organized criticality and earthquake predictability

    NASA Astrophysics Data System (ADS)

    Godano, C.; Alonzo, M. L.; Caruso, V.

    1993-11-01

    We analyse a seismic catalogue of South California to investigate the possibility of earthquake prediction using the hypothesis that the seismic events are self-organized critical phenomena. The relation found previously is valid only in a mean field approximation, but cannot be used for earthquake prediction because the time clustering of seismic events makes the definition of a standard deviation of waiting times of earthquakes impossible.

  9. Simulation of rockfalls triggered by earthquakes

    Microsoft Academic Search

    Y. Kobayashi; E. L. Harp; T. Kagawa

    1990-01-01

    Summary A computer program to simulate the downslope movement of boulders in rolling or bouncing modes has been developed and applied to actual rockfalls triggered by the Mammoth Lakes, California, earthquake sequence in 1980 and the Central Idaho earthquake in 1983. In order to reproduce a movement mode where bouncing predominated, we introduced an artificial unevenness to the slope surface

  10. 75 FR 22872 - California Disaster # CA-00154

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-30

    ...Administrative declaration of a disaster for the State of California dated 04/21/2010. Incident: Northern Baja California Earthquake. Incident Period: 04/04/2010 and continuing. Effective Date: 04/21/2010. Physical Loan Application...

  11. THE DEVELOPMENT OF THE NETWORK FOR EARTHQUAKE ENGINEERING SIMULATION (NEES)

    E-print Network

    Pancake, Cherri M.

    THE DEVELOPMENT OF THE NETWORK FOR EARTHQUAKE ENGINEERING SIMULATION (NEES) Stephen Mahin1 , Robert of Universities for Research in Earthquake Engineering, Richmond, California, USA; reitherman@curee.org. 5 Dept program designed to advance earthquake engineering by infusing it with recent developments in information

  12. Earthquake Preparedness 101: Planning Guidelines for Colleges and Universities.

    ERIC Educational Resources Information Center

    California Governor's Office, Sacramento.

    This publication is a guide for California colleges and universities wishing to prepare for earthquakes. An introduction aimed at institutional leaders emphasizes that earthquake preparedness is required by law and argues that there is much that can be done to prepare for earthquakes. The second section, addressed to the disaster planner, offers…

  13. Earthquake Quiz

    NSDL National Science Digital Library

    This web site provides a short, interactive, four-question quiz on earthquakes focusing the the largest earthquake in both the world and in recent US history, preparedness, and the development of seismic instrumentation.

  14. Earthquake Glossary

    MedlinePLUS

    ... Education FAQ Earthquake Glossary For Kids Prepare Google Earth/KML Files Earthquake Summary Posters Photos Publications Share ... for Education FAQ EQ Glossary For Kids Google Earth/KML Files EQ Summary Posters Photos Publications Monitoring ...

  15. Earthquake Hazard in Italy, 2001–2030

    Microsoft Academic Search

    Roberto W. Romeo

    2005-01-01

    The study computes time-dependant earthquake probabilities on the basis of seismicity data mainly deriving from historic records. It provides a methodological approach useful for those countries where the scarcity of instrumental data and\\/or paleoseismological evidences requires that historical information shall be stressed. Thus, the conditional probability that damaging earthquakes (M ? 6) may occur in Italy in the next 30

  16. Virtual Earthquake

    NSDL National Science Digital Library

    Gary Novak

    This interactive feature shows students how an earthquake epicenter is located and how Richter magnitude is determined. They will use recordings of seismograms from three stations (provided in the activity), learn the difference between the focus and epicenter of an earthquake, and that the magnitude of an earthquake is an estimate of the amount of energy that it has released.

  17. Earthquake prediction

    Microsoft Academic Search

    Tsuneji Rikitake

    1968-01-01

    Earthquake prediction research programmes in a number of countries are reviewed together with achievements in various disciplines involved in earthquake prediction research, i.e., geodetic work, tide gauge observation, continuous observation of crustal movement, seismic activity and seismological method, seismic wave velocity, geotectonic work, geomagnetic and geoelectric work and laboratory work and its application in the field. Present-day development of earthquake

  18. Earthquakes Rock!

    NSDL National Science Digital Library

    Integrated Teaching and Learning Program,

    Students learn the two main methods to measure earthquakes, the Richter Scale and the Mercalli Scale. They make a model of a seismograph—a measuring device that records an earthquake on a seismogram. Students also investigate which structural designs are most likely to survive an earthquake. And, they illustrate an informational guide to the Mercalli Scale.

  19. The 2004 Parkfield earthquake: Test of the electromagnetic precursor hypothesis

    Microsoft Academic Search

    Stephen K. Park; William Dalrymple; Jimmy C. Larsen

    2007-01-01

    A controversy has existed for 30 years concerning the possibility of earthquake prediction using electromagnetic precursors. Long-term electromagnetic monitoring prior to, during, and after the M6.0 earthquake at Parkfield, California, on 28 September 2004 now provides a definitive test of this hypothesis. During the earthquake our instruments recorded clearly documented electrical signals from an earthquake: impulsive changes of up to

  20. San Francisco Bay Area Earthquakes

    NSDL National Science Digital Library

    Larry Braile

    This activity is designed to provide a better understanding of earthquake activity, the locations of faults, and earthquake hazards in the San Francisco bay area. Students study a false color satellite photo of the bay area on which earthquake epicenters for a seventeen year period have been plotted. Students use a California highway map or a copy of the California or San Francisco Bay area map from an atlas to help in finding some locations on the satellite image and help them become familiar with the geography represented in the satellite view. They will be guided in the recognition of some features and will be able to answer the questions based on the map and photograph.

  1. Geotechnical Extreme Events Reconnaissance Report on the Performance of Structures in Densely Urbanized Areas Affected by Surface Fault Rupture During the August 24, 2014 M6 South Napa Earthquake, California, USA.

    NASA Astrophysics Data System (ADS)

    Cohen-Waeber, J.; Lanzafame, R.; Bray, J.; Sitar, N.

    2014-12-01

    The August 24, 2014, M­w 6.0 South Napa earthquake is the largest seismic event to have occurred in the San Francisco Bay Region, California, USA, since the Mw 6.9 1989 Loma Prieta earthquake. The event epicenter occurred at the South end of the Napa Valley, California, principally rupturing northwest along parts of the active West Napa fault zone. Bound by two major fault zones to the East and West (Calaveras and Rogers Creek, respectively), the Napa Valley is filled with up to 170 m. of alluvial deposits and is considered to be moderately to very highly susceptible to liquefaction and has the potential for violent shaking. While damage due to strong ground shaking was significant, remarkably little damage due to liquefaction or landslide induced ground deformations was observed. This may be due to recent drought in the region. Instead, the South Napa earthquake is the first to produce significant surface rupture in this area since the Mw 7.9 1906 San Andreas event, and the first in Northern California to rupture through a densely urbanized environment. Clear expressions of surface fault rupture extended approximately 12 - 15 km northward from the epicenter and approximately 1-2 km southeast with a significant impact to infrastructure, including roads, lifelines and residential structures. The National Science Foundation funded Geotechnical Extreme Events Reconnaissance (GEER) Association presents here its observations on the performance of structures affected by surface fault rupture, in a densely populated residential neighborhood located approximately 10 km north of the epicenter. Based on the detailed mapping of 27 residential structures, a preliminary assessment of the quantitative descriptions of damage shows certain characteristic interactions between surface fault rupture and the overlying infrastructure: 48% of concrete slabs cracked up to 8 cm wide, 19% of structures shifted up to 11 cm off of their foundation and 44% of foundations cracked up to 3 cm. Of particular interest is the performance of pier and grade beam foundations which behaved more stiffly in comparison to typically observed shallow strip footing foundations.

  2. Application of Second-Moment Source Analysis to Three Problems in Earthquake Forecasting

    NASA Astrophysics Data System (ADS)

    Donovan, J.; Jordan, T. H.

    2011-12-01

    Though earthquake forecasting models have often represented seismic sources as space-time points (usually hypocenters), a more complete hazard analysis requires the consideration of finite-source effects, such as rupture extent, orientation, directivity, and stress drop. The most compact source representation that includes these effects is the finite moment tensor (FMT), which approximates the degree-two polynomial moments of the stress glut by its projection onto the seismic (degree-zero) moment tensor. This projection yields a scalar space-time source function whose degree-one moments define the centroid moment tensor (CMT) and whose degree-two moments define the FMT. We apply this finite-source parameterization to three forecasting problems. The first is the question of hypocenter bias: can we reject the null hypothesis that the conditional probability of hypocenter location is uniformly distributed over the rupture area? This hypothesis is currently used to specify rupture sets in the "extended" earthquake forecasts that drive simulation-based hazard models, such as CyberShake. Following McGuire et al. (2002), we test the hypothesis using the distribution of FMT directivity ratios calculated from a global data set of source slip inversions. The second is the question of source identification: given an observed FMT (and its errors), can we identify it with an FMT in the complete rupture set that represents an extended fault-based rupture forecast? Solving this problem will facilitate operational earthquake forecasting, which requires the rapid updating of earthquake triggering and clustering models. Our proposed method uses the second-order uncertainties as a norm on the FMT parameter space to identify the closest member of the hypothetical rupture set and to test whether this closest member is an adequate representation of the observed event. Finally, we address the aftershock excitation problem: given a mainshock, what is the spatial distribution of aftershock probabilities? The FMT representation allows us to generalize the models typically used for this purpose (e.g., marked point process models, such as ETAS), which will again be necessary in operational earthquake forecasting. To quantify aftershock probabilities, we compare mainshock FMTs with the first and second spatial moments of weighted aftershock hypocenters. We will describe applications of these results to the Uniform California Earthquake Rupture Forecast, version 3, which is now under development by the Working Group on California Earthquake Probabilities.

  3. STUDY ON EARTHQUAKE-PROOF REINFORCEMENT OF BREAKWATER IN FISHING PORT TO NANKAI EARTHQUAKE

    Microsoft Academic Search

    Kojiro OKABAYASHI; Kozo TAGAYA; Youya HAYASHI

    The Nankai Earthquakes, of which epicenters are at the Nankai Trough in the offshore in the Tosa Bay, has occurred repeatedly every 90-150 years. The Japan government officially announced that the earthquake will occur with the 50 per cents of probability within the coming 30 years, and with the 80 per cents of probability within 50 years. According to the

  4. Rapid Estimates of Postseismic Slip from GPS Data in Northern California

    Microsoft Academic Search

    I. A. Johanson

    2010-01-01

    Rapid postseismic slip has followed a number of recent California earthquakes, including the 2004 Parkfield earthquake and the 2005 Alum Rock earthquake. Early knowledge of the occurrence and size of postseismic slip from real-time or daily GPS position estimates could be very valuable to investigators planning a scientific response to an earthquake. Strategically planned field campaigns following an earthquake would

  5. California Integrated Seismic Network (CISN)

    NSDL National Science Digital Library

    The California Integrated Seismic Network (CISN) was formed to serve emergency response, engineering, and scientific communities with a better understanding of the earthquakes and mitigation. After learning about CISN's mission and organizational layout, visitors can discover upcoming seminars and other events. At the Earthquake Info link, users can find earthquake maps, felt reports, and data. Visitors can view and download shake maps, which represent the ground shaking produced by an earthquake. The website offers the latest and archived earthquake-related news and alerts.

  6. Earthquake Plotting

    NSDL National Science Digital Library

    Mr. Perry

    2008-11-18

    Do earthquakes tend to happen in certain locations on Earth? Are there predictable patterns to where earthquakes will occur? The Earth is divided into large tectonic plates that move on a ductile layer of material in the mantle (the Asthenosphere). Earthquakes tend to occur along the boundaries where these plates either collide with one another or try to slide one past the other. Today you will plot on a map the location of every earthquake with a magnitude greater than 4.0 within the past week to see if any patterns appear. You will need Dynamic Crust lab #3 (Earthquake Plotting) from your lab books and your Earth Science Reference Tables. Vocabulary: Use the following website to find definitions to the vocabulary terms in the lab. Geology Dictionary Procedures: Go to this site to find a list of \\"Latest Earthquakes Magnitude 2.5 or Greater in the United States ...

  7. Earthquake Plotting

    NSDL National Science Digital Library

    Mr. Kio

    2008-12-06

    Do earthquakes tend to happen in certain locations on Earth? Are there predictable patterns to where earthquakes will occur? The Earth is divided into large tectonic plates that move on a ductile layer of material in the mantle (the Asthenosphere). Earthquakes tend to occur along the boundaries where these plates either collide with one another or try to slide one past the other. Today you will plot on a map the location of every earthquake with a magnitude greater than 4.0 within the past week to see if any patterns appear. You will need Dynamic Crust lab #3 (Earthquake Plotting) from your lab books and your Earth Science Reference Tables. Vocabulary: Use the following website to find definitions to the vocabulary terms in the lab. Geology Dictionary Procedures: Go to this site to find a list of \\"Latest Earthquakes Magnitude 2.5 or Greater in the United States ...

  8. Forecasting the Rupture Directivity of Large Earthquakes

    NASA Astrophysics Data System (ADS)

    Donovan, J. R.; Jordan, T. H.

    2013-12-01

    Forecasting the rupture directivity of large earthquakes is an important problem in probabilistic seismic hazard analysis (PSHA), because directivity strongly influences ground motions. We cast this forecasting problem in terms of the conditional hypocenter distribution (CHD), defined to be the probability distribution of a hypocenter given the spatial distribution of fault slip (moment release). The simplest CHD is a uniform distribution for which the hypocenter probability density equals the moment-release probability density. We have compiled samples of CHDs from a global distribution of large earthquakes using three estimation methods: (a) location of hypocenters within the slip distribution from finite-fault inversions, (b) location of hypocenters within early aftershock distributions, and (c) direct inversion for the directivity parameter D, defined in terms of the degree-two polynomial moments of the source space-time function. The data from method (a) are statistically inconsistent with the uniform CHD suggested by McGuire et al. (2002) using method (c). Instead, the data indicate a 'centroid-biased' CHD, in which the expected distance between the hypocenter and the hypocentroid is less than that of a uniform CHD; i.e., the directivities inferred from finite-fault models appear to be closer to bilateral than predicted by the uniform CHD. One source of this discrepancy may be centroid bias in the second-order moments owing to poor localization of the slip in finite-fault inversions. We compare these observational results with CHDs computed from a large set of theoretical ruptures in the Southern California fault system produced by the Rate-State Quake simulator (RSQSim) of Dieterich and Richards-Dinger (2010) and discuss the implications for rupture dynamics and fault-zone heterogeneities.

  9. Basin structure beneath the Santa Rosa Plain, Northern California: Implications for damage caused by the 1969 Santa Rosa and 1906 San Francisco earthquakes

    USGS Publications Warehouse

    McPhee, D.K.; Langenheim, V.E.; Hartzell, S.; McLaughlin, R.J.; Aagaard, B.T.; Jachens, R.C.; McCabe, C.

    2007-01-01

    Regional gravity data in the northern San Francisco Bay region reflect a complex basin configuration beneath the Santa Rosa plain that likely contributed to the significant damage to the city of Santa Rosa caused by the 1969 M 5.6, 5.7 Santa Rosa earthquakes and the 1906 M 7.9 San Francisco earthquake. Inversion of these data indicates that the Santa Rosa plain is underlain by two sedimentary basins about 2 km deep separated by the Trenton Ridge, a shallow west-northwest-striking bedrock ridge west of Santa Rosa. The city of Santa Rosa is situated above the 2-km-wide protruding northeast corner of the southern basin where damage from both the 1969 and 1906 earthquakes was concentrated. Ground-motion simulations of the 1969 and 1906 earthquakes, two events with opposing azimuths, using the gravity-defined basin surface, show enhanced ground motions along the northeastern edge of this corner, suggesting that basin-edge effects contributed to the concentration of shaking damage in this area in the past and may also contribute to strong shaking during future earthquakes.

  10. Remotely triggered earthquakes following moderate main shocks

    USGS Publications Warehouse

    Hough, S.E.

    2007-01-01

    Since 1992, remotely triggered earthquakes have been identified following large (M > 7) earthquakes in California as well as in other regions. These events, which occur at much greater distances than classic aftershocks, occur predominantly in active geothermal or volcanic regions, leading to theories that the earthquakes are triggered when passing seismic waves cause disruptions in magmatic or other fluid systems. In this paper, I focus on observations of remotely triggered earthquakes following moderate main shocks in diverse tectonic settings. I summarize evidence that remotely triggered earthquakes occur commonly in mid-continent and collisional zones. This evidence is derived from analysis of both historic earthquake sequences and from instrumentally recorded M5-6 earthquakes in eastern Canada. The latter analysis suggests that, while remotely triggered earthquakes do not occur pervasively following moderate earthquakes in eastern North America, a low level of triggering often does occur at distances beyond conventional aftershock zones. The inferred triggered events occur at the distances at which SmS waves are known to significantly increase ground motions. A similar result was found for 28 recent M5.3-7.1 earthquakes in California. In California, seismicity is found to increase on average to a distance of at least 200 km following moderate main shocks. This supports the conclusion that, even at distances of ???100 km, dynamic stress changes control the occurrence of triggered events. There are two explanations that can account for the occurrence of remotely triggered earthquakes in intraplate settings: (1) they occur at local zones of weakness, or (2) they occur in zones of local stress concentration. ?? 2007 The Geological Society of America.

  11. Processed seismic motion records from Desert Hot Springs, California earthquake of April 22, 1992, recorded at seismic stations in southern Nevada

    SciTech Connect

    Lum, P.K.; Honda, K.K.

    1993-04-01

    As part of the contract with the US Department of Energy, Nevada Field Office (DOE/NV), URS/John A. Blume & Associates, Engineers (URS/Blume) maintains a network of seismographs in southern Nevada to monitor the ground motion generated by the underground nuclear explosions (UNEs) at the Nevada Test Site (NTS). The seismographs are located in the communities surrounding the NTS and the Las Vegas valley. When these seismographs are not used for monitoring the UNE generated nations, a limited number of seismographs are maintained for monitoring motion generated by other than UNEs (e.g. motion generated by earthquakes, wind, blast). During the subject earthquake of April 22nd 1992, a total of 19 of these systems recorded the earthquake motions. This report contains the recorded data.

  12. State and local authorities in parts of the central U.S.that are at risk from earthquakes

    E-print Network

    Stein, Seth

    the earthquake resistance of new buildings to levels similar to those in southern California. Here,we argue by careful analysis. Because most earthquake-related deaths result from the collapse of buildings--a Should Memphis Build for California's Earthquakes? PAGES 177,184­185 BY SETH STEIN,JOSEPH TOMASELLO,AND ANDREW

  13. Predicted liquefaction in the greater Oakland area and northern Santa Clara Valley during a repeat of the 1868 Hayward Fault (M6.7-7.0) earthquake

    USGS Publications Warehouse

    Holzer, Thomas L.; Noce, Thomas E.; Bennett, Michael J.

    2010-01-01

    Probabilities of surface manifestations of liquefaction due to a repeat of the 1868 (M6.7-7.0) earthquake on the southern segment of the Hayward Fault were calculated for two areas along the margin of San Francisco Bay, California: greater Oakland and the northern Santa Clara Valley. Liquefaction is predicted to be more common in the greater Oakland area than in the northern Santa Clara Valley owing to the presence of 57 km2 of susceptible sandy artificial fill. Most of the fills were placed into San Francisco Bay during the first half of the 20th century to build military bases, port facilities, and shoreline communities like Alameda and Bay Farm Island. Probabilities of liquefaction in the area underlain by this sandy artificial fill range from 0.2 to ~0.5 for a M7.0 earthquake, and decrease to 0.1 to ~0.4 for a M6.7 earthquake. In the greater Oakland area, liquefaction probabilities generally are less than 0.05 for Holocene alluvial fan deposits, which underlie most of the remaining flat-lying urban area. In the northern Santa Clara Valley for a M7.0 earthquake on the Hayward Fault and an assumed water-table depth of 1.5 m (the historically shallowest water level), liquefaction probabilities range from 0.1 to 0.2 along Coyote and Guadalupe Creeks, but are less than 0.05 elsewhere. For a M6.7 earthquake, probabilities are greater than 0.1 along Coyote Creek but decrease along Guadalupe Creek to less than 0.1. Areas with high probabilities in the Santa Clara Valley are underlain by young Holocene levee deposits along major drainages where liquefaction and lateral spreading occurred during large earthquakes in 1868 and 1906.

  14. Earthquake Hazards.

    ERIC Educational Resources Information Center

    Donovan, Neville

    1979-01-01

    Provides a survey and a review of earthquake activity and global tectonics from the advancement of the theory of continental drift to the present. Topics include: an identification of the major seismic regions of the earth, seismic measurement techniques, seismic design criteria for buildings, and the prediction of earthquakes. (BT)

  15. G141 Earthquakes & Volcanoes Lab 2 Plate Tectonics Name _____________________________ G141 Lab 2: Exploring Plate Motion and Deformation in California Using GPS Data

    E-print Network

    Polly, David

    G141 Earthquakes & Volcanoes Lab 2 ­ Plate Tectonics Name _____________________________ 1 G141 Lab of modern GPS technology, we can actually observe the process of plate tectonics as the plates are moving of that segment of the Earth's tectonic plate. We will be using data recorded by UNAVCO consort- ium (www

  16. Revised earthquake hazard of the Hat Creek fault, northern California: A case example of a normal fault dissecting variable-age basaltic lavas

    E-print Network

    Kattenhorn, Simon

    have distinctive surface morphologies where they cut through near-surface lavas (Pea- cock and Parfitt Burney McCloud Old Station Fall River Mills Modoc Plateau Figure 1. Terrain map of north- ern California

  17. The World-Wide Earthquake Locator

    NSDL National Science Digital Library

    Bruce Gittings

    The World-Wide Earthquake Locator provides up-to-date information and detailed dynamic maps of earthquakes across the world within a maximum of 24 hours of their occurence. Features include a current earthquake page with reports listed in chronological order, a catalog query page where users can search the earthquake database and map the results, and an animated map that shows worldwide activity for the last thirty days. There is also an earthquake map viewer that lets users construct their own maps of activity for the week, the previous week, or the rest of the month, with extra data such as plate boundaries, faults and volcanoes. The earthquake prediction map provides an estimate of earthquake probability for any region on Earth, also with extra data layers if desired. There are also links to news articles and to additional information from related sites.

  18. Earthquakes: Los Angeles

    NSDL National Science Digital Library

    Although the San Andreas Fault is the longest and one of the most active fault zones in California, it is not responsible for every earthquake in the state. This video segment describes the geologic setting of the San Andreas fault and a network of other active faults, particularly thrust faults, closer to Los Angeles, and explains why these may present a greater danger to the city than the San Andreas Fault. The segment is five minutes fifteen seconds in length. A background essay and discussion questions are included. Running time for the video is 5:15.

  19. Complex faulting associated with the 22 December 2003 Mw 6.5 San Simeon California, earthquake, aftershocks and postseismic surface deformation

    USGS Publications Warehouse

    McLaren, M.K.; Hardebeck, J.L.; van der Elst, N.; Unruh, J.R.; Bawden, G.W.; Blair, J.L.

    2008-01-01

    We use data from two seismic networks and satellite interferometric synthetic aperture radar (InSAR) imagery to characterize the 22 December 2003 Mw 6.5 San Simeon earthquake sequence. Absolute locations for the mainshock and nearly 10,000 aftershocks were determined using a new three-dimensional (3D) seismic velocity model; relative locations were obtained using double difference. The mainshock location found using the 3D velocity model is 35.704?? N, 121.096?? W at a depth of 9.7 ?? 0.7 km. The aftershocks concentrate at the northwest and southeast parts of the aftershock zone, between the mapped traces of the Oceanic and Nacimiento fault zones. The northwest end of the mainshock rupture, as defined by the aftershocks, projects from the mainshock hypocenter to the surface a few kilometers west of the mapped trace of the Oceanic fault, near the Santa Lucia Range front and the > 5 mm postseismic InSAR imagery contour. The Oceanic fault in this area, as mapped by Hall (1991), is therefore probably a second-order synthetic thrust or reverse fault that splays upward from the main seismogenic fault at depth. The southeast end of the rupture projects closer to the mapped Oceanic fault trace, suggesting much of the slip was along this fault, or at a minimum is accommodating much of the postseismic deformation. InSAR imagery shows ???72 mm of postseismic uplift in the vicinity of maximum coseismic slip in the central section of the rupture, and ???48 and ???45 mm at the northwest and southeast end of the aftershock zone, respectively. From these observations, we model a ???30-km-long northwest-trending northeast-dipping mainshock rupture surface - called the mainthrust - which is likely the Oceanic fault at depth, a ???10-km-long southwest-dipping backthrust parallel to the mainthrust near the hypocenter, several smaller southwest-dipping structures in the southeast, and perhaps additional northeast-dipping or subvertical structures southeast of the mainshock plane. Discontinuous backthrust features opposite the mainthrust in the southeast part of the aftershock zone may offset the relic Nacimiento fault zone at depth. The InSAR data image surface deformation associated with both aseismic slip and aftershock production on the mainthrust and the backthrusts at the northwest and southeast ends of the aftershock zone. The well-defined mainthrust at the latitude of the epicenter and antithetic backthrust illuminated by the aftershock zone indicate uplift of the Santa Lucia Range as a popup block; aftershocks in the southeast part of the zone also indicate a popup block, but it is less well defined. The absence of backthrust features in the central part of the zone suggests range-front uplift by fault-propagation folding, or backthrusts in the central part were not activated during the mainshock.

  20. Stress-based aftershock forecasts made within 24 h postmain shock: Expected north San Francisco Bay area seismicity changes after the 2014 M = 6.0 West Napa earthquake

    NASA Astrophysics Data System (ADS)

    Parsons, Tom; Segou, Margaret; Sevilgen, Volkan; Milner, Kevin; Field, Edward; Toda, Shinji; Stein, Ross S.

    2014-12-01

    We calculate stress changes resulting from the M = 6.0 West Napa earthquake on north San Francisco Bay area faults. The earthquake ruptured within a series of long faults that pose significant hazard to the Bay area, and we are thus concerned with potential increases in the probability of a large earthquake through stress transfer. We conduct this exercise as a prospective test because the skill of stress-based aftershock forecasting methodology is inconclusive. We apply three methods: (1) generalized mapping of regional Coulomb stress change, (2) stress changes resolved on Uniform California Earthquake Rupture Forecast faults, and (3) a mapped rate/state aftershock forecast. All calculations were completed within 24 h after the main shock and were made without benefit of known aftershocks, which will be used to evaluative the prospective forecast. All methods suggest that we should expect heightened seismicity on parts of the southern Rodgers Creek, northern Hayward, and Green Valley faults.

  1. Earthquake Effects and Experiences

    NSDL National Science Digital Library

    This portion of the United States Geological Survey's (USGS) frequently-asked-questions feature on earthquakes addresses what individuals might actually experience during an earthquake. Topics include earthquake motion (rolling or shaking), earthquake effects (ground shaking, surface faulting, ground failure, etc.), earthquake magnitude, what an earthquake feels like, and others. There are also links to additional resources on earthquake effects and experiences.

  2. Earthquake and volcano hazard notices: An economic evaluation of changes in risk perceptions

    USGS Publications Warehouse

    Bernknopf, R.L.; Brookshire, D.S.; Thayer, M.A.

    1990-01-01

    Earthquake and volcano hazard notices were issued for the Mammoth Lakes, California area by the U.S. Geological Survey under the authority granted by the Disaster Relief Act of 1974. The effects on investment, recretion visitation, and risk perceptionsare explored. The hazard notices did not affect recreation visitation, although investment was affected. A perceived loss in the market value of homes was documented. Risk perceptions were altered for property owners. Communication of the probability of an event over time would enhance hazard notices as a policy instrument and would mitigate unnecessary market perturbations. ?? 1990.

  3. A probabilistic neural network for earthquake magnitude prediction.

    PubMed

    Adeli, Hojjat; Panakkat, Ashif

    2009-09-01

    A probabilistic neural network (PNN) is presented for predicting the magnitude of the largest earthquake in a pre-defined future time period in a seismic region using eight mathematically computed parameters known as seismicity indicators. The indicators considered are the time elapsed during a particular number (n) of significant seismic events before the month in question, the slope of the Gutenberg-Richter inverse power law curve for the n events, the mean square deviation about the regression line based on the Gutenberg-Richter inverse power law for the n events, the average magnitude of the last n events, the difference between the observed maximum magnitude among the last n events and that expected through the Gutenberg-Richter relationship known as the magnitude deficit, the rate of square root of seismic energy released during the n events, the mean time or period between characteristic events, and the coefficient of variation of the mean time. Prediction accuracies of the model are evaluated using three different statistical measures: the probability of detection, the false alarm ratio, and the true skill score or R score. The PNN model is trained and tested using data for the Southern California region. The model yields good prediction accuracies for earthquakes of magnitude between 4.5 and 6.0. The PNN model presented in this paper complements the recurrent neural network model developed by the authors previously, where good results were reported for predicting earthquakes with magnitude greater than 6.0. PMID:19502005

  4. Identified EM Earthquake Precursors

    NASA Astrophysics Data System (ADS)

    Jones, Kenneth, II; Saxton, Patrick

    2014-05-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After a number of custom rock experiments, two hypotheses were formed which could answer the EM wave model. The first hypothesis concerned a sufficient and continuous electron movement either by surface or penetrative flow, and the second regarded a novel approach to radio transmission. Electron flow along fracture surfaces was determined to be inadequate in creating strong EM fields, because rock has a very high electrical resistance making it a high quality insulator. Penetrative flow could not be corroborated as well, because it was discovered that rock was absorbing and confining electrons to a very thin skin depth. Radio wave transmission and detection worked with every single test administered. This hypothesis was reviewed for propagating, long-wave generation with sufficient amplitude, and the capability of penetrating solid rock. Additionally, fracture spaces, either air or ion-filled, can facilitate this concept from great depths and allow for surficial detection. A few propagating precursor signals have been detected in the field occurring with associated phases using custom-built loop antennae. Field testing was conducted in Southern California from 2006-2011, and outside the NE Texas town of Timpson in February, 2013. The antennae have mobility and observations were noted for recurrence, duration, and frequency response. At the Southern California field sites, one loop antenna was positioned for omni-directional reception and also detected a strong First Schumann Resonance; however, additional Schumann Resonances were absent. At the Timpson, TX field sites, loop antennae were positioned for directional reception, due to earthquake-induced, hydraulic fracturing activity currently conducted by the oil and gas industry. Two strong signals, one moderately strong signal, and approximately 6-8 weaker signals were detected in the immediate vicinity. The three stronger signals were mapped by a biangulation technique, followed by a triangulation technique for confirmation. This was the first antenna mapping technique ever performed for determining possible earthquake epicenters. Six and a half months later, Timpson experienced two M4 (M4.1 and M4.3) earthquakes on September 2, 2013 followed by a M2.4 earthquake three days later, all occurring at a depth of five kilometers. The Timpson earthquake activity now has a cyclical rate and a forecast was given to the proper authorities. As a result, the Southern California and Timpson, TX field results led to an improved design and construction of a third prototype antenna. With a loop antenna array, a viable communication system, and continuous monitoring, a full fracture cycle can be established and observed in real-time. In addition, field data could be reviewed quickly for assessment and lead to a much more improved earthquake forecasting capability. The EM precursors determined by this method appear to surpass all prior precursor claims, and the general public will finally receive long overdue forecasting.

  5. Tidal and surface wave triggering of earthquakes at injection and geothermal sites across the United States

    NASA Astrophysics Data System (ADS)

    Cooper, S.; Thomas, A.; Krogstad, R. D.

    2014-12-01

    We searched for evidence of tidal and surface wave triggering of earthquakes at thirteen different well injection sites around the United States. We assembled earthquake catalogs in regions surrounding thirteen injection/geothermal sites, seven of which are in California, one in Oregon, and five in the Midwestern United States. After temporally declustering the catalogs, we tested each site for evidence of tidal triggering by applying the Schuster test at periods between 10.8 and 26.4 hours. We defined "triggering" at a given period as the time when the Schuster probability exceeds 99% confidence. In general, there was little evidence for tidal triggering of earthquakes at the injection/geothermal sites we investigated. The one exception was seismicity at the Geysers, CA, which was triggered at the M2 semidiurnal lunar period. Many sites also had amplitudes that exceeded 99% confidence at 24 hour periods, however since these sites had no evidence of triggering at the larger M2 tidal period, we consider this to be an anthropogenic effect. Next we searched for triggering by Mw 7 or above global earthquakes and Mw 4-7 regional earthquakes. We quantified the influence of the regional and global earthquakes on seismicity at each injection/geothermal site using the beta statistic, which is a measure of the change in seismicity rate before and after the triggering event occurs. In general, we found evidence for triggering by regional/global earthquakes at most sites. We also explored the relationship of activation to other site variables such as wellhead pressure, monthly injection, monthly production, and cumulative injection/production volumes when available. While the beta statistic does correlate with obvious variables such as the distance between the site and the triggering event it doesn't appear to correlate with the aforementioned site-specific variables.

  6. Earthquake Prediction

    NSDL National Science Digital Library

    Earthquake prediction has never been an exact science or an easy job. In 1923, the debate between two Japanese seismologists, Akitune Imamura, and his superior at the University of Tokyo, Professor Omori, over whether a great earthquake was imminent, ended in tragedy as Omori prevailed and no preparations were made for the disaster. In this video segment, a contemporary seismologist tells the story of these two pioneers and describes the events of the Kanto Earthquake, in which 140,000 people were killed. The segment is two minutes fifty-seven seconds in length. A background essay and discussion questions are included.

  7. Spatial Detect Technology Applied on Earthquake's Impending Forecast

    Microsoft Academic Search

    Guo Ziqi; Hu Guiwen; Qian Shuqing

    2001-01-01

    Impending forecast of strong earthquake is still an unsolvable scientific question in the world. Based on our analysis of the disadvantages of present earthquake forecasting, this article discussed the probability to use the Spatial Detect Technology in earthquake's forecasting, moreover summarized the status, extent and some unanswered questions of those new spatial detect techniques and methods. We can surely believe

  8. Diurnal changes of earthquake activity and geomagnetic Sq-variations

    Microsoft Academic Search

    G. Duma; Y. Ruzhin

    2003-01-01

    Statistic analyses demonstrate that the probability of earthquake occurrence in many earthquake regions strongly depends on the time of day, that is on Local Time (e.g. Conrad, 1909, 1932; Shimshoni, 1971; Duma, 1997; Duma and Vilardo, 1998). This also applies to strong earthquake activity. Moreover, recent observations reveal an involvement of the regular diurnal variations of the Earth's magnetic field,

  9. The physics of an earthquake

    NASA Astrophysics Data System (ADS)

    McCloskey, John

    2008-03-01

    The Sumatra-Andaman earthquake of 26 December 2004 (Boxing Day 2004) and its tsunami will endure in our memories as one of the worst natural disasters of our time. For geophysicists, the scale of the devastation and the likelihood of another equally destructive earthquake set out a series of challenges of how we might use science not only to understand the earthquake and its aftermath but also to help in planning for future earthquakes in the region. In this article a brief account of these efforts is presented. Earthquake prediction is probably impossible, but earth scientists are now able to identify particularly dangerous places for future events by developing an understanding of the physics of stress interaction. Having identified such a dangerous area, a series of numerical Monte Carlo simulations is described which allow us to get an idea of what the most likely consequences of a future earthquake are by modelling the tsunami generated by lots of possible, individually unpredictable, future events. As this article was being written, another earthquake occurred in the region, which had many expected characteristics but was enigmatic in other ways. This has spawned a series of further theories which will contribute to our understanding of this extremely complex problem.

  10. The Distribution of Earthquakes: Landers Shakes Things Up

    NSDL National Science Digital Library

    In this activity, learners will investigate the seismicity in southern California around the time of the 1992 Landers, California earthquake, which at magnitude 7.3 was the largest earthquake to strike the region in 40 years. Scientific studies into the circumstances surrounding these events and their connection to the Landers earthquake were carried out, primarily because of the hazard that 'triggered' earthquakes (if real) could pose. An investigation into the question of whether the Landers, earthquake of 1992 triggered earthquakes well outside of its 'aftershock zone'. The activity provides a walkthrough of the creation of a blink-comparison figure, and asks the learner to answer questions based upon their observations of the figure and their knowledge of how it was created.

  11. Late Pleistocene-Holocene Faulting History Along the Northern El Carrizal Fault, Baja California Sur, Mexico: Earthquake Recurrence at a Persistently Active Rifted Margin

    Microsoft Academic Search

    S. J. Maloney; P. J. Umhoefer; J. R. Arrowsmith; G. M. Gutiérrez; A. U. Santillanez; T. R. Rittenour

    2007-01-01

    The El Carrizal fault is a NW striking, east dipping normal fault located 25 km west of the city of La Paz, Baja California Sur, Mexico and is the westernmost bounding fault of the gulf-margin system at this latitude. The fault is ~70 km long onshore and ~50 km long offshore to the north in La Paz Bay. As many

  12. New Constraints on Deformation, Slip Rate, and Timing of the Most Recent Earthquake on the West Tahoe-Dollar Point Fault, Lake Tahoe Basin, California

    Microsoft Academic Search

    Daniel S. Brothers; Graham M. Kent; Neal W. Driscoll; Shane B. Smith; Robert Karlin; Jeffrey A. Dingler; Alistair J. Harding; Gordon G. Seitz; Jeffrey M. Babcock

    2009-01-01

    High-resolution seismic compressed high intensity Radar pulse (CHIRP) data and piston cores acquired in Fallen Leaf Lake (FLL) and Lake Tahoe provide new paleoseismic constraints on the West Tahoe-Dollar Point fault (WTDPF), the western- most normal fault in the Lake Tahoe Basin, California. Paleoearthquake records along three sections of the WTDPF are investigated to determine the magnitude and recency of

  13. CyberShake: A Physics-Based Seismic Hazard Model for Southern California ROBERT GRAVES,1

    E-print Network

    Okaya, David

    California Earthquake Rupture Forecast, Version 2.0 (UCERF2.0) and iden- tify all ruptures within 200 km-term earthquake rupture forecasts based on the CFM are now available (FIELD et al., 2009). These developments have--CyberShake, as part of the Southern California Earthquake Center's (SCEC) Community Modeling Environment

  14. A preliminary assessment of earthquake ground shaking hazard at Yucca Mountain, Nevada and implications to the Las Vegas region

    SciTech Connect

    Wong, I.G.; Green, R.K.; Sun, J.I. [Woodward-Clyde Federal Services, Oakland, CA (United States); Pezzopane, S.K. [Geological Survey, Denver, CO (United States); Abrahamson, N.A. [Abrahamson (Norm A.), Piedmont, CA (United States); Quittmeyer, R.C. [Woodward-Clyde Federal Services, Las Vegas, NV (United States)

    1996-12-31

    As part of early design studies for the potential Yucca Mountain nuclear waste repository, the authors have performed a preliminary probabilistic seismic hazard analysis of ground shaking. A total of 88 Quaternary faults within 100 km of the site were considered in the hazard analysis. They were characterized in terms of their probability o being seismogenic, and their geometry, maximum earthquake magnitude, recurrence model, and slip rate. Individual faults were characterized by maximum earthquakes that ranged from moment magnitude (M{sub w}) 5.1 to 7.6. Fault slip rates ranged from a very low 0.00001 mm/yr to as much as 4 mm/yr. An areal source zone representing background earthquakes up to M{sub w} 6 1/4 = 1/4 was also included in the analysis. Recurrence for these background events was based on the 1904--1994 historical record, which contains events up to M{sub w} 5.6. Based on this analysis, the peak horizontal rock accelerations are 0.16, 0.21, 0.28, and 0.50 g for return periods of 500, 1,000, 2,000, and 10,000 years, respectively. In general, the dominant contributor to the ground shaking hazard at Yucca Mountain are background earthquakes because of the low slip rates of the Basin and Range faults. A significant effect on the probabilistic ground motions is due to the inclusion of a new attenuation relation developed specifically for earthquakes in extensional tectonic regimes. This relation gives significantly lower peak accelerations than five other predominantly California-based relations used in the analysis, possibly due to the lower stress drops of extensional earthquakes compared to California events. Because Las Vegas is located within the same tectonic regime as Yucca Mountain, the seismic sources and path and site factors affecting the seismic hazard at Yucca Mountain also have implications to Las Vegas. These implications are discussed in this paper.

  15. Paleoseismic investigations in the Santa Cruz mountains, California: Implications for recurrence of large-magnitude earthquakes on the San Andreas Fault

    NASA Astrophysics Data System (ADS)

    Schwartz, D. P.; Pantosti, D.; Okumura, K.; Powers, T. J.; Hamilton, J. C.

    1998-08-01

    Trenching, microgeomorphic mapping, and tree ring analysis provide information on timing of paleoearthquakes and behavior of the San Andreas fault in the Santa Cruz mountains. At the Grizzly Flat site alluvial units dated at 1640-1659 A.D., 1679-1894 A.D., 1668-1893 A.D., and the present ground surface are displaced by a single event. This was the 1906 surface rupture. Combined trench dates and tree ring analysis suggest that the penultimate event occurred in the mid-1600 s, possibly in an interval as narrow as 1632-1659 A.D. There is no direct evidence in the trenches for the 1838 or 1865 earthquakes, which have been proposed as occurring on this part of the fault zone. In a minimum time of about 340 years only one large surface faulting event (1906) occurred at Grizzly Flat, in contrast to previous recurrence estimates of 95-110 years for the Santa Cruz mountains segment. Comparison with dates of the penultimate San Andreas earthquake at sites north of San Francisco suggests that the San Andreas fault between Point Arena and the Santa Cruz mountains may have failed either as a sequence of closely timed earthquakes on adjacent segments or as a single long rupture similar in length to the 1906 rupture around the mid-1600 s. The 1906 coseismic geodetic slip and the late Holocene geologic slip rate on the San Francisco peninsula and southward are about 50-70% and 70% of their values north of San Francisco, respectively. The slip gradient along the 1906 rupture section of the San Andreas reflects partitioning of plate boundary slip onto the San Gregorio, Sargent, and other faults south of the Golden Gate. If a mid-1600 s event ruptured the same section of the fault that failed in 1906, it supports the concept that long strike-slip faults can contain master rupture segments that repeat in both length and slip distribution. Recognition of a persistent slip rate gradient along the northern San Andreas fault and the concept of a master segment remove the requirement that lower slip sections of large events such as 1906 must fill in on a periodic basis with smaller and more frequent earthquakes.

  16. Landslides and ridge-top failures associated with the epicentral area of the Loma Prieta earthquake of October 17, 1989 Santa Cruz County, California

    SciTech Connect

    Spittler, T.E.; Sydnor, R.H.

    1990-01-01

    Extensive landslides and ridge-top failures occurred in the epicentral are of the Loma Prieta earthquake. These failures have been subdivided into four categories: (1) small rockfalls, dry debris flows, minor slumps, wedge-failure landslides along highway cut-slopes, and sea cliff failures; (2) ridge-top fractures dominated by tensional separation; (3) crown scarps of incipient landslides on the axes of steeply-plunging, steed-sided spur ridges; and (4) remobilized portions of existing large-scale rotational landslide complexes. These failures are described in detail.

  17. Earthquake! An Example of How to Develop Reading Skills Using a Topic of Current Interest.

    ERIC Educational Resources Information Center

    Montori, Laura; Lally, Julia

    A topic of student interest, earthquakes, is used as a vehicle for teaching reading and research skills in a California junior high school. Students develop geography skills by labeling fault lines on maps of the Pacific Basin, California, and San Francisco; develop their vocabulary by preparing a list of words about earthquakes; and practice word…

  18. Earthquake Exhibit (title provided or enhanced by cataloger)

    NSDL National Science Digital Library

    This on-line exhibit provides an in-depth look at earthquakes. It covers the technology and history of seismographs, plate boundaries and tectonics, faults, seismic waves and their uses, the Richter and Mercalli scales, and details about California earthquakes.

  19. 1920s prediction reveals some pitfalls of earthquake forecasting

    Microsoft Academic Search

    Carl-Henry Geschwind

    1997-01-01

    A number of seismologists are pursuing the possibility of making scientifically credible earthquake forecasts or predictions. Others, however, are concerned about how the public might react to such forecasts [National Research Council, 1978; Geller, 1997]. In the 1920s, the president of the Seismological Society of America publicly predicted that southern California would suffer a severe earthquake within the next ten

  20. Hypothesis testing and earthquake prediction.

    PubMed

    Jackson, D D

    1996-04-30

    Requirements for testing include advance specification of the conditional rate density (probability per unit time, area, and magnitude) or, alternatively, probabilities for specified intervals of time, space, and magnitude. Here I consider testing fully specified hypotheses, with no parameter adjustments or arbitrary decisions allowed during the test period. Because it may take decades to validate prediction methods, it is worthwhile to formulate testable hypotheses carefully in advance. Earthquake prediction generally implies that the probability will be temporarily higher than normal. Such a statement requires knowledge of "normal behavior"--that is, it requires a null hypothesis. Hypotheses can be tested in three ways: (i) by comparing the number of actual earth-quakes to the number predicted, (ii) by comparing the likelihood score of actual earthquakes to the predicted distribution, and (iii) by comparing the likelihood ratio to that of a null hypothesis. The first two tests are purely self-consistency tests, while the third is a direct comparison of two hypotheses. Predictions made without a statement of probability are very difficult to test, and any test must be based on the ratio of earthquakes in and out of the forecast regions. PMID:11607663

  1. Proceedings of the 8 U.S. National Conference on Earthquake Engineering

    E-print Network

    Komatitsch, Dimitri

    Francisco, California, USA IMPACT OF A LARGE SAN ANDREAS FAULT EARTHQUAKE ON TALL BUILDINGS IN SOUTHERN the Northridge earthquake, we determine the damage in 18-story steel moment- frame buildings in southern. 1998; Petak and Elahi 2000). This earthquake exposed the vulnerability of steel moment-resisting frame

  2. Simulation of strong earthquake motion by explosions --experiments at the Lyaur testing range in Tajikistan

    E-print Network

    Southern California, University of

    of earthquake resistant design codes and for providing new independent data for development of improved methods of analysis and design. Since the 1933 Long Beach, California, earthquake, many strong motion recordings haveSimulation of strong earthquake motion by explosions -- experiments at the Lyaur testing range

  3. Simulated Performance of Steel Moment-Resisting Frame Buildings in the ,**-Tokachi-oki Earthquake

    E-print Network

    Greer, Julia R.

    Simulated Performance of Steel Moment-Resisting Frame Buildings in the ,**- Tokachi-oki Earthquake Thomas Heaton, Jing Yang and John Hall Earthquake Engineering Research Laboratory, California Institute +33. UBC) for ground motions recorded in the ,**- Tokachi-oki earthquake. We consider buildings

  4. SLIP ALONG THE SAN ANDREAS FAULT ASSOCIATED WITH THE EARTHQUAKE1

    E-print Network

    Tai, Yu-Chong

    SLIP ALONG THE SAN ANDREAS FAULT ASSOCIATED WITH THE EARTHQUAKE1 By KERRY E. SIEH, CALIFORNIA Imperial Valley earthquake occurred along other than the Imperial fault and the Brawley fault zone. More developed a discontinuous set ofsurficial fractures soon after the earthquake. This set offractures

  5. After a damaging earthquake, emergency managers must quickly find answers to impor-

    E-print Network

    Torgersen, Christian

    After a damaging earthquake, emergency managers must quickly find answers to impor- tant questions shaking follow- ing an earthquake. ShakeMaps can be used for emergency response, loss estimation created for earthquakes in southern California for the last several years as part of the TriNet Project

  6. Seismic Intensity Estimation of Tall Buildings in Earthquake Early Warning System

    E-print Network

    Greer, Julia R.

    prediction equation (GMPE) that predicts response spectral amplitude from knowledge of earthquake magnitudeSeismic Intensity Estimation of Tall Buildings in Earthquake Early Warning System M. H. Cheng & T. W. Graves U.S. Geological Survey, USA SUMMARY: In California, United States, an earthquake early

  7. Practical problems in testing earthquake forecasts

    NASA Astrophysics Data System (ADS)

    Kagan, Y. Y.; Jackson, D. D.

    2005-12-01

    In principle earthquake forecasts can be stated as probabilities, probability densities, or rate densities, and their predictive value can be assessed using likelihood scores, receiver operating characteristics, or other statistical methods. However, earthquake behavior and the limitations of earthquake recording pose some practical problems. 1. Some users need long-term forecasts, decades or more, for planning construction and for making long-term decisions. But earthquakes are strongly clustered, and one big earthquake can change the prospects of other ones in minutes. In other words, the probabilities are highly conditional and the conditions change rapidly. Some possible approaches include using simulation of earthquake sequences to make more realistic forecasts, and testing only the largest earthquake in pre-assigned zones. 2. Prospective testing of fully specified forecasts is the gold standard, but many forecasts involve large earthquakes, and who wants to wait? Thus, retrospective testing will be inevitable. That implies posterior data selection, and complicates or even prevents valid statistical testing. 3. Earthquake data vary in quality, and even now the uncertainties in earthquake location, magnitude, and completeness are highly time dependent. Following a large earthquake, many moderate ones may still be missed because their waves overlap. Statistical tests must allow for data errors, so rather sophisticated error models are needed. So far, we don't have them. 4. Hypotheses that predict only a few events are unlikely to lead to definitive tests. To reject either of two typical, distinct hypotheses (e.g., fault-based vs. seismicity-based) may require that the expected number of events by one or both be about 10.

  8. Medium-term Earthquake Forecasting with Numerical Earthquake Simulators: A Feasibility Study with a Comparison to the WGCEP

    Microsoft Academic Search

    J. B. Rundle; J. van Aalsburg; G. Morein; D. L. Turcotte; L. Grant-Ludwig; A. Donnellan; K. F. Tiampo; W. Klein

    2008-01-01

    Topologically realistic earthquake simulations are now possible using numerical codes such as Virtual California (VC). Currently, VC is written in modern object-oriented C++ code, and runs under MPI-II protocols on parallel HPC machines such as the NASA Columbia supercomputer. In VC, an earthquake fault system is modeled by a large number of Boundary Elements interacting by means of linear elasticity.

  9. Effects of topographic position and geology on shaking damage to residential wood-framed structures during the 2003 San Simeon earthquake, western San Luis obispo county, California

    USGS Publications Warehouse

    McCrink, T.P.; Wills, C.J.; Real, C.R.; Manson, M.W.

    2010-01-01

    A statistical evaluation of shaking damage to wood-framed houses caused by the 2003 M6.5 San Simeon earthquake indicates that both the rate and severity of damage, independent of structure type, are significantly greater on hilltops compared to hill slopes when underlain by Cretaceous or Tertiary sedimentary rocks. This increase in damage is interpreted to be the result of topographic amplification. An increase in the damage rate is found for all structures built on Plio-Pleistocene rocks independent of topographic position, and this is interpreted to be the result of amplified shaking caused by geologic site response. Damage rate and severity to houses built on Tertiary rocks suggest that amplification due to both topographic position and geologic site response may be occurring in these rocks, but effects from other topographic parameters cannot be ruled out. For all geologic and topographic conditions, houses with raised foundations are more frequently damaged than those with slab foundations. However, the severity of damage to houses on raised foundations is only significantly greater for those on hill slopes underlain by Tertiary rocks. Structures with some damage-resistant characteristics experienced greater damage severity on hilltops, suggesting a spectral response to topographic amplification. ?? 2010, Earthquake Engineering Research Institute.

  10. Paleoseismology and Global Positioning System: Earthquake-cycle effects and geodetic versus geologic fault slip rates in the Eastern California shear zone

    Microsoft Academic Search

    Timothy H. Dixon; E. Norabuena; L. Hotaling

    2003-01-01

    Published slip rates for the Owens Valley fault zone in eastern California based on geodetic data and elastic half-space models (5 7 mm\\/yr) are faster than longer term geologic rates (2 3 mm\\/yr). We use Global Positioning System data spanning the central Owens Valley, a more realistic rheological model with an elastic upper crust over a viscoelastic lower crust and

  11. Neural network models for earthquake magnitude prediction using multiple seismicity indicators.

    PubMed

    Panakkat, Ashif; Adeli, Hojjat

    2007-02-01

    Neural networks are investigated for predicting the magnitude of the largest seismic event in the following month based on the analysis of eight mathematically computed parameters known as seismicity indicators. The indicators are selected based on the Gutenberg-Richter and characteristic earthquake magnitude distribution and also on the conclusions drawn by recent earthquake prediction studies. Since there is no known established mathematical or even empirical relationship between these indicators and the location and magnitude of a succeeding earthquake in a particular time window, the problem is modeled using three different neural networks: a feed-forward Levenberg-Marquardt backpropagation (LMBP) neural network, a recurrent neural network, and a radial basis function (RBF) neural network. Prediction accuracies of the models are evaluated using four different statistical measures: the probability of detection, the false alarm ratio, the frequency bias, and the true skill score or R score. The models are trained and tested using data for two seismically different regions: Southern California and the San Francisco bay region. Overall the recurrent neural network model yields the best prediction accuracies compared with LMBP and RBF networks. While at the present earthquake prediction cannot be made with a high degree of certainty this research provides a scientific approach for evaluating the short-term seismic hazard potential of a region. PMID:17393560

  12. Toward a Global Model for Predicting Earthquake-Induced Landslides in Near-Real Time

    NASA Astrophysics Data System (ADS)

    Nowicki, M. A.; Wald, D. J.; Hamburger, M. W.; Hearne, M.; Thompson, E.

    2013-12-01

    We present a newly developed statistical model for estimating the distribution of earthquake-triggered landslides in near-real time, which is designed for use in the USGS Prompt Assessment of Global Earthquakes for Response (PAGER) and ShakeCast systems. We use standardized estimates of ground shaking from the USGS ShakeMap Atlas 2.0 to develop an empirical landslide probability model by combining shaking estimates with broadly available landslide susceptibility proxies, including topographic slope, surface geology, and climatic parameters. While the initial model was based on four earthquakes for which digitally mapped landslide inventories and well constrained ShakeMaps are available--the Guatemala (1976), Northridge, California (1994), Chi-Chi, Taiwan (1999), and Wenchuan, China (2008) earthquakes, our improved model includes observations from approximately ten other events from a variety of tectonic and geomorphic settings for which we have obtained landslide inventories. Using logistic regression, this database is used to build a predictive model of the probability of landslide occurrence. We assess the performance of the regression model using statistical goodness-of-fit metrics to determine which combination of the tested landslide proxies provides the optimum prediction of observed landslides while minimizing ';false alarms' in non-landslide zones. Our initial results indicate strong correlations with peak ground acceleration and maximum slope, and weaker correlations with surface geological and soil wetness proxies. In terms of the original four events included, the global model predicts landslides most accurately when applied to the Wenchuan and Chi-Chi events, and less accurately when applied to the Northridge and Guatemala datasets. Combined with near-real time ShakeMaps, the model can be used to make generalized predictions of whether or not landslides are likely to occur (and if so, where) for future earthquakes around the globe, and these estimates will be included in the loss estimates of the PAGER system after further development.

  13. A method for producing digital probabilistic seismic landslide hazard maps; an example from the Los Angeles, California, area

    USGS Publications Warehouse

    Jibson, Randall W.; Harp, Edwin L.; Michael, John A.

    1998-01-01

    The 1994 Northridge, California, earthquake is the first earthquake for which we have all of the data sets needed to conduct a rigorous regional analysis of seismic slope instability. These data sets include (1) a comprehensive inventory of triggered landslides, (2) about 200 strong-motion records of the mainshock, (3) 1:24,000-scale geologic mapping of the region, (4) extensive data on engineering properties of geologic units, and (5) high-resolution digital elevation models of the topography. All of these data sets have been digitized and rasterized at 10-m grid spacing in the ARC/INFO GIS platform. Combining these data sets in a dynamic model based on Newmark's permanent-deformation (sliding-block) analysis yields estimates of coseismic landslide displacement in each grid cell from the Northridge earthquake. The modeled displacements are then compared with the digital inventory of landslides triggered by the Northridge earthquake to construct a probability curve relating predicted displacement to probability of failure. This probability function can be applied to predict and map the spatial variability in failure probability in any ground-shaking conditions of interest. We anticipate that this mapping procedure will be used to construct seismic landslide hazard maps that will assist in emergency preparedness planning and in making rational decisions regarding development and construction in areas susceptible to seismic slope failure.

  14. Heterogeneous rupture in the great Cascadia earthquake of 1700 inferred from coastal subsidence estimates

    USGS Publications Warehouse

    Wang, Pei-Ling; Engelhart, Simon E.; Wang, Kelin; Hawkes, Andrea D.; Horton, Benjamin P.; Nelson, Alan R.; Witter, Robert C.

    2013-01-01

    Past earthquake rupture models used to explain paleoseismic estimates of coastal subsidence during the great A.D. 1700 Cascadia earthquake have assumed a uniform slip distribution along the megathrust. Here we infer heterogeneous slip for the Cascadia margin in A.D. 1700 that is analogous to slip distributions during instrumentally recorded great subduction earthquakes worldwide. The assumption of uniform distribution in previous rupture models was due partly to the large uncertainties of then available paleoseismic data used to constrain the models. In this work, we use more precise estimates of subsidence in 1700 from detailed tidal microfossil studies. We develop a 3-D elastic dislocation model that allows the slip to vary both along strike and in the dip direction. Despite uncertainties in the updip and downdip slip extensions, the more precise subsidence estimates are best explained by a model with along-strike slip heterogeneity, with multiple patches of high-moment release separated by areas of low-moment release. For example, in A.D. 1700, there was very little slip near Alsea Bay, Oregon (~44.4°N), an area that coincides with a segment boundary previously suggested on the basis of gravity anomalies. A probable subducting seamount in this area may be responsible for impeding rupture during great earthquakes. Our results highlight the need for more precise, high-quality estimates of subsidence or uplift during prehistoric earthquakes from the coasts of southern British Columbia, northern Washington (north of 47°N), southernmost Oregon, and northern California (south of 43°N), where slip distributions of prehistoric earthquakes are poorly constrained.

  15. A Quantitative Analysis of the Pattern Informatics Technique to Characterize Future Earthquakes

    NASA Astrophysics Data System (ADS)

    Moulik, P.; Tiampo, K. F.; Shcherbakov, R.; Klein, W.; Rundle, J. B.

    2008-12-01

    The Pattern Informatics (PI) technique, which is based on the spatiotemporal changes in seismicity rate, has shown promising results in Southern California (Tiampo et al., 2002). However, there have been limited attempts to quantitatively account for the optimal forecasting parameters and the characteristics of the predicted earthquakes. A sensitivity analysis of the model parameters in the PI method is investigated using receiver operating characteristics (ROC) diagram and Pierce's skill score to account for the optimal parameters (Holliday et al., 2006). The threshold PI value used for identifying the earthquake is evaluated by comparing the Pierce's skill score for different threshold values. The optimal cell size for the PI, found by varying the cell sizes to find the maximum area under the ROC, is an important factor if the forecasted earthquakes are characterized using their rupture areas, which can transcend several cell sizes. Moreover, the duration of training and forecasting intervals as well as the lower magnitude cutoff is estimated using the ROC diagram to provide a better forecast. Identification of the spatial and temporal regions of the seismic catalogs for better forecasts is done using the Thirumalai-Mountain (TM) fluctuation metric (Thirumalai et al., 1989). The TM metric, a measure of effective ergodicity, is studied systematically to analyse its dependence on cell size and other forecasting parameters. Next, the magnitude of the forecasting events is empirically estimated from rupture dimensions using the roughness index, a variation of the PI index (Tiampo et al., 2006). As the exact epicentre of the future earthquake is unknown, a forecasting magnitude or rupture dimension map is constructed which may provide insight into the mechanisms behind the earthquakes. Thus, the present work aims to provide a quantitative way to determine the various optimal parameters used for identifying and using relevant regions of the seismic catalogs while using a modified PI index to empirically estimate the magnitude of future earthquakes at various probable epicentres in the region.

  16. Nonlinear Dynamics, Magnitude-Period Formula and Forecasts on Earthquake

    E-print Network

    Chang, Yi-Fang

    2008-01-01

    Based on the geodynamics, an earthquake does not take place until the momentum-energy excess a faulting threshold value of rock due to the movement of the fluid layer under the rock layer and the transport and accumulation of the momentum. From the nonlinear equations of fluid mechanics, a simplified nonlinear solution of momentum corresponding the accumulation of the energy could be derived. Otherwise, a chaos equation could be obtained, in which chaos corresponds to the earthquake, which shows complexity on seismology, and impossibility of exact prediction of earthquakes. But, combining the Carlson-Langer model and the Gutenberg-Richter relation, the magnitude-period formula of the earthquake may be derived approximately, and some results can be calculated quantitatively. For example, we forecast a series of earthquakes of 2004, 2009 and 2014, especially in 2019 in California. Combining the Lorenz model, we discuss the earthquake migration to and fro. Moreover, many external causes for earthquake are merely...

  17. THE LOMA PRIETA, CALIFORNIA, EARTHQUAKEOF OCTOBER 17,1989: EARTHQUAKEOCCURRENCE

    E-print Network

    Silver, Paul

    from a periodic geyser in northern California exhibit- ed precursory variations 1 to 3 days before the three largest earthquakes within a 250-km radius of the geyser, including the Loma Prieta earthquake earthquake, a simi- lar preseismic signal was recorded from a strainmeter located halfway between the geyser

  18. Earthquake and tsunami forecasts: relation of slow slip events to subsequent earthquake rupture.

    PubMed

    Dixon, Timothy H; Jiang, Yan; Malservisi, Rocco; McCaffrey, Robert; Voss, Nicholas; Protti, Marino; Gonzalez, Victor

    2014-12-01

    The 5 September 2012 M(w) 7.6 earthquake on the Costa Rica subduction plate boundary followed a 62-y interseismic period. High-precision GPS recorded numerous slow slip events (SSEs) in the decade leading up to the earthquake, both up-dip and down-dip of seismic rupture. Deeper SSEs were larger than shallower ones and, if characteristic of the interseismic period, release most locking down-dip of the earthquake, limiting down-dip rupture and earthquake magnitude. Shallower SSEs were smaller, accounting for some but not all interseismic locking. One SSE occurred several months before the earthquake, but changes in Mohr-Coulomb failure stress were probably too small to trigger the earthquake. Because many SSEs have occurred without subsequent rupture, their individual predictive value is limited, but taken together they released a significant amount of accumulated interseismic strain before the earthquake, effectively defining the area of subsequent seismic rupture (rupture did not occur where slow slip was common). Because earthquake magnitude depends on rupture area, this has important implications for earthquake hazard assessment. Specifically, if this behavior is representative of future earthquake cycles and other subduction zones, it implies that monitoring SSEs, including shallow up-dip events that lie offshore, could lead to accurate forecasts of earthquake magnitude and tsunami potential. PMID:25404327

  19. Earthquake and tsunami forecasts: Relation of slow slip events to subsequent earthquake rupture

    PubMed Central

    Dixon, Timothy H.; Jiang, Yan; Malservisi, Rocco; McCaffrey, Robert; Voss, Nicholas; Protti, Marino; Gonzalez, Victor

    2014-01-01

    The 5 September 2012 Mw 7.6 earthquake on the Costa Rica subduction plate boundary followed a 62-y interseismic period. High-precision GPS recorded numerous slow slip events (SSEs) in the decade leading up to the earthquake, both up-dip and down-dip of seismic rupture. Deeper SSEs were larger than shallower ones and, if characteristic of the interseismic period, release most locking down-dip of the earthquake, limiting down-dip rupture and earthquake magnitude. Shallower SSEs were smaller, accounting for some but not all interseismic locking. One SSE occurred several months before the earthquake, but changes in Mohr–Coulomb failure stress were probably too small to trigger the earthquake. Because many SSEs have occurred without subsequent rupture, their individual predictive value is limited, but taken together they released a significant amount of accumulated interseismic strain before the earthquake, effectively defining the area of subsequent seismic rupture (rupture did not occur where slow slip was common). Because earthquake magnitude depends on rupture area, this has important implications for earthquake hazard assessment. Specifically, if this behavior is representative of future earthquake cycles and other subduction zones, it implies that monitoring SSEs, including shallow up-dip events that lie offshore, could lead to accurate forecasts of earthquake magnitude and tsunami potential. PMID:25404327

  20. Arias intensity assessment of liquefaction test sites on the east side of San Francisco Bay affected by the Loma Prieta, California, earthquake of 17 October 1989

    USGS Publications Warehouse

    Kayen, R.E.

    1997-01-01

    Abstract. Uncompacted artificial-fill deposits on the east side of San Francisco Bay suffered severe levels of soil liquefaction during the Loma Prieta earthquake of 17 October 1989. Damaged areas included maritime-port facilities, office buildings, and shoreline transportation arteries, ranging from 65 to 85 km from the north end of the Loma Prieta rupture zone. Typical of all these sites, which represent occurrences of liquefaction-induced damage farthest from the rupture zone, are low cone penetration test and Standard Penetration Test resistances in zones of cohesionless silty and sandy hydraulic fill, and underlying soft cohesive Holocene and Pleistocene sediment that strongly amplified ground motions. Postearthquake investigations at five study sites using standard penetration tests and cone penetration tests provide a basis for evaluation of the Arias intensity-based methodology for assessment of liquefaction susceptibility. ?? 1997 Kluwer Academic Publishers.

  1. Recurrence intervals for great earthquakes of the past 3,500 years at northeastern Willapa Bay, Washington

    USGS Publications Warehouse

    Atwater, Brian F.; Hemphill-Haley, Eileen

    1997-01-01

    Seven great earthquakes, or earthquake series, probably ruptured the southern Washington part of the Cascadia subduction zone in the past 3,500 years. Each earthquake was probably of magnitude 8 or larger. The earthquakes define six recurrence intervals that average about 500 years. The longest interval, about 700-1300 years, was followed by two of the shortest, which together lasted less than 800 years. Another long interval, 600-1000 years, ended with an earthquake 300 years ago.

  2. Fault structure and kinematics of the Long Valley Caldera region, California, revealed by high-accuracy earthquake hypocenters and focal mechanism stress inversions

    USGS Publications Warehouse

    Prejean, S.; Ellsworth, W.; Zoback, M.; Waldhauser, F.

    2002-01-01

    We have determined high-resolution hypocenters for 45,000+ earthquakes that occurred between 1980 and 2000 in the Long Valley caldera area using a double-difference earthquake location algorithm and routinely determined arrival times. The locations reveal numerous discrete fault planes in the southern caldera and adjacent Sierra Nevada block (SNB). Intracaldera faults include a series of east/west-striking right-lateral strike-slip faults beneath the caldera's south moat and a series of more northerly striking strike-slip/normal faults beneath the caldera's resurgent dome. Seismicity in the SNB south of the caldera is confined to a crustal block bounded on the west by an east-dipping oblique normal fault and on the east by the Hilton Creek fault. Two NE-striking left-lateral strike-slip faults are responsible for most seismicity within this block. To understand better the stresses driving seismicity, we performed stress inversions using focal mechanisms with 50 or more first motions. This analysis reveals that the least principal stress direction systematically rotates across the studied region, from NE to SW in the caldera's south moat to WNW-ESE in Round Valley, 25 km to the SE. Because WNW-ESE extension is characteristic of the western boundary of the Basin and Range province, caldera area stresses appear to be locally perturbed. This stress perturbation does not seem to result from magma chamber inflation but may be related to the significant (???20 km) left step in the locus of extension along the Sierra Nevada/Basin and Range province boundary. This implies that regional-scale tectonic processes are driving seismic deformation in the Long Valley caldera.

  3. Quantitative analysis of seismic fault zone waves in the rupture zone of the 1992 Landers, California, earthquake: Evidence for a shallow trapping structure

    USGS Publications Warehouse

    Peng, Z.; Ben-Zion, Y.; Michael, A.J.; Zhu, L.

    2003-01-01

    We analyse quantitatively a waveform data set of 238 earthquakes recorded by a dense seismic array across and along the rupture zone of the 1992 Landers earthquake. A grid-search method with station delay corrections is used to locate events that do not have catalogue locations. The quality of fault zone trapped waves generated by each event is determined from the ratios of seismic energy in time windows corresponding to trapped waves and direct S waves at stations close to and off the fault zone. Approximately 70 per cent of the events with S-P times of less than 2 s, including many clearly off the fault, produce considerable trapped wave energy. This distribution is in marked contrast with previous claims that trapped waves are generated only by sources close to or inside the Landers rupture zone. The time difference between the S arrival and trapped waves group does not grow systematically with increasing hypocentral distance and depth. The dispersion measured from the trapped waves is weak. These results imply that the seismic trapping structure at the Landers rupture zone is shallow and does not extend continuously along-strike by more than a few kilometres. Synthetic waveform modelling indicates that the fault zone waveguide has depth of approximately 2-4 km, a width of approximately 200 m, an S-wave velocity reduction relative to the host rock of approximately 30-40 per cent and an S-wave attenuation coefficient of approximately 20-30. The fault zone waveguide north of the array appears to be shallower and weaker than that south of the array. The waveform modelling also indicates that the seismic trapping structure below the array is centred approximately 100 m east of the surface break.

  4. Earthquake Formation

    NSDL National Science Digital Library

    Integrated Teaching and Learning Program,

    Students learn about the structure of the earth and how an earthquake happens. In one activity, students make a model of the earth including all of its layers. In a teacher-led demonstration, students learn about continental drift. In another activity, students create models demonstrating the different types of faults.

  5. Earthquake Facts

    MedlinePLUS

    ... California was felt in 1769 by the exploring expedition of Gaspar de Portola while the group was ... free oscillation of the Earth. The interior of Antarctica has icequakes which, although they are much smaller, ...

  6. Liquefaction probability curves for surficial geologic deposits

    USGS Publications Warehouse

    Holzer, Thomas L.; Noce, Thomas E.; Bennett, Michael J.

    2011-01-01

    Liquefaction probability curves that predict the probability of surface manifestations of earthquake-induced liquefaction are developed for 14 different types of surficial geologic units. The units consist of alluvial fan, beach ridge, river delta topset and foreset beds, eolian dune, point bar, flood basin, natural river and alluvial fan levees, abandoned river channel, deep-water lake, lagoonal, sandy artificial fill, and valley train deposits. Probability is conditioned on earthquake magnitude and peak ground acceleration. Curves are developed for water table depths of 1.5 and 5.0 m. Probabilities are derived from complementary cumulative frequency distributions of the liquefaction potential index (LPI) that were computed from 927 cone penetration tests. For natural deposits with a water table at 1.5 m and subjected to a M7.5 earthquake with peak ground acceleration (PGA) ?=? 0.25g, probabilities range from 0.5 for beach ridge, point bar, and deltaic deposits. The curves also were used to assign ranges of liquefaction probabilities to the susceptibility categories proposed previously for different geologic deposits. For the earthquake described here, probabilities for susceptibility categories have ranges of 0–0.08 for low, 0.09–0.30 for moderate, 0.31–0.62 for high, and 0.63–1.00 for very high. Retrospective predictions of liquefaction during historical earthquakes based on the curves compare favorably to observations.

  7. California Geological Survey-Educational Resources Center

    NSDL National Science Digital Library

    2007-01-01

    How do we understand the Earth and its complexity? It's a crucial question in our age. Fortunately, the California Geological Survey is interested in these matters. The Survey's Educational Resources Center site features California geology maps, teachers' aids, and "California Geology 101." This last resource is an interactive index of online geologic field trip guides and related sites. The resources include an exploration of the 1906 San Francisco Earthquake, replies to questions posed by the "Earthquake DOC," and a glossary of rock and mineral terminology. The maps should not be missed either, as they include a fault activity map of California and a detailed map of the Golden State's geomorphic provinces.

  8. Engaging Students in Earthquake Science

    NASA Astrophysics Data System (ADS)

    Cooper, I. E.; Benthien, M.

    2004-12-01

    The Southern California Earthquake Center Communication, Education, and Outreach program (SCEC CEO) has been collaborating with the University of Southern California (USC) Joint Education Project (JEP) and the Education Consortium of Central Los Angeles (ECCLA) to work directly with the teachers and schools in the local community around USC. The community surrounding USC is 57 % Hispanic (US Census, 2000) and 21% African American (US Census, 2000). Through the partnership with ECCLA SCEC has created a three week enrichment intersession program, targeting disadvantaged students at the fourth/fifth grade level, dedicated entirely to earthquakes. SCEC builds partnerships with the intersession teachers, working together to actively engage the students in learning about earthquakes. SCEC provides a support system for the teachers, supplying them with the necessary content background as well as classroom manipulatives. SCEC goes into the classrooms with guest speakers and take the students out of the classroom on two field trips. There are four intersession programs each year. SCEC is also working with USC's Joint Education Project program. The JEP program has been recognized as one of the "oldest and best organized" Service-Learning programs in the country (TIME Magazine and the Princeton Review, 2000). Through this partnership SCEC is providing USC students with the necessary tools to go out to the local schools and teach students of all grade levels about earthquakes. SCEC works with the USC students to design engaging lesson plans that effectively convey content regarding earthquakes. USC students can check out hands-on/interactive materials to use in the classrooms from the SCEC Resource Library. In both these endeavors SCEC has expanded our outreach to the local community. SCEC is reaching over 200 minority children each year through our partnerships, and this number will increase as our programs grow.

  9. A RELM earthquake forecast based on pattern informatics

    Microsoft Academic Search

    James R. Holliday; Chien-chih Chen; Kristy F. Tiampo; John B. Rundle; Donald L. Turcotte; Andrea Donnelan

    2005-01-01

    We present a RELM forecast of future earthquakes in California that is primarily based on the pattern informatics (PI) method. This method identifies regions that have systematic fluctuations in seismicity, and it has been demonstrated to be successful. A PI forecast map originally published on 19 February 2002 for southern California successfully forecast the locations of sixteen of eighteen M>5

  10. 75 FR 27846 - California Disaster # CA-00155

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-18

    ...major disaster for Public Assistance Only for the State of California (FEMA- 1911-DR), dated 05/07/2010. Incident: Earthquake. Incident Period: 04/04/2010 and continuing. Effective Date: 05/07/2010. Physical Loan Application Deadline...

  11. 77 FR 58901 - California Disaster #CA-00190

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-24

    ...notice of an Administrative declaration of a disaster for the State of California dated 09/14/2012. Incident: Brawley Earthquakes. Incident Period: 08/26/2012 and continuing. DATES: Effective Date: 09/14/2012. Physical Loan...

  12. Geologic evidence for recurrent moderate to large earthquakes near Charleston, South Carolina

    USGS Publications Warehouse

    Obermeier, S.F.; Gohn, G.S.; Weems, R.E.; Gelinas, R.L.; Rubin, M.

    1985-01-01

    Multiple generations of earthquake-induced sand blows in Quaternary sediments and soils near Charleston, South Carolina, are evidence of recurrent moderate to large earthquakes in that area. The large 1886 earthquake, the only historic earthquake known to have produced sand blows at Charleston, probably caused the youngest observed blows. Older (late Quaternary) sand blows in the Charleston area indicate at least two prehistoric earthquakes with shaking severities comparable to the 1886 event.

  13. The effects of earthquake measurement concepts and magnitude anchoring on individuals' perceptions of earthquake risk

    USGS Publications Warehouse

    Celsi, R.; Wolfinbarger, M.; Wald, D.

    2005-01-01

    The purpose of this research is to explore earthquake risk perceptions in California. Specifically, we examine the risk beliefs, feelings, and experiences of lay, professional, and expert individuals to explore how risk is perceived and how risk perceptions are formed relative to earthquakes. Our results indicate that individuals tend to perceptually underestimate the degree that earthquake (EQ) events may affect them. This occurs in large part because individuals' personal felt experience of EQ events are generally overestimated relative to experienced magnitudes. An important finding is that individuals engage in a process of "cognitive anchoring" of their felt EQ experience towards the reported earthquake magnitude size. The anchoring effect is moderated by the degree that individuals comprehend EQ magnitude measurement and EQ attenuation. Overall, the results of this research provide us with a deeper understanding of EQ risk perceptions, especially as they relate to individuals' understanding of EQ measurement and attenuation concepts. ?? 2005, Earthquake Engineering Research Institute.

  14. Probability Illustrator

    NSDL National Science Digital Library

    Zollman, Dean

    This resource shows the relation between quantum wave functions and probabilities. The user can draw a 1D wavefunction, and observe the resultant probability distribution. Both the probability density at a single point and the probability in any interval can be measured.

  15. Probability distributions of hydraulic conductivity for the hydrogeologic units of the Death Valley regional ground-water flow system, Nevada and California

    USGS Publications Warehouse

    Belcher, Wayne R.; Sweetkind, Donald S.; Elliott, Peggy E.

    2002-01-01

    The use of geologic information such as lithology and rock properties is important to constrain conceptual and numerical hydrogeologic models. This geologic information is difficult to apply explicitly to numerical modeling and analyses because it tends to be qualitative rather than quantitative. This study uses a compilation of hydraulic-conductivity measurements to derive estimates of the probability distributions for several hydrogeologic units within the Death Valley regional ground-water flow system, a geologically and hydrologically complex region underlain by basin-fill sediments, volcanic, intrusive, sedimentary, and metamorphic rocks. Probability distributions of hydraulic conductivity for general rock types have been studied previously; however, this study provides more detailed definition of hydrogeologic units based on lithostratigraphy, lithology, alteration, and fracturing and compares the probability distributions to the aquifer test data. Results suggest that these probability distributions can be used for studies involving, for example, numerical flow modeling, recharge, evapotranspiration, and rainfall runoff. These probability distributions can be used for such studies involving the hydrogeologic units in the region, as well as for similar rock types elsewhere. Within the study area, fracturing appears to have the greatest influence on the hydraulic conductivity of carbonate bedrock hydrogeologic units. Similar to earlier studies, we find that alteration and welding in the Tertiary volcanic rocks greatly influence hydraulic conductivity. As alteration increases, hydraulic conductivity tends to decrease. Increasing degrees of welding appears to increase hydraulic conductivity because welding increases the brittleness of the volcanic rocks, thus increasing the amount of fracturing.

  16. A Review of Two Methods of Predicting Earthquakes

    NSDL National Science Digital Library

    This report reviews two methods of earthquake prediction: statistical analysis and observation of geophysical precursors of earthquakes. Statistical analysis involves observing the history of earthquake occurrences in a given region to see whether there are patterns or cycles; two case studies are provided as examples: Friuli, Italy and Parkfield, California. Five geophysical precursors, changes in the physical state of the earth that preceed earthquakes, are also discussed: changes in P-wave velocity, ground uplift and tilt, increase in Radon emissions, decrease in electrical resistivity, and fluctuation in groundwater levels. Examples of each of these precursors are also provided. References are included.

  17. Comparison of seismic waveform inversion results for the rupture history of a finite fault: application to the 1986 North Palm Springs, California, earthquake

    USGS Publications Warehouse

    Hartzell, S.

    1989-01-01

    The July 8, 1986, North Palm Strings earthquake is used as a basis for comparison of several different approaches to the solution for the rupture history of a finite fault. The inversion of different waveform data is considered; both teleseismic P waveforms and local strong ground motion records. Linear parametrizations for slip amplitude are compared with nonlinear parametrizations for both slip amplitude and rupture time. Inversions using both synthetic and empirical Green's functions are considered. In general, accurate Green's functions are more readily calculable for the teleseismic problem where simple ray theory and flat-layered velocity structures are usually sufficient. However, uncertainties in the variation in t* with frequency most limit the resolution of teleseismic inversions. A set of empirical Green's functions that are well recorded at teleseismic distances could avoid the uncertainties in attenuation. In the inversion of strong motion data, the accurate calculation of propagation path effects other than attenuation effects is the limiting factor in the resolution of source parameters. -from Author

  18. California Geological Survey: Seismic Hazards Zonation Program

    NSDL National Science Digital Library

    Users can access interactive mapping and Geographic Information Systems (GIS) data, and view or download maps and reports on California's most prominent earthquake hazards. These maps are intended to help avoid damage resulting from earthquakes and are used by cities and counties to regulate development and by property owners selling property within areas where seismic hazards have been identified.

  19. Earthquake Forecast via Neutrino Tomography

    E-print Network

    Bin Wang; Ya-Zheng Chen; Xue-Qian Li

    2011-03-29

    We discuss the possibility of forecasting earthquakes by means of (anti)neutrino tomography. Antineutrinos emitted from reactors are used as a probe. As the antineutrinos traverse through a region prone to earthquakes, observable variations in the matter effect on the antineutrino oscillation would provide a tomography of the vicinity of the region. In this preliminary work, we adopt a simplified model for the geometrical profile and matter density in a fault zone. We calculate the survival probability of electron antineutrinos for cases without and with an anomalous accumulation of electrons which can be considered as a clear signal of the coming earthquake, at the geological region with a fault zone, and find that the variation may reach as much as 3% for $\\bar \

  20. Earthquake Forecast via Neutrino Tomography

    E-print Network

    Wang, Bin; Li, Xue-Qian

    2010-01-01

    We discuss the possibility of forecasting earthquakes by means of (anti)neutrino tomography. Antineutrinos emitted from reactors are used as a probe. As the antineutrinos traverse through a region prone to earthquakes, observable variations of the matter effect on the antineutrino oscillation would provide a tomography of the vicinity of the region. In this preliminary work, we adopt a simplified model for the geometrical profile and matter density in a fault zone. We calculate the survival probability of electron antineutrinos for cases without and with an anomalous accumulation of electrons which can be considered as a clear signal of the coming earthquake, at the geological region with a fault zone and find that the variation may reach as large as 5% to 8.5% for $\\bar \