These are representative sample records from Science.gov related to your search topic.
For comprehensive and current results, perform a real-time search at Science.gov.
1

An empirical model for earthquake probabilities in the San Francisco Bay region, California, 2002-2031  

USGS Publications Warehouse

The moment magnitude M 7.8 earthquake in 1906 profoundly changed the rate of seismic activity over much of northern California. The low rate of seismic activity in the San Francisco Bay region (SFBR) since 1906, relative to that of the preceding 55 yr, is often explained as a stress-shadow effect of the 1906 earthquake. However, existing elastic and visco-elastic models of stress change fail to fully account for the duration of the lowered rate of earthquake activity. We use variations in the rate of earthquakes as a basis for a simple empirical model for estimating the probability of M ???6.7 earthquakes in the SFBR. The model preserves the relative magnitude distribution of sources predicted by the Working Group on California Earthquake Probabilities' (WGCEP, 1999; WGCEP, 2002) model of characterized ruptures on SFBR faults and is consistent with the occurrence of the four M ???6.7 earthquakes in the region since 1838. When the empirical model is extrapolated 30 yr forward from 2002, it gives a probability of 0.42 for one or more M ???6.7 in the SFBR. This result is lower than the probability of 0.5 estimated by WGCEP (1988), lower than the 30-yr Poisson probability of 0.60 obtained by WGCEP (1999) and WGCEP (2002), and lower than the 30-yr time-dependent probabilities of 0.67, 0.70, and 0.63 obtained by WGCEP (1990), WGCEP (1999), and WGCEP (2002), respectively, for the occurrence of one or more large earthquakes. This lower probability is consistent with the lack of adequate accounting for the 1906 stress-shadow in these earlier reports. The empirical model represents one possible approach toward accounting for the stress-shadow effect of the 1906 earthquake. However, the discrepancy between our result and those obtained with other modeling methods underscores the fact that the physics controlling the timing of earthquakes is not well understood. Hence, we advise against using the empirical model alone (or any other single probability model) for estimating the earthquake hazard and endorse the use of all credible earthquake probability models for the region, including the empirical model, with appropriate weighting, as was done in WGCEP (2002).

Reasenberg, P.A.; Hanks, T.C.; Bakun, W.H.

2003-01-01

2

Earthquake Rate Model 2 of the 2007 Working Group for California Earthquake Probabilities, Magnitude-Area Relationships  

USGS Publications Warehouse

The Working Group for California Earthquake Probabilities must transform fault lengths and their slip rates into earthquake moment-magnitudes. First, the down-dip coseismic fault dimension, W, must be inferred. We have chosen the Nazareth and Hauksson (2004) method, which uses the depth above which 99% of the background seismicity occurs to assign W. The product of the observed or inferred fault length, L, with the down-dip dimension, W, gives the fault area, A. We must then use a scaling relation to relate A to moment-magnitude, Mw. We assigned equal weight to the Ellsworth B (Working Group on California Earthquake Probabilities, 2003) and Hanks and Bakun (2007) equations. The former uses a single logarithmic relation fitted to the M=6.5 portion of data of Wells and Coppersmith (1994); the latter uses a bilinear relation with a slope change at M=6.65 (A=537 km2) and also was tested against a greatly expanded dataset for large continental transform earthquakes. We also present an alternative power law relation, which fits the newly expanded Hanks and Bakun (2007) data best, and captures the change in slope that Hanks and Bakun attribute to a transition from area- to length-scaling of earthquake slip. We have not opted to use the alternative relation for the current model. The selections and weights were developed by unanimous consensus of the Executive Committee of the Working Group, following an open meeting of scientists, a solicitation of outside opinions from additional scientists, and presentation of our approach to the Scientific Review Panel. The magnitude-area relations and their assigned weights are unchanged from that used in Working Group (2003).

Stein, Ross S.

2008-01-01

3

Southern California regional earthquake probability estimated from continuous GPS geodetic data  

NASA Astrophysics Data System (ADS)

Current seismic hazard estimates are primarily based on seismic and geologic data, but geodetic measurements from large, dense arrays such as the Southern California Integrated GPS Network (SCIGN) can also be used to estimate earthquake probabilities and seismic hazard. Geodetically-derived earthquake probability estimates are particularly important in regions with poorly-constrained fault slip rates. In addition, they are useful because such estimates come with well-determined error bounds. Long-term planning is underway to incorporate geodetic data in the next generation of United States national seismic hazard maps, and techniques for doing so need further development. I present a new method for estimating the expected rates of earthquakes using strain rates derived from geodetic station velocities. I compute the strain rates using a new technique devised by Y. Hsu and M. Simons [Y. Hsu and M. Simons, pers. comm.], which computes the horizontal strain rate tensor ( {? {?}}) at each node of a pre-defined regular grid, using all geodetic velocities in the data set weighted by distance and estimated uncertainty. In addition, they use a novel weighting to handle the effects of station distribution: they divide the region covered by the geodetic network into Voronoi cells using the station locations and weight each station's contribution to {? {?}} by the area of the Voronoi cell centered at that station. I convert {? {?}} into the equivalent seismic moment rate density (? {M}) using the method of \\textit{Savage and Simpson} [1997] and maximum seismogenic depths estimated from regional seismicity; ? {M} gives the expected rate of seismic moment release in a region, based on the geodetic strain rates. Assuming the seismicity in the given region follows a Gutenberg-Richter relationship, I convert ? {M} to an expected rate of earthquakes of a given magnitude. I will present results of a study applying this method to data from the SCIGN array to estimate earthquake rates in southern California, but my technique is generally applicable to any region with a sufficiently dense geodetic array and well-located seismicity. Savage, J. C. and R. W. Simpson, Surface strain accumulation and the seismic moment tensor, \\textit{Bull. Seismol. Soc. Am., 87,} 1345--1353, 1997.

Anderson, G.

2002-12-01

4

Earthquake Rate Model 2.2 of the 2007 Working Group for California Earthquake Probabilities, Appendix D: Magnitude-Area Relationships  

USGS Publications Warehouse

Summary To estimate the down-dip coseismic fault dimension, W, the Executive Committee has chosen the Nazareth and Hauksson (2004) method, which uses the 99% depth of background seismicity to assign W. For the predicted earthquake magnitude-fault area scaling used to estimate the maximum magnitude of an earthquake rupture from a fault's length, L, and W, the Committee has assigned equal weight to the Ellsworth B (Working Group on California Earthquake Probabilities, 2003) and Hanks and Bakun (2002) (as updated in 2007) equations. The former uses a single relation; the latter uses a bilinear relation which changes slope at M=6.65 (A=537 km2).

Stein, Ross S.

2007-01-01

5

Earthquake Probability Mapping  

NSDL National Science Digital Library

This mapping tool allows users to generate Earthquake Probability Maps (EPMs) for a region within 50 kilometers of a location specified by latitude and longitude or by ZIP code. The maps are color-coded; higher earthquake probabilities are indicated by orange and red colors, while lower probabilities are indicated by green or blue. Fault traces are marked in white; rivers are in blue. The maps are also produced in downloadable, printable format (PDF).

6

The earthquake prediction experiment at Parkfield, California  

Microsoft Academic Search

Since 1985, a focused earthquake prediction experiment has been in progress along the San Andreas fault near the town of Parkfield in central California. Parkfield has experienced six moderate earthquakes since 1857 at average intervals of 22 years, the most recent a magnitude of 6 event in 1966. The probability of another moderate earthquake soon appears high, but studies assigning

Evelyn Roeloffs; John Langbein

1994-01-01

7

Intro Probability Rabbits Description Predictions Ontology of Earthquake Probability: Metaphor  

E-print Network

Intro Probability Rabbits Description Predictions Ontology of Earthquake Probability: Metaphor be abandoned in favor of common sense. #12;Intro Probability Rabbits Description Predictions Earthquake not random. ­Wm. ShakesEarth #12;Intro Probability Rabbits Description Predictions Earthquake Poker

Stark, Philip B.

8

The Uniform California Earthquake Rupture Forecast, Version 2 (UCERF 2)  

USGS Publications Warehouse

California?s 35 million people live among some of the most active earthquake faults in the United States. Public safety demands credible assessments of the earthquake hazard to maintain appropriate building codes for safe construction and earthquake insurance for loss protection. Seismic hazard analysis begins with an earthquake rupture forecast?a model of probabilities that earthquakes of specified magnitudes, locations, and faulting types will occur during a specified time interval. This report describes a new earthquake rupture forecast for California developed by the 2007 Working Group on California Earthquake Probabilities (WGCEP 2007).

2007 Working Group on California Earthquake Probabilities

2008-01-01

9

Parkfield, California: Earthquake History  

NSDL National Science Digital Library

This report describes the history of seismic activity at Parkfield, California, which is situated on the San Andreas Fault. It points out that moderate-size earthquakes have occurred on the Parkfield section of the San Andreas fault at fairly regular intervals, and that the earthquakes may have been 'characteristic' in the sense that they occurred with some regularity (mean repetition time of about 22 years). This indicates that they may have repeatedly ruptured the same area on the fault. A diagram shows the timing of the earthquakes, and illustrations of the seismic waveforms show the similarities between earthquakes occurring in 1922, 1934, and 1966.

10

Cosmogenic beryllium-10 exposure dating of probable earthquake-triggered rock avalanches in Yosemite Valley, California  

NASA Astrophysics Data System (ADS)

In Yosemite Valley, rock falls commonly originate from the glacially-steepened walls. Deposition of the smaller rock falls, from hundreds up to tens of thousands of cubic meters, is typically limited to the active talus slopes beneath the cliffs. The floor of Yosemite Valley, however, preserves at least seven extremely large rock fall deposits, here termed rock avalanches, up to several million cubic meters in volume. These deposits extend far beyond the base of active talus slopes onto the valley floor, and have occurred since the retreat of Last Glacial Maximum glaciers circa 15-17 ka. Using airborne LiDAR data that resolves individual boulders, we mapped the rock avalanche deposits in the field and in ArcGIS. Minimum exposed volumes range from hundreds of thousands to several million cubic meters. To assess the frequency of rock avalanche occurrence, we employed cosmogenic beryllium-10 surface exposure dating of large (>4 cubic meters) boulders embedded within the deposits. These deposits are ideal targets for cosmogenic 10Be exposure dating, as they are instantaneous events that excavate deep-seated quartz-rich granitic rocks, and once deposited, are essentially immune to post-depositional erosion or modification. Mean exposure ages indicate that failures occurred at 1.0, 1.8, 2.3, 3.7, 4.4, 6.4, and 11.6 ka. At least three of the deposits appear to represent two or more failures, separated in time by hundreds to thousands of years. Synchronous rock avalanches (within the uncertainty of the exposure ages (<200 yrs)) at different locations within the valley appear to have occurred at 3.7 ka, and possibly at 2.3 ka, suggesting possible coseismic triggering. Age correlations from paleoseismic work tentatively identify large earthquakes originating from the eastern Sierra Nevada or western Nevada as possible triggers for at least half of the rock avalanches. These unique and robust age data provide key information for accurately mapping rock avalanches in Yosemite Valley and for quantifying their recurrence intervals.

Thompson, J. A.; Stock, G. M.; Rood, D.; Frankel, K. L.

2013-12-01

11

Paleoseismic event dating and the conditional probability of large earthquakes on the southern San Andreas fault, California  

USGS Publications Warehouse

We introduce a quantitative approach to paleoearthquake dating and apply it to paleoseismic data from the Wrightwood and Pallett Creek sites on the southern San Andreas fault. We illustrate how stratigraphic ordering, sedimentological, and historical data can be used quantitatively in the process of estimating earthquake ages. Calibrated radiocarbon age distributions are used directly from layer dating through recurrence intervals and recurrence probability estimation. The method does not eliminate subjective judgements in event dating, but it does provide a means of systematically and objectively approaching the dating process. Date distributions for the most recent 14 events at Wrightwood are based on sample and contextual evidence in Fumal et al. (2002) and site context and slip history in Weldon et al. (2002). Pallett Creek event and dating descriptions are from published sources. For the five most recent events at Wrightwood, our results are consistent with previously published estimates, with generally comparable or narrower uncertainties. For Pallett Creek, our earthquake date estimates generally overlap with previous results but typically have broader uncertainties. Some event date estimates are very sensitive to details of data interpretation. The historical earthquake in 1857 ruptured the ground at both sites but is not constrained by radiocarbon data. Radiocarbon ages, peat accumulation rates, and historical constraints at Pallett Creek for event X yield a date estimate in the earliest 1800s and preclude a date in the late 1600s. This event is almost certainly the historical 1812 earthquake, as previously concluded by Sieh et al. (1989). This earthquake also produced ground deformation at Wrightwood. All events at Pallett Creek, except for event T, about A.D. 1360, and possibly event I, about A.D. 960, have corresponding events at Wrightwood with some overlap in age ranges. Event T falls during a period of low sedimentation at Wrightwood when conditions were not favorable for recording earthquake evidence. Previously proposed correlations of Pallett Creek X with Wrightwood W3 in the 1690s and Pallett Creek event V with W5 around 1480 (Fumal et al., 1993) appear unlikely after our dating reevaluation. Apparent internal inconsistencies among event, layer, and dating relationships around events R and V identify them as candidates for further investigation at the site. Conditional probabilities of earthquake recurrence were estimated using Poisson, lognormal, and empirical models. The presence of 12 or 13 events at Wrightwood during the same interval that 10 events are reported at Pallett Creek is reflected in mean recurrence intervals of 105 and 135 years, respectively. Average Poisson model 30-year conditional probabilities are about 20% at Pallett Creek and 25% at Wrightwood. The lognormal model conditional probabilities are somewhat higher, about 25% for Pallett Creek and 34% for Wrightwood. Lognormal variance ??ln estimates of 0.76 and 0.70, respectively, imply only weak time predictability. Conditional probabilities of 29% and 46%, respectively, were estimated for an empirical distribution derived from the data alone. Conditional probability uncertainties are dominated by the brevity of the event series; dating uncertainty contributes only secondarily. Wrightwood and Pallett Creek event chronologies both suggest variations in recurrence interval with time, hinting that some form of recurrence rate modulation may be at work, but formal testing shows that neither series is more ordered than might be produced by a Poisson process.

Biasi, G.P.; Weldon, R.J., II; Fumal, T.E.; Seitz, G.G.

2002-01-01

12

Parkfield, California, earthquake prediction experiment  

Microsoft Academic Search

Five moderate (magnitude 6) earthquakes with similar features have occurred on the Parkfield section of the San Andreas fault in central California since 1857. The next moderate Parkfield earthquake is expected to occur before 1993. The Parkfield prediction experiment is designed to monitor the details of the final stages of the earthquake preparation process; observations and reports of seismicity and

W. H. Bakun; A. G. Lindh

1985-01-01

13

The Parkfield, California, Earthquake Experiment  

NSDL National Science Digital Library

This report decribes research being carried out in Parkfield, California whose purpose is to better understand the physics of earthquakes: what actually happens on the fault and in the surrounding region before, during and after an earthquake. Ultimately, scientists hope to better understand the earthquake process and, if possible, to provide a scientific basis for earthquake prediction. Topics include the scientific background for the experiment, including the tectonic setting at Parkfield, historical earthquake activity on this section of the San Andreas fault, the monitoring and data collecting activities currently being carried out, and plans for future research. Data are also available to view in real time and to download.

14

How Do Scientists Determine Earthquake Probabilities?  

NSDL National Science Digital Library

This provides many links to articles, graphics, scientific papers and podcasts to help students understand how scientists determine probabilities for earthquake occurrences. Topics include the locations of faults and how much they need to move in order to release the strain that accumulates; the study of past earthquakes on each fault to predict the size of possible earthquakes that could occur in the future; and using information on how long it's been since the last earthquake to estimate the probability that an earthquake will occur in the next few years. Links to additional information are embedded in the text.

15

Southern California Earthquake Center (SCEC)  

NSDL National Science Digital Library

The Southern California Earthquake Center (SCEC), a National Science Foundation (NSF) Science and Technology Center, aims to reduce earthquake hazard by defining the locations of future earthquakes, calculating expected ground motions, and conveying this information to the general public. The SCEC¹s homepage contains access to research and data, including links to databases for strong motion and seismograms, and a searchable and sortable bibliographic database of publications. Also available are GPS data and a network of GPS stations. A link to the Earthquake Information Network provides a searchable list of up-to-date internet earthquakes resources. Note, in order to access the SCEC Publications Database, a username and password are required. Use your own name for the username, and enter -webview as the password. SCEC is a first rate resource for earthquake engineers.

16

Southern California Earthquake Data Center  

NSDL National Science Digital Library

To say that there are a few earthquake research centers in Southern California is a bit like saying that Chicago sits on a lake of some size. It's a bit of an obvious remark, but given that there are a number of such projects, it's important to take a look at some of the more compelling ones out there. One such important resource is the Southern California Earthquake Data Center, sponsored by a host of organizations, including the California Institute of Technology and the United States Geological Survey. Visitors to the project site can peruse some of its recent work, which includes a clickable map of the region that features information on recent earthquakes in California and Nevada. Equally compelling is the clickable fault map of Southern California where visitors can learn about the local faults and recent activity along each fault. Another key element of the site is the historical earthquake database, which may be of interest to both the general public and those who are studying this area.

17

USGS Earthquake Hazards Program-Northern California: Special Features  

NSDL National Science Digital Library

This page describes current special features on seismology, faults, and earthquakes. The current articles covered the topics shake maps for Northern and Southern California, Real-time display of seismograms for Northern California, an Earthquake probability study for the San Francisco Bay area, Landscape, seascape and faults of the San Francisco Bay area, and the Scientific expedition for the earthquake in Turkey 1999. Visitors who feel significant earthquakes in this area are invited to participate in an internet survey of ground shaking and damage. Visitors may also view previous features.

18

Southern California Earthquake Center (SCEC) Home Page  

NSDL National Science Digital Library

This is the home page of the Southern California Earthquake Center (SCEC), a consortium of universities and research institutions dedicated to gathering information about earthquakes in Southern California, integrate that knowledge into a comprehensive and predictive understanding of earthquake phenomena, and communicate this understanding to end-users and the general public in order to increase earthquake awareness, reduce economic losses, and save lives. News of recent earthquake research, online resources and educational information is available here.

19

Probability of derailment under earthquake conditions  

E-print Network

A quantitative assessment of the probability of derailment under earthquake conditions is presented. Two derailment modes are considered: by vibratory motion - during the ground motion - and by permanent track deformation ...

Guillaud, Lucile M. (Lucile Marie)

2006-01-01

20

The earthquake prediction experiment at Parkfield, California  

USGS Publications Warehouse

Since 1985, a focused earthquake prediction experiment has been in progress along the San Andreas fault near the town of Parkfield in central California. Parkfield has experienced six moderate earthquakes since 1857 at average intervals of 22 years, the most recent a magnitude 6 event in 1966. The probability of another moderate earthquake soon appears high, but studies assigning it a 95% chance of occurring before 1993 now appear to have been oversimplified. The identification of a Parkfield fault "segment" was initially based on geometric features in the surface trace of the San Andreas fault, but more recent microearthquake studies have demonstrated that those features do not extend to seismogenic depths. On the other hand, geodetic measurements are consistent with the existence of a "locked" patch on the fault beneath Parkfield that has presently accumulated a slip deficit equal to the slip in the 1966 earthquake. A magnitude 4.7 earthquake in October 1992 brought the Parkfield experiment to its highest level of alert, with a 72-hour public warning that there was a 37% chance of a magnitude 6 event. However, this warning proved to be a false alarm. Most data collected at Parkfield indicate that strain is accumulating at a constant rate on this part of the San Andreas fault, but some interesting departures from this behavior have been recorded. Here we outline the scientific arguments bearing on when the next Parkfield earthquake is likely to occur and summarize geophysical observations to date.

Roeloffs, E.; Langbein, J.

1994-01-01

21

The earthquake prediction experiment at Parkfield, California  

NASA Astrophysics Data System (ADS)

Since 1985, a focused earthquake prediction experiment has been in progress along the San Andreas fault near the town of Parkfield in central California. Parkfield has experienced six moderate earthquakes since 1857 at average intervals of 22 years, the most recent a magnitude of 6 event in 1966. The probability of another moderate earthquake soon appears high, but studies assigning it a 95% chance of occuring before 1993 now appear to have been oversimplified. The identification of a Parkfield fault 'segment' was initially based on geometric features in the surface trace of the San Andreas fault, but more recent microearthquake studies have demonstrated that those features do not extend to seismogenic depths. On the other hand, geodetic measurements are consistent with the existence of a 'locked' patch on the fault beneath Parkfield that has presently accumulated a slip deficit equal to the slip in the 1966 earthquake. A magnitude 4.7 earthquake in October 1992 brought the Parkfield experiment to its highest level of alert, with a 72-hour public warning that there was a 37% chance of a magnitude 6 event. However, this warning proved to be a false alarm. Most data collected at Parkfield indicate that strain is accumulating at a constant rate on this part of the San Andreas fault, but some interesting departures from this behavior have been recorded. Here we outline the scientific arguments bearing on when the next Parkfield earthquake is likely to occur and summarize geophysical observations to date.

Roeloffs, Evelyn; Langbein, John

1994-08-01

22

Northern California Earthquake Data Center  

NSDL National Science Digital Library

A project between the University of California Berkeley Seismological Laboratory and the United State Geological Survey, the Northern California Earthquake Data Center (NCEDC) "is a long-term archive and distribution center for seismological and geodetic data for Northern and Central California." Educators and students can examine recent seismograms from the Berkeley Digital Seismic Network. Researchers will benefit from the site's enormous amount of data collections including BARD; a system of 67 constantly operating Global Positioning System receivers in Northern California. By reading the annual reports, educators will also learn about the center's many outreach activities from talks and lab tours to the production of classroom resources for kindergarten through twelfth grade teachers. This site is also reviewed in the October 17, 2003 NSDL Physical Sciences Report.

23

Northern California Earthquake Data Center (NCEDC)  

NSDL National Science Digital Library

This is the home page of the Northern California Earthquake Data Center (NCEDC) which is a joint project of the University of California Berkeley Seismological Laboratory and the U. S. Geological Survey at Menlo Park. The NCEDC is an archive for seismological and geodetic data for Northern and Central California. Accessible through this page are news items, recent earthquake information, links to earthquake catalogs, seismic waveform data sets, and Global Positioning System information. Most data sets are accessible for downloading via ftp.

24

Results of the Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California  

PubMed Central

The Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California was the first competitive evaluation of forecasts of future earthquake occurrence. Participants submitted expected probabilities of occurrence of M?4.95 earthquakes in 0.1° × 0.1° cells for the period 1 January 1, 2006, to December 31, 2010. Probabilities were submitted for 7,682 cells in California and adjacent regions. During this period, 31 M?4.95 earthquakes occurred in the test region. These earthquakes occurred in 22 test cells. This seismic activity was dominated by earthquakes associated with the M = 7.2, April 4, 2010, El Mayor–Cucapah earthquake in northern Mexico. This earthquake occurred in the test region, and 16 of the other 30 earthquakes in the test region could be associated with it. Nine complete forecasts were submitted by six participants. In this paper, we present the forecasts in a way that allows the reader to evaluate which forecast is the most “successful” in terms of the locations of future earthquakes. We conclude that the RELM test was a success and suggest ways in which the results can be used to improve future forecasts. PMID:21949355

Lee, Ya-Ting; Turcotte, Donald L.; Holliday, James R.; Sachs, Michael K.; Rundle, John B.; Chen, Chien-Chih; Tiampo, Kristy F.

2011-01-01

25

Prospective Tests of Southern California Earthquake Forecasts  

NASA Astrophysics Data System (ADS)

We are testing earthquake forecast models prospectively using likelihood ratios. Several investigators have developed such models as part of the Southern California Earthquake Center's project called Regional Earthquake Likelihood Models (RELM). Various models are based on fault geometry and slip rates, seismicity, geodetic strain, and stress interactions. Here we describe the testing procedure and present preliminary results. Forecasts are expressed as the yearly rate of earthquakes within pre-specified bins of longitude, latitude, magnitude, and focal mechanism parameters. We test models against each other in pairs, which requires that both forecasts in a pair be defined over the same set of bins. For this reason we specify a standard "menu" of bins and ground rules to guide forecasters in using common descriptions. One menu category includes five-year forecasts of magnitude 5.0 and larger. Contributors will be requested to submit forecasts in the form of a vector of yearly earthquake rates on a 0.1 degree grid at the beginning of the test. Focal mechanism forecasts, when available, are also archived and used in the tests. Interim progress will be evaluated yearly, but final conclusions would be made on the basis of cumulative five-year performance. The second category includes forecasts of earthquakes above magnitude 4.0 on a 0.1 degree grid, evaluated and renewed daily. Final evaluation would be based on cumulative performance over five years. Other types of forecasts with different magnitude, space, and time sampling are welcome and will be tested against other models with shared characteristics. Tests are based on the log likelihood scores derived from the probability that future earthquakes would occur where they do if a given forecast were true [Kagan and Jackson, J. Geophys. Res.,100, 3,943-3,959, 1995]. For each pair of forecasts, we compute alpha, the probability that the first would be wrongly rejected in favor of the second, and beta, the probability that the second would be wrongly rejected in favor of the first. Computing alpha and beta requires knowing the theoretical distribution of likelihood scores under each hypothesis, which we estimate by simulations. In this scheme, each forecast is given equal status; there is no "null hypothesis" which would be accepted by default. Forecasts and test results will be archived and posted on the RELM web site. Major problems under discussion include how to treat aftershocks, which clearly violate the variable-rate Poissonian hypotheses that we employ, and how to deal with the temporal variations in catalog completeness that follow large earthquakes.

Jackson, D. D.; Schorlemmer, D.; Gerstenberger, M.; Kagan, Y. Y.; Helmstetter, A.; Wiemer, S.; Field, N.

2004-12-01

26

An Investigation of Southern California Earthquakes  

NSDL National Science Digital Library

This site has directions for a classroom activity in which students plot locations of major Southern California earthquakes on a map. A table listing major earthquakes, when they occurred, their locations and their magnitudes is included. There is also a set of questions for the students to answer once they have plotted the earthquake data on their map. This site is in PDF format.

27

Earthquake probabilities: theoretical assessments and reality  

NASA Astrophysics Data System (ADS)

It is of common knowledge that earthquakes are complex phenomena which classification and sizing remain serious problems of the contemporary seismology. In general, their frequency-magnitude distribution exhibit power law scaling. This scaling differs significantly when different time and/or space domains are considered. At the scale of a particular earthquake rupture zone the frequency of similar size events is usually estimated to be about once in several hundred years. Evidently, contemporary seismology does not possess enough reported instrumental data for any reliable quantification of an earthquake probability at a given place of expected event. Regretfully, most of the state-of-the-art theoretical approaches to assess probability of seismic events are based on trivial (e.g. Poisson, periodic, etc) or, conversely, delicately-designed (e.g. STEP, ETAS, etc) models of earthquake sequences. Some of these models are evidently erroneous, some can be rejected by the existing statistics, and some are hardly testable in our life-time. Nevertheless such probabilistic counts including seismic hazard assessment and earthquake forecasting when used on practice eventually mislead to scientifically groundless advices communicated to decision makers and inappropriate decisions. As a result, the population of seismic regions continues facing unexpected risk and losses. The international project Global Earthquake Model (GEM) is on the wrong track, if it continues to base seismic risk estimates on the standard, mainly probabilistic, methodology to assess seismic hazard. It is generally accepted that earthquakes are infrequent, low-probability events. However, they keep occurring at earthquake-prone areas with 100% certainty. Given the expectation of seismic event once per hundred years, the daily probability of occurrence on a certain date may range from 0 to 100% depending on a choice of probability space (which is yet unknown and, therefore, made by a subjective lucky chance). How many days are needed to distinguish 0 from the average probability of 0.000027? Is it theoretically admissible to apply average when seismic events, including mega-earthquakes, are evidently clustered in time and space displaying behaviors that are far from independent? Is it possible to ignore possibly fractal, definitely, far from uniform distribution in space when mapping seismic probability density away from the empirical earthquake locus embedded onto the boundaries of the lithosphere blocks? These are simple questions to those who advocate the existing probabilistic products for seismic hazard assessment and forecasting. Fortunately, the situation is not hopeless due to deterministic pattern recognition approaches applied to available geological evidences, specifically, when intending to predict predictable, but not the exact size, site, date, and probability of a target event. Understanding by modeling the complexity of non-linear dynamics of hierarchically organized systems of blocks-and-faults has led already to methodologies of neo-deterministic seismic hazard analysis and intermediate-term middle- to narrow-range earthquake prediction algorithms tested in real-time applications over the last decades.

Kossobokov, V. G.

2013-12-01

28

The Parkfield, California, Earthquake Prediction Experiment  

NASA Astrophysics Data System (ADS)

Five moderate (magnitude 6) earthquakes with similar features have occurred on the Parkfield section of the San Andreas fault in central California since 1857. The next moderate Parkfield earthquake is expected to occur before 1993. The Parkfield prediction experiment is designed to monitor the details of the final stages of the earthquake preparation process; observations and reports of seismicity and aseismic slip associated with the last moderate Parkfield earthquake in 1966 constitute much of the basis of the design of the experiment.

Bakun, W. H.; Lindh, A. G.

1985-08-01

29

Combining earthquake forecasts using differential probability gains  

NASA Astrophysics Data System (ADS)

We describe an iterative method to combine seismicity forecasts. With this method, we produce the next generation of a starting forecast by incorporating predictive skill from one or more input forecasts. For a single iteration, we use the differential probability gain of an input forecast relative to the starting forecast. At each point in space and time, the rate in the next-generation forecast is the product of the starting rate and the local differential probability gain. The main advantage of this method is that it can produce high forecast rates using all types of numerical forecast models, even those that are not rate-based. Naturally, a limitation of this method is that the input forecast must have some information not already contained in the starting forecast. We illustrate this method using the Every Earthquake a Precursor According to Scale (EEPAS) and Early Aftershocks Statistics (EAST) models, which are currently being evaluated at the US testing center of the Collaboratory for the Study of Earthquake Predictability. During a testing period from July 2009 to December 2011 (with 19 target earthquakes), the combined model we produce has better predictive performance - in terms of Molchan diagrams and likelihood - than the starting model (EEPAS) and the input model (EAST). Many of the target earthquakes occur in regions where the combined model has high forecast rates. Most importantly, the rates in these regions are substantially higher than if we had simply averaged the models.

Shebalin, Peter N.; Narteau, Clément; Zechar, Jeremy Douglas; Holschneider, Matthias

2014-12-01

30

The parkfield, california, earthquake prediction experiment.  

PubMed

Five moderate (magnitude 6) earthquakes with similar features have occurred on the Parkfield section of the San Andreas fault in central California since 1857. The next moderate Parkfield earthquake is expected to occur before 1993. The Parkfield prediction experiment is designed to monitor the details of the final stages of the earthquake preparation process; observations and reports of seismicity and aseismic slip associated with the last moderate Parkfield earthquake in 1966 constitute much of the basis of the design of the experiment. PMID:17739363

Bakun, W H; Lindh, A G

1985-08-16

31

Probability based earthquake load and resistance factor design criteria for offshore platforms  

SciTech Connect

This paper describes a probability reliability based formulation to determine earthquake Load and Resistance Factor Design (LRFD) parameters for conventional, steel, pile supported, tubular membered platforms that is proposed as a basis for earthquake design criteria and guidelines for offshore platforms that are intended to have worldwide applicability. The formulation is illustrated with application to platforms located in five areas: offshore California, Venezuela (Rio Caribe), the East Coast of Canada, in the Caspian Sea (Azeri), and the Norwegian sector of the North Sea.

Bea, R.G. [Univ. of California, Berkeley, CA (United States). Dept. of Civil Engineering

1996-12-31

32

A physically-based earthquake recurrence model for estimation of long-term earthquake probabilities  

USGS Publications Warehouse

A physically-motivated model for earthquake recurrence based on the Brownian relaxation oscillator is introduced. The renewal process defining this point process model can be described by the steady rise of a state variable from the ground state to failure threshold as modulated by Brownian motion. Failure times in this model follow the Brownian passage time (BPT) distribution, which is specified by the mean time to failure, ?, and the aperiodicity of the mean, ? (equivalent to the familiar coefficient of variation). Analysis of 37 series of recurrent earthquakes, M -0.7 to 9.2, suggests a provisional generic value of ? = 0.5. For this value of ?, the hazard function (instantaneous failure rate of survivors) exceeds the mean rate for times > ??2, and is ~ ~ 2 ? ? for all times > ?. Application of this model to the next M 6 earthquake on the San Andreas fault at Parkfield, California suggests that the annual probability of the earthquake is between 1:10 and 1:13.

Ellsworth, William L.; Matthews, Mark V.; Nadeau, Robert M.; Nishenko, Stuart P.; Reasenberg, Paul A.; Simpson, Robert W.

1999-01-01

33

California earthquakes: why only shallow focus?  

PubMed

Frictional sliding on sawcuts and faults in laboratory samples of granite and gabbro is markedly temperature-dependent. At pressures from 1 to 5 kilobars, stick-slip gave way to stable sliding as temperature was increased from 200 to 500 degrees Celsius. Increased temperature with depth could thus cause the abrupt disappearance of earthquakes noted at shallow depths in California. PMID:17759338

Brace, W F; Byerlee, J D

1970-06-26

34

Conditional Probabilities for Large Events Estimated by Small Earthquake Rate  

NASA Astrophysics Data System (ADS)

We examined forecasting quiescence and activation models to obtain the conditional probability that a large earthquake will occur in a specific time period on different scales in Taiwan. The basic idea of the quiescence and activation models is to use earthquakes that have magnitudes larger than the completeness magnitude to compute the expected properties of large earthquakes. We calculated the probability time series for the whole Taiwan region and for three subareas of Taiwan—the western, eastern, and northeastern Taiwan regions—using 40 years of data from the Central Weather Bureau catalog. In the probability time series for the eastern and northeastern Taiwan regions, a high probability value is usually yielded in cluster events such as events with foreshocks and events that all occur in a short time period. In addition to the time series, we produced probability maps by calculating the conditional probability for every grid point at the time just before a large earthquake. The probability maps show that high probability values are yielded around the epicenter before a large earthquake. The receiver operating characteristic (ROC) curves of the probability maps demonstrate that the probability maps are not random forecasts, but also suggest that lowering the magnitude of a forecasted large earthquake may not improve the forecast method itself. From both the probability time series and probability maps, it can be observed that the probability obtained from the quiescence model increases before a large earthquake and the probability obtained from the activation model increases as the large earthquakes occur. The results lead us to conclude that the quiescence model has better forecast potential than the activation model.

Wu, Yi-Hsuan; Chen, Chien-Chih; Li, Hsien-Chi

2015-01-01

35

Space- and Time-Dependent Probabilities for Earthquake Fault Systems from Numerical Simulations: Feasibility Study and First Results  

NASA Astrophysics Data System (ADS)

In weather forecasting, current and past observational data are routinely assimilated into numerical simulations to produce ensemble forecasts of future events in a process termed "model steering". Here we describe a similar approach that is motivated by analyses of previous forecasts of the Working Group on California Earthquake Probabilities (WGCEP). Our approach is adapted to the problem of earthquake forecasting using topologically realistic numerical simulations for the strike-slip fault system in California. By systematically comparing simulation data to observed paleoseismic data, a series of spatial probability density functions (PDFs) can be computed that describe the probable locations of future large earthquakes. We develop this approach and show examples of PDFs associated with magnitude M > 6.5 and M > 7.0 earthquakes in California.

van Aalsburg, Jordan; Rundle, John B.; Grant, Lisa B.; Rundle, Paul B.; Yakovlev, Gleb; Turcotte, Donald L.; Donnellan, Andrea; Tiampo, Kristy F.; Fernandez, Jose

2010-08-01

36

Conditional Probability Approaches for the Occurrence of Earthquake Generated Tsunamis  

NASA Astrophysics Data System (ADS)

The problem of probabilistic tsunami hazard assessment is not an easy task because usually the number of events contained in a tsunami time series of a particular tsunamigenic zone is low and, therefore, do not allow for statistical significance of the results. On the contrary, the earthquake data set contains more events which is due to that not all of the earthquakes are tsunamigenic. As a consequence, should the problem of tsunami probabilistic hazard assessment is considered in association to approaches for earthquake probabilistic occurrence then more reliable tsunami probabilities may be obtained. We develop alternative approaches for the determination of conditional probabilities of tsunami occurrence by combining two interrelated time series, one for the earthquake events, and another for the tsunami events. The approaches allow to calculate the probability for a future earthquake to be tsunamigenic or non-tsunamigenic, within a given time interval, under the conditions that (1) the last earthquake took place at time t before the calculation date, (2) the last tsunamigenic earthquake occurred at time T before time t, and (3) the ratio of tsunami generating earthquakes is known. The approaches were applied in earthquake and tsunami data sets from the Pacific and the Mediterranean Sea and the efficiency in calculating conditional probabilities was successfully tested. The alternative approaches are discussed as for the superiority of one against the other.

Orfanogiannaki, K.; Papadopoulos, G.

2004-12-01

37

Earthquake Alerting in California Prof. of Engineering Seismology  

E-print Network

Earthquake Alerting in California Tom Heaton Prof. of Engineering Seismology Caltech #12;Earthquake Alerting ... a different kind of prediction · What if earthquakes were really slow, like the weather? · We could recognize that an earthquake is beginning and then broadcast information on its development

Greer, Julia R.

38

Seismicity alert probabilities at Parkfield, California, revisited  

USGS Publications Warehouse

For a decade, the US Geological Survey has used the Parkfield Earthquake Prediction Experiment scenario document to estimate the probability that earthquakes observed on the San Andreas fault near Parkfield will turn out to be foreshocks followed by the expected magnitude six mainshock. During this time, we have learned much about the seismogenic process at Parkfield, about the long-term probability of the Parkfield mainshock, and about the estimation of these types of probabilities. The probabilities for potential foreshocks at Parkfield are reexamined and revised in light of these advances. As part of this process, we have confirmed both the rate of foreshocks before strike-slip earthquakes in the San Andreas physiographic province and the uniform distribution of foreshocks with magnitude proposed by earlier studies. Compared to the earlier assessment, these new estimates of the long-term probability of the Parkfield mainshock are lower, our estimate of the rate of background seismicity is higher, and we find that the assumption that foreshocks at Parkfield occur in a unique way is not statistically significant at the 95% confidence level. While the exact numbers vary depending on the assumptions that are made, the new alert probabilities are lower than previously estimated. Considering the various assumptions and the statistical uncertainties in the input parameters, we also compute a plausible range for the probabilities. The range is large, partly due to the extra knowledge that exists for the Parkfield segment, making us question the usefulness of these numbers.

Michael, A.J.; Jones, L.M.

1998-01-01

39

Earthquake Simulations and Historical Patterns of Events: Forecasting the Next Great Earthquake in California  

NASA Astrophysics Data System (ADS)

The fault system in California combined with some of the United States most densely populated regions is a recipe for devastation. It has been estimated that a repeat of the 1906 m=7.8 San Francisco earthquake could cause as much as $84 billion in damage. Earthquake forecasting can help alleviate the effects of these events by targeting disaster relief and preparedness in regions that will need it the most. However, accurate earthquake forecasting has proven difficult. We present a forecasting technique that uses simulated earthquake catalogs generated by Virtual California and patterns of historical events. As background, we also describe internal details of the Virtual California earthquake simulator.

Sachs, M. K.; Rundle, J. B.; Heien, E. M.; Turcotte, D. L.; Yikilmaz, M.; Kellogg, L. H.

2013-12-01

40

Earthquake probabilities in the San Francisco Bay Region: 2000 to 2030 - a summary of findings  

USGS Publications Warehouse

The San Francisco Bay region sits astride a dangerous “earthquake machine,” the tectonic boundary between the Pacific and North American Plates. The region has experienced major and destructive earthquakes in 1838, 1868, 1906, and 1989, and future large earthquakes are a certainty. The ability to prepare for large earthquakes is critical to saving lives and reducing damage to property and infrastructure. An increased understanding of the timing, size, location, and effects of these likely earthquakes is a necessary component in any effective program of preparedness. This study reports on the probabilities of occurrence of major earthquakes in the San Francisco Bay region (SFBR) for the three decades 2000 to 2030. The SFBR extends from Healdsberg on the northwest to Salinas on the southeast and encloses the entire metropolitan area, including its most rapidly expanding urban and suburban areas. In this study a “major” earthquake is defined as one with M?6.7 (where M is moment magnitude). As experience from the Northridge, California (M6.7, 1994) and Kobe, Japan (M6.9, 1995) earthquakes has shown us, earthquakes of this size can have a disastrous impact on the social and economic fabric of densely urbanized areas. To reevaluate the probability of large earthquakes striking the SFBR, the U.S. Geological Survey solicited data, interpretations, and analyses from dozens of scientists representing a wide crosssection of the Earth-science community (Appendix A). The primary approach of this new Working Group (WG99) was to develop a comprehensive, regional model for the long-term occurrence of earthquakes, founded on geologic and geophysical observations and constrained by plate tectonics. The model considers a broad range of observations and their possible interpretations. Using this model, we estimate the rates of occurrence of earthquakes and 30-year earthquake probabilities. Our study considers a range of magnitudes for earthquakes on the major faults in the region—an innovation over previous studies of the SFBR that considered only a small number of potential earthquakes of fixed magnitude.

Working Group on California Earthquake Probabilities

1999-01-01

41

Southern California Earthquake Center Geologic Vertical Motion Database  

Microsoft Academic Search

The Southern California Earthquake Center Geologic Vertical Motion Database (VMDB) integrates disparate sources of geologic uplift and subsidence data at 104- to 106-year time scales into a single resource for investigations of crustal deformation in southern California. Over 1800 vertical deformation rate data points in southern California and northern Baja California populate the database. Four mature data sets are now

Nathan A. Niemi; Michael Oskin; Thomas K. Rockwell

2008-01-01

42

Bayesian probabilities of earthquake occurrences in Longmenshan fault system (China)  

NASA Astrophysics Data System (ADS)

China has a long history of earthquake records, and the Longmenshan fault system (LFS) is a famous earthquake zone. We believed that the LFS could be divided into three seismogenic zones (north, central, and south zones) based on the geological structures and the earthquake catalog. We applied the Bayesian probability method using extreme-value distribution of earthquake occurrences to estimate the seismic hazard in the LFS. The seismic moment, slip rate, earthquake recurrence rate, and magnitude were considered as the basic parameters for computing the Bayesian prior estimates of the seismicity. These estimates were then updated in terms of Bayes' theorem and historical estimates of seismicity in the LFS. Generally speaking, the north zone seemingly is quite peaceful compared with the central and south zones. The central zone is the most dangerous; however, the periodicity of earthquake occurrences for M s = 8.0 is quite long (1,250 to 5,000 years). The selection of upper bound probable magnitude influences the result, and the upper bound magnitude of the south zone maybe 7.5. We obtained the empirical relationship of magnitude conversion for M s and ML, the values of the magnitude of completeness Mc (3.5), and the Gutenberg-Richter b value before applying the Bayesian extreme-value distribution of earthquake occurrences method.

Wang, Ying; Zhang, Keyin; Gan, Qigang; Zhou, Wen; Xiong, Liang; Zhang, Shihua; Liu, Chao

2015-01-01

43

The initial subevent of the 1994 Northridge, California, earthquake: Is earthquake size predictable?  

Microsoft Academic Search

We examine the initial subevent (ISE) of the M 6.7, 1994 Northridge, California, earthquake in order to discriminate between two end-member rupture initiation models: the ‘preslip’ and ‘cascade’ models. Final earthquake size may be predictable from an ISE's seismic signature in the preslip model but not in the cascade model. In the cascade model ISEs are simply small earthquakes that

D. Kilb; J. Gomberg

1999-01-01

44

Fundamental questions of earthquake statistics, source behavior, and the estimation of earthquake probabilities from possible foreshocks  

USGS Publications Warehouse

Estimates of the probability that an ML 4.8 earthquake, which occurred near the southern end of the San Andreas fault on 24 March 2009, would be followed by an M 7 mainshock over the following three days vary from 0.0009 using a Gutenberg–Richter model of aftershock statistics (Reasenberg and Jones, 1989) to 0.04 using a statistical model of foreshock behavior and long?term estimates of large earthquake probabilities, including characteristic earthquakes (Agnew and Jones, 1991). I demonstrate that the disparity between the existing approaches depends on whether or not they conform to Gutenberg–Richter behavior. While Gutenberg–Richter behavior is well established over large regions, it could be violated on individual faults if they have characteristic earthquakes or over small areas if the spatial distribution of large?event nucleations is disproportional to the rate of smaller events. I develop a new form of the aftershock model that includes characteristic behavior and combines the features of both models. This new model and the older foreshock model yield the same results when given the same inputs, but the new model has the advantage of producing probabilities for events of all magnitudes, rather than just for events larger than the initial one. Compared with the aftershock model, the new model has the advantage of taking into account long?term earthquake probability models. Using consistent parameters, the probability of an M 7 mainshock on the southernmost San Andreas fault is 0.0001 for three days from long?term models and the clustering probabilities following the ML 4.8 event are 0.00035 for a Gutenberg–Richter distribution and 0.013 for a characteristic?earthquake magnitude–frequency distribution. Our decisions about the existence of characteristic earthquakes and how large earthquakes nucleate have a first?order effect on the probabilities obtained from short?term clustering models for these large events.

Michael, Andrew J.

2012-01-01

45

Detection of hydrothermal precursors to large northern california earthquakes.  

PubMed

During the period 1973 to 1991 the interval between eruptions from a periodic geyser in Northern California exhibited precursory variations 1 to 3 days before the three largest earthquakes within a 250-kilometer radius of the geyser. These include the magnitude 7.1 Loma Prieta earthquake of 18 October 1989 for which a similar preseismic signal was recorded by a strainmeter located halfway between the geyser and the earthquake. These data show that at least some earthquakes possess observable precursors, one of the prerequisites for successful earthquake prediction. All three earthquakes were further than 130 kilometers from the geyser, suggesting that precursors might be more easily found around rather than within the ultimate rupture zone of large California earthquakes. PMID:17738277

Silver, P G; Valette-Silver, N J

1992-09-01

46

Toward earthquake early warning in northern California  

Microsoft Academic Search

Earthquake early warning systems are an approach to earthquake hazard mitigation which takes advantage of the rapid availability of earthquake information to quantify the hazard associated with an earthquake and issue a prediction of impending ground motion prior to its arrival in populated or otherwise sensitive areas. One such method, Earthquake Alarm Systems (ElarmS) has been under development in southern

Gilead Wurman; Richard M. Allen; Peter Lombard

2007-01-01

47

Earthquake preparedness levels amongst youth and adults in Oakland, California  

NASA Astrophysics Data System (ADS)

The San Francisco Bay Area has not experienced a large earthquake since 1989. However research shows that the Hayward fault is overdue for a tremor, based on paleo-seismic research. To analyze the level of earthquake preparedness in the Oakland area (close to the Hayward fault), we surveyed over 150 people to assess their understanding of earthquakes. Our research evaluates whether increased earthquake knowledge impacts people's preparedness and concern toward earthquake events. Data was collected using smart-phone technology and survey software in four sites across Oakland including; North Oakland, Downtown, East Oakland, and a summer school program in East Oakland, which has youth from throughout the city. Preliminary studies show that over 60% of interviewees have sufficient earthquake knowledge, but that over half of all interviewees are not prepared for a seismic event. Our study shows that in Oakland, California earthquake preparedness levels vary, which could mean we need to develop more ways to disseminate information on earthquake preparedness.

Burris, M.; Arroyo-Ruiz, D.; Crockett, C.; Dixon, G.; Jones, M.; Lei, P.; Phillips, B.; Romero, D.; Scott, M.; Spears, D.; Tate, L.; Whitlock, J.; Diaz, J.; Chagolla, R.

2011-12-01

48

Mathematical principles of predicting the probabilities of large earthquakes  

E-print Network

A multicomponent random process used as a model for the problem of space-time earthquake prediction; this allows us to develop consistent estimation for conditional probabilities of large earthquakes if the values of the predictor characterizing the seismicity prehistory are known. We introduce tools for assessing prediction efficiency, including a separate determination of efficiency for "time prediction" and "location prediction": a generalized correlation coefficient and the density of information gain. We suggest a technique for testing the predictor to decide whether the hypothesis of no prediction can be rejected.

Ghertzik, V M

2009-01-01

49

Recalculated probability of M !!!!!! 7 earthquakes beneath the Sea of Marmara, Turkey  

E-print Network

Recalculated probability of M !!!!!! 7 earthquakes beneath the Sea of Marmara, Turkey Tom Parsons U); KEYWORDS: earthquake probability, Sea of Marmara, seismic hazard, Turkey, stress interaction, North Anatolian fault Citation: Parsons, T. (2004), Recalculated probability of M ! 7 earthquakes beneath the Sea

50

The California Post-Earthquake Information Clearinghouse: A Plan to Learn From the Next Large California Earthquake  

Microsoft Academic Search

In the rush to remove debris after a damaging earthquake, perishable data related to a wide range of impacts on the physical, built and social environments can be lost. The California Post-Earthquake Information Clearinghouse is intended to prevent this data loss by supporting the earth scientists, engineers, and social and policy researchers who will conduct fieldwork in the affected areas

R. Loyd; S. Walter; J. Fenton; S. Tubbesing; M. Greene

2008-01-01

51

Simulations of the 1906 San Francisco Earthquake and Scenario Earthquakes in Northern California  

Microsoft Academic Search

3-D simulations of seismic ground motions are performed to better characterize the 1906 San Francisco earthquake and to investigate the seismic consequences from scenario events in northern California. Specifically, we perform simulations of: 1) the 1906 earthquake, which bilaterally ruptured a 480-km segment of the San Andreas fault from San Juan Bautista to Cape Mendocino (epicenter a few kilometers off

S. Larsen; D. Dreger; D. Dolenc

2006-01-01

52

The Effects of Static Coulomb Stress Change on Southern California Earthquake Forecasting  

NASA Astrophysics Data System (ADS)

I investigate how inclusion of static Coulomb stress changes, caused by tectonic loading and previous seismicity, contributes to the effectiveness and reliability of prospective earthquake forecasts. Several studies have shown that positive static Coulomb stress changes are associated with increased seismicity, relative to stress shadows. However, it is difficult to avoid bias when the learning and testing intervals are chosen retrospectively. I hypothesize that earthquake forecasts based on static Coulomb stress fields may improve upon existing earthquake forecasts based on historical seismicity. Within southern California, I have confirmed the aforementioned relationship between earthquake location and Coulomb stress change, but found no identifiable triggering threshold based on static Coulomb stress history at individual earthquake locations. I have also converted static Coulomb stress changes into spatially-varying earthquake rates by optimizing an index function and calculating probabilities of cells containing at least one earthquake based on Coulomb stress ranges. Inclusion of Coulomb stress effects gives an improvement in earthquake forecasts that is significant with 95% confidence, compared to smoothed seismicity null forecasts. Because of large uncertainties in Coulomb stress calculations near faults (and aftershock distributions), I combine static Coulomb stress and smoothed seismicity into a hybrid earthquake forecast. Evaluating such forecasts against those in which only Coulomb stress or smoothed seismicity determines earthquake rates indicates that Coulomb stress is more effective in the far field, whereas statistical seismology outperforms Coulomb stress near faults. Additionally, I test effects of receiver plane orientation, stress type (normal and shear components), and declustering receiver earthquakes. While static Coulomb stress shows significant potential in a prospective earthquake forecast, simplifying assumptions compromise its effectiveness. For example, we assume that crustal material within the study region is isotropic and homogeneous and purely elastic, and that pore fluid pressure variations do not significantly affect the static Coulomb stress field. Such assumptions require further research in order to detect direct earthquake triggering mechanisms.

Strader, Anne Elizabeth

53

Loma Prieta earthquake, October 17, 1989, Santa Cruz County, California  

SciTech Connect

On Tuesday, October 17, 1989 at 5:04 p.m. Pacific Daylight Time, a magnitude 7.1 earthquake occurred on the San Andreas fault 10 miles northeast of Santa Cruz. This earthquake was the largest earthquake to occur in the San Francisco Bay area since 1906, and the largest anywhere in California since 1952. The earthquake was responsible for 67 deaths and about 7 billion dollars worth of damage, making it the biggest dollar loss natural disaster in United States history. This article describes the seismological features of the earthquake, and briefly outlines a number of other geologic observations made during study of the earthquake, its aftershocks, and its effects. Much of the information in this article was provided by the U.S. Geological Survey (USGS).

McNutt, S.

1990-01-01

54

Keeping the History in Historical Seismology: The 1872 Owens Valley, California Earthquake  

NASA Astrophysics Data System (ADS)

The importance of historical earthquakes is being increasingly recognized. Careful investigations of key pre-instrumental earthquakes can provide critical information and insights for not only seismic hazard assessment but also for earthquake science. In recent years, with the explosive growth in computational sophistication in Earth sciences, researchers have developed increasingly sophisticated methods to analyze macroseismic data quantitatively. These methodological developments can be extremely useful to exploit fully the temporally and spatially rich information source that seismic intensities often represent. For example, the exhaustive and painstaking investigations done by Ambraseys and his colleagues of early Himalayan earthquakes provides information that can be used to map out site response in the Ganges basin. In any investigation of macroseismic data, however, one must stay mindful that intensity values are not data but rather interpretations. The results of any subsequent analysis, regardless of the degree of sophistication of the methodology, will be only as reliable as the interpretations of available accounts—and only as complete as the research done to ferret out, and in many cases translate, these accounts. When intensities are assigned without an appreciation of historical setting and context, seemingly careful subsequent analysis can yield grossly inaccurate results. As a case study, I report here on the results of a recent investigation of the 1872 Owen's Valley, California earthquake. Careful consideration of macroseismic observations reveals that this event was probably larger than the great San Francisco earthquake of 1906, and possibly the largest historical earthquake in California. The results suggest that some large earthquakes in California will generate significantly larger ground motions than San Andreas fault events of comparable magnitude.

Hough, Susan E.

2008-07-01

55

Keeping the History in Historical Seismology: The 1872 Owens Valley, California Earthquake  

SciTech Connect

The importance of historical earthquakes is being increasingly recognized. Careful investigations of key pre-instrumental earthquakes can provide critical information and insights for not only seismic hazard assessment but also for earthquake science. In recent years, with the explosive growth in computational sophistication in Earth sciences, researchers have developed increasingly sophisticated methods to analyze macroseismic data quantitatively. These methodological developments can be extremely useful to exploit fully the temporally and spatially rich information source that seismic intensities often represent. For example, the exhaustive and painstaking investigations done by Ambraseys and his colleagues of early Himalayan earthquakes provides information that can be used to map out site response in the Ganges basin. In any investigation of macroseismic data, however, one must stay mindful that intensity values are not data but rather interpretations. The results of any subsequent analysis, regardless of the degree of sophistication of the methodology, will be only as reliable as the interpretations of available accounts - and only as complete as the research done to ferret out, and in many cases translate, these accounts. When intensities are assigned without an appreciation of historical setting and context, seemingly careful subsequent analysis can yield grossly inaccurate results. As a case study, I report here on the results of a recent investigation of the 1872 Owen's Valley, California earthquake. Careful consideration of macroseismic observations reveals that this event was probably larger than the great San Francisco earthquake of 1906, and possibly the largest historical earthquake in California. The results suggest that some large earthquakes in California will generate significantly larger ground motions than San Andreas fault events of comparable magnitude.

Hough, Susan E. [U.S. Geological Survey, 525 South Wilson Avenue, Pasadena, California 91106 (United States)

2008-07-08

56

The initial subevent of the 1994 Northridge, California, earthquake: Is earthquake size predictable?  

USGS Publications Warehouse

We examine the initial subevent (ISE) of the M?? 6.7, 1994 Northridge, California, earthquake in order to discriminate between two end-member rupture initiation models: the 'preslip' and 'cascade' models. Final earthquake size may be predictable from an ISE's seismic signature in the preslip model but not in the cascade model. In the cascade model ISEs are simply small earthquakes that can be described as purely dynamic ruptures. In this model a large earthquake is triggered by smaller earthquakes; there is no size scaling between triggering and triggered events and a variety of stress transfer mechanisms are possible. Alternatively, in the preslip model, a large earthquake nucleates as an aseismically slipping patch in which the patch dimension grows and scales with the earthquake's ultimate size; the byproduct of this loading process is the ISE. In this model, the duration of the ISE signal scales with the ultimate size of the earthquake, suggesting that nucleation and earthquake size are determined by a more predictable, measurable, and organized process. To distinguish between these two end-member models we use short period seismograms recorded by the Southern California Seismic Network. We address questions regarding the similarity in hypocenter locations and focal mechanisms of the ISE and the mainshock. We also compare the ISE's waveform characteristics to those of small earthquakes and to the beginnings of earthquakes with a range of magnitudes. We find that the focal mechanisms of the ISE and mainshock are indistinguishable, and both events may have nucleated on and ruptured the same fault plane. These results satisfy the requirements for both models and thus do not discriminate between them. However, further tests show the ISE's waveform characteristics are similar to those of typical small earthquakes in the vicinity and more importantly, do not scale with the mainshock magnitude. These results are more consistent with the cascade model.

Kilb, Debi; Gomberg, J.

1999-01-01

57

Depth dependence of earthquake frequency-magnitude distributions in California: Implications for rupture initiation  

USGS Publications Warehouse

Statistics of earthquakes in California show linear frequency-magnitude relationships in the range of M2.0 to M5.5 for various data sets. Assuming Gutenberg-Richter distributions, there is a systematic decrease in b value with increasing depth of earthquakes. We find consistent results for various data sets from northern and southern California that both include and exclude the larger aftershock sequences. We suggest that at shallow depth (???0 to 6 km) conditions with more heterogeneous material properties and lower lithospheric stress prevail. Rupture initiations are more likely to stop before growing into large earthquakes, producing relatively more smaller earthquakes and consequently higher b values. These ideas help to explain the depth-dependent observations of foreshocks in the western United States. The higher occurrence rate of foreshocks preceding shallow earthquakes can be interpreted in terms of rupture initiations that are stopped before growing into the mainshock. At greater depth (9-15 km), any rupture initiation is more likely to continue growing into a larger event, so there are fewer foreshocks. If one assumes that frequency-magnitude statistics can be used to estimate probabilities of a small rupture initiation growing into a larger earthquake, then a small (M2) rupture initiation at 9 to 12 km depth is 18 times more likely to grow into a M5.5 or larger event, compared to the same small rupture initiation at 0 to 3 km. Copyright 1997 by the American Geophysical Union.

Mori, J.; Abercrombie, R.E.

1997-01-01

58

Scientific Challenges in Developing the Next Uniform California Earthquake Rupture Forecast (UCERF3)  

NASA Astrophysics Data System (ADS)

The Working Group on California Earthquake Probabilities (WGCEP) is in the process of developing the next-generation Uniform California Earthquake Rupture Forecast (UCERF version 3). The main goals for this future model, which is being developed jointly by the United States Geological Survey, California Geological Survey, and Southern California Earthquake Center, are to include multi-fault ruptures and spatial and temporal clustering. While there are broad range of challenges associated with the development, implementation, and use of this model, the intent of this presentation is to give an overview of some of the most pressing scientific issues. These questions can be distilled down as follows: 1) Does every small volume of space exhibit a Gutenberg Richter distribution of nucleations?; 2) What is the average slip distribution of large events, both down dip and along strike?; 3) How do we apply elastic rebound in an un-segmented fault model?; 4) How can we quantify fault-to-fault rupture probabilities, especially give uncertainties in fault endpoints?; 5) What constitutes “best available science” with respect to spatial and temporal clustering models?; and 6) What is the explanation for the apparent post-1906 seismicity-rate reduction? Each of these questions will be described and exemplified, together with our current plans for addressing them.

Field, E. H.

2009-12-01

59

Triggering of repeating earthquakes in central California  

USGS Publications Warehouse

Dynamic stresses carried by transient seismic waves have been found capable of triggering earthquakes instantly in various tectonic settings. Delayed triggering may be even more common, but the mechanisms are not well understood. Catalogs of repeating earthquakes, earthquakes that recur repeatedly at the same location, provide ideal data sets to test the effects of transient dynamic perturbations on the timing of earthquake occurrence. Here we employ a catalog of 165 families containing ~2500 total repeating earthquakes to test whether dynamic perturbations from local, regional, and teleseismic earthquakes change recurrence intervals. The distance to the earthquake generating the perturbing waves is a proxy for the relative potential contributions of static and dynamic deformations, because static deformations decay more rapidly with distance. Clear changes followed the nearby 2004 Mw6 Parkfield earthquake, so we study only repeaters prior to its origin time. We apply a Monte Carlo approach to compare the observed number of shortened recurrence intervals following dynamic perturbations with the distribution of this number estimated for randomized perturbation times. We examine the comparison for a series of dynamic stress peak amplitude and distance thresholds. The results suggest a weak correlation between dynamic perturbations in excess of ~20 kPa and shortened recurrence intervals, for both nearby and remote perturbations.

Wu, Chunquan; Gomberg, Joan; Ben-Naim, Eli; Johnson, Paul

2014-01-01

60

Chapter F. The Loma Prieta, California, Earthquake of October 17, 1989 - Tectonic Processes and Models  

USGS Publications Warehouse

If there is a single theme that unifies the diverse papers in this chapter, it is the attempt to understand the role of the Loma Prieta earthquake in the context of the earthquake 'machine' in northern California: as the latest event in a long history of shocks in the San Francisco Bay region, as an incremental contributor to the regional deformation pattern, and as a possible harbinger of future large earthquakes. One of the surprises generated by the earthquake was the rather large amount of uplift that occurred as a result of the reverse component of slip on the southwest-dipping fault plane. Preearthquake conventional wisdom had been that large earthquakes in the region would probably be caused by horizontal, right-lateral, strike-slip motion on vertical fault planes. In retrospect, the high topography of the Santa Cruz Mountains and the elevated marine terraces along the coast should have provided some clues. With the observed ocean retreat and the obvious uplift of the coast near Santa Cruz that accompanied the earthquake, Mother Nature was finally caught in the act. Several investigators quickly saw the connection between the earthquake uplift and the long-term evolution of the Santa Cruz Mountains and realized that important insights were to be gained by attempting to quantify the process of crustal deformation in terms of Loma Prieta-type increments of northward transport and fault-normal shortening.

Simpson, Robert W.

1994-01-01

61

Earthquake Probability-based Automated Decision-making Framework for Earthquake Early Warning Applications  

NASA Astrophysics Data System (ADS)

The benefits and feasibility of Earthquake Early Warning (EEW) is becoming more appreciated throughout the world. An EEW system detects an earthquake initiation based on a seismic sensor network and broadcasts a warning of the predicted location and magnitude shortly before an earthquake hits a site. The typical range of this lead time is very short, around tens of seconds to a minute, which becomes a huge challenge for applications taking advantage of EEW. As a result, a robust automated decision process about whether to initiate a mitigation action is essential. One of the recent approaches proposes taking an action upon exceedance of a fixed threshold for an intensity measure or damage or loss measure, but the determination of the threshold value remains as an open-ended question. Other approaches propose a decision-making framework based on cost-benefit analysis. However, a general framework that can handle multiple-action decisions, lead time and its uncertainty is still missing. In this study, a more robust decision criterion based on a new cost-benefit analysis procedure is proposed as part of an earthquake probability-based automated decision-making (ePAD) framework. An illustrative example is presented to demonstrate how to explicitly handle multi-action decisions and lead time uncertainty.; Summery of the ePAD framework

Wu, S.; Beck, J.; Heaton, T. H.

2012-12-01

62

Southern California Earthquake Center (SCEC) Community Fault Model (over 150 major faults of Southern California)  

NSDL National Science Digital Library

This is a movie made using the SCEC-VDO software showing a 3D animation of the SCEC Community Fault Model (over 150 major faults of Southern California). The movie highlights the San Andreas and Puente Hills faults. The Southern California Earthquake Center's Virtual Display of Objects (SCEC-VDO) is 3D visualization software that allows users to display, study and make movies of earthquakes as they occur globally. SCEC-VDO was developed by interns of SCEC Undergraduate Studies in Earthquake Information Technology (UseIT), under the supervision of Sue Perry and Tom Jordan.

63

Search for seismic forerunners to earthquakes in central California  

USGS Publications Warehouse

The relatively high seismicity of the San Andreas fault zone in central California provides an excellent opportunity to search for seismic forerunners to moderate earthquakes. Analysis of seismic traveltime and earthquake location data has resulted in the identification of two possible seismic forerunners. The first is a period of apparently late (0.3 sec) P-wave arrival times lasting several weeks preceding one earthquake of magnitude 5.0. The rays for these travel paths passed through - or very close to - the aftershock volume of the subsequent earthquake. The sources for these P-arrival time data were earthquakes in the distance range 20-70 km. Uncertainties in the influence of small changes in the hypocenters of the source earthquakes and in the identification of small P-arrivals raise the possibility that the apparantly delayed arrivals are not the result of a decrease in P-velocity. The second possible precursor is an apparent increase in the average depth of earthquakes preceding two moderate earthquakes. This change might be only apparent, caused by a location bias introduced by a decrease in P-wave velocity, but numerical modeling for realistic possible changes in velocity suggests that the observed effect is more likely a true migration of earthquakes. To carry out this work - involving the manipulation of several thousand earthquake hypocenters and several hundred thousand readings of arrival time - a system of data storage was designed and manipulation programs for a large digital computer have been executed. This system allows, for example, the automatic selection of earthquakes from a specific region, the extraction of all the observed arrival times for these events, and their relocation under a chosen set of assumptions. ?? 1977.

Wesson, R.L.; Robinson, R.; Bufe, C.G.; Ellsworth, W.L.; Pfluke, J.H.; Steppe, J.A.; Seekins, L.C.

1977-01-01

64

Earthquakes near Parkfield, California: Comparing the 1934 and 1966 sequences  

USGS Publications Warehouse

Moderate-sized earthquakes (Richter magnitude ML 5 1/2) have occurred four times this century (1901, 1922, 1934, and 1966) on the San Andreas fault near Parkfield in central California. In many respects the June 1966 sequence was a remarkably detailed repetition of the June 1934 sequence, suggesting a recurring recognizable pattern of stress and fault zone behavior.

Bakun, W.H.; McEvilly, T.V.

1979-01-01

65

Contrasts between source parameters of M [>=] 5. 5 earthquakes in northern Baja California and southern California  

Microsoft Academic Search

Source parameters determined from the body waveform modeling of large (M [>=] 5.5) historic earthquakes occurring between 1915 and 1956 along the San Jacinto and Imperial fault zones of southern California and the Cerro Prieto, Tres Hermanas and San Miguel fault zones of Baja California have been combined with information from post-1960's events to study regional variations in source parameters.

Doser

1993-01-01

66

Distribution and Characteristics of Repeating Earthquakes in Northern California  

NASA Astrophysics Data System (ADS)

Repeating earthquakes are playing an increasingly important role in the study of fault processes and behavior, and have the potential to improve hazard assessment, earthquake forecast, and seismic monitoring capabilities. These events rupture the same fault patch repeatedly, generating virtually identical seismograms. In California, repeating earthquakes have been found predominately along the creeping section of the central San Andreas Fault, where they are believed to represent failing asperities on an otherwise creeping fault. Here, we use the northern California double-difference catalog of 450,000 precisely located events (1984-2009) and associated database of 2 billion waveform cross-correlation measurements to systematically search for repeating earthquakes across various tectonic regions. An initial search for pairs of earthquakes with high-correlation coefficients and similar magnitudes resulted in 4,610 clusters including a total of over 26,000 earthquakes. A subsequent double-difference re-analysis of these clusters resulted in 1,879 sequences (8,640 events) where a common rupture area can be resolved to the precision of a few tens of meters or less. These repeating earthquake sequences (RES) include between 3 and 24 events with magnitudes up to ML=4. We compute precise relative magnitudes between events in each sequence from differential amplitude measurements. Differences between these and standard coda-duration magnitudes have a standard deviation of 0.09. The RES occur throughout northern California, but RES with 10 or more events (6%) only occur along the central San Andreas and Calaveras faults. We are establishing baseline characteristics for each sequence, such as recurrence intervals and their coefficient of variation (CV), in order to compare them across tectonic regions. CVs for these clusters range from 0.002 to 2.6, indicating a range of behavior between periodic occurrence (CV~0), random occurrence, and temporal clustering. 10% of the RES show burst-like behavior with mean recurrence times smaller than one month. 5% of the RES have mean recurrence times greater than one year and include more than 10 earthquakes. Earthquakes in the 50 most periodic sequences (CV<0.2) do not appear to be predictable by either time- or slip-predictable models, consistent with previous findings. We demonstrate that changes in recurrence intervals of repeating earthquakes can be routinely monitored. This is especially important for sequences with CV~0, as they may indicate changes in the loading rate. We also present results from retrospective forecast experiments based on near-real time hazard functions.

Waldhauser, F.; Schaff, D. P.; Zechar, J. D.; Shaw, B. E.

2012-12-01

67

Intensity earthquake scenario (scenario event - a damaging earthquake with higher probability of occurrence) for the city of Sofia  

NASA Astrophysics Data System (ADS)

Among the many kinds of natural and man-made disasters, earthquakes dominate with regard to their social and economical impact on the urban environment. Global seismic risk to earthquakes are increasing steadily as urbanization and development occupy more areas that a prone to effects of strong earthquakes. Additionally, the uncontrolled growth of mega cities in highly seismic areas around the world is often associated with the construction of seismically unsafe buildings and infrastructures, and undertaken with an insufficient knowledge of the regional seismicity peculiarities and seismic hazard. The assessment of seismic hazard and generation of earthquake scenarios is the first link in the prevention chain and the first step in the evaluation of the seismic risk. The earthquake scenarios are intended as a basic input for developing detailed earthquake damage scenarios for the cities and can be used in earthquake-safe town and infrastructure planning. The city of Sofia is the capital of Bulgaria. It is situated in the centre of the Sofia area that is the most populated (the population is of more than 1.2 mil. inhabitants), industrial and cultural region of Bulgaria that faces considerable earthquake risk. The available historical documents prove the occurrence of destructive earthquakes during the 15th-18th centuries in the Sofia zone. In 19th century the city of Sofia has experienced two strong earthquakes: the 1818 earthquake with epicentral intensity I0=8-9 MSK and the 1858 earthquake with I0=9-10 MSK. During the 20th century the strongest event occurred in the vicinity of the city of Sofia is the 1917 earthquake with MS=5.3 (I0=7-8 MSK). Almost a century later (95 years) an earthquake of moment magnitude 5.6 (I0=7-8 MSK) hit the city of Sofia, on May 22nd, 2012. In the present study as a deterministic scenario event is considered a damaging earthquake with higher probability of occurrence that could affect the city with intensity less than or equal to VIII. The usable and realistic ground motion maps for urban areas are generated: - either from the assumption of a "reference earthquake" - or directly, showing values of macroseimic intensity generated by a damaging, real earthquake. In the study, applying deterministic approach, earthquake scenario in macroseismic intensity ("model" earthquake scenario) for the city of Sofia is generated. The deterministic "model" intensity scenario based on assumption of a "reference earthquake" is compared with a scenario based on observed macroseimic effects caused by the damaging 2012 earthquake (MW5.6). The difference between observed (Io) and predicted (Ip) intensities values is analyzed.

Aleksandrova, Irena; Simeonova, Stela; Solakov, Dimcho; Popova, Maria

2014-05-01

68

Long Period Earthquakes Beneath California's Young and Restless Volcanoes  

NASA Astrophysics Data System (ADS)

The newly established USGS California Volcano Observatory has the broad responsibility of monitoring and assessing hazards at California's potentially threatening volcanoes, most notably Mount Shasta, Medicine Lake, Clear Lake Volcanic Field, and Lassen Volcanic Center in northern California; and Long Valley Caldera, Mammoth Mountain, and Mono-Inyo Craters in east-central California. Volcanic eruptions occur in California about as frequently as the largest San Andreas Fault Zone earthquakes-more than ten eruptions have occurred in the last 1,000 years, most recently at Lassen Peak (1666 C.E. and 1914-1917 C.E.) and Mono-Inyo Craters (c. 1700 C.E.). The Long Valley region (Long Valley caldera and Mammoth Mountain) underwent several episodes of heightened unrest over the last three decades, including intense swarms of volcano-tectonic (VT) earthquakes, rapid caldera uplift, and hazardous CO2 emissions. Both Medicine Lake and Lassen are subsiding at appreciable rates, and along with Clear Lake, Long Valley Caldera, and Mammoth Mountain, sporadically experience long period (LP) earthquakes related to migration of magmatic or hydrothermal fluids. Worldwide, the last two decades have shown the importance of tracking LP earthquakes beneath young volcanic systems, as they often provide indication of impending unrest or eruption. Herein we document the occurrence of LP earthquakes at several of California's young volcanoes, updating a previous study published in Pitt et al., 2002, SRL. All events were detected and located using data from stations within the Northern California Seismic Network (NCSN). Event detection was spatially and temporally uneven across the NCSN in the 1980s and 1990s, but additional stations, adoption of the Earthworm processing system, and heightened vigilance by seismologists have improved the catalog over the last decade. LP earthquakes are now relatively well-recorded under Lassen (~150 events since 2000), Clear Lake (~60 events), Mammoth Mountain (~320 events), and Long Valley Caldera (~40 events). LP earthquakes are notably absent under Mount Shasta. With the exception of Long Valley Caldera where LP earthquakes occur at depths of ?5 km, hypocenters are generally between 15-25 km. The rates of LP occurrence over the last decade have been relatively steady within the study areas, except at Mammoth Mountain, where years of gradually declining LP activity abruptly increased after a swarm of unusually deep (20 km) VT earthquakes in October 2012. Epicenter locations relative to the sites of most recent volcanism vary across volcanic centers, but most LP earthquakes fall within 10 km of young vents. Source models for LP earthquakes often involve the resonance of fluid-filled cracks or nonlinear flow of fluids along irregular cracks (reviewed in Chouet and Matoza, 2013, JVGR). At mid-crustal depths the relevant fluids are likely to be low-viscosity basaltic melt and/or exsolved CO2-rich volatiles (Lassen, Clear Lake, Mammoth Mountain). In the shallow crust, however, hydrothermal waters/gases are likely involved in the generation of LP seismicity (Long Valley Caldera).

Pitt, A. M.; Dawson, P. B.; Shelly, D. R.; Hill, D. P.; Mangan, M.

2013-12-01

69

Southern California Earthquake Center Operates 1991 present, $3 -$5 million per year  

E-print Network

Southern California Earthquake Center · Operates 1991 ­ present, $3 - $5 million per year · NSF, USC · High profile seismic hazard reports from 1993 · Community data bases ­faults, earthquakes, 3-D faults Quake rates elsewhere Putting it all together ... Uniform California Earthquake Rupture Forecast

70

Losses to single-family housing from ground motions in the 1994 Northridge, California, earthquake  

USGS Publications Warehouse

The distributions of insured losses to single-family housing following the 1994 Northridge, California, earthquake for 234 ZIP codes can be satisfactorily modeled with gamma distributions. Regressions of the parameters in the gamma distribution on estimates of ground motion, derived from ShakeMap estimates or from interpolated observations, provide a basis for developing curves of conditional probability of loss given a ground motion. Comparison of the resulting estimates of aggregate loss with the actual aggregate loss gives satisfactory agreement for several different ground-motion parameters. Estimates of loss based on a deterministic spatial model of the earthquake ground motion, using standard attenuation relationships and NEHRP soil factors, give satisfactory results for some ground-motion parameters if the input ground motions are increased about one and one-half standard deviations above the median, reflecting the fact that the ground motions for the Northridge earthquake tended to be higher than the median ground motion for other earthquakes with similar magnitude. The results give promise for making estimates of insured losses to a similar building stock under future earthquake loading. ?? 2004, Earthquake Engineering Research Institute.

Wesson, R.L.; Perkins, D.M.; Leyendecker, E.V.; Roth, R.J., Jr.; Petersen, M.D.

2004-01-01

71

Crustal deformation in great California earthquake cycles  

NASA Technical Reports Server (NTRS)

Periodic crustal deformation associated with repeated strike slip earthquakes is computed for the following model: A depth L (less than or similiar to H) extending downward from the Earth's surface at a transform boundary between uniform elastic lithospheric plates of thickness H is locked between earthquakes. It slips an amount consistent with remote plate velocity V sub pl after each lapse of earthquake cycle time T sub cy. Lower portions of the fault zone at the boundary slip continuously so as to maintain constant resistive shear stress. The plates are coupled at their base to a Maxwellian viscoelastic asthenosphere through which steady deep seated mantle motions, compatible with plate velocity, are transmitted to the surface plates. The coupling is described approximately through a generalized Elsasser model. It is argued that the model gives a more realistic physical description of tectonic loading, including the time dependence of deep slip and crustal stress build up throughout the earthquake cycle, than do simpler kinematic models in which loading is represented as imposed uniform dislocation slip on the fault below the locked zone.

Li, Victor C.; Rice, James R.

1986-01-01

72

Southern California Earthquake Center (SCEC) Summer Internship Programs  

NASA Astrophysics Data System (ADS)

For the eleventh consecutive year, the Southern California Earthquake Center (SCEC) coordinated undergraduate research experiences in summer 2004, allowing 35 students with a broad array of backgrounds and interests to work with the world's preeminent earthquake scientists and specialists. Students participate in interdisciplinary, system-level earthquake science and information technology research, and several group activities throughout the summer. Funding for student stipends and activities is made possible by the NSF Research Experiences for Undergraduates (REU) program. SCEC coordinates two intern programs: The SCEC Summer Undergraduate Research Experience (SCEC/SURE) and the SCEC Undergraduate Summer in Earthquake Information Technology (SCEC/USEIT). SCEC/SURE interns work one-on-one with SCEC scientists at their institutions on a variety of earthquake science research projects. The goals of the program are to expand student participation in the earth sciences and related disciplines, encourage students to consider careers in research and education, and to increase diversity of students and researchers in the earth sciences. 13 students participated in this program in 2004. SCEC/USEIT is an NSF REU site that brings undergraduate students from across the country to the University of Southern California each summer. SCEC/USEIT interns interact in a team-oriented research environment and are mentored by some of the nation's most distinguished geoscience and computer science researchers. The goals of the program are to allow undergraduates to use advanced tools of information technology to solve problems in earthquake research; close the gap between computer science and geoscience; and engage non-geoscience majors in the application of earth science to the practical problems of reducing earthquake risk. SCEC/USEIT summer research goals are structured around a grand challenge problem in earthquake information technology. For the past three years the students have developed a new earthquake and fault visualization platform named "LA3D." 22 students participated in this program in 2004. SCEC Interns come together several times during the summer, beginning with a Communication Workshop that develops the student's oral and written communication skills. In mid-summer, a one-day SCEC Intern Colloquium is held, where student researchers present status reports on their research, followed by a three-day field trip of southern California geology and SCEC research locations. Finally, at the end of the summer each student presents a poster at the SCEC Annual Meeting.

Benthien, M. L.; Perry, S.; Jordan, T. H.

2004-12-01

73

Crustal deformation in Great California Earthquake cycles  

NASA Technical Reports Server (NTRS)

A model in which coupling is described approximately through a generalized Elsasser model is proposed for computation of the periodic crustal deformation associated with repeated strike-slip earthquakes. The model is found to provide a more realistic physical description of tectonic loading than do simpler kinematic models. Parameters are chosen to model the 1857 and 1906 San Andreas ruptures, and predictions are found to be consistent with data on variations of contemporary surface strain and displacement rates as a function of distance from the 1857 and 1906 rupture traces. Results indicate that the asthenosphere appropriate to describe crustal deformation on the earthquake cycle time scale lies in the lower crust and perhaps the crust-mantle transition zone.

Li, Victor C.; Rice, James R.

1987-01-01

74

Earthquake Prediction  

NSDL National Science Digital Library

This video segment adapted from NOVA tells the tragic story of two Japanese seismologists who disagreed about the threat of earthquakes in the early twentieth century. Today, seismologists in California offer residents a probability of risk that an earthquake might occur.

2005-12-17

75

MOHO ORIENTATION BENEATH CENTRAL CALIFORNIA FROM REGIONAL EARTHQUAKE TRAVEL TIMES.  

USGS Publications Warehouse

This paper examines relative Pn arrival times, recorded by the U. S. Geological Survey seismic network in central and northern California from an azimuthally distributed set of regional earthquakes. Improved estimates are presented of upper mantle velocities in the Coast Ranges, Great Valley, and Sierra Nevada foothills and estimates of the orientation of the Moho throughout this region. Finally, the azimuthal distribution of apparent velocities, corrected for dip and individual station travel time effects, is then studied for evidence of upper mantle velocity anisotropy and for indications of lower crustal structure in central California.

Oppenheimer, David H.; Eaton, Jerry P.

1984-01-01

76

Chapter C. The Loma Prieta, California, Earthquake of October 17, 1989 - Landslides  

USGS Publications Warehouse

Central California, in the vicinity of San Francisco and Monterey Bays, has a history of fatal and damaging landslides, triggered by heavy rainfall, coastal and stream erosion, construction activity, and earthquakes. The great 1906 San Francisco earthquake (MS=8.2-8.3) generated more than 10,000 landslides throughout an area of 32,000 km2; these landslides killed at least 11 people and caused substantial damage to buildings, roads, railroads, and other civil works. Smaller numbers of landslides, which caused more localized damage, have also been reported from at least 20 other earthquakes that have occurred in the San Francisco Bay-Monterey Bay region since 1838. Conditions that make this region particularly susceptible to landslides include steep and rugged topography, weak rock and soil materials, seasonally heavy rainfall, and active seismicity. Given these conditions and history, it was no surprise that the 1989 Loma Prieta earthquake generated thousands of landslides throughout the region. Landslides caused one fatality and damaged at least 200 residences, numerous roads, and many other structures. Direct damage from landslides probably exceeded $30 million; additional, indirect economic losses were caused by long-term landslide blockage of two major highways and by delays in rebuilding brought about by concern over the potential long-term instability of some earthquake-damaged slopes.

Keefer, David K., (Edited By)

1998-01-01

77

Forecast model for moderate earthquakes near Parkfield, California  

NASA Astrophysics Data System (ADS)

Earthquake instability models have possible application to earthquake forecasting because the models simulate both preseismic and coseismic changes of fault slip and ground deformation. In the forecast procedure proposed here, repeated measurements of preseismic fault slip and ground deformation constrain the values of model parameters. The early part of the model simulation corresponds to the available field data, and the subsequent part constitutes an estimate of future faulting and ground deformation. In particular, the time, location, and size of unstable faulting are estimates of the pending earthquake parameters. The forecast accuracy depends on the model realism and parameter resolution. The forecast procedure is applied to fault creep and trilateration data measured near Parkfield, California, where at least five magnitude 5.5 to 6 earthquakes have occurred regularly since 1881, the last in 1966. The quasi-static model consists of a flat vertical plane embedded in an elastic half space. Spacially variable fault slip of strike-slip sense is driven by an increasing regional shear stress but is impeded by a relatively strong patch of brittle, strain-softening fault. The field data are consistent with these approximate values of patch parameters: radius of 3 km, patch center 5 km deep and 8 km southeast of the 1966 epicenter, and maximum brittle strength of 26 bars. Fluctuations in the available field data prevent estimating the earthquake time with any more precision than use of the 21±8 year recurrence interval. However, the model may later give a more precise estimate of the earthquake time if the fault slip rate near the inferred patch increases before the earthquake, as predicted by the model.

Stuart, William D.; Archuleta, Ralph J.; Lindh, Allan G.

1985-01-01

78

The California Post-Earthquake Information Clearinghouse: A Plan to Learn From the Next Large California Earthquake  

NASA Astrophysics Data System (ADS)

In the rush to remove debris after a damaging earthquake, perishable data related to a wide range of impacts on the physical, built and social environments can be lost. The California Post-Earthquake Information Clearinghouse is intended to prevent this data loss by supporting the earth scientists, engineers, and social and policy researchers who will conduct fieldwork in the affected areas in the hours and days following the earthquake to study these effects. First called for by Governor Ronald Reagan following the destructive M6.5 San Fernando earthquake in 1971, the concept of the Clearinghouse has since been incorporated into the response plans of the National Earthquake Hazard Reduction Program (USGS Circular 1242). This presentation is intended to acquaint scientists with the purpose, functions, and services of the Clearinghouse. Typically, the Clearinghouse is set up in the vicinity of the earthquake within 24 hours of the mainshock and is maintained for several days to several weeks. It provides a location where field researchers can assemble to share and discuss their observations, plan and coordinate subsequent field work, and communicate significant findings directly to the emergency responders and to the public through press conferences. As the immediate response effort winds down, the Clearinghouse will ensure that collected data are archived and made available through "lessons learned" reports and publications that follow significant earthquakes. Participants in the quarterly meetings of the Clearinghouse include representatives from state and federal agencies, universities, NGOs and other private groups. Overall management of the Clearinghouse is delegated to the agencies represented by the authors above.

Loyd, R.; Walter, S.; Fenton, J.; Tubbesing, S.; Greene, M.

2008-12-01

79

Variability of Near-Term Probability for the Next Great Earthquake on the Cascadia Subduction Zone  

Microsoft Academic Search

The threat of a great (M 9) earthquake along the Cascadia subduction zone is evidenced by both paleoseismology data and current strain accumulation along the fault. On the basis of recent information on the characteristics of this subduction system, we estimate the conditional probabilities of a great earthquake occurring within the next 50 years and their variabilities. The most important

Stephane Mazzotti; John Adams

2004-01-01

80

Dynamic Triggering of Earthquakes in the Salton Sea Region of Southern California from Large Regional and Teleseismic Earthquakes  

Microsoft Academic Search

We perform a systematic survey of dynamically triggered earthquakes in the Salton Sea region of southern California using borehole seismic data recordings (2007 to present). We define triggered events as high-frequency seismic energy during large-amplitude seismic waves of distant earthquakes. Our mainshock database includes 26 teleseismic events (epicentral distances > 1000 km; Mw >= 7.5), and 8 regional events (epicentral

A. Doran; X. Meng; Z. Peng; C. Wu; D. L. Kilb

2010-01-01

81

Earthquake epicenters and fault intersections in central and southern California  

NASA Technical Reports Server (NTRS)

The author has identifed the following significant results. ERTS-1 imagery provided evidence for the existence of short transverse fault segments lodged between faults of the San Andreas system in the Coast Ranges, California. They indicate that an early episode of transverse shear has affected the Coast Ranges prior to the establishment of the present San Andreas fault. The fault has been offset by transverse faults of the Transverse Ranges. It appears feasible to identify from ERTS-1 imagery geomorphic criteria of recent fault movements. Plots of historic earthquakes in the Coast Ranges and western Transverse Ranges show clusters in areas where structures are complicated by interaction of tow active fault systems. A fault lineament apparently not previously mapped was identified in the Uinta Mountains, Utah. Part of the lineament show evidence of recent faulting which corresponds to a moderate earthquake cluster.

Abdel-Gawad, M. (principal investigator); Silverstein, J.

1972-01-01

82

Seismic velocity variations along the rupture zone of the 1989 Loma Prieta earthquake, California  

E-print Network

Andreas fault zone since the 1906 San Francisco earthquake. The majority of the previous tomographic carried out to study the 1989 Mw6.9 Loma Prieta earthquake, the first major event to occur along the SanSeismic velocity variations along the rupture zone of the 1989 Loma Prieta earthquake, California G

Lin, Guoqing

83

Geometry and earthquake potential of the shoreline fault, central California  

USGS Publications Warehouse

The Shoreline fault is a vertical strike?slip fault running along the coastline near San Luis Obispo, California. Much is unknown about the Shoreline fault, including its slip rate and the details of its geometry. Here, I study the geometry of the Shoreline fault at seismogenic depth, as well as the adjacent section of the offshore Hosgri fault, using seismicity relocations and earthquake focal mechanisms. The Optimal Anisotropic Dynamic Clustering (OADC) algorithm (Ouillon et al., 2008) is used to objectively identify the simplest planar fault geometry that fits all of the earthquakes to within their location uncertainty. The OADC results show that the Shoreline fault is a single continuous structure that connects to the Hosgri fault. Discontinuities smaller than about 1 km may be undetected, but would be too small to be barriers to earthquake rupture. The Hosgri fault dips steeply to the east, while the Shoreline fault is essentially vertical, so the Hosgri fault dips towards and under the Shoreline fault as the two faults approach their intersection. The focal mechanisms generally agree with pure right?lateral strike?slip on the OADC planes, but suggest a non?planar Hosgri fault or another structure underlying the northern Shoreline fault. The Shoreline fault most likely transfers strike?slip motion between the Hosgri fault and other faults of the Pacific–North America plate boundary system to the east. A hypothetical earthquake rupturing the entire known length of the Shoreline fault would have a moment magnitude of 6.4–6.8. A hypothetical earthquake rupturing the Shoreline fault and the section of the Hosgri fault north of the Hosgri–Shoreline junction would have a moment magnitude of 7.2–7.5.

Hardebeck, Jeanne L.

2013-01-01

84

Helium soil-gas variations associated with recent central California earthquakes: precursor or coincidence?  

USGS Publications Warehouse

Decreases in the helium concentration of soil-gas have been observed to precede six of eight recent central California earthquakes. Ten monitoring stations were established near Hollister, California and along the San Andreas Fault to permit gas collection. The data showed decreases occurring a few weeks before the earthquakes and concentratiosn returned to prequake levels either shortly before or after the earthquakes.-Author

Reimer, G.M.

1981-01-01

85

1957 Gobi-Altay, Mongolia, earthquake as a prototype for southern California's most devastating earthquake  

USGS Publications Warehouse

The 1957 Gobi-Altay earthquake was associated with both strike-slip and thrust faulting, processes similar to those along the San Andreas fault and the faults bounding the San Gabriel Mountains just north of Los Angeles, California. Clearly, a major rupture either on the San Andreas fault north of Los Angeles or on the thrust faults bounding the Los Angeles basin poses a serious hazard to inhabitants of that area. By analogy with the Gobi-Altay earthquake, we suggest that simultaneous rupturing of both the San Andreas fault and the thrust faults nearer Los Angeles is a real possibility that amplifies the hazard posed by ruptures on either fault system separately.

Bayarsayhan, C.; Bayasgalan, A.; Enhtuvshin, B.; Hudnut, K.W.; Kurushin, R.A.; Molnar, P.; Olziybat, M.

1996-01-01

86

Conditional Probability Approaches for the Occurrence of Earthquake Generated Tsunamis  

Microsoft Academic Search

The problem of probabilistic tsunami hazard assessment is not an easy task because usually the number of events contained in a tsunami time series of a particular tsunamigenic zone is low and, therefore, do not allow for statistical significance of the results. On the contrary, the earthquake data set contains more events which is due to that not all of

K. Orfanogiannaki; G. Papadopoulos

2004-01-01

87

Subduction zone earthquake probably triggered submarine hydrocarbon seepage offshore Pakistan  

NASA Astrophysics Data System (ADS)

Seepage of methane-dominated hydrocarbons is heterogeneous in space and time, and trigger mechanisms of episodic seep events are not well constrained. It is generally found that free hydrocarbon gas entering the local gas hydrate stability field in marine sediments is sequestered in gas hydrates. In this manner, gas hydrates can act as a buffer for carbon transport from the sediment into the ocean. However, the efficiency of gas hydrate-bearing sediments for retaining hydrocarbons may be corrupted: Hypothesized mechanisms include critical gas/fluid pressures beneath gas hydrate-bearing sediments, implying that these are susceptible to mechanical failure and subsequent gas release. Although gas hydrates often occur in seismically active regions, e.g., subduction zones, the role of earthquakes as potential triggers of hydrocarbon transport through gas hydrate-bearing sediments has hardly been explored. Based on a recent publication (Fischer et al., 2013), we present geochemical and transport/reaction-modelling data suggesting a substantial increase in upward gas flux and hydrocarbon emission into the water column following a major earthquake that occurred near the study sites in 1945. Calculating the formation time of authigenic barite enrichments identified in two sediment cores obtained from an anticlinal structure called "Nascent Ridge", we find they formed 38-91 years before sampling, which corresponds well to the time elapsed since the earthquake (62 years). Furthermore, applying a numerical model, we show that the local sulfate/methane transition zone shifted upward by several meters due to the increased methane flux and simulated sulfate profiles very closely match measured ones in a comparable time frame of 50-70 years. We thus propose a causal relation between the earthquake and the amplified gas flux and present reflection seismic data supporting our hypothesis that co-seismic ground shaking induced mechanical fracturing of gas hydrate-bearing sediments creating pathways for free gas to migrate from a shallow reservoir within the gas hydrate stability zone into the water column. Our results imply that free hydrocarbon gas trapped beneath a local gas hydrate seal was mobilized through earthquake-induced mechanical failure and in that way circumvented carbon sequestration within the sediment. These findings lead to conclude that hydrocarbon seepage triggered by earthquakes can play a role for carbon budgets at other seismically active continental margins. The newly identified process presented in our study is conceivable to help interpret data from similar sites. Reference: Fischer, D., Mogollon, J.M., Strasser, M., Pape, T., Bohrmann, G., Fekete, N., Spieß, V. and Kasten, S., 2013. Subduction zone earthquake as potential trigger of submarine hydrocarbon seepage. Nature Geoscience 6: 647-651.

Fischer, David; José M., Mogollón; Michael, Strasser; Thomas, Pape; Gerhard, Bohrmann; Noemi, Fekete; Volkhard, Spiess; Sabine, Kasten

2014-05-01

88

Simulations of the 1906 San Francisco Earthquake and Scenario Earthquakes in Northern California  

NASA Astrophysics Data System (ADS)

3-D simulations of seismic ground motions are performed to better characterize the 1906 San Francisco earthquake and to investigate the seismic consequences from scenario events in northern California. Specifically, we perform simulations of: 1) the 1906 earthquake, which bilaterally ruptured a 480-km segment of the San Andreas fault from San Juan Bautista to Cape Mendocino (epicenter a few kilometers off the coast of San Francisco); 2) large scenario San Andreas events with different epicentral locations; and 3) smaller scenario events along faults local to the San Francisco Bay Area. Simulations of the 1906 earthquake indicate that significant ground motion occurred up and down the northern California coast and out into the Central Valley. Comparisons between the simulated motions and observed data (e.g., shaking intensities, Boatwright and Bundock, 2005), suggest that the moment magnitude of this event was between M7.8 and M7.9. Simulations of 1906-like scenario events along the San Andreas fault reveal that ground motions in the San Francisco Bay Area and in the Sacramento Delta region would be significantly stronger for earthquakes initiating along the northern section of the fault and rupturing to the southeast. Simulations of smaller scenario events in the San Francisco Bay Area indicate areas of concentrated shaking. These simulations are performed using a recently developed 3-D geologic model of northern California (Brocher and Thurber, 2005; Jachens et al., 2005), together with finite-difference codes (E3D and a new public domain package). The effects of topography and attenuation are included. The full computational domain spans most of the geologic model and is 630x320x50 km3. The minimum S-wave velocity is constrained to 500 m/s, except in water. Frequencies up to 1.0 Hz are modeled. The grid spacing ranges from 75 m to 200 m. High performance supercomputers are used for the simulations, which include models of over 23 billion grid nodes using 2000 processors. This work was performed under the auspices of the U.S. Department of Energy by University of California Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48.

Larsen, S.; Dreger, D.; Dolenc, D.

2006-12-01

89

Catalog of earthquakes along the San Andreas fault system in Central California, April-June 1972  

USGS Publications Warehouse

Numerous small earthquakes occur each day in the coast ranges of Central California. The detailed study of these earthquakes provides a tool for gaining insight into the tectonic and physical processes responsible for the generation of damaging earthquakes. This catalog contains the fundamental parameters for earthquakes located within and adjacent to the seismograph network operated by the National Center for Earthquake Research (NCER), U.S. Geological Survey, during the period April - June, 1972. The motivation for these detailed studies has been described by Pakiser and others (1969) and by Eaton and others (1970). Similar catalogs of earthquakes for the years 1969, 1970 and 1971 have been prepared by Lee and others (1972 b, c, d). A catalog for the first quarter of 1972 has been prepared by Wesson and others (1972). The basic data contained in these catalogs provide a foundation for further studies. This catalog contains data on 910 earthquakes in Central California. A substantial portion of the earthquakes reported in this catalog represents a continuation of the sequence of earthquakes in the Bear Valley area which began in February, 1972 (Wesson and others, 1972). Arrival times at 126 seismograph stations were used to locate the earthquakes listed in this catalog. Of these, 101 are telemetered stations operated by NCER. Readings from the remaining 25 stations were obtained through the courtesy of the Seismographic Stations, University of California, Berkeley (UCB); the Earthquake Mechanism Laboratory, National Oceanic and Atmospheric Administration, San Francisco (EML); and the California Department of Water Resources, Sacramento. The Seismographic Stations of the University of California, Berkeley, have for many years published a bulletin describing earthquakes in Northern California and the surrounding area, and readings at UCB Stations from more distant events. The purpose of the present catalog is not to replace the UCB Bulletin, but rather to supplement it, by describing the seismicity of a portion of central California in much greater detail.

Wesson, R.L.; Bennett, R.E.; Lester, F.W.

1973-01-01

90

Catalog of earthquakes along the San Andreas fault system in Central California: January-March, 1972  

USGS Publications Warehouse

Numerous small earthquakes occur each day in the Coast Ranges of Central California. The detailed study of these earthquakes provides a tool for gaining insight into the tectonic and physical processes responsible for the generation of damaging earthquakes. This catalog contains the fundamental parameters for earthquakes located within and adjacent to the seismograph network operated by the National Center for Earthquake Research (NCER), U.S. Geological Survey, during the period January - March, 1972. The motivation for these detailed studies has been described by Pakiser and others (1969) and by Eaton and others (1970). Similar catalogs of earthquakes for the years 1969, 1970 and 1971 have been prepared by Lee and others (1972 b,c,d). The basic data contained in these catalogs provide a foundation for further studies. This catalog contains data on 1,718 earthquakes in Central California. Of particular interest is a sequence of earthquakes in the Bear Valley area which contained single shocks with local magnitudes of S.O and 4.6. Earthquakes from this sequence make up roughly 66% of the total and are currently the subject of an interpretative study. Arrival times at 118 seismograph stations were used to locate the earthquakes listed in this catalog. Of these, 94 are telemetered stations operated by NCER. Readings from the remaining 24 stations were obtained through the courtesy of the Seismographic Stations, University of California, Berkeley (UCB); the Earthquake Mechanism Laboratory, National Oceanic and Atmospheric Administration, San Francisco (EML); and the California Department of Water Resources, Sacramento. The Seismographic Stations of the University of California, Berkeley,have for many years published a bulletin describing earthquakes in Northern California and the surrounding area, and readings at UCB Stations from more distant events. The purpose of the present catalog is not to replace the UCB Bulletin, but rather to supplement it, by describing the seismicity of a portion of central California in much greater detail.

Wesson, R.L.; Bennett, R.E.; Meagher, K.L.

1973-01-01

91

Recalculated probability of M ? 7 earthquakes beneath the Sea of Marmara, Turkey  

USGS Publications Warehouse

New earthquake probability calculations are made for the Sea of Marmara region and the city of Istanbul, providing a revised forecast and an evaluation of time-dependent interaction techniques. Calculations incorporate newly obtained bathymetric images of the North Anatolian fault beneath the Sea of Marmara [Le Pichon et al., 2001; Armijo et al., 2002]. Newly interpreted fault segmentation enables an improved regional A.D. 1500-2000 earthquake catalog and interevent model, which form the basis for time-dependent probability estimates. Calculations presented here also employ detailed models of coseismic and postseismic slip associated with the 17 August 1999 M = 7.4 Izmit earthquake to investigate effects of stress transfer on seismic hazard. Probability changes caused by the 1999 shock depend on Marmara Sea fault-stressing rates, which are calculated with a new finite element model. The combined 2004-2034 regional Poisson probability of M?7 earthquakes is ~38%, the regional time-dependent probability is 44 ± 18%, and incorporation of stress transfer raises it to 53 ± 18%. The most important effect of adding time dependence and stress transfer to the calculations is an increase in the 30 year probability of a M ??? 7 earthquake affecting Istanbul. The 30 year Poisson probability at Istanbul is 21%, and the addition of time dependence and stress transfer raises it to 41 ± 14%. The ranges given on probability values are sensitivities of the calculations to input parameters determined by Monte Carlo analysis; 1000 calculations are made using parameters drawn at random from distributions. Sensitivities are large relative to mean probability values and enhancements caused by stress transfer, reflecting a poor understanding of large-earthquake aperiodicity.

Parsons, T.

2004-01-01

92

Occurrence probability of moderate to large earthquakes in Italy based on new geophysical methods  

NASA Astrophysics Data System (ADS)

We develop new approaches to calculating 30-year probabilities for occurrence of moderate-to-large earthquakes in Italy. Geodetic techniques and finite-element modelling, aimed to reproduce a large amount of neotectonic data using thin-shell finite element, are used to separately calculate the expected seismicity rates inside seismogenic areas (polygons containing mapped faults and/or suspected or modelled faults). Thirty-year earthquake probabilities obtained from the two approaches show similarities in most of Italy: the largest probabilities are found in the southern Apennines, where they reach values between 10% and 20% for earthquakes of M W ? 6.0, and lower than 10% for events with an M W ? 6.5.

Slejko, Dario; Caporali, Alessandro; Stirling, Mark; Barba, Salvatore

2010-01-01

93

Differential Energy Radiation from Two Earthquakes with Similar Mw: The Baja California 2010 and Haiti 2010 Earthquakes  

Microsoft Academic Search

The Baja, Mexico, earthquake of the April 4, 2010, Mw 7.2 occurred in northern Baja California at shallow depth along the principal plate boundary between the North American and Pacific plates, 2 people killed in the Mexicali area. The January 12, 2010, Mw 7.0, Haiti, earthquake occurred in the vicinity of Port-au-Prince, the capital of Haiti, on the Enriquillo Plantain

L. Meng; B. Shi

2010-01-01

94

Catalog of earthquakes along the San Andreas fault system in Central California, July-September 1972  

USGS Publications Warehouse

Numerous small earthquakes occur each day in the coast ranges of Central California. The detailed study of these earthquakes provides a tool for gaining insight into the tectonic and physical processes responsible for the generation of damaging earthquakes. This catalog contains the fundamental parameters for earthquakes located within and adjacent to the seismograph network operated by the National Center for Earthquake Research (NCER), U.S. Geological Survey, during the period July - September, 1972. The motivation for these detailed studies has been described by Pakiser and others (1969) and by Eaton and others (1970). Similar catalogs of earthquakes for the years 1969, 1970 and 1971 have been prepared by Lee and others (1972 b, c, d). Catalogs for the first and second quarters of 1972 have been prepared by Wessan and others (1972 a & b). The basic data contained in these catalogs provide a foundation for further studies. This catalog contains data on 1254 earthquakes in Central California. Arrival times at 129 seismograph stations were used to locate the earthquakes listed in this catalog. Of these, 104 are telemetered stations operated by NCER. Readings from the remaining 25 stations were obtained through the courtesy of the Seismographic Stations, University of California, Berkeley (UCB), the Earthquake Mechanism Laboratory, National Oceanic and Atmospheric Administration, San Francisco (EML); and the California Department of Water Resources, Sacramento. The Seismographic Stations of the University of California, Berkeley, have for many years published a bulletin describing earthquakes in Northern California and the surrounding area, and readings at UCB Stations from more distant events. The purpose of the present catalog is not to replace the UCB Bulletin, but rather to supplement it, by describing the seismicity of a portion of central California in much greater detail.

Wesson, R.L.; Meagher, K.L.; Lester, F.W.

1973-01-01

95

UCERF3: A new earthquake forecast for California's complex fault system  

USGS Publications Warehouse

With innovations, fresh data, and lessons learned from recent earthquakes, scientists have developed a new earthquake forecast model for California, a region under constant threat from potentially damaging events. The new model, referred to as the third Uniform California Earthquake Rupture Forecast, or "UCERF" (http://www.WGCEP.org/UCERF3), provides authoritative estimates of the magnitude, location, and likelihood of earthquake fault rupture throughout the state. Overall the results confirm previous findings, but with some significant changes because of model improvements. For example, compared to the previous forecast (Uniform California Earthquake Rupture Forecast 2), the likelihood of moderate-sized earthquakes (magnitude 6.5 to 7.5) is lower, whereas that of larger events is higher. This is because of the inclusion of multifault ruptures, where earthquakes are no longer confined to separate, individual faults, but can occasionally rupture multiple faults simultaneously. The public-safety implications of this and other model improvements depend on several factors, including site location and type of structure (for example, family dwelling compared to a long-span bridge). Building codes, earthquake insurance products, emergency plans, and other risk-mitigation efforts will be updated accordingly. This model also serves as a reminder that damaging earthquakes are inevitable for California. Fortunately, there are many simple steps residents can take to protect lives and property.

Field, Edward H.; 2014 Working Group on California Earthquake Probabilities

2015-01-01

96

CISN ShakeAlert: Delivering test warnings for California earthquakes  

NASA Astrophysics Data System (ADS)

The Earthquake Early Warning (EEW) project team within the California Integrated Seismic Network (CISN) has been developing a test earthquake alerting system for the past five years. The project is a collaboration between Caltech, UC Berkeley, ETH Zurich, USC/SCEC and the USGS, and is funded by the USGS. We use data from the recently upgraded academic- and government-operated geophysical networks across the state to rapidly detect earthquakes and assess the hazard. The test system is operational statewide and delivers warnings to project scientists and other interested scientists. Three event detection and hazard assessment algorithms are currently used: Virtual Seismologist, Onsite and ElarmS, but others may be added to the system. The algorithms provide hazard assessments to a DecisionModule that aggregates the information. It then generates a unified alert stream that is broadcast to certified users. The alerts are received via the UserDisplay, a pop-up computer application. When it opens, the UserDisplay shows a map with the event location and magnitude, and tracks the propagation of the P- and S-waves. It also shows a countdown to shaking at the user's location and an estimate of the expected peak shaking intensity. The algorithm and DecisionModule outputs are archived by the project testing center for independent performance evaluation. Over the coming year the project intends to release alerts to a small group of test users. Identification and engagement of possible project partners is already well underway. Individual project components presentations will provide additional detail.

Allen, R. M.; Boese, M.; Brown, H.; Caprio, M.; Cua, G. B.; Fischer, M.; Given, D. D.; Hauksson, E.; Heaton, T. H.; Hellweg, M.; Henson, I.; Liukis, M.; Maechling, P. J.; Meier, M. A.; Neuhauser, D. S.; Oppenheimer, D. H.; Solanki, K.

2011-12-01

97

Heightened odds of large earthquakes near Istanbul: an interaction-based probability calculation  

USGS Publications Warehouse

We calculate the probability of strong shaking in Istanbul, an urban center of 10 million people, from the description of earthquakes on the North Anatolian fault system in the Marmara Sea during the past 500 years and test the resulting catalog against the frequency of damage in Istanbul during the preceding millennium, departing from current practice, we include the time-dependent effect of stress transferred by the 1999 moment magnitude M = 7.4 Izmit earthquake to faults nearer to Istanbul. We find a 62 ± 15% probability (one standard deviation) of strong shaking during the next 30 years and 32 ± 12% during the next decade.

Parsons, T.; Toda, S.; Stein, R.S.; Barka, A.; Dieterich, J.H.

2000-01-01

98

Overview of the 2004 M6 Parkfield, California, Earthquake  

NASA Astrophysics Data System (ADS)

The long--anticipated Parkfield earthquake occurred on 28 September 2004. Although similar in size to the 1922, 1934, and 1966 Parkfield events, it exhibited significant differences from those earlier events. In particular, while prior events initiated rupture under Middle Mountain north of Parkfield and ruptured towards the southeast, the 2004 event nucleated near Gold Hill south of Parkfield and ruptured northwest towards Middle Mountain. And, unlike the 1934 and 1966 events, this event was not preceded by any foreshocks (M > 0). Because the U.S. Geological Survey, California Geological Survey, and partner universities had deployed a variety of instrumentation, including seismometers, strainmeters, GPS, resistivity, creepmeters, magnetometers, fluid pressure senors, etc., many of the details of the 2004 event were recorded in unprecedented detail. Preliminary observations include: 1) if there was pre-seismic slip on the fault, its moment was less than 10 ^ {-4} of the moment of the mainshock; 2) location of aftershocks is similar to locations of the background seismicity; 3) the distribution of co-seismic slip is primarily concentrated beneath Middle Mountain with a secondary concentration at the hypocenter adjacent to Gold Hill; 4) post-seismic deformation constitutes a significant portion of the total deformation and will decay to background rates in weeks to years; and 5) except for the lack of ground breakage south of Gold Hill, the pattern of surface ground breakage is similar to that from the 1966 earthquake including rupture on the southwest fracture zone.

Langbein, J.

2004-12-01

99

Regression models for predicting the probability of near-fault earthquake ground motion pulses, and their period.  

E-print Network

Regression models for predicting the probability of near-fault earthquake ground motion pulses to the earthquake magnitude, but other predictive parameters are also considered and discussed. Both empirical University, Stanford, CA, USA ABSTRACT: Near-fault earthquake ground motions containing large velocity pulses

Baker, Jack W.

100

Tom Parsons, U.S. Geological Survey, 345 Middlefield Rd. Menlo Park, CA, 94025 Earthquake probability calculated from paleoseismic  

E-print Network

transfer and post-seismic viscoelastic relaxation from the 1906 great San Francisco earthquake. 179 Proceedings of the Third Conference on Earthquake Hazards in the Eastern San Francisco Bay Area October 22 reduction from the 1906 earthquake on the south Hayward fault reduces probabilities to 5.2% in 10 years

101

Stress transferred by the 1995 M ? = 6.9 Kobe, Japan, shock: Effect on aftershocks and future earthquake probabilities  

Microsoft Academic Search

The Kobe earthquake struck at the edge of the densely populated Osaka-Kyoto corridor in southwest Japan. We investigate how the earthquake transferred stress to nearby faults, altering their proximity to failure and thus changing earthquake probabilities. We find that relative to the pre-Kobe seismicity, Kobe aftershocks were concentrated in regions of calculated Coulomb stress increase and less common in regions

Shinji Toda; Ross S. Stein; Paul A. Reasenberg; James H. Dieterich; Akio Yoshida

1998-01-01

102

What to Expect from the Virtual Seismologist: Delay Times and Uncertainties of Initial Earthquake Alerts in California  

NASA Astrophysics Data System (ADS)

The Virtual Seismologist (VS) method is a Bayesian approach to regional network-based earthquake early warning (EEW) originally formulated by Cua and Heaton (2007). Implementation of VS into real-time EEW codes has been an on-going effort of the Swiss Seismological Service at ETH Zürich since 2006, with support from ETH Zürich, various European projects, and the United States Geological Survey (USGS). VS is one of three EEW algorithms that form the basis of the California Integrated Seismic Network (CISN) ShakeAlert system, a USGS-funded prototype end-to-end EEW system that could potentially be implemented in California. In Europe, VS is currently operating as a real-time test system in Switzerland, western Greece and Istanbul. As part of the on-going EU project REAKT (Strategies and Tools for Real-Time Earthquake Risk Reduction), VS installations in southern Italy, Romania, and Iceland are planned or underway. The possible use cases for an EEW system will be determined by the speed and reliability of earthquake source parameter estimates. A thorough understanding of both is therefore essential to evaluate the usefulness of VS. For California, we present state-wide theoretical alert times for hypothetical earthquakes by analyzing time delays introduced by the different components in the VS EEW system. Taking advantage of the fully probabilistic formulation of the VS algorithm we further present an improved way to describe the uncertainties of every magnitude estimate by evaluating the width and shape of the probability density function that describes the relationship between waveform envelope amplitudes and magnitude. We evaluate these new uncertainty values for past seismicity in California through off-line playbacks and compare them to the previously defined static definitions of uncertainty based on real-time detections. Our results indicate where VS alerts are most useful in California and also suggest where most effective improvements to the VS EEW system can be made.

Behr, Y.; Cua, G. B.; Clinton, J. F.; Racine, R.; Meier, M.; Cauzzi, C.

2013-12-01

103

Monte Carlo Method for Determining Earthquake Recurrence Parameters from Short Paleoseismic Catalogs: Example Calculations for California  

USGS Publications Warehouse

Paleoearthquake observations often lack enough events at a given site to directly define a probability density function (PDF) for earthquake recurrence. Sites with fewer than 10-15 intervals do not provide enough information to reliably determine the shape of the PDF using standard maximum-likelihood techniques [e.g., Ellsworth et al., 1999]. In this paper I present a method that attempts to fit wide ranges of distribution parameters to short paleoseismic series. From repeated Monte Carlo draws, it becomes possible to quantitatively estimate most likely recurrence PDF parameters, and a ranked distribution of parameters is returned that can be used to assess uncertainties in hazard calculations. In tests on short synthetic earthquake series, the method gives results that cluster around the mean of the input distribution, whereas maximum likelihood methods return the sample means [e.g., NIST/SEMATECH, 2006]. For short series (fewer than 10 intervals), sample means tend to reflect the median of an asymmetric recurrence distribution, possibly leading to an overestimate of the hazard should they be used in probability calculations. Therefore a Monte Carlo approach may be useful for assessing recurrence from limited paleoearthquake records. Further, the degree of functional dependence among parameters like mean recurrence interval and coefficient of variation can be established. The method is described for use with time-independent and time-dependent PDF?s, and results from 19 paleoseismic sequences on strike-slip faults throughout the state of California are given.

Parsons, Tom

2008-01-01

104

Monte Carlo method for determining earthquake recurrence parameters from short paleoseismic catalogs: Example calculations for California  

USGS Publications Warehouse

Paleoearthquake observations often lack enough events at a given site to directly define a probability density function (PDF) for earthquake recurrence. Sites with fewer than 10-15 intervals do not provide enough information to reliably determine the shape of the PDF using standard maximum-likelihood techniques (e.g., Ellsworth et al., 1999). In this paper I present a method that attempts to fit wide ranges of distribution parameters to short paleoseismic series. From repeated Monte Carlo draws, it becomes possible to quantitatively estimate most likely recurrence PDF parameters, and a ranked distribution of parameters is returned that can be used to assess uncertainties in hazard calculations. In tests on short synthetic earthquake series, the method gives results that cluster around the mean of the input distribution, whereas maximum likelihood methods return the sample means (e.g., NIST/SEMATECH, 2006). For short series (fewer than 10 intervals), sample means tend to reflect the median of an asymmetric recurrence distribution, possibly leading to an overestimate of the hazard should they be used in probability calculations. Therefore a Monte Carlo approach may be useful for assessing recurrence from limited paleoearthquake records. Further, the degree of functional dependence among parameters like mean recurrence interval and coefficient of variation can be established. The method is described for use with time-independent and time-dependent PDFs, and results from 19 paleoseismic sequences on strike-slip faults throughout the state of California are given.

Parsons, T.

2008-01-01

105

Statistical analysis of an earthquake-induced landslide distribution — the 1989 Loma Prieta, California event  

Microsoft Academic Search

The 1989 Loma Prieta, California earthquake (moment magnitude, M=6.9) generated landslides throughout an area of about 15,000km2 in central California. Most of these landslides occurred in an area of about 2000km2 in the mountainous terrain around the epicenter, where they were mapped during field investigations immediately following the earthquake. The distribution of these landslides is investigated statistically, using regression and

David K Keefer

2000-01-01

106

Source processes of industrially-induced earthquakes at The Geysers geothermal area, California  

Microsoft Academic Search

Microearthquake activity at The Geysers geothermal area, California, mirrors the steam production rate, suggesting that the earthquakes are industrially induced. A 15-station network of digital, three-component seismic stations was operated for one month in 1991, and 3,900 earthquakes were recorded. Highly-accurate moment tensors were derived for 30 of the best recorded earthquakes by tracing rays through tomographically derived 3-D V{sub

Alwyn Ross; G. R. Foulger; Bruce R. Julian

1999-01-01

107

Earthquake probability at the Kashiwazaki Kariwa nuclear power plant, Japan, assessed using bandwidth optimization  

NASA Astrophysics Data System (ADS)

On July 16, 2007, a strong 6.8 magnitude earthquake occurred on Japan's west coast, rocking the nearby Kashiwazaki Kariwa nuclear power plant, the largest nuclear power station on Earth. Shaking during this event produced ground accelerations of ~680 gal, exceeding the plant seismic design specification of 273 gal. This occurrence renews concerns regarding seismic hazards at nuclear facilities located in regions with persistent earthquake activity. Seismic hazard assessments depend upon an understanding of the spatial distribution of earthquakes to effectively assess future earthquake hazards. Earthquake spatial density is best estimated using kernel density functions based on the locations of past seismic events. Two longstanding problems encountered when using kernel density estimation are the selection of an optimal smoothing bandwidth and the quantification of the uncertainty inherent in these estimates. Currently, kernel bandwidths are often selected subjectively and the uncertainty in spatial density estimation is not calculated. As a result, hazards with potentially large consequences for society are poorly estimated. We solve these two problems by employing an optimal bandwidth selector algorithm to objectively identify an appropriately sized kernel bandwidth based on earthquake locations from catalog databases and by assessing uncertainty in the spatial density estimate using a modified smoothed bootstrap technique. After applying these methods to the Kashiwazaki Kariwa site, the calculated probability of one or more Mw 6-7 earthquakes within 10 km of the site during a 40 yr facility lifetime is between 0.005 and 0.02 with 95 percent confidence. This result is made more robust by calculating similar probabilities using alternative databases of earthquake locations and magnitudes. The objectivity and quantitative robustness of these techniques make them extremely beneficial for seismic hazard assessment.

Connor, C. B.; Connor, L. J.

2007-12-01

108

Post-Earthquake Damage Evaluation and Reporting Procedures: A Guidebook for California Schools.  

ERIC Educational Resources Information Center

The California Office of the State Architect, Structural Safety Division (OSA/SSS) is responsible for evaluating public school structures after an earthquake. However, final authority on whether a building should be reoccupied after damage lies with the school district. This guidebook is designed to help school officials assess earthquake damage…

California State Office of Emergency Services, Sacramento.

109

Unacceptable Risk: Earthquake Hazard Mitigation in One California School District. Hazard Mitigation Case Study.  

ERIC Educational Resources Information Center

Earthquakes are a perpetual threat to California's school buildings. School administrators must be aware that hazard mitigation means much more than simply having a supply of water bottles in the school; it means getting everyone involved in efforts to prevent tragedies from occurring in school building in the event of an earthquake. The PTA in…

California State Office of Emergency Services, Sacramento.

110

Seismic Moment, Stress, and Source Dimensions for Earthquakes in the California-Nevada Region  

Microsoft Academic Search

The source mechanism of earthquakes in the California-Nevada region was studied using surface wave analyses, surface displacement observations in the source region, magnitude determinations, and accurate epicenter locations. Fourier analyses of surface waves from thirteen earthquakes in the Parkfield region have yielded the following relationship between seismic moment, Mo and Richter magnitude, M,: log Mo -- 1.4 M, n u

Max Wyss; James N. Brune

1968-01-01

111

Time-dependent Earthquake Probability Calculations For Faults In The Sea of Marmara  

NASA Astrophysics Data System (ADS)

The earthquake migration along the North Anatolian Fault Zone during the last cen- tury left a gap in the Sea of Marmara, since the rupturing process stopped with the 1999 MW = 7.4 Izmit earthquake in the easternmost part of the Sea, the Gulf of Izmit. Furthermore, because the last M > 7 earthquake in the Sea of Marmara occurred in 1766 it is our aim to quantify the seismic hazard that is supposed to threat Istanbul with respect to the expected big shock on a fault in the Marmara Sea. A positive (negative) stress step on faults there induced by the 1999 Izmit earthquake is assumed to shorten (lengthen) the time required for tectonic stressing to bring a segment to failure. As will be shown, not all faults experienced a positive coseismic Coulomb failure stress change due to the Izmit shock as one might have expected. Together with a math- ematical expression for the seismicity rate based on a state-dependent constitutive formulation by Dieterich (1994), the calculated Coulomb stress step is incorporated into a probability calculation, leading to a time-variable probability change. Further important parameters for calculating this time-dependent probabilities are the mean recurrence time of a characteristic earthquake on a fault and the elapsed time since this event with respect to the stress disturbing shock. Contrary to the time independent Poisson probability, these 'conditional probabilities' consider the fact, that the proba- bility for failure increases the longer the time span grows since the last stress releasing event. Furthermore, a transient effect due to the transient increase in the probability of additional earthquakes in the surrounding area after a strong event was modeled based on the constitutive formulation. We applied these tools - coseismic Coulomb stress modeling and time-dependent probability calculations - to four different sets of seismically explored faults in the Sea of Marmara to estimate the change in seismic hazard because of the 1999 Izmit event. It will be shown, that the probability for a M > 7 earthquake in the Sea of Marmara on one of the mapped faults depends strongly on their exact location, and whether peak values or mean values of the stress change are incorporated into the calculations because of the strongly inhomogeneous Coulomb failure stress distribution.

Hillers, G.; Roth, F.; Harjes, H.-P.; Zschau, J.

112

Recalculated probability of M >= 7 earthquakes beneath the Sea of Marmara, Turkey  

Microsoft Academic Search

New earthquake probability calculations are made for the Sea of Marmara region and the city of Istanbul, providing a revised forecast and an evaluation of time-dependent interaction techniques. Calculations incorporate newly obtained bathymetric images of the North Anatolian fault beneath the Sea of Marmara [Le Pichon et al., 2001; Armijo et al., 2002]. Newly interpreted fault segmentation enables an improved

Tom Parsons

2004-01-01

113

Recalculated probability of M ? 7 earthquakes beneath the Sea of Marmara, Turkey  

Microsoft Academic Search

New earthquake probability calculations are made for the Sea of Marmara region and the city of Istanbul, providing a revised forecast and an evaluation of time-dependent interaction techniques. Calculations incorporate newly obtained bathymetric images of the North Anatolian fault beneath the Sea of Marmara [Le Pichon et al., 2001; Armijo et al., 2002]. Newly interpreted fault segmentation enables an improved

Tom Parsons

2004-01-01

114

Heightened Odds of Large Earthquakes Near Istanbul: An Interaction-Based Probability Calculation  

Microsoft Academic Search

We calculate the probability of strong shaking in Istanbul, an urban center of 10 million people, from the description of earthquakes on the North Anatolian fault system in the Marmara Sea during the past 500 years and test the resulting catalog against the frequency of damage in Istanbul during the preceding millennium. Departing from current practice, we include the time-dependent

Tom Parsons; Shinji Toda; Ross S. Stein; Aykut Barka; James H. Dieterich

2000-01-01

115

Probability of earthquake occurrence in Greece with special reference to the VAN predictions  

NASA Astrophysics Data System (ADS)

The VAN earthquake predictions were made on the basis of seismic electric signals (SES), but the debate seems to be directed toward the statistical significance of the predictions from seismic data only. Accordingly, applying a logistic regression model to seismicity, we present our estimation of the probability of earthquake occurrence in Greece. The main purpose of our study is to examine whether or not we can find a specific seismicity pattern that can be used to considerably increase probability estimates. Our estimation of the probability of occurrence of an earthquake of Ms ? 5.0 is less than 0.25 in all the cases that we have examined. If we lower the threshold magnitude from 5.0 to 4.3, we can find cases in which the probability becomes as high as 0.75, comparable to the success rate of the VAN method estimated by Hamada [1993]. In these cases, however, such a high probability is due mostly to aftershocks, and if aftershocks are removed from the data set, the probability falls below 0.5.

Honkura, Y.; Tanaka, N.

116

THE GREAT SOUTHERN CALIFORNIA SHAKEOUT: Earthquake Science for 22 Million People  

NASA Astrophysics Data System (ADS)

Earthquake science is being communicated to and used by the 22 million residents of southern California to improve resiliency to future earthquakes through the Great Southern California ShakeOut. The ShakeOut began when the USGS partnered with the California Geological Survey, Southern California Earthquake Center and many other organizations to bring 300 scientists and engineers together to formulate a comprehensive description of a plausible major earthquake, released in May 2008, as the ShakeOut Scenario, a description of the impacts and consequences of a M7.8 earthquake on the Southern San Andreas Fault (USGS OFR2008-1150). The Great Southern California ShakeOut was a week of special events featuring the largest earthquake drill in United States history. The ShakeOut drill occurred in houses, businesses, and public spaces throughout southern California at 10AM on November 13, 2008, when southern Californians were asked to pretend that the M7.8 scenario earthquake had occurred and to practice actions that could reduce the impact on their lives. Residents, organizations, schools and businesses registered to participate in the drill through www.shakeout.org where they could get accessible information about the scenario earthquake and share ideas for better reparation. As of September 8, 2008, over 2.7 million confirmed participants had been registered. The primary message of the ShakeOut is that what we do now, before a big earthquake, will determine what our lives will be like after. The goal of the ShakeOut has been to change the culture of earthquake preparedness in southern California, making earthquakes a reality that are regularly discussed. This implements the sociological finding that 'milling,' discussing a problem with loved ones, is a prerequisite to taking action. ShakeOut milling is taking place at all levels from individuals and families, to corporations and governments. Actions taken as a result of the ShakeOut include the adoption of earthquake response technologies by Los Angeles Unified School District and a top to bottom examination of Los Angeles County Fire Department's earthquake response strategies.

Jones, L.; Cox, D.; Perry, S.; Hudnut, K.; Benthien, M.; Bwarie, J.; Vinci, M.; Buchanan, M.; Long, K.; Sinha, S.; Collins, L.

2008-12-01

117

Northern California Earthquake Data Center Data Retrieval (title provided or enhanced by cataloger)  

NSDL National Science Digital Library

The Northern California Earthquake Data Center (NCEDC) offers various types of earthquake-related data. Most of the datasets are available on the WWW. A few require the establishment of a research account. Available information includes: earthquake catalogs and lists; seismic waveform data from the Berkeley Digital Seismic Network, the Northern California Seismic Network, the Parkfield High-Resolution Seismic Network, and the Calpine/Unocal Geysers Network; Global Positioning System data from continuous monitoring stations; and Berkeley Digital Seismic Network temperature, electromagnetic and strain data.

118

Error propagation in time-dependent probability of occurrence for characteristic earthquakes in Italy  

NASA Astrophysics Data System (ADS)

Time-dependent models for seismic hazard and earthquake probabilities are at the leading edge of research nowadays. In the framework of a 2-year national Italian project (2005-2007), we have applied the Brownian passage time (BPT) renewal model to the recently released Database of Individual Seismogenic Sources (DISS) to compute earthquake probability in the period 2007-2036. Observed interevent times on faults in Italy are absolutely insufficient to characterize the recurrence time. We, therefore, derived mean recurrence intervals indirectly. To estimate the uncertainty of the results, we resorted to the theory of error propagation with respect to the main parameters: magnitude and slip rate. The main issue concerned the high variability of slip rate, which could hardly be reduced by exploiting geodetic constraints. We did some validation tests, and interesting considerations were derived from seismic moment budgeting on the historical earthquake catalog. In a time-dependent perspective, i.e., when the date of the last event is known, only 10-15% of the 115 sources exhibit a probability of a characteristic earthquake in the next 30 years higher than the equivalent Poissonian probabilities. If we accept the Japanese conventional choice of probability threshold greater than 3% in 30 years to define “highly probable sources,” mainly intermediate earthquake faults with characteristic M < 6, having an elapsed time of 0.7-1.2 times the recurrence interval are the most “prone” sources. The number of highly probable sources rises by increasing the aperiodicity coefficient (from 14 sources in the case of variable ? ranging between 0.22 and 0.36 to 31 sources out of 115 in the case of an ? value fixed at 0.7). On the other hand, in stationary time-independent approaches, more than two thirds of all sources are considered probabilistically prone to an impending earthquake. The performed tests show the influence of the variability of the aperiodicity factor in the BPT renewal model on the absolute probability values. However, the influence on the relative ranking of sources is small. Future developments should give priority to a more accurate determination of the date of the last seismic event for a few seismogenic sources of the DISS catalog and to a careful check on the applicability of a purely characteristic model.

Peruzza, Laura; Pace, Bruno; Cavallini, Fabio

2010-01-01

119

Operational Earthquake Forecasting and Decision-Making in a Low-Probability Environment  

NASA Astrophysics Data System (ADS)

Operational earthquake forecasting (OEF) is the dissemination of authoritative information about the time dependence of seismic hazards to help communities prepare for potentially destructive earthquakes. Most previous work on the public utility of OEF has anticipated that forecasts would deliver high probabilities of large earthquakes; i.e., deterministic predictions with low error rates (false alarms and failures-to-predict) would be possible. This expectation has not been realized. An alternative to deterministic prediction is probabilistic forecasting based on empirical statistical models of aftershock triggering and seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains in excess of 100 relative to long-term forecasts. The utility of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing OEF in this sort of "low-probability environment." The need to move more quickly has been underscored by recent seismic crises, such as the 2009 L'Aquila earthquake sequence, in which an anxious public was confused by informal and inaccurate earthquake predictions. After the L'Aquila earthquake, the Italian Department of Civil Protection appointed an International Commission on Earthquake Forecasting (ICEF), which I chaired, to recommend guidelines for OEF utilization. Our report (Ann. Geophys., 54, 4, 2011; doi: 10.4401/ag-5350) concludes: (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and timely, and need to convey epistemic uncertainties. (b) Earthquake probabilities should be based on operationally qualified, regularly updated forecasting systems. (c) All operational models should be evaluated for reliability and skill by retrospective testing, and the models should be under continuous prospective testing against long-term forecasts and alternative time-dependent models. (d) Short-term models used in operational forecasting should be consistent with the long-term forecasts used in probabilistic seismic hazard analysis. (e) Alert procedures should be standardized to facilitate decisions at different levels of government, based in part on objective analysis of costs and benefits. (f) In establishing alert protocols, consideration should also be given to the less tangible aspects of value-of-information, such as gains in psychological preparedness and resilience. Authoritative statements of increased risk, even when the absolute probability is low, can provide a psychological benefit to the public by filling information vacuums that lead to informal predictions and misinformation. Formal OEF procedures based on probabilistic forecasting appropriately separate hazard estimation by scientists from the decision-making role of civil protection authorities. The prosecution of seven Italian scientists on manslaughter charges stemming from their actions before the L'Aquila earthquake makes clear why this separation should be explicit in defining OEF protocols.

Jordan, T. H.; the International Commission on Earthquake ForecastingCivil Protection

2011-12-01

120

Current Development at the Southern California Earthquake Data Center (SCEDC)  

NASA Astrophysics Data System (ADS)

Over the past year, the SCEDC completed or is near completion of three featured projects: Station Information System (SIS) Development: The SIS will provide users with an interface into complete and accurate station metadata for all current and historic data at the SCEDC. The goal of this project is to develop a system that can interact with a single database source to enter, update and retrieve station metadata easily and efficiently. The system will provide accurate station/channel information for active stations to the SCSN real-time processing system, as will as station/channel information for stations that have parametric data at the SCEDC i.e., for users retrieving data via STP. Additionally, the SIS will supply information required to generate dataless SEED and COSMOS V0 volumes and allow stations to be added to the system with a minimum, but incomplete set of information using predefined defaults that can be easily updated as more information becomes available. Finally, the system will facilitate statewide metadata exchange for both real-time processing and provide a common approach to CISN historic station metadata. Moment Tensor Solutions: The SCEDC is currently archiving and delivering Moment Magnitudes and Moment Tensor Solutions (MTS) produced by the SCSN in real-time and post-processing solutions for events spanning back to 1999. The automatic MTS runs on all local events with magnitudes > 3.0, and all regional events > 3.5. The distributed solution automatically creates links from all USGS Simpson Maps to a text e-mail summary solution, creates a .gif image of the solution, and updates the moment tensor database tables at the SCEDC. Searchable Scanned Waveforms Site: The Caltech Seismological Lab has made available 12,223 scanned images of pre-digital analog recordings of major earthquakes recorded in Southern California between 1962 and 1992 at http://www.data.scec.org/research/scans/. The SCEDC has developed a searchable web interface that allows users to search the available files, select multiple files for download and then retrieve a zipped file containing the results. Scanned images of paper records for M>3.5 southern California earthquakes and several significant teleseisms are available for download via the SCEDC through this search tool.

Appel, V. L.; Clayton, R. W.

2005-12-01

121

Source Characterization and Ground Motion Modeling of the 1892 Vacaville-Winters Earthquake Sequence, California  

NASA Astrophysics Data System (ADS)

We use a multidisciplinary approach that combines structural geologic analysis with geophysical modeling to evaluate the depth, geometry and segmentation of thrust faults that were the probable sources of the 1892 Winters-Vacaville earthquake sequence, which produced significant damage to towns in the southwestern Sacramento Valley, California. The largest event in this sequence occurred 19 April 1892 with a maximum Modified Mercalli Intensity (Imm) of IX and was assigned M 6.5 based on felt area. Through crustal velocity modeling and analysis of seismic reflection data, we interpret that the epicentral region of the 1892 sequence is underlain by a system of blind, west-dipping thrust faults at a minimum depth of about 8 km. Quaternary uplift along the valley margin is a result of fault-propagation folding above the tips of the thrust faults. From analysis of seismic reflection data, we interpret that two geometrically distinct thrust-fault segments are separated by a right en echelon step at the latitude of the town of Winters. The "Gordon Valley" segment, which probably was the source of the 1892 main shock, is 18 +/- 2 km long and extends between the towns of Winters and Vacaville. The empirical peak horizontal acceleration- and velocity-Imm relationships of Wald et al. (1999) were used with synthetic ground motion modeling to demonstrate that rupture of the Gordon Valley segment reproduces the distribution of Imm >= VI intensities documented from anecdotal accounts of the 19 and 21 April 1892 earthquakes, including probable directivity effects east of the range front. Integrated structural analysis and ground motion modeling also were used to assess the role of the northern Gordon Valley segment boundary in arresting the 19 April 1892 earthquake rupture, and the subsequent occurrence of the 21 April 1892 aftershock. Our results support the inferences of previous workers (Wong and Ely, 1983; Eaton, 1986; Toppozada, 1987; Bennett, 1987; Unruh and Moores, 1992) that the 1983 Coalinga earthquake, which occurred on a blind thrust fault beneath the western San Joaquin Valley, is an analogue for the 1892 Winters-Vacaville sequence.

O'Connell, D. R.; Unruh, J.; Block, L. P.

2001-12-01

122

A search for long-term periodicities in large earthquakes of southern and coastal central California  

SciTech Connect

It has been occasionally suggested that large earthquakes may follow the 8.85-year and 18.6-year lunar-solar tidal cycles and possibly the {approximately} 11-year solar activity cycle. From a new study of earthquakes with magnitudes {ge} 5.5 in southern and coastal central California during the years 1855-1983, it is concluded that, at least in this selected area of the world, no statistically significant long-term periodicities in earthquake frequency occur. The sample size used is about twice that used in comparable earlier studies of this region which concentrated on large earthquakes.

Stothers, R.B. (NASA, Goddard Space Flight Center, Greenbelt, MD (USA))

1990-10-01

123

Estimated ground motion from the 1994 Northridge, California, earthquake at the site of interstate 10 and La Cienega Boulevard bridge collapse, West Los Angeles, California  

USGS Publications Warehouse

We have estimated ground motions at the site of a bridge collapse during the 1994 Northridge, California, earthquake. The estimated motions are based on correcting motions recorded during the mainshock 2.3 km from the collapse site for the relative site response of the two sites. Shear-wave slownesses and damping based on analysis of borehole measurements at the two sites were used in the site response analysis. We estimate that the motions at the collapse site were probably larger, by factors ranging from 1.2 to 1.6, than at the site at which the ground motion was recorded, for periods less than about 1 sec.

Boore, D.M.; Gibbs, J.F.; Joyner, W.B.; Tinsley, J.C.; Ponti, D.J.

2003-01-01

124

The Effects of Static Coulomb Stress Change on Southern California Earthquake Forecasts  

NASA Astrophysics Data System (ADS)

In previous studies, we confirmed an association between static Coulomb stress change and earthquake location in southern California, when resolving stress tensors onto uniformly oriented northwest right-lateral strike-slip planes (Deng & Sykes, 1997). Using an optimized index function to convert static Coulomb stress change into normalized seismicity rates, we found that the Coulomb stress-based forecasts were not significantly more effective indicators of future earthquake locations than forecasts based on smoothed seismicity (Hiemer et al., 2011). These results were likely due to Coulomb stress uncertainties, particularly near stress singularities at the ends of fault sections where many earthquakes occurred. We evaluate hybrid Coulomb stress/smoothed seismicity earthquake forecasts against those with earthquake rates derived from only one component, within a southern California study area (32°N-37°N latitude, 122°W-114°W longitude). Using a weighted linear combination of earthquake rates derived from static Coulomb stress change and smoothed seismicity, we mitigate the effects of stress uncertainty through increasing the influence of Coulomb stress on earthquake rates with increasing distance from faults. We also evaluate time-dependent Coulomb stress earthquake forecasts based on rate-and-state friction (Toda & Enescu, 2011 and Dieterich, 1996) against a Poissonian null hypothesis, from the 10/16/1999 Hector Mine earthquake to the 4/4/2010 El Mayor Cucapah earthquake. From numerical integration, we establish a normalized seismicity rate for each day, during the target time interval, from Coulomb stress evolution and the times since all preceding source earthquakes. During each day we assume seismicity follows a Poissonian process, with expected rates defined as the rate-and-state seismicity rates. By pseudo-prospectively testing these spatial and spatiotemporal earthquake forecasts, we ascertain the role of static and quasi-static Coulomb stress change in indicating future earthquake locations and times.

Strader, A. E.; Jackson, D. D.

2013-12-01

125

Simplifying Construction of Complex Workflows for Non-Expert Users of the Southern California Earthquake Center Community Modeling Environment  

E-print Network

the holy grail of this work is earthquake prediction, a more prosaic goal is to estimate the potential Earthquake Center Community Modeling Environment Philip Maechling (3), Hans Chalupsky (2), Maureen Dougherty 90292, (3) Southern California Earthquake Center, USC, Los Angeles CA, 90089, USA, {Corresponding Author

Kim, Jihie

126

Slip on the San Andreas Fault at Parkfield, California, over Two Earthquake Cycles, and the Implications for Seismic Hazard  

Microsoft Academic Search

Parkfield, California, which experienced M 6.0 earthquakes in 1934, 1966, and 2004, is one of the few locales for which geodetic observations span multiple earthquake cycles. We undertake a comprehensive study of deformation over the most recent earthquake cycle and explore the results in the context of ge- odetic data collected prior to the 1966 event. Through joint inversion of

Jessica Murray; John Langbein

2006-01-01

127

Forecasting California's earthquakes: What can we expect in the next 30 years?  

USGS Publications Warehouse

In a new comprehensive study, scientists have determined that the chance of having one or more magnitude 6.7 or larger earthquakes in the California area over the next 30 years is greater than 99%. Such quakes can be deadly, as shown by the 1989 magnitude 6.9 Loma Prieta and the 1994 magnitude 6.7 Northridge earthquakes. The likelihood of at least one even more powerful quake of magnitude 7.5 or greater in the next 30 years is 46%?such a quake is most likely to occur in the southern half of the State. Building codes, earthquake insurance, and emergency planning will be affected by these new results, which highlight the urgency to prepare now for the powerful quakes that are inevitable in California?s future.

Field, Edward H.; Milner, Kevin R.; The 2007 Working Group on California Earthquake Probabilities

2008-01-01

128

Database of potential sources for earthquakes larger than magnitude 6 in Northern California  

USGS Publications Warehouse

The Northern California Earthquake Potential (NCEP) working group, composed of many contributors and reviewers in industry, academia and government, has pooled its collective expertise and knowledge of regional tectonics to identify potential sources of large earthquakes in northern California. We have created a map and database of active faults, both surficial and buried, that forms the basis for the northern California portion of the national map of probabilistic seismic hazard. The database contains 62 potential sources, including fault segments and areally distributed zones. The working group has integrated constraints from broadly based plate tectonic and VLBI models with local geologic slip rates, geodetic strain rate, and microseismicity. Our earthquake source database derives from a scientific consensus that accounts for conflict in the diverse data. Our preliminary product, as described in this report brings to light many gaps in the data, including a need for better information on the proportion of deformation in fault systems that is aseismic.

Working Group on Northern California Earthquake Potential

1996-01-01

129

Short Notes Variability of Near-Term Probability for the Next Great Earthquake on the Cascadia Subduction Zone  

Microsoft Academic Search

The threat of a great (M 9) earthquake along the Cascadia subduction zone is evidenced by both paleoseismology data and current strain accumulation along the fault. On the basis of recent information on the characteristics of this subduction system, we estimate the conditional probabilities of a great earthquake occurring within the next 50 years and their variabilities. The most important

Stephane Mazzotti; John Adams

2004-01-01

130

Cross-fault triggering in the November 1987 Superstition Hills earthquake sequence, southern California  

Microsoft Academic Search

Two large strike-slip ruptures 11.4 hours apart occurred on intersecting, nearly orthogonal, vertical faults during the November 1987 Superstition Hills earthquake sequence in southern California. This sequence is the latest in a northwestward progression of earthquakes (1979, 1981, and 1987) rupturing a set of parallel left-lateral cross-faults that trend northeast between the Brawley seismic zone and Superstition Hills fault, a

K. W. Hudnut; L. Seeber; J. Pacheco

1989-01-01

131

Earthquake source mechanisms and transform fault tectonics in the Gulf of California  

NASA Technical Reports Server (NTRS)

The source parameters of 19 earthquakes in the Gulf of California were determined from an inversion of long-period P and SH waveforms. The inversion procedure is described and the estimated precision of the derived source parameters is examined, with particular attention given to source complexity, the resolution of slip vector azimuth, and the resolution of centroid depth. The implications of these earthquake source characteristics for the tectonic evolution of the gulf are discussed.

Goff, John A.; Bergman, Eric A.; Solomon, Sean C.

1987-01-01

132

Southern California Earthquake Center--Virtual Display of Objects (SCEC-VDO): An Earthquake Research and Education Tool  

NASA Astrophysics Data System (ADS)

Interns in the program Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT, an NSF Research Experience for Undergraduates Site) have designed, engineered, and distributed SCEC-VDO (Virtual Display of Objects), an interactive software used by earthquake scientists and educators to integrate and visualize global and regional, georeferenced datasets. SCEC-VDO is written in Java/Java3D with an extensible, scalable architecture. An increasing number of SCEC-VDO datasets are obtained on the fly through web services and connections to remote databases; and user sessions may be saved in xml-encoded files. Currently users may display time-varying sequences of earthquake hypocenters and focal mechanisms, several 3-dimensional fault and rupture models, satellite imagery - optionally draped over digital elevation models - and cultural datasets including political boundaries. The ability to juxtapose and interactively explore these data and their temporal and spatial relationships has been particularly important to SCEC scientists who are evaluating fault and deformation models, or who must quickly evaluate the menace of evolving earthquake sequences. Additionally, SCEC-VDO users can annotate the display, plus script and render animated movies with adjustable compression levels. SCEC-VDO movies are excellent communication tools and have been featured in scientific presentations, classrooms, press conferences, and television reports.

Perry, S.; Maechling, P.; Jordan, T.

2006-12-01

133

Changes in static stress on southern California faults after the 1992 Landers earthquake  

USGS Publications Warehouse

THE magnitude 7.5 Landers earthquake of 28 June 1992 was the largest earthquake to strike California in 40 years. The slip that occurs in such an earthquake would be expected to induce large changes in the static stress on neighbouring faults; these changes in stress should in turn affect the likelihood of future earthquakes. Stress changes that load faults towards failure have been cited as the cause of small1-5, moderate6 and large7 earthquakes; conversely, those that relax neighbouring faults have been related to a decrease in seismicity5. Here we use an elastic half-space model8 to estimate the stress changes produced by the Landers earthquake on selected southern California faults, including the San Andreas. We find that the estimated stress changes are consistent with the triggering of four out of the five aftershocks with magnitude greater than 4.5, and that the largest changes (1-10 bar), occurring on part of the San Bernardino segment of the San Andreas fault, may have decreased the time to the next magnitude 8 earthquake by about 14 years.

Harris, R.A.; Simpson, R.W.

1992-01-01

134

Earthquakes  

MedlinePLUS

... Matters What's New A - Z Index Disasters & Severe Weather Earthquakes Extreme Heat Floods Hurricanes Landslides Tornadoes Tsunamis ... during an earthquake. Be Ready! Earthquakes Disasters & Severe Weather Earthquakes Extreme Heat Floods Hurricanes Landslides Tornadoes Tsunamis ...

135

Earthquake Education and Public Information Centers: A Collaboration Between the Earthquake Country Alliance and Free-Choice Learning Institutions in California  

Microsoft Academic Search

In 1999 the Southern California Earthquake Center initiated an effort to expand its reach to multiple target audiences through the development of an interpretive trail on the San Andreas fault at Wallace Creek and an earthquake exhibit at Fingerprints Youth Museum in Hemet. These projects and involvement with the San Bernardino County Museum in Redlands beginning in 2007 led to

R. M. Degroot; K. Springer; C. J. Brooks; L. Schuman; D. Dalton; M. L. Benthien

2009-01-01

136

Chapter B. The Loma Prieta, California, Earthquake of October 17, 1989 - Forecasts  

USGS Publications Warehouse

The magnitude (Mw) 6.9 Loma Prieta earthquake struck the San Francisco Bay region of central California at 5:04 p.m. P.d.t. on October 17, 1989, killing 62 people and generating billions of dollars in property damage. Scientists were not surprised by the occurrence of a destructive earthquake in this region and had, in fact, been attempting to forecast the location of the next large earthquake in the San Francisco Bay region for decades. This paper summarizes more than 20 scientifically based forecasts made before the 1989 Loma Prieta earthquake for a large earthquake that might occur in the Loma Prieta area. The forecasts geographically closest to the actual earthquake primarily consisted of right-lateral strike-slip motion on the San Andreas Fault northwest of San Juan Bautista. Several of the forecasts did encompass the magnitude of the actual earthquake, and at least one approximately encompassed the along-strike rupture length. The 1989 Loma Prieta earthquake differed from most of the forecasted events in two ways: (1) it occurred with considerable dip-slip in addition to strike-slip motion, and (2) it was much deeper than expected.

Harris, Ruth A.

1998-01-01

137

Probability assessment on the recurring Meishan earthquake in central Taiwan with a new non-stationary analysis  

NASA Astrophysics Data System (ADS)

From theory to experience, earthquake probability should be increasing with time as far as the same fault is concerned, rather than being a stationary process or independent of the date of the last occurrence. With a new non-stationary model, we evaluated the earthquake probability associated with the Meishan fault in central Taiwan, a growing concern to the local community given a relatively short return period reported (i.e., around 160 years). The analysis shows that on the condition that the earthquake has not recurred by the end of year 2014, the earthquake probability in the next 50 years could be around 0.3 (mean value), with a 95% confidence interval from 0.26 to 0.36.

Wang, J. P.; Yun, X.

2014-07-01

138

Likelihood- and residual-based evaluation of medium-term earthquake forecast models for California  

NASA Astrophysics Data System (ADS)

Seven competing models for forecasting medium-term earthquake rates in California are quantitatively evaluated using the framework of the Collaboratory for the Study of Earthquake Predictability (CSEP). The model class consists of contrasting versions of the Every Earthquake a Precursor According to Size (EEPAS) and Proximity to Past Earthquakes (PPE) modelling approaches. Models are ranked by their performance on likelihood-based tests, which measure the consistency between a model forecast and observed earthquakes. To directly compare one model against another, we run a classical paired t-test and its non-parametric alternative on an information gain score based on the forecasts. These test scores are complemented by several residual-based methods, which offer detailed spatial information. The experiment period covers 2009 June-2012 September, when California experienced 23 earthquakes above the magnitude threshold. Though all models fail to capture seismicity during an earthquake sequence, spatio-temporal differences between models also emerge. The overall best-performing model has strong time- and magnitude-dependence, weights all earthquakes equally as medium-term precursors of larger events and has a full set of fitted parameters. Models with this time- and magnitude-dependence offer a statistically significant advantage over simpler baseline models. In addition, models that down-weight aftershocks when forecasting larger events have a desirable feature in that they do not overpredict following an observed earthquake sequence. This tendency towards overprediction differs between the simpler model, which is based on fewer parameters, and more complex models that include more parameters.

Schneider, Max; Clements, Robert; Rhoades, David; Schorlemmer, Danijel

2014-09-01

139

SURFACE RUPTURES ON CROSS-FAULTS IN THE 24 NOVEMBER 1987 SUPERSTITION HILLS, CALIFORNIA, EARTHQUAKE SEQUENCE  

Microsoft Academic Search

Left-lateral slip occurred on individual surface breaks along northeast-trending faults associated with the 24 November 1987 earthquake sequence in the Super- stition Hills, Imperial Valley, California. This sequence included the Ms = 6.2 event on a left-lateral, northeast-trending \\

K. HUDNUT; L. SEEBER; T. ROCKWELL; J. GOODMACHER; R. KLINGER; S. LINDVALL; R. MCELWAIN

1989-01-01

140

What Parts of PTSD Are Normal: Intrusion, Avoidance, or Arousal? Data from the Northridge, California, Earthquake  

Microsoft Academic Search

The incidence and comorbidity of posttraumatic stress disorder (PTSD) are addressed in a study of 130 Northridge, California, earthquake survivors interviewed 3 months postdisaster. Only 13% of the sample met full PTSD criteria, but 48% met both the reexperiencing and the arousal symptom criteria, without meeting the avoidance and numbing symptom criterion. Psychiatric comorbidity was associated mostly with avoidance and

J. Curtis McMillen; Carol S. North; Elizabeth M. Smith

2000-01-01

141

COMPARISON OF EARTHQUAKE AND MICROTREMOR GROUND MOTIONS IN EL CENTRO, CALIFORNIA  

Microsoft Academic Search

Strong earthquake ground shaking has been investigated by the study of 15 events recorded in El Centro, California. The strong-motion records analyzed show that no simple features (e.g., local site conditions) govern the details of local ground shaking. Any effects of local subsoil conditions at this site appear to be over- shadowed by the source mechanism and the transmission path,

F. E. UDWADIA; M. D. TRIFUNAC

1973-01-01

142

Analysis of similar event clusters in aftershocks of the 1994 Northridge, California, earthquake  

E-print Network

Analysis of similar event clusters in aftershocks of the 1994 Northridge, California, earthquake parameter analysis to characterize the shape of each cluster and to compute best fitting planes. In several, L. Astiz, and K. B. Richards-Dinger, Analysis of similar event clusters in aftershocks of the 1994

Shearer, Peter

143

Complexity of energy release during the Imperial Valley, California, earthquake of 1940  

Microsoft Academic Search

The pattern of energy release during the Imperial Valley, California, earth- quake of 1940 is studied by analyzing the El Centro strong motion seismograph record and records from the Tinemaha seismograph station, 546 km from the epicenter. The earthquake was a multiple event sequence with at least 4 events recorded at El Centro in the first 25 seconds, followed by

M. D. Trifunac; JAMES N. BRUNE

1970-01-01

144

Earthquake Locations and Three-Dimensional Crustal Structure in the Coyote Lake Area, Central California  

Microsoft Academic Search

Previous work on the simultaneous inversion method has been improved and extended to incorporate iterative solution for earthquake locations and laterally heterogeneous structure. Approximate ray tracing and parameter separation are important elements of the improved method. Application of the method to P wave arrival time data recorded by stations of the U.S. Geological Survey Central California Network yields a three-dimensional

Clifford H. Thurber

1983-01-01

145

Stress drops and radiated energies of aftershocks of the 1994 Northridge, California, earthquake  

E-print Network

Stress drops and radiated energies of aftershocks of the 1994 Northridge, California, earthquake December 2000; revised 11 May 2003; accepted 25 June 2003; published 28 November 2003. [1] We study stress estimate static and dynamic stress drops from the source time functions and compare them to well

Abercrombie, Rachel E.

146

Prediction of central California earthquakes from soil-gas helium fluctuations  

USGS Publications Warehouse

The observations of short-term decreases in helium soil-gas concentrations along the San Andreas Fault in central California have been correlated with subsequent earthquake activity. The area of study is elliptical in shape with radii approximately 160??80 km, centered near San Benito, and with the major axis parallel to the Fault. For 83 percent of the M>4 earthquakes in this area a helium decrease preceded seismic activity by 1.5 to 6.5 weeks. There were several earthquakes without a decrease and several decreases without a corresponding earthquake. Owing to complex and unresolved interaction of many geophysical and geochemical parameters, no suitable model is yet developed to explain the observations. ?? 1985 Birkha??user Verlag.

Reimer, G.M.

1985-01-01

147

Earthquake swarm along the San Andreas fault near Palmdale, Southern California, 1976 to 1977  

USGS Publications Warehouse

Between November 1976 and November 1977 a swarm of small earthquakes (local magnitude ??? 3) occurred on or near the San Andreas fault near Palmdale, California. This swarm was the first observed along this section of the San Andreas since cataloging of instrumental data began in 1932. The activity followed partial subsidence of the 35-centimeter vertical crustal uplift known as the Palmdale bulge along this "locked" section of the San Andreas, which last broke in the great (surface-wave magnitude = 81/4+) 1857 Fort Tejon earthquake. The swarm events exhibit characteristics previously observed for some foreshock sequences, such as tight clustering of hypocenters and time-dependent rotations of stress axes inferred from focal mechanisms. However, because of our present lack of understanding of the processes that precede earthquake faulting, the implications of the swarm for future large earthquakes on the San Andreas fault are unknown. Copyright ?? 1978 AAAS.

Mcnally, K.C.; Kanamori, H.; Pechmann, J.C.; Fuis, G.

1978-01-01

148

Nonvolcanic tremor evolution and the San Simeon and Parkfield, California, earthquakes.  

PubMed

Nonvolcanic tremors occur adjacent to locked faults and may be closely related to the generation of earthquakes. Monitoring of the San Andreas Fault in the Parkfield, California, region revealed that after two strong earthquakes, tremor activity increased in a nearly dormant tremor zone, increased and became periodic in a previously active zone, and has remained elevated and periodic for over 4 years. Static shear- and Coulomb-stress increases of 6 to 14 kilopascals from these two earthquakes are coincident with sudden increases in tremor rates. The persistent changes in tremor suggest that stress is now accumulating more rapidly beneath this part of the San Andreas Fault, which ruptured in the moment magnitude 7.8 Ft. Tejon earthquake of 1857. PMID:19589999

Nadeau, Robert M; Guilhem, Aurélie

2009-07-10

149

Earthquakes.  

ERIC Educational Resources Information Center

One of a series of general interest publications on science topics, the booklet provides those interested in earthquakes with an introduction to the subject. Following a section presenting an historical look at the world's major earthquakes, the booklet discusses earthquake-prone geographic areas, the nature and workings of earthquakes, earthquake

Pakiser, Louis C.

150

The 1936, 1945-1947, and 1950 earthquake sequences near Lassen Peak, California  

NASA Astrophysics Data System (ADS)

Three vigorous earthquake sequences occurred near Lassen Peak in 1936, between 1945 and 1947, and in 1950; the latter two sequences included mainshocks of magnitude 5.0 and 5.5, respectively, and thousands of smaller events. No comparable earthquake sequences have occurred near Lassen Peak since 1950. The epicentral area lies within 20 km of the southern boundary of Lassen Volcanic National Park, in a northwest striking seismic zone that extends from Lake Tahoe to the vicinity of Mount Shasta. In comparing their time history and magnitude distribution with other earthquake sequences that have occurred in regions of Cenozoic volcanism within and east of the Cascade Range and the Sierra Nevada, we find that the Lassen earthquake sequences show similar characteristics to two earthquake sequences that occurred on Basin and Range faults near Herlong, California, and Klamath Falls, Oregon. We interpret this similarity as evidence that the Lassen earthquakes were caused by Basin and Range extension and may have occurred on one or more Basin and Range faults in the Lassen region. However, the limitations of the data do not allow other possible sources, such as magmatic injection, to be ruled out. The most important implication of the Lassen earthquake sequences is that earthquakes of M 5 or greater may occur in the Lassen region, perhaps quite close to Lassen Peak or other volcanoes. The record of Holocene volcanism and fault displacements in the region indicates that earthquake sequences driven by either tectonic or magmatic processes may occur near Lassen Peak, and any significant earthquake sequence should be carefully monitored to assess its nature.

Norris, R. D.; Meagher, K. L.; Weaver, C. S.

1997-01-01

151

On the reported ionospheric precursor of the 1999 Hector Mine, California earthquake  

NASA Astrophysics Data System (ADS)

Using Global Positioning System (GPS) data from sites near the 16 Oct. 1999 Hector Mine, California earthquake, Pulinets et al. (2007) identified anomalous changes in the ionospheric total electron content (TEC) starting one week prior to the earthquake. Pulinets (2007) suggested that precursory phenomena of this type could be useful for predicting earthquakes. On the other hand, and in a separate analysis, Afraimovich et al. (2004) concluded that TEC variations near the epicenter were controlled by solar and geomagnetic activity that were unrelated to the earthquake. In an investigation of these very different results, we examine TEC time series of long duration from GPS stations near and far from the epicenter of the Hector Mine earthquake, and long before and long after the earthquake. While we can reproduce the essential time series results of Pulinets et al., we find that the signal they identified as being anomalous is not actually anomalous. Instead, it is just part of normal global-scale TEC variation. We conclude that the TEC anomaly reported by Pulinets et al. is unrelated to the Hector Mine earthquake.

Thomas, J. N.; Love, J. J.; Komjathy, A.; Verkhoglyadova, O. P.; Butala, M.; Rivera, N.

2012-03-01

152

On the reported ionospheric precursor of the Hector Mine, California earthquake  

USGS Publications Warehouse

Using Global Positioning System (GPS) data from sites near the 16 Oct. 1999 Hector Mine, California earthquake, Pulinets et al. (2007) identified anomalous changes in the ionospheric total electron content (TEC) starting one week prior to the earthquake. Pulinets (2007) suggested that precursory phenomena of this type could be useful for predicting earthquakes. On the other hand, and in a separate analysis, Afraimovich et al. (2004) concluded that TEC variations near the epicenter were controlled by solar and geomagnetic activity that were unrelated to the earthquake. In an investigation of these very different results, we examine TEC time series of long duration from GPS stations near and far from the epicenter of the Hector Mine earthquake, and long before and long after the earthquake. While we can reproduce the essential time series results of Pulinets et al., we find that the signal they identified as being anomalous is not actually anomalous. Instead, it is just part of normal global-scale TEC variation. We conclude that the TEC anomaly reported by Pulinets et al. is unrelated to the Hector Mine earthquake.

Thomas, J.N.; Love, J.J.; Komjathy, A.; Verkhoglyadova, O.P.; Butala, M.; Rivera, N.

2012-01-01

153

Earthquakes  

MedlinePLUS

An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a ...

154

Moment accumulation rate on faults in California inferred from viscoelastic earthquake cycle models (Invited)  

NASA Astrophysics Data System (ADS)

Calculations of moment accumulation rates on active faults require knowledge of long-term fault slip rates and the area of the fault that is locked interseismically. These parameters are routinely estimated from geodetic data using elastic block models with back slip on dislocations in an elastic half-space. Yet, the elastic models are inconsistent with studies that infer postseismic viscous flow in the lower crust and mantle occurring for decades following large earthquakes. Viscous flow in the lower crust and mantle generates rapid, localized deformation early in the earthquake cycle and slower, more diffuse deformation later in the cycle. Elastic models which neglect this time-dependent flow process may lead to biased estimates of fault slip rates and locking distribution. To address this issue we have developed a three-dimensional earthquake cycle model consisting of fault-bounded blocks in an elastic crust overlying a viscoelastic lower crust and uppermost mantle. It is a kinematic model in which long-term motions of fault-bounded blocks is imposed. Interseismic locking of faults and associated deformation is modeled with steady back-slip on faults and imposed periodic earthquakes. Creep on unlocked portions of the faults occurs at constant stress and therefore the instantaneous creep rate is proportional to the instantaneous stressing rate on the fault. We compare geologic slip rate estimates in southern California with model estimates using GPS data and show that elastic block models underpredict slip rates on several faults that are late in the earthquake cycle and overpredict slip rates on faults that are early in the earthquake cycle. The viscoelastic cycle model, constrained by earthquake timing from the geologic record, predicts fault slip rates that are entirely consistent with geologic estimates for all major faults in southern California. For northern California, fault slip rate estimates using geodetic data appear not to be strongly dependent on model assumptions and are generally consistent with geologic estimates; therefore we focus on estimates of the distribution of interseismic locking of faults. We constrain the locking distribution using nearly a century of triangulation measurements of strain following the M7.8 1906 San Francisco earthquake, contemporary GPS velocities, geologic slip rate and earthquake timing data, and the viscoelastic earthquake cycle model with spatially variable distributions of locking and stress-driven creep. We find considerable lateral variations in locking depths in the San Francisco Bay area. Compared with our models of spatially variable locking distribution, models that assume a typical 15 km uniform locking depth overpredict the moment accumulation rate by a factor of 2-3 on the Peninsular San Andreas, Calaveras, Rodgers Creek, and Green Valley faults.

Johnson, K. M.

2009-12-01

155

A record of large earthquakes during the past two millennia on the southern Green Valley Fault, California  

USGS Publications Warehouse

We document evidence for surface-rupturing earthquakes (events) at two trench sites on the southern Green Valley fault, California (SGVF). The 75-80-km long dextral SGVF creeps ~1-4 mm/yr. We identify stratigraphic horizons disrupted by upward-flowering shears and in-filled fissures unlikely to have formed from creep alone. The Mason Rd site exhibits four events from ~1013 CE to the Present. The Lopes Ranch site (LR, 12 km to the south) exhibits three events from 18 BCE to Present including the most recent event (MRE), 1610 ±52 yr CE (1?) and a two-event interval (18 BCE-238 CE) isolated by a millennium of low deposition. Using Oxcal to model the timing of the 4-event earthquake sequence from radiocarbon data and the LR MRE yields a mean recurrence interval (RI or ?) of 199 ±82 yr (1?) and ±35 yr (standard error of the mean), the first based on geologic data. The time since the most recent earthquake (open window since MRE) is 402 yr ±52 yr, well past ?~200 yr. The shape of the probability density function (pdf) of the average RI from Oxcal resembles a Brownian Passage Time (BPT) pdf (i.e., rather than normal) that permits rarer longer ruptures potentially involving the Berryessa and Hunting Creek sections of the northernmost GVF. The model coefficient of variation (cv, ?/?) is 0.41, but a larger value (cv ~0.6) fits better when using BPT. A BPT pdf with ? of 250 yr and cv of 0.6 yields 30-yr rupture probabilities of 20-25% versus a Poisson probability of 11-17%.

Lienkaemper, James J.; Baldwin, John N.; Turner, Robert; Sickler, Robert R.; Brown, Johnathan

2013-01-01

156

History of Modern Earthquake Hazard Mapping and Assessment in California Using a Deterministic or Scenario Approach  

NASA Astrophysics Data System (ADS)

Modern earthquake ground motion hazard mapping in California began following the 1971 San Fernando earthquake in the Los Angeles metropolitan area of southern California. Earthquake hazard assessment followed a traditional approach, later called Deterministic Seismic Hazard Analysis (DSHA) in order to distinguish it from the newer Probabilistic Seismic Hazard Analysis (PSHA). In DSHA, seismic hazard in the event of the Maximum Credible Earthquake (MCE) magnitude from each of the known seismogenic faults within and near the state are assessed. The likely occurrence of the MCE has been assumed qualitatively by using late Quaternary and younger faults that are presumed to be seismogenic, but not when or within what time intervals MCE may occur. MCE is the largest or upper-bound potential earthquake in moment magnitude, and it supersedes and automatically considers all other possible earthquakes on that fault. That moment magnitude is used for estimating ground motions by applying it to empirical attenuation relationships, and for calculating ground motions as in neo-DSHA (Z uccolo et al., 2008). The first deterministic California earthquake hazard map was published in 1974 by the California Division of Mines and Geology (CDMG) which has been called the California Geological Survey (CGS) since 2002, using the best available fault information and ground motion attenuation relationships at that time. The California Department of Transportation (Caltrans) later assumed responsibility for printing the refined and updated peak acceleration contour maps which were heavily utilized by geologists, seismologists, and engineers for many years. Some engineers involved in the siting process of large important projects, for example, dams and nuclear power plants, continued to challenge the map(s). The second edition map was completed in 1985 incorporating more faults, improving MCE's estimation method, and using new ground motion attenuation relationships from the latest published results at that time. CDMG eventually published the second edition map in 1992 following the Governor's Board of Inquiry on the 1989 Loma Prieta earthquake and at the demand of Caltrans. The third edition map was published by Caltrans in 1996 utilizing GIS technology to manage data that includes a simplified three-dimension geometry of faults and to facilitate efficient corrections and revisions of data and the map. The spatial relationship of fault hazards with highways, bridges or any other attribute can be efficiently managed and analyzed now in GIS at Caltrans. There has been great confidence in using DSHA in bridge engineering and other applications in California, and it can be confidently applied in any other earthquake-prone region. Earthquake hazards defined by DSHA are: (1) transparent and stable with robust MCE moment magnitudes; (2) flexible in their application to design considerations; (3) can easily incorporate advances in ground motion simulations; and (4) economical. DSHA and neo-DSHA have the same approach and applicability. The accuracy of DSHA has proven to be quite reasonable for practical applications within engineering design and always done with professional judgment. In the final analysis, DSHA is a reality-check for public safety and PSHA results. Although PSHA has been acclaimed as a better approach for seismic hazard assessment, it is DSHA, not PSHA, that has actually been used in seismic hazard assessment for building and bridge engineering, particularly in California.

Mualchin, Lalliana

2011-03-01

157

Fluid?driven seismicity response of the Rinconada fault near Paso Robles, California, to the 2003 M 6.5 San Simeon earthquake  

USGS Publications Warehouse

The 2003 M 6.5 San Simeon, California, earthquake caused significant damage in the city of Paso Robles and a persistent cluster of aftershocks close to Paso Robles near the Rinconada fault. Given the importance of secondary aftershock triggering in sequences of large events, a concern is whether this cluster of events could trigger another damaging earthquake near Paso Robles. An epidemic?type aftershock sequence (ETAS) model is fit to the Rinconada seismicity, and multiple realizations indicate a 0.36% probability of at least one M?6.0 earthquake during the next 30 years. However, this probability estimate is only as good as the projection into the future of the ETAS model. There is evidence that the seismicity may be influenced by fluid pressure changes, which cannot be forecasted using ETAS. The strongest evidence for fluids is the delay between the San Simeon mainshock and a high rate of seismicity in mid to late 2004. This delay can be explained as having been caused by a pore pressure decrease due to an undrained response to the coseismic dilatation, followed by increased pore pressure during the return to equilibrium. Seismicity migration along the fault also suggests fluid involvement, although the migration is too slow to be consistent with pore pressure diffusion. All other evidence, including focal mechanisms and b?value, is consistent with tectonic earthquakes. This suggests a model where the role of fluid pressure changes is limited to the first seven months, while the fluid pressure equilibrates. The ETAS modeling adequately fits the events after July 2004 when the pore pressure stabilizes. The ETAS models imply that while the probability of a damaging earthquake on the Rinconada fault has approximately doubled due to the San Simeon earthquake, the absolute probability remains low.

Hardebeck, Jeanne L.

2012-01-01

158

California takes earthquakes very seriously. The state straddles two major tectonic plates and is subject to relatively frequent, often major, potentially devastating quakes.  

E-print Network

-minute earthquake half again as powerful as the temblor that destroyed San Francisco in 1906--or 30 times on California and its San Andreas Fault. A team led by Southern California Earthquake Center (SCEC) director earthquake waves above 1 hertz. According to computational scientist Yifeng Cui of the San Diego

159

Identification and Reduction of Nonstructural Earthquake Hazards in California Schools.  

ERIC Educational Resources Information Center

It is necessary to identify nonstructural hazards at the school site to reduce the possibly of injury in the event of an earthquake. Nonstructural hazards can occur in every part of a building and all of its contents with the exception of structure. In other words, nonstructural elements are everything but the columns, beams, floors, load-bearing…

Greene, Marjorie; And Others

160

Real-time earthquake detection and hazard assessment by ElarmS across California  

NASA Astrophysics Data System (ADS)

ElarmS is a network-based methodology for rapid earthquake detection, location and hazard assessment in the form of magnitude estimation and peak ground motion prediction. The methodology is currently being tested as part of the real-time seismic system in California leveraging the resources of the California Integrated Seismic Network (CISN) and the Advanced National Seismic System. A total of 603 velocity and acceleration sensors at 383 sites across the state stream waveform data to ElarmS processing modules at three network processing centers where waveforms are reduced to a few parameters. These parameters are then collected and processed at UC Berkeley to provide a single statewide prediction of future ground shaking that is updated every second. The system successfully detected the Mw 5.4 Alum Rock earthquake in northern California for which it generated an accurate hazard prediction before peak shaking began in San Francisco. It also detected the Mw 5.4 Chino Hills earthquake in southern California. The median system latency is currently 11.8 sec; the median waveform data latency is 6.5 sec.

Allen, Richard M.; Brown, Holly; Hellweg, Margaret; Khainovski, Oleg; Lombard, Peter; Neuhauser, Douglas

2009-03-01

161

The Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT) Internship Program  

NASA Astrophysics Data System (ADS)

Our undergraduate research program, SCEC/UseIT, an NSF Research Experience for Undergraduates site, provides software for earthquake researchers and educators, movies for outreach, and ways to strengthen the technical career pipeline. SCEC/UseIT motivates diverse undergraduates towards science and engineering careers through team-based research in the exciting field of earthquake information technology. UseIT provides the cross-training in computer science/information technology (CS/IT) and geoscience needed to make fundamental progress in earthquake system science. Our high and increasing participation of women and minority students is crucial given the nation"s precipitous enrollment declines in CS/IT undergraduate degree programs, especially among women. UseIT also casts a "wider, farther" recruitment net that targets scholars interested in creative work but not traditionally attracted to summer science internships. Since 2002, SCEC/UseIT has challenged 79 students in three dozen majors from as many schools with difficult, real-world problems that require collaborative, interdisciplinary solutions. Interns design and engineer open-source software, creating increasingly sophisticated visualization tools (see "SCEC-VDO," session IN11), which are employed by SCEC researchers, in new curricula at the University of Southern California, and by outreach specialists who make animated movies for the public and the media. SCEC-VDO would be a valuable tool for research-oriented professional development programs.

Perry, S.; Jordan, T.

2006-12-01

162

Tree-ring responses to the 1978 earthquake at Stephens Pass, northeastern California  

NASA Astrophysics Data System (ADS)

The 1978 earthquake at Stephens Pass, northeastern California, dropped a series of grabens that average 4.5 m in width, extend up to 1 m in depth, and are found intermittently along a 2-km-long rupture zone. The formation of this graben series killed or otherwise affected many trees growing in or immediately adjacent to the rupture zone. Nine trees responded to the 1978 earthquake with anomalously narrow ring widths, beginning in 1979 and continuing for several years. One tree responded with anomalously wide latewood relative to total ring width. This example of tree-ring responses to a normal-fault earthquake complements other cases of tree-growth responses to earthquakes of thrust and strike-slip tectonic settings. The 1978 earthquake at Stephens Pass was unique in that it caused tree-ring responses even though it was only moderate (magnitude 4.6). This study serves as a specific calibration example for dendrochronologically studying prehistoric earthquakes, as well as eruptions, at the nearby Medicine Lake Highlands. Medicine Lake has been seismically and volcanically active during the past 1000 yr, and it supports a forest of several coniferous tree species that can be used for dendrochronologically studying geomorphological processes.

Sheppard, Paul R.; White, Lester O.

1995-02-01

163

Chapter B. The Loma Prieta, California, Earthquake of October 17, 1989 - Highway Systems  

USGS Publications Warehouse

This paper summarizes the impact of the Loma Prieta earthquake on highway systems. City streets, urban freeways, county roads, state routes, and the national highway system were all affected. There was damage to bridges, roads, tunnels, and other highway structures. The most serious damage occurred in the cities of San Francisco and Oakland, 60 miles from the fault rupture. The cost to repair and replace highways damaged by this earthquake was $2 billion. About half of this cost was to replace the Cypress Viaduct, a long, elevated double-deck expressway that had a devastating collapse which resulted in 42 deaths and 108 injuries. The earthquake also resulted in some positive changes for highway systems. Research on bridges and earthquakes began to be funded at a much higher level. Retrofit programs were started to upgrade the seismic performance of the nation's highways. The Loma Prieta earthquake changed earthquake policy and engineering practice for highway departments not only in California, but all over the world.

Yashinsky, Mark

1998-01-01

164

Earthquakes.  

ERIC Educational Resources Information Center

Presents an analysis of the causes of earthquakes. Topics discussed include (1) geological and seismological factors that determine the effect of a particular earthquake on a given structure; (2) description of some large earthquakes such as the San Francisco quake; and (3) prediction of earthquakes. (HM)

Walter, Edward J.

1977-01-01

165

Lg and Rg waves on the California regional networks from the December 23, 1985 Nahanni earthquake  

USGS Publications Warehouse

Investigates Lg and Rg propagation in California using the central and southern California regional networks. The waveforms recorded from the December 23, 1985, Nahanni, Canada, earthquake are used to construct three profiles along the propagation path (almost N-S) and three perpendicular to the propagation path (almost E-W) to look at the nature of propagation of these two types of surface waves. Groups of records from stations in various geological and tectonic provinces in California are also examined in order to establish regional characteristics of the surface waves, it is found that the propagation characteristics of Lg differ from those of Rg across California; Lg waves are apparently more sensitive to crustal heterogeneities. The most striking observations are the similarity of coda for both the Lg and the Rg waves within geologic provinces and the marked difference in coda between regions. -from Authors

Wald, L.A.; Heaton, T.H.

1991-01-01

166

Earthquakes  

NSDL National Science Digital Library

Provided by the British Geological Survey, the Earthquakes Web site contains numerous educational topics for kids. Best suited for junior high school students and older, the site contains information on macroseismology (or the observable effects of earthquakes on people, buildings, and nature); seismic hazards; earthquake monitoring; recent and historical earthquakes; and more. Other links on the site include a Questions and Answers page, earthquake references, and additional educational links culminating in an informative and helpful source of online science learning. [JAB

167

Space-Time Clustering and Correlations of Major Earthquakes  

SciTech Connect

Earthquake occurrence in nature is thought to result from correlated elastic stresses, leading to clustering in space and time. We show that the occurrence of major earthquakes in California correlates with time intervals when fluctuations in small earthquakes are suppressed relative to the long term average. We estimate a probability of less than 1% that this coincidence is due to random clustering.

Holliday, James R. [Center for Computational Science and Engineering, University of California, Davis, California 95616 (United States); Department of Physics, University of California, Davis, California 95616 (United States); Rundle, John B. [Center for Computational Science and Engineering, University of California, Davis, California 95616 (United States); Department of Physics, University of California, Davis, California 95616 (United States); Department of Geology, University of California, Davis, California 95616 (United States); Turcotte, Donald L. [Department of Geology, University of California, Davis, California 95616 (United States); Klein, William [Department of Physics, Boston University, Boston, Massachusetts 02215 (United States); Tiampo, Kristy F. [Department of Earth Sciences, University of Western Ontario, London, Ontario N6A 5B8 (Canada); Donnellan, Andrea [NASA Jet Propulsion Laboratory, Pasadena, California 91109 (United States)

2006-12-08

168

Do earthquakes exhibit self-organized criticality?  

PubMed

If earthquakes are phenomena of self-organized criticality (SOC), statistical characteristics of the earthquake time series should be invariant after the sequence of events in an earthquake catalog are randomly rearranged. In this Letter we argue that earthquakes are unlikely phenomena of SOC because our analysis of the Southern California Earthquake Catalog shows that the first-return-time probability PM(T) is apparently changed after the time series is rearranged. This suggests that the SOC theory should not be used to oppose the efforts of earthquake prediction. PMID:15245263

Yang, Xiaosong; Du, Shuming; Ma, Jin

2004-06-01

169

Focal mechanisms of Southern California offshore earthquakes: the effects of incomplete geographical data coverage on understanding rupture patterns  

NASA Astrophysics Data System (ADS)

Calculating accurate focal mechanisms for offshore seismic events is difficult due to a lack of nearby seismic stations, limited azimuthal coverage, and uncertain velocity structure. We conducted an experiment to determine what effect data from island seismic stations in Southern California (San Miguel, Santa Rosa, Santa Cruz, Santa Barbara, San Nicolas, Santa Catalina, and San Clemente Islands), and ocean bottom seismometers (OBSs) have on constraining focal mechanisms for earthquakes in the California Borderland with a local magnitude greater than three. Thirty-four OBSs were deployed in August of 2010 with the ALBACORE project to collect data for over a year before being recovered in September of 2011. Waveform data from those stations as well as the Southern California Seismic Network were analyzed to determine P-wave first-motion polarities for twenty-nine earthquakes with an acceptable signal-to-noise ratio. These data were then used to calculate focal mechanisms with and without the offshore stations using HASH v.1.2 [Hardebeck and Shearer, 2002], an algorithm that accounts for errors in earthquake location, velocity model, and polarity observations. Comparisons of these results show that including offshore stations improves the errors in fault plane uncertainty and solution probability due to the increased azimuthal coverage and smaller source-receiver distance. Plots of these solutions on maps of the offshore region indicate that the San Clemente fault, San Diego Trough fault, Palos Verdes fault, and additional unmapped faults are currently active. These observations agree with maps of more comprehensive seismicity patterns from the past twenty years. Additionally, the focal mechanisms show that the San Clemente fault, San Diego Trough fault, and a region south of San Nicolas Island all exhibit right lateral movement. The Palos Verdes fault exhibits reverse faulting and a region west of the northern Channel Islands exhibits normal faulting. These observations provide evidence that offshore faults are not purely strike-slip, but have normal and reverse slip, and present the possibility of producing tsunamis that could threaten the highly populated areas of Southern California.

Brunner, K.; Kohler, M. D.; Weeraratne, D. S.

2011-12-01

170

Earthquake and Tsunami planning, outreach and awareness in Humboldt County, California  

NASA Astrophysics Data System (ADS)

Humboldt County has the longest coastline in California and is one of the most seismically active areas of the state. It is at risk from earthquakes located on and offshore and from tsunamis generated locally from faults associated with the Cascadia subduction zone (CSZ), other regional fault systems, and from distant sources elsewhere in the Pacific. In 1995 the California Division of Mines and Geology published the first earthquake scenario to include both strong ground shaking effects and a tsunami. As a result of the scenario, the Redwood Coast Tsunami Work Group (RCTWG), an organization of representatives from government agencies, tribes, service groups, academia and the private sector from the three northern coastal California counties, was formed in 1996 to coordinate and promote earthquake and tsunami hazard awareness and mitigation. The RCTWG and its member agencies have sponsored a variety of projects including education/outreach products and programs, tsunami hazard mapping, signage and siren planning, and has sponsored an Earthquake - Tsunami Education Room at the Humboldt County fair for the past eleven years. Three editions of Living on Shaky Ground an earthquake-tsunami preparedness magazine for California's North Coast, have been published since 1993 and a fourth is due to be published in fall 2008. In 2007, Humboldt County was the first region in the country to participate in a tsunami training exercise at FEMA's Emergency Management Institute in Emmitsburg, MD and the first area in California to conduct a full-scale tsunami evacuation drill. The County has conducted numerous multi-agency, multi-discipline coordinated exercises using county-wide tsunami response plan. Two Humboldt County communities were recognized as TsunamiReady by the National Weather Service in 2007. Over 300 tsunami hazard zone signs have been posted in Humboldt County since March 2008. Six assessment surveys from 1993 to 2006 have tracked preparedness actions and personal awareness of earthquake and tsunami hazards in the county and additional surveys have tracked public awareness and tourist concerns about tsunami hazard signs. Over the thirteen year period covered by the surveys, the percent with houses secured to foundations has increased from 58 to 80 percent, respondents aware of a local tsunami hazard increased from 51 to 73 percent and knowing what the Cascadia subduction zone is from 16 to 42 percent.

Ozaki, V.; Nicolini, T.; Larkin, D.; Dengler, L.

2008-12-01

171

Historigraphical analysis of the 1857 Ft. Tejon earthquake, San Andreas Fault, California: Preliminary results  

NASA Astrophysics Data System (ADS)

Past historical analyses of the 1857 Forth Tejon earthquake include Townley and Allen (1939); Wood (1955) re-examined the earthquake and added some additional new material, and Agnew and Sieh (1978) published an extensive review of the previous publications and included primary sources not formerly known. Since 1978, most authors have reiterated the findings of Agnew and Sieh, with the exception of Meltzner and Wald's 1998 work that built on Sieh's foreshock research and included an extensive study of aftershocks. Approximately twenty-five years has past since the last full investigation of the event. In the last several decades, libraries and archives have continued to gather additional documents. Staff members continually inventory new and existing collections, making them accessible to researchers today. As a result, we are conducting an updated examination, with the hope of new insight regarding the 1857 Fort Tejon earthquake. We use a new approached to the topic: the research skills of a historian in collaboration with a geologist to generate quantitative data on the nature and location of ground shaking associated with the earthquake. We analyze documents from the Huntington Library, California State Historical Society, California State Library-California Room, Utah Historical Association Information Center, the Church of Jesus Christ of Latter-day Saints (LDS) Archives and Historical Department, Cal Tech Archives, the National Archives, and the Fort Tejon State Park. New facilities reviewed also include Utah State University, University of Utah, and the LDS Family History Center. Each facility not only provided formerly quoted sources, but many offered new materials. For example, previous scholars examined popular, well-known newspapers; yet, publications in smaller towns and in languages other than English, also existed. Thirty newspapers published in January 1857 were located. We find records of the event at least one year after the earthquake. One outcome of such a search includes letters, approximately eight pictures useful in structure-damage analysis. Over 170 newspapers were published during 1857 throughout California, Nevada, and New Mexico Territory, encompassing the area of Arizona and New Mexico today. Historical information regarding the settlement of areas also proved useful. Although earlier scholars knew of LDS settlement missions in San Bernardino, California and Las Vegas, Nevada, only brief information was located. Preliminary results include increasing the felt area to include Las Vegas, Nevada; support for a Mercalli Index of IX or even X for San Bernardino; VIII or greater for sites NE of Sacramento, a northwest to southeast rupture pattern, and reports of electromagnetic disturbances. Based on these results, we suggest that the 1857 Ft. Tejon earthquake be felt over a wider area, and in places created greater ground shaking, than previously documented.

Martindale, D.; Evans, J. P.

2002-12-01

172

DEFORMATION NEAR THE EPICENTER OF THE 1984 ROUND VALLEY, CALIFORNIA, EARTHQUAKE.  

USGS Publications Warehouse

A trilateration network extending from near Mammoth Lakes to Bishop, California, was resurveyed following the November 23, 1984, Round Valley earthquake (M//L equals 5. 8). The network had previously been surveyed in 1982. Deformation apparently associated with the Round Valley earthquake was detected as well as deformation due to the expansion of a magma chamber 8 km beneath the resurgent dome in the Long Valley caldera and right-lateral slip on the uppermost 2 km of the 1983 rupture surface in the south moat of the caldera. The deformation associated with Round Valley earthquake suggests left-lateral slip on the north-northeasterly striking vertical plane defined by the aftershock hypocenters. (Edted author abstract) Refs.

Gross, W.K.; Savage, J.C.

1985-01-01

173

A Composite Chronology of Earthquakes From the Bidart fan Paleoseismic Site, San Andreas Fault, California  

NASA Astrophysics Data System (ADS)

Chronologies of earthquakes spanning at least ten ruptures at multiple sites are required for developing robust models of fault behavior and forecasts of future earthquakes. Such a long chronology can be obtained by placing multiple trenches across the San Andreas fault at the Bidart alluvial fan paleoseismic site in the Carrizo Plain to capture the spatio-temporal record of earthquakes created by the interplay of surface rupture and spatially varying deposition. Exposures from one trench reveal evidence of at least 6 and probably 7 earthquakes since 3000 BP. Evidence of 7 earthquakes since 2200 BP has been interpreted from exposures in 3 other trenches. Analysis of exposures from two new trenches is in progress. Excavations reveal alternating sequences of depositional preservation and gaps in the record of earthquakes. The "gaps" are massive featureless zones caused by bioturbation of the fan surface while that portion of the fan was depositionally inactive. When the depositional record of 4 trenches is combined, it yields a composite chronology of at least10 surface ruptures over the last 3000 years, for a minimum average recurrence interval of 300 years if the most recent event exposed in all trenches is assumed to be the 1857 Fort Tejon earthquake. So far, the uncertainty in dates of pre-1857 ruptures ranges from decades to millennia, and at least 5 of the 10 recognized earthquakes are obscured by depositional gaps at one of the trench sites. Therefore, synchroneity of ruptures at different trench sites is difficult to establish, and there is the possibility that the existing record contains more than 10 earthquakes and/or additional ruptures may have occurred that are not preserved by deposition.

Grant, L. B.; Arrowsmith, J. R.; Akciz, S.

2005-12-01

174

Earthquakes  

NSDL National Science Digital Library

This outline of basic information on earthquakes starts with an explanation of an earthquake, including the forces acting on rock, (tension, compression, and shear) and plastic and elastic deformation of rock. Next, the principle of the seismograph, seismometer, and seismogram along with the three types of seismic waves are discussed. Information is then presented to help the student distinguish between the focus and epicenter of an earthquake, describe the world-wide distribution pattern of earthquake activity, and explain the earthquake magnitude (Richter) scale and the Modified Mercalli scale of earthquake intensity. This site also includes an explanation of how the epicenter of an earthquake can be located. There is a discussion of some past earthquakes along with a description of the effects of earthquake activity.

Pamela Gore

175

Geophysical Investigations Along the Hayward Fault, Northern California, and Their Implications on Earthquake Hazards  

NASA Astrophysics Data System (ADS)

Geophysical studies indicate that the Hayward Fault follows a pre-existing basement structure and that local geologic features play an important role in earthquake seismicity. The recent creeping trace of the Hayward Fault extends for about 90 km from San Pablo Bay in the northwest to Fremont in the southeast, and together with its northern extension, the Rodgers Creek Fault, is regarded as one of the most hazardous faults in northern California. The Hayward Fault is predominantly a right-lateral strike-slip fault that forms the western boundary of the East Bay Hills and separates Franciscan Complex rocks on the southwest from Coast Range Ophiolite and Great Valley Sequence basement rocks on the northeast. The Hayward Fault is characterized by distinct linear gravity and magnetic anomalies that correlate with changes in geology, structural trends, creep rates, and clusters of seismicity. These correlations indicate the existence of fault-zone discontinuities that probably reflect changes in mechanical properties. These fault-zone discontinuities may play a role in defining fault segments--locations where recurring seismic ruptures may tend to nucleate or terminate. Along the central part of the Hayward Fault, a prominent gravity and magnetic anomaly correlates with an exposed gabbro body, the San Leandro gabbro. Modeling of these anomalies reveals that the San Leandro gabbro is much more extensive in the subsurface than the outcrop pattern suggests, extending to a depth of about 6-8 km. The inferred extent of the San Leandro gabbro, it's geologic setting, and associated seismicity suggest that the Hayward Fault evolved from a pre-existing basement feature, similar to the ancestral Coast Range Fault. Combined modeling and relocated double-difference seismicity data indicate that the dip of the fault surface varies from near vertical in the north to about 75 degrees in the central part to about 50 degrees in the south near Fremont and ultimately connects with the central Calaveras Fault. A seismicity cluster along the western edge of the San Leandro gabbro and a bend in the fault associated with the gravity and magnetic high along the gabbro body suggests that this mafic body influences fault geometry and behavior, and may serve as a nucleation point for large earthquakes on the fault.

Ponce, D. A.; Graymer, R. W.; Hildenbrand, T. G.; Jachens, R. C.; Simpson, R. W.

2007-12-01

176

Earthquakes  

NSDL National Science Digital Library

This lesson on earthquakes is based on naturalist John Muir's experiences with two significant earthquakes, the 1872 earthquake on the east side of the Sierra Nevada Mountains, and the Great San Francisco Earthquake of 1906. Students will learn to explain that earthquakes are sudden motions along breaks in the crust called faults, and list the major geologic events including earthquakes, volcanic eruptions and mountain building, which are the result of crustal plate motions. A downloadable, printable version (PDF) of the lesson plan is available.

177

The 3 August 2009 Mw 6.9 Canal de Ballenas Region, Gulf of California, Earthquake and Its Aftershocks  

E-print Network

The 3 August 2009 Mw 6.9 Canal de Ballenas Region, Gulf of California, Earthquake and Its.9 occurred near Canal de Ballenas, in the north-central region of the Gulf of California, Mexico. The focal August a third aftershock of Mw 5.7 was located in the Canal de Ballenas region. The events of August

Shearer, Peter

178

Slip budget and potential for a M7 earthquake in central California  

NASA Astrophysics Data System (ADS)

The slip rate budget of the San Andreas fault (SAF) in central California, which is approximately 33 mm/yr, is accounted for by a change in the slip release mechanism along the fault. In the NW section of the fault, between Bear Valley and Monarch Peak, creep apparently accounts for the slip budget with the seismicity contributing negligibly. The section at Parkfield marks the transition from a creeping to a locked fault trace. Since the M8 1857 earthquake five M6 earthquakes have occurred but have not completely accounted for the slip budget. Southeast of Parkfield, the SAF has been locked since 1857. From Cholame to Bitterwater Valley this section now lags the deep slip by the amount of slip released in 1857; consequently faulting in this section could occur at the time of the next Parkfield earthquake. If this earthquake releases the slip deficit accumulated in the transition zone and in the locked zone, the earthquake will have a moment-magnitude M7.2.

Harris, Ruth A.; Archuleta, Ralph J.

1988-10-01

179

Earthquake Education and Public Information Centers: A Collaboration Between the Earthquake Country Alliance and Free-Choice Learning Institutions in California  

NASA Astrophysics Data System (ADS)

In 1999 the Southern California Earthquake Center initiated an effort to expand its reach to multiple target audiences through the development of an interpretive trail on the San Andreas fault at Wallace Creek and an earthquake exhibit at Fingerprints Youth Museum in Hemet. These projects and involvement with the San Bernardino County Museum in Redlands beginning in 2007 led to the creation of Earthquake Education and Public Information Centers (EPIcenters) in 2008. The impetus for the development of the network was to broaden participation in The Great Southern California ShakeOut. In 2009 it has grown to be more comprehensive in its scope including its evolution into a statewide network. EPIcenters constitute a variety of free-choice learning institutions, representing museums, science centers, libraries, universities, parks, and other places visited by a variety of audiences including families, seniors, and school groups. They share a commitment to demonstrating and encouraging earthquake preparedness. EPIcenters coordinate Earthquake Country Alliance activities in their county or region, lead presentations or organize events in their communities, or in other ways demonstrate leadership in earthquake education and risk reduction. The San Bernardino County Museum (Southern California) and The Tech Museum of Innovation (Northern California) serve as EPIcenter regional coordinating institutions. They interact with over thirty institutional partners who have implemented a variety of activities from displays and talks to earthquake exhibitions. While many activities are focused on the time leading up to and just after the ShakeOut, most EPIcenter members conduct activities year round. Network members at Kidspace Museum in Pasadena and San Diego Natural History Museum have formed EPIcenter focus groups on early childhood education and safety and security. This presentation highlights the development of the EPIcenter network, synergistic activities resulting from this collaboration, and lessons learned from interacting with free-choice learning institutions.

Degroot, R. M.; Springer, K.; Brooks, C. J.; Schuman, L.; Dalton, D.; Benthien, M. L.

2009-12-01

180

Earthquake  

MedlinePLUS

... Earthquake App! Text "GETQUAKE" to 90999 or search "Red Cross Earthquake" in the Apple App Store or Google Play . Aplicación Terremoto - ahora disponible en español también! Be Red Cross Ready Are you Red Cross Ready? Click ...

181

Earthquakes  

ERIC Educational Resources Information Center

Describes the causes and effects of earthquakes, defines the meaning of magnitude (measured on the Richter Magnitude Scale) and intensity (measured on a modified Mercalli Intensity Scale) and discusses earthquake prediction and control. (JR)

Roper, Paul J.; Roper, Jere Gerard

1974-01-01

182

Cruise report for 01-99-SC: southern California earthquake hazards project  

USGS Publications Warehouse

The focus of the Southern California Earthquake Hazards project is to identify the landslide and earthquake hazards and related ground-deformation processes occurring in the offshore areas that have significant potential to impact the inhabitants of the Southern California coastal region. The project activity is supported through the Coastal and Marine Geology Program of the Geologic Division of the U. S. Geological Survey (USGS) and is a component of the Geologic Division's Science Strategy under Goal 1—Conduct Geologic Hazard Assessments for Mitigation Planning (Bohlen et al., 1998). The project research is specifically stated under Activity 1.1.2 of the Science Strategy: Earthquake Hazard Assessments and Loss Reduction Products in Urban Regions. This activity involves "research, seismic and geodetic monitoring, field studies, geologic mapping, and analyses needed to provide seismic hazard assessments of major urban centers in earthquake-prone regions including adjoining coastal and offshore areas." The southern California urban areas, which form the most populated urban corridor along the U.S. Pacific margin, are among a few specifically designated for special emphasis under the Division's science strategy (Bohlen et al., 1998). The primary objective of the project is to help mitigate the earthquake hazards for the Southern California region by improving our understanding of how deformation is distributed (spatially and temporally) in the offshore with respect to the onshore region. To meet this objective, we are conducting field investigations to observe the distribution, character, and relative intensity of active (i.e., primarily Holocene) deformation within the basins and along the shelf adjacent to the most highly populated areas (Fig. 1). In addition, acoustic imaging should help determine the subsurface dimensions of the faults and identify the size and frequency of submarine landslides, both of which are necessary for evaluating the potential for generating destructive tsunamis in the southern California offshore. In order to evaluate the strain associated with the offshore structures, the initial results from the field mapping under this project will be used to identify possible sites for deployment of acoustic geodetic instruments to monitor strain in the offshore region. A major goal of mapping under this project is to provide detailed geologic and geophysical information in GIS data bases that build on the earlier studies and use the new data to precisely locate active faults and to map recent submarine landslide deposits.

Normark, William R.; Reid, Jane A.; Sliter, Ray W.; Holton, David; Gutmacher, Christina E.; Fisher, Michael A.; Childs, Jonathan R.

1999-01-01

183

Time-dependent Earthquake Probability Calculations For Faults In The Sea of Marmara  

Microsoft Academic Search

The earthquake migration along the North Anatolian Fault Zone during the last cen- tury left a gap in the Sea of Marmara, since the rupturing process stopped with the 1999 MW = 7.4 Izmit earthquake in the easternmost part of the Sea, the Gulf of Izmit. Furthermore, because the last M > 7 earthquake in the Sea of Marmara occurred

G. Hillers; F. Roth; H.-P. Harjes; J. Zschau

2002-01-01

184

Earthquakes  

NSDL National Science Digital Library

For this exercise we meet in a computer lab and students access the IRIS Earthquake Browser to download geospatial information of earthquakes. Students use the GEON Integrated Data Viewer (IDV) to explore the location of earthquake zones and their 3-dimensional characteristics. Students compare the earthquake characteristics of subduction zones, mid-oceanic ridges, and transform faults. This leads into a discussion of plate tectonics.

Achim Herrmann

185

Web Services and Other Enhancements at the Northern California Earthquake Data Center  

NASA Astrophysics Data System (ADS)

The Northern California Earthquake Data Center (NCEDC) provides data archive and distribution services for seismological and geophysical data sets that encompass northern California. The NCEDC is enhancing its ability to deliver rapid information through Web Services. NCEDC Web Services use well-established web server and client protocols and REST software architecture to allow users to easily make queries using web browsers or simple program interfaces and to receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, or MiniSEED depending on the service, and are compatible with the equivalent IRIS DMC web services. The NCEDC is currently providing the following Web Services: (1) Station inventory and channel response information delivered in StationXML format, (2) Channel response information delivered in RESP format, (3) Time series availability delivered in text and XML formats, (4) Single channel and bulk data request delivered in MiniSEED format. The NCEDC is also developing a rich Earthquake Catalog Web Service to allow users to query earthquake catalogs based on selection parameters such as time, location or geographic region, magnitude, depth, azimuthal gap, and rms. It will return (in QuakeML format) user-specified results that can include simple earthquake parameters, as well as observations such as phase arrivals, codas, amplitudes, and computed parameters such as first motion mechanisms, moment tensors, and rupture length. The NCEDC will work with both IRIS and the International Federation of Digital Seismograph Networks (FDSN) to define a uniform set of web service specifications that can be implemented by multiple data centers to provide users with a common data interface across data centers. The NCEDC now hosts earthquake catalogs and waveforms from the US Department of Energy (DOE) Enhanced Geothermal Systems (EGS) monitoring networks. These data can be accessed through the above web services and through special NCEDC web pages.

Neuhauser, D. S.; Zuzlewski, S.; Allen, R. M.

2012-12-01

186

Earthquakes  

NSDL National Science Digital Library

In this lesson, students explore the causes of earthquakes and their impact on the geology of an area and on human societies. They begin by looking at the role tectonic plates play in creating the forces that cause earthquakes, to help them understand why earthquakes occur when and where they do. Hands-on activities illustrate how rocks can withstand a certain amount of stress, but that every material has its breaking point. When rocks break underground, an earthquake occurs. In the last section, students explore the impact earthquakes have on humans and look at the efforts scientists are making to better understand and predict these sometimes deadly events.

187

Designing a network-based earthquake early warning system for California: ElarmS-2  

NASA Astrophysics Data System (ADS)

The California Integrated Seismic Network (CISN), funded by the USGS, is developing an Earthquake Early Warning (EEW) system for the state. Within this 'California ShakeAlert' project three algorithms are being tested, of which one is the network based Earthquake Alarm Systems (ElarmS) EEW. Over the last 3 years, the ElarmS algorithms have undergone a large-scale reassessment and have been re-coded to solve technological and methodological challenges. The improved algorithms in the new production-grade version of the code (E2) maximize the current seismic network's configuration, hardware, and software performance capabilities improving both the speed of the early warning processing and the accuracy of the result. E2 is designed as a modular code and consists of a new event monitor module with an improved associator that allows for more rapid association with fewer triggers, while also adding several new alert filter checks that help minimize false alarms. Here, we outline the methodology and summarize the performance of this new online real-time system. The online performance from 2 October 2012 to 15 February 2013 shows on average, ElarmS currently issues an alert 8.68 × 3.73 s after the first P-wave detection for all events across California. This time is reduced by 2 seconds in regions with dense station instrumentation. Standard deviations of magnitude, origin time are 0.4 magnitude units, 1.2 seconds, and the median location errors is 3.8 km. E2 successfully detected 26 of 29 earthquakes (M>3.5) across California while issuing 2 false alarms. E2 is now delivering alerts to ShakeAlert that in turn distributes warnings to test users.

Allen, R. M.; Kuyuk, H. S.; Henson, I. H.; Neuhauser, D. S.; Hellweg, M.

2013-12-01

188

Localization of intermediate-term earthquake prediction  

Microsoft Academic Search

Relative seismic quiescence within a region which has already been diagnosed as having entered a Time of Increased Probability (TIP) for the occurrence of a strong earthquake can be used to refine the locality in which the earthquake may be expected to occur. A simple algorithm with parameters fitted from the data in Northern California preceding the 1980 magnitude 7.0

V. G. Kossobokov; V. I. Keilis-Borok; S. W. Smith

1990-01-01

189

Surface fault slip associated with the 2004 Parkfield, California, earthquake  

USGS Publications Warehouse

Surface fracturing occurred along the San Andreas fault, the subparallel Southwest Fracture Zone, and six secondary faults in association with the 28 September 2004 (M 6.0) Parkfield earthquake. Fractures formed discontinuous breaks along a 32-km-long stretch of the San Andreas fault. Sense of slip was right lateral; only locally was there a minor (1-11 mm) vertical component of slip. Right-lateral slip in the first few weeks after the event, early in its afterslip period, ranged from 1 to 44 mm. Our observations in the weeks following the earthquake indicated that the highest slip values are in the Middle Mountain area, northwest of the mainshock epicenter (creepmeter measurements indicate a similar distribution of slip). Surface slip along the San Andreas fault developed soon after the mainshock; field checks in the area near Parkfield and about 5 km to the southeast indicated that surface slip developed more than 1 hr but generally less than 1 day after the event. Slip along the Southwest Fracture Zone developed coseismically and extended about 8 km. Sense of slip was right lateral; locally there was a minor to moderate (1-29 mm) vertical component of slip. Right-lateral slip ranged from 1 to 41 mm. Surface slip along secondary faults was right lateral; the right-lateral component of slip ranged from 3 to 5 mm. Surface slip in the 1966 and 2004 events occurred along both the San Andreas fault and the Southwest Fracture Zone. In 1966 the length of ground breakage along the San Andreas fault extended 5 km longer than that mapped in 2004. In contrast, the length of ground breakage along the Southwest Fracture Zone was the same in both events, yet the surface fractures were more continuous in 2004. Surface slip on secondary faults in 2004 indicated previously unmapped structural connections between the San Andreas fault and the Southwest Fracture Zone, further revealing aspects of the structural setting and fault interactions in the Parkfield area.

Rymer, M.J.; Tinsley, J. C., III; Treiman, J.A.; Arrowsmith, J.R.; Ciahan, K.B.; Rosinski, A.M.; Bryant, W.A.; Snyder, H.A.; Fuis, G.S.; Toke, N.A.; Bawden, G.W.

2006-01-01

190

Impact of a Large San Andreas Fault Earthquake on Tall Buildings in Southern California  

NASA Astrophysics Data System (ADS)

In 1857, an earthquake of magnitude 7.9 occurred on the San Andreas fault, starting at Parkfield and rupturing in a southeasterly direction for more than 300~km. Such a unilateral rupture produces significant directivity toward the San Fernando and Los Angeles basins. The strong shaking in the basins due to this earthquake would have had a significant long-period content (2--8~s). If such motions were to happen today, they could have a serious impact on tall buildings in Southern California. In order to study the effects of large San Andreas fault earthquakes on tall buildings in Southern California, we use the finite source of the magnitude 7.9 2001 Denali fault earthquake in Alaska and map it onto the San Andreas fault with the rupture originating at Parkfield and proceeding southward over a distance of 290~km. Using the SPECFEM3D spectral element seismic wave propagation code, we simulate a Denali-like earthquake on the San Andreas fault and compute ground motions at sites located on a grid with a 2.5--5.0~km spacing in the greater Southern California region. We subsequently analyze 3D structural models of an existing tall steel building designed in 1984 as well as one designed according to the current building code (Uniform Building Code, 1997) subjected to the computed ground motion. We use a sophisticated nonlinear building analysis program, FRAME3D, that has the ability to simulate damage in buildings due to three-component ground motion. We summarize the performance of these structural models on contour maps of carefully selected structural performance indices. This study could benefit the city in laying out emergency response strategies in the event of an earthquake on the San Andreas fault, in undertaking appropriate retrofit measures for tall buildings, and in formulating zoning regulations for new construction. In addition, the study would provide risk data associated with existing and new construction to insurance companies, real estate developers, and individual owners, so that they can make well-informed financial decisions.

Krishnan, S.; Ji, C.; Komatitsch, D.; Tromp, J.

2004-12-01

191

Premonitory patterns of seismicity months before a large earthquake: five case histories in Southern California.  

PubMed

This article explores the problem of short-term earthquake prediction based on spatio-temporal variations of seismicity. Previous approaches to this problem have used precursory seismicity patterns that precede large earthquakes with "intermediate" lead times of years. Examples include increases of earthquake correlation range and increases of seismic activity. Here, we look for a renormalization of these patterns that would reduce the predictive lead time from years to months. We demonstrate a combination of renormalized patterns that preceded within 1-7 months five large (M > or = 6.4) strike-slip earthquakes in southeastern California since 1960. An algorithm for short-term prediction is formulated. The algorithm is self-adapting to the level of seismicity: it can be transferred without readaptation from earthquake to earthquake and from area to area. Exhaustive retrospective tests show that the algorithm is stable to variations of its adjustable elements. This finding encourages further tests in other regions. The final test, as always, should be advance prediction. The suggested algorithm has a simple qualitative interpretation in terms of deformations around a soon-to-break fault: the blocks surrounding that fault began to move as a whole. A more general interpretation comes from the phenomenon of self-similarity since our premonitory patterns retain their predictive power after renormalization to smaller spatial and temporal scales. The suggested algorithm is designed to provide a short-term approximation to an intermediate-term prediction. It remains unclear whether it could be used independently. It seems worthwhile to explore similar renormalizations for other premonitory seismicity patterns. PMID:12482945

Keilis-Borok, V I; Shebalin, P N; Zaliapin, I V

2002-12-24

192

Broadband records of earthquakes in deep gold mines and a comparison with results from SAFOD, California  

USGS Publications Warehouse

For one week during September 2007, we deployed a temporary network of field recorders and accelerometers at four sites within two deep, seismically active mines. The ground-motion data, recorded at 200 samples/sec, are well suited to determining source and ground-motion parameters for the mining-induced earthquakes within and adjacent to our network. Four earthquakes with magnitudes close to 2 were recorded with high signal/noise at all four sites. Analysis of seismic moments and peak velocities, in conjunction with the results of laboratory stick-slip friction experiments, were used to estimate source processes that are key to understanding source physics and to assessing underground seismic hazard. The maximum displacements on the rupture surfaces can be estimated from the parameter Rv, where v is the peak ground velocity at a given recording site, and R is the hypocentral distance. For each earthquake, the maximum slip and seismic moment can be combined with results from laboratory friction experiments to estimate the maximum slip rate within the rupture zone. Analysis of the four M 2 earthquakes recorded during our deployment and one of special interest recorded by the in-mine seismic network in 2004 revealed maximum slips ranging from 4 to 27 mm and maximum slip rates from 1.1 to 6:3 m=sec. Applying the same analyses to an M 2.1 earthquake within a cluster of repeating earthquakes near the San Andreas Fault Observatory at Depth site, California, yielded similar results for maximum slip and slip rate, 14 mm and 4:0 m=sec.

McGarr, A.; Boettcher, M.; Fletcher, Joe B.; Sell, R.; Johnston, M.J.S.; Durrheim, R.; Spottiswoode, S.; Milev, A.

2009-01-01

193

Cruise report for A1-00-SC southern California earthquake hazards project, part A  

USGS Publications Warehouse

A three-week cruise to obtain high-resolution boomer and multichannel seismic-reflection profiles supported two project activities of the USGS Coastal and Marine Geology (CMG) Program: (1) evaluating the earthquake and related geologic hazards posed by faults in the near offshore area of southern California and (2) determining the pathways through which sea-water is intruding into aquifers of Los Angeles County in the area of the Long Beach and Los Angeles harbors. The 2000 cruise, A1-00-SC, is the third major data-collection effort in support of the first objective (Normark et al., 1999a, b); one more cruise is planned for 2002. This report deals primarily with the shipboard operations related to the earthquake-hazard activity. The sea-water intrusion survey is confined to shallow water and the techniques used are somewhat different from that of the hazards survey (see Edwards et al., in preparation).

Gutmacher, Christina E.; Normark, William R.; Ross, Stephanie L.; Edwards, Brian D.; Sliter, Ray; Hart, Patrick; Cooper, Becky; Childs, Jon; Reid, Jane A.

2000-01-01

194

Rupture process of the 29 May 2013 Mw 4.8 Isla Vista, California earthquake and its tectonic implication  

NASA Astrophysics Data System (ADS)

The Santa Barbara Channel is one of the most seismically active regions in southern California as the result of roughly north-south compressive deformation associating with the Western Transverse Range. On 29 May 2013, a Mw 4.8 earthquake occurred at a depth of 10 km beneath the north Santa Barbara Channel, about 5 km west offshore of the Isla Vista, California. This Isla Vista earthquake is the largest earthquake in the Santa Barbara region after the 1978 ML 5.1 (Mw 5.9) Santa Barbara earthquake. The large historic earthquake also includes the catastrophic 1925 ML 6.8 Santa Barbara earthquake. However, the causative fault planes of these two earthquakes are still debated, presumably due to their limited observations. The analysis to the Isla Vista earthquake then shall shed light on the potential large-magnitude thrust earthquakes in this region. The CISN focal mechanism and the distribution of relocated aftershocks suggest that the 2013 Isla Vista earthquake is a pure thrust failure on a north dipping low angle (~30°) fault plane orienting N287°E, consistent with the local tectonic setting. The Isla Vista earthquake is well recorded by CISN and CGS strong motion stations. Eight of them locating within 20 km away from the epicenter. The observed peak ground acceleration is 0.265 g. A finite fault analysis with this dataset is being conducted to constrain its slip distribution. The preliminary analysis using just local P wave observations reveals a compact slip distribution on a low angle fault plane with a slightly larger fault dip of 37°.

Li, X.; Ji, C.

2013-12-01

195

The 1999 Mw 7.1 Hector Mine, California, earthquake: A test of the stress shadow hypothesis?  

USGS Publications Warehouse

We test the stress shadow hypothesis for large earthquake interactions by examining the relationship between two large earthquakes that occurred in the Mojave Desert of southern California, the 1992 Mw 7.3 Landers and 1999 Mw 7.1 Hector Mine earthquakes. We want to determine if the 1999 Hector Mine earthquake occurred at a location where the Coulomb stress was increased (earthquake advance, stress trigger) or decreased (earthquake delay, stress shadow) by the previous large earthquake. Using four models of the Landers rupture and a range of possible hypocentral planes for the Hector Mine earthquake, we discover that most scenarios yield a Landers-induced relaxation (stress shadow) on the Hector Mine hypocentral plane. Although this result would seem to weigh against the stress shadow hypothesis, the results become considerably more uncertain when the effects of a nearby Landers aftershock, the 1992 ML 5.4 Pisgah earthquake, are taken into account. We calculate the combined static Coulomb stress changes due to the Landers and Pisgah earthquakes to range from -0.3 to +0.3 MPa (- 3 to +3 bars) at the possible Hector Mine hypocenters, depending on choice of rupture model and hypocenter. These varied results imply that the Hector Mine earthquake does not provide a good test of the stress shadow hypothesis for large earthquake interactions. We use a simple approach, that of static dislocations in an elastic half-space, yet we still obtain a wide range of both negative and positive Coulomb stress changes. Our findings serve as a caution that more complex models purporting to explain the triggering or shadowing relationship between the 1992 Landers and 1999 Hector Mine earthquakes need to also consider the parametric and geometric uncertainties raised here.

Harris, R.A.; Simpson, R.W.

2002-01-01

196

LLNL-Generated Content for the California Academy of Sciences, Morrison Planetarium Full-Dome Show: Earthquake  

Microsoft Academic Search

The California Academy of Sciences (CAS) Morrison Planetarium is producing a 'full-dome' planetarium show on earthquakes and asked LLNL to produce content for the show. Specifically the show features numerical ground motion simulations of the M 7.9 1906 San Francisco and a possible future M 7.05 Hayward fault scenario earthquake. The show also features concepts of plate tectonics and mantle

A J Rodgers; N A Petersson; C E Morency; N A Simmons; B Sjogreen

2012-01-01

197

GPS Time Series Analysis of Southern California Associated with the 2010 M7.2 El Mayor/Cucapah Earthquake  

NASA Technical Reports Server (NTRS)

The Magnitude 7.2 El-Mayor/Cucapah earthquake the occurred in Mexico on April 4, 2012 was well instrumented with continuous GPS stations in California. Large Offsets were observed at the GPS stations as a result of deformation from the earthquake providing information about the co-seismic fault slip as well as fault slip from large aftershocks. Information can also be obtained from the position time series at each station.

Granat, Robert; Donnellan, Andrea

2011-01-01

198

Stress/strain changes and triggered seismicity following the MW7.3 Landers, California, earthquake  

USGS Publications Warehouse

Calculations of dynamic stresses and strains, constrained by broadband seismograms, are used to investigate their role in generating the remotely triggered seismicity that followed the June 28, 1992, MW7.3 Landers, California earthquake. I compare straingrams and dynamic Coulomb failure functions calculated for the Landers earthquake at sites that did experience triggered seismicity with those at sites that did not. Bounds on triggering thresholds are obtained from analysis of dynamic strain spectra calculated for the Landers and MW,6.1 Joshua Tree, California, earthquakes at various sites, combined with results of static strain investigations by others. I interpret three principal results of this study with those of a companion study by Gomberg and Davis [this issue]. First, the dynamic elastic stress changes themselves cannot explain the spatial distribution of triggered seismicity, particularly the lack of triggered activity along the San Andreas fault system. In addition to the requirement to exceed a Coulomb failure stress level, this result implies the need to invoke and satisfy the requirements of appropriate slip instability theory. Second, results of this study are consistent with the existence of frequency- or rate-dependent stress/strain triggering thresholds, inferred from the companion study and interpreted in terms of earthquake initiation involving a competition of processes, one promoting failure and the other inhibiting it. Such competition is also part of relevant instability theories. Third, the triggering threshold must vary from site to site, suggesting that the potential for triggering strongly depends on site characteristics and response. The lack of triggering along the San Andreas fault system may be correlated with the advanced maturity of its fault gouge zone; the strains from the Landers earthquake were either insufficient to exceed its larger critical slip distance or some other critical failure parameter; or the faults failed stably as aseismic creep events. Variations in the triggering threshold at sites of triggered seismicity may be attributed to variations in gouge zone development and properties. Finally, these interpretations provide ready explanations for the time delays between the Landers earthquake and the triggered events.

Gomberg, J.

1996-01-01

199

Calculation of earthquake rupture histories using a hybrid global search algorithm: Application to the 1992 Landers, California, earthquake  

USGS Publications Warehouse

A method is presented for the simultaneous calculation of slip amplitudes and rupture times for a finite fault using a hybrid global search algorithm. The method we use combines simulated annealing with the downhill simplex method to produce a more efficient search algorithm then either of the two constituent parts. This formulation has advantages over traditional iterative or linearized approaches to the problem because it is able to escape local minima in its search through model space for the global optimum. We apply this global search method to the calculation of the rupture history for the Landers, California, earthquake. The rupture is modeled using three separate finite-fault planes to represent the three main fault segments that failed during this earthquake. Both the slip amplitude and the time of slip are calculated for a grid work of subfaults. The data used consist of digital, teleseismic P and SH body waves. Long-period, broadband, and short-period records are utilized to obtain a wideband characterization of the source. The results of the global search inversion are compared with a more traditional linear-least-squares inversion for only slip amplitudes. We use a multi-time-window linear analysis to relax the constraints on rupture time and rise time in the least-squares inversion. Both inversions produce similar slip distributions, although the linear-least-squares solution has a 10% larger moment (7.3 ?? 1026 dyne-cm compared with 6.6 ?? 1026 dyne-cm). Both inversions fit the data equally well and point out the importance of (1) using a parameterization with sufficient spatial and temporal flexibility to encompass likely complexities in the rupture process, (2) including suitable physically based constraints on the inversion to reduce instabilities in the solution, and (3) focusing on those robust rupture characteristics that rise above the details of the parameterization and data set.

Hartzell, S.; Liu, P.

1996-01-01

200

Products and Services Available from the Southern California Earthquake Data Center (SCEDC) and the Southern California Seismic Network (SCSN)  

NASA Astrophysics Data System (ADS)

Currently the SCEDC archives continuous and triggered data from nearly 5000 data channels from 425 SCSN recorded stations, processing and archiving an average of 12,000 earthquakes each year. The SCEDC provides public access to these earthquake parametric and waveform data through its website www.data.scec.org and through client applications such as STP and DHI. This poster will describe the most significant developments at the SCEDC in the past year. Updated hardware: ? The SCEDC has more than doubled its waveform file storage capacity by migrating to 2 TB disks. New data holdings: ? Waveform data: Beginning Jan 1, 2010 the SCEDC began continuously archiving all high-sample-rate strong-motion channels. All seismic channels recorded by SCSN are now continuously archived and available at SCEDC. ? Portable data from El Mayor Cucapah 7.2 sequence: Seismic waveforms from portable stations installed by researchers (contributed by Elizabeth Cochran, Jamie Steidl, and Octavio Lazaro-Mancilla) have been added to the archive and are accessible through STP either as continuous data or associated with events in the SCEDC earthquake catalog. This additional data will help SCSN analysts and researchers improve event locations from the sequence. ? Real time GPS solutions from El Mayor Cucapah 7.2 event: Three component 1Hz seismograms of California Real Time Network (CRTN) GPS stations, from the April 4, 2010, magnitude 7.2 El Mayor-Cucapah earthquake are available in SAC format at the SCEDC. These time series were created by Brendan Crowell, Yehuda Bock, the project PI, and Mindy Squibb at SOPAC using data from the CRTN. The El Mayor-Cucapah earthquake demonstrated definitively the power of real-time high-rate GPS data: they measure dynamic displacements directly, they do not clip and they are also able to detect the permanent (coseismic) surface deformation. ? Triggered data from the Quake Catcher Network (QCN) and Community Seismic Network (CSN): The SCEDC in cooperation with QCN and CSN is exploring ways to archive and distribute data from high density low cost networks. As a starting point the SCEDC will store a dataset from QCN and CSN and distribute it through a separate STP client. New archival methods: ? The SCEDC is exploring the feasibility of archiving and distributing waveform data using cloud computing such as Google Apps. A month of continuous data from the SCEDC archive will be stored in Google Apps and a client developed to access it in a manner similar to STP. XML formats: ? The SCEDC is now distributing earthquake parameter data through web services in QuakeML format. ? The SCEDC in collaboration with the Northern California Earthquake Data Center (NCEDC) and USGS Golden has reviewed and revised the StationXML format to produce version 2.0. The new version includes a rules on extending the schema, use of named complex types, and greater consistency in naming conventions. Based on this work we plan to develop readers and writers of the StationXML format.

Yu, E.; Bhaskaran, A.; Chen, S.; Chowdhury, F. R.; Meisenhelter, S.; Hutton, K.; Given, D.; Hauksson, E.; Clayton, R. W.

2010-12-01

201

Coulomb static stress interactions between simulated M>7 earthquakes and major faults in Southern California  

NASA Astrophysics Data System (ADS)

We calculate the Coulomb stress changes imparted to major Southern California faults by thirteen simulated worst-case-scenario earthquakes for the region, including the “Big Ten” scenarios (Ely et al, in progress). The source models for the earthquakes are variable-slip simulations from the SCEC CyberShake project (Graves et al, 2010). We find strong stress interactions between the San Andreas and subparallel right-lateral faults, thrust faults under the Los Angeles basin, and the left-lateral Garlock Fault. M>7 earthquakes rupturing sections of the southern San Andreas generally decrease Coulomb stress on the San Jacinto and Elsinore faults and impart localized stress increases and decreases to the Garlock, San Cayetano, Puente Hills and Sierra Madre faults. A M=7.55 quake rupturing the San Andreas between Lake Hughes and San Gorgonio Pass increases Coulomb stress on the eastern San Cayetano fault, consistent with Deng and Sykes (1996). M>7 earthquakes rupturing the San Jacinto, Elsinore, Newport-Inglewood and Palos Verdes faults decrease stress on parallel right-lateral faults. A M=7.35 quake on the San Cayetano Fault decreases stress on the Garlock and imparts localized stress increases and decreases to the San Andreas. A M=7.15 quake on the Puente Hills Fault increases stress on the San Andreas and San Jacinto faults, decreases stress on the Sierra Madre Fault and imparts localized stress increases and decreases to the Newport-Inglewood and Palos Verdes faults. A M=7.25 shock on the Sierra Madre Fault increases stress on the San Andreas and decreases stress on the Puente Hills Fault. These findings may be useful for hazard assessment, paleoseismology, and comparison with dynamic stress interactions featuring the same set of earthquakes.

Rollins, J. C.; Ely, G. P.; Jordan, T. H.

2010-12-01

202

Repeating Earthquake and Nonvolcanic Tremor Observations of Aseismic Deep Fault Transients in Central California.  

NASA Astrophysics Data System (ADS)

Seismic indicators of fault zone deformation can complement geodetic measurements by providing information on aseismic transient deformation: 1) from deep within the fault zone, 2) on a regional scale, 3) with intermediate temporal resolution (weeks to months) and 4) that spans over 2 decades (1984 to early 2005), including pre- GPS and INSAR coverage. Along the San Andreas Fault (SAF) in central California, two types of seismic indicators are proving to be particularly useful for providing information on deep fault zone deformation. The first, characteristically repeating microearthquakes, provide long-term coverage (decades) on the evolution of aseismic fault slip rates at seismogenic depths along a large (~175 km) stretch of the SAF between the rupture zones of the ~M8 1906 San Francisco and 1857 Fort Tejon earthquakes. In Cascadia and Japan the second type of seismic indicator, nonvolcanic tremors, have shown a remarkable correlation between their activity rates and GPS and tiltmeter measurements of transient deformation in the deep (sub-seismogenic) fault zone. This correlation suggests that tremor rate changes and deep transient deformation are intimately related and that deformation associated with the tremor activity may be stressing the seismogenic zone in both areas. Along the SAF, nonvolcanic tremors have only recently been discovered (i.e., in the Parkfield-Cholame area), and knowledge of their full spatial extent is still relatively limited. Nonetheless the observed temporal correlation between earthquake and tremor activity in this area is consistent with a model in which sub-seismogenic deformation and seismogenic zone stress changes are closely related. We present observations of deep aseismic transient deformation associated with the 28 September 2004, M6 Parkfield earthquake from both repeating earthquake and nonvolcanic tremor data. Also presented are updated deep fault slip rate estimates from prepeating quakes in the San Juan Bautista area with an assessment of their significance to previously reported quasi-periodic slip rate pulses and small to moderate magnitude (> M3.5) earthquake occurrence in the area.

Nadeau, R. M.; Traer, M.; Guilhem, A.

2005-12-01

203

Earthquakes!  

NSDL National Science Digital Library

A strong earthquake struck Istanbul, Turkey on Monday, only weeks after a major quake in the same area claimed more than 15,500 lives. This site, from The Why Files (see the August 9, 1996 Scout Report), offers background information on the science of earthquakes, with particular emphasis on the recent tectonic activity in Turkey.

204

Marine geology and earthquake hazards of the San Pedro Shelf region, southern California  

USGS Publications Warehouse

High-resolution seismic-reflection data have been com- bined with a variety of other geophysical and geological data to interpret the offshore structure and earthquake hazards of the San Pedro Shelf, near Los Angeles, California. Prominent structures investigated include the Wilmington Graben, the Palos Verdes Fault Zone, various faults below the western part of the shelf and slope, and the deep-water San Pedro Basin. The structure of the Palos Verdes Fault Zone changes mark- edly southeastward across the San Pedro Shelf and slope. Under the northern part of the shelf, this fault zone includes several strands, but the main strand dips west and is probably an oblique-slip fault. Under the slope, this fault zone con- sists of several fault strands having normal separation, most of which dip moderately east. To the southeast near Lasuen Knoll, the Palos Verdes Fault Zone locally is a low-angle fault that dips east, but elsewhere near this knoll the fault appears to dip steeply. Fresh sea-floor scarps near Lasuen Knoll indi- cate recent fault movement. The observed regional structural variation along the Palos Verdes Fault Zone is explained as the result of changes in strike and fault geometry along a master strike-slip fault at depth. The shallow summit and possible wavecut terraces on Lasuen knoll indicate subaerial exposure during the last sea-level lowstand. Modeling of aeromagnetic data indicates the presence of a large magnetic body under the western part of the San Pedro Shelf and upper slope. This is interpreted to be a thick body of basalt of Miocene(?) age. Reflective sedimentary rocks overlying the basalt are tightly folded, whereas folds in sedimentary rocks east of the basalt have longer wavelengths. This difference might mean that the basalt was more competent during folding than the encasing sedimentary rocks. West of the Palos Verdes Fault Zone, other northwest-striking faults deform the outer shelf and slope. Evidence for recent movement along these faults is equivocal, because age dates on deformed or offset sediment are lacking.

Fisher, Michael A.; Normark, William R.; Langenheim, Victoria E.; Calvert, Andrew J.; Sliter, Ray

2004-01-01

205

Web Services and Data Enhancements at the Northern California Earthquake Data Center  

NASA Astrophysics Data System (ADS)

The Northern California Earthquake Data Center (NCEDC) provides data archive and distribution services for seismological and geophysical data sets that encompass northern California. The NCEDC is enhancing its ability to deliver rapid information through Web Services. NCEDC Web Services use well-established web server and client protocols and REST software architecture to allow users to easily make queries using web browsers or simple program interfaces and to receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, simple text, or MiniSEED depending on the service and selected output format. The NCEDC offers the following web services that are compliant with the International Federation of Digital Seismograph Networks (FDSN) web services specifications: (1) fdsn-dataselect: time series data delivered in MiniSEED format, (2) fdsn-station: station and channel metadata and time series availability delivered in StationXML format, (3) fdsn-event: earthquake event information delivered in QuakeML format. In addition, the NCEDC offers the the following IRIS-compatible web services: (1) sacpz: provide channel gains, poles, and zeros in SAC format, (2) resp: provide channel response information in RESP format, (3) dataless: provide station and channel metadata in Dataless SEED format. The NCEDC is also developing a web service to deliver timeseries from pre-assembled event waveform gathers. The NCEDC has waveform gathers for ~750,000 northern and central California events from 1984 to the present, many of which were created by the USGS NCSN prior to the establishment of the joint NCSS (Northern California Seismic System). We are currently adding waveforms to these older event gathers with time series from the UCB networks and other networks with waveforms archived at the NCEDC, and ensuring that the waveform for each channel in the event gathers have the highest quality waveform from the archive.

Neuhauser, D. S.; Zuzlewski, S.; Lombard, P. N.; Allen, R. M.

2013-12-01

206

Caltech/USGS Southern California Seismic Network (SCSN): Infrastructure upgrade to support Earthquake Early Warning (EEW)  

NASA Astrophysics Data System (ADS)

The SCSN is the modern digital ground motion seismic network in Southern California and performs the following tasks: 1) Operates remote seismic stations and the central data processing systems in Pasadena; 2) Generates and reports real-time products including location, magnitude, ShakeMap, aftershock probabilities and others; 3) Responds to FEMA, CalOES, media, and public inquiries about earthquakes; 4) Manages the production, archival, and distribution of waveforms, phase picks, and other data at the SCEDC; 5) Contributes to development and implementation of the demonstration EEW system called CISN ShakeAlert. Initially, the ShakeAlert project was funded through the US Geological Survey (USGS) and in early 2012, the Gordon and Betty Moore Foundation provided three years of new funding for EEW research and development for the US west coast. Recently, we have also received some Urban Areas Security Initiative (UASI) funding to enhance the EEW capabilities for the local UASI region by making our system overall faster, more reliable and redundant than the existing system. The additional and upgraded stations will be capable of decreasing latency and ensuring data delivery by using more reliable and redundant telemetry pathways. Overall, this will enhance the reliability of the earthquake early warnings by providing denser station coverage and more resilient data centers than before. * Seismic Datalogger upgrade: replaces existing dataloggers with modern equipment capable of sending one-second uncompressed packets and utilizing redundant Ethernet telemetry. * GPS upgrade: replaces the existing GPS receivers and antennas, especially at "zipper array" sites near the major faults, with receivers that perform on-board precise point positioning to calculate position and velocity in real time and stream continuous data for use in EEW calculations. * New co-located seismic/GPS stations: increases station density and reduces early warning delays that are incurred by travel time of the seismic waves to the nearest station and will increase the reliability of the early warning with multiple measurements from more than one reporting station. * New server hardware: will allow for separate software development, testing/integration of algorithms and production systems capable of testing with current as well as playback of historical data. Also the new systems will be used to develop and test new EEW algorithms like slip detection (GPSlip) and Finite-Fault Rupture Detection (FinDer). * Standardization and Security: the new systems will allow us to standardize on hardware installation and configuration procedures. It will also enable us to implement the latest computer and network security measures to secure the data and internal processing from malicious threats. * System architecture: the new hardware will allow us to port existing EEW algorithms from Solaris to Linux. The new equipment will also allow us to experiment with different system architecture configurations like redundant servers with fail-over capabilities for the production EEW system. When installed the new and upgraded seismic dataloggers and GPS stations as well as the new server hardware will greatly improve the EEW capabilities of the SCSN network and the CISN ShakeAlert system in general providing more resilience, robustness and redundancy in the system.

Bhadha, R. J.; Hauksson, E.; Boese, M.; Felizardo, C.; Thomas, V. I.; Yu, E.; Given, D. D.; Heaton, T. H.; Hudnut, K. W.

2013-12-01

207

Reply to “Comment on “Should Memphis build for California's earthquakes?” From A.D. Frankel”  

NASA Astrophysics Data System (ADS)

Carl Sagan observed that “extraordinary claims require extraordinary evidence.” In our view, A.D. Frankel's arguments (see accompanying Comment piece) do not reach the level required to demonstrate the counter-intuitive propositions that the earthquake hazard in the New Madrid Seismic Zone (NMSZ) is comparable to that in coastal California, and that buildings should be built to similar standards.This interchange is the latest in an ongoing debate beginning with Newman et al.'s [1999a] recommendation, based on analysis of Global Positioning System and earthquake data, that Frankel et al.'s [1996] estimate of California-level seismic hazard for the NMSZ should be reduced. Most points at issue, except for those related to the costs and benefits of the proposed new International Building Code 2000, have already been argued at length by both sides in the literature [e.g.,Schweig et al., 1999; Newman et al., 1999b, 2001; Cramer, 2001]. Hence,rather than rehash these points, we will try here to provide readers not enmeshed in this morass with an overview of the primary differences between our view and that of Frankel.

Stein, Seth; Tomasello, Joseph; Newman, Andrew

208

Earthquake source mechanisms and transform fault tectonics in the Gulf of California  

NASA Technical Reports Server (NTRS)

The source parameters of 19 large earthquakes in the Gulf of California were determined from inversions of long-period P and SH waveforms. The goal was to understand the recent slip history of this dominantly transform boundary between the Pacific and North American plates as well as the effect on earthquake characteristics of the transition from young oceanic to continental lithosphere. For the better recorded transform events, the fault strike is resolved to + or - 4 deg at 90 percent confidence. The slip vectors thus provide important constraints on the direction of relative plate motion. Most centroid depths are poorly resolved because of tradeoffs between depth and source time function. On the basis of waveform modeling, historical seismicity, and other factors, it is appropriate to divide the Gulf into three distinct zones. The difference in seismic character among the three zones is likely the result of differing levels of maturity of the processes of rifting, generation of oceanic crust, and formation of stable oceanic transform faults. The mechanism of an earthquake on the Tres Marias Escarpment is characterized by thrust faulting and likely indicates the direction of relative motion between the Rivera and North American plates. This mechanism requires revision in plate velocity models which predict strike slip motion at this location.

Goff, John A.; Bergman, Eric A.; Solomon, Sean C.

1987-01-01

209

Earthquake source mechanisms and transform fault tectonics in the Gulf of California  

NASA Astrophysics Data System (ADS)

The source parameters of 19 large earthquakes in the Gulf of California were determined from inversions of long-period P and SH waveforms. The goal was to understand the recent slip history of this dominantly transform boundary between the Pacific and North American plates as well as the effect on earthquake characteristics of the transition from young oceanic to continental lithosphere. For the better recorded transform events, the fault strike is resolved to + or - 4 deg at 90 percent confidence. The slip vectors thus provide important constraints on the direction of relative plate motion. Most centroid depths are poorly resolved because of tradeoffs between depth and source time function. On the basis of waveform modeling, historical seismicity, and other factors, it is appropriate to divide the Gulf into three distinct zones. The difference in seismic character among the three zones is likely the result of differing levels of maturity of the processes of rifting, generation of oceanic crust, and formation of stable oceanic transform faults. The mechanism of an earthquake on the Tres Marias Escarpment is characterized by thrust faulting and likely indicates the direction of relative motion between the Rivera and North American plates. This mechanism requires revision in plate velocity models which predict strike slip motion at this location.

Goff, John A.; Bergman, Eric A.; Solomon, Sean C.

1987-11-01

210

Holocene paleoseismicity, temporal clustering, and probabilities of future large (M > 7) earthquakes on the Wasatch fault zone, Utah  

USGS Publications Warehouse

The chronology of M>7 paleoearthquakes on the central five segments of the Wasatch fault zone (WFZ) is one of the best dated in the world and contains 16 earthquakes in the past 5600 years with an average repeat time of 350 years. Repeat times for individual segments vary by a factor of 2, and range from about 1200 to 2600 years. Four of the central five segments ruptured between ??? 620??30 and 1230??60 calendar years B.P. The remaining segment (Brigham City segment) has not ruptured in the past 2120??100 years. Comparison of the WFZ space-time diagram of paleoearthquakes with synthetic paleoseismic histories indicates that the observed temporal clusters and gaps have about an equal probability (depending on model assumptions) of reflecting random coincidence as opposed to intersegment contagion. Regional seismicity suggests that for exposure times of 50 and 100 years, the probability for an earthquake of M>7 anywhere within the Wasatch Front region, based on a Poisson model, is 0.16 and 0.30, respectively. A fault-specific WFZ model predicts 50 and 100 year probabilities for a M>7 earthquake on the WFZ itself, based on a Poisson model, as 0.13 and 0.25, respectively. In contrast, segment-specific earthquake probabilities that assume quasi-periodic recurrence behavior on the Weber, Provo, and Nephi segments are less (0.01-0.07 in 100 years) than the regional or fault-specific estimates (0.25-0.30 in 100 years), due to the short elapsed times compared to average recurrence intervals on those segments. The Brigham City and Salt Lake City segments, however, have time-dependent probabilities that approach or exceed the regional and fault specific probabilities. For the Salt Lake City segment, these elevated probabilities are due to the elapsed time being approximately equal to the average late Holocene recurrence time. For the Brigham City segment, the elapsed time is significantly longer than the segment-specific late Holocene recurrence time.

McCalpin, J.P.; Nishenko, S.P.

1996-01-01

211

Earthquakes  

MedlinePLUS

... earthquake is the sudden, rapid shaking of the earth, caused by the breaking and shifting of subterranean ... the forces of plate tectonics have shaped the earth, as the huge plates that form the earth’s ...

212

Earthquakes  

NSDL National Science Digital Library

To understand P and S waves, to observe some videos of earthquakes, and to find out where and when the last earthquake in Utah was. Print out this worksheet for the questions to accompany the following websites. Worksheet Go to The Earth Layers The Earth's Layers and read the information. Answer the following 4 questions on a separate paper. Name the four layers of the Earth in order from the outside to the center of the Earth. What causes ...

Mrs. Clemons

2010-11-02

213

Source processes of industrially-induced earthquakes at the Geysers geothermal area, California  

USGS Publications Warehouse

Microearthquake activity at The Geysers geothermal area, California, mirrors the steam production rate, suggesting that the earthquakes are industrially induced. A 15-station network of digital, three-component seismic stations was operated for one month in 1991, and 3,900 earthquakes were recorded. Highly-accurate moment tensors were derived for 30 of the best recorded earthquakes by tracing rays through tomographically derived 3-D VP and VP / VS structures, and inverting P-and S-wave polarities and amplitude ratios. The orientations of the P-and T-axes are very scattered, suggesting that there is no strong, systematic deviatoric stress field in the reservoir, which could explain why the earthquakes are not large. Most of the events had significant non-double-couple (non-DC) components in their source mechanisms with volumetric components up to ???30% of the total moment. Explosive and implosive sources were observed in approximately equal numbers, and must be caused by cavity creation (or expansion) and collapse. It is likely that there is a causal relationship between these processes and fluid reinjection and steam withdrawal. Compensated linear vector dipole (CLVD) components were up to 100% of the deviatoric component. Combinations of opening cracks and shear faults cannot explain all the observations, and rapid fluid flow may also be involved. The pattern of non-DC failure at The Geysers contrasts with that of the Hengill-Grensdalur area in Iceland, a largely unexploited water-dominated field in an extensional stress regime. These differences are poorly understood but may be linked to the contrasting regional stress regimes and the industrial exploitation at The Geysers.

Ross, A.; Foulger, G.R.; Julian, B.R.

1999-01-01

214

Satellite IR Thermal Measurements Prior to the September 2004 Earthquakes in Central California  

NASA Technical Reports Server (NTRS)

We present and discuss observed variations in thermal transients and radiation fields prior to the earthquakes of September 18 near Bodie (M5.5) and September 28,2004 near Parkfield(M6.0) in California. Previous analysis of earthquake events have indicated the presence of a thermal anomaly, where temperatures increased or did not return to its usual nighttime value. The procedures used in our work is to analyze weather satellite data taken at night and to record the general condition where the ground cools after sunset. Two days before the Bodie earthquake lower temperature radiation was observed by the NOAA/AVHRR satellite. This occurred when the entire region was relatively cloud-free. IR land surface nighttime temperature from the MODIS instrument rose to +4 C in a 100 km radius around the Bodie epicenter. The thermal transient field recorded by MODIS in the vicinity of Parkfield, also with a cloud free environment, was around +l C and it is significantly smaller than the Parkfield epicenter, however, for that period showed a steady increase 4 days prior to the earthquake and a significant drop of the night before the quake. Geosynchronous weather satellite thermal IR measurements taken every half hour from sunset to dawn, were also recorded for 10 days prior to the Parkfield event and 5 days after as well as the day of the quake. To establish a baseline we also obtained GOES data for the same Julian sets were then used to systematically observe and record any thermal anomaly prior to the events that deviated from the baseline. Our recent results support the hypothesis of a possible relationship between an thermodynamic processes produced by increasing tectonic stress in the Earth's crust and a subsequent electro-chemical interaction between this crust and the atmosphere/ionosphere.

Ouzounov, D.; Logan, T.; Taylor, Patrick

2004-01-01

215

Earthquake geology of the northern San Andreas Fault near Point Arena, California  

SciTech Connect

Excavations into a Holocene alluvial fan provided exposures of a record of prehistoric earthquakes near Point Arena, California. At least five earthquakes were recognized in the section. All of these occurred since the deposition of a unit that is approximately 2000 years old. Radiocarbon dating allows constraints to be placed on the dates of these earthquakes. A buried Holocene (2356-2709 years old) channel has been offset a maximum of 64 {plus minus} 2 meters. This implies a maximum slip rate of 25.5 {plus minus} 2.5 mm/yr. These data suggest that the average recurrence interval for great earthquakes on this segment of the San Andreas fault is long - between about 200 and 400 years. Offset marine terrace risers near Point Arena and an offset landslide near Fort Ross provide estimates of the average slip rate since Late Pleistocene time. Near Fort Ross, an offset landslide implies a slip rate of less than 39 mm/yr. Correlation and age estimates of two marine terrace risers across the San Andreas fault near Point Arena suggest slip rates of about 18-19 mm/yr since Late Pleistocene time. Tentative correlation of the Pliocene Ohlson Ranch Formation in northwestern Sonoma County with deposits 50 km to the northwest near Point Arean, provides piercing points to use in calculation of a Pliocene slip rate for the northern San Andreas fault. A fission-track age 3.3 {plus minus} 0.8 Ma was determined for zicrons separated from a tuff collected from the Ohlson Ranch Formation. The geomorphology of the region, especially of the two major river drainages, supports the proposed 50 km Pliocene offset. This implies a Pliocene slip rate of at least 12-20 mm/yr. These rates for different time periods imply that much of the Pacific-North American plate motion must be accommodated on other structures at this latitude.

Prentice, C.S.

1989-01-01

216

Products and Services Available from the Southern California Earthquake Data Center (SCEDC) and the Southern California Seismic Network (SCSN)  

NASA Astrophysics Data System (ADS)

Currently, the SCEDC archives continuous and triggered data from nearly 8400 data channels from 425 SCSN recorded stations, processing and archiving an average of 6.4 TB of continuous waveforms and 12,000 earthquakes each year. The SCEDC provides public access to these earthquake parametric and waveform data through its website www.data.scec.org and through client applications such as STP and DHI. This poster will describe the most significant developments at the SCEDC during 2011. New website design: ? The SCEDC has revamped its website. The changes make it easier for users to search the archive, discover updates and new content. These changes also improve our ability to manage and update the site. New data holdings: ? Post processing on El Mayor Cucapah 7.2 sequence continues. To date there have been 11847 events reviewed. Updates are available in the earthquake catalog immediately. ? A double difference catalog (Hauksson et. al 2011) spanning 1981 to 6/30/11 will be available for download at www.data.scec.org and available via STP. ? A focal mechanism catalog determined by Yang et al. 2011 is available for distribution at www.data.scec.org. ? Waveforms from Southern California NetQuake stations are now being stored in the SCEDC archive and available via STP as event associated waveforms. Amplitudes from these stations are also being stored in the archive and used by ShakeMap. ? As part of a NASA/AIST project in collaboration with JPL and SIO, the SCEDC will receive real time 1 sps streams of GPS displacement solutions from the California Real Time Network (http://sopac.ucsd.edu/projects/realtime; Genrich and Bock, 2006, J. Geophys. Res.). These channels will be archived at the SCEDC as miniSEED waveforms, which then can be distributed to the user community via applications such as STP. Improvements in the user tool STP: ? STP sac output now includes picks from the SCSN. New archival methods: ? The SCEDC is exploring the feasibility of archiving and distributing waveform data using cloud computing such as Google Apps. A month of continuous data from the SCEDC archive will be stored in Google Apps and a client developed to access it in a manner similar to STP. The data is stored in miniseed format with gzip compression. Time gaps between time series were padded with null values, which substantially increases search efficiency by make the records uniform in length.

Chen, S. E.; Yu, E.; Bhaskaran, A.; Chowdhury, F. R.; Meisenhelter, S.; Hutton, K.; Given, D.; Hauksson, E.; Clayton, R. W.

2011-12-01

217

A public health issue related to collateral seismic hazards: The valley fever outbreak triggered by the 1994 Northridge, California earthquake  

USGS Publications Warehouse

Following the 17 January 1994 Northridge. California earthquake (M = 6.7), Ventura County, California, experienced a major outbreak of coccidioidomycosis (CM), commonly known as valley fever, a respiratory disease contracted by inhaling airborne fungal spores. In the 8 weeks following the earthquake (24 January through 15 March), 203 outbreak-associated cases were reported, which is about an order of magnitude more than the expected number of cases, and three of these cases were fatal. Simi Valley, in easternmost Ventura County, had the highest attack rate in the county, and the attack rate decreased westward across the county. The temporal and spatial distribution of CM cases indicates that the outbreak resulted from inhalation of spore-contaminated dust generated by earthquake-triggered landslides. Canyons North East of Simi Valley produced many highly disrupted, dust-generating landslides during the earthquake and its aftershocks. Winds after the earthquake were from the North East, which transported dust into Simi Valley and beyond to communities to the West. The three fatalities from the CM epidemic accounted for 4 percent of the total earthquake-related fatalities.

Jibson, R.W.

2002-01-01

218

Robust features of the source process for the 2004 Parkfield, California, earthquake from strong-motion seismograms  

NASA Astrophysics Data System (ADS)

We explore a recently developed procedure for kinematic inversion based on an elliptical subfault approximation. In this method, the slip is modelled by a small set of elliptical patches, each ellipse having a Gaussian distribution of slip. We invert near-field strong ground motion for the 2004 September 28 Mw 6.0 Parkfield, California, earthquake. The data set consists of 10 digital three-component 18-s long displacement seismograms. The best model gives a moment of 1.21 × 1018 N m, with slip on two distinct ellipses, one with a high-slip amplitude of 0.91 m located 20 km northwest of the hypocentre. The average rupture speed of the rupture process is ˜2.7 km s-1. We find no slip in the top 5 km. At this depth, a lineation of small aftershocks marks the transition from creeping above to locked below, in the interseismic period. The high-slip patch coincides spatially with the hypocentre of the 1966 Mw6.0 Parkfield, California, earthquake. The larger earthquakes prior to the 2004 Parkfield earthquake and the aftershocks of the 2004 earthquake (Mw > 3) also lie around this high-slip patch, where our model images a sharp slip gradient. This observation suggests the presence of a permanent asperity that breaks during large earthquakes, and has important implications for the slip deficit observed on the Parkfield segment, which is necessary for reliable seismic hazard assessment.

Twardzik, C.; Madariaga, R.; Das, S.; Custódio, S.

2012-12-01

219

Probable Earthquake Archaeological Effects in the ancient pyramids of Quetzalcóatl and Sun in Teotihuacán (Central Mexico)  

NASA Astrophysics Data System (ADS)

Teotihuacán was one of the blooming and greater cities of the Prehispanic cultural period within the central valley of México and one of the best archaeological findings of the Earth. During the period of splendour (Middle-Late Classic Period, 350-650 AD), almost 125.000 inhabitants lived in a vast city with more than 2000 stucco and block buildings, including the great religious and ceremonial pyramids: the Great Sun Pyramid, built between 1- 150 AD, the Moon Pyramid, built during a large time span (1-650 AD) and the outstanding Quetzalcóatl Pyramid (Feathered Snake Temple), built in two phases: the first original edifice built before 350 AD and the second one mainly are repairs of the west side and dated post-350 AD. The Quetzalcóatl Pyramid (Q- pyramid) shows a quadrangular base of ca. 3500 m2 with an extraordinary decoration of feathered snakes (attributed to the God Quetzalcóatl) and lizards. The second phase of construction consisted in a townhouse façade covering the west side of the pyramid (post 350AD), up to now with no evidence to justify such annexed wrapper of this west side. This ceremonial building was built within the Citadel, a complex area of Teotihuacán with residential and common zones as well (i.e. market). A detailed view of the steps of the west side stairs, displays different patterns of deformation affecting the blocks of the stair. The original and ancient stair exhibits rotated, overturned and displaced blocks, being stronger this deformation at the base of the pyramid. Moreover, the upper corners of the blocks appear broken in a similar form than the seismic-related feature defined as dipping broken corners or chipped corners. However, the horizontal disposition of the blocks suggests lateral vibration between them from horizontal shaking propagation. Besides, this feature appears lesser evident affecting the lower blocks of the annexed west façade, the only originally preserved ones. We have carried out a systematic measurement of this feature across the original west stairs of the Q- pyramid and the first stair level of the Sun pyramid. Furthermore, these horizontal dipping broken corners were also described affecting the new stairs of the annexed façade of the Q- pyramid. This suggests that seismic shaking could produce that deformation with a relative date of 350 AD post-quem. More data are necessary to properly test the earthquake occurrence and to bracket a probable intensity value.

Perez-Lopez, Raul; Rodríguez-Pascua, Miguel Angel; Garduño-Monroy, Victor Hugo; Oliveros, Arturo; Giner-Robles, Jorge L.; Silva, Pablo G.

2010-05-01

220

Probability  

NSDL National Science Digital Library

This application demomstrates simple probability concepts by having student rank the probability of an event on a probability line (from impossible to certain). After several trials the application then allows students to complete a simulation and collect data based on the probability task (retrieving balls from a machine). Several guiding questions are provided throughout the activity to encourage student dialogue.

2011-01-18

221

Uniform California earthquake rupture forecast, version 3 (UCERF3): the time-independent model  

USGS Publications Warehouse

In this report we present the time-independent component of the Uniform California Earthquake Rupture Forecast, Version 3 (UCERF3), which provides authoritative estimates of the magnitude, location, and time-averaged frequency of potentially damaging earthquakes in California. The primary achievements have been to relax fault segmentation assumptions and to include multifault ruptures, both limitations of the previous model (UCERF2). The rates of all earthquakes are solved for simultaneously, and from a broader range of data, using a system-level "grand inversion" that is both conceptually simple and extensible. The inverse problem is large and underdetermined, so a range of models is sampled using an efficient simulated annealing algorithm. The approach is more derivative than prescriptive (for example, magnitude-frequency distributions are no longer assumed), so new analysis tools were developed for exploring solutions. Epistemic uncertainties were also accounted for using 1,440 alternative logic tree branches, necessitating access to supercomputers. The most influential uncertainties include alternative deformation models (fault slip rates), a new smoothed seismicity algorithm, alternative values for the total rate of M?5 events, and different scaling relationships, virtually all of which are new. As a notable first, three deformation models are based on kinematically consistent inversions of geodetic and geologic data, also providing slip-rate constraints on faults previously excluded because of lack of geologic data. The grand inversion constitutes a system-level framework for testing hypotheses and balancing the influence of different experts. For example, we demonstrate serious challenges with the Gutenberg-Richter hypothesis for individual faults. UCERF3 is still an approximation of the system, however, and the range of models is limited (for example, constrained to stay close to UCERF2). Nevertheless, UCERF3 removes the apparent UCERF2 overprediction of M6.5–7 earthquake rates and also includes types of multifault ruptures seen in nature. Although UCERF3 fits the data better than UCERF2 overall, there may be areas that warrant further site-specific investigation. Supporting products may be of general interest, and we list key assumptions and avenues for future model improvements.

Field, Edward H.; Biasi, Glenn P.; Bird, Peter; Dawson, Timothy E.; Felzer, Karen R.; Jackson, David D.; Johnson, Kaj M.; Jordan, Thomas H.; Madden, Christopher; Michael, Andrew J.; Milner, Kevin R.; Page, Morgan T.; Parsons, Thomas; Powers, Peter M.; Shaw, Bruce E.; Thatcher, Wayne R.; Weldon, Ray J., II; Zeng, Yuehua; Working Group on CA Earthquake Probabilities

2013-01-01

222

Compilation of Slip in Last Earthquake Data for High-Slip Rate Faults in California for Input into Slip Dependent Rupture Forecast  

NASA Astrophysics Data System (ADS)

Slip in the last earthquake along a fault, in conjunction with the application of appropriate recurrence models, can be used to estimate the timing and size of future ground-rupturing earthquakes. Surface slip measurements are relatively easy to acquire along highly active faults because offsets from the last event are usually well preserved by geomorphic features in the landscape. We present a comprehensive database of slip measurements for high slip rate strike-slip and dip-slip faults in California for input into the slip-dependent 2011 Uniform California Earthquake Rupture Forecast (UCERF 3). Our database includes historic, paleoseismic, and geomorphic data on the slip in the last event and multi-event offsets. Faults were prioritized by highest slip rates and longest time since the last event relative to average recurrence interval. Slip rate, timing of the last event, and recurrence interval were obtained from past reports by the Working Group on California Earthquake Probabilities, unless more recently published data were available. A literature search determined the availability of offset data for the highest priority faults. We contacted authors of published slip studies to ascertain whether additional data exist in unpublished archives, gray literature, or publications in preparation. The lack of consistency in existing schemes to rate offset quality led us to develop a new semi-quantitative method to asses feature quality and tectonic quality for new and existing data. Recent analyses of newly available, high-resolution LiDAR topography for micro-geomorphic offsets have substantially increased the number of slip measurements available for our compilation. For faults with LiDAR coverage, but limited, poor, or unavailable offset data, we identified reaches with a high potential to preserve geomorphic offsets and calculated slip measurements. The methodology for our geomorphic analyses has been developed and implemented successfully in recent studies along the central San Jacinto Fault and 1857 earthquake reach of the San Andreas Fault. Last, we compiled data collected from our literature search and LiDAR analysis into a geodatabase. Our database contains multiple measurements for the same features using different techniques, making it a powerful tool to test the repeatability of slip measurements. Our compilation reveals that despite local variation, slip values tend to cluster around a reach averaged mean, and slip can be similar at a point over multiple events.

Arrowsmith, R.; Madden, C.; Haddad, D. E.; Salisbury, J. B.; Weldon, R. J.

2011-12-01

223

Earthquakes  

NSDL National Science Digital Library

This is an online lesson that can be transferred into a classroom instructional activity by the teacher. This lesson simplifies the concepts while pushing the the higher order thinking concepts with scaffolding all concepts of the layers of the earth, plate tectonics, P and S waves, creating a model of an earthquake. Students enjoy this lesson and have been able to improve on assessment after completing the Earthquake lesson. Teachers will enjoy the online printable worksheets that correlate to the lesson/data sheets and the variety of choices while using the interactive tool for whole group instruction. There are many choices for formative assessment as well as summamtive assessment.

U.S. Geological Survey Joy Lopez, M.A., teacher Scott Hassler, Ph.D. Geologist

2011-10-14

224

Probability  

NSDL National Science Digital Library

This lesson is designed to develop students' understanding of probability in real life situations. Students will also be introduced to running experiments, experimental probability, and theoretical probability. This lesson provides links to discussions and activities related to probability as well as suggested ways to integrate them into the lesson. Finally, the lesson provides links to follow-up lessons designed for use in succession with the current one.

2011-05-24

225

Aftershocks of the Coyote Lake, California, earthquake of August 6, 1979: A detailed study  

NASA Astrophysics Data System (ADS)

Aftershock hypocenters and focal mechanism solutions for the Coyote Lake, California, earthquake reveal a geometrically complex fault structure, consisting of multiple slip surfaces. The faulting surface principally consists of two right stepping en echelon, northwest trending, partially overlapping, nearly vertical sheets and is similar in geometry to a slip surface inferred for the 1966 Parkfield, California, earthquake. The overlap occurs near a prominent bend in the surface trace of the Calaveras fault at San Felipe Lake. Slip during the main rupture, as inferred from the distribution of early aftershocks, appears to have been confined to a 14-km portion of the northeastern sheet between 4- and 10-km depth. Focal mechanisms and the hypocentral distribution of aftershocks suggest that the main rupture surface itself is geometrically complex, with left stepping imbricate structure. Seismic shear displacement on the southwestern slip surface commenced some 5 hours after the mainshock. Aftershocks in this zone define a single vertical plane 8 km long between 3- and 7-km depth. Within the overlap zone between the two main slip surfaces, the average strike of aftershock nodal planes is significantly rotated clockwise relative to the strike of the fault zone, in close agreement with the stress perturbations predicted by crack interaction models. Aftershock activity in the overlap zone is not associated with a simple dislocation surface. Space and time clustering within the entire aftershock set suggest an alternation of seismic displacement between the component parts of the fault zone. This alternation is consistent with local stress perturbations predicted by crack interaction models. We conclude that the fault structure is geometrically complex and that the displacements that occur on its component surfaces during the aftershock process dynamically interact by generating perturbations in the local stress field which, in turn, control the displacements. Table 5 is available with entire article on microfiche. Order from American Geophysical Union, 2000 Florida Avenue, N.W., Washington, D.C. 20009. Document J82-006; $1.00. Payment must accompany order.

Reasenberg, P.; Ellsworth, W. L.

1982-12-01

226

Comparison of four moderate-size earthquakes in southern California using seismology and InSAR  

USGS Publications Warehouse

Source parameters determined from interferometric synthetic aperture radar (InSAR) measurements and from seismic data are compared from four moderate-size (less than M 6) earthquakes in southern California. The goal is to verify approximate detection capabilities of InSAR, assess differences in the results, and test how the two results can be reconciled. First, we calculated the expected surface deformation from all earthquakes greater than magnitude 4 in areas with available InSAR data (347 events). A search for deformation from the events in the interferograms yielded four possible events with magnitudes less than 6. The search for deformation was based on a visual inspection as well as cross-correlation in two dimensions between the measured signal and the expected signal. A grid-search algorithm was then used to estimate focal mechanism and depth from the InSAR data. The results were compared with locations and focal mechanisms from published catalogs. An independent relocation using seismic data was also performed. The seismic locations fell within the area of the expected rupture zone for the three events that show clear surface deformation. Therefore, the technique shows the capability to resolve locations with high accuracy and is applicable worldwide. The depths determined by InSAR agree with well-constrained seismic locations determined in a 3D velocity model. Depth control for well-imaged shallow events using InSAR data is good, and better than the seismic constraints in some cases. A major difficulty for InSAR analysis is the poor temporal coverage of InSAR data, which may make it impossible to distinguish deformation due to different earthquakes at the same location.

Mellors, R.J.; Magistrale, H.; Earle, P.; Cogbill, A.H.

2004-01-01

227

Triggered reverse fault and earthquake due to crustal unloading, northwest Transverse Ranges, California.  

USGS Publications Warehouse

A reverse-right-oblique surface rupture, associated with a ML 2.5 earthquake, formed in a diatomite quarry near Lompoc, California, in the northwesternmost Transverse Ranges on April 7, 1981. The 575-m-long narrow zone of ruptures formed in clay interbeds in diatomite and diatomaceous shale of the Neogene Monterey Formation. The ruptures parallel bedding, dip 39o-59oS, and trend about N84oE on the north limb of an open symmetrical syncline. Maximum net slip was 25 cm; maximum reverse dip slip was 23 cm, maximum right-lateral strike slip was about 9 cm, and average net slip was about 12 cm. The seismic moment of the earthquake is estimated at 1 to 2 X 1018 dyne/cm and the static stress drop at about 3 bar. The removal of an average of about 44 m of diatomite resulted in an average load reduction of about 5 bar, which decreased the normal stress by about 3.5 bar and increased the shear stress on the tilted bedding plane by about 2 bar. The April 7, 1981, event was a very shallow bedding-plane rupture, apparently triggered by crustal unloading. -Authors

Yerkes, R.F.; Ellsworth, W.L.; Tinsley, J.C.

1983-01-01

228

A Double-difference Earthquake location algorithm: Method and application to the Northern Hayward Fault, California  

USGS Publications Warehouse

We have developed an efficient method to determine high-resolution hypocenter locations over large distances. The location method incorporates ordinary absolute travel-time measurements and/or cross-correlation P-and S-wave differential travel-time measurements. Residuals between observed and theoretical travel-time differences (or double-differences) are minimized for pairs of earthquakes at each station while linking together all observed event-station pairs. A least-squares solution is found by iteratively adjusting the vector difference between hypocentral pairs. The double-difference algorithm minimizes errors due to unmodeled velocity structure without the use of station corrections. Because catalog and cross-correlation data are combined into one system of equations, interevent distances within multiplets are determined to the accuracy of the cross-correlation data, while the relative locations between multiplets and uncorrelated events are simultaneously determined to the accuracy of the absolute travel-time data. Statistical resampling methods are used to estimate data accuracy and location errors. Uncertainties in double-difference locations are improved by more than an order of magnitude compared to catalog locations. The algorithm is tested, and its performance is demonstrated on two clusters of earthquakes located on the northern Hayward fault, California. There it colapses the diffuse catalog locations into sharp images of seismicity and reveals horizontal lineations of hypocenter that define the narrow regions on the fault where stress is released by brittle failure.

Waldhauser, F.; Ellsworth, W.L.

2000-01-01

229

The 1868 Hayward Earthquake Alliance: A Case Study - Using an Earthquake Anniversary to Promote Earthquake Preparedness  

NASA Astrophysics Data System (ADS)

Last October 21st marked the 140th anniversary of the M6.8 1868 Hayward Earthquake, the last damaging earthquake on the southern Hayward Fault. This anniversary was used to help publicize the seismic hazards associated with the fault because: (1) the past five such earthquakes on the Hayward Fault occurred about 140 years apart on average, and (2) the Hayward-Rodgers Creek Fault system is the most likely (with a 31 percent probability) fault in the Bay Area to produce a M6.7 or greater earthquake in the next 30 years. To promote earthquake awareness and preparedness, over 140 public and private agencies and companies and many individual joined the public-private nonprofit 1868 Hayward Earthquake Alliance (1868alliance.org). The Alliance sponsored many activities including a public commemoration at Mission San Jose in Fremont, which survived the 1868 earthquake. This event was followed by an earthquake drill at Bay Area schools involving more than 70,000 students. The anniversary prompted the Silver Sentinel, an earthquake response exercise based on the scenario of an earthquake on the Hayward Fault conducted by Bay Area County Offices of Emergency Services. 60 other public and private agencies also participated in this exercise. The California Seismic Safety Commission and KPIX (CBS affiliate) produced professional videos designed forschool classrooms promoting Drop, Cover, and Hold On. Starting in October 2007, the Alliance and the U.S. Geological Survey held a sequence of press conferences to announce the release of new research on the Hayward Fault as well as new loss estimates for a Hayward Fault earthquake. These included: (1) a ShakeMap for the 1868 Hayward earthquake, (2) a report by the U. S. Bureau of Labor Statistics forecasting the number of employees, employers, and wages predicted to be within areas most strongly shaken by a Hayward Fault earthquake, (3) new estimates of the losses associated with a Hayward Fault earthquake, (4) new ground motion simulations of a Hayward Fault earthquake, (5) a new USGS Fact Sheet about the earthquake and the Hayward Fault, (6) a virtual tour of the 1868 earthquake, and (7) a new online field trip guide to the Hayward Fault using locations accessible by car and public transit. Finally, the California Geological Survey and many other Alliance members sponsored the Third Conference on Earthquake Hazards in the East Bay at CSU East Bay in Hayward for the three days following the 140th anniversary. The 1868 Alliance hopes to commemorate the anniversary of the 1868 Hayward Earthquake every year to maintain and increase public awareness of this fault, the hazards it and other East Bay Faults pose, and the ongoing need for earthquake preparedness and mitigation.

Brocher, T. M.; Garcia, S.; Aagaard, B. T.; Boatwright, J. J.; Dawson, T.; Hellweg, M.; Knudsen, K. L.; Perkins, J.; Schwartz, D. P.; Stoffer, P. W.; Zoback, M.

2008-12-01

230

Mapping probability of fire occurrence in San Jacinto Mountains, California, USA  

NASA Astrophysics Data System (ADS)

An ecological data base for the San Jacinto Mountains, California, USA, was used to construct a probability model of wildland fire occurrence. The model incorporates both environmental and human factors, including vegetation, temperature, precipitation, human structures, and transportation. Spatial autocorrelation was examined for both fire activity and vegetation to determine the specification of neighborhood effects in the model. Parameters were estimated using stepwise logistic regressions. Among the explanatory variables, the variable that represents the neighborhood effects of spatial processes is shown to be of great importance in the distribution of wildland fires. An important implication of this result is that the management of wildland fires must take into consideration neighborhood effects in addition to environmental and human factors. The distribution of fire occurrence probability is more accurately mapped when the model incorporates the spatial term of neighborhood effects. The map of fire occurrence probability is useful for designing large-scale management strategies of wildfire prevention.

Chou, Yue Hong; Minnich, Richard A.; Chase, Richard A.

1993-01-01

231

Survey of strong motion earthquake effects on thermal power plants in California with emphasis on piping systems. Volume 2, Appendices  

SciTech Connect

Volume 2 of the ``Survey of Strong Motion Earthquake Effects on Thermal Power Plants in California with Emphasis on Piping Systems`` contains Appendices which detail the detail design and seismic response of several power plants subjected to strong motion earthquakes. The particular plants considered include the Ormond Beach, Long Beach and Seal Beach, Burbank, El Centro, Glendale, Humboldt Bay, Kem Valley, Pasadena and Valley power plants. Included is a typical power plant piping specification and photographs of typical power plant piping specification and photographs of typical piping and support installations for the plants surveyed. Detailed piping support spacing data are also included.

Stevenson, J.D. [Stevenson and Associates, Cleveland, OH (United States)

1995-11-01

232

CRUSTAL REFRACTION PROFILE OF THE LONG VALLEY CALDERA, CALIFORNIA, FROM THE JANUARY 1983 MAMMOTH LAKES EARTHQUAKE SWARM.  

USGS Publications Warehouse

Seismic-refraction profiles recorded north of Mammoth Lakes, California, using earthquake sources from the January 1983 swarm complement earlier explosion refraction profiles and provide velocity information from deeper in the crust in the area of the Long Valley caldera. Eight earthquakes from a depth range of 4. 9 to 8. 0 km confirm the observation of basement rocks with seismic velocities ranging from 5. 8 to 6. 4 km/sec extending at least to depths of 20 km. The data provide further evidence for the existence of a partial melt zone beneath Long Valley caldera and constrain its geometry. Refs.

Luetgert, James H.; Mooney, Walter D.

1985-01-01

233

Earthquake Myths  

NSDL National Science Digital Library

This site serves to belie several popular myths about earthquakes. Students will learn that most earthquakes do not occur in the early morning and one cannot be swallowed up by an earthquake. In addition, there is no such thing as earthquake weather and California is not falling into the ocean. On the more practical side, students can learn that good building codes do not insure good buildings, it is safer under a table than in a doorway during an earthquake, and most people do not panic during an earthquake.

234

Precise relocations and stress change calculations for the Upland earthquake sequence in southern California  

NASA Astrophysics Data System (ADS)

We relocate earthquakes that occurred near the 1988 (ML = 4.7) and the 1990 (ML = 5.5) Upland, California, earthquakes to map the fault geometry of the poorly defined San Jose fault and to test the static stress triggering hypothesis for this sequence. We adopt the L1 norm, waveform cross-correlation method of Shearer [1997] to obtain precise relocations for 1573 events between 1981 and 1997 in the Upland area. To limit computation time, we only perform waveform cross correlation on 60 of the nearest neighbors of each relocated event. Our final relocations show two linear features. The first is imaged by the locations of the initial month of aftershocks of the 1988 Upland earthquake, which delineate a fault with a dip angle of ˜45° between 7 and 9 km depth, consistent with the mainshock focal mechanism. The second linear feature is a plane dipping at about 74° from 2 to 9 km depth, which is illuminated by both the 1988 and 1990 Upland sequences, in agreement with the inferred location of the San Jose fault at depth. However, below 9 km the event locations become more diffuse, giving rise to two different interpretations of the fate of the San Jose fault at depth. One possibility is that the fault shallows at depth, consistent with our relocations but not with the focal mechanism of a ML = 4.7 deep aftershock. Alternatively, the fault may be offset at depth by the more shallow dipping fault strand broken during the 1988 earthquake. Using these inferred fault geometries, we compute stress changes resulting from slip during the mainshocks to test whether the relocated aftershocks are consistent with the hypothesis that more aftershocks occur where the change in static Coulomb failure stress is positive (on faults optimally oriented for failure). This requires an extension of previous models of changes in the failure stress to three dimensions and arbitrary fault orientation. We find that patterns of change in Coulomb failure stress differ little between the different fault geometries: all are nearly symmetric about the fault and so do not match the aftershock distribution, in which most of the off-fault events occur to one side of the fault plane.

Astiz, Luciana; Shearer, Peter M.; Agnew, Duncan C.

2000-02-01

235

Revised earthquake hazard of the Hat Creek fault, northern California: A case example of a normal fault dissecting variable-age basaltic lavas  

E-print Network

Revised earthquake hazard of the Hat Creek fault, northern California: A case example of a normal that provide information about the slip behavior and earthquake potential. The 47-km-long Hat Creek fault be applied to any normal-faulted basalt envi- ronment. Applied to the Hat Creek fault, we estimate

Kattenhorn, Simon

236

Probability  

NSDL National Science Digital Library

What are the chances? What is probability? Math Glossary What are the chances that you will get a baby brother or a baby sister? Boy or Girl? If you flip more than one coin, what are the combinations you could get? What are the chances you will get each combination? Probability in Flipping Coins ...

Ms. Banks

2005-05-11

237

Potential Effects of a Scenario Earthquake on the Economy of Southern California: Small Business Exposure and Sensitivity Analysis to a Magnitude 7.8 Earthquake  

USGS Publications Warehouse

The Multi-Hazards Demonstration Project (MHDP) is a collaboration between the U.S. Geological Survey (USGS) and various partners from the public and private sectors and academia, meant to improve Southern California's resiliency to natural hazards (Jones and others, 2007). In support of the MHDP objectives, the ShakeOut Scenario was developed. It describes a magnitude 7.8 (M7.8) earthquake along the southernmost 300 kilometers (200 miles) of the San Andreas Fault, identified by geoscientists as a plausible event that will cause moderate to strong shaking over much of the eight-county (Imperial, Kern, Los Angeles, Orange, Riverside, San Bernardino, San Diego, and Ventura) Southern California region. This report contains an exposure and sensitivity analysis of small businesses in terms of labor and employment statistics. Exposure is measured as the absolute counts of labor market variables anticipated to experience each level of Instrumental Intensity (a proxy measure of damage). Sensitivity is the percentage of the exposure of each business establishment size category to each Instrumental Intensity level. The analysis concerns the direct effect of the earthquake on small businesses. The analysis is inspired by the Bureau of Labor Statistics (BLS) report that analyzed the labor market losses (exposure) of a M6.9 earthquake on the Hayward fault by overlaying geocoded labor market data on Instrumental Intensity values. The method used here is influenced by the ZIP-code-level data provided by the California Employment Development Department (CA EDD), which requires the assignment of Instrumental Intensities to ZIP codes. The ZIP-code-level labor market data includes the number of business establishments, employees, and quarterly payroll categorized by business establishment size.

Sherrouse, Benson C.; Hester, David J.; Wein, Anne M.

2008-01-01

238

TriNet "ShakeMaps": Rapid generation of peak ground motion and intensity maps for earthquakes in southern California  

USGS Publications Warehouse

Rapid (3-5 minutes) generation of maps of instrumental ground-motion and shaking intensity is accomplished through advances in real-time seismographic data acquisition combined with newly developed relationships between recorded ground-motion parameters and expected shaking intensity values. Estimation of shaking over the entire regional extent of southern California is obtained by the spatial interpolation of the measured ground motions with geologically based frequency and amplitude-dependent site corrections. Production of the maps is automatic, triggered by any significant earthquake in southern California. Maps are now made available within several minutes of the earthquake for public and scientific consumption via the World Wide Web; they will be made available with dedicated communications for emergency response agencies and critical users.

Wald, D.J.; Quitoriano, V.; Heaton, T.H.; Kanamori, H.; Scrivner, C.W.; Worden, C.B.

1999-01-01

239

Detailed observations of California foreshock sequences: Implications for the earthquake initiation process  

USGS Publications Warehouse

We find that foreshocks provide clear evidence for an extended nucleation process before some earthquakes. In this study, we examine in detail the evolution of six California foreshock sequences, the 1986 Mount Lewis (ML, = 5.5), the 1986 Chalfant (ML = 6.4), the. 1986 Stone Canyon (ML = 4.7), the 1990 Upland (ML = 5.2), the 1992 Joshua Tree (MW= 6.1), and the 1992 Landers (MW = 7.3) sequence. Typically, uncertainties in hypocentral parameters are too large to establish the geometry of foreshock sequences and hence to understand their evolution. However, the similarity of location and focal mechanisms for the events in these sequences leads to similar foreshock waveforms that we cross correlate to obtain extremely accurate relative locations. We use these results to identify small-scale fault zone structures that could influence nucleation and to determine the stress evolution leading up to the mainshock. In general, these foreshock sequences are not compatible with a cascading failure nucleation model in which the foreshocks all occur on a single fault plane and trigger the mainshock by static stress transfer. Instead, the foreshocks seem to concentrate near structural discontinuities in the fault and may themselves be a product of an aseismic nucleation process. Fault zone heterogeneity may also be important in controlling the number of foreshocks, i.e., the stronger the heterogeneity, the greater the number of foreshocks. The size of the nucleation region, as measured by the extent of the foreshock sequence, appears to scale with mainshock moment in the same manner as determined independently by measurements of the seismic nucleation phase. We also find evidence for slip localization as predicted by some models of earthquake nucleation. Copyright 1996 by the American Geophysical Union.

Dodge, D.A.; Beroza, G.C.; Ellsworth, W.L.

1996-01-01

240

Anomalous geomagnetic variations associated with Parkfield (Ms=6.0, 28-SEP-2004, California, USA) earthquake  

NASA Astrophysics Data System (ADS)

Analysis of geomagnetic and telluric data, measured at the station PRK (Parkfield, ULF flux-gate 3-axial magnetometer) 1 week before (including) the day of the major EQ (EarthQuake, Ms=6.0, 28-SEP-2004, 17:15:24) near Parkfield, California, USA, are presented. Spectral analysis reveal the ULF geomagnetic disturbances observed the day before the event, Sep 27, at 15:00- 20:00 by UT, and at the day of the EQ, Sep 28, at 11:00-19:00. Filtering in the corresponding frequency band f = 0.25-0.5 Hz gives the following estimations of the amplitudes of the signals: up to 20 pT for the magnetic channels and 1.5 mkV/km for the telluric ones. Observed phenomena occurs under quiet geomagnetic conditions (|Dst|<20 nT); revision of the referent stations data situated far away from the EQ epicenter (330 km) does not reveal any similar effect. Moreover, the Quake Finder research group (http:www.quakefinder.com) received very similar results (ELF range instrument, placed about 50 km from the EQ epicenter) for the day of the EQ. Mentioned above suggests the localized character of the source, possibly of the ionosphere or tectonic origin rather than of magnetosphere. Comparative analysis of the mentioned 2 stations show that we observed the lower-frequency part of the ULF- ELF burst, localized in the frequency range 0.25-1 Hz, generated 9 hours before the earthquake. Acknowledgements. The authors are grateful to Malcolm Johnston for providing us with the geomagnetic data.

Kotsarenko, A. A.; Pilinets, S. A.; Perez Enriquez, R.; Lopez Cruz Abeyro, J. A.

2007-05-01

241

The California Academy of Sciences, Grove Karl Gilbert, and Photographs of the 1906 Earthquake, mostly from the Archives of the Academy  

Microsoft Academic Search

In the early morning hours of 18 April 1906, a catastrophic earthquake struck the San Francisco region of central California. Buildings were severely damaged, trans- portation systems disrupted, water mains broken. When fires broke out in the down- town area, firefighters had no means to control them. The California Academy of Sciences museum building lay in the path of advancing

Alan E. Leviton; Michele L. Aldrich; Karren Elsbernd

242

Remotely triggered microearthquakes and tremor in central California following the 2010 Mw 8.8 Chile earthquake  

USGS Publications Warehouse

We examine remotely triggered microearthquakes and tectonic tremor in central California following the 2010 Mw 8.8 Chile earthquake. Several microearthquakes near the Coso Geothermal Field were apparently triggered, with the largest earthquake (Ml 3.5) occurring during the large-amplitude Love surface waves. The Chile mainshock also triggered numerous tremor bursts near the Parkfield-Cholame section of the San Andreas Fault (SAF). The locally triggered tremor bursts are partially masked at lower frequencies by the regionally triggered earthquake signals from Coso, but can be identified by applying high-pass or matched filters. Both triggered tremor along the SAF and the Ml 3.5 earthquake in Coso are consistent with frictional failure at different depths on critically-stressed faults under the Coulomb failure criteria. The triggered tremor, however, appears to be more phase-correlated with the surface waves than the triggered earthquakes, likely reflecting differences in constitutive properties between the brittle, seismogenic crust and the underlying lower crust.

Peng, Zhigang; Hill, David P.; Shelly, David R.; Aiken, Chastity

2010-01-01

243

Virtual Earthquake  

NSDL National Science Digital Library

Virtual Earthquake was created by California State University, Los Angeles, as part of the Electronic Desktop Project. This virtual simulation allows students to locate the epicenter of an earthquake and determine its magnitude on the Richter scale. Students can choose from four geographic areas for their simulation. Virtual Earthquake carefully guides the student through the steps required to calculate the epicenter and to determine the magnitude of a simulated earthquake. The actual epicenter is provided along with the epicenter determined by the user. The user can then determine the magnitude of the earthquake as measured on the Richter scale.

244

Evaluation of Real-Time Performance of the Virtual Seismologist Earthquake Early Warning Algorithm in Switzerland and California  

NASA Astrophysics Data System (ADS)

The Virtual Seismologist (VS) method is a Bayesian approach to regional network-based earthquake early warning (EEW) originally formulated by Cua and Heaton (2007). Implementation of VS into real-time EEW codes has been an on-going effort of the Swiss Seismological Service at ETH Zürich since 2006, with support from ETH Zürich, various European projects, and the United States Geological Survey (USGS). VS is one of three EEW algorithms - the other two being ElarmS (Allen and Kanamori, 2003) and On-Site (Wu and Kanamori, 2005; Boese et al., 2008) algorithms - that form the basis of the California Integrated Seismic Network (CISN) ShakeAlert system, a USGS-funded prototype end-to-end EEW system that could potentially be implemented in California. In Europe, VS is currently operating as a real-time test system in Switzerland. As part of the on-going EU project REAKT (Strategies and Tools for Real-Time Earthquake Risk Reduction), VS will be installed and tested at other European networks. VS has been running in real-time on stations of the Southern California Seismic Network (SCSN) since July 2008, and on stations of the Berkeley Digital Seismic Network (BDSN) and the USGS Menlo Park strong motion network in northern California since February 2009. In Switzerland, VS has been running in real-time on stations monitored by the Swiss Seismological Service (including stations from Austria, France, Germany, and Italy) since 2010. We present summaries of the real-time performance of VS in Switzerland and California over the past two and three years respectively. The empirical relationships used by VS to estimate magnitudes and ground motion, originally derived from southern California data, are demonstrated to perform well in northern California and Switzerland. Implementation in real-time and off-line testing in Europe will potentially be extended to southern Italy, western Greece, Istanbul, Romania, and Iceland. Integration of the VS algorithm into both the CISN Advanced Quake Monitoring System (AQMS) and the SeisComP3 real-time earthquake monitoring systems is underway. VS operation in California will eventually be fully transitioned to the CISN AQMS system. European installations of VS will most likely be based on the SeisComP3 platform.

Behr, Y.; Cua, G. B.; Clinton, J. F.; Heaton, T. H.

2012-12-01

245

Low Velocity Zones along the San Jacinto Fault, Southern California, inferred from Local Earthquakes  

NASA Astrophysics Data System (ADS)

Natural fault zones have regions of brittle damage leading to a low-velocity zone (LVZ) in the immediate vicinity of the main fault interface. The LVZ may amplify ground motion, modify rupture propagation, and impact derivation of earthquke properties. Here we image low-velocity fault zone structures along the San Jacinto Fault (SJF), southern California, using waveforms of local earthquakes that are recorded at several dense arrays across the SJFZ. We use generalized ray theory to compute synthetic travel times to track the direct and FZ-reflected waves bouncing from the FZ boundaries. This method can effectively reduce the trade-off between FZ width and velocity reduction relative to the host rock. Our preliminary results from travel time modeling show the clear signature of LVZs along the SJF, including the segment of the Anza seismic gap. At the southern part near the trifrication area, the LVZ of the Clark Valley branch (array JF) has a width of ~200 m with ~55% reduction in Vp and Vs. This is consistent with what have been suggested from previous studies. In comparison, we find that the velocity reduction relative to the host rock across the Anza seismic gap (array RA) is ~50% for both Vp and Vs, nearly as prominent as that on the southern branches. The width of the LVZ is ~230 m. In addition, the LVZ across the Anza gap appears to locate in the northeast side of the RA array, implying potential preferred propagation direction of past ruptures.

Li, Z.; Yang, H.; Peng, Z.; Ben-Zion, Y.; Vernon, F.

2013-12-01

246

A three-step Maximum-A-Posterior probability method for InSAR data inversion of coseismic rupture with application to four recent large earthquakes in Asia  

NASA Astrophysics Data System (ADS)

We develop a three-step Maximum-A-Posterior probability (MAP) method for coseismic rupture inversion, which aims at maximizing the a posterior probability density function (PDF) of elastic solutions of earthquake rupture. The method originates from the Fully Bayesian Inversion (FBI) and the Mixed linear-nonlinear Bayesian inversion (MBI) methods , shares the same a posterior PDF with them and keeps most of their merits, while overcoming its convergence difficulty when large numbers of low quality data are used and improving the convergence rate greatly using optimization procedures. A highly efficient global optimization algorithm, Adaptive Simulated Annealing (ASA), is used to search for the maximum posterior probability in the first step. The non-slip parameters are determined by the global optimization method, and the slip parameters are inverted for using the least squares method without positivity constraint initially, and then damped to physically reasonable range. This step MAP inversion brings the inversion close to 'true' solution quickly and jumps over local maximum regions in high-dimensional parameter space. The second step inversion approaches the 'true' solution further with positivity constraints subsequently applied on slip parameters using the Monte Carlo Inversion (MCI) technique, with all parameters obtained from step one as the initial solution. Then the slip artifacts are eliminated from slip models in the third step MAP inversion with fault geometry parameters fixed. We first used a designed model with 45 degree dipping angle and oblique slip, and corresponding synthetic InSAR data sets to validate the efficiency and accuracy of method. We then applied the method on four recent large earthquakes in Asia, namely the 2010 Yushu, China earthquake, the 2011 Burma earthquake, the 2011 New Zealand earthquake and the 2008 Qinghai, China earthquake, and compared our results with those results from other groups. Our results show the effectiveness of the method in earthquake studies and a number of advantages of it over other methods. The details will be reported on the meeting.

Sun, J.; Shen, Z.; Burgmann, R.; Liang, F.

2012-12-01

247

Seismicity remotely triggered by the magnitude 7.3 Landers, California, earthquake  

Microsoft Academic Search

The magnitude 7.3 Landers earthquake of 28 June 1992 triggered a remarkably sudden and widespread increase in earthquake activity across much of the western United States. The triggered earthquakes, which occurred at distances up to 1250 kilometers (17 source dimensions) from the Landers mainshock, were confined to areas of persistent seismicity and strike-slip to normal faulting. Many of the triggered

D. P. Hill; P. A. Reasenberg; A. Michael; W. J. Arabaz; G. Beroza; D. Brumbaugh; J. N. Brune; R. Castro; S. Davis; D. Depolo; R. B. Smith; L. Munguia; A. Vidal; V. Wong; J. Gomberg; S. Harmsen; L. House; S. M. Jackson; L. Jones; R. Keller; S. Malone; A. Sanford; S. Walter; J. Zollweg

1993-01-01

248

Earthquakes, active faults, and geothermal areas in the Imperial Valley, California  

USGS Publications Warehouse

A dense seismograph network in the Imperial Valley recorded a series of earthquake swarms along the Imperial and Brawley faults and a diffuse pattern of earthquakes along the San Jacinto fault. Two known geothermal areas are closely associated with these earthquake swarms. This seismicity pattern demonstrates that seismic slip is occurring along both the Imperial-Brawley and San Jacinto fault systems.

Hill, D.P.; Mowinckel, P.; Peake, L.G.

1975-01-01

249

Surface Rupture and Slip Distribution of the 1940 Imperial Valley Earthquake, Imperial Fault, Southern California: Implications  

E-print Network

more than a century of ruptures following the 1891 Nobi and 1906 San Francisco earthquakes, where sci Surface Rupture and Slip Distribution of the 1940 Imperial Valley Earthquake, Imperial Fault earthquake to provide a higher resolution distribution of displace- ment with which to test variation

Klinger, Yann

250

Forecasting the evolution of seismicity in southern California: Animations built on earthquake stress transfer  

USGS Publications Warehouse

We develop a forecast model to reproduce the distibution of main shocks, aftershocks and surrounding seismicity observed during 1986-200 in a 300 ?? 310 km area centered on the 1992 M = 7.3 Landers earthquake. To parse the catalog into frames with equal numbers of aftershocks, we animate seismicity in log time increments that lengthen after each main shock; this reveals aftershock zone migration, expansion, and densification. We implement a rate/state algorithm that incorporates the static stress transferred by each M ??? 6 shock and then evolves. Coulomb stress changes amplify the background seismicity, so small stress changes produce large changes in seismicity rate in areas of high background seismicity. Similarly, seismicity rate declines in the stress shadows are evident only in areas with previously high seismicity rates. Thus a key constituent of the model is the background seismicity rate, which we smooth from 1981 to 1986 seismicity. The mean correlation coefficient between observed and predicted M ??? 1.4 shocks (the minimum magnitude of completeness) is 0.52 for 1986-2003 and 0.63 for 1992-2003; a control standard aftershock model yields 0.54 and 0.52 for the same periods. Four M ??? 6.0 shocks struck during the test period; three are located at sites where the expected seismicity rate falls above the 92 percentile, and one is located above the 75 percentile. The model thus reproduces much, but certainly not all, of the observed spatial and temporal seismicity, from which we infer that the decaying effect of stress transferred by successive main shocks influences seismicity for decades. Finally, we offer a M ??? 5 earthquake forecast for 2005-2015, assigning probabilities to 324 10 ?? 10 km cells.

Toda, S.; Stein, R.S.; Richards-Dinger, K.; Bozkurt, S.B.

2005-01-01

251

Response of the San Andreas fault to the 1983 Coalinga-Nuñez earthquakes: An application of interaction-based probabilities for Parkfield  

NASA Astrophysics Data System (ADS)

The Parkfield-Cholame section of the San Andreas fault, site of an unfulfilled earthquake forecast in 1985, is the best monitored section of the world's most closely watched fault. In 1983, the M = 6.5 Coalinga and M = 6.0 Nuñez events struck 25 km northeast of Parkfield. Seismicity rates climbed for 18 months along the creeping section of the San Andreas north of Parkfield and dropped for 6 years along the locked section to the south. Right-lateral creep also slowed or reversed from Parkfield south. Here we calculate that the Coalinga sequence increased the shear and Coulomb stress on the creeping section, causing the rate of small shocks to rise until the added stress was shed by additional slip. However, the 1983 events decreased the shear and Coulomb stress on the Parkfield segment, causing surface creep and seismicity rates to drop. We use these observations to cast the likelihood of a Parkfield earthquake into an interaction-based probability, which includes both the renewal of stress following the 1966 Parkfield earthquake and the stress transfer from the 1983 Coalinga events. We calculate that the 1983 shocks dropped the 10-year probability of a M ~ 6 Parkfield earthquake by 22% (from 54 +/- 22% to 42 +/- 23%) and that the probability did not recover until about 1991, when seismicity and creep resumed. Our analysis may thus explain why the Parkfield earthquake did not strike in the 1980s, but not why it was absent in the 1990s. We calculate a 58 +/- 17% probability of a M ~ 6 Parkfield earthquake during 2001-2011.

Toda, Shinji; Stein, Ross S.

2002-06-01

252

Geophysical setting of the 2000 ML 5.2 Yountville, California, earthquake: Implications for seismic Hazard in Napa Valley, California  

USGS Publications Warehouse

The epicenter of the 2000 ML 5.2 Yountville earthquake was located 5 km west of the surface trace of the West Napa fault, as defined by Helley and Herd (1977). On the basis of the re-examination of geologic data and the analysis of potential field data, the earthquake occurred on a strand of the West Napa fault, the main basin-bounding fault along the west side of Napa Valley. Linear aeromagnetic anomalies and a prominent gravity gradient extend the length of the fault to the latitude of Calistoga, suggesting that this fault may be capable of larger-magnitude earthquakes. Gravity data indicate an ???2-km-deep basin centered on the town of Napa, where damage was concentrated during the Yountville earthquake. It most likely played a minor role in enhancing shaking during this event but may lead to enhanced shaking caused by wave trapping during a larger-magnitude earthquake.

Langenheim, V.E.; Graymer, R.W.; Jachens, R.C.

2006-01-01

253

Public Education for Household Mitigation and Preparedness for Earthquakes in California: The Research Base and Program Innovations  

NASA Astrophysics Data System (ADS)

This presentation summarizes the findings from previous research in the social sciences regarding the factors and processes that enhance the effectivenss of public education efforts for household mitigation and preparedness actions for earthquakes. The conclusions from this research base include that the most effective efforts are those that are designed as an ongoing process with multiple channels and types of public communications. Second, an anticipated survey to measure household mitigation and preparedness actions in the State of California is sumarized. This survey will measure actual household mitigation and preparedness actions taken, knowledge, perceived risk, and other factors that previous research suggests impact these actions and perceptions; each of these factors are reviewed. The presentation then illustrates how knowledge from previous research will be blended with the information obtained from the planned survey in order to desgin a state-of-the-art public education campaign in California that maximizes household mitigation and preparedness for earthquakes and mega-earthquakes. Among other things, this requires that government agencies, NGOs, and provate sector organizations cooperate to coordinate their efforts to maximize program effectivenss. Finally, how this program might be evaluated to inform program refinements over time is discussed.

Mileti, D. S.

2007-05-01

254

Plotting Earthquakes  

NSDL National Science Digital Library

In this activity, learners discover how to plot earthquakes on a map by exploring recent earthquake activity in California and Nevada. Within this activity, learners also practice using latitudinal and longitudinal lines and make predictions. This detailed lesson plan includes key vocabulary words, background information for educators, extension ideas, and resources.

2012-06-26

255

Westward-derived conglomerates in Moenkopi formation of Southeastern California, and their probable tectonic significance  

SciTech Connect

The upper part of the Moenkopi Formation in the Northern Clark Mountains, Southeastern California, contains conglomerate beds whose clasts comprise igneous, metamorphic, and sedimentary rocks. Metamorphic clasts include foliated granite, meta-arkose, and quarzite, probably derived from older Precambrian basement and younger Precambrian clastic rocks. Volcanic clasts are altered plagioclase-bearing rocks, and sedimentary clasts were derived from Paleozoic miogeoclinal rocks. Paleocurrent data indicate that the clasts had a source to the southwest. An age of late Early or early Middle Triassic has been tentatively assigned to these conglomerates. These conglomerates indicate that Late Permian to Early Triassic deformational events in this part of the orogen affected rocks much farther east than has been previously recognized.

Walker, J.D.; Burchfiel, B.C.; Royden, L.H.

1983-02-01

256

Earthquake-cycle models of the Pacific-North America plate boundary at Point Reyes, California  

NASA Astrophysics Data System (ADS)

At Point Reyes, California, about 36 mm/yr of Pacific-North America relative plate motion is accommodated by (from west to east) the San Andreas, Rodgers Creek, Napa and Green Valley faults. We have developed a suite of viscoelastic earthquake cycle models which take into account the timing and recurrence intervals of large earthquakes on these faults, and are calibrated to the current GPS velocity field. We infer a locking depth of about 12 km for all four faults, consistent with previous analyses of local hypocenter depths (e.g., d'Alessio et al, 2005). Low-viscosity viscous shear zones appear to be required for our models to fit the GPS velocities. In order to fit the high surface velocity gradient across this set of faults, the effective viscosity for the lower crust and mantle must exceed 10^20 Pa s. A modest contrast in effective viscosity of the lower crust and upper mantle across the San Andreas Fault, with higher viscosity values (at least 5 x 10^20 Pa s) to the east, is also indicated. In the region between the Rodgers Creek Fault and the Green Valley Fault, GPS data indicate a higher strain rate than our models can explain. Even after shifting the entire Green Valley Fault slip rate (9 mm/yr) westward to the Napa Fault, the misfit is not eliminated. Double-difference hypocenter data (Waldhauser and Schaff, 2008) suggest the presence of another fault zone between the Napa Fault and the Green Valley Fault, and that all three of these faults dip toward the west. This offsets their deep, creeping extensions several km from their surface traces. A preliminary model with a suitably offset, deep Green Valley Fault extension cuts the WRSS misfit to GPS site velocities by over a factor of two. Since non-vertical fault dips are often missed in seismic studies (e.g. Fuis et al., 2008), creeping shear zones at depth may routinely be offset by several kilometers from their surface traces, unless alternate evidence of their position at depth is available (e.g. Shelly et al., 2009). This may lead to incorrect inferences of material asymmetry, or errors in the attribution of slip rates to closely spaced, active faults.

Vaghri, A.; Hearn, E. H.

2011-12-01

257

In-situ fluid-pressure measurements for earthquake prediction: An example from a deep well at Hi Vista, California  

USGS Publications Warehouse

Short-term earthquake prediction requires sensitive instruments for measuring the small anomalous changes in stress and strain that precede earthquakes. Instruments installed at or near the surface have proven too noisy for measuring anomalies of the size expected to occur, and it is now recognized that even to have the possibility of a reliable earthquake-prediction system will require instruments installed in drill holes at depths sufficient to reduce the background noise to a level below that of the expected premonitory signals. We are conducting experiments to determine the maximum signal-to-noise improvement that can be obtained in drill holes. In a 592 m well in the Mojave Desert near Hi Vista, California, we measured water-level changes with amplitudes greater than 10 cm, induced by earth tides. By removing the effects of barometric pressure and the stress related to earth tides, we have achieved a sensitivity to volumetric strain rates of 10-9 to 10-10 per day. Further improvement may be possible, and it appears that a successful earthquake-prediction capability may be achieved with an array of instruments installed in drill holes at depths of about 1 km, assuming that the premonitory strain signals are, in fact, present. ?? 1985 Birkha??user Verlag.

Healy, J.H.; Urban, T.C.

1985-01-01

258

LLNL-Generated Content for the California Academy of Sciences, Morrison Planetarium Full-Dome Show: Earthquake  

SciTech Connect

The California Academy of Sciences (CAS) Morrison Planetarium is producing a 'full-dome' planetarium show on earthquakes and asked LLNL to produce content for the show. Specifically the show features numerical ground motion simulations of the M 7.9 1906 San Francisco and a possible future M 7.05 Hayward fault scenario earthquake. The show also features concepts of plate tectonics and mantle convection using images from LLNL's G3D global seismic tomography. This document describes the data that was provided to the CAS in support of production of the 'Earthquake' show. The CAS is located in Golden Gate Park, San Francisco and hosts over 1.6 million visitors. The Morrison Planetarium, within the CAS, is the largest all digital planetarium in the world. It features a 75-foot diameter spherical section projection screen tilted at a 30-degree angle. Six projectors cover the entire field of view and give a three-dimensional immersive experience. CAS shows strive to use scientifically accurate digital data in their productions. The show, entitled simply 'Earthquake', will debut on 26 May 2012. They are working on graphics and animations based on the same data sets for display on LLNL powerwalls and flat-screens as well as for public release.

Rodgers, A J; Petersson, N A; Morency, C E; Simmons, N A; Sjogreen, B

2012-01-23

259

Potential Effects of a Scenario Earthquake on the Economy of Southern California: Labor Market Exposure and Sensitivity Analysis to a Magnitude 7.8 Earthquake  

USGS Publications Warehouse

The Multi-Hazards Demonstration Project (MHDP) is a collaboration between the U.S. Geological Survey (USGS) and various partners from the public and private sectors and academia, meant to improve Southern California's resiliency to natural hazards (Jones and others, 2007). In support of the MHDP objectives, the ShakeOut Scenario was developed. It describes a magnitude 7.8 (M7.8) earthquake along the southernmost 300 kilometers (200 miles) of the San Andreas Fault, identified by geoscientists as a plausible event that will cause moderate to strong shaking over much of the eight-county (Imperial, Kern, Los Angeles, Orange, Riverside, San Bernardino, San Diego, and Ventura) Southern California region. This report contains an exposure and sensitivity analysis of economic Super Sectors in terms of labor and employment statistics. Exposure is measured as the absolute counts of labor market variables anticipated to experience each level of Instrumental Intensity (a proxy measure of damage). Sensitivity is the percentage of the exposure of each Super Sector to each Instrumental Intensity level. The analysis concerns the direct effect of the scenario earthquake on economic sectors and provides a baseline for the indirect and interactive analysis of an input-output model of the regional economy. The analysis is inspired by the Bureau of Labor Statistics (BLS) report that analyzed the labor market losses (exposure) of a M6.9 earthquake on the Hayward fault by overlaying geocoded labor market data on Instrumental Intensity values. The method used here is influenced by the ZIP-code-level data provided by the California Employment Development Department (CA EDD), which requires the assignment of Instrumental Intensities to ZIP codes. The ZIP-code-level labor market data includes the number of business establishments, employees, and quarterly payroll categorized by the North American Industry Classification System. According to the analysis results, nearly 225,000 business establishments, or 44 percent of all establishments, would experience Instrumental Intensities between VII (7) and X (10). This represents more than 4 million employees earning over $45 billion in quarterly payroll. Over 57,000 of these establishments, employing over 1 million employees earning over $10 billion in quarterly payroll, would experience Instrumental Intensities of IX (9) or X (10). Based upon absolute counts and percentages, the Trade, Transportation, and Utilities Super Sector and the Manufacturing Super Sector are estimated to have the greatest exposure and sensitivity respectively. The Information and the Natural Resources and Mining Super Sectors are estimated to be the least impacted. Areas estimated to experience an Instrumental Intensity of X (10) account for approximately 3 percent of the region's labor market.

Sherrouse, Benson C.; Hester, David J.; Wein, Anne M.

2008-01-01

260

Potential Effects of a Scenario Earthquake on the Economy of Southern California: Intraregional Commuter, Worker, and Earnings Flow Analysis  

USGS Publications Warehouse

The Multi-Hazards Demonstration Project (MHDP) is a collaboration between the U.S. Geological Survey (USGS) and various partners from the public and private sectors and academia, meant to improve Southern California's resiliency to natural hazards (Jones and others, 2007). In support of the MHDP objectives, the ShakeOut Scenario was developed. It describes a magnitude 7.8 (M7.8) earthquake along the southernmost 300 kilometers (200 miles) of the San Andreas Fault, identified by geoscientists as a plausible event that will cause moderate to strong shaking over much of the eight-county (Imperial, Kern, Los Angeles, Orange, Riverside, San Bernardino, San Diego, and Ventura) Southern California region (Jones and others, 2008). This report uses selected datasets from the U.S. Census Bureau and the State of California's Employment Development Department to develop preliminary estimates of the number and spatial distribution of commuters who cross the San Andreas Fault and to characterize these commuters by the industries in which they work and their total earnings. The analysis concerns the relative exposure of the region's economy to the effects of the earthquake as described by the location, volume, and earnings of those commuters who work in each of the region's economic sectors. It is anticipated that damage to transportation corridors traversing the fault would lead to at least short-term disruptions in the ability of commuters to travel between their places of residence and work.

Sherrouse, Benson C.; Hester, David J.

2008-01-01

261

Probability of San Francisco quake increased  

NASA Astrophysics Data System (ADS)

The probability that a large earthquake (magnitude 7.0 or greater) will occur in the San Francisco Bay area in the next 30 years is 67%, according to a recent report by the U.S. Geological Survey. This is a substantial increase from the 50% probability estimated in a similar report released by the USGS in 1988.The new report also says that scientists estimate the probability of a M-7.0 or larger quake in the Bay area in the next 20 years is 50%, compared to 30% from the 1988 report. For the next 10 years, probability is 33%, compared with 20% in the 1988 report. The new probability estimates for the nine-county Bay area are the result of a 6-month study by the California Working Group on Earthquake Probabilities. The group consists of a 12-person panel of earthquake specialists from the USGS, the University of California, the California Institute of Technology, the University of Southern California, Stanford University, the Lamont-Doherty Geologic Observatory at Columbia University, and the Pacific Gas and Electric Company.

262

Borehole velocity measurements at five sites that recorded the Cape Mendocino, California earthquake of 25 April, 1992  

USGS Publications Warehouse

The U.S. Geological Survey (USGS), as part of an ongoing program to acquire seismic velocity and geologic data at locations that recorded strong-ground motions during earthquakes, has investigated five sites in the Fortuna, California region (Figure 1). We selected drill sites at strong-motion stations that recorded high accelerations (Table 1) from the Cape Mendocino earthquake (M 7.0) of 25 April 1992 (Oppenheimer et al., 1993). The boreholes were drilled to a nominal depth of 95 meters (310 ft) and cased with schedule 80 pvc-casing grouted in place at each location. S-wave and P-wave data were acquired at each site using a surface source and a borehole three-component geophone. This report contains the velocity models interpreted from the borehole data and gives reference to locations and peak accelerations at the selected strong-motion stations.

Gibbs, James F.; Tinsley, John C.; Boore, David M.

2002-01-01

263

CISN ShakeAlert: Three Years of Comparative Real-Time Earthquake Early Warning Testing in California  

NASA Astrophysics Data System (ADS)

The California Integrated Seismic Network (CISN) recently concluded a three-year project (August 2006-July 2009) aimed at the implementation, real-time testing, and comparative performance evaluation of three participating earthquake early warning (EEW) algorithms: 1) the Tau-C/P-d onsite algorithm developed by the California Institute of Technology, 2) the ElarmS algorithm developed by UC Berkeley, and 3) the Virtual Seismologist (VS) algorithm developed by the Swiss Seismological Service at ETH Zurich. These 3 EEW algorithms were installed and tested, and continue to run in real-time, at the Southern California Seismic Network, the Berkeley Digital Seismic Network, and the USGS Menlo Park network. The OnSite algorithm provides single-station magnitude estimates and estimates peak ground velocity at a given station. ElarmS and VS both provide magnitude and location estimates, as well as estimates of the geographic distribution of peak ground shaking. Over the last three years, these EEW algorithms submitted real-time and automatic non-interactive offline event reports to the Southern California Earthquake Center (SCEC) EEW Testing Center, which independently evaluated algorithm performance relative to the ANSS earthquake catalogue and observed ground motion datasets. We quantify the performance of these participating algorithms in terms of the accuracy of magnitude, location, and peak ground motion estimation, as well as the speed at which the algorithms provide information. Based on the derived performance characteristics, we infer how a prototype system based on the three algorithms might operate given alternative conditions, such as shorter telemetry delays, faster processing times, or higher station densities. The 2006-2009 CISN-EEW project demonstrated the feasibility and potential benefits of EEW in California. A new USGS-funded effort is underway to develop key components of CISN ShakeAlert, a prototype EEW system that could potentially be implemented in California. These key components include: 1) integration of the three algorithms into a single processing thread capable of providing alerts, 2) development of partnerships with key test users, and 3) development of protocols and procedures in collaboration with the test users that will govern the form and release of EEW information, if such a system is built.

Cua, G. B.; Allen, R. M.; Boese, M.; Brown, H.; Given, D.; Fischer, M.; Hauksson, E.; Heaton, T. H.; Hellweg, M.; Jordan, T. H.; Khainovski, O.; Maechling, P. J.; Neuhauser, D. S.; Oppenheimer, D. H.; Solanki, K.

2009-12-01

264

Instrumental intensity distribution for the Hector Mine, California, and the Chi-Chi, Taiwan, earthquakes: Comparison of two methods  

USGS Publications Warehouse

We compare two methods of seismic-intensity estimation from ground-motion records for the two recent strong earthquakes: the 1999 (M 7.1) Hector Mine, California, and the 1999 (M 7.6) Chi-Chi, Taiwan. The first technique utilizes the peak ground acceleration (PGA) and velocity (PGV), and it is used for rapid generation of the instrumental intensity map in California. The other method is based on the revised relationships between intensity and Fourier amplitude spectrum (FAS). The results of using the methods are compared with independently observed data and between the estimations from the records. For the case of the Hector Mine earthquake, the calculated intensities in general agree with the observed values. For the case of the Chi-Chi earthquake, the areas of maximum calculated intensity correspond to the areas of the greatest damage and highest number of fatalities. However, the FAS method producees higher-intensity values than those of the peak amplitude method. The specific features of ground-motion excitation during the large, shallow, thrust earthquake may be considered a reason for the discrepancy. The use of PGA and PGV is simple; however, the use of FAS provides a natural consideration of site amplification by means of generalized or site-specific spectral ratios. Because the calculation of seismic-intensity maps requires rapid processing of data from a large network, it is very practical to generate a "first-order" map from the recorded peak motions. Then, a "second-order" map may be compiled using an amplitude-spectra method on the basis of available records and numerical modeling of the site-dependent spectra for the regions of sparse station spacing.

Sokolov, V.; Wald, D.J.

2002-01-01

265

Behavior of Repeating Earthquake Sequences in Central California and the Implications for Subsurface Fault Creep  

SciTech Connect

Repeating earthquakes (REs) are sequences of events that have nearly identical waveforms and are interpreted to represent fault asperities driven to failure by loading from aseismic creep on the surrounding fault surface at depth. We investigate the occurrence of these REs along faults in central California to determine which faults exhibit creep and the spatio-temporal distribution of this creep. At the juncture of the San Andreas and southern Calaveras-Paicines faults, both faults as well as a smaller secondary fault, the Quien Sabe fault, are observed to produce REs over the observation period of March 1984-May 2005. REs in this area reflect a heterogeneous creep distribution along the fault plane with significant variations in time. Cumulative slip over the observation period at individual sequence locations is determined to range from 5.5-58.2 cm on the San Andreas fault, 4.8-14.1 cm on the southern Calaveras-Paicines fault, and 4.9-24.8 cm on the Quien Sabe fault. Creep at depth appears to mimic the behaviors seen of creep on the surface in that evidence of steady slip, triggered slip, and episodic slip phenomena are also observed in the RE sequences. For comparison, we investigate the occurrence of REs west of the San Andreas fault within the southern Coast Range. Events within these RE sequences only occurred minutes to weeks apart from each other and then did not repeat again over the observation period, suggesting that REs in this area are not produced by steady aseismic creep of the surrounding fault surface.

Templeton, D C; Nadeau, R; Burgmann, R

2007-07-09

266

Bulletin of the Seismological Society of America, Vol. 86, No. 1B, pp. $49-$70, February 1996 The Slip History of the 1994 Northridge, California, EarthquakeDetermined  

E-print Network

States since the great 1906 San Francisco earthquake (USGSand SCEC,1994). Peak acceleration and velocity The Slip History of the 1994 Northridge, California, EarthquakeDetermined from Strong-Motion, Teleseismic a rupture model of the Northridge earthquake, determined from the joint inversion of near-source strong

Greer, Julia R.

267

The Magnitude 6.7 Northridge, California, Earthquake of 17 January 1994  

NASA Astrophysics Data System (ADS)

The most costly American earthquake since 1906 struck Los Angeles on 17 January 1994. The magnitude 6.7 Northridge earthquake resulted from more than 3 meters of reverse slip on a 15-kilometer-long south-dipping thrust fault that raised the Santa Susana mountains by as much as 70 centimeters. The fault appears to be truncated by the fault that broke in the 1971 San Fernando earthquake at a depth of 8 kilometers. Of these two events, the Northridge earthquake caused many times more damage, primarily because its causative fault is directly under the city. Many types of structures were damaged, but the fracture of welds in steel-frame buildings was the greatest surprise. The Northridge earthquake emphasizes the hazard posed to Los Angeles by concealed thrust faults and the potential for strong ground shaking in moderate earthquakes.

Geological Survey, Scientists Of The U. S.; Southern California Earth, The

1994-10-01

268

Cross-sections and maps showing double-difference relocated earthquakes from 1984-2000 along the Hayward and Calaveras faults, California  

USGS Publications Warehouse

We present cross-section and map views of earthquakes that occurred from 1984 to 2000 in the vicinity of the Hayward and Calaveras faults in the San Francisco Bay region, California. These earthquakes came from a catalog of events relocated using the double-difference technique, which provides superior relative locations of nearby events. As a result, structures such as fault surfaces and alignments of events along these surfaces are more sharply defined than in previous catalogs.

Simpson, Robert W.; Graymer, Russell W.; Jachens, Robert C.; Ponce, David A.; Wentworth, Carl M.

2004-01-01

269

Earthquake-induced structures in sediments of Van Norman Lake, San Fernando, California  

USGS Publications Warehouse

The 9 February 1971 earthquake in the San Fernando Valley damaged the Lower Van Norman Dam severely enough to warrant draining the reservoir. In March 1972 the sediment deposited on the reservoir floor was examined to determine whether the 1971 earthquake had induced sediment deformation and, if so, what types. A zone of deformational structures characterized by small-scale loads and slightly recumbent folds associated with the 1971 earthquake was discovered, in addition to two older zones of load structures. Each of the zones has been tentatively correlated with an historic earthquake.

Sims, J.D.

1973-01-01

270

Damage and restoration of geodetic infrastructure caused by the 1994 Northridge, California, earthquake  

USGS Publications Warehouse

We seek to restore the integrity of the geodetic network in the San Fernando, Simi, Santa Clarita Valleys and in the northern Los Angeles Basin by remeasurement of the network and identification of BMs which experienced non-tectonic displacements associated with the Northridge earthquake. We then use the observed displacement of BMs in the network to portray or predict the permanent vertical and horizontal deformation associated with the 1994 Northridge earthquake throughout the area, including sites where we lack geodetic measurements. To accomplish this, we find the fault geometry and earthquake slip that are most compatible with the geodetic and independent seismic observations of the earthquake. We then use that fault model to predict the deformation everywhere at the earth's surface, both at locations where geodetic observations exist and also where they are absent. We compare displacements predicted for a large number of numerical models of the earthquake faulting to the coseismic displacements, treating the earthquake fault as a cut or discontinuity embedded in a stiff elastic solid. This comparison is made after non-tectonic deformation has been removed from the measured elevation changes. The fault slip produces strain in the medium and deforms the ground surface. The model compatible with seismic observations that best fits the geodetic data within their uncertainties is selected. The acceptable model fault bisects the mainshock focus, and the earthquake size , magnitude, is compatible with the earthquake size measured seismically. Our fault model was used to identify geodetic monuments on engineered structures that were anomalously displaced by the earthquake.

Hodgkinson, Kathleen M.; Stein, Ross S.; Hudnut, Kenneth W.; Satalich, Jay; Richards, John H.

1996-01-01

271

A model of earthquake triggering probabilities and application to dynamic deformations constrained by ground motion observations  

USGS Publications Warehouse

We have used observations from Felzer and Brodsky (2006) of the variation of linear aftershock densities (i.e., aftershocks per unit length) with the magnitude of and distance from the main shock fault to derive constraints on how the probability of a main shock triggering a single aftershock at a point, P(r, D), varies as a function of distance, r, and main shock rupture dimension, D. We find that P(r, D) becomes independent of D as the triggering fault is approached. When r ??? D P(r, D) scales as Dm where m-2 and decays with distance approximately as r-n with n = 2, with a possible change to r-(n-1) at r > h, where h is the closest distance between the fault and the boundaries of the seismogenic zone. These constraints may be used to test hypotheses about the types of deformations and mechanisms that trigger aftershocks. We illustrate this using dynamic deformations (i.e., radiated seismic waves) and a posited proportionality with P(r, D). Deformation characteristics examined include peak displacements, peak accelerations and velocities (proportional to strain rates and strains, respectively), and two measures that account for cumulative deformations. Our model indicates that either peak strains alone or strain rates averaged over the duration of rupture may be responsible for aftershock triggering.

Gomberg, J.; Felzer, K.

2008-01-01

272

Paleoseismology of the southern Panamint Valley fault: Implications for regional earthquake occurrence and seismic hazard in southern California  

NASA Astrophysics Data System (ADS)

data from the southern Panamint Valley fault (PVF) reveal evidence of at least four surface ruptures during late Holocene time (0.33-0.48 ka, 0.9-3.0 ka, 3.3-3.6 ka, and >4.1 ka). These paleo-earthquake ages indicate that the southern PVF has ruptured at least once and possibly twice during the ongoing (?1.5 ka) seismic cluster in the Mojave section of the eastern California shear zone (ECSZ). The most recent event (MRE) on the PVF is also similar in age to the 1872 Owens Valley earthquake and the geomorphically youthful MRE on the Death Valley fault. The timing of the three oldest events at our site shows that the PVF ruptured at least once and possibly thrice during the well-defined 2-5 ka seismic lull in the Mojave section of the ECSZ. Interestingly, the 3.3-3.6 ka age of Event 3 overlaps with the 3.3-3.8 ka age of the penultimate (i.e., pre-1872) rupture on the central Owens Valley fault. These new PVF data support the notion that earthquake occurrence in the ECSZ may be spatially and temporally complex, with earthquake clusters occurring in different regions at different times. Coulomb failure function modeling of the Panamint Valley and Garlock faults reveals significant stress interactions between these two faults that may influence future earthquake occurrence. Specifically, our models suggest a possible rupture sequence whereby an event on the southern Panamint Valley fault can lead to the potential triggering of an event on the Garlock fault, which in turn could trigger the Mojave section of the San Andreas Fault.

McAuliffe, Lee J.; Dolan, James F.; Kirby, Eric; Rollins, Chris; Haravitch, Ben; Alm, Steve; Rittenour, Tammy M.

2013-09-01

273

Probability of introduction of exotic strains of bluetongue virus into the US and into California through importation of infected cattle.  

PubMed

Strategies designed to minimize the probability of bluetongue virus (BTV) introduction to new areas should be based on a quantitative assessment of the probability of actually establishing the virus once it is introduced. The risk of introducing a new strain of bluetongue virus into a region depends on the number of viremic animals that enter and the competency of local vectors to transmit the virus. We used Monte Carlo simulation to model the probability of introducing BTV into California, USA, and the US through importation of cattle. Records of cattle and calf imports into California and the US were obtained, as was seroprevalence information from the exporting countries. A simulation model was constructed to evaluate the probability of importing either a viremic PCR-negative animal after 14-day quarantine, a c-ELISA BTV-antibody-negative animal after 28-day quarantine, or an untested viremic animal after 100-day quarantine into California and into the US. We found that for animals imported to the US, the simulated (best to worst scenarios) median percentage that tested positive for BTV-antibody ranged from 5.4 to 7.2%, while for the subset imported to California, the simulated median percentage that tested positive for BTV-antibody ranged from 20.9 to 78.9%. Using PCR, for animals imported to the US these values were 71.8-85.3%, and for those imported to California, the simulated median that test positive ranged from 74.3 to 92.4%. The probability that an imported animal was BTV-viremic is very low regardless of the scenario selected (median probability=0.0%). The probability of introducing an exotic strain of BTV into California or the US by importing infected cattle was remote, and the current Office International des Epizooties (OIE) recommendation of either a final PCR test performed 14 days after entry into quarantine, a c-ELISA performed 28 days after entry into quarantine or a 100-day quarantine with no testing requirement was adequate to protect cattle in the US and California from an exotic strain of BTV. PMID:15579336

Hoar, Bruce R; Carpenter, Tim E; Singer, Randall S; Gardner, Ian A

2004-12-15

274

Ground-water-level monitoring for earthquake prediction; a progress report based on data collected in Southern California, 1976-79  

USGS Publications Warehouse

The U.S. Geological Survey is conducting a research program to determine if groundwater-level measurements can be used for earthquake prediction. Earlier studies suggest that water levels in wells may be responsive to small strains on the order of 10 to the minus 8th power to 10 to the minus 10th power (dimensionless). Water-level data being collected in the area of the southern California uplift show response to earthquakes and other natural and manmade effects. The data are presently (1979) being made ready for computer analysis. The completed analysis may indicate the presence of precursory earthquake information. (USGS)

Moyle, W.R., Jr.

1980-01-01

275

The Magnitude 6.7 Northridge, California, Earthquake of January 17, 1994  

NASA Technical Reports Server (NTRS)

The most damaging earthquake in the United States since 1906 struck northern Los Angeles on January 17.1994. The magnitude 6.7 Northridge earthquake produced a maximum of more than 3 meters of reverse (up-dip) slip on a south-dipping thrust fault rooted under the San Fernando Valley and projecting north under the Santa Susana Mountains.

Donnellan, A.

1994-01-01

276

Investigation of ionospheric electron content variations before earthquakes in southern California, 2003–2004  

Microsoft Academic Search

It has been proposed that earthquakes are preceded by electromagnetic signals detectable from ground- and space-based measurements. Ionospheric anomalies, such as variations in the electron density a few days before earthquakes, are one of the precursory signals proposed. Since Global Positioning System (GPS) data can be used to measure the ionospheric total electron content (TEC), the technique has received attention

Thomas Dautermann; Eric Calais; Jennifer Haase; James Garrison

2007-01-01

277

Investigation of Ionospheric Electron Content Variations Before Earthquakes in Southern California, 2003-2004  

Microsoft Academic Search

It has been proposed that earthquakes are preceded by electromagnetic signals detectable from ground- and space-based measurements. Ionospheric anomalies, such as variations in the electron density a few days before earthquakes, are one of the precursory signals proposed. Since Global Positioning System (GPS) data can be used to measure the ionospheric Total Electron Content (TEC), the technique has received attention

T. Dautermann; E. Calais; J. Haase; J. L. Garrison

2006-01-01

278

Chapter C. The Loma Prieta, California, Earthquake of October 17, 1989 - Preseismic Observations  

USGS Publications Warehouse

The October 17, 1989, Loma Prieta, Calif., Ms=7.1 earthquake provided the first opportunity in the history of fault monitoring in the United States to gather multidisciplinary preearthquake data in the near field of an M=7 earthquake. The data obtained include observations on seismicity, continuous strain, long-term ground displacement, magnetic field, and hydrology. The papers in this chapter describe these data, their implications for fault-failure mechanisms, the scale of prerupture nucleation, and earthquake prediction in general. Of the 10 papers presented here, about half identify preearthquake anomalies in the data, but some of these results are equivocal. Seismicity in the Loma Prieta region during the 20 years leading up to the earthquake was unremarkable. In retrospect, however, it is apparent that the principal southwest-dipping segment of the subsequent Loma Prieta rupture was virtually aseismic during this period. Two M=5 earthquakes did occur near Lake Elsman near the junction of the Sargent and San Andreas faults within 2.5 and 15 months of, and 10 km to the north of, the Loma Prieta epicenter. Although these earthquakes were not on the subsequent rupture plane of the Loma Prieta earthquake and other M=5 earthquakes occurred in the preceding 25 years, it is now generally accepted that these events were, in some way, foreshocks to the main event.

Johnston, Malcolm J. S., (Edited By)

1993-01-01

279

Rheological controls on fault loading rates in northern California following the 1906 San Francisco earthquake  

Microsoft Academic Search

To estimate seismic hazard in complex, multiple fault systems, it is necessary to understand how stresses within the system are transferred. Accurate estimates of loading rate due to far-field tectonic motion and postseismic transients following large earthquakes are required. Using plausible lower crustal rheologies constrained by observations following the 1906 San Francisco earthquake, I show that postseismic relaxation may play

S. J. Kenner

2004-01-01

280

Offshore and onshore liquefaction at Moss Landing spit, central California - result of the October 17, 1989, Loma Prieta earthquake  

SciTech Connect

As a result of the October 17, 1989, Loma Prieta (Santa Cruz Mountains, California) earthquake, liquefaction of the fluvial, estuarine, eolian, and beach sediments under a sand spit destroyed the Moss Landing Marine Laboratories and damaged other structures and utilities. Initial studies suggested that the liquefaction was a local phenomenon. More detailed offshore investigations, however, indicate that it occurred over a large area (maximum 8 km{sup 2}) during or shortly after the earthquake with movement of unconsolidated sediment toward and into the head of Monterey submarine canyon. This conclusion is supported by side-scan sonographs, high-resolution seismic-reflection and bathymetric profiles, onshore and sea-floor photographs, and underwater video tapes. Many distinct lobate features were identified on the shallow shelf. These features almost certainly were the result of the October 17 earthquake; they were subsequently destroyed by winter storms. In addition, fresh slump scars and recently dislodged mud debris were found on the upper, southern wall of Monterey submarine canyon.

Greene, H.G.; Chase, T.E.; Hicks, K.R. (Geological Survey, Menlo Park, CA (United States)); Gardner-Taggart, J.; Ledbetter, M.T.; Barminski, R. (Moss Landing Marine Labs., CA (United States)); Baxter, C. (Monterey Bay Aquarium Research Inst., CA (United States))

1991-09-01

281

Comments on baseline correction of digital strong-motion data: Examples from the 1999 Hector Mine, California, earthquake  

USGS Publications Warehouse

Residual displacements for large earthquakes can sometimes be determined from recordings on modern digital instruments, but baseline offsets of unknown origin make it difficult in many cases to do so. To recover the residual displacement, we suggest tailoring a correction scheme by studying the character of the velocity obtained by integration of zeroth-order-corrected acceleration and then seeing if the residual displacements are stable when the various parameters in the particular correction scheme are varied. For many seismological and engineering purposes, however, the residual displacement are of lesser importance than ground motions at periods less than about 20 sec. These ground motions are often recoverable with simple baseline correction and low-cut filtering. In this largely empirical study, we illustrate the consequences of various correction schemes, drawing primarily from digital recordings of the 1999 Hector Mine, California, earthquake. We show that with simple processing the displacement waveforms for this event are very similar for stations separated by as much as 20 km. We also show that a strong pulse on the transverse component was radiated from the Hector Mine earthquake and propagated with little distortion to distances exceeding 170 km; this pulse leads to large response spectral amplitudes around 10 sec.

Boore, D.M.; Stephens, C.D.; Joyner, W.B.

2002-01-01

282

Geodetic slip rate for the eastern California shear zone and the recurrence time of Mojave desert earthquakes  

USGS Publications Warehouse

Where the San Andreas fault passes along the southwestern margin of the Mojave desert, it exhibits a large change in trend, and the deformation associated with the Pacific/North American plate boundary is distributed broadly over a complex shear zone. The importance of understanding the partitioning of strain across this region, especially to the east of the Mojave segment of the San Andreas in a region known as the eastern California shear zone (ECSZ), was highlighted by the occurrence (on 28 June 1992) of the magnitude 7.3 Landers earthquake in this zone. Here we use geodetic observations in the central Mojave desert to obtain new estimates for the rate and distribution of strain across a segment of the ECSZ, and to determine a coseismic strain drop of ~770 ??rad for the Landers earthquake. From these results we infer a strain energy recharge time of 3,500-5,000 yr for a Landers-type earthquake and a slip rate of ~12 mm yr-1 across the faults of the central Mojave. The latter estimate implies that a greater fraction of plate motion than heretofore inferred from geodetic data is accommodated across the ECSZ.

Sauber, J.; Thatcher, W.; Solomon, S.C.; Lisowski, M.

1994-01-01

283

Probability of detecting perchlorate under natural conditions in deep groundwater in California and the Southwestern United States  

USGS Publications Warehouse

We use data from 1626 groundwater samples collected in California, primarily from public drinking water supply wells, to investigate the distribution of perchlorate in deep groundwater under natural conditions. The wells were sampled for the California Groundwater Ambient Monitoring and Assessment Priority Basin Project. We develop a logistic regression model for predicting probabilities of detecting perchlorate at concentrations greater than multiple threshold concentrations as a function of climate (represented by an aridity index) and potential anthropogenic contributions of perchlorate (quantified as an anthropogenic score, AS). AS is a composite categorical variable including terms for nitrate, pesticides, and volatile organic compounds. Incorporating water-quality parameters in AS permits identification of perturbation of natural occurrence patterns by flushing of natural perchlorate salts from unsaturated zones by irrigation recharge as well as addition of perchlorate from industrial and agricultural sources. The data and model results indicate low concentrations (0.1-0.5 ?g/L) of perchlorate occur under natural conditions in groundwater across a wide range of climates, beyond the arid to semiarid climates in which they mostly have been previously reported. The probability of detecting perchlorate at concentrations greater than 0.1 ?g/L under natural conditions ranges from 50-70% in semiarid to arid regions of California and the Southwestern United States to 5-15% in the wettest regions sampled (the Northern California coast). The probability of concentrations above 1 ?g/L under natural conditions is low (generally <3%).

Fram, Miranda S.; Belitz, Kenneth

2011-01-01

284

Earthquake deformation cycle on the San Andreas Fault near Parkfield, California  

NASA Astrophysics Data System (ADS)

Six moderate earthquakes are thought to have ruptured the Parkfield segment of the San Andreas fault since 1857. The similar characteristics of the three seismically recorded events, together with the quasi-regular 22-year recurrence interval, have led to the forecast of a similar event in 1988 ± 5 years. In this study the potential for the hypothesized earthquake is assessed by estimating the time required for interseismic straining to recover the strain energy released by the most recent, June 1966, Parkfield earthquake. For a simple model of the earthquake cycle this interval is proportional to the ratio of the 1966 seismic moment to the interseismic moment deficit rate. Baseline lengths measured before and after the 1966 earthquake are inverted to estimate the magnitude and distribution of the combined coseismic and transient postseismic slip. Although the geodetic data determine only broad spatial averages of the fault slip, they do require that slip at depth exceeded the surface offsets measured in 1966. Inversions with extreme prior models are calculated to assess the range of seismic moment and moment deficit rates consistent with the available geodetic data. Inversions assuming only that the slip distributions are to some degree smooth yield strain energy recovery intervals of 14-25 years. Extreme values range from as little as 6 years to as much as 29 years. In comparison, historic recurrence intervals of Parkfield earthquakes range from 12 to 32 years. These results are consistent with the expectation of a magnitude 5½-6 earthquake near Parkfield by 1993.

Segall, Paul; Harris, Ruth

1987-09-01

285

Why earthquakes correlate weakly with the solid Earth tides: Effects of periodic stress on the rate and probability of earthquake occurrence  

USGS Publications Warehouse

We provide an explanation why earthquake occurrence does not correlate well with the daily solid Earth tides. The explanation is derived from analysis of laboratory experiments in which faults are loaded to quasiperiodic failure by the combined action of a constant stressing rate, intended to simulate tectonic loading, and a small sinusoidal stress, analogous to the Earth tides. Event populations whose failure times correlate with the oscillating stress show two modes of response; the response mode depends on the stressing frequency. Correlation that is consistent with stress threshold failure models, e.g., Coulomb failure, results when the period of stress oscillation exceeds a characteristic time tn; the degree of correlation between failure time and the phase of the driving stress depends on the amplitude and frequency of the stress oscillation and on the stressing rate. When the period of the oscillating stress is less than tn, the correlation is not consistent with threshold failure models, and much higher stress amplitudes are required to induce detectable correlation with the oscillating stress. The physical interpretation of tn is the duration of failure nucleation. Behavior at the higher frequencies is consistent with a second-order dependence of the fault strength on sliding rate which determines the duration of nucleation and damps the response to stress change at frequencies greater than 1/tn. Simple extrapolation of these results to the Earth suggests a very weak correlation of earthquakes with the daily Earth tides, one that would require >13,000 earthquakes to detect. On the basis of our experiments and analysis, the absence of definitive daily triggering of earthquakes by the Earth tides requires that for earthquakes, tn exceeds the daily tidal period. The experiments suggest that the minimum typical duration of earthquake nucleation on the San Andreas fault system is ???1 year.

Beeler, N.M.; Lockner, D.A.

2003-01-01

286

The M7 October 21, 1868 Hayward Earthquake, Northern California-140 Years Later  

NASA Astrophysics Data System (ADS)

October 21, 2008 marks the 140th anniversary of the M7 1868 Hayward earthquake. This large earthquake, which occurred slightly before 8 AM, caused extensive damage to San Francisco Bay Area and remains the nation's 12th most lethal earthquake. Property loss was extensive and about 30 people were killed. This earthquake culminated a decade-long series of earthquakes in the Bay Area which started with an M~6 earthquake in the southern Peninsula in 1856, followed by a series of four M5.8 to M6.1 sized earthquakes along the northern Calaveras fault, and ended with a M~6.5 earthquake in the Santa Cruz Mountains in 1865. Despite this flurry of quakes, the shaking from the 1868 earthquake was the strongest that the new towns and growing cities of the Bay Area had ever experienced. The effect on the brick buildings of the time was devastating: walls collapsed in San Francisco, Oakland, and San Jose, and buildings cracked as far away as Napa, Santa Rosa, and Hollister. The area that was strongly shaken (at Modified Mercalli Intensity VII or higher) encompassed about 2,300 km2. Aftershocks continued into November 1868. Surface cracking of the ground along the southern end of the Hayward Fault was traced from Warm Springs in Fremont northward 32 km to San Leandro. As Lawson (1908) reports, "the evidence to the northward of San Leandro is not very satisfactory. The country was then unsettled, and the information consisted of reports of cow- boys riding on the range". Analysis of historical triangulation data suggest that the fault moved as far north as Berkeley, and from these data the average slip along the fault is inferred to be about 1.9 ± 0.4 meters. The paleoseismic record from the southern end of the Hayward Fault provides evidence for 10 earthquakes before 1868. The average interval between these earthquakes is 170 ± 80 years, but the last five earthquakes have had an average interval of only 140 ± 50 years. The 1868 Hayward earthquake and more recent analogs such as the 1995 Kobe earthquake are stark reminders of the awesome energy waiting to be released from below the east side of the San Francisco Bay along the Hayward Fault. The population at risk from a Hayward Fault earthquake is now 100 times larger than in 1868. The infrastructure in the San Francisco Bay Area has been tested only by the relatively remote 1989 M6.9 Loma Prieta earthquake. To help focus public attention on these hazards, the 1868 Hayward Earthquake Alliance has been formed, consisting of public and private sector agencies and corporations (see their website www.1868alliance.org). The Alliance is planning a series of activities leading up to the 140th anniversary on October 21, 2008. These include public forums, conferences, commemoration events, publications, websites, videos, and public service announcements.

Brocher, T. M.; Boatwright, J.; Lienkaemper, J. J.; Schwartz, D. P.; Garcia, S.

2007-12-01

287

Forecasting the Next Great San Francisco Earthquake  

NASA Astrophysics Data System (ADS)

The great San Francisco earthquake of 18 April 1906 and its subsequent fires killed more than 3,000 persons, and destroyed much of the city leaving 225,000 out of 400,000 inhabitants homeless. The 1906 earthquake occurred on a km segment of the San Andreas fault that runs from the San Juan Bautista north to Cape Mendocino and is estimated to have had a moment magnitude m ,l 7.9. Observations of surface displacements across the fault were in the range m. As we approach the 100 year anniversary of this event, a critical concern is the hazard posed by another such earthquake. In this talk we examine the assumptions presently used to compute the probability of occurrence of these earthquakes. We also present the results of a numerical simulation of interacting faults on the San Andreas system. Called Virtual California, this simulation can be used to compute the times, locations and magnitudes of simulated earthquakes on the San Andreas fault in the vicinity of San Francisco. Of particular importance are new results for the statistical distribution of interval times between great earthquakes, results that are difficult or impossible to obtain from a purely field-based approach. We find that our results are fit well under most circumstances by the Weibull statistical distribution, and we compute waiting times to future earthquakes based upon our simulation results. A contrasting approach to the same problem has been adopted by the Working Group on California Earthquake Probabilities, who use observational data combined with statistical assumptions to compute probabilities of future earthquakes.

Rundle, P.; Rundle, J. B.; Turcotte, D. L.; Donnellan, A.; Yakovlev, G.; Tiampo, K. F.

2005-12-01

288

Birth of a fault: Connecting the Kern County and Walker Pass, California, earthquakes  

USGS Publications Warehouse

A band of seismicity transects the southern Sierra Nevada range between the northeastern end of the site of the 1952 MW (moment magnitude) 7.3 Kern County earthquake and the site of the 1946 MW 6.1 Walker Pass earthquake. Relocated earthquakes in this band, which lacks a surface expression, better delineate the northeast-trending seismic lineament and resolve complex structure near the Walker Pass mainshock. Left-lateral earthquake focal planes are rotated counterclockwise from the strike of the seismic lineament, consistent with slip on shear fractures such as those observed in the early stages of fault development in laboratory experiments. We interpret this seismic lineament as a previously unrecognized, incipient, currently blind, strike-slip fault, a unique example of a newly forming structure.

Bawden, G.W.; Michael, A.J.; Kellogg, L.H.

1999-01-01

289

Chapter C. The Loma Prieta, California, Earthquake of October 17, 1989 - Building Structures  

USGS Publications Warehouse

Several approaches are used to assess the performance of the built environment following an earthquake -- preliminary damage surveys conducted by professionals, detailed studies of individual structures, and statistical analyses of groups of structures. Reports of damage that are issued by many organizations immediately following an earthquake play a key role in directing subsequent detailed investigations. Detailed studies of individual structures and statistical analyses of groups of structures may be motivated by particularly good or bad performance during an earthquake. Beyond this, practicing engineers typically perform stress analyses to assess the performance of a particular structure to vibrational levels experienced during an earthquake. The levels may be determined from recorded or estimated ground motions; actual levels usually differ from design levels. If a structure has seismic instrumentation to record response data, the estimated and recorded response and behavior of the structure can be compared.

Celebi, Mehmet

1998-01-01

290

Earthquake Photo Collections  

NSDL National Science Digital Library

This collection of earthquake photos, published by the United States Geological Survey (USGS), contains links to photos for specific earthquakes, as well as links to other USGS image collections and non-USGS collections. Highlights include photos from the 1906 San Francisco earthquake, the 1989 Loma Prieta earthquake, and the 1994 earthquake in Northridge, California. There is also a link to the USGS photo library (general geologic topics), and links to collections published by universities, museums, other government organizations, and professional organizations.

291

Surface Displacement of the 17 May 1993 Eureka Valley, California, Earthquake Observed by SAR Interferometry  

Microsoft Academic Search

Satellite synthetic aperture radar (SAR) interferometry shows that the magnitude 6.1 Eureka Valley earthquake of 17 May 1993 produced an elongated subsidence basin oriented north-northwest, parallel to the trend defined by the aftershock distribution, whereas the source mechanism of the earthquake implies a north-northeast-striking normal fault. The ±3-millimeter accuracy of the radar-observed displacement map over short spatial scales allowed identification

Gilles Peltzer; Paul Rosen

1995-01-01

292

The magnitude-frequency distribution of earthquakes recorded with deep seismometers at Cajon Pass, southern California  

Microsoft Academic Search

The cumulative b-value (the slope of the Gutenberg-Richter relationship between earthquake occurrence rate and magnitude) is commonly found to be constant (?1). Network catalogues, however, reveal a decrease at small magnitudes (<3). Some recent studies have suggested that this decrease in b-value is not just an artifact of catalogue incompleteness, but that small earthquakes are really not as numerous as

Rachel E. Abercrombie

1996-01-01

293

Near-field postseismic deformation associated with the 1992 Landers and 1999 Hector Mine, California, earthquakes  

NASA Astrophysics Data System (ADS)

After the Landers earthquake (Mw = 7.3, 1992.489) a linear array of 10 monuments extending about 30 km N50°E on either side of the earthquake rupture plus a nearby off-trend reference monument were surveyed frequently by GPS until 2003.2. The array also spans the rupture of the subsequent Hector Mine earthquake (Mw = 7.1, 1999.792). The pre-Landers velocities of monuments in the array relative to interior North America were estimated from earlier trilateration and very long baseline interferometry measurements. Except at the reference monument, the post-Landers velocities of the individual monuments in the array relaxed to their preseismic values within 4 years. Following the Hector Mine earthquake the velocities of the monuments relaxed to steady rates within 1 year. Those steady rates for the east components are about equal to the pre-Landers rates as is the steady rate for the north component of the one monument east of the Hector Mine rupture. However, the steady rates for the north components of the 10 monuments west of the rupture are systematically ˜10 mm yr-1 larger than the pre-Landers rates. The relaxation to a steady rate is approximately exponential with decay times of 0.50 ± 0.10 year following the Landers earthquake and 0.32 ± 0.18 year following the Hector Mine earthquake. The postearthquake motions of the Landers array following the Landers earthquake are not well approximated by the viscoelastic-coupling model of [2000]. A similar viscoelastic-coupling model [, 2001] is more successful in representing the deformation after the Hector Mine earthquake.

Savage, J. C.; Svarc, J. L.; Prescott, W. H.

2003-09-01

294

Chapter A. The Loma Prieta, California, Earthquake of October 17, 1989 - Lifelines  

USGS Publications Warehouse

To the general public who had their televisions tuned to watch the World Series, the 1989 Loma Prieta earthquake was a lifelines earthquake. It was the images seen around the world of the collapsed Cypress Street viaduct, with the frantic and heroic efforts to pull survivors from the structure that was billowing smoke; the collapsed section of the San Francisco-Oakland Bay Bridge and subsequent home video of a car plunging off the open span; and the spectacular fire in the Marina District of San Francisco fed by a broken gasline. To many of the residents of the San Francisco Bay region, the relation of lifelines to the earthquake was characterized by sitting in the dark because of power outage, the inability to make telephone calls because of network congestion, and the slow and snarled traffic. Had the public been aware of the actions of the engineers and tradespeople working for the utilities and other lifeline organizations on the emergency response and restoration of lifelines, the lifeline characteristics of this earthquake would have been even more significant. Unobserved by the public were the warlike devastation in several electrical-power substations, the 13 miles of gas-distribution lines that had to be replaced in several communities, and the more than 1,200 leaks and breaks in water mains and service connections that had to be excavated and repaired. Like the 1971 San Fernando, Calif., earthquake, which was a seminal event for activity to improve the earthquake performance of lifelines, the 1989 Loma Prieta earthquake demonstrated that the tasks of preparing lifelines in 'earthquake country' were incomplete-indeed, new lessons had to be learned.

Schiff, Anshel J., (Edited By)

1998-01-01

295

Sedimentary record of the 1872 earthquake and "Tsunami" at Owens Lake, southeast California  

USGS Publications Warehouse

In 1872, a magnitude 7.5-7.7 earthquake vertically offset the Owens Valley fault by more than a meter. An eyewitness reported a large wave on the surface of Owens Lake, presumably initiated by the earthquake. Physical evidence of this event is found in cores and trenches from Owens Lake, including soft-sediment deformation and fault offsets. A graded pebbly sand truncates these features, possibly over most of the lake floor, reflecting the "tsunami" wave. Confirmation of the timing of the event is provided by abnormally high lead concentrations in the sediment immediately above and below these proposed earthquake deposits derived from lead-smelting plants that operated near the eastern lake margin from 1869-1876. The bottom velocity in the deepest part of the lake needed to transport the coarsest grain sizes in the graded pebbly sand provides an estimate of the minimum initial 'tsunami' wave height at 37 cm. This is less than the wave height calculated from long-wave numerical models (about 55 cm) using average fault displacement during the earthquake. Two other graded sand deposits associated with soft-sediment deformation in the Owens Lake record are less than 3000 years old, and are interpreted as evidence of older earthquake and tsunami events. Offsets of the Owens Valley fault elsewhere in the valley indicate that at least two additional large earthquakes occurred during the Holocene, which is consistent with our observations in this lacustrine record.

Smoot, J.P.; Litwin, R.J.; Bischoff, J.L.; Lund, S.J.

2000-01-01

296

Chapter D. The Loma Prieta, California, Earthquake of October 17, 1989 - Aftershocks and Postseismic Effects  

USGS Publications Warehouse

While the damaging effects of the earthquake represent a significant social setback and economic loss, the geophysical effects have produced a wealth of data that have provided important insights into the structure and mechanics of the San Andreas Fault system. Generally, the period after a large earthquake is vitally important to monitor. During this part of the seismic cycle, the primary fault and the surrounding faults, rock bodies, and crustal fluids rapidly readjust in response to the earthquake's sudden movement. Geophysical measurements made at this time can provide unique information about fundamental properties of the fault zone, including its state of stress and the geometry and frictional/rheological properties of the faults within it. Because postseismic readjustments are rapid compared with corresponding changes occurring in the preseismic period, the amount and rate of information that is available during the postseismic period is relatively high. From a geophysical viewpoint, the occurrence of the Loma Prieta earthquake in a section of the San Andreas fault zone that is surrounded by multiple and extensive geophysical monitoring networks has produced nothing less than a scientific bonanza. The reports assembled in this chapter collectively examine available geophysical observations made before and after the earthquake and model the earthquake's principal postseismic effects. The chapter covers four broad categories of postseismic effect: (1) aftershocks; (2) postseismic fault movements; (3) postseismic surface deformation; and (4) changes in electrical conductivity and crustal fluids.

Reasenberg, Paul A., (Edited By)

1997-01-01

297

The Redwood Coast Tsunami Work Group: a unique organization promoting earthquake and tsunami resilience on California's North Coast  

NASA Astrophysics Data System (ADS)

The Northern California counties of Del Norte, Humboldt, and Mendocino account for over 30% of California's coastline and is one of the most seismically active areas of the contiguous 48 states. The region is at risk from earthquakes located on- and offshore and from tsunamis generated locally from faults associated with the Cascadia subduction zone (CSZ) and from distant sources elsewhere in the Pacific. In 1995 the California Geological Survey (CGS) published a scenario for a CSZ earthquake that included both strong ground shaking effects and a tsunami. As a result of the scenario, the Redwood Coast Tsunami Work Group (RCTWG), an organization of government agencies, tribes, service groups, academia and the private sector, was formed to coordinate and promote earthquake and tsunami hazard awareness and mitigation in the three-county region. The RCTWG and its member agencies projects include education/outreach products and programs, tsunami hazard mapping, signage and siren planning. Since 2008, RCTWG has worked with the California Emergency Management Agency (Cal EMA) in conducting tsunami warning communications tests on the North Coast. In 2007, RCTWG members helped develop and carry out the first tsunami training exercise at FEMA's Emergency Management Institute in Emmitsburg, MD. The RCTWG has facilitated numerous multi-agency, multi-discipline coordinated exercises, and RCTWG county tsunami response plans have been a model for other regions of the state and country. Eight North Coast communities have been recognized as TsunamiReady by the National Weather Service, including the first National Park the first State Park and only tribe in California to be so recognized. Over 500 tsunami hazard zone signs have been posted in the RCTWG region since 2008. Eight assessment surveys from 1993 to 2010 have tracked preparedness actions and personal awareness of earthquake and tsunami hazards in the county and additional surveys have tracked public awareness and tourist concerns about tsunami hazard signs. Over the seventeen-year period covered by the surveys, the percent with houses secured to foundations has increased from 58 to 84 percent, respondents aware of a local tsunami hazard increased from 51 to 89 percent and knowing what the Cascadia subduction zone is from 16 to 57 percent. In 2009, the RCTWG was recognized by the Western States Seismic Policy Council (WSSPC) with an award for innovation and in 2010, the RCTWG-sponsored class "Living on Shaky Ground" was awarded WSSPC's overall Award in Excellence. The RCTWG works closely with CGS and Cal EMA on a number of projects including tsunami mapping, evacuation zone planning, siren policy, tsunami safety for boaters, and public education messaging. Current projects include working with CGS to develop a "playbook" tsunami mapping product to illustrate the expected effects from a range of tsunami source events and assist local governments in focusing future response actions to reflect the range expected impacts from distant source events. Preparedness efforts paid off on March 11, 2011 when a tsunami warning was issued for the region and significant damage occurred in harbor regions of Del Norte County and Mendocino County. Full-scale evacuations were carried out in a coordinated manner and the majority of the commercial fishing fleet in Crescent City was able to exit the harbor before the tsunami arrived.

Dengler, L.; Henderson, C.; Larkin, D.; Nicolini, T.; Ozaki, V.

2012-12-01

298

Chapter F. The Loma Prieta, California, Earthquake of October 17, 1989 - Marina District  

USGS Publications Warehouse

During the earthquake, a total land area of about 4,300 km2 was shaken with seismic intensities that can cause significant damage to structures. The area of the Marina District of San Francisco is only 4.0 km2--less than 0.1 percent of the area most strongly affected by the earthquake--but its significance with respect to engineering, seismology, and planning far outstrips its proportion of shaken terrain and makes it a centerpiece for lessons learned from the earthquake. The Marina District provides perhaps the most comprehensive case history of seismic effects at a specific site developed for any earthquake. The reports assembled in this chapter, which provide an account of these seismic effects, constitute a unique collection of studies on site, as well as infrastructure and societal, response that cover virtually all aspects of the earthquake, ranging from incoming ground waves to the outgoing airwaves used for emergency communication. The Marina District encompasses the area bounded by San Francisco Bay on the north, the Presidio on the west, and Lombard Street and Van Ness Avenue on the south and east, respectively. Nearly all of the earthquake damage in the Marina District, however, occurred within a considerably smaller area of about 0.75 km2, bounded by San Francisco Bay and Baker, Chestnut, and Buchanan Streets. At least five major aspects of earthquake response in the Marina District are covered by the reports in this chapter: (1) dynamic site response, (2) soil liquefaction, (3) lifeline performance, (4) building performance, and (5) emergency services.

O'Rourke, Thomas D., (Edited By)

1992-01-01

299

Statistical analysis of the induced Basel 2006 earthquake sequence: introducing a probability-based monitoring approach for Enhanced Geothermal Systems  

NASA Astrophysics Data System (ADS)

Geothermal energy is becoming an important clean energy source, however, the stimulation of a reservoir for an Enhanced Geothermal System (EGS) is associated with seismic risk due to induced seismicity. Seismicity occurring due to the water injection at depth have to be well recorded and monitored. To mitigate the seismic risk of a damaging event, an appropriate alarm system needs to be in place for each individual experiment. In recent experiments, the so-called traffic-light alarm system, based on public response, local magnitude and peak ground velocity, was used. We aim to improve the pre-defined alarm system by introducing a probability-based approach; we retrospectively model the ongoing seismicity in real time with multiple statistical forecast models and then translate the forecast to seismic hazard in terms of probabilities of exceeding a ground motion intensity level. One class of models accounts for the water injection rate, the main parameter that can be controlled by the operators during an experiment. By translating the models into time-varying probabilities of exceeding various intensity levels, we provide tools which are well understood by the decision makers and can be used to determine thresholds non-exceedance during a reservoir stimulation; this, however, remains an entrepreneurial or political decision of the responsible project coordinators. We introduce forecast models based on the data set of an EGS experiment in the city of Basel. Between 2006 December 2 and 8, approximately 11 500 m3 of water was injected into a 5-km-deep well at high pressures. A six-sensor borehole array, was installed by the company Geothermal Explorers Limited (GEL) at depths between 300 and 2700 m around the well to monitor the induced seismicity. The network recorded approximately 11 200 events during the injection phase, more than 3500 of which were located. With the traffic-light system, actions where implemented after an ML 2.7 event, the water injection was reduced and then stopped after another ML 2.5 event. A few hours later, an earthquake with ML 3.4, felt within the city, occurred, which led to bleed-off of the well. A risk study was later issued with the outcome that the experiment could not be resumed. We analyse the statistical features of the sequence and show that the sequence is well modelled with the Omori-Utsu law following the termination of water injection. Based on this model, the sequence will last 31+29/-14 years to reach the background level. We introduce statistical models based on Reasenberg and Jones and Epidemic Type Aftershock Sequence (ETAS) models, commonly used to model aftershock sequences. We compare and test different model setups to simulate the sequences, varying the number of fixed and free parameters. For one class of the ETAS models, we account for the flow rate at the injection borehole. We test the models against the observed data with standard likelihood tests and find the ETAS model accounting for the on flow rate to perform best. Such a model may in future serve as a valuable tool for designing probabilistic alarm systems for EGS experiments.

Bachmann, C. E.; Wiemer, S.; Woessner, J.; Hainzl, S.

2011-08-01

300

Processed seismic motion records from earthquakes (1982--1993): Recorded at Scotty`s Castle, California  

SciTech Connect

As part of the contract with the US Department of Energy, Nevada Operations Office (DOE/NV), URS/John A. Blume & Associates, Engineers (URS/Blume) maintained a network of seismographs to monitor the ground motion generated by the underground nuclear explosions (UNEs) at the Nevada Test Site (NTS). The seismographs were located in the communities surrounding the NTS and the Las Vegas valley. When these seismographs were not used for monitoring the UNE generated motions, a limited number of seismographs were maintained for monitoring motion generated by other than UNEs (e.g. motion generated by earthquakes, wind, blast). Scotty`s Castle was one of the selected earthquake monitoring station. During the period from 1982 through 1993, numerous earthquakes with varied in magnitudes and distances were recorded at Scotty`s Castle. The records from 24 earthquakes were processed and included in this report. Tables 1 and 2 lists the processed earthquakes in chronological order and in the order of epicentral distances, respectively. Figure 1 shows these epicenters and magnitudes. Due to the potential benefit of these data for the scientific community, DOE/NV and the National Park Service authorize the release of these records.

Lum, P.K.; Honda, K.K.

1993-10-01

301

Processed seismic motion records from earthquakes, 1982-1993: Recorded at Scotty's Castle, California  

NASA Astrophysics Data System (ADS)

As part of the contract with the US Department of Energy, Nevada Operations Office (DOE/NV), URS/John A. Blume & Associates, Engineers (URS/Blume) maintained a network of seismographs to monitor the ground motion generated by the underground nuclear explosions (UNE's) at the Nevada Test Site (NTS). The seismographs were located in the communities surrounding the NTS and the Las Vegas valley. When these seismographs were not used for monitoring the UNE generated motions, a limited number of seismographs were maintained for monitoring motion generated by other than UNE's (e.g. motion generated by earthquakes, wind, blast). Scotty's Castle was one of the selected earthquake monitoring stations. During the period from 1982 through 1993, numerous earthquakes which varied in magnitudes and distances were recorded at Scotty's Castle. The records from 24 earthquakes were processed and included in this report. The processed earthquakes are listed in chronological order and in the order of epicentral distances, respectively. These epicenters and magnitudes are shown. Due to the potential benefit of these data for the scientific community, DOE/NV and the National Park Service authorize the release of these records.

Lum, P. K.; Honda, K. K.

1993-10-01

302

Probability Probability  

E-print Network

Review: Probability and Statistics Sam Roweis October 2, 2002 Probability #15;We use probabilities;ne random variables and sample spaces carefully: e.g. Prisoner's paradox Entropy #15; Measures(x). #12; Statistics #15; Probability: inferring probabilistic quantities for data given #12;xed models

Roweis, Sam

303

Acceleration and volumetric strain generated by the Parkfield 2004 earthquake on the GEOS strong-motion array near Parkfield, California  

USGS Publications Warehouse

An integrated array of 11 General Earthquake Observation System (GEOS) stations installed near Parkfield, CA provided on scale broad-band, wide-dynamic measurements of acceleration and volumetric strain of the Parkfield earthquake (M 6.0) of September 28, 2004. Three component measurements of acceleration were obtained at each of the stations. Measurements of collocated acceleration and volumetric strain were obtained at four of the stations. Measurements of velocity at most sites were on scale only for the initial P-wave arrival. When considered in the context of the extensive set of strong-motion recordings obtained on more than 40 analog stations by the California Strong-Motion Instrumentation Program (Shakal, et al., 2004 http://www.quake.ca.gov/cisn-edc) and those on the dense array of Spudich, et al, (1988), these recordings provide an unprecedented document of the nature of the near source strong motion generated by a M 6.0 earthquake. The data set reported herein provides the most extensive set of near field broad band wide dynamic range measurements of acceleration and volumetric strain for an earthquake as large as M 6 of which the authors are aware. As a result considerable interest has been expressed in these data. This report is intended to describe the data and facilitate its use to resolve a number of scientific and engineering questions concerning earthquake rupture processes and resultant near field motions and strains. This report provides a description of the array, its scientific objectives and the strong-motion recordings obtained of the main shock. The report provides copies of the uncorrected and corrected data. Copies of the inferred velocities, displacements, and Psuedo velocity response spectra are provided. Digital versions of these recordings are accessible with information available through the internet at several locations: the National Strong-Motion Program web site (http://agram.wr.usgs.gov/), the COSMOS Virtual Data Center Web site (http://www.cosmos-eq.org), and the CISN Engineering and Berkeley data centers (http://www.quake.ca.gov/cisn-edc). They are also accessible together with recordings on the GEOS Strong-motion Array near Parkfield, CA since its installation in 1987 through the USGS GEOS web site ( http://nsmp.wr.usgs.gov/GEOS).

Borcherdt, Rodger D.; Johnston, Malcolm J.S.; Dietel, Christopher; Glassmoyer, Gary; Myren, Doug; Stephens, Christopher

2004-01-01

304

Preseismic and coseismic deformation associated with the Coyote Lake, California, earthquake.  

USGS Publications Warehouse

The Coyote Lake earthquake (ML=5.9; August 6, 1979; epicenter c100 km SE of San Francisco) occurred on the Calaveras fault within a geodetic network that had been surveyed annually since 1972 to monitor strain accumulation. The rupture surface as defined by aftershocks is a vertical rectangle 20 km in length extending from a depth of 4 km to c12 km. The observed deformation of the geodetic network constrains the average slip to be 0.33 + or - 0.05m right lateral. Although the geodetic data furnished an exceptionally detailed picture of the pre-earthquake deformation, no significant premonitory anomaly associated with the Coyote Lake earthquake can be identified.-Authors

King, N.E.; Savage, J.C.; Lisowski, M.; Prescott, W.H.

1981-01-01

305

Deformation following the 1994 Northridge earthquake (M=6.7), southern California  

USGS Publications Warehouse

Following the 1994 MW=6.7 Northridge earthquake, a 65-km-long, north-south array of 11 geodetic monuments was established across the rupture. The array was surveyed with GPS ten times in the 4.25 yr after the earthquake. Although there is evidence for modest nonlinear postseismic relaxation in the first few weeks after the Northridge earthquake, the deformation in the subsequent four years can be adequately described by constant station velocities. The observed S70??E velocity components are consistent with the deformation expected from steady strain accumulation on the San Andreas fault. The N20??E velocity components indicate that the southern Northridge fault block is moving almost as a unit N20??E with repect to the northern fault block, the motion being accommodated by a zone of convergence (width 20 km) at the north end of the Northridge rupture.Following the 1994 Mw=6.7 Northridge earthquake, a 65-km-long, north-south array of 11 geodetic monuments was established across the rupture. The array was surveyed with GPS ten times in the 4.25 yr after the earthquake. Although there is evidence for modest nonlinear postseismic relaxation in the first few weeks after the Northridge earthquake, the deformation in the subsequent four years can be adequately described by constant station velocities. The observed S70??E velocity components are consistent with the deformation expected from steady strain accumulation on the San Andreas fault. The N20??E velocity components indicate that the southern Northridge fault block is moving almost as a unit N20??E with respect to the northern fault block, the motion being accommodated by a zone of convergence (width 20 km) at the north end of the Northridge rupture.

Savage, J.C.; Svarc, J.L.; Prescott, W.H.; Hudnut, K.W.

1998-01-01

306

Comment on "Revisiting the 1872 owens valley, California, earthquake" by Susan E. Hough and Kate Hutton  

USGS Publications Warehouse

Bakun (2009) argues that the conclusions of Hough and Hutton (2008) are wrong because the study failed to take into account the Sierra Nevada attenuation model of Bakun (2006). In particular, Bakun (2009) argues that propagation effects can explain the relatively high intensities generated by the 1872 Owens Valley earthquake. Using an intensity attenuation model that attempts to account for attenuation through the Sierra Nevada, Bakun (2006) infers the magnitude estimate (Mw 7.4–7.5) that is currently accepted by National Earthquake Information Center (NEIC).

Bakun, W.H.

2009-01-01

307

Chapter E. The Loma Prieta, California, Earthquake of October 17, 1989 - Geologic Setting and Crustal Structure  

USGS Publications Warehouse

Although some scientists considered the Ms=7.1 Loma Prieta, Calif., earthquake of 1989 to be an anticipated event, some aspects of the earthquake were surprising. It occurred 17 km beneath the Santa Cruz Mountains along a left-stepping restraining bend in the San Andreas fault system. Rupture on the southwest-dipping fault plane consisted of subequal amounts of right-lateral and reverse motion but did not reach the surface. In the area of maximum uplift, severe shaking and numerous ground cracks occurred along Summit Road and Skyland Ridge, several kilometers south of the main trace of the San Andreas fault. The relatively deep focus of the earthquake, the distribution of ground failure, the absence of throughgoing surface rupture on the San Andreas fault, and the large component of uplift raised several questions about the relation of the 1989 Loma Prieta earthquake to the San Andreas fault: Did the earthquake actually occur on the San Andreas fault? Where exactly is the San Andreas fault in the heavily forested Santa Cruz Mountains, and how does the fault relate to ground ruptures that occurred there in 1989 and 1906? What is the geometry of the San Andreas fault system at depth, and how does it relate to the major crustal blocks identified by geologic mapping? Subsequent geophysical and geologic investigations of crustal structure in the Loma Prieta region have addressed these and other questions about the relation of the earthquake to geologic structures observed in the southern Santa Cruz Mountains. The diverse papers in this chapter cover several topics: geologic mapping of the region, potential- field and electromagnetic modeling of crustal structure, and the velocity structure of the crust and mantle in and below the source region for the earthquake. Although these papers were mostly completed between 1992 and 1997, they provide critical documentation of the crustal structure of the Loma Prieta region. Together, they present a remarkably coherent, three-dimensional picture of the earthquake source region--a geologically complex volume of crust with a long history of both right-lateral faulting and fault-normal compression, thrusting, and uplift.

Wells, Ray E.

2004-01-01

308

Chapter A. The Loma Prieta, California, Earthquake of October 17, 1989 - Strong Ground Motion  

USGS Publications Warehouse

Strong ground motion generated by the Loma Prieta, Calif., earthquake (MS~7.1) of October 17, 1989, resulted in at least 63 deaths, more than 3,757 injuries, and damage estimated to exceed $5.9 billion. Strong ground motion severely damaged critical lifelines (freeway overpasses, bridges, and pipelines), caused severe damage to poorly constructed buildings, and induced a significant number of ground failures associated with liquefaction and landsliding. It also caused a significant proportion of the damage and loss of life at distances as far as 100 km from the epicenter. Consequently, understanding the characteristics of the strong ground motion associated with the earthquake is fundamental to understanding the earthquake's devastating impact on society. The papers assembled in this chapter address this problem. Damage to vulnerable structures from the earthquake varied substantially with the distance from the causative fault and the type of underlying geologic deposits. Most of the damage and loss of life occurred in areas underlain by 'soft soil'. Quantifying these effects is important for understanding the tragic concentrations of damage in such areas as Santa Cruz and the Marina and Embarcadero Districts of San Francisco, and the failures of the San Francisco-Oakland Bay Bridge and the Interstate Highway 880 overpass. Most importantly, understanding these effects is a necessary prerequisite for improving mitigation measures for larger earthquakes likely to occur much closer to densely urbanized areas in the San Francisco Bay region. The earthquake generated an especially important data set for understanding variations in the severity of strong ground motion. Instrumental strong-motion recordings were obtained at 131 sites located from about 6 to 175 km from the rupture zone. This set of recordings, the largest yet collected for an event of this size, was obtained from sites on various geologic deposits, including a unique set on 'soft soil' deposits (artificial fill and bay mud). These exceptional ground-motion data are used by the authors of the papers in this chapter to infer radiation characteristics of the earthquake source, identify dominant propagation characteristics of the Earth?s crust, quantify amplification characteristics of near-surface geologic deposits, develop general amplification factors for site-dependent building-code provisions, and revise earthquake-hazard assessments for the San Francisco Bay region. Interpretations of additional data recorded in well-instrumented buildings, dams, and freeway overpasses are provided in other chapters of this report.

Borcherdt, Roger D.

1994-01-01

309

Continuous GPS observations of postseismic deformation following the 16 October 1999 Hector Mine, California, earthquake (Mw 7.1)  

USGS Publications Warehouse

Rapid field deployment of a new type of continuously operating Global Positioning System (GPS) network and data from Southern California Integrated GPS Network (SCIGN) stations that had recently begun operating in the area allow unique observations of the postseismic deformation associated with the 1999 Hector Mine earthquake. Innovative solutions in fieldcraft, devised for the 11 new GPS stations, provide high-quality observations with 1-year time histories on stable monuments at remote sites. We report on our results from processing the postseismic GPS data available from these sites, as well as 8 other SCIGN stations within 80 km of the event (a total of 19 sites). From these data, we analyze the temporal character and spatial pattern of the postseismic transients. Data from some sites display statistically significant time variation in their velocities. Although this is less certain, the spatial pattern of change in the postseismic velocity field also appears to have changed. The pattern now is similar to the pre-Landers (pre-1992) secular field, but laterally shifted and locally at twice the rate. We speculate that a 30 km ?? 50 km portion of crust (near Twentynine Palms), which was moving at nearly the North American plate rate (to within 3.5 mm/yr of that rate) prior to the 1992 Landers sequence, now is moving along with the crust to the west of it, as though it has been entrained in flow along with the Pacific Plate as a result of the Landers and Hector Mine earthquake sequence. The inboard axis of right-lateral shear deformation (at lower crustal to upper mantle depth) may have jumped 30 km farther into the continental crust at this fault junction that comprises the southern end of the eastern California shear zone.

Hudnutt, K.W.; King, N.E.; Galetzka, J.E.; Stark, K.F.; Behr, J.A.; Aspiotes, A.; van, Wyk S.; Moffitt, R.; Dockter, S.; Wyatt, F.

2002-01-01

310

Rapid Centroid Moment Tensor (CMT) Inversion in 3D Earth Structure Model for Earthquakes in Southern California  

NASA Astrophysics Data System (ADS)

Accurate and rapid CMT inversion is important for seismic hazard analysis. We have developed an algorithm for very rapid CMT inversions in a 3D Earth structure model and applied it on small to medium-sized earthquakes recorded by the Southern California Seismic Network (SCSN). Our CMT inversion algorithm is an integral component of the scattering-integral (SI) method for full-3D waveform tomography (F3DT). In the SI method for F3DT, the sensitivity (Fréchet) kernels are constructed through the temporal convolution between the earthquake wavefield (EWF) and the receiver Green tensor (RGT), which is the wavefield generated by 3 orthogonal unit impulsive body forces acting at the receiver location. The RGTs are also the partial derivatives of the waveform with respect to the moment tensors. In this study, our RGTs are computed in a 3D seismic structure model for Southern California (CVM4SI1) using the finite-difference method, which allows us to account for 3D path effects in our source inversion. We used three component broadband waveforms below 0.2 Hz. An automated waveform-picking algorithm based on continuous wavelet transform is applied on observed waveforms to pick P, S and surface waves. A multi-scale grid-searching algorithm is then applied on the picked waveforms to find the optimal strike, dip and rake values that minimize the amplitude misfit and maximize the correlation coefficient. In general, our CMT solutions agree with solutions inverted using other methods and provide better fit to the observed waveforms.

Chen, P.; Lee, E.; Jordan, T. H.; Maechling, P. J.

2009-12-01

311

Southern California Permanent GPS Geodetic Array: Continuous measurements of regional crustal deformation between the 1992 Landers and 1994 Northridge earthquakes  

USGS Publications Warehouse

The southern California Permanent GPS Geodetic Array (PGGA) was established in 1990 across the Pacific-North America plate boundary to continuously monitor crustal deformation. We describe the development of the array and the time series of daily positions estimated for its first 10 sites in the 19-month period between the June 28, 1992 (Mw=7.3), Landers and January 17, 1994 (Mw=6.7), Northridge earthquakes. We compare displacement rates at four site locations with those reported by Feigl et al. [1993], which were derived from an independent set of Global Positioning System (GPS) and very long baseline interferometry (VLBI) measurements collected over nearly a decade prior to the Landers earthquake. The velocity differences for three sites 65-100 km from the earthquake's epicenter are of order of 3-5 mm/yr and are systematically coupled with the corresponding directions of coseismic displacement. The fourth site, 300 km from the epicenter, shows no significant velocity difference. These observations suggest large-scale postseismic deformation with a relaxation time of at least 800 days. The statistical significance of our observations is complicated by our incomplete knowledge of the noise properties of the two data sets; two possible noise models fit the PGGA data equally well as described in the companion paper by Zhang et al. [this issue]; the pre-Landers data are too sparse and heterogeneous to derive a reliable noise model. Under a fractal white noise model for the PGGA data we find that the velocity differences for all three sites are statistically different at the 99% significance level. A white noise plus flicker noise model results in significance levels of only 94%, 43%, and 88%. Additional investigations of the pre-Landers data, and analysis of longer spans of PGGA data, could have an important effect on the significance of these results and will be addressed in future work. Copyright 1997 by the American Geophysical Union.

Bock, Y.; Wdowinski, S.; Fang, P.; Zhang, J.; Williams, S.; Johnson, H.; Behr, J.; Genrich, J.; Dean, J.; Van Domselaar, M.; Agnew, D.; Wyatt, F.; Stark, K.; Oral, B.; Hudnut, K.; King, R.; Herring, T.; Dinardo, S.; Young, W.; Jackson, D.; Gurtner, W.

1997-01-01

312

Late Holocene slip rate and recurrence of great earthquakes on the San Andreas fault in northern California  

SciTech Connect

The slip rate of the San Andreas fault 45 km north of San Francisco at Olema, California, is determined by matching offset segments of a buried late Holocene stream channel. Stream deposits from 1,800 {plus minus} 78 yr B.P. are offset 42.5 {plus minus} 3.5 m across the active (1906) fault trace for a minimum late Holocene slip rate of 24 {plus minus} 3 mm/yr. When local maximum coseismic displacements of 4.9 to 5.5 m from the 1906 earthquake are considered with this slip rate, the recurrence of 1906-type earthquakes on the North Coast segment of the San Andreas fault falls within the interval of 221 {plus minus} 40 yr. Both comparable coseismic slip in 1906 and similar late Holocene geologic slip rates at the Olema site and a site 145 km northwest at Point Arena (Prentice, 1989) suggest that the North Coast segment behaves as a coherent rupture unit.

Niemi, T.M. (Stanford Univ., CA (United States) Earth Sciences Associates, Palo Alto, CA (United States)); Hall, N.T. (Geomatrix Consultants, San Francisco, CA (United States))

1992-03-01

313

Testing time-predictable earthquake recurrence by direct measurement of strain accumulation and release  

NASA Astrophysics Data System (ADS)

Probabilistic estimates of earthquake hazard use various models for the temporal distribution of earthquakes, including the `time-predictable' recurrence model formulated by Shimazaki and Nakata (which incorporates the concept of elastic rebound described as early as 1910 by H. F. Reid). This model states that an earthquake occurs when the fault recovers the stress relieved in the most recent earthquake. Unlike time-independent models (for example, Poisson probability), the time-predictable model is thought to encompass some of the physics behind the earthquake cycle, in that earthquake probability increases with time. The time-predictable model is therefore often preferred when adequate data are available, and it is incorporated in hazard predictions for many earthquake-prone regions, including northern California, southern California, New Zealand and Japan. Here we show that the model fails in what should be an ideal locale for its application - Parkfield, California. We estimate rigorous bounds on the predicted recurrence time of the magnitude ~6 1966 Parkfield earthquake through inversion of geodetic measurements and we show that, according to the time-predictable model, another earthquake should have occurred by 1987. The model's poor performance in a relatively simple tectonic setting does not bode well for its successful application to the many areas of the world characterized by complex fault interactions.

Murray, Jessica; Segall, Paul

2002-09-01

314

Direct and indirect evidence for earthquakes; an example from the Lake Tahoe Basin, California-Nevada  

NASA Astrophysics Data System (ADS)

High-resolution seismic CHIRP data can image direct evidence of earthquakes (i.e., offset strata) beneath lakes and the ocean. Nevertheless, direct evidence often is not imaged due to conditions such as gas in the sediments, or steep basement topography. In these cases, indirect evidence for earthquakes (i.e., debris flows) may provide insight into the paleoseismic record. The four sub-basins of the tectonically active Lake Tahoe Basin provide an ideal opportunity to image direct evidence for earthquake deformation and compare it to indirect earthquake proxies. We present results from high-resolution seismic CHIRP surveys in Emerald Bay, Fallen Leaf Lake, and Cascade Lake to constrain the recurrence interval on the West Tahoe Dollar Point Fault (WTDPF), which was previously identified as potentially the most hazardous fault in the Lake Tahoe Basin. Recently collected CHIRP profiles beneath Fallen Leaf Lake image slide deposits that appear synchronous with slides in other sub-basins. The temporal correlation of slides between multiple basins suggests triggering by events on the WTDPF. If correct, we postulate a recurrence interval for the WTDPF of ~3-4 k.y., indicating that the WTDPF is near its seismic recurrence cycle. In addition, CHIRP data beneath Cascade Lake image strands of the WTDPF that offset the lakefloor as much as ~7 m. The Cascade Lake data combined with onshore LiDAR allowed us to map the geometry of the WTDPF continuously across the southern Lake Tahoe Basin and yielded an improved geohazard assessment.

Maloney, J. M.; Noble, P. J.; Driscoll, N. W.; Kent, G.; Schmauder, G. C.

2012-12-01

315

Linearized inversion for fault rupture behavior: Application to the 1984 Morgan Hill, California, earthquake  

Microsoft Academic Search

We present a technique to infer the rupture history of an earthquake from near-source records of ground motion. Unlike most previous studies, each point on the fault is assumed to slip only once, when the rupture front passes, with a spatially variable slip intensity. In this parameterization the data are linearly related to slip intensity but nonlinearly related to rupture

Gregory C. Beroza; Paul Spudich

1988-01-01

316

Chapter E. The Loma Prieta, California, Earthquake of October 17, 1989 - Hydrologic Disturbances  

USGS Publications Warehouse

Seismic events have long been known to cause changes in the level of oceans, streams, lakes, and the water table. The great San Francisco earthquake of 1906 induced significant hydrologic changes that were qualitatively similar to those changes observed for the Loma Prieta earthquake. What is different is that the hydrologic data sets collected from the Loma Prieta event have enough detail to enable hypotheses on the causes for these changes to be tested. The papers in this chapter document changes in ocean level, stream morphology and flow, water table height, and ground-water flow rates in response to the earthquake. Although hydrologic disturbances may have occurred about 1 hour before the main shock, the papers in this chapter deal strictly with postevent hydrologic changes. The hydrologic responses reported here reflect changes that are not the result of surface rupture. They appear to be the result of landslides, the static displacements induced by the earthquake, and changes in the permeability of the near surface.

Rojstaczer, Stuart A., (Edited By)

1994-01-01

317

Chapter B. The Loma Prieta, California, Earthquake of October 17, 1989 - Public Response  

USGS Publications Warehouse

Major earthquakes provide seismologists and engineers an opportunity to examine the performance of the Earth and the man-made structures in response to the forces of the quake. So, too, do they provide social scientists an opportunity to delve into human responses evoked by the ground shaking and its physical consequences. The findings from such research can serve to guide the development and application of programs and practices designed to reduce death, injury, property losses, and social disruption in subsequent earthquakes. This chapter contains findings from studies focused mainly on public response to the Loma Prieta earthquake; that is, on the behavior and perceptions of the general population rather than on the activities of specific organizations or on the impact on procedures or policies. A major feature of several of these studies is that the information was collected from the population throughout the Bay area, not just from persons in the most badly damaged communities or who had suffered the greatest losses. This wide range serves to provide comparisons of behavior for those most directly affected by the earthquake with others who were less directly affected by it but still had to consider it very 'close to home.'

Bolton, Patricia A., (Edited By)

1993-01-01

318

Earthquake Record of the Peninsula Segment of the San Andreas fault, Portola Valley, California  

Microsoft Academic Search

Previous paleoseismic studies on the Peninsula segment provide evidence of poorly constrained large magnitude earthquakes occurring in the late Holocene (Wright et al., 1999; Hall et al., 2001), and possibly in 1838 (Toppozada and Borchardt, 1998); whereas paleoseismic investigations on the Santa Cruz Mountains and North Coast segments provide moderately well constrained event chronology information over the last approximately 2,000

S. T. Sundermann; J. N. Baldwin; C. Prentice

2008-01-01

319

Monitoring velocity variations in the crust using earthquake doublets: An application to the Calaveras fault, California  

Microsoft Academic Search

We present a technique that greatly improves the precision in measuring temporal variations of crustal velocities using an earthquake doublet, or pair of microearthquakes that have nearly identical waveforms and the same hypocenter and magnitude but occur on different dates. We compute differences in arrival times between seismograms recorded at the same station in the frequency domain by cross correlation

G. Poupinet; V. L. Ellsworth; J. Frechet

1984-01-01

320

The SAFRR Tsunami Scenario: Improving Resilience for California from a Plausible M9 Earthquake near the Alaska Peninsula  

NASA Astrophysics Data System (ADS)

The SAFRR Tsunami Scenario models a hypothetical but plausible tsunami, created by an Mw9.1 earthquake occurring offshore from the Alaskan peninsula, and its impacts on the California coast. We present the likely inundation areas, current velocities in key ports and harbors, physical damage and repair costs, economic consequences, environmental impacts, social vulnerability, emergency management, and policy implications for California associated with the tsunami scenario. The intended users are those who must make mitigation decisions before and rapid decisions during future tsunamis. Around a half million people would be present in the scenario's inundation area in residences, businesses, public venues, parks and beaches. Evacuation would likely be ordered for the State of California's maximum mapped tsunami inundation zone, evacuating an additional quarter million people from residences and businesses. Some island and peninsula communities would face particular evacuation challenges because of limited access options and short warning time, caused by the distance between Alaska and California. Evacuations may also be a challenge for certain dependent-care populations. One third of the boats in California's marinas could be damaged or sunk, costing at least 700 million in repairs to boats and docks, and potentially much more to address serious issues due to sediment transport and environmental contamination. Fires would likely start at many sites where fuel and petrochemicals are stored in ports and marinas. Tsunami surges and bores may travel several miles inland up coastal rivers. Debris clean-up and recovery of inundated and damaged areas will take days, months, or years depending on the severity of impacts and the available resources for recovery. The Ports of Los Angeles and Long Beach (POLA/LB) would be shut down for a mini?m of two days due to strong currents. Inundation of dry land in the ports would result in 100 million damages to cargo and additional downtime. The direct exposure of port trade value totals over 1.2 billion, while associated business interruption losses in the California economy could more than triple that value. Other estimated damages include 1.8 billion of property damage and 85 million for highway and railroad repairs. In total, we have estimated repair and replacement costs of almost 3 billion to California marinas, coastal properties and the POLA/LB. These damages could cause $6 billion of business interruption losses in the California economy, but that could be reduced by 80-90% with the implementation of business continuity or resilience strategies. This scenario provides the basis for improving preparedness, mitigation, and continuity planning for tsunamis, which can reduce damage and economic impacts and enhance recovery efforts. Two positive outcomes have already resulted from the SAFRR Tsunami Scenario. Emergency managers in areas where the scenario inundation exceeds the State's maximum inundation zone have been notified and evacuation plans have been updated appropriately. The State has also worked with NOAA's West Coast and Alaska Tsunami Warning Center to modify future message protocols to facilitate effective evacuations in California. While our specific results pertain to California, the lessons learned and our scenario approach can be applied to other regions.

Ross, S.; Jones, L.; Wilson, R. I.; Bahng, B.; Barberopoulou, A.; Borrero, J. C.; Brosnan, D.; Bwarie, J.; Geist, E. L.; Johnson, L.; Kirby, S. H.; Knight, W.; Long, K.; Lynett, P. J.; Miller, K.; Mortensen, C. E.; Nicolsky, D.; Oglesby, D. D.; Perry, S. C.; Plumlee, G. S.; Porter, K. A.; Real, C. R.; Ryan, K. J.; Suleimani, E.; Thio, H. K.; Titov, V.; Wein, A. M.; Whitmore, P.; Wood, N. J.

2013-12-01

321

Present-day loading rate of faults in southern California and northern Baja California, Mexico, and post-seismic deformation following the M7.2 April 4, 2010, El Mayor-Cucapah earthquake from GPS Geodesy  

NASA Astrophysics Data System (ADS)

We use 142 GPS velocity estimates from the SCEC Crustal Motion Map 4 and 59 GPS velocity estimates from additional sites to model the crustal velocity field of southern California, USA, and northern Baja California, Mexico, prior to the 2010 April 4 Mw 7.2 El Mayor-Cucapah (EMC) earthquake. The EMC earthquake is the largest event to occur along the southern San Andreas fault system in nearly two decades. In the year following the EMC earthquake, the EarthScope Plate Boundary Observatory (PBO) constructed eight new continuous GPS sites in northern Baja California, Mexico. We used our velocity model, which represents the period before the EMC earthquake, to assess postseismic velocity changes at the new PBO sites. Time series from the new PBO sites, which were constructed 4-18 months following the earthquake do not exhibit obvious exponential or logarithmic decay, showing instead fairly secular trends through the period of our analysis (2010.8-2012.5). The weighted RMS misfit to secular rates, accounting for periodic site motions is typically around 1.7 mm/yr, indicating high positioning precision and fairly linear site motion. Results of our research include new fault slip rate estimates for the greater San Andreas fault system, including model faults representing the Cerro Prieto (39.0±0.1 mm/yr), Imperial (35.7±0.1 mm/yr), and southernmost San Andreas (24.7±0.1 mm/yr), generally consistent with previous geodetic studies within the region. Velocity changes at the new PBO sites associated with the EMC earthquake are in the range 1.7±0.3 to 9.2±2.6 mm/yr. The maximum rate difference is found in Mexicali Valley, close to the rupture. Rate changes decay systematically with distance from the EMC epicenter and velocity orientations exhibit a butterfly pattern as expected from a strike slip earthquake. Sites to the south and southwest of the Baja California shear zone are moving more rapidly to the northwest relative to their motions prior to the earthquake. Sites to the west of the Laguna Salada fault zone are moving more westerly. Sites to the east of the EMC rupture move more southerly than prior to the EMC earthquake. Continued monitoring of these velocity changes will allow us to differentiate between lower crustal and upper mantle relaxation processes.

Spinler, J. C.; Bennett, R. A.

2012-12-01

322

Three dimensional images of geothermal systems: local earthquake P-wave velocity tomography at the Hengill and Krafla geothermal areas, Iceland, and The Geysers, California  

USGS Publications Warehouse

Local earthquake tomography - the use of earthquake signals to form a 3-dimensional structural image - is now a mature geophysical analysis method, particularly suited to the study of geothermal reservoirs, which are often seismically active and severely laterally inhomogeneous. Studies have been conducted of the Hengill (Iceland), Krafla (Iceland) and The Geysers (California) geothermal areas. All three systems are exploited for electricity and/or heat production, and all are highly seismically active. Tomographic studies of volumes a few km in dimension were conducted for each area using the method of Thurber (1983).

Julian, B.R.; Prisk, A.; Foulger, G.R.; Evans, J.R.

1993-01-01

323

Superficial simplicity of the 2010 El Mayor-Cucapah earthquake of Baja California in Mexico  

E-print Network

The geometry of faults is usually thought to be more complicated at the surface than at depth and to control the initiation, propagation and arrest of seismic ruptures. The fault system that runs from southern California ...

Herring, Thomas A.

324

Toward petascale earthquake simulations  

Microsoft Academic Search

Earthquakes are among the most complex terrestrial phenomena, and modeling of earthquake dynamics is one of the most challenging\\u000a computational problems in science. Computational capabilities have advanced to a state where we can perform wavefield simulations\\u000a for realistic three-dimensional earth models, and gain more insights into the earthquakes that threaten California and many\\u000a areas of the world. The Southern California

Yifeng Cui; Reagan Moore; Kim Olsen; Amit Chourasia; Philip Maechling; Bernard Minster; Steven Day; Yuanfang Hu; Jing Zhu; Thomas Jordan

2009-01-01

325

Holocene paleoseismicity, temporal clustering, and probabilities of future large (M > 7) earthquakes on the Wasatch fault zone, Utah  

Microsoft Academic Search

The clu:onology of M>7 palcoearthquakes on the central five segments of the Wasatch fault zone (WFZ) is one of the best dated in the world and contains 16 earthquakes in the past 5600 years with an average repeat thne of 350 years. Repeat times for individual segments vary by a factor of 2, and range from about 1200 to 2600

J. P. McCalpin; S. P. Nishenko

1996-01-01

326

Probability of chance correlations of earthquakes with predictions in areas of heterogeneous seismicity rate: the VAN case  

Microsoft Academic Search

Evaluations of 22 claims of successful earthquake predictions in Greece by Varotsos and Lazaridou [1991] were performed using the Ms (surface wave) as well as the ML (local) magnitude scales. If we assume that the predicted magnitudes were Ms (the scale was not specified in the prediction telegrams), and use the Preliminary Determinations of Epicenters (PDE) to estimate the seismicity

M. Wyss; A. Allmann

1996-01-01

327

Stress Drop and Its Uncertainty for Central California Earthquakes M 3.8-5.5  

NASA Astrophysics Data System (ADS)

In studies where stress drop (tau) is estimated from the displacement spectrum of body waves (Brune, 1970,1971), the standard deviation of tau (std-tau) is around a factor of four (Cotton et al., 2013). The std-tau is primarily due to uncertainty in the corner frequency (fc) because tau is proportional to fc^3. An alternative approach is to compute tau based on the acceleration spectrum, using the root-mean-square amplitude (A-rms) for frequencies greater than fc (Hanks, 1979). To compare these two methods we use borehole recordings at the Hollister downhole array (Steidl, 2006); earthquakes have M>3.8 to ensure a good signal to noise. The recordings at depth remove the near surface attenuation. However, the whole path attenuation must still be accounted for. To date we have analyzed 5 earthquakes (M 3.8-4.5) at hypocentral distance 8-17 km and 6 earthquakes (M 3.9-5.5) at hypocentral distance 26-76 km. We find that it is critically important to correct the spectrum for attenuation. For these earthquakes we used Q=Qo x f^0.6. We find an average t* (=travel time/Qo) near 5.5 for 8 of the earthquakes. After correcting for Q, the spectrum has an ?^2 shape for frequencies up to 70 Hz for the close distances and up to 40 Hz for the far distances. This bandwidth allows for a stable estimate of the rms acceleration. With the corrected spectrum we estimated fc, the low-frequency asymptote (to determine seismic moment) and the rms acceleration. Using the A-rms method we find nearly the same mean value of tau: 1.3 MPa and 1.0 MPa for close and far distances, with an uncertainty of a factor of 1.9 and 2.1, respectively. Whereas the same spectra produce a Brune tau of 1.1 MPa and 0.3 MPa for close and far distances, with an uncertainty of a factor of 3.5 and 3.4, respectively. Our analysis suggests that we can use the A-rms method for estimating tau for distances as large as 76 km provided we correct the spectrum for whole path attenuation. This analysis suggests that the intrinsic variability in earthquake stress drop is around a factor of 2.

Ding, L.; Crempien, J.; Archuleta, R. J.

2013-12-01

328

The 1994 Northridge earthquake sequence in California: Seismological and tectonic aspects  

NASA Astrophysics Data System (ADS)

The Mw 6.7 Northridge earthquake occurred on January 17, 1994, beneath the San Fernando Valley. Two seismicity clusters, located 25 km to the south and 35 km to the north-northwest, preceded the mainshock by 7 days and 16 hours, respectively. The mainshock hypocenter was relatively deep, at 19 km depth in the lower crust. It had a thrust faulting focal mechanism with a rake of 100° on a fault plane dipping 35° to the south-southwest and striking N75°W. Because the mainshock did not rupture the surface, its association with surficial geological features remains difficult to resolve. Nonetheless, its occurrence reemphasized the seismic hazard of concealed faults associated with the contractional deformation of the Transverse Ranges. The Northridge earthquake is part of the temporal increase in earthquake activity in the Los Angeles area since 1970. The mainshock was followed by an energetic aftershock sequence. Eight aftershocks of M ? 5.0 and 48 aftershocks of 4 ? M ? 5 occurred between January 17 and September 30, 1994. The aftershocks extend over most of the western San Fernando Valley and Santa Susana Mountains. They form a diffuse spatial distribution around the mainshock rupture plane, illuminating a previously unmapped thrust ramp, extending from 7-10 km depth into the lower crust to a depth of 23 km. No flattening of the aftershock distribution is observed near its bottom. At shallow depths, above 7-10 km, the thrust ramp is topped by a dense distribution of aftershock hypocenters bounded by some of the surficial faults. The dip of the ramp increases from east to west. The west side of the aftershock zoae is characterized by a dense, steeply dipping, and north-northeast striking planar cluster of aftershocks that exhibited mostly thrust faulting. These events coincided with the Gillibrand Canyon lateral ramp. Along the east side of the aftershock zone the aftershocks also exhibited primarily thrust faulting focal mechanisms. The focal mechanisms of the aftershocks were dominated by thrust faulting in the large aftershocks, with some strike-slip and normal faulting in the smaller aftershocks. The 1971 San Fernando and the 1994 Northridge earthquakes ruptured partially abutting fault surfaces on opposite sides of a ridge. Both earthquakes accommodated north-south contractional deformation of the Transverse Ranges. The two earthquakes differ primarily in the dip direction of the faults and the depth of faulting. The 1971 north-northeast trend of left-lateral faulting (Chatsworth trend) was not activated in 1994.

Hauksson, Egill; Jones, Lucile M.; Hutton, Kate

1995-07-01

329

Local Public Health System Response to the Tsunami Threat in Coastal California following the T?hoku Earthquake  

PubMed Central

Background On Friday March 11, 2011 a 9.0 magnitude earthquake triggered a tsunami off the eastern coast of Japan, resulting in thousands of lives lost and billions of dollars in damage around the Pacific Rim. The tsunami first reached the California coast on Friday, March 11th, causing more than $70 million in damage and at least one death. While the tsunami’s impact on California pales in comparison to the destruction caused in Japan and other areas of the Pacific, the event tested emergency responders’ ability to rapidly communicate and coordinate a response to a potential threat. Methods To evaluate the local public health system emergency response to the tsunami threat in California, we surveyed all local public health, emergency medical services (EMS), and emergency management agencies in coastal or floodplain counties about several domains related to the tsunami threat in California, including: (1) the extent to which their community was affected by the tsunami, (2) when and how they received notification of the event, (3) which public health response activities were carried out to address the tsunami threat in their community, and (4) which organizations contributed to the response. Public health activities were characterized using the Centers for Disease Control and Prevention (CDC) Public Health Preparedness Capabilities (PHEP) framework. Findings The tsunami's impact on coastal communities in California ranged widely, both in terms of the economic consequences and the response activities. Based on estimates from the National Oceanic and Atmospheric Administration (NOAA), ten jurisdictions in California reported tsunami-related damage, which ranged from $15,000 to $35 million. Respondents first became aware of the tsunami threat in California between the hours of 10:00pm Pacific Standard Time (PST) on Thursday March 10th and 2:00pm PST on Friday March 11th, a range of 16 hours, with notification occurring through both formal and informal channels. In response to this threat, the activities most commonly reported by the local government agencies included in this study were: emergency public information and warning, emergency operations coordination, and inter-organizational information sharing, which were reported by 86%, 75%, and 65% of all respondents, respectively. When looking at the distribution of responsibility, emergency management agencies were the most likely to report assuming a lead role in these common activities as well as those related to evacuation and community recovery. While activated less frequently, public health agencies carried out emergency response functions related to surveillance and epidemiology, environmental health, and mental health/psychological support. Both local public health and EMS agencies took part in mass care and medical material management activities. A large network of organizations contributed to response activities, with emergency management, law enforcement, fire, public health, public works, EMS, and media cited by more than half of respondents. Conclusions In response to the tsunami threat in California, we found that emergency management agencies assumed a lead role in the local response efforts. While public health and medical agencies played a supporting role in the response, they uniquely contributed to a number of specific activities. If the response to the recent tsunami is any indication, these support activities can be anticipated in planning for future events with similar characteristics to the tsunami threat. Additionally, we found that many respondents first learned of the tsunami through the media, rather than through rapid notification systems, which suggests that government agencies must continue to develop and maintain the ability to rapidly aggregate and analyze information in order to provide accurate assessments and guidance to a potentially well-informed public. Citation: Hunter JC, Crawley AW, Petrie M, Yang JE, Aragón TJ. Local Public Health System Response to the Tsunami Threat in Coastal California following the T?hok

Hunter, Jennifer C.; Crawley, Adam W.; Petrie, Michael; Yang, Jane E.; Aragón, Tomás J.

2012-01-01

330

Estimating the probability of occurrence of earthquakes (M>6) in the Western part of the Corinth rift using fault-based and classical seismotectonic approaches.  

NASA Astrophysics Data System (ADS)

The Corinth rift, Greece, is one of the regions with highest strain rates in the Euro-Mediterranean area and as such it has long been identified as a site of major importance for earthquake studies in Europe (20 years of research by the Corinth Rift Laboratory and 4 years of in-depth studies by the ANR-SISCOR project). This enhanced knowledge, acquired in particular, in the western part of the Gulf of Corinth, an area about 50 by 40 km, between the city of Patras to the west and the city of Aigion to the east, provides an excellent opportunity to compare fault-based and classical seismotectonic approaches currently used in seismic hazard assessment studies. A homogeneous earthquake catalogue was first constructed for the Greek territory based on two existing earthquake catalogues available for Greece (National Observatory of Athens and Thessaloniki). In spite of numerous documented damaging earthquakes, only a limited amount of macroseismic intensity data points are available in the existing databases for the damaging earthquakes affecting the west Corinth rift region. A re-interpretation of the macroseismic intensity field for numerous events was thus conducted, following an in-depth analysis of existing and newly found documentation (for details see Rovida et al. EGU2014-6346). In parallel, the construction of a comprehensive database of all relevant geological, geodetical and geophysical information (available in the literature and recently collected within the ANR-SISCOR project), allowed proposing rupture geometries for the different fault-systems identified in the study region. The combination of the new earthquake parameters and the newly defined fault geometries, together with the existing published paleoseismic data, allowed proposing a suite of rupture scenarios including the activation of multiple fault segments. The methodology used to achieve this goal consisted in setting up a logic tree that reflected the opinion of all the members of the ANR-SISCOR Working Group. On the basis of this consensual logic tree, median probability of occurrences of M>=6 events were computed for the region of study. Time-dependent models (Brownian Passage time and Weibull probability distributions) were also explored. The probability of a M>=6.0 event is found to be greater in the western region compared to the eastern part of the Corinth rift, whether a fault-based or a classical seismotectonic approach is used. Percentile probability estimates are also provided to represent the range of uncertainties in the results. The percentile results show that, in general, probability estimates following the classical approach (based on the definition of seismotectonic source zones), cover the median values estimated following the fault-based approach. On the contrary, the fault-based approach in this region is still affected by a high degree of uncertainty, because of the poor constraints on the 3D geometries of the faults and the high uncertainties in their slip rates.

Boiselet, Aurelien; Scotti, Oona; Lyon-Caen, Hélène

2014-05-01

331

Non-shear focal mechanisms of earthquakes at The Geysers, California and Hengill, Iceland, geothermal areas  

USGS Publications Warehouse

Several thousand earthquakes were recorded in each area. We report an initial investigation of the focal mechanisms based on P-wave polarities. Distortion by complicated three-dimensional crustal structure was minimized using tomographically derived three-dimensional crustal models. Events with explosive and implosive source mechanisms, suggesting cavity opening and collapse, have been tentatively identified at The Geysers. The new data show that some of these events do not fit the model of tensile cracking accompanied by isotropic pore pressure decreases that was suggested in earlier studies, but that they may instead involve combination of explosive and shear processes. However, the confirmation of earthquakes dominated by explosive components supports the model that the event are caused by crack opening induced by thermal contraction of the heat source.

Julian, B.R.; Miller, A.D.; Foulger, G.R.

1993-01-01

332

The 1987 Whittier Narrows earthquake in the Los Angeles metropolitan area, California  

USGS Publications Warehouse

The Whittier Narrows earthquake sequence (local magnitude, ML=5.9), which caused over $358-million damage, indicates that assessments of earthquake hazards in the Los Angeles metropolitan area may be underestimated. The sequence ruptured a previously unidentified thrust fault that may be part of a large system of thrust faults that extends across the entire east-west length of the northern margin of the Los Angeles basin. Peak horizontal accelerations from the main shock, which were measured at ground level and in structures, were as high as 0.6g (where g is the acceleration of gravity at sea level) within 50 kilometers of the epicenter. The distribution of the modified Mercalli intensity VII reflects a broad north-south elongated zone of damage that is approximately centered on the main shock epicenter.

Hauksson, E.; Jones, L.M.; Davis, T.L.; Hutton, L.K.; Brady, A.G.; Reasenberg, P.A.; Michael, A.J.; Yerkes, R.F.; Williams, Pat; Reagor, G.; Stover, C.W.; Bent, A.L.; Shakal, A.K.; Etheredge, E.; Porcella, R.L.; Bufe, C.G.; Johnston, M.J.S.; Cranswick, E.

1988-01-01

333

Slip partitioning of the Calaveras Fault, California, and prospects for future earthquakes  

USGS Publications Warehouse

Examination of main shock and microearthquake data from the Calaveras Fault during the last 20 years reveals that main shock hypocenters occur at depths of 8-9 km near the base of the zone of microearthquakes. Microseismicity extends between depths of 4 and 10 km and defines zones of concentrated microseismicity and aseismic zones. Estimates of the fault regions which slipped during the Coyote Lake and Morgan Hill earthquakes as derived from seismic radiation coincide with zones which are otherwise aseismic. We propose that these persistent aseismic zones represent stuck patches which slip only during moderate earthquakes. From the pattern of microearthquake locations we recognize six aseismic zones where we expect future main shocks will rupture the Calaveras Fault. -from Authors

Oppenheimer, D.H.; Bakun, W.H.; Lindh, A.G.

1990-01-01

334

The Cape Mendocino, California, earthquakes of April 1992: Subduction at the triple junction  

USGS Publications Warehouse

The 25 April 1992 magnitude 7.1 Cape Mendocino thrust earthquake demonstrated that the North America-Gorda plate boundary is seismogenic and illustrated hazards that could result from much larger earthquakes forecast for the Cascadia region. The shock occurred just north of the Mendocino Triple Junction and caused strong ground motion and moderate damage in the immediate area. Rupture initiated onshore at a depth of 10.5 kilometers and propagated up-dip and seaward. Slip on steep faults in the Gorda plate generated two magnitude 6.6 aftershocks on 26 April. The main shock did not produce surface rupture on land but caused coastal uplift and a tsunami. The emerging picture of seismicity and faulting at the triple junction suggests that the region is likely to continue experiencing significant seismicity.

Oppenheimer, D.; Beroza, G.; Carver, G.; Dengler, L.; Eaton, J.; Gee, L.; Gonzalez, F.; Jayko, A.; Li, W.H.; Lisowski, M.; Magee, M.; Marshall, G.; Murray, M.; McPherson, R.; Romanowicz, B.; Satake, K.; Simpson, R.; Somerville, P.; Stein, R.; Valentine, D.

1993-01-01

335

Chapter D. The Loma Prieta, California, Earthquake of October 17, 1989 - Recovery, Mitigation, and Reconstruction  

USGS Publications Warehouse

The papers in this chapter reflect the broad spectrum of issues that arise following a major damaging urban earthquake-the regional economic consequences, rehousing problems, reconstruction strategies and policies, and opportunities for mitigation before the next major seismic event. While some of these papers deal with structural or physical science topics, their significant social and policy implications make them relevant for improving our understanding of the processes and dynamics that take place during the recovery period.

Nigg, Joanne M., (Edited By)

1998-01-01

336

Earthquake recurrence on the Calaveras fault east of San Jose, California  

USGS Publications Warehouse

Occurrence of small (3 ??? ML < 4) earthquakes on two 10-km segments of the Calaveras fault between Calaveras and Anderson reservoirs follows a simple linear pattern of elastic strain accumulation and release. The centers of these independent patches of earthquake activity are 20 km apart. Each region is characterized by a constant rate of seismic slip as computed from earthquake magnitudes, and is assumed to be an isolated locked patch on a creeping fault surface. By calculating seismic slip rates and the amount of seismic slip since the time of the last significant (M ??? 3) earthquake, it is possible to estimate the most likely date of the next (M ???- 3) event on each patch. The larger the last significant event, the longer the time until the next one. The recurrence time also appears to be increased according to the moment of smaller (2 < ML < 3) events in the interim. The anticipated times of future larger events on each patch, on the basis of preliminary location data through May 1977 and estimates of interim activity, are tabulated below with standard errors. The occurrence time for the southern zone is based on eight recurrent events since 1969, the northern zone on only three. The 95% confidence limits can be estimated as twice the standard error of the projected least-squares line. Events of M ??? 3 should not occur in the specified zones at times outside these limits. The central region between the two zones was the locus of two events (M = 3.6, 3.3) on July 3, 1977. These events occurred prior to a window based on the three point, post-1969 slip-time line for the central region. {A table is presented}. ?? 1979.

Bufe, C.G.; Harsh, P.W.; Burford, R.O.

1979-01-01

337

A Double-Difference Earthquake Location Algorithm: Method and Application to the Northern Hayward Fault, California  

Microsoft Academic Search

We have developed an efficient method to determine high-resolution hypocenter locations over large distances. The location method incorporates ordinary absolute travel-time measurements and\\/or cross-correlation P-and S-wave differential travel-time measurements. Residuals between observed and theoretical travel-time differences (or double-differences) are minimized for pairs of earthquakes at each station while linking together all observed event-station pairs. A least-squares solu- tion is found

Felix Waldhauser; William L. Ellsworth

2000-01-01

338

Chapter A. The Loma Prieta, California, Earthquake of October 17, 1989 - Loss Estimation and Procedures  

USGS Publications Warehouse

This Professional Paper includes a collection of papers on subjects ranging from evaluation of building safety, to human injuries, to correlation of ground deformation with building damage. What these papers share is a common goal to improve the tools available to the research community to measure the nature, extent, and causes of damage and losses due to earthquakes. These measurement tools are critical to reducing future loss.

Tubbesing, Susan K., (Edited By)

1994-01-01

339

Chapter A. The Loma Prieta, California, Earthquake of October 17, 1989 - Main Shock Characteristics  

USGS Publications Warehouse

The October 17, 1989, Loma Prieta, Calif., earthquake (0004:15.2 G.m.t. October 18; lat 37.036? N., long 121.883? W.; 19-km depth) had a local magnitude (ML) of about 6.7, a surface-wave magnitude (MS) of 7.1, a seismic moment of 2.2x1019 N-m to 3.5x1019 N-m, a source duration of 6 to 15 s, and an average stress drop of at least 50 bars. Slip occurred on a dipping fault surface about 35 km long and was largely confined to a depth of about 7 to 20 km. The slip vector had a large vertical component, and slip was distributed in two main regions situated northwest and southeast of the hypocenter. This slip distribution caused about half of the earthquake's energy to be focused toward the urbanized San Francisco Bay region, while the other half was focused toward the southeast. Had the rupture initiated at the southeast end of the aftershock zone, shaking in the bay region would have been both longer and stronger. These source parameters suggest that the earthquake was not a typical shallow San Andreas-type event but a deeper event on a different fault with a recurrence interval of many hundreds of years. Therefore, the potential for a damaging shallow event on the San Andreas fault in the Santa Cruz Mountains may still exist.

Spudich, Paul, (Edited By)

1996-01-01

340

U.S.Geological Survey Grant No. 01HQGR0018 EARTHQUAKE POTENTIAL OF MAJOR FAULTS OFFSHORE SOUTHERN CALIFORNIA  

E-print Network

U.S.Geological Survey Grant No. 01HQGR0018 EARTHQUAKE POTENTIAL OF MAJOR FAULTS OFFSHORE SOUTHERN;U.S.Geological Survey Grant No. 01HQGR0018 EARTHQUAKE POTENTIAL OF MAJOR FAULTS OFFSHORE SOUTHERN@coas.oregonstate.edu Program Elements: I-Products for Earthquake Loss Reduction; II-Research on Earthquake Occurrence

Goldfinger, Chris

341

The 2007 M5.4 Alum Rock, California, earthquake: Implications for future earthquakes on the central and southern Calaveras Fault  

Microsoft Academic Search

The similarity of seismograms recorded by two seismic stations demonstrate that the 31 October 2007 moment magnitude M5.4 Alum Rock earthquake is a repeat of a 1955 ML5.5 earthquake. Both occurred on Oppenheimer et al.'s (1990) Zone V ``stuck patch'' on the central Calaveras fault, providing new support for their model of Calaveras fault earthquake activity. We suggest that Zone

David H. Oppenheimer; William H. Bakun; Tom Parsons; Robert W. Simpson; John Boatwright; Robert A. Uhrhammer

2010-01-01

342

The 2007 M5.4 Alum Rock, California, earthquake: Implications for future earthquakes on the central and southern Calaveras Fault  

Microsoft Academic Search

The similarity of seismograms recorded by two seismic stations demonstrate that the 31 October 2007 moment magnitude M5.4 Alum Rock earthquake is a repeat of a 1955 ML5.5 earthquake. Both occurred on Oppenheimer et al.'s (1990) Zone V “stuck patch” on the central Calaveras fault, providing new support for their model of Calaveras fault earthquake activity. We suggest that Zone

David H. Oppenheimer; William H. Bakun; Tom Parsons; Robert W. Simpson; John Boatwright; Robert A. Uhrhammer

2010-01-01

343

Interaction of the san jacinto and san andreas fault zones, southern california: triggered earthquake migration and coupled recurrence intervals.  

PubMed

Two lines of evidence suggest that large earthquakes that occur on either the San Jacinto fault zone (SJFZ) or the San Andreas fault zone (SAFZ) may be triggered by large earthquakes that occur on the other. First, the great 1857 Fort Tejon earthquake in the SAFZ seems to have triggered a progressive sequence of earthquakes in the SJFZ. These earthquakes occurred at times and locations that are consistent with triggering by a strain pulse that propagated southeastward at a rate of 1.7 kilometers per year along the SJFZ after the 1857 earthquake. Second, the similarity in average recurrence intervals in the SJFZ (about 150 years) and in the Mojave segment of the SAFZ (132 years) suggests that large earthquakes in the northern SJFZ may stimulate the relatively frequent major earthquakes on the Mojave segment. Analysis of historic earthquake occurrence in the SJFZ suggests little likelihood of extended quiescence between earthquake sequences. PMID:17818388

Sanders, C O

1993-05-14

344

Avian Flu / Earthquake Prediction  

NSDL National Science Digital Library

This radio broadcast includes a discussion of the avian flu spreading though Southeast Asia, Russia and parts of Europe. Topics include whether the outbreak is a pandemic in the making, and what preparations might be made to control the outbreak. The next segment of the broadcast discusses earthquake prediction, in light of the 2005 earthquake in Pakistan. Two seismologists discuss what was learned in the Parkfield project, an experiment in earthquake prediction conducted in California. Other topics include the distribution of large versus small earthquakes; how poor construction magnifies earthquake devastation; and the relationship of plate tectonics to the Pakistan earthquake.

345

Coseismic and Initial Postseismic Deformation from the 2004 Parkfield, California, Earthquake, Observed by Global Positioning System, Electronic Distance Meter, Creepmeters, and Borehole Strainmeters  

Microsoft Academic Search

Global Positioning System (GPS), electronic distance meter, creepmeter, and strainmeter measurements spanning the M 6.0 Parkfield, California, earthquake are examined. Using these data from 100 sec through 9 months following the main- shock, the Omori's law, with rate inversely related to time, 1\\/tp and p ranging be- tween 0.7 and 1.3, characterizes the time-dependent deformation during the post- seismic period;

J. Langbein; J. R. Murray; H. A. Snyder

2006-01-01

346

The Pulse Azimuth effect as seen in induction coil magnetometers located in California and Peru 2007-2010, and its possible association with earthquakes  

Microsoft Academic Search

The QuakeFinder network of magnetometers has recorded geomagnetic field activity in California since 2000. Established as an effort to follow up observations of ULF activity reported from before and after the M = 7.1 Loma Prieta earthquake in 1989 by Stanford University, the QuakeFinder network has over 50 sites, fifteen of which are high-resolution QF1005 and QF1007 systems. Pairs of

J. C. Dunson; T. E. Bleier; S. Roth; J. Heraud; C. H. Alvarez; A. Lira

2011-01-01

347

INVERSION OF STRONG GROUND MOTION AND TELESEISMIC WAVEFORM DATA FOR THE FAULT RUPTURE HISTORY OF THE 1979 IMPERIAL VALLEY, CALIFORNIA, EARTHQUAKE  

Microsoft Academic Search

A least-squares point-by-point inversion of strong ground motion and tele- seismic body waves is used to infer the fault rupture history of the 1979 Imperial Valley, California, earthquake. The Imperial fault is represented by a plane embedded in a half-space where the elastic properties vary with depth. The inversion yields both the spatial and temporal variations in dislocation on the

STEPHEN H. HARTZELL; THOMAS H. HEATON

1983-01-01

348

Survey of strong motion earthquake effects on thermal power plants in California with emphasis on piping systems. Volume 1, Main report  

SciTech Connect

Since 1982, there has been a major effort expended to evaluate the susceptibility of nuclear Power plant equipment to failure and significant damage during seismic events. This was done by making use of data on the performance of electrical and mechanical equipment in conventional power plants and other similar industrial facilities during strong motion earthquakes. This report is intended as an extension of the seismic experience data collection effort and a compilation of experience data specific to power plant piping and supports designed and constructed US power piping code requirements which have experienced strong motion earthquakes. Eight damaging (Richter Magnitude 7.7 to 5.5) California earthquakes and their effects on 8 power generating facilities in use natural gas and California were reviewed. All of these facilities were visited and evaluated. Seven fossel-fueled (dual use natural gas and oil) and one nuclear fueled plants consisting of a total of 36 individual boiler or reactor units were investigated. Peak horizontal ground accelerations that either had been recorded on site at these facilities or were considered applicable to these power plants on the basis of nearby recordings ranged between 0.20g and 0.5lg with strong motion durations which varied from 3.5 to 15 seconds. Most US nuclear power plants are designed for a safe shutdown earthquake peak ground acceleration equal to 0.20g or less with strong motion durations which vary from 10 to 15 seconds.

Stevenson, J.D. [Stevenson and Associates, Cleveland, OH (United States)

1995-11-01

349

Three-Dimensional Geologic Map of Northern California: A Foundation for Earthquake Simulations and Other Predictive Modeling  

NASA Astrophysics Data System (ADS)

Detailed, realistic models of the subsurface are needed for predicting damage patterns from future earthquakes and simulating other phenomena affecting human safety and well being. The simple models used in the past are no longer adequate. In support of a planned simulation of the ground shaking from the Great 1906 San Francisco earthquake, we constructed a three-dimensional (3D) geologic map of northern California that consists of specific geologic units separated by discrete boundaries. It is based on a century of geologic mapping, 50 years of gravity and magnetic surveying, double-difference relocated seismicity, seismic soundings, P-wave tomography, and well logs. The map is a rules-based construction composed of faults that break the map volume into fault blocks, which in turn are populated with geologic units defined by surfaces that represent their tops. The rules define how the faults and tops truncate one another. The map is easily updated as new information becomes available. The 3D map is made up of two related parts. An inner detailed map of central California centered on San Francisco extends from Clear Lake to Monterey, from the edge of the continental shelf to the western Great Valley, and to a depth of 45 km. This is embedded in a less detailed regional map that extends from north of Cape Mendocino to Parkfield, from the ocean basin to the foothills of the Sierra Nevada and Cascade Ranges, and also to a depth of 45 km. The detailed map volume is broken by 25 major faults including the active San Andreas, Hayward, and Calaveras faults. The fault blocks are populated with geologic units in the following groups: water, Plio-Quaternary deposits, Tertiary (or undifferentiated Cenozoic) sedimentary and volcanic deposits, Mesozoic sedimentary or plutonic rocks, mafic lower crust, and mantle rocks. The primary purpose of the regional map is: 1) to provide coverage of the entire reach of the San Andreas Fault that ruptured in 1906 (including the major bedrock units that occupy the fault faces); and 2) to provide a consistent `buffer' surrounding the detailed map to minimize modeling artifacts from boundary discontinuities. The regional map includes major Mesozoic and Tertiary bedrock units, a representation of the Great Valley sedimentary fill, the mafic lower crust, and the mantle. The 3D map was assigned physical properties (seismic wave velocities, densities, and intrinsic attenuations) according to geologic unit and depth, and provided to the seismic-wave modeling community. Successful simulations of ground shaking from the Great 1906 San Francisco earthquake and the 1989 Loma Prieta earthquake based on this 3D map specifically highlighted the role of sedimentary basins in amplifying and prolonging ground shaking, and more generally illustrated the benefits of a `geologic' approach for producing realistic earth models to support predictive process modeling. The present 3D map and its derivative physical property model are appropriate for incorporation into a statewide community fault model and a statewide seismic velocity model.

Jachens, R. C.; Simpson, R. W.; Graymer, R. W.; Wentworth, C. M.; Brocher, T. M.

2006-12-01

350

Deformation from the 1989 Loma Prieta earthquake near the southwest margin of the Santa Clara Valley, California  

USGS Publications Warehouse

To gain additional measurement of any permanent ground deformation that accompanied this damage, we compiled and conducted post-earthquake surveys along two 5-km lines of horizontal control and a 15-km level line. Measurements of horizontal distortion indicate approximately 0.1 m shortening in a NE-SW direction across the valley margin, similar to the amount measured in the channel lining. Evaluation of precise leveling by the National Geodetic Survey showed a downwarp, with an amplitude of >0.1 m over a span of >12 km, that resembled regional geodetic models of coseismic deformation. Although the leveling indicates broad, regional warping, abrupt discontinuities characteristic of faulting characterize both the broad-scale distribution of damage and the local deformation of the channel lining. Reverse movement largely along preexisting faults and probably enhanced significantly by warping combined with enhanced ground shaking, produced the documented coseismic ground deformation.

Schmidt, Kevin M.; Ellen, Stephen D.; Peterson, David M.

2014-01-01

351

Situated Preparedness: The Negotiation of a Future Catastrophic Earthquake in a California University  

ERIC Educational Resources Information Center

This dissertation examines disaster preparedness as engaged at a large university in southern California using inductive research and grounded theory data collection and analysis methods. The thesis consists of three parts, all addressing the problem of disaster preparedness as enacted in this at-risk context. I use in-depth interviews, archival…

Baker, Natalie Danielle

2013-01-01

352

Using Logistic Regression to Predict the Probability of Debris Flows in Areas Burned by Wildfires, Southern California, 2003-2006  

USGS Publications Warehouse

Logistic regression was used to develop statistical models that can be used to predict the probability of debris flows in areas recently burned by wildfires by using data from 14 wildfires that burned in southern California during 2003-2006. Twenty-eight independent variables describing the basin morphology, burn severity, rainfall, and soil properties of 306 drainage basins located within those burned areas were evaluated. The models were developed as follows: (1) Basins that did and did not produce debris flows soon after the 2003 to 2006 fires were delineated from data in the National Elevation Dataset using a geographic information system; (2) Data describing the basin morphology, burn severity, rainfall, and soil properties were compiled for each basin. These data were then input to a statistics software package for analysis using logistic regression; and (3) Relations between the occurrence or absence of debris flows and the basin morphology, burn severity, rainfall, and soil properties were evaluated, and five multivariate logistic regression models were constructed. All possible combinations of independent variables were evaluated to determine which combinations produced the most effective models, and the multivariate models that best predicted the occurrence of debris flows were identified. Percentage of high burn severity and 3-hour peak rainfall intensity were significant variables in all models. Soil organic matter content and soil clay content were significant variables in all models except Model 5. Soil slope was a significant variable in all models except Model 4. The most suitable model can be selected from these five models on the basis of the availability of independent variables in the particular area of interest and field checking of probability maps. The multivariate logistic regression models can be entered into a geographic information system, and maps showing the probability of debris flows can be constructed in recently burned areas of southern California. This study demonstrates that logistic regression is a valuable tool for developing models that predict the probability of debris flows occurring in recently burned landscapes.

Rupert, Michael G.; Cannon, Susan H.; Gartner, Joseph E.; Michael, John A.; Helsel, Dennis R.

2008-01-01

353

Earthquake stress drops and inferred fault strength on the Hayward Fault, east San Francisco Bay, California  

USGS Publications Warehouse

We study variations in earthquake stress drop with respect to depth, faulting regime, creeping versus locked fault behavior, and wall-rock geology. We use the P-wave displacement spectra from borehole seismic recordings of M 1.0-4.2 earthquakes in the east San Francisco Bay to estimate stress drop using a stack-and-invert empirical Green's function method. The median stress drop is 8.7 MPa, and most stress drops are in the range between 0.4 and 130 MPa. An apparent correlation between stress drop and magnitude is entirely an artifact of the limited frequency band of 4-55 Hz. There is a trend of increasing stress drop with depth, with a median stress drop of ~5 MPa for 1-7 km depth, ~10 MPa for 7-13 km depth, and ~50 MPa deeper than 13 km. We use S=P amplitude ratios measured from the borehole records to better constrain the first-motion focal mechanisms. High stress drops are observed for a deep cluster of thrust-faulting earthquakes. The correlation of stress drops with depth and faulting regime implies that stress drop is related to the applied shear stress. We compare the spatial distribution of stress drops on the Hayward fault to a model of creeping versus locked behavior of the fault and find that high stress drops are concentrated around the major locked patch near Oakland. This also suggests a connection between stress drop and applied shear stress, as the locked patch may experience higher applied shear stress as a result of the difference in cumulative slip or the presence of higher-strength material. The stress drops do not directly correlate with the strength of the proposed wall-rock geology at depth, suggesting that the relationship between fault strength and the strength of the wall rock is complex.

Hardebeck, J.L.; Aron, A.

2009-01-01

354

Predicting catastrophic earthquakes  

NSDL National Science Digital Library

This resource provides an abstract. This article discusses a method based on the magnitude-frequency distribution of previous earthquakes in a region. It is used to examine the probability of a small earthquake growing into a catastrophic one. When a small earthquake is detected in a region where a catastrophic one is expected, early warning systems can be modified to determine the probability that this earthquake will grow in magnitude. It was found that if the observed earthquake magnitude reaches 6.5, the estimated probability that the final magnitude will reach 7.5 is between 25 and 41 percent.

Iwata et al.

355

Triggered surface slips in southern California associated with the 2010 El Mayor-Cucapah, Baja California, Mexico, earthquake  

USGS Publications Warehouse

Triggered slip in the Yuha Desert area occurred along more than two dozen faults, only some of which were recognized before the April 4, 2010, El Mayor-Cucapah earthquake. From east to northwest, slip occurred in seven general areas: (1) in the Northern Centinela Fault Zone (newly named), (2) along unnamed faults south of Pinto Wash, (3) along the Yuha Fault (newly named), (4) along both east and west branches of the Laguna Salada Fault, (5) along the Yuha Well Fault Zone (newly revised name) and related faults between it and the Yuha Fault, (6) along the Ocotillo Fault (newly named) and related faults to the north and south, and (7) along the southeasternmost section of the Elsinore Fault. Faults that slipped in the Yuha Desert area include northwest-trending right-lateral faults, northeast-trending left-lateral faults, and north-south faults, some of which had dominantly vertical offset. Triggered slip along the Ocotillo and Elsinore Faults appears to have occurred only in association with the June 14, 2010 (Mw5.7), aftershock. This aftershock also resulted in slip along other faults near the town of Ocotillo. Triggered offset on faults in the Yuha Desert area was mostly less than 20 mm, with three significant exceptions, including slip of about 50–60 mm on the Yuha Fault, 40 mm on a fault south of Pinto Wash, and about 85 mm on the Ocotillo Fault. All triggered slips in the Yuha Desert area occurred along preexisting faults, whether previously recognized or not.

Rymer, Michael J.; Treiman, Jerome A.; Kendrick, Katherine J.; Lienkaemper, James J.; Weldon, Ray J.; Bilham, Roger; Wei, Meng; Fielding, Eric J.; Hernandez, Janis L.; Olson, Brian P.E.; Irvine, Pamela J.; Knepprath, Nichole; Sickler, Robert R.; Tong, Xiaopeng; Siem, Martin E.

2011-01-01

356

Structural Constraints and Earthquake Recurrence Estimates for the West Tahoe-Dollar Point Fault, Lake Tahoe Basin, California  

NASA Astrophysics Data System (ADS)

Previous work in the Lake Tahoe Basin (LTB), California, identified the West Tahoe-Dollar Point Fault (WTDPF) as the most hazardous fault in the region. Onshore and offshore geophysical mapping delineated three segments of the WTDPF extending along the western margin of the LTB. The rupture patterns between the three WTDPF segments remain poorly understood. Fallen Leaf Lake (FLL), Cascade Lake, and Emerald Bay are three sub-basins of the LTB, located south of Lake Tahoe, that provide an opportunity to image primary earthquake deformation along the WTDPF and associated landslide deposits. We present results from recent (June 2011) high-resolution seismic CHIRP surveys in FLL and Cascade Lake, as well as complete multibeam swath bathymetry coverage of FLL. Radiocarbon dates obtained from the new piston cores acquired in FLL provide age constraints on the older FLL slide deposits and build on and complement previous work that dated the most recent event (MRE) in Fallen Leaf Lake at ~4.1-4.5 k.y. BP. The CHIRP data beneath FLL image slide deposits that appear to correlate with contemporaneous slide deposits in Emerald Bay and Lake Tahoe. A major slide imaged in FLL CHIRP data is slightly younger than the Tsoyowata ash (7950-7730 cal yrs BP) identified in sediment cores and appears synchronous with a major Lake Tahoe slide deposit (7890-7190 cal yrs BP). The equivalent age of these slides suggests the penultimate earthquake on the WTDPF may have triggered them. If correct, we postulate a recurrence interval of ~3-4 k.y. These results suggest the FLL segment of the WTDPF is near its seismic recurrence cycle. Additionally, CHIRP profiles acquired in Cascade Lake image the WTDPF for the first time in this sub-basin, which is located near the transition zone between the FLL and Rubicon Point Sections of the WTDPF. We observe two fault-strands trending N45°W across southern Cascade Lake for ~450 m. The strands produce scarps of ~5 m and ~2.7 m, respectively, on the lake floor, but offset increases down-section to ~14 m and ~8 m at the acoustic basement. Studying the style and timing of earthquake deformation in Fallen Leaf Lake, Cascade Lake, Emerald Bay and Lake Tahoe will help us to understand how strain is partitioned between adjacent segments and the potential rupture magnitude.

Maloney, J. M.; Driscoll, N. W.; Kent, G.; Brothers, D. S.; Baskin, R. L.; Babcock, J. M.; Noble, P. J.; Karlin, R. E.

2011-12-01

357

The 130-km-long Green Valley Fault Zone of Northern California: Discontinuities Regulate Its Earthquake Recurrence  

NASA Astrophysics Data System (ADS)

The Green Valley fault (GVF), a branch of the dextral strike-slip San Andreas fault system, connects the Northern Calaveras fault (NCF) to the Bartlett Springs fault (BSF) to the north. Although, the GVF may occasionally rupture along its entire length to produce M7 earthquakes, 2-3 km discontinuities in its trace appear to modulate the length and frequency of ruptures. The global historical earthquake record suggests that ruptures tend to stop at such fault discontinuities (1-4 km steps) about half the time (Wesnousky and Biasi, 2011). The GVF has three sections: the 77-km-long southern GVF (SGVF), the 25-km Berryessa (BF), and the 30-km Hunting Creek (HCF). The SGVF itself could produce large (M6.7) events, and the BF and HCF somewhat smaller events (M6.3-6.6). The BF is centered on a compressional pop-up structure. It is separated to the north from the HCF by a ~2.5-3 km extensional stepover and to the south from the SGVF by a ~2.5-3 km extensional bend. At its south end, the GVF is separated from the NCF by the 5-km Alamo stepover, which is likely to stop all ruptures; and at its north end the GVF (HCF section) makes a 2.5 km right step to the BSF at Wilson Valley. The HCF apparently forms a significant transition between the BSF and the GVF. The overall trend of the GVF bends ~17° through the HCF and emerges on the BSF trend. Thus, this bend, along with the Wilson Valley step-over, would tend to inhibit ruptures between BSF and sections of the GVF. Creep rates along most of the GVF (SGVF, HCF) range from 1 to 4 mm/yr. No creep is known for the BF section, but its microseismicity levels are similar to creeping parts of the GVF and BSF, so we assume that the BF may creep too. We estimate slip rate on the GVF is 6±2 mm/yr by interpolating rates on the BSF and the NCF. Lienkaemper and Brown (2009) estimated ~6.5 mm/yr for the average deep loading rate on the BSF using a rigid block model of the USGS-GPS site velocities observed in the central BSF. This rate is comparable to the 6 mm/yr Holocene slip rate observed on the NCF (Kelson et al., 1996). Microearthquakes on the GVF reach a depth of ~14 km. Using methods of Savage and Lisowski (1993) for the GVF suggests that creep may on average extend to depths of ~7.5 km, leaving a width of ~6.5 km of locked fault zone below. Trenching on the SGVF indicates 400 (±50) years have elapsed since the most recent large earthquake (MRE) in 1610±50 yr CE. Previous earthquake recurrence intervals (RI) in the past millennium indicate a mean RI of 200±80 yr (?±?) for the SGVF, which is much shorter than the 400-yr open interval. Preliminary evidence from trenching on the BF gives a MRE of 1630±100 yr CE, which may thus coincide with of the MRE on the SGVF. If the MRE on the BF and SGVF sections is the same earthquake, then its expected larger size (M~6.9-7.0 vs 6.7) and greater fault complexity may have produced a large stress drop, which would possibly help explain the current long open interval. The SGVF paleoseismic recurrence model is consistent with a simple probabilistic rupture model (i.e., 50%-probable rupture across 1-4 km steps) and with a Brownian Passage Time recurrence model with a mean RI of 250 yr, CV (coefficient of variation, ?/?) of 0.6, and a 30-yr rupture probability of 20-25%.

Lienkaemper, J. J.

2012-12-01

358

Reply to “Probability of earthquake occurrence in Greece with special reference to the VAN predictions,” by Y. Honkura and N. Tanaka  

NASA Astrophysics Data System (ADS)

The calculation of Honkura and Tanaka [1996] and that of Aceves et al. [1996], provide important tools for the clarification of the main question of this issue, i.e., whether or not the VAN predictions can be ascribed to chance. Honkura and Tanaka's [1996] calculation showed that, in a circular area with a radius of 120 km and for the time window of 22 days, the probability P of occurrence of an earthquake (EQ) with Ms ? 5.0 in Greece is less than 0.25, and even smaller for a time window of 11 days. For larger magnitude thresholds, i.e., Ms ? 5.5 or Ms ? 5.8 (and in view of Gutenberg-Richter relation), their P-value has to be drastically smaller. A simple comparison of these P-values with the Tables of Mulargia and Gasperini [1992, 1996a] immediately reveals that VAN-predictions cannot be ascribed to chance. Note that an inspection of the latter Tables leads to (i) the VAN success rate is 40?45% (when considering correlations with earthquakes having MEQ ? 5.0 only), and (ii) the VAN alarm rates increase with the (earthquake) magnitude threshold, reaching to values of 50% and 60%, for MEQ ? 5.5 and MEQ ? 5.8 respectively. Another important point, which emerges from the calculation of Honkura and Tanaka [1996], is that “aftershocks must be treated carefully.” This strengthenes our remarks in Principles 4 and 5 of Varotsos et al. [1996a] that the inappropriate treatment of aftershocks in Mulargia and Gasperini’s [1992] calculation (which was based on Poisson distribution): (i) changed drastically the values of the significance level and (ii) turned a true “forward time correlation,” between predictions and earthquakes, to a “backward time association.” The latter point is also separately checked by Honkura and Tanaka [1996] who conclude that: “… with the backwards time correlation in mind … we could not find cases in which a high probability arises for the occurrence of an EQ of Ms ? 5.0 in the target area.” In this Reply we also proceed to some necessary clarifications, concerning the calculation of the “success rate” and “alarm rate” when a prediction method has, as expected, an experimental error in the magnitude determination.

Varotsos, P.; Lazaridou, M.

359

History of earthquakes and tsunamis along the eastern Aleutian-Alaska megathrust, with implications for tsunami hazards in the California Continental Borderland  

USGS Publications Warehouse

During the past several years, devastating tsunamis were generated along subduction zones in Indonesia, Chile, and most recently Japan. Both the Chile and Japan tsunamis traveled across the Pacific Ocean and caused localized damage at several coastal areas in California. The question remains as to whether coastal California, in particular the California Continental Borderland, is vulnerable to more extensive damage from a far-field tsunami sourced along a Pacific subduction zone. Assuming that the coast of California is at risk from a far-field tsunami, its coastline is most exposed to a trans-Pacific tsunami generated along the eastern Aleutian-Alaska subduction zone. We present the background geologic constraints that could control a possible giant (Mw ~9) earthquake sourced along the eastern Aleutian-Alaska megathrust. Previous great earthquakes (Mw ~8) in 1788, 1938, and 1946 ruptured single segments of the eastern Aleutian-Alaska megathrust. However, in order to generate a giant earthquake, it is necessary to rupture through multiple segments of the megathrust. Potential barriers to a throughgoing rupture, such as high-relief fracture zones or ridges, are absent on the subducting Pacific Plate between the Fox and Semidi Islands. Possible asperities (areas on the megathrust that are locked and therefore subject to infrequent but large slip) are identified by patches of high moment release observed in the historical earthquake record, geodetic studies, and the location of forearc basin gravity lows. Global Positioning System (GPS) data indicate that some areas of the eastern Aleutian-Alaska megathrust, such as that beneath Sanak Island, are weakly coupled. We suggest that although these areas will have reduced slip during a giant earthquake, they are not really large enough to form a barrier to rupture. A key aspect in defining an earthquake source for tsunami generation is determining the possibility of significant slip on the updip end of the megathrust near the trench. Large slip on the updip part of the eastern Aleutian-Alaska megathrust is a viable possibility owing to the small frontal accretionary prism and the presence of arc basement relatively close to the trench along most of the megathrust.

Ryan, Holly F.; von Huene, Roland; Wells, Ray E.; Scholl, David W.; Kirby, Stephen; Draut, Amy E.

2012-01-01

360

Late Quaternary uplift and earthquake potential of the San Joaquin Hills, southern Los Angeles basin, California  

Microsoft Academic Search

Analysis of emergent marine terraces in the San Joaquin Hills, California, and 230Th dating of solitary corals from the lowest terraces reveal that the San Joaquin Hills have risen at a rate of 0.21-0.27 m\\/k.y. during the past 122 k.y. Movement on a blind thrust fault in the southern Los Angeles basin has uplifted the San Joaquin Hills and has

Lisa B. Grant; Karl J. Mueller; Eldon M. Gath; Hai Cheng; R. Lawrence Edwards; Rosalind Munro; George L. Kennedy

1999-01-01

361

Late Quaternary uplift and earthquake potential of the San Joaquin Hills, southern Los Angeles basin, California  

Microsoft Academic Search

Analysis of emergent marine terraces in the San Joaquin Hills, California, and 230Th dating of solitary corals from the lowest terraces reveal that the San Joaquin Hills have risen at a rate of 0.21 0.27 m\\/k.y. during the past 122 k.y. Movement on a blind thrust fault in the southern Los Angeles basin has uplifted the San Joaquin Hills and

Lisa B. Grant; Karl J. Mueller; Eldon M. Gath; Hai Cheng; R. Lawrence Edwards; Rosalind Munro; George L. Kennedy

1999-01-01

362

Earthquake Record of the Peninsula Segment of the San Andreas fault, Portola Valley, California  

NASA Astrophysics Data System (ADS)

Previous paleoseismic studies on the Peninsula segment provide evidence of poorly constrained large magnitude earthquakes occurring in the late Holocene (Wright et al., 1999; Hall et al., 2001), and possibly in 1838 (Toppozada and Borchardt, 1998); whereas paleoseismic investigations on the Santa Cruz Mountains and North Coast segments provide moderately well constrained event chronology information over the last approximately 2,000 years (Fumal et al., 2004; Niemi et al., 2004; Kelson et al., 2006). The WGCEP (2003) report provides an estimate of mean recurrence values of 378 years for 1906-type events, and 230 years for ruptures occurring only on the Peninsula segment of the northern San Andreas fault. More recent paleoseismic investigations along the Santa Cruz Mountains segment (Fumal et al., 2004) and North Coast segment (Niemi et al., 2004; Zhang et al., 2003) suggest the occurrence of surface-fault rupture at these sites as frequently as every 105 to 250 years. It is unknown if these short recurrence intervals for the two segments represent local, smaller events, or if they could have been associated with through-going events that included the Peninsula segment located between the Santa Cruz Mountains and North Coast segments of the San Andreas fault. Paleoseismic trenching within the Portola Valley Town Center site, located on the Peninsula segment, provides preliminary late Holocene event chronology data for the Peninsula segment of the San Andreas fault. At the Town of Portola Valley site we interpret three, and possibly four, earthquake events within approximately the past 1,000 years. Based on stratigraphic and structural relations, available radiocarbon dating, and the presence of historical artifacts, it is permissible to interpret three and possibly four earthquakes that from oldest to youngest include: Event 1 (A.D. 1030 to 1490); Event 2 (A.D. 1260 to 1490); and Event 3 (1906 A.D.). It is permissible to interpret Event 2 for the site as two separate events: Event 2A (A.D. 1260 to 1490) and Event 2B (A.D. 1410 to 1640). The youngest stratigraphic units do not provide the resolution necessary to assess whether or not the postulated, historical 1838 event ruptured the Woodside trace at this site. Select, additional charcoal samples have been submitted for dating to further constrain our event chronology.

Sundermann, S. T.; Baldwin, J. N.; Prentice, C.

2008-12-01

363

Non-double-couple earthquake mechanisms at the Geysers geothermal area, California  

USGS Publications Warehouse

Inverting P- and S-wave polarities and P:SH amplitude ratios using linear programming methods suggests that about 20% of earthquakes at The Geysers geothermal area have significantly non-double-couple focal mechanisms, with explosive volumetric components as large as 33% of the seismic moment. This conclusion contrasts with those of earlier studies, which interpreted data in terms of double couples. The non-double-couple mechanisms are consistent with combined shear and tensile faulting, possibly caused by industrial water injection. Implosive mechanisms, which might be expected because of rapid steam withdrawal, have not been found. Significant compensated-linear-vector-dipole (CLVD) components in some mechanisms may indicate rapid fluid flow accompanying crack opening. Copyright 1996 by the American Geophysical Union.

Ross, A.; Foulger, G.R.; Julian, B.R.

1996-01-01

364

Deep Structure Of Long Valley, California, Based On Deep Reflections From Earthquakes  

SciTech Connect

Knowledge of the deep structure of Long Valley comes primarily from seismic studies. Most of these efforts have focused on delimiting the top of the inferred magma chamber. We present evidence for the location of the bottom of the low velocity layer (LVL). Two other studies have provided similar information. Steeples and Iyer (1976) inferred from teleseismic P-wave delays that low-velocity material extends from 7 km depth to 25 to 40 km, depending on the velocities assumed. Luetgert and Mooney (1985) have examined seismic refraction data from earthquake sources and have identified a reflection that appears to be from the lower boundary of a magma chamber. They detected the reflection with a linear array of single component stations, and assuming it traveled in a vertical plane, matched the travel time and apparent velocity (6.3 km/sec) to deduce that it was a P-P reflection from within a LVL. We recorded a similar phase with a 2-dimensional array of three-component stations, and carried out a similar analysis, but utilized additional information about the travel path, particle motions and amplitudes to constrain our interpretation. Our data comes from a passive seismic refraction experiment conducted during August 1982. Fourteen portable seismograph stations were deployed in a network with approximately 5 km station spacing in the Mono Craters region north of Long Valley (Figure 1). The network recorded earthquakes located south of Long Valley and in the south moat. Three components of motion were recorded at all sites. The data represent one of the few times that three-component data has been collected for raypaths through a magma chamber in the Long Valley area.

Zucca, J. J.; Kasameyer, P. W.

1987-01-01

365

Imaging the source region of the 2003 San Simeon earthquake within the weak Franciscan subduction complex, central California  

USGS Publications Warehouse

Data collected from the 2003 Mw6.5 San Simeon earthquake sequence in central California and a 1986 seismic refraction experiment demonstrate that the weak Franciscan subduction complex suffered brittle failure in a region without significant velocity contrast across a slip plane. Relocated hypocenters suggest a spatial relationship between the seismicity and the Oceanic fault, although blind faulting on a nearby, unknown fault is an equally plausible alternative. The aftershock volume is sandwiched between the Nacimiento and Oceanic faults and is characterized by rocks of low compressional velocity (Vp) abutted to the east and west by rocks of higher Vp. This volume of inferred Franciscan rocks is embedded within the larger Santa Lucia anticline. Pore fluids, whose presence is implied by elevated Vp/Vs values, may locally decrease normal stress and limit the aftershock depth distribution between 3 to 10 km within the hanging wall. The paucity of aftershocks along the mainshock rupture surface may reflect either the absence of a damage zone or an almost complete stress drop within the low Vp or weak rock matrix surrounding the mainshock rupture. Copyright 2004 by the American Geophysical Union.

Hauksson, E.; Oppenheimer, D.; Brocher, T.M.

2004-01-01

366

Chronology of Postglacial Eruptive Activity and Calculation of Eruption Probabilities for Medicine Lake Volcano, Northern California  

USGS Publications Warehouse

Medicine Lake volcano has had 4 eruptive episodes in its postglacial history (since 13,000 years ago) comprising 16 eruptions. Time intervals between events within the episodes are relatively short, whereas time intervals between the episodes are much longer. An updated radiocarbon chronology for these eruptions is presented that uses paleomagnetic data to constrain the choice of calibrated ages. This chronology is used with exponential, Weibull, and mixed-exponential probability distributions to model the data for time intervals between eruptions. The mixed exponential distribution is the best match to the data and provides estimates for the conditional probability of a future eruption given the time since the last eruption. The probability of an eruption at Medicine Lake volcano in the next year from today is 0.00028.

Nathenson, Manuel; Donnelly-Nolan, Julie M.; Champion, Duane E.; Lowenstern, Jacob B.

2007-01-01

367

Electrical structure in a region of the Transverse Ranges, southern California. [for earthquake prediction  

NASA Technical Reports Server (NTRS)

Magnetotelluric sounding at a site in the Transverse Ranges province in southern California indicates a low-resistivity region in the lower crust and possibly also the upper mantle. A two-dimensional model fit to the data indicates that the resistivity of this region is between 1 and 10 ohm-meters. The depth to the top surface of this zone is between 15 and 20 km. The lateral extent of this feature, which strikes N65 deg W, appears to be confined to the Transverse Ranges province. The petrological characteristics of this region cannot be deduced unambiguously from the magnetotelluric sounding alone.

Reddy, I. K.; Phillips, R. J.; Whitcomb, J. H.; Rankin, D.

1977-01-01

368

Statistical eruption forecast for the Chilean Southern Volcanic Zone: typical probabilities of volcanic eruptions as baseline for possibly enhanced activity following the large 2010 Concepción earthquake  

NASA Astrophysics Data System (ADS)

A probabilistic eruption forecast is provided for ten volcanoes of the Chilean Southern Volcanic Zone (SVZ). Since 70% of the Chilean population lives in this area, the estimation of future eruption likelihood is an important part of hazard assessment. After investigating the completeness and stationarity of the historical eruption time series, the exponential, Weibull, and log-logistic distribution functions are fit to the repose time distributions for the individual volcanoes and the models are evaluated. This procedure has been implemented in two different ways to methodologically compare details in the fitting process. With regard to the probability of at least one VEI ? 2 eruption in the next decade, Llaima, Villarrica and Nevados de Chillán are most likely to erupt, while Osorno shows the lowest eruption probability among the volcanoes analysed. In addition to giving a compilation of the statistical eruption forecasts along the historically most active volcanoes of the SVZ, this paper aims to give "typical" eruption probabilities, which may in the future permit to distinguish possibly enhanced activity in the aftermath of the large 2010 Concepción earthquake.

Dzierma, Y.; Wehrmann, H.

2010-10-01

369

Spatially heterogeneous stress field in the source area of the 2011 Mw 6.6 Fukushima-Hamadori earthquake, NE Japan, probably caused by static stress change  

NASA Astrophysics Data System (ADS)

In order to know whether principal stress orientations in the source area rotated after the 2011 April 11 Mw 6.6 Fukushima-Hamadori earthquake in NE Japan, we investigated detailed spatial distributions of stress orientations for both the pre- and post-main shock periods using a large amount of focal mechanism data. We applied stress tensor inversions to focal mechanism data from Japan's National Research Institute for Earth Science and Disaster Prevention's F-net broadband seismic network and the Japan Meteorological Agency (JMA). The ?3-axes estimated for the pre-main shock period are predominantly oriented WSW-ENE, and are relatively homogeneously in space. In contrast, the orientations of the ?3-axes show a significantly heterogeneous distribution in space for the post-main shock period. In the northern subarea of the focal region, the ?3-axes are oriented NW-SE. In the east and west portions of the central subarea, they are oriented NNW-SSE and WNW-ESE, respectively, almost perpendicular to each other. In the southern subarea, the ?3-axes are oriented WSW-ENE. On the whole, the ?3-axis orientations show concentric circle-like distribution surrounding the large slip area of the Mw Mw 6.6 main shock rupture. The change of principal stress axis orientations after the earthquake is not significant because of the sparse data set for the pre-main shock period. We calculated static stress changes from the Mw 6.6 main shock and three Mw > 5.5 earthquakes to compare with the observed stress axis orientations in the post-main shock period. The ?3-axis orientations of the calculated total static stress change show a concentric circle-like distribution surrounding the large slip area of the main shock, similar to that noted above. This observation strongly suggests that the spatially heterogeneous stress orientations in the post-main shock period were caused by the static stress change from the Mw 6.6 main shock and other large earthquakes. In order to estimate the differential stress magnitude in the focal area, we calculated deviatoric stress tensors in the post-main shock period by assuming that they are the sum of the deviatoric stress tensors in the pre-main shock period and the static stress changes. Comparison of the calculated and observed stress tensors revealed differential stress magnitudes of 2-30 MPa that explain the observed stress orientations, considering the probable range of estimated stress ratios in the pre-main shock period.

Yoshida, Keisuke; Hasegawa, Akira; Okada, Tomomi

2015-05-01

370

Earthquakes Living Lab: Geology and the 1906 San Francisco Earthquake  

NSDL National Science Digital Library

Students examine the effects of geology on earthquake magnitudes and how engineers anticipate and prepare for these effects. Using information provided through the Earthquakes Living Lab interface, students investigate how geology, specifically soil type, can amplify the magnitude of earthquakes and their consequences. Students look in-depth at the historical 1906 San Francisco earthquake and its destruction thorough photographs and data. They compare the 1906 California earthquake to another historical earthquake in Kobe, Japan, looking at the geological differences and impacts in the two regions, and learning how engineers, geologists and seismologists work to predict earthquakes and minimize calamity. A worksheet serves as a student guide for the activity.

2014-09-18

371

Moment-tensor solutions for the 24 November 1987 Superstition Hills, California, earthquakes  

USGS Publications Warehouse

The teleseismic long-period waveforms recorded by the Global Digital Seismograph Network from the two largest Superstition Hills earthquakes are inverted using an algorithm based on optimal filter theory. These solutions differ slightly from those published in the Preliminary Determination of Epicenters Monthly Listing because a somewhat different, improved data set was used in the inversions and a time-dependent moment-tensor algorithm was used to investigate the complexity of the main shock. The foreshock (origin time 01:54:14.5, mb 5.7, Ms6.2) had a scalar moment of 2.3 ?? 1025 dyne-cm, a depth of 8km, and a mechanism of strike 217??, dip 79??, rake 4??. The main shock (origin time 13:15:56.4, mb 6.0, Ms6.6) was a complex event, consisting of at least two subevents, with a combined scalar moment of 1.0 ?? 1026 dyne-cm, a depth of 10km, and a mechanism of strike 303??, dip 89??, rake -180??. -Authors

Sipkin, S.A.

1989-01-01

372

Comparison of Repeating Magnitude 2 Earthquakes Near the SAFOD Site, California, With Similar-Magnitude Mining-Induced Earthquakes in South Africa  

NASA Astrophysics Data System (ADS)

A small patch of the San Andreas fault at a depth of about 2.7 km near the site of SAFOD (San Andreas Fault Observatory at Depth) produces magnitude 2 earthquakes that repeat at intervals of 2.89 years (Nadeau and Johnson, Bull. Seism. Soc. Am., 1998; Dreger et al., Geophys. Res. Lett., 2007). A repeat occurred on 20 October 2003 and was recorded by the SAFOD Pilot Hole Array, a vertical network of 3- component seismometers at 32 levels extending from 856 to 2096 m below the local ground surface (Imanishi and Ellsworth, AGU Geophysical Monograph 170, 2006). The ground velocity signals from this repeating earthquake, recorded in the Pilot Hole at hypocentral distances of several km, are remarkably similar to those recorded using a four-station network of IRIS/PASSCAL broadband recording units with accelerometers, deployed underground at depths between 2 and 3.5 km, within two of the deepest and most seismically active mines in South Africa. During our one-week deployment, four earthquakes, of seismic moment and hypocentral depth much the same as those of the repeating earthquakes near SAFOD, were recorded and analyzed to investigate their source processes. In addition to determining the traditional source parameters, including seismic moment, we used distance-corrected peak ground velocities to estimate maximum seismic slips within the four rupture zones, ranging from 4 to 27 mm. (Although our method of doing this involves several model assumptions, underground observations of maximum slip due to mining-induced earthquakes indicate that our approach yields realistic estimates.) These maximum slips, in conjunction with data from laboratory stick-slip friction experiments, were used to estimate maximum slip rates that fell in the range 2 to 7 m/s, typical for earthquakes in the continental crust. Applying the same methods to the 20 October 2003 earthquake data, as recorded in the Pilot Hole, revealed a maximum slip of about 17 mm and a maximum slip rate of 4.7 m/s. That is, our analysis has not indicated any aspect of this repeating earthquake that is out of the ordinary for crustal earthquakes of the same magnitude (McGarr and Fletcher, Bull. Seism. Soc. Am., 2003, 2007).

McGarr, A.; Fletcher, J. B.; Boettcher, M.; Ellsworth, W.

2008-12-01

373

Correlation of ground motion and intensity for the 17 January 1994 Northridge, California, earthquake  

USGS Publications Warehouse

We analyze the correlations between intensity and a set of groundmotion parameters obtained from 66 free-field stations in Los Angeles County that recorded the 1994 Northridge earthquake. We use the tagging intensities from Thywissen and Boatwright (1998) because these intensities are determined independently on census tracts, rather than interpolated from zip codes, as are the modified Mercalli isoseismals from Dewey et al. (1995). The ground-motion parameters we consider are the peak ground acceleration (PGA), the peak ground velocity (PGV), the 5% damped pseudovelocity response spectral (PSV) ordinates at 14 periods from 0.1 to 7.5 sec, and the rms average of these spectral ordinates from 0.3 to 3 sec. Visual comparisons of the distribution of tagging intensity with contours of PGA, PGV, and the average PSV suggest that PGV and the average PSV are better correlated with the intensity than PGA. The correlation coefficients between the intensity and the ground-motion parameters bear this out: r = 0.75 for PGA, 0.85 for PGV, and 0.85 for the average PSV. Correlations between the intensity and the PSV ordinates, as a function of period, are strongest at 1.5 sec (r = 0.83) and weakest at 0.2 sec (r = 0.66). Regressing the intensity on the logarithms of these ground-motion parameters yields relations I ?? mlog?? with 3.0 ??? m ??? 5.2 for the parameters analyzed, where m = 4.4 ?? 0.7 for PGA, 3.4 ?? 0.4 for PGV, and 3.6 ?? 0.5 for the average PSV.

Boatwright, J.; Thywissen, K.; Seekins, L.C.

2001-01-01

374

Postearthquake relaxation after the 2004 M6 Parkfield, California, earthquake and rate-and-state friction  

USGS Publications Warehouse

An unusually complete set of measurements (including rapid rate GPS over the first 10 days) of postseismic deformation is available at 12 continuous GPS stations located close to the epicenter of the 2004 M6.0 Parkfield earthquake. The principal component modes for the relaxation of the ensemble of those 12 GPS stations were determined. The first mode alone furnishes an adequate approximation to the data. Thus, the relaxation at all stations can be represented by the product of a common temporal function and distinct amplitudes for each component (north or east) of relaxation at each station. The distribution in space of the amplitudes indicates that the relaxation is dominantly strike slip. The temporal function, which spans times from about 5 min to 900 days postearthquake, can be fit by a superposition of three creep terms, each of the form ??l loge(1 + t/??l), with characteristic times ??, = 4.06, 0.11, and 0.0001 days. It seems likely that what is actually involved is a broad spectrum of characteristic times, the individual components of which arise from afterslip on different fault patches. Perfettini and Avouac (2004) have shown that an individual creep term can be explained by the spring-slider model with rate-dependent (no state variable) friction. The observed temporal function can also be explained using a single spring-slider model (i.e., single fault patch) that includes rate-and-state-dependent friction, a single-state variable, and either of the two commonly used (aging and slip) state evolution laws. In the latter fits, the rate-and-state friction parameter b is negative.

Savage, J.C.; Langbein, J.

2008-01-01

375

The Mw 6.5 offshore Northern California earthquake of 10 January 2010: Ordinary stress drop on a high-strength fault  

NASA Astrophysics Data System (ADS)

The 10 January 2010 Mw 6.5 earthquake offshore Northern California is one of the first intraplate earthquakes in oceanic lithosphere to be well captured by a GPS network. It presents an opportunity to evaluate rupture mechanics on a high-strength fault. Static inversion of the coseismic displacements shows that the slip peaks at the same depth as the expected strength envelope, where the differential stresses can be as high as 600 MPa. Laboratory experiments on peridotite predict dramatic dynamic weakening at these conditions. The observed ordinary stress drop, 2-20 MPa, may indicate that the lithosphere is much weaker than strength envelope predicts or that the failure mechanisms seen in the laboratory are not occurring during the rupture. The GPS observations show very little postseismic signal indicating that if a shear zone exists beneath the coseismic rupture, it operates at significantly greater stress levels than the coseismic stress change.

Wei, Meng; McGuire, Jeffrey J.

2014-09-01

376

Predicting Earthquakes  

NSDL National Science Digital Library

Five moderate-to-strong earthquakes struck California in June 2005. Could the cluster of quakes be a harbinger of the Big One? Another earthquake-prone area, New Madrid, near Memphis, Tennessee, has had more than its share of impressive quakes and strain is building along its fault lines. This radio broadcast discusses these two seismic zones, the new data based on years of GPS (Global Positioning System) measurements that may give scientists more information, and how the Earth generates the stress which leads to earthquakes. There is also discussion of the danger of tsunamis in the Virgin Islands and the need for a worldwide tsunami warning network. The broadcast is 18 minutes in length.

377

Can earthquakes be Karen Felzer  

E-print Network

Can earthquakes be predicted? Karen Felzer U.S. Geological Survey #12;Earthquake predictions that most seismologists agree with #12;Long term earthquake probabilities These kinds of predictions. Journalists and the general public rush to any suggestion of earthquake prediction like hogs to a full trough

Felzer, Karen

378

Dynamics, Patterns, and Migration in Earthquake Fault Systems  

NASA Astrophysics Data System (ADS)

Space-time patterns of earthquakes have been described for many years. These include migration of major earthquakes along fault systems, precursory quiescence, precursory activation, aftershock diffusion, Mogi donuts, long range triggering as in the Landers earthquake, episodic tremor and slip, and precursory chains. A major goal is to use understanding of space time patterns to inform predictions and forecasts of future activity. Online catalogs of earthquakes combined with new analysis techniques based on statistical mechanics, and newly developed sophisticated numerical simulations of fault systems have led to new approaches for understanding these complex dynamical phenomena. In this talk, we summarize a number of these developments, and show how these new methods can be used to deliver information on earthquake occurrence over the web. Examples of these methods include the Virtual California simulation, and the Natural Time Weibull method for computing earthquake probabilities. In the former, we use information about fault system geometry, slip rates, and historic events to build a topologically realistic model of the fault system dynamics. Models of this type show evidence of many of the types of migration and chain behavior that natural earthquakes demonstrate. In the latter, the Natural Time Weibull model makes use of the fat tailed statistics observed for natural earthquakes to compute earthquake probabilities. Using this method for example, it can be seen that earthquake activity south of the island of Japan may have led to the triggering of the devastating M9.1, March 11, 2011 earthquake and tsunami. We describe prospects for active use of these models and methods of analysis in the future.

Rundle, J. B.; Sachs, M. K.; Holliday, J. R.; Heien, E. M.; Turcotte, D. L.; Donnellan, A.; Meadows, Z.

2012-12-01

379

The Distribution of Earthquakes: Where Do Large Earthquakes Occur?  

NSDL National Science Digital Library

In this activity, students investigate the distribution of large earthquakes (magnitude greater than 6) in Southern California. Using online maps of earthquake epicenters in Southern California and the Los Angeles Basin, they will compare these distributions with historic distributions (1932-1996), and with respect to the locations of major fault traces.

John Marquis

380

The 1994 Northridge, California, earthquake: Investigation of rupture velocity, risetime, and high-frequency radiation  

USGS Publications Warehouse

A hybrid global search algorithm is used to solve the nonlinear problem of calculating slip amplitude, rake, risetime, and rupture time on a finite fault. Thirty-five strong motion velocity records are inverted by this method over the frequency band from 0.1 to 1.0 Hz for the Northridge earthquake. Four regions of larger-amplitude slip are identified: one near the hypocenter at a depth of 17 km, a second west of the hypocenter at about the same depth, a third updip from the hypocenter at a depth of 10 km, and a fourth updip from the hypocenter and to the northwest. The results further show an initial fast rupture with a velocity of 2.8 to 3.0 km/s followed by a slow termination of the rupture with velocities of 2.0 to 2.5 km/s. The initial energetic rupture phase lasts for 3 s, extending out 10 km from the hypocenter. Slip near the hypocenter has a short risetime of 0.5 s, which increases to 1.5 s for the major slip areas removed from the hypocentral region. The energetic rupture phase is also shown to be the primary source of high-frequency radiation (1-15 Hz) by an inversion of acceleration envelopes. The same global search algorithm is used in the envelope inversion to calculate high-frequency radiation intensity on the fault and rupture time. The rupture timing from the low- and high-frequency inversions is similar, indicating that the high frequencies are produced primarily at the mainshock rupture front. Two major sources of high-frequency radiation are identified within the energetic rupture phase, one at the hypocenter and another deep source to the west of the hypocenter. The source at the hypocenter is associated with the initiation of rupture and the breaking of a high-stress-drop asperity and the second is associated with stopping of the rupture in a westerly direction.

Hartzell, S.; Liu, P.; Mendoza, C.

1996-01-01

381

Space-time model for repeating earthquakes and analysis of recurrence intervals on the San Andreas Fault near Parkfield, California  

NASA Astrophysics Data System (ADS)

We propose a stochastic model for characteristically repeating earthquake sequences to estimate the spatiotemporal change in static stress loading rate. These earthquakes recur by a cyclic mechanism where stress at a hypocenter is accumulated by tectonic forces until an earthquake occurs that releases the accumulated stress to a basal level. Renewal processes are frequently used to describe this repeating earthquake mechanism. Variations in the rate of tectonic loading due to large earthquakes and aseismic slip transients, however, introduce nonstationary effects into the repeating mechanism that result in nonstationary trends in interevent times, particularly for smaller-magnitude repeating events which have shorter interevent times. These trends are also similar among repeating earthquake sites having similar hypocenters. Therefore, we incorporate space-time structure represented by cubic B-spline functions into the renewal model and estimate their coefficient parameters by maximizing the integrated likelihood in a Bayesian framework. We apply our model to 31 repeating earthquake sequences including 824 events on the Parkfield segment of the San Andreas Fault and estimate the spatiotemporal transition of the loading rate on this segment. The result gives us details of the change in tectonic loading caused by an aseismic slip transient in 1993, the 2004 Parkfield M6 earthquake, and other nearby or remote seismic activities. The degree of periodicity of repeating event recurrence intervals also shows spatial trends that are preserved in time even after the 2004 Parkfield earthquake when time scales are normalized with respect to the estimated loading rate.

Nomura, Shunichi; Ogata, Yosihiko; Nadeau, Robert M.

2014-09-01

382

Staying Safe in Earthquake Country  

E-print Network

that (together with the subsequent fire) led to the destruction of San Francisco in 1906. Seismologists worry! The Bad News When most southern Californians think of earthquakes, their minds leap immediately to the San the most frequent large earthquakes. In southern California, the most recent earthquake on the San Andreas

de Lijser, Peter

383

Lower crustal structure in northern California: Implications from strain rate variations following the 1906 San Francisco earthquake  

E-print Network

the 1906 San Francisco earthquake Shelley J. Kenner1 and Paul Segall Department of Geophysics, Stanford rate variations following the 1906 San Francisco earthquake, J. Geophys. Res., 108(B1), 2011, doi:10 by comparing geodetic data from north of San Francisco Bay obtained between 1906 and 1995 to predictions from

Segall, Paul

384

Evidence for large earthquakes on the San Andreas fault at the Wrightwood, California paleoseismic site: A.D. 500 to present  

USGS Publications Warehouse

We present structural and stratigraphic evidence from a paleoseismic site near Wrightwood, California, for 14 large earthquakes that occurred on the southern San Andreas fault during the past 1500 years. In a network of 38 trenches and creek-bank exposures, we have exposed a composite section of interbedded debris flow deposits and thin peat layers more than 24 m thick; fluvial deposits occur along the northern margin of the site. The site is a 150-m-wide zone of deformation bounded on the surface by a main fault zone along the northwest margin and a secondary fault zone to the southwest. Evidence for most of the 14 earthquakes occurs along structures within both zones. We identify paleoearthquake horizons using infilled fissures, scarps, multiple rupture terminations, and widespread folding and tilting of beds. Ages of stratigraphic units and earthquakes are constrained by historic data and 72 14C ages, mostly from samples of peat and some from plant fibers, wood, pine cones, and charcoal. Comparison of the long, well-resolved paleoseimic record at Wrightwood with records at other sites along the fault indicates that rupture lengths of past earthquakes were at least 100 km long. Paleoseismic records at sites in the Coachella Valley suggest that each of the past five large earthquakes recorded there ruptured the fault at least as far northwest as Wrightwood. Comparisons with event chronologies at Pallett Creek and sites to the northwest suggests that approximately the same part of the fault that ruptured in 1857 may also have failed in the early to mid-sixteenth century and several other times during the past 1200 years. Records at Pallett Creek and Pitman Canyon suggest that, in addition to the 14 earthquakes we document, one and possibly two other large earthquakes ruptured the part of the fault including Wrightwood since about A.D. 500. These observations and elapsed times that are significantly longer than mean recurrence intervals at Wrightwood and sites to the southeast suggest that at least the southermost 200 km of the San Andreas fault is near failure.

Fumal, T.E.; Weldon, R.J.; Biasi, G.P.; Dawson, T.E.; Seitz, G.G.; Frost, W.T.; Schwartz, D.P.

2002-01-01

385

Downscaling of slip distribution for strong earthquakes  

NASA Astrophysics Data System (ADS)

We intend to develop a downscaling model to enhance the earthquake slip distribution resolution. Slip distributions have been obtained by other researchers using various inversion methods. As a downscaling model, we are discussing fractal models that include mono-fractal models (fractional Brownian motion, fBm; fractional Lévy motion, fLm) and multi-fractal models as candidates. Log - log-linearity of k (wave number) versus E (k) (power spectrum) is the necessary condition for fractality: the slip distribution is expected to satisfy log - log-linearity described above if we can apply fractal model to a slip distribution as a downscaling model. Therefore, we conducted spectrum analyses using slip distributions of 11 earthquakes as explained below. 1) Spectrum analyses using one-dimensional slip distributions (strike direction) were conducted. 2) Averaging of some results of power spectrum (dip direction) was conducted. Results show that, from the viewpoint of log - log-linearity, applying a fractal model to slip distributions can be inferred as valid. We adopt the filtering method after Lavallée (2008) to generate fBm/ fLm. In that method, generated white noises (random numbers) are filtered using a power law type filter (log - log-linearity of the spectrum). Lavallée (2008) described that Lévy white noise that generates fLm is more appropriate than the Gaussian white noise which generates fBm. In addition, if the 'alpha' parameter of the Lévy law, which governs the degree of attenuation of tails of the probability distribution, is 2.0, then the Lévy distribution is equivalent to the Gauss distribution. We analyzed slip distributions of 11 earthquakes: the Tohoku earthquake