Sample records for nanbu earthquake hyogoken

  1. Use of fault striations and dislocation models to infer tectonic shear stress during the 1995 Hyogo-Ken Nanbu (Kobe) earthquake

    USGS Publications Warehouse

    Spudich, P.; Guatteri, Mariagiovanna; Otsuki, K.; Minagawa, J.

    1998-01-01

    Dislocation models of the 1995 Hyogo-ken Nanbu (Kobe) earthquake derived by Yoshida et al. (1996) show substantial changes in direction of slip with time at specific points on the Nojima and Rokko fault systems, as do striations we observed on exposures of the Nojima fault surface on Awaji Island. Spudich (1992) showed that the initial stress, that is, the shear traction on the fault before the earthquake origin time, can be derived at points on the fault where the slip rake rotates with time if slip velocity and stress change are known at these points. From Yoshida's slip model, we calculated dynamic stress changes on the ruptured fault surfaces. To estimate errors, we compared the slip velocities and dynamic stress changes of several published models of the earthquake. The differences between these models had an exponential distribution, not gaussian. We developed a Bayesian method to estimate the probability density function (PDF) of initial stress from the striations and from Yoshida's slip model. Striations near Toshima and Hirabayashi give initial stresses of about 13 and 7 MPa, respectively. We obtained initial stresses of about 7 to 17 MPa at depths of 2 to 10 km on a subset of points on the Nojima and Rokko fault systems. Our initial stresses and coseismic stress changes agree well with postearthquake stresses measured by hydrofracturing in deep boreholes near Hirabayashi and Ogura on Awaji Island. Our results indicate that the Nojima fault slipped at very low shear stress, and fractional stress drop was complete near the surface and about 32% below depths of 2 km. Our results at depth depend on the accuracy of the rake rotations in Yoshida's model, which are probably correct on the Nojima fault but debatable on the Rokko fault. Our results imply that curved or cross-cutting fault striations can be formed in a single earthquake, contradicting a common assumption of structural geology.

  2. Applicability of source scaling relations for crustal earthquakes to estimation of the ground motions of the 2016 Kumamoto earthquake

    NASA Astrophysics Data System (ADS)

    Irikura, Kojiro; Miyakoshi, Ken; Kamae, Katsuhiro; Yoshida, Kunikazu; Somei, Kazuhiro; Kurahashi, Susumu; Miyake, Hiroe

    2017-01-01

    A two-stage scaling relationship of the source parameters for crustal earthquakes in Japan has previously been constructed, in which source parameters obtained from the results of waveform inversion of strong motion data are combined with parameters estimated based on geological and geomorphological surveys. A three-stage scaling relationship was subsequently developed to extend scaling to crustal earthquakes with magnitudes greater than M w 7.4. The effectiveness of these scaling relationships was then examined based on the results of waveform inversion of 18 recent crustal earthquakes ( M w 5.4-6.9) that occurred in Japan since the 1995 Hyogo-ken Nanbu earthquake. The 2016 Kumamoto earthquake, with M w 7.0, was one of the largest earthquakes to occur since dense and accurate strong motion observation networks, such as K-NET and KiK-net, were deployed after the 1995 Hyogo-ken Nanbu earthquake. We examined the applicability of the scaling relationships of the source parameters of crustal earthquakes in Japan to the 2016 Kumamoto earthquake. The rupture area and asperity area were determined based on slip distributions obtained from waveform inversion of the 2016 Kumamoto earthquake observations. We found that the relationship between the rupture area and the seismic moment for the 2016 Kumamoto earthquake follows the second-stage scaling within one standard deviation ( σ = 0.14). The ratio of the asperity area to the rupture area for the 2016 Kumamoto earthquake is nearly the same as ratios previously obtained for crustal earthquakes. Furthermore, we simulated the ground motions of this earthquake using a characterized source model consisting of strong motion generation areas (SMGAs) based on the empirical Green's function (EGF) method. The locations and areas of the SMGAs were determined through comparison between the synthetic ground motions and observed motions. The sizes of the SMGAs were nearly coincident with the asperities with large slip. The synthetic

  3. Rupture, waves and earthquakes.

    PubMed

    Uenishi, Koji

    2017-01-01

    Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but "extraordinary" phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable.

  4. Rupture, waves and earthquakes

    PubMed Central

    UENISHI, Koji

    2017-01-01

    Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but “extraordinary” phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable. PMID:28077808

  5. Geochemical challenge to earthquake prediction.

    PubMed Central

    Wakita, H

    1996-01-01

    The current status of geochemical and groundwater observations for earthquake prediction in Japan is described. The development of the observations is discussed in relation to the progress of the earthquake prediction program in Japan. Three major findings obtained from our recent studies are outlined. (i) Long-term radon observation data over 18 years at the SKE (Suikoen) well indicate that the anomalous radon change before the 1978 Izu-Oshima-kinkai earthquake can with high probability be attributed to precursory changes. (ii) It is proposed that certain sensitive wells exist which have the potential to detect precursory changes. (iii) The appearance and nonappearance of coseismic radon drops at the KSM (Kashima) well reflect changes in the regional stress state of an observation area. In addition, some preliminary results of chemical changes of groundwater prior to the 1995 Kobe (Hyogo-ken nanbu) earthquake are presented. PMID:11607665

  6. Inferring rate and state friction parameters from a rupture model of the 1995 Hyogo-ken Nanbu (Kobe) Japan earthquake

    USGS Publications Warehouse

    Guatteri, Mariagiovanna; Spudich, P.; Beroza, G.C.

    2001-01-01

    We consider the applicability of laboratory-derived rate- and state-variable friction laws to the dynamic rupture of the 1995 Kobe earthquake. We analyze the shear stress and slip evolution of Ide and Takeo's [1997] dislocation model, fitting the inferred stress change time histories by calculating the dynamic load and the instantaneous friction at a series of points within the rupture area. For points exhibiting a fast-weakening behavior, the Dieterich-Ruina friction law, with values of dc = 0.01-0.05 m for critical slip, fits the stress change time series well. This range of dc is 10-20 times smaller than the slip distance over which the stress is released, Dc, which previous studies have equated with the slip-weakening distance. The limited resolution and low-pass character of the strong motion inversion degrades the resolution of the frictional parameters and suggests that the actual dc is less than this value. Stress time series at points characterized by a slow-weakening behavior are well fitted by the Dieterich-Ruina friction law with values of dc ??? 0.01-0.05 m. The apparent fracture energy Gc can be estimated from waveform inversions more stably than the other friction parameters. We obtain a Gc = 1.5??106 J m-2 for the 1995 Kobe earthquake, in agreement with estimates for previous earthquakes. From this estimate and a plausible upper bound for the local rock strength we infer a lower bound for Dc of about 0.008 m. Copyright 2001 by the American Geophysical Union.

  7. Preliminary map of peak horizontal ground acceleration for the Hanshin-Awaji earthquake of January 17, 1995, Japan - Description of Mapped Data Sets

    USGS Publications Warehouse

    Borcherdt, R.D.; Mark, R.K.

    1995-01-01

    The Hanshin-Awaji earthquake (also known as the Hyogo-ken Nanbu and the Great Hanshin earthquake) provided an unprecedented set of measurements of strong ground shaking. The measurements constitute the most comprehensive set of strong- motion recordings yet obtained for sites underlain by soft soil deposits of Holocene age within a few kilometers of the crustal rupture zone. The recordings, obtained on or near many important structures, provide an important new empirical data set for evaluating input ground motion levels and site amplification factors for codes and site-specific design procedures world wide. This report describes the data used to prepare a preliminary map summarizing the strong motion data in relation to seismicity and underlying geology (Wentworth, Borcherdt, and Mark., 1995; Figure 1, hereafter referred to as Figure 1/I). The map shows station locations, peak acceleration values, and generalized acceleration contours superimposed on pertinent seismicity and the geologic map of Japan. The map (Figure 1/I) indicates a zone of high acceleration with ground motions throughout the zone greater than 400 gal and locally greater than 800 gal. This zone encompasses the area of most intense damage mapped as JMA intensity level 7, which extends through Kobe City. The zone of most intense damage is parallel, but displaced slightly from the surface projection of the crustal rupture zone implied by aftershock locations. The zone is underlain by soft-soil deposits of Holocene age.

  8. Particle Simulation of Coulomb Collisions: Comparing the Methods of Takizuka & Abe and Nanbu

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, C; Lin, T; Caflisch, R

    2007-05-22

    The interactions of charged particles in a plasma are in a plasma is governed by the long-range Coulomb collision. We compare two widely used Monte Carlo models for Coulomb collisions. One was developed by Takizuka and Abe in 1977, the other was developed by Nanbu in 1997. We perform deterministic and stochastic error analysis with respect to particle number and time step. The two models produce similar stochastic errors, but Nanbu's model gives smaller time step errors. Error comparisons between these two methods are presented.

  9. Particle simulation of Coulomb collisions: Comparing the methods of Takizuka and Abe and Nanbu

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang Chiaming; Lin, Tungyou; Caflisch, Russel

    2008-04-20

    The interactions of charged particles in a plasma are governed by long-range Coulomb collision. We compare two widely used Monte Carlo models for Coulomb collisions. One was developed by Takizuka and Abe in 1977, the other was developed by Nanbu in 1997. We perform deterministic and statistical error analysis with respect to particle number and time step. The two models produce similar stochastic errors, but Nanbu's model gives smaller time step errors. Error comparisons between these two methods are presented.

  10. High-Speed Observations of Dynamic Fracture Propagation in Solids and Their Implications in Earthquake Rupture Dynamics

    NASA Astrophysics Data System (ADS)

    Uenishi, Koji

    2016-04-01

    This contribution outlines our experimental observations of seismicity-related fast fracture (rupture) propagation in solids utilising high-speed analog and digital photography (maximum frame rate 1,000,000 frames per second) over the last two decades. Dynamic fracture may be triggered or initiated in the monolithic or layered seismic models by detonation of micro explosives, a projectile launched by a gun, laser pulses and electric discharge impulses, etc. First, we have investigated strike-slip rupture along planes of weakness in transparent photoelastic (birefringent) materials at a laboratory scale and shown (at that time) extraordinarily fast rupture propagation in a bi-material system and its possible effect on the generation of large strong motion in the limited narrow areas in the Kobe region on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake (Uenishi Ph.D. thesis 1997, Uenishi et al. BSSA 1999). In this series of experiments, we have also modelled shallow dip-slip earthquakes and indicated a possible origin of the asymmetric ground motion in the hanging and foot-walls. In the photoelastic photographs, we have found the unique dynamic wave interaction and generation of specific shear and interface waves numerically predicted by Uenishi and Madariaga (Eos 2005), and considered as a case study the seismic motion associated with the 2014 Nagano-ken Hokubu (Kamishiro Fault), Japan, dip-slip earthquake (Uenishi EFA 2015). Second, we have experimentally shown that even in a monolithic material, rupture speed may exceed the local shear wave speed if we employ hyperelasically behaving materials like natural rubber (balloons) (Uenishi Eos 2006, Uenishi ICF 2009, Uenishi Trans. JSME A 2012) but fracture in typical monolithic thin fluid films (e.g. soap bubbles, which may be treated as a solid material) propagates at an ordinary subsonic (sub-Rayleigh) speed (Uenishi et al. SSJ 2006). More recent investigation handling three-dimensional rupture propagation

  11. Comparative study of two active faults in different stages of the earthquake cycle in central Japan -The Atera fault (with 1586 Tensho earthquake) and the Nojima fault (with 1995 Kobe earthquake)-

    NASA Astrophysics Data System (ADS)

    Matsuda, T.; Omura, K.; Ikeda, R.

    2003-12-01

    National Research Institute for Earth Science and Disaster Prevention (NIED) has been conducting _gFault zone drilling_h. Fault zone drilling is especially important in understanding the structure, composition, and physical properties of an active fault. In the Chubu district of central Japan, large active faults such as the Atotsugawa (with 1858 Hietsu earthquake) and the Atera (with 1586 Tensho earthquake) faults exist. After the occurrence of the 1995 Kobe earthquake, it has been widely recognized that direct measurements in fault zones by drilling. This time, we describe about the Atera fault and the Nojima fault. Because, these two faults are similar in geological situation (mostly composed of granitic rocks), so it is easy to do comparative study of drilling investigation. The features of the Atera fault, which have been dislocated by the 1586 Tensho earthquake, are as follows. Total length is about 70 km. That general trend is NW45 degree with a left-lateral strike slip. Slip rate is estimated as 3-5 m / 1000 years. Seismicity is very low at present and lithologies around the fault are basically granitic rocks and rhyolite. Six boreholes have been drilled from the depth of 400 m to 630 m. Four of these boreholes (Hatajiri, Fukuoka, Ueno and Kawaue) are located on a line crossing in a direction perpendicular to the Atera fault. In the Kawaue well, mostly fractured and alternating granitic rock continued from the surface to the bottom at 630 m. X-ray fluorescence analysis (XRF) is conducted to estimate the amount of major chemical elements using the glass bead method for core samples. The amounts of H20+ are about from 0.5 to 2.5 weight percent. This fractured zone is also characterized by the logging data such as low resistivity, low P-wave velocity, low density and high neutron porosity. The 1995 Kobe (Hyogo-ken Nanbu) earthquake occurred along the NE-SW-trending Rokko-Awaji fault system, and the Nojima fault appeared on the surface on Awaji Island when this

  12. Large scale centrifuge test of a geomembrane-lined landfill subject to waste settlement and seismic loading.

    PubMed

    Kavazanjian, Edward; Gutierrez, Angel

    2017-10-01

    A large scale centrifuge test of a geomembrane-lined landfill subject to waste settlement and seismic loading was conducted to help validate a numerical model for performance based design of geomembrane liner systems. The test was conducted using the 240g-ton centrifuge at the University of California at Davis under the U.S. National Science Foundation Network for Earthquake Engineering Simulation Research (NEESR) program. A 0.05mm thin film membrane was used to model the liner. The waste was modeled using a peat-sand mixture. The side slope membrane was underlain by lubricated low density polyethylene to maximize the difference between the interface shear strength on the top and bottom of the geomembrane and the induced tension in it. Instrumentation included thin film strain gages to monitor geomembrane strains and accelerometers to monitor seismic excitation. The model was subjected to an input design motion intended to simulate strong ground motion from the 1994 Hyogo-ken Nanbu earthquake. Results indicate that downdrag waste settlement and seismic loading together, and possibly each phenomenon individually, can induce potentially damaging tensile strains in geomembrane liners. The data collected from this test is publically available and can be used to validate numerical models for the performance of geomembrane liner systems. Published by Elsevier Ltd.

  13. Seismic Risk Assessment of Italian Seaports Using GIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartolomei, Anna; Corigliano, Mirko; Lai, Carlo G.

    Seaports are crucial elements in the export and import of goods and/or on the flow of travellers in the tourism industry of many industrialised nations included Italy. Experience gained from recent earthquakes (e.g. 1989 Loma Prieta in USA, 1995 Hyogoken-Nanbu and 2003 Tokachi-Oki in Japan) have dramatically demonstrated the seismic vulnerability of seaport structures and the severe damage that can be caused by ground shaking. In Italy, the Department of Civil Protection has funded a research project to develop a methodology for the seismic design of new marginal wharves and assessment of existing structures at seaports located in areas ofmore » medium or high seismicity. This paper shows part of the results of this research project, currently underway, with particular reference to the seismic risk assessment through an interactive, geographically referenced database (GIS). Standard risk assessment have been carried out for the Gioia Tauro port in Calabria (Italy) using the empirical curves implemented by the National Institute of Building Sciences (NIBS, 2004)« less

  14. Optical methods in fault dynamics

    NASA Astrophysics Data System (ADS)

    Uenishi, K.; Rossmanith, H. P.

    2003-10-01

    The Rayleigh pulse interaction with a pre-stressed, partially contacting interface between similar and dissimilar materials is investigated experimentally as well as numerically. This study is intended to obtain an improved understanding of the interface (fault) dynamics during the earthquake rupture process. Using dynamic photoelasticity in conjunction with high-speed cinematography, snapshots of time-dependent isochromatic fringe patterns associated with Rayleigh pulse-interface interaction are experimentally recorded. It is shown that interface slip (instability) can be triggered dynamically by a pulse which propagates along the interface at the Rayleigh wave speed. For the numerical investigation, the finite difference wave simulator SWIFD is used for solving the problem under different combinations of contacting materials. The effect of acoustic impedance ratio of the two contacting materials on the wave patterns is discussed. The results indicate that upon interface rupture, Mach (head) waves, which carry a relatively large amount of energy in a concentrated form, can be generated and propagated from the interface contact region (asperity) into the acoustically softer material. Such Mach waves can cause severe damage onto a particular region inside an adjacent acoustically softer area. This type of damage concentration might be a possible reason for the generation of the "damage belt" in Kobe, Japan, on the occasion of the 1995 Hyogo-ken Nanbu (Kobe) Earthquake.

  15. The quest for better quality-of-life - learning from large-scale shaking table tests

    NASA Astrophysics Data System (ADS)

    Nakashima, M.; Sato, E.; Nagae, T.; Kunio, F.; Takahito, I.

    2010-12-01

    Earthquake engineering has its origins in the practice of “learning from actual earthquakes and earthquake damages.” That is, we recognize serious problems by witnessing the actual damage to our structures, and then we develop and apply engineering solutions to solve these problems. This tradition in earthquake engineering, i.e., “learning from actual damage,” was an obvious engineering response to earthquakes and arose naturally as a practice in a civil and building engineering discipline that traditionally places more emphasis on experience than do other engineering disciplines. But with the rapid progress of urbanization, as society becomes denser, and as the many components that form our society interact with increasing complexity, the potential damage with which earthquakes threaten the society also increases. In such an era, the approach of ”learning from actual earthquake damages” becomes unacceptably dangerous and expensive. Among the practical alternatives to the old practice is to “learn from quasi-actual earthquake damages.” One tool for experiencing earthquake damages without attendant catastrophe is the large shaking table. E-Defense, the largest one we have, was developed in Japan after the 1995 Hyogoken-Nanbu (Kobe) earthquake. Since its inauguration in 2005, E-Defense has conducted over forty full-scale or large-scale shaking table tests, applied to a variety of structural systems. The tests supply detailed data on actual behavior and collapse of the tested structures, offering the earthquake engineering community opportunities to experience and assess the actual seismic performance of the structures, and to help society prepare for earthquakes. Notably, the data were obtained without having to wait for the aftermaths of actual earthquakes. Earthquake engineering has always been about life safety, but in recent years maintaining the quality of life has also become a critical issue. Quality-of-life concerns include nonstructural

  16. In-situ stress measurements using core-based methods in the vicinity of Nojima fault.

    NASA Astrophysics Data System (ADS)

    Yano, S.; Sugimoto, T.; Lin, W.; Lin, A.

    2017-12-01

    In the cycle of repeatable occurrence of earthquakes, stress accumulates at the source fault and its surroundings in an interseismic period until the next earthquake, and releases abruptly when the earthquake occurs. However, it is almost unknown that the quantitative relationship between stress change and earthquake occurrence. Hence, in order to improve our understanding on the mechanisms of the outbreak of earthquakes, it is important to grasp the stress states in the vicinity of the source fault and to evaluate its change over time. In this study, we carried out in-situ stress measurements by using core samples obtained from a scientific drilling penetrated through the Nojima fault which ruptured and caused the Hyogo-ken Nanbu earthquake, Japan in 1995. Our stress measurements were conducted from 2016 to 2017 when is 22 years after the earthquake. For this purpose, we applied the Anelastic Strain Recovery (ASR) method and Diametrical Core Deformation Analysis (DCDA). First, we measure the ASR change with time of the cores from stress releasing soon and calculate three-dimensional principal in-situ stress orientations and magnitudes from the ASR data. In this study, to ensure the enough amount of ASR, we conducted the measurements using the cores collected within a short time (e.g. 2.5 - 3.5 hours) after stress releasing by drilling at an on-site laboratory in the drilling site in Awaji island, Japan. The site locates at the south-west part of the Nojima fault. In DCDA, we measure the core diameters in all (360°) azimuths, and determine difference of the two horizontal principal stresses and their orientation by using the other cores as those used for ASR. DCDA experiments were conducted indoor and after a long time passed from core collecting. Lithology of all the core samples we used for ASR and DCDA are granite, and 19 and 7 cores were used for ASR and DCDA, respectively. As a result, it was found that the stress state in the depth range of 500 - 560 m and

  17. Connecting slow earthquakes to huge earthquakes.

    PubMed

    Obara, Kazushige; Kato, Aitaro

    2016-07-15

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. Copyright © 2016, American Association for the Advancement of Science.

  18. The HayWired Earthquake Scenario—Earthquake Hazards

    USGS Publications Warehouse

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of

  19. Earthquakes

    MedlinePlus

    ... Search Term(s): Main Content Home Be Informed Earthquakes Earthquakes An earthquake is the sudden, rapid shaking of the earth, ... by the breaking and shifting of underground rock. Earthquakes can cause buildings to collapse and cause heavy ...

  20. Earthquakes.

    ERIC Educational Resources Information Center

    Walter, Edward J.

    1977-01-01

    Presents an analysis of the causes of earthquakes. Topics discussed include (1) geological and seismological factors that determine the effect of a particular earthquake on a given structure; (2) description of some large earthquakes such as the San Francisco quake; and (3) prediction of earthquakes. (HM)

  1. Earthquakes.

    ERIC Educational Resources Information Center

    Pakiser, Louis C.

    One of a series of general interest publications on science topics, the booklet provides those interested in earthquakes with an introduction to the subject. Following a section presenting an historical look at the world's major earthquakes, the booklet discusses earthquake-prone geographic areas, the nature and workings of earthquakes, earthquake…

  2. The 2004 Parkfield, CA Earthquake: A Teachable Moment for Exploring Earthquake Processes, Probability, and Earthquake Prediction

    NASA Astrophysics Data System (ADS)

    Kafka, A.; Barnett, M.; Ebel, J.; Bellegarde, H.; Campbell, L.

    2004-12-01

    The occurrence of the 2004 Parkfield earthquake provided a unique "teachable moment" for students in our science course for teacher education majors. The course uses seismology as a medium for teaching a wide variety of science topics appropriate for future teachers. The 2004 Parkfield earthquake occurred just 15 minutes after our students completed a lab on earthquake processes and earthquake prediction. That lab included a discussion of the Parkfield Earthquake Prediction Experiment as a motivation for the exercises they were working on that day. Furthermore, this earthquake was recorded on an AS1 seismograph right in their lab, just minutes after the students left. About an hour after we recorded the earthquake, the students were able to see their own seismogram of the event in the lecture part of the course, which provided an excellent teachable moment for a lecture/discussion on how the occurrence of the 2004 Parkfield earthquake might affect seismologists' ideas about earthquake prediction. The specific lab exercise that the students were working on just before we recorded this earthquake was a "sliding block" experiment that simulates earthquakes in the classroom. The experimental apparatus includes a flat board on top of which are blocks of wood attached to a bungee cord and a string wrapped around a hand crank. Plate motion is modeled by slowly turning the crank, and earthquakes are modeled as events in which the block slips ("blockquakes"). We scaled the earthquake data and the blockquake data (using how much the string moved as a proxy for time) so that we could compare blockquakes and earthquakes. This provided an opportunity to use interevent-time histograms to teach about earthquake processes, probability, and earthquake prediction, and to compare earthquake sequences with blockquake sequences. We were able to show the students, using data obtained directly from their own lab, how global earthquake data fit a Poisson exponential distribution better

  3. Earthquakes

    MedlinePlus

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  4. High stresses stored in fault zones: example of the Nojima fault (Japan)

    NASA Astrophysics Data System (ADS)

    Boullier, Anne-Marie; Robach, Odile; Ildefonse, Benoît; Barou, Fabrice; Mainprice, David; Ohtani, Tomoyuki; Fujimoto, Koichiro

    2018-04-01

    During the last decade pulverized rocks have been described on outcrops along large active faults and attributed to damage related to a propagating seismic rupture front. Questions remain concerning the maximal lateral distance from the fault plane and maximal depth for dynamic damage to be imprinted in rocks. In order to document these questions, a representative core sample of granodiorite located 51.3 m from the Nojima fault (Japan) that was drilled after the Hyogo-ken Nanbu (Kobe) earthquake is studied by using electron backscattered diffraction (EBSD) and high-resolution X-ray Laue microdiffraction. Although located outside of the Nojima damage fault zone and macroscopically undeformed, the sample shows pervasive microfractures and local fragmentation. These features are attributed to the first stage of seismic activity along the Nojima fault characterized by laumontite as the main sealing mineral. EBSD mapping was used in order to characterize the crystallographic orientation and deformation microstructures in the sample, and X-ray microdiffraction was used to measure elastic strain and residual stresses on each point of the mapped quartz grain. Both methods give consistent results on the crystallographic orientation and show small and short wavelength misorientations associated with laumontite-sealed microfractures and alignments of tiny fluid inclusions. Deformation microstructures in quartz are symptomatic of the semi-brittle faulting regime, in which low-temperature brittle plastic deformation and stress-driven dissolution-deposition processes occur conjointly. This deformation occurred at a 3.7-11.1 km depth interval as indicated by the laumontite stability domain. Residual stresses are calculated from deviatoric elastic strain tensor measured using X-ray Laue microdiffraction using the Hooke's law. The modal value of the von Mises stress distribution is at 100 MPa and the mean at 141 MPa. Such stress values are comparable to the peak strength of a

  5. The 1868 Hayward Earthquake Alliance: A Case Study - Using an Earthquake Anniversary to Promote Earthquake Preparedness

    NASA Astrophysics Data System (ADS)

    Brocher, T. M.; Garcia, S.; Aagaard, B. T.; Boatwright, J. J.; Dawson, T.; Hellweg, M.; Knudsen, K. L.; Perkins, J.; Schwartz, D. P.; Stoffer, P. W.; Zoback, M.

    2008-12-01

    Last October 21st marked the 140th anniversary of the M6.8 1868 Hayward Earthquake, the last damaging earthquake on the southern Hayward Fault. This anniversary was used to help publicize the seismic hazards associated with the fault because: (1) the past five such earthquakes on the Hayward Fault occurred about 140 years apart on average, and (2) the Hayward-Rodgers Creek Fault system is the most likely (with a 31 percent probability) fault in the Bay Area to produce a M6.7 or greater earthquake in the next 30 years. To promote earthquake awareness and preparedness, over 140 public and private agencies and companies and many individual joined the public-private nonprofit 1868 Hayward Earthquake Alliance (1868alliance.org). The Alliance sponsored many activities including a public commemoration at Mission San Jose in Fremont, which survived the 1868 earthquake. This event was followed by an earthquake drill at Bay Area schools involving more than 70,000 students. The anniversary prompted the Silver Sentinel, an earthquake response exercise based on the scenario of an earthquake on the Hayward Fault conducted by Bay Area County Offices of Emergency Services. 60 other public and private agencies also participated in this exercise. The California Seismic Safety Commission and KPIX (CBS affiliate) produced professional videos designed forschool classrooms promoting Drop, Cover, and Hold On. Starting in October 2007, the Alliance and the U.S. Geological Survey held a sequence of press conferences to announce the release of new research on the Hayward Fault as well as new loss estimates for a Hayward Fault earthquake. These included: (1) a ShakeMap for the 1868 Hayward earthquake, (2) a report by the U. S. Bureau of Labor Statistics forecasting the number of employees, employers, and wages predicted to be within areas most strongly shaken by a Hayward Fault earthquake, (3) new estimates of the losses associated with a Hayward Fault earthquake, (4) new ground motion

  6. The Loma Prieta, California, Earthquake of October 17, 1989: Earthquake Occurrence

    USGS Publications Warehouse

    Coordinated by Bakun, William H.; Prescott, William H.

    1993-01-01

    Professional Paper 1550 seeks to understand the M6.9 Loma Prieta earthquake itself. It examines how the fault that generated the earthquake ruptured, searches for and evaluates precursors that may have indicated an earthquake was coming, reviews forecasts of the earthquake, and describes the geology of the earthquake area and the crustal forces that affect this geology. Some significant findings were: * Slip during the earthquake occurred on 35 km of fault at depths ranging from 7 to 20 km. Maximum slip was approximately 2.3 m. The earthquake may not have released all of the strain stored in rocks next to the fault and indicates a potential for another damaging earthquake in the Santa Cruz Mountains in the near future may still exist. * The earthquake involved a large amount of uplift on a dipping fault plane. Pre-earthquake conventional wisdom was that large earthquakes in the Bay area occurred as horizontal displacements on predominantly vertical faults. * The fault segment that ruptured approximately coincided with a fault segment identified in 1988 as having a 30% probability of generating a M7 earthquake in the next 30 years. This was one of more than 20 relevant earthquake forecasts made in the 83 years before the earthquake. * Calculations show that the Loma Prieta earthquake changed stresses on nearby faults in the Bay area. In particular, the earthquake reduced stresses on the Hayward Fault which decreased the frequency of small earthquakes on it. * Geological and geophysical mapping indicate that, although the San Andreas Fault can be mapped as a through going fault in the epicentral region, the southwest dipping Loma Prieta rupture surface is a separate fault strand and one of several along this part of the San Andreas that may be capable of generating earthquakes.

  7. PAGER--Rapid assessment of an earthquake?s impact

    USGS Publications Warehouse

    Wald, D.J.; Jaiswal, K.; Marano, K.D.; Bausch, D.; Hearne, M.

    2010-01-01

    PAGER (Prompt Assessment of Global Earthquakes for Response) is an automated system that produces content concerning the impact of significant earthquakes around the world, informing emergency responders, government and aid agencies, and the media of the scope of the potential disaster. PAGER rapidly assesses earthquake impacts by comparing the population exposed to each level of shaking intensity with models of economic and fatality losses based on past earthquakes in each country or region of the world. Earthquake alerts--which were formerly sent based only on event magnitude and location, or population exposure to shaking--now will also be generated based on the estimated range of fatalities and economic losses.

  8. Geometry of the Nojima fault at Nojima-Hirabayashi, Japan - I. A simple damage structure inferred from borehole core permeability

    USGS Publications Warehouse

    Lockner, David A.; Tanaka, Hidemi; Ito, Hisao; Ikeda, Ryuji; Omura, Kentaro; Naka, Hisanobu

    2009-01-01

    The 1995 Kobe (Hyogo-ken Nanbu) earthquake, M = 7.2, ruptured the Nojima fault in southwest Japan. We have studied core samples taken from two scientific drillholes that crossed the fault zone SW of the epicentral region on Awaji Island. The shallower hole, drilled by the Geological Survey of Japan (GSJ), was started 75 m to the SE of the surface trace of the Nojima fault and crossed the fault at a depth of 624 m. A deeper hole, drilled by the National Research Institute for Earth Science and Disaster Prevention (NIED) was started 302 m to the SE of the fault and crossed fault strands below a depth of 1140 m. We have measured strength and matrix permeability of core samples taken from these two drillholes. We find a strong correlation between permeability and proximity to the fault zone shear axes. The half-width of the high permeability zone (approximately 15 to 25 m) is in good agreement with the fault zone width inferred from trapped seismic wave analysis and other evidence. The fault zone core or shear axis contains clays with permeabilities of approximately 0.1 to 1 microdarcy at 50 MPa effective confining pressure (10 to 30 microdarcy at in situ pressures). Within a few meters of the fault zone core, the rock is highly fractured but has sustained little net shear. Matrix permeability of this zone is approximately 30 to 60 microdarcy at 50 MPa effective confining pressure (300 to 1000 microdarcy at in situ pressures). Outside this damage zone, matrix permeability drops below 0.01 microdarcy. The clay-rich core material has the lowest strength with a coefficient of friction of approximately 0.55. Shear strength increases with distance from the shear axis. These permeability and strength observations reveal a simple fault zone structure with a relatively weak fine-grained core surrounded by a damage zone of fractured rock. In this case, the damage zone will act as a high-permeability conduit for vertical and horizontal flow in the plane of the

  9. Operational earthquake forecasting can enhance earthquake preparedness

    USGS Publications Warehouse

    Jordan, T.H.; Marzocchi, W.; Michael, A.J.; Gerstenberger, M.C.

    2014-01-01

    We cannot yet predict large earthquakes in the short term with much reliability and skill, but the strong clustering exhibited in seismic sequences tells us that earthquake probabilities are not constant in time; they generally rise and fall over periods of days to years in correlation with nearby seismic activity. Operational earthquake forecasting (OEF) is the dissemination of authoritative information about these time‐dependent probabilities to help communities prepare for potentially destructive earthquakes. The goal of OEF is to inform the decisions that people and organizations must continually make to mitigate seismic risk and prepare for potentially destructive earthquakes on time scales from days to decades. To fulfill this role, OEF must provide a complete description of the seismic hazard—ground‐motion exceedance probabilities as well as short‐term rupture probabilities—in concert with the long‐term forecasts of probabilistic seismic‐hazard analysis (PSHA).

  10. Earthquakes for Kids

    MedlinePlus

    ... across a fault to learn about past earthquakes. Science Fair Projects A GPS instrument measures slow movements of the ground. Become an Earthquake Scientist Cool Earthquake Facts Today in Earthquake History A scientist stands in ...

  11. Missing great earthquakes

    USGS Publications Warehouse

    Hough, Susan E.

    2013-01-01

    The occurrence of three earthquakes with moment magnitude (Mw) greater than 8.8 and six earthquakes larger than Mw 8.5, since 2004, has raised interest in the long-term global rate of great earthquakes. Past studies have focused on the analysis of earthquakes since 1900, which roughly marks the start of the instrumental era in seismology. Before this time, the catalog is less complete and magnitude estimates are more uncertain. Yet substantial information is available for earthquakes before 1900, and the catalog of historical events is being used increasingly to improve hazard assessment. Here I consider the catalog of historical earthquakes and show that approximately half of all Mw ≥ 8.5 earthquakes are likely missing or underestimated in the 19th century. I further present a reconsideration of the felt effects of the 8 February 1843, Lesser Antilles earthquake, including a first thorough assessment of felt reports from the United States, and show it is an example of a known historical earthquake that was significantly larger than initially estimated. The results suggest that incorporation of best available catalogs of historical earthquakes will likely lead to a significant underestimation of seismic hazard and/or the maximum possible magnitude in many regions, including parts of the Caribbean.

  12. Analog earthquakes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hofmann, R.B.

    1995-09-01

    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed.more » A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository.« less

  13. Turkish Compulsory Earthquake Insurance and "Istanbul Earthquake

    NASA Astrophysics Data System (ADS)

    Durukal, E.; Sesetyan, K.; Erdik, M.

    2009-04-01

    The city of Istanbul will likely experience substantial direct and indirect losses as a result of a future large (M=7+) earthquake with an annual probability of occurrence of about 2%. This paper dwells on the expected building losses in terms of probable maximum and average annualized losses and discusses the results from the perspective of the compulsory earthquake insurance scheme operational in the country. The TCIP system is essentially designed to operate in Turkey with sufficient penetration to enable the accumulation of funds in the pool. Today, with only 20% national penetration, and about approximately one-half of all policies in highly earthquake prone areas (one-third in Istanbul) the system exhibits signs of adverse selection, inadequate premium structure and insufficient funding. Our findings indicate that the national compulsory earthquake insurance pool in Turkey will face difficulties in covering incurring building losses in Istanbul in the occurrence of a large earthquake. The annualized earthquake losses in Istanbul are between 140-300 million. Even if we assume that the deductible is raised to 15%, the earthquake losses that need to be paid after a large earthquake in Istanbul will be at about 2.5 Billion, somewhat above the current capacity of the TCIP. Thus, a modification to the system for the insured in Istanbul (or Marmara region) is necessary. This may mean an increase in the premia and deductible rates, purchase of larger re-insurance covers and development of a claim processing system. Also, to avoid adverse selection, the penetration rates elsewhere in Turkey need to be increased substantially. A better model would be introduction of parametric insurance for Istanbul. By such a model the losses will not be indemnified, however will be directly calculated on the basis of indexed ground motion levels and damages. The immediate improvement of a parametric insurance model over the existing one will be the elimination of the claim processing

  14. Earthquake potential revealed by tidal influence on earthquake size-frequency statistics

    NASA Astrophysics Data System (ADS)

    Ide, Satoshi; Yabe, Suguru; Tanaka, Yoshiyuki

    2016-11-01

    The possibility that tidal stress can trigger earthquakes is long debated. In particular, a clear causal relationship between small earthquakes and the phase of tidal stress is elusive. However, tectonic tremors deep within subduction zones are highly sensitive to tidal stress levels, with tremor rate increasing at an exponential rate with rising tidal stress. Thus, slow deformation and the possibility of earthquakes at subduction plate boundaries may be enhanced during periods of large tidal stress. Here we calculate the tidal stress history, and specifically the amplitude of tidal stress, on a fault plane in the two weeks before large earthquakes globally, based on data from the global, Japanese, and Californian earthquake catalogues. We find that very large earthquakes, including the 2004 Sumatran, 2010 Maule earthquake in Chile and the 2011 Tohoku-Oki earthquake in Japan, tend to occur near the time of maximum tidal stress amplitude. This tendency is not obvious for small earthquakes. However, we also find that the fraction of large earthquakes increases (the b-value of the Gutenberg-Richter relation decreases) as the amplitude of tidal shear stress increases. The relationship is also reasonable, considering the well-known relationship between stress and the b-value. This suggests that the probability of a tiny rock failure expanding to a gigantic rupture increases with increasing tidal stress levels. We conclude that large earthquakes are more probable during periods of high tidal stress.

  15. Understanding earthquake from the granular physics point of view — Causes of earthquake, earthquake precursors and predictions

    NASA Astrophysics Data System (ADS)

    Lu, Kunquan; Hou, Meiying; Jiang, Zehui; Wang, Qiang; Sun, Gang; Liu, Jixing

    2018-03-01

    We treat the earth crust and mantle as large scale discrete matters based on the principles of granular physics and existing experimental observations. Main outcomes are: A granular model of the structure and movement of the earth crust and mantle is established. The formation mechanism of the tectonic forces, which causes the earthquake, and a model of propagation for precursory information are proposed. Properties of the seismic precursory information and its relevance with the earthquake occurrence are illustrated, and principle of ways to detect the effective seismic precursor is elaborated. The mechanism of deep-focus earthquake is also explained by the jamming-unjamming transition of the granular flow. Some earthquake phenomena which were previously difficult to understand are explained, and the predictability of the earthquake is discussed. Due to the discrete nature of the earth crust and mantle, the continuum theory no longer applies during the quasi-static seismological process. In this paper, based on the principles of granular physics, we study the causes of earthquakes, earthquake precursors and predictions, and a new understanding, different from the traditional seismological viewpoint, is obtained.

  16. Redefining Earthquakes and the Earthquake Machine

    ERIC Educational Resources Information Center

    Hubenthal, Michael; Braile, Larry; Taber, John

    2008-01-01

    The Earthquake Machine (EML), a mechanical model of stick-slip fault systems, can increase student engagement and facilitate opportunities to participate in the scientific process. This article introduces the EML model and an activity that challenges ninth-grade students' misconceptions about earthquakes. The activity emphasizes the role of models…

  17. OMG Earthquake! Can Twitter improve earthquake response?

    NASA Astrophysics Data System (ADS)

    Earle, P. S.; Guy, M.; Ostrum, C.; Horvath, S.; Buckmaster, R. A.

    2009-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment its earthquake response products and the delivery of hazard information. The goal is to gather near real-time, earthquake-related messages (tweets) and provide geo-located earthquake detections and rough maps of the corresponding felt areas. Twitter and other social Internet technologies are providing the general public with anecdotal earthquake hazard information before scientific information has been published from authoritative sources. People local to an event often publish information within seconds via these technologies. In contrast, depending on the location of the earthquake, scientific alerts take between 2 to 20 minutes. Examining the tweets following the March 30, 2009, M4.3 Morgan Hill earthquake shows it is possible (in some cases) to rapidly detect and map the felt area of an earthquake using Twitter responses. Within a minute of the earthquake, the frequency of “earthquake” tweets rose above the background level of less than 1 per hour to about 150 per minute. Using the tweets submitted in the first minute, a rough map of the felt area can be obtained by plotting the tweet locations. Mapping the tweets from the first six minutes shows observations extending from Monterey to Sacramento, similar to the perceived shaking region mapped by the USGS “Did You Feel It” system. The tweets submitted after the earthquake also provided (very) short first-impression narratives from people who experienced the shaking. Accurately assessing the potential and robustness of a Twitter-based system is difficult because only tweets spanning the previous seven days can be searched, making a historical study impossible. We have, however, been archiving tweets for several months, and it is clear that significant limitations do exist. The main drawback is the lack of quantitative information

  18. Twitter earthquake detection: Earthquake monitoring in a social world

    USGS Publications Warehouse

    Earle, Paul S.; Bowden, Daniel C.; Guy, Michelle R.

    2011-01-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  19. Characterization of Seismogenic Faults of Central Japan by Geophysical Survey and Drilling

    NASA Astrophysics Data System (ADS)

    Ikeda, R.; Omura, K.; Matsuda, T.

    2004-12-01

    Integrated investigations on seismogenic faults by geophysical survey and drilling are indispensable to better understand deep structure and physical properties of a fault fracture zone. In central Japan, three large active faults, Neodani, Atotsugawa and Atera faults, exist and are remarkable for research because of the potentiality of a scale of magnitude 7 to 8 class earthquake and the different characteristics of the seismogenic activities in these faults. Each individual fault shows its own characteristic features, which may reflect different stages in an earthquake cycle. High seismicity is concentrated with a clear lineation on and around the Atotsugawa fault, which is recognized as aftershocks from the latest event of the 1858 Hida earthquake (M=7.0). On the other hand, extremely low seismicity is found around the Atera fault, of which some parts seemed to be dislocated by the 1586 Tensyo earthquake (M=7.9). As an example of the results of study at the Atera fault, we obtained a wide variety of fault structures, composed materials, states of crustal stress and strengths of the fault from the geophysical survey (resistivity and gravity) and in-situ borehole experiments. Our findings are as follows: (1) The fracture zone around the Atera fault shows a very wide and complex fracture structure, from approximately 1 km to 4 km wide. (2) The average slip rate was estimated to be 5.3 m /1000 yr by the distribution of basalt in the age of 1.5 Ma as determined by radioactive dating. We inferred that the Atera fault has been repeatedly active in recent geologic time; however, it is in a very weak state at present. (3) Stress magnitude decreases in the area closer to the center of the fracture zone. These are important results to evaluate fault activity. Recent in-situ downhole measurements and coring through active faults have provided us with new insights into the physical properties of fault zones. In the vicinity of the epicenter of the 1995 Hyogo-ken Nanbu (Kobe

  20. Crustal earthquake triggering by pre-historic great earthquakes on subduction zone thrusts

    USGS Publications Warehouse

    Sherrod, Brian; Gomberg, Joan

    2014-01-01

    Triggering of earthquakes on upper plate faults during and shortly after recent great (M>8.0) subduction thrust earthquakes raises concerns about earthquake triggering following Cascadia subduction zone earthquakes. Of particular regard to Cascadia was the previously noted, but only qualitatively identified, clustering of M>~6.5 crustal earthquakes in the Puget Sound region between about 1200–900 cal yr B.P. and the possibility that this was triggered by a great Cascadia thrust subduction thrust earthquake, and therefore portends future such clusters. We confirm quantitatively the extraordinary nature of the Puget Sound region crustal earthquake clustering between 1200–900 cal yr B.P., at least over the last 16,000. We conclude that this cluster was not triggered by the penultimate, and possibly full-margin, great Cascadia subduction thrust earthquake. However, we also show that the paleoseismic record for Cascadia is consistent with conclusions of our companion study of the global modern record outside Cascadia, that M>8.6 subduction thrust events have a high probability of triggering at least one or more M>~6.5 crustal earthquakes.

  1. Impact of earthquakes on sex ratio at birth: Eastern Marmara earthquakes

    PubMed Central

    Doğer, Emek; Çakıroğlu, Yiğit; Köpük, Şule Yıldırım; Ceylan, Yasin; Şimşek, Hayal Uzelli; Çalışkan, Eray

    2013-01-01

    Objective: Previous reports suggest that maternal exposure to acute stress related to earthquakes affects the sex ratio at birth. Our aim was to examine the change in sex ratio at birth after Eastern Marmara earthquake disasters. Material and Methods: This study was performed using the official birth statistics from January 1997 to December 2002 – before and after 17 August 1999, the date of the Golcuk Earthquake – supplied from the Turkey Statistics Institute. The secondary sex ratio was expressed as the male proportion at birth, and the ratio of both affected and unaffected areas were calculated and compared on a monthly basis using data from gender with using the Chi-square test. Results: We observed significant decreases in the secondary sex ratio in the 4th and 8th months following an earthquake in the affected region compared to the unaffected region (p= 0.001 and p= 0.024). In the earthquake region, the decrease observed in the secondary sex ratio during the 8th month after an earthquake was specific to the period after the earthquake. Conclusion: Our study indicated a significant reduction in the secondary sex ratio after an earthquake. With these findings, events that cause sudden intense stress such as earthquakes can have an effect on the sex ratio at birth. PMID:24592082

  2. Large-Scale Earthquake Countermeasures Act and the Earthquake Prediction Council in Japan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rikitake, T.

    1979-08-07

    The Large-Scale Earthquake Countermeasures Act was enacted in Japan in December 1978. This act aims at mitigating earthquake hazards by designating an area to be an area under intensified measures against earthquake disaster, such designation being based on long-term earthquake prediction information, and by issuing an earthquake warnings statement based on imminent prediction information, when possible. In an emergency case as defined by the law, the prime minister will be empowered to take various actions which cannot be taken at ordinary times. For instance, he may ask the Self-Defense Force to come into the earthquake-threatened area before the earthquake occurrence.more » A Prediction Council has been formed in order to evaluate premonitory effects that might be observed over the Tokai area, which was designated an area under intensified measures against earthquake disaster some time in June 1979. An extremely dense observation network has been constructed over the area.« less

  3. Foreshocks, aftershocks, and earthquake probabilities: Accounting for the landers earthquake

    USGS Publications Warehouse

    Jones, Lucile M.

    1994-01-01

    The equation to determine the probability that an earthquake occurring near a major fault will be a foreshock to a mainshock on that fault is modified to include the case of aftershocks to a previous earthquake occurring near the fault. The addition of aftershocks to the background seismicity makes its less probable that an earthquake will be a foreshock, because nonforeshocks have become more common. As the aftershocks decay with time, the probability that an earthquake will be a foreshock increases. However, fault interactions between the first mainshock and the major fault can increase the long-term probability of a characteristic earthquake on that fault, which will, in turn, increase the probability that an event is a foreshock, compensating for the decrease caused by the aftershocks.

  4. Earthquake catalog for estimation of maximum earthquake magnitude, Central and Eastern United States: Part B, historical earthquakes

    USGS Publications Warehouse

    Wheeler, Russell L.

    2014-01-01

    Computation of probabilistic earthquake hazard requires an estimate of Mmax: the moment magnitude of the largest earthquake that is thought to be possible within a specified geographic region. The region specified in this report is the Central and Eastern United States and adjacent Canada. Parts A and B of this report describe the construction of a global catalog of moderate to large earthquakes that occurred worldwide in tectonic analogs of the Central and Eastern United States. Examination of histograms of the magnitudes of these earthquakes allows estimation of Central and Eastern United States Mmax. The catalog and Mmax estimates derived from it are used in the 2014 edition of the U.S. Geological Survey national seismic-hazard maps. Part A deals with prehistoric earthquakes, and this part deals with historical events.

  5. Earthquake and tsunami forecasts: Relation of slow slip events to subsequent earthquake rupture

    PubMed Central

    Dixon, Timothy H.; Jiang, Yan; Malservisi, Rocco; McCaffrey, Robert; Voss, Nicholas; Protti, Marino; Gonzalez, Victor

    2014-01-01

    The 5 September 2012 Mw 7.6 earthquake on the Costa Rica subduction plate boundary followed a 62-y interseismic period. High-precision GPS recorded numerous slow slip events (SSEs) in the decade leading up to the earthquake, both up-dip and down-dip of seismic rupture. Deeper SSEs were larger than shallower ones and, if characteristic of the interseismic period, release most locking down-dip of the earthquake, limiting down-dip rupture and earthquake magnitude. Shallower SSEs were smaller, accounting for some but not all interseismic locking. One SSE occurred several months before the earthquake, but changes in Mohr–Coulomb failure stress were probably too small to trigger the earthquake. Because many SSEs have occurred without subsequent rupture, their individual predictive value is limited, but taken together they released a significant amount of accumulated interseismic strain before the earthquake, effectively defining the area of subsequent seismic rupture (rupture did not occur where slow slip was common). Because earthquake magnitude depends on rupture area, this has important implications for earthquake hazard assessment. Specifically, if this behavior is representative of future earthquake cycles and other subduction zones, it implies that monitoring SSEs, including shallow up-dip events that lie offshore, could lead to accurate forecasts of earthquake magnitude and tsunami potential. PMID:25404327

  6. Earthquake and tsunami forecasts: relation of slow slip events to subsequent earthquake rupture.

    PubMed

    Dixon, Timothy H; Jiang, Yan; Malservisi, Rocco; McCaffrey, Robert; Voss, Nicholas; Protti, Marino; Gonzalez, Victor

    2014-12-02

    The 5 September 2012 M(w) 7.6 earthquake on the Costa Rica subduction plate boundary followed a 62-y interseismic period. High-precision GPS recorded numerous slow slip events (SSEs) in the decade leading up to the earthquake, both up-dip and down-dip of seismic rupture. Deeper SSEs were larger than shallower ones and, if characteristic of the interseismic period, release most locking down-dip of the earthquake, limiting down-dip rupture and earthquake magnitude. Shallower SSEs were smaller, accounting for some but not all interseismic locking. One SSE occurred several months before the earthquake, but changes in Mohr-Coulomb failure stress were probably too small to trigger the earthquake. Because many SSEs have occurred without subsequent rupture, their individual predictive value is limited, but taken together they released a significant amount of accumulated interseismic strain before the earthquake, effectively defining the area of subsequent seismic rupture (rupture did not occur where slow slip was common). Because earthquake magnitude depends on rupture area, this has important implications for earthquake hazard assessment. Specifically, if this behavior is representative of future earthquake cycles and other subduction zones, it implies that monitoring SSEs, including shallow up-dip events that lie offshore, could lead to accurate forecasts of earthquake magnitude and tsunami potential.

  7. Understanding earthquake hazards in urban areas - Evansville Area Earthquake Hazards Mapping Project

    USGS Publications Warehouse

    Boyd, Oliver S.

    2012-01-01

    The region surrounding Evansville, Indiana, has experienced minor damage from earthquakes several times in the past 200 years. Because of this history and the proximity of Evansville to the Wabash Valley and New Madrid seismic zones, there is concern among nearby communities about hazards from earthquakes. Earthquakes currently cannot be predicted, but scientists can estimate how strongly the ground is likely to shake as a result of an earthquake and are able to design structures to withstand this estimated ground shaking. Earthquake-hazard maps provide one way of conveying such information and can help the region of Evansville prepare for future earthquakes and reduce earthquake-caused loss of life and financial and structural loss. The Evansville Area Earthquake Hazards Mapping Project (EAEHMP) has produced three types of hazard maps for the Evansville area: (1) probabilistic seismic-hazard maps show the ground motion that is expected to be exceeded with a given probability within a given period of time; (2) scenario ground-shaking maps show the expected shaking from two specific scenario earthquakes; (3) liquefaction-potential maps show how likely the strong ground shaking from the scenario earthquakes is to produce liquefaction. These maps complement the U.S. Geological Survey's National Seismic Hazard Maps but are more detailed regionally and take into account surficial geology, soil thickness, and soil stiffness; these elements greatly affect ground shaking.

  8. Continuing megathrust earthquake potential in Chile after the 2014 Iquique earthquake

    USGS Publications Warehouse

    Hayes, Gavin P.; Herman, Matthew W.; Barnhart, William D.; Furlong, Kevin P.; Riquelme, Sebástian; Benz, Harley M.; Bergman, Eric; Barrientos, Sergio; Earle, Paul S.; Samsonov, Sergey

    2014-01-01

    The seismic gap theory identifies regions of elevated hazard based on a lack of recent seismicity in comparison with other portions of a fault. It has successfully explained past earthquakes (see, for example, ref. 2) and is useful for qualitatively describing where large earthquakes might occur. A large earthquake had been expected in the subduction zone adjacent to northern Chile which had not ruptured in a megathrust earthquake since a M ~8.8 event in 1877. On 1 April 2014 a M 8.2 earthquake occurred within this seismic gap. Here we present an assessment of the seismotectonics of the March–April 2014 Iquique sequence, including analyses of earthquake relocations, moment tensors, finite fault models, moment deficit calculations and cumulative Coulomb stress transfer. This ensemble of information allows us to place the sequence within the context of regional seismicity and to identify areas of remaining and/or elevated hazard. Our results constrain the size and spatial extent of rupture, and indicate that this was not the earthquake that had been anticipated. Significant sections of the northern Chile subduction zone have not ruptured in almost 150 years, so it is likely that future megathrust earthquakes will occur to the south and potentially to the north of the 2014 Iquique sequence.

  9. Continuing megathrust earthquake potential in Chile after the 2014 Iquique earthquake.

    PubMed

    Hayes, Gavin P; Herman, Matthew W; Barnhart, William D; Furlong, Kevin P; Riquelme, Sebástian; Benz, Harley M; Bergman, Eric; Barrientos, Sergio; Earle, Paul S; Samsonov, Sergey

    2014-08-21

    The seismic gap theory identifies regions of elevated hazard based on a lack of recent seismicity in comparison with other portions of a fault. It has successfully explained past earthquakes (see, for example, ref. 2) and is useful for qualitatively describing where large earthquakes might occur. A large earthquake had been expected in the subduction zone adjacent to northern Chile, which had not ruptured in a megathrust earthquake since a M ∼8.8 event in 1877. On 1 April 2014 a M 8.2 earthquake occurred within this seismic gap. Here we present an assessment of the seismotectonics of the March-April 2014 Iquique sequence, including analyses of earthquake relocations, moment tensors, finite fault models, moment deficit calculations and cumulative Coulomb stress transfer. This ensemble of information allows us to place the sequence within the context of regional seismicity and to identify areas of remaining and/or elevated hazard. Our results constrain the size and spatial extent of rupture, and indicate that this was not the earthquake that had been anticipated. Significant sections of the northern Chile subduction zone have not ruptured in almost 150 years, so it is likely that future megathrust earthquakes will occur to the south and potentially to the north of the 2014 Iquique sequence.

  10. Earthquakes: Predicting the unpredictable?

    USGS Publications Warehouse

    Hough, Susan E.

    2005-01-01

    The earthquake prediction pendulum has swung from optimism in the 1970s to rather extreme pessimism in the 1990s. Earlier work revealed evidence of possible earthquake precursors: physical changes in the planet that signal that a large earthquake is on the way. Some respected earthquake scientists argued that earthquakes are likewise fundamentally unpredictable. The fate of the Parkfield prediction experiment appeared to support their arguments: A moderate earthquake had been predicted along a specified segment of the central San Andreas fault within five years of 1988, but had failed to materialize on schedule. At some point, however, the pendulum began to swing back. Reputable scientists began using the "P-word" in not only polite company, but also at meetings and even in print. If the optimism regarding earthquake prediction can be attributed to any single cause, it might be scientists' burgeoning understanding of the earthquake cycle.

  11. Comparison of two large earthquakes: the 2008 Sichuan Earthquake and the 2011 East Japan Earthquake.

    PubMed

    Otani, Yuki; Ando, Takayuki; Atobe, Kaori; Haiden, Akina; Kao, Sheng-Yuan; Saito, Kohei; Shimanuki, Marie; Yoshimoto, Norifumi; Fukunaga, Koichi

    2012-01-01

    Between August 15th and 19th, 2011, eight 5th-year medical students from the Keio University School of Medicine had the opportunity to visit the Peking University School of Medicine and hold a discussion session titled "What is the most effective way to educate people for survival in an acute disaster situation (before the mental health care stage)?" During the session, we discussed the following six points: basic information regarding the Sichuan Earthquake and the East Japan Earthquake, differences in preparedness for earthquakes, government actions, acceptance of medical rescue teams, earthquake-induced secondary effects, and media restrictions. Although comparison of the two earthquakes was not simple, we concluded that three major points should be emphasized to facilitate the most effective course of disaster planning and action. First, all relevant agencies should formulate emergency plans and should supply information regarding the emergency to the general public and health professionals on a normal basis. Second, each citizen should be educated and trained in how to minimize the risks from earthquake-induced secondary effects. Finally, the central government should establish a single headquarters responsible for command, control, and coordination during a natural disaster emergency and should centralize all powers in this single authority. We hope this discussion may be of some use in future natural disasters in China, Japan, and worldwide.

  12. Injection-induced earthquakes

    USGS Publications Warehouse

    Ellsworth, William L.

    2013-01-01

    Earthquakes in unusual locations have become an important topic of discussion in both North America and Europe, owing to the concern that industrial activity could cause damaging earthquakes. It has long been understood that earthquakes can be induced by impoundment of reservoirs, surface and underground mining, withdrawal of fluids and gas from the subsurface, and injection of fluids into underground formations. Injection-induced earthquakes have, in particular, become a focus of discussion as the application of hydraulic fracturing to tight shale formations is enabling the production of oil and gas from previously unproductive formations. Earthquakes can be induced as part of the process to stimulate the production from tight shale formations, or by disposal of wastewater associated with stimulation and production. Here, I review recent seismic activity that may be associated with industrial activity, with a focus on the disposal of wastewater by injection in deep wells; assess the scientific understanding of induced earthquakes; and discuss the key scientific challenges to be met for assessing this hazard.

  13. Earthquakes; January-February 1982

    USGS Publications Warehouse

    Person, W.J.

    1982-01-01

    In the United States, a number of earthquakes occurred, but only minor damage was reported. Arkansas experienced a swarm of earthquakes beginning on January 12. Canada experienced one of its strongest earthquakes in a number of years on January 9; this earthquake caused slight damage in Maine. 

  14. Results of the Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California.

    PubMed

    Lee, Ya-Ting; Turcotte, Donald L; Holliday, James R; Sachs, Michael K; Rundle, John B; Chen, Chien-Chih; Tiampo, Kristy F

    2011-10-04

    The Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California was the first competitive evaluation of forecasts of future earthquake occurrence. Participants submitted expected probabilities of occurrence of M ≥ 4.95 earthquakes in 0.1° × 0.1° cells for the period 1 January 1, 2006, to December 31, 2010. Probabilities were submitted for 7,682 cells in California and adjacent regions. During this period, 31 M ≥ 4.95 earthquakes occurred in the test region. These earthquakes occurred in 22 test cells. This seismic activity was dominated by earthquakes associated with the M = 7.2, April 4, 2010, El Mayor-Cucapah earthquake in northern Mexico. This earthquake occurred in the test region, and 16 of the other 30 earthquakes in the test region could be associated with it. Nine complete forecasts were submitted by six participants. In this paper, we present the forecasts in a way that allows the reader to evaluate which forecast is the most "successful" in terms of the locations of future earthquakes. We conclude that the RELM test was a success and suggest ways in which the results can be used to improve future forecasts.

  15. Results of the Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California

    PubMed Central

    Lee, Ya-Ting; Turcotte, Donald L.; Holliday, James R.; Sachs, Michael K.; Rundle, John B.; Chen, Chien-Chih; Tiampo, Kristy F.

    2011-01-01

    The Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California was the first competitive evaluation of forecasts of future earthquake occurrence. Participants submitted expected probabilities of occurrence of M≥4.95 earthquakes in 0.1° × 0.1° cells for the period 1 January 1, 2006, to December 31, 2010. Probabilities were submitted for 7,682 cells in California and adjacent regions. During this period, 31 M≥4.95 earthquakes occurred in the test region. These earthquakes occurred in 22 test cells. This seismic activity was dominated by earthquakes associated with the M = 7.2, April 4, 2010, El Mayor–Cucapah earthquake in northern Mexico. This earthquake occurred in the test region, and 16 of the other 30 earthquakes in the test region could be associated with it. Nine complete forecasts were submitted by six participants. In this paper, we present the forecasts in a way that allows the reader to evaluate which forecast is the most “successful” in terms of the locations of future earthquakes. We conclude that the RELM test was a success and suggest ways in which the results can be used to improve future forecasts. PMID:21949355

  16. Induced earthquake during the 2016 Kumamoto earthquake (Mw7.0): Importance of real-time shake monitoring for Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Hoshiba, M.; Ogiso, M.

    2016-12-01

    Sequence of the 2016 Kumamoto earthquakes (Mw6.2 on April 14, Mw7.0 on April 16, and many aftershocks) caused a devastating damage at Kumamoto and Oita prefectures, Japan. During the Mw7.0 event, just after the direct S waves passing the central Oita, another M6 class event occurred there more than 80 km apart from the Mw7.0 event. The M6 event is interpreted as an induced earthquake; but it brought stronger shaking at the central Oita than that from the Mw7.0 event. We will discuss the induced earthquake from viewpoint of Earthquake Early Warning. In terms of ground shaking such as PGA and PGV, the Mw7.0 event is much smaller than those of the M6 induced earthquake at the central Oita (for example, 1/8 smaller at OIT009 station for PGA), and then it is easy to discriminate two events. However, PGD of the Mw7.0 is larger than that of the induced earthquake, and its appearance is just before the occurrence of the induced earthquake. It is quite difficult to recognize the induced earthquake from displacement waveforms only, because the displacement is strongly contaminated by that of the preceding Mw7.0 event. In many methods of EEW (including current JMA EEW system), magnitude is used for prediction of ground shaking through Ground Motion Prediction Equation (GMPE) and the magnitude is often estimated from displacement. However, displacement magnitude does not necessarily mean the best one for prediction of ground shaking, such as PGA and PGV. In case of the induced earthquake during the Kumamoto earthquake, displacement magnitude could not be estimated because of the strong contamination. Actually JMA EEW system could not recognize the induced earthquake. One of the important lessons we learned from eight years' operation of EEW is an issue of the multiple simultaneous earthquakes, such as aftershocks of the 2011 Mw9.0 Tohoku earthquake. Based on this lesson, we have proposed enhancement of real-time monitor of ground shaking itself instead of rapid estimation of

  17. Earthquakes, September-October 1986

    USGS Publications Warehouse

    Person, W.J.

    1987-01-01

    There was one great earthquake (8.0 and above) during this reporting period in the South Pacific in the Kermadec Islands. There were no major earthquakes (7.0-7.9) but earthquake-related deaths were reported in Greece and in El Salvador. There were no destrcutive earthquakes in the United States.

  18. Post earthquake recovery in natural gas systems--1971 San Fernando Earthquake

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, W.T. Jr.

    1983-01-01

    In this paper a concise summary of the post earthquake investigations for the 1971 San Fernando Earthquake is presented. The effects of the earthquake upon building and other above ground structures are briefly discussed. Then the damages and subsequent repairs in the natural gas systems are reported.

  19. Earthquakes; July-August, 1978

    USGS Publications Warehouse

    Person, W.J.

    1979-01-01

    Earthquake activity during this period was about normal. Deaths from earthquakes were reported from Greece and Guatemala. Three major earthquakes (magnitude 7.0-7.9) occurred in Taiwan, Chile, and Costa Rica. In the United States, the most significant earthquake was a magnitude 5.6 on August 13 in southern California. 

  20. Improvements of the offshore earthquake locations in the Earthquake Early Warning System

    NASA Astrophysics Data System (ADS)

    Chen, Ta-Yi; Hsu, Hsin-Chih

    2017-04-01

    Since 2014 the Earthworm Based Earthquake Alarm Reporting (eBEAR) system has been operated and been used to issue warnings to schools. In 2015 the system started to provide warnings to the public in Taiwan via television and the cell phone. Online performance of the eBEAR system indicated that the average reporting times afforded by the system are approximately 15 and 28 s for inland and offshore earthquakes, respectively. The eBEAR system in average can provide more warning time than the current EEW system (3.2 s and 5.5 s for inland and offshore earthquakes, respectively). However, offshore earthquakes were usually located poorly because only P-wave arrivals were used in the eBEAR system. Additionally, in the early stage of the earthquake early warning system, only fewer stations are available. The poor station coverage may be a reason to answer why offshore earthquakes are difficult to locate accurately. In the Geiger's inversion procedure of earthquake location, we need to put an initial hypocenter and origin time into the location program. For the initial hypocenter, we defined some test locations on the offshore area instead of using the average of locations from triggered stations. We performed 20 programs concurrently running the Geiger's method with different pre-defined initial position to locate earthquakes. We assume that if the program with the pre-defined initial position is close to the true earthquake location, during the iteration procedure of the Geiger's method the processing time of this program should be less than others. The results show that using pre-defined locations for trial-hypocenter in the inversion procedure is able to improve the accurate of offshore earthquakes. Especially for EEW system, in the initial stage of the EEW system, only use 3 or 5 stations to locate earthquakes may lead to bad results because of poor station coverage. In this study, the pre-defined trial-locations provide a feasible way to improve the estimations of

  1. Megathrust earthquakes in Central Chile: What is next after the Maule 2010 earthquake?

    NASA Astrophysics Data System (ADS)

    Madariaga, R.

    2013-05-01

    The 27 February 2010 Maule earthquake occurred in a well identified gap in the Chilean subduction zone. The event has now been studied in detail using both far-field, near field seismic and geodetic data, we will review this information gathered so far. The event broke a region that was much longer along strike than the gap left over from the 1835 Concepcion earthquake, sometimes called the Darwin earthquake because he was in the area when the earthquake occurred and made many observations. Recent studies of contemporary documents by Udias et al indicate that the area broken by the Maule earthquake in 2010 had previously broken by a similar earthquake in 1751, but several events in the magnitude 8 range occurred in the area principally in 1835 already mentioned and, more recently on 1 December 1928 to the North and on 21 May 1960 (1 1/2 days before the big Chilean earthquake of 1960). Currently the area of the 2010 earthquake and the region immediately to the North is undergoing a very large increase in seismicity with numerous clusters of seismicity that move along the plate interface. Examination of the seismicity of Chile of the 18th and 19th century show that the region immediately to the North of the 2010 earthquake broke in a very large megathrust event in July 1730. this is the largest known earthquake in central Chile. The region where this event occurred has broken in many occasions with M 8 range earthquakes in 1822, 1880, 1906, 1971 and 1985. Is it preparing for a new very large megathrust event? The 1906 earthquake of Mw 8.3 filled the central part of the gap but it has broken again on several occasions in 1971, 1973 and 1985. The main question is whether the 1906 earthquake relieved enough stresses from the 1730 rupture zone. Geodetic data shows that most of the region that broke in 1730 is currently almost fully locked from the northern end of the Maule earthquake at 34.5°S to 30°S, near the southern end of the of the Mw 8.5 Atacama earthquake of 11

  2. Earthquakes; March-April 1975

    USGS Publications Warehouse

    Person, W.J.

    1975-01-01

    There were no major earthquakes (magnitude 7.0-7.9) in March or April; however, there were earthquake fatalities in Chile, Iran, and Venezuela and approximately 35 earthquake-related injuries were reported around the world. In the United States a magnitude 6.0 earthquake struck the Idaho-Utah border region. Damage was estimated at about a million dollars. The shock was felt over a wide area and was the largest to hit the continental Untied States since the San Fernando earthquake of February 1971. 

  3. Protecting your family from earthquakes: The seven steps to earthquake safety

    USGS Publications Warehouse

    Developed by American Red Cross, Asian Pacific Fund

    2007-01-01

    This book is provided here because of the importance of preparing for earthquakes before they happen. Experts say it is very likely there will be a damaging San Francisco Bay Area earthquake in the next 30 years and that it will strike without warning. It may be hard to find the supplies and services we need after this earthquake. For example, hospitals may have more patients than they can treat, and grocery stores may be closed for weeks. You will need to provide for your family until help arrives. To keep our loved ones and our community safe, we must prepare now. Some of us come from places where earthquakes are also common. However, the dangers of earthquakes in our homelands may be very different than in the Bay Area. For example, many people in Asian countries die in major earthquakes when buildings collapse or from big sea waves called tsunami. In the Bay Area, the main danger is from objects inside buildings falling on people. Take action now to make sure your family will be safe in an earthquake. The first step is to read this book carefully and follow its advice. By making your home safer, you help make our community safer. Preparing for earthquakes is important, and together we can make sure our families and community are ready. English version p. 3-13 Chinese version p. 14-24 Vietnamese version p. 25-36 Korean version p. 37-48

  4. Earthquake Triggering in the September 2017 Mexican Earthquake Sequence

    NASA Astrophysics Data System (ADS)

    Fielding, E. J.; Gombert, B.; Duputel, Z.; Huang, M. H.; Liang, C.; Bekaert, D. P.; Moore, A. W.; Liu, Z.; Ampuero, J. P.

    2017-12-01

    Southern Mexico was struck by four earthquakes with Mw > 6 and numerous smaller earthquakes in September 2017, starting with the 8 September Mw 8.2 Tehuantepec earthquake beneath the Gulf of Tehuantepec offshore Chiapas and Oaxaca. We study whether this M8.2 earthquake triggered the three subsequent large M>6 quakes in southern Mexico to improve understanding of earthquake interactions and time-dependent risk. All four large earthquakes were extensional despite the the subduction of the Cocos plate. The traditional definition of aftershocks: likely an aftershock if it occurs within two rupture lengths of the main shock soon afterwards. Two Mw 6.1 earthquakes, one half an hour after the M8.2 beneath the Tehuantepec gulf and one on 23 September near Ixtepec in Oaxaca, both fit as traditional aftershocks, within 200 km of the main rupture. The 19 September Mw 7.1 Puebla earthquake was 600 km away from the M8.2 shock, outside the standard aftershock zone. Geodetic measurements from interferometric analysis of synthetic aperture radar (InSAR) and time-series analysis of GPS station data constrain finite fault total slip models for the M8.2, M7.1, and M6.1 Ixtepec earthquakes. The early M6.1 aftershock was too close in time and space to the M8.2 to measure with InSAR or GPS. We analyzed InSAR data from Copernicus Sentinel-1A and -1B satellites and JAXA ALOS-2 satellite. Our preliminary geodetic slip model for the M8.2 quake shows significant slip extended > 150 km NW from the hypocenter, longer than slip in the v1 finite-fault model (FFM) from teleseismic waveforms posted by G. Hayes at USGS NEIC. Our slip model for the M7.1 earthquake is similar to the v2 NEIC FFM. Interferograms for the M6.1 Ixtepec quake confirm the shallow depth in the upper-plate crust and show centroid is about 30 km SW of the NEIC epicenter, a significant NEIC location bias, but consistent with cluster relocations (E. Bergman, pers. comm.) and with Mexican SSN location. Coulomb static stress

  5. Earthquake catalog for estimation of maximum earthquake magnitude, Central and Eastern United States: Part A, Prehistoric earthquakes

    USGS Publications Warehouse

    Wheeler, Russell L.

    2014-01-01

    Computation of probabilistic earthquake hazard requires an estimate of Mmax, the maximum earthquake magnitude thought to be possible within a specified geographic region. This report is Part A of an Open-File Report that describes the construction of a global catalog of moderate to large earthquakes, from which one can estimate Mmax for most of the Central and Eastern United States and adjacent Canada. The catalog and Mmax estimates derived from it were used in the 2014 edition of the U.S. Geological Survey national seismic-hazard maps. This Part A discusses prehistoric earthquakes that occurred in eastern North America, northwestern Europe, and Australia, whereas a separate Part B deals with historical events.

  6. The initial subevent of the 1994 Northridge, California, earthquake: Is earthquake size predictable?

    USGS Publications Warehouse

    Kilb, Debi; Gomberg, J.

    1999-01-01

    We examine the initial subevent (ISE) of the M?? 6.7, 1994 Northridge, California, earthquake in order to discriminate between two end-member rupture initiation models: the 'preslip' and 'cascade' models. Final earthquake size may be predictable from an ISE's seismic signature in the preslip model but not in the cascade model. In the cascade model ISEs are simply small earthquakes that can be described as purely dynamic ruptures. In this model a large earthquake is triggered by smaller earthquakes; there is no size scaling between triggering and triggered events and a variety of stress transfer mechanisms are possible. Alternatively, in the preslip model, a large earthquake nucleates as an aseismically slipping patch in which the patch dimension grows and scales with the earthquake's ultimate size; the byproduct of this loading process is the ISE. In this model, the duration of the ISE signal scales with the ultimate size of the earthquake, suggesting that nucleation and earthquake size are determined by a more predictable, measurable, and organized process. To distinguish between these two end-member models we use short period seismograms recorded by the Southern California Seismic Network. We address questions regarding the similarity in hypocenter locations and focal mechanisms of the ISE and the mainshock. We also compare the ISE's waveform characteristics to those of small earthquakes and to the beginnings of earthquakes with a range of magnitudes. We find that the focal mechanisms of the ISE and mainshock are indistinguishable, and both events may have nucleated on and ruptured the same fault plane. These results satisfy the requirements for both models and thus do not discriminate between them. However, further tests show the ISE's waveform characteristics are similar to those of typical small earthquakes in the vicinity and more importantly, do not scale with the mainshock magnitude. These results are more consistent with the cascade model.

  7. Identification of Deep Earthquakes

    DTIC Science & Technology

    2010-09-01

    discriminants that will reliably separate small, crustal earthquakes (magnitudes less than about 4 and depths less than about 40 to 50 km) from small...characteristics on discrimination plots designed to separate nuclear explosions from crustal earthquakes. Thus, reliably flagging these small, deep events is...Further, reliably identifying subcrustal earthquakes will allow us to eliminate deep events (previously misidentified as crustal earthquakes) from

  8. The 1985 central chile earthquake: a repeat of previous great earthquakes in the region?

    PubMed

    Comte, D; Eisenberg, A; Lorca, E; Pardo, M; Ponce, L; Saragoni, R; Singh, S K; Suárez, G

    1986-07-25

    A great earthquake (surface-wave magnitude, 7.8) occurred along the coast of central Chile on 3 March 1985, causing heavy damage to coastal towns. Intense foreshock activity near the epicenter of the main shock occurred for 11 days before the earthquake. The aftershocks of the 1985 earthquake define a rupture area of 170 by 110 square kilometers. The earthquake was forecast on the basis of the nearly constant repeat time (83 +/- 9 years) of great earthquakes in this region. An analysis of previous earthquakes suggests that the rupture lengths of great shocks in the region vary by a factor of about 3. The nearly constant repeat time and variable rupture lengths cannot be reconciled with time- or slip-predictable models of earthquake recurrence. The great earthquakes in the region seem to involve a variable rupture mode and yet, for unknown reasons, remain periodic. Historical data suggest that the region south of the 1985 rupture zone should now be considered a gap of high seismic potential that may rupture in a great earthquake in the next few tens of years.

  9. The relationship between earthquake exposure and posttraumatic stress disorder in 2013 Lushan earthquake

    NASA Astrophysics Data System (ADS)

    Wang, Yan; Lu, Yi

    2018-01-01

    The objective of this study is to explore the relationship between earthquake exposure and the incidence of PTSD. A stratification random sample survey was conducted to collect data in the Longmenshan thrust fault after Lushan earthquake three years. We used the Children's Revised Impact of Event Scale (CRIES-13) and the Earthquake Experience Scale. Subjects in this study included 3944 school student survivors in local eleven schools. The prevalence of probable PTSD is relatively higher, when the people was trapped in the earthquake, was injured in the earthquake or have relatives who died in the earthquake. It concluded that researchers need to pay more attention to the children and adolescents. The government should pay more attention to these people and provide more economic support.

  10. Crowdsourced earthquake early warning.

    PubMed

    Minson, Sarah E; Brooks, Benjamin A; Glennie, Craig L; Murray, Jessica R; Langbein, John O; Owen, Susan E; Heaton, Thomas H; Iannucci, Robert A; Hauser, Darren L

    2015-04-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an M w (moment magnitude) 7 earthquake on California's Hayward fault, and real data from the M w 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.

  11. Crowdsourced earthquake early warning

    PubMed Central

    Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

    2015-01-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing. PMID:26601167

  12. Earthquakes, November-December 1992

    USGS Publications Warehouse

    Person, W.J.

    1993-01-01

    There were two major earthquakes (7.0≤M<8.0) during the last two months of the year, a magntidue 7.5 earthquake on December 12 in the Flores region, Indonesia, and a magnitude 7.0 earthquake on December 20 in the Banda Sea. Earthquakes caused fatalities in China and Indonesia. The greatest number of deaths (2,500) for the year occurred in Indonesia. In Switzerland, six people were killed by an accidental explosion recoreded by seismographs. In teh United States, a magnitude 5.3 earthquake caused slight damage at Big Bear in southern California. 

  13. Crowdsourced earthquake early warning

    USGS Publications Warehouse

    Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

    2015-01-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.

  14. Earthquake Clusters and Spatio-temporal Migration of earthquakes in Northeastern Tibetan Plateau: a Finite Element Modeling

    NASA Astrophysics Data System (ADS)

    Sun, Y.; Luo, G.

    2017-12-01

    Seismicity in a region is usually characterized by earthquake clusters and earthquake migration along its major fault zones. However, we do not fully understand why and how earthquake clusters and spatio-temporal migration of earthquakes occur. The northeastern Tibetan Plateau is a good example for us to investigate these problems. In this study, we construct and use a three-dimensional viscoelastoplastic finite-element model to simulate earthquake cycles and spatio-temporal migration of earthquakes along major fault zones in northeastern Tibetan Plateau. We calculate stress evolution and fault interactions, and explore effects of topographic loading and viscosity of middle-lower crust and upper mantle on model results. Model results show that earthquakes and fault interactions increase Coulomb stress on the neighboring faults or segments, accelerating the future earthquakes in this region. Thus, earthquakes occur sequentially in a short time, leading to regional earthquake clusters. Through long-term evolution, stresses on some seismogenic faults, which are far apart, may almost simultaneously reach the critical state of fault failure, probably also leading to regional earthquake clusters and earthquake migration. Based on our model synthetic seismic catalog and paleoseismic data, we analyze probability of earthquake migration between major faults in northeastern Tibetan Plateau. We find that following the 1920 M 8.5 Haiyuan earthquake and the 1927 M 8.0 Gulang earthquake, the next big event (M≥7) in northeastern Tibetan Plateau would be most likely to occur on the Haiyuan fault.

  15. Prototype operational earthquake prediction system

    USGS Publications Warehouse

    Spall, Henry

    1986-01-01

    An objective if the U.S. Earthquake Hazards Reduction Act of 1977 is to introduce into all regions of the country that are subject to large and moderate earthquakes, systems for predicting earthquakes and assessing earthquake risk. In 1985, the USGS developed for the Secretary of the Interior a program for implementation of a prototype operational earthquake prediction system in southern California.

  16. Perception of earthquake risk in Taiwan: effects of gender and past earthquake experience.

    PubMed

    Kung, Yi-Wen; Chen, Sue-Huei

    2012-09-01

    This study explored how individuals in Taiwan perceive the risk of earthquake and the relationship of past earthquake experience and gender to risk perception. Participants (n= 1,405), including earthquake survivors and those in the general population without prior direct earthquake exposure, were selected and interviewed through a computer-assisted telephone interviewing procedure using a random sampling and stratification method covering all 24 regions of Taiwan. A factor analysis of the interview data yielded a two-factor structure of risk perception in regard to earthquake. The first factor, "personal impact," encompassed perception of threat and fear related to earthquakes. The second factor, "controllability," encompassed a sense of efficacy of self-protection in regard to earthquakes. The findings indicated prior earthquake survivors and females reported higher scores on the personal impact factor than males and those with no prior direct earthquake experience, although there were no group differences on the controllability factor. The findings support that risk perception has multiple components, and suggest that past experience (survivor status) and gender (female) affect the perception of risk. Exploration of potential contributions of other demographic factors such as age, education, and marital status to personal impact, especially for females and survivors, is discussed. Future research on and intervention program with regard to risk perception are suggested accordingly. © 2012 Society for Risk Analysis.

  17. Earthquakes in Alaska

    USGS Publications Warehouse

    Haeussler, Peter J.; Plafker, George

    1995-01-01

    Earthquake risk is high in much of the southern half of Alaska, but it is not the same everywhere. This map shows the overall geologic setting in Alaska that produces earthquakes. The Pacific plate (darker blue) is sliding northwestward past southeastern Alaska and then dives beneath the North American plate (light blue, green, and brown) in southern Alaska, the Alaska Peninsula, and the Aleutian Islands. Most earthquakes are produced where these two plates come into contact and slide past each other. Major earthquakes also occur throughout much of interior Alaska as a result of collision of a piece of crust with the southern margin.

  18. Investigations in site response from ground motion observations in vertical arrays

    NASA Astrophysics Data System (ADS)

    Baise, Laurie Gaskins

    The aim of the research is to improve the understanding of earthquake site response and to improve the techniques available to investigate issues in this field. Vertical array ground motion data paired with the empirical transfer function (ETF) methodology is shown to accurately characterize site response. This manuscript draws on methods developed in the field of signal processing and statistical time series analysis to parameterize the ETF as an autoregressive moving-average (ARMA) system which is justified theoretically, historically, and by example. Site response is evaluated at six sites in California, Japan, and Taiwan using ETF estimates, correlation analysis, and full waveform modeling. Correlation analysis is proposed as a required data quality evaluation imperative to any subsequent site response analysis. ETF estimates and waveform modeling are used to decipher the site response at sites with simple and complex geologic structure, which provide simple time-invariant and time-variant methods for evaluating both linear site transfer functions and nonlinear site response for sites experiencing liquefaction of the soils. The Treasure and Yerba Buena Island sites, however, require 2-D waveform modeling to accurately evaluate the effects of the shallow sedimentary basin. ETFs are used to characterize the Port Island site and corresponding shake table tests before, during, and after liquefaction. ETFs derived from the shake table tests were demonstrated to consistently predict the linear field ground response below 16 m depth and the liquefied behavior above 15 m depth. The liquefied interval response was demonstrated to gradually return to pre-liquefied conditions within several weeks of the 1995 Hyogo-ken Nanbu earthquake. Both the site's and the shake table test's response were shown to be effectively linear up to 0.5 g in the native materials below 16 m depth. The effective linearity of the site response at GVDA, Chiba, and Lotting up to 0.1 g, 0.33 g, and

  19. Earthquakes, November-December 1973

    USGS Publications Warehouse

    Person, W.J.

    1974-01-01

    Other parts of the world suffered fatalities and significant damage from earthquakes. In Iran, an earthquake killed one person, injured many, and destroyed a number of homes. Earthquake fatalities also occurred in the Azores and in Algeria. 

  20. Comparative study of earthquake-related and non-earthquake-related head traumas using multidetector computed tomography

    PubMed Central

    Chu, Zhi-gang; Yang, Zhi-gang; Dong, Zhi-hui; Chen, Tian-wu; Zhu, Zhi-yu; Shao, Heng

    2011-01-01

    OBJECTIVE: The features of earthquake-related head injuries may be different from those of injuries obtained in daily life because of differences in circumstances. We aim to compare the features of head traumas caused by the Sichuan earthquake with those of other common head traumas using multidetector computed tomography. METHODS: In total, 221 patients with earthquake-related head traumas (the earthquake group) and 221 patients with other common head traumas (the non-earthquake group) were enrolled in our study, and their computed tomographic findings were compared. We focused the differences between fractures and intracranial injuries and the relationships between extracranial and intracranial injuries. RESULTS: More earthquake-related cases had only extracranial soft tissue injuries (50.7% vs. 26.2%, RR = 1.9), and fewer cases had intracranial injuries (17.2% vs. 50.7%, RR = 0.3) compared with the non-earthquake group. For patients with fractures and intracranial injuries, there were fewer cases with craniocerebral injuries in the earthquake group (60.6% vs. 77.9%, RR = 0.8), and the earthquake-injured patients had fewer fractures and intracranial injuries overall (1.5±0.9 vs. 2.5±1.8; 1.3±0.5 vs. 2.1±1.1). Compared with the non-earthquake group, the incidences of soft tissue injuries and cranial fractures combined with intracranial injuries in the earthquake group were significantly lower (9.8% vs. 43.7%, RR = 0.2; 35.1% vs. 82.2%, RR = 0.4). CONCLUSION: As depicted with computed tomography, the severity of earthquake-related head traumas in survivors was milder, and isolated extracranial injuries were more common in earthquake-related head traumas than in non-earthquake-related injuries, which may have been the result of different injury causes, mechanisms and settings. PMID:22012045

  1. Earthquakes, May-June 1991

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    In the United States, a magnitude 5.8 earthquake in southern California on June 28 killed two people and caused considerable damage. Strong earthquakes hit Alaska on May 1 and May 30; the May 1 earthquake caused some minor damage. 

  2. Nowcasting Earthquakes: A Comparison of Induced Earthquakes in Oklahoma and at the Geysers, California

    NASA Astrophysics Data System (ADS)

    Luginbuhl, Molly; Rundle, John B.; Hawkins, Angela; Turcotte, Donald L.

    2018-01-01

    Nowcasting is a new method of statistically classifying seismicity and seismic risk (Rundle et al. 2016). In this paper, the method is applied to the induced seismicity at the Geysers geothermal region in California and the induced seismicity due to fluid injection in Oklahoma. Nowcasting utilizes the catalogs of seismicity in these regions. Two earthquake magnitudes are selected, one large say M_{λ } ≥ 4, and one small say M_{σ } ≥ 2. The method utilizes the number of small earthquakes that occurs between pairs of large earthquakes. The cumulative probability distribution of these values is obtained. The earthquake potential score (EPS) is defined by the number of small earthquakes that has occurred since the last large earthquake, the point where this number falls on the cumulative probability distribution of interevent counts defines the EPS. A major advantage of nowcasting is that it utilizes "natural time", earthquake counts, between events rather than clock time. Thus, it is not necessary to decluster aftershocks and the results are applicable if the level of induced seismicity varies in time. The application of natural time to the accumulation of the seismic hazard depends on the applicability of Gutenberg-Richter (GR) scaling. The increasing number of small earthquakes that occur after a large earthquake can be scaled to give the risk of a large earthquake occurring. To illustrate our approach, we utilize the number of M_{σ } ≥ 2.75 earthquakes in Oklahoma to nowcast the number of M_{λ } ≥ 4.0 earthquakes in Oklahoma. The applicability of the scaling is illustrated during the rapid build-up of injection-induced seismicity between 2012 and 2016, and the subsequent reduction in seismicity associated with a reduction in fluid injections. The same method is applied to the geothermal-induced seismicity at the Geysers, California, for comparison.

  3. The 1906 earthquake and a century of progress in understanding earthquakes and their hazards

    USGS Publications Warehouse

    Zoback, M.L.

    2006-01-01

    The 18 April 1906 San Francisco earthquake killed nearly 3000 people and left 225,000 residents homeless. Three days after the earthquake, an eight-person Earthquake Investigation Commission composed of 25 geologists, seismologists, geodesists, biologists and engineers, as well as some 300 others started work under the supervision of Andrew Lawson to collect and document physical phenomena related to the quake . On 31 May 1906, the commission published a preliminary 17-page report titled "The Report of the State Earthquake Investigation Commission". The report included the bulk of the geological and morphological descriptions of the faulting, detailed reports on shaking intensity, as well as an impressive atlas of 40 oversized maps and folios. Nearly 100 years after its publication, the Commission Report remains a model for post-earthquake investigations. Because the diverse data sets were so complete and carefully documented, researchers continue to apply modern analysis techniques to learn from the 1906 earthquake. While the earthquake marked a seminal event in the history of California, it served as impetus for the birth of modern earthquake science in the United States.

  4. Earthquakes; January-February, 1979

    USGS Publications Warehouse

    Person, W.J.

    1979-01-01

    The first major earthquake (magnitude 7.0 to 7.9) of the year struck in southeastern Alaska in a sparsely populated area on February 28. On January 16, Iran experienced the first destructive earthquake of the year causing a number of casualties and considerable damage. Peru was hit by a destructive earthquake on February 16 that left casualties and damage. A number of earthquakes were experienced in parts of the Untied States, but only minor damage was reported. 

  5. Preliminary results on earthquake triggered landslides for the Haiti earthquake (January 2010)

    NASA Astrophysics Data System (ADS)

    van Westen, Cees; Gorum, Tolga

    2010-05-01

    This study presents the first results on an analysis of the landslides triggered by the Ms 7.0 Haiti earthquake that occurred on January 12, 2010 in the boundary region of the Pacific Plate and the North American plate. The fault is a left lateral strike slip fault with a clear surface expression. According to the USGS earthquake information the Enriquillo-Plantain Garden fault system has not produced any major earthquake in the last 100 years, and historical earthquakes are known from 1860, 1770, 1761, 1751, 1684, 1673, and 1618, though none of these has been confirmed in the field as associated with this fault. We used high resolution satellite imagery available for the pre and post earthquake situations, which were made freely available for the response and rescue operations. We made an interpretation of all co-seismic landslides in the epicentral area. We conclude that the earthquake mainly triggered landslide in the northern slope of the fault-related valley and in a number of isolated area. The earthquake apparently didn't trigger many visible landslides within the slum areas on the slopes in the southern part of Port-au-Prince and Carrefour. We also used ASTER DEM information to relate the landslide occurrences with DEM derivatives.

  6. Turkish Compulsory Earthquake Insurance (TCIP)

    NASA Astrophysics Data System (ADS)

    Erdik, M.; Durukal, E.; Sesetyan, K.

    2009-04-01

    Through a World Bank project a government-sponsored Turkish Catastrophic Insurance Pool (TCIP) is created in 2000 with the essential aim of transferring the government's financial burden of replacing earthquake-damaged housing to international reinsurance and capital markets. Providing coverage to about 2.9 Million homeowners TCIP is the largest insurance program in the country with about 0.5 Billion USD in its own reserves and about 2.3 Billion USD in total claims paying capacity. The total payment for earthquake damage since 2000 (mostly small, 226 earthquakes) amounts to about 13 Million USD. The country-wide penetration rate is about 22%, highest in the Marmara region (30%) and lowest in the south-east Turkey (9%). TCIP is the sole-source provider of earthquake loss coverage up to 90,000 USD per house. The annual premium, categorized on the basis of earthquake zones type of structure, is about US90 for a 100 square meter reinforced concrete building in the most hazardous zone with 2% deductible. The earthquake engineering related shortcomings of the TCIP is exemplified by fact that the average rate of 0.13% (for reinforced concrete buildings) with only 2% deductible is rather low compared to countries with similar earthquake exposure. From an earthquake engineering point of view the risk underwriting (Typification of housing units to be insured, earthquake intensity zonation and the sum insured) of the TCIP needs to be overhauled. Especially for large cities, models can be developed where its expected earthquake performance (and consequently the insurance premium) can be can be assessed on the basis of the location of the unit (microzoned earthquake hazard) and basic structural attributes (earthquake vulnerability relationships). With such an approach, in the future the TCIP can contribute to the control of construction through differentiation of premia on the basis of earthquake vulnerability.

  7. Limitation of the Predominant-Period Estimator for Earthquake Early Warning and the Initial Rupture of Earthquakes

    NASA Astrophysics Data System (ADS)

    Yamada, T.; Ide, S.

    2007-12-01

    Earthquake early warning is an important and challenging issue for the reduction of the seismic damage, especially for the mitigation of human suffering. One of the most important problems in earthquake early warning systems is how immediately we can estimate the final size of an earthquake after we observe the ground motion. It is relevant to the problem whether the initial rupture of an earthquake has some information associated with its final size. Nakamura (1988) developed the Urgent Earthquake Detection and Alarm System (UrEDAS). It calculates the predominant period of the P wave (τp) and estimates the magnitude of an earthquake immediately after the P wave arrival from the value of τpmax, or the maximum value of τp. The similar approach has been adapted by other earthquake alarm systems (e.g., Allen and Kanamori (2003)). To investigate the characteristic of the parameter τp and the effect of the length of the time window (TW) in the τpmax calculation, we analyze the high-frequency recordings of earthquakes at very close distances in the Mponeng mine in South Africa. We find that values of τpmax have upper and lower limits. For larger earthquakes whose source durations are longer than TW, the values of τpmax have an upper limit which depends on TW. On the other hand, the values for smaller earthquakes have a lower limit which is proportional to the sampling interval. For intermediate earthquakes, the values of τpmax are close to their typical source durations. These two limits and the slope for intermediate earthquakes yield an artificial final size dependence of τpmax in a wide size range. The parameter τpmax is useful for detecting large earthquakes and broadcasting earthquake early warnings. However, its dependence on the final size of earthquakes does not suggest that the earthquake rupture is deterministic. This is because τpmax does not always have a direct relation to the physical quantities of an earthquake.

  8. Sun, Moon and Earthquakes

    NASA Astrophysics Data System (ADS)

    Kolvankar, V. G.

    2013-12-01

    During a study conducted to find the effect of Earth tides on the occurrence of earthquakes, for small areas [typically 1000km X1000km] of high-seismicity regions, it was noticed that the Sun's position in terms of universal time [GMT] shows links to the sum of EMD [longitude of earthquake location - longitude of Moon's foot print on earth] and SEM [Sun-Earth-Moon angle]. This paper provides the details of this relationship after studying earthquake data for over forty high-seismicity regions of the world. It was found that over 98% of the earthquakes for these different regions, examined for the period 1973-2008, show a direct relationship between the Sun's position [GMT] and [EMD+SEM]. As the time changes from 00-24 hours, the factor [EMD+SEM] changes through 360 degree, and plotting these two variables for earthquakes from different small regions reveals a simple 45 degree straight-line relationship between them. This relationship was tested for all earthquakes and earthquake sequences for magnitude 2.0 and above. This study conclusively proves how Sun and the Moon govern all earthquakes. Fig. 12 [A+B]. The left-hand figure provides a 24-hour plot for forty consecutive days including the main event (00:58:23 on 26.12.2004, Lat.+3.30, Long+95.980, Mb 9.0, EQ count 376). The right-hand figure provides an earthquake plot for (EMD+SEM) vs GMT timings for the same data. All the 376 events including the main event faithfully follow the straight-line curve.

  9. Are Earthquakes Predictable? A Study on Magnitude Correlations in Earthquake Catalog and Experimental Data

    NASA Astrophysics Data System (ADS)

    Stavrianaki, K.; Ross, G.; Sammonds, P. R.

    2015-12-01

    The clustering of earthquakes in time and space is widely accepted, however the existence of correlations in earthquake magnitudes is more questionable. In standard models of seismic activity, it is usually assumed that magnitudes are independent and therefore in principle unpredictable. Our work seeks to test this assumption by analysing magnitude correlation between earthquakes and their aftershocks. To separate mainshocks from aftershocks, we perform stochastic declustering based on the widely used Epidemic Type Aftershock Sequence (ETAS) model, which allows us to then compare the average magnitudes of aftershock sequences to that of their mainshock. The results of earthquake magnitude correlations were compared with acoustic emissions (AE) from laboratory analog experiments, as fracturing generates both AE at the laboratory scale and earthquakes on a crustal scale. Constant stress and constant strain rate experiments were done on Darley Dale sandstone under confining pressure to simulate depth of burial. Microcracking activity inside the rock volume was analyzed by the AE technique as a proxy for earthquakes. Applying the ETAS model to experimental data allowed us to validate our results and provide for the first time a holistic view on the correlation of earthquake magnitudes. Additionally we search the relationship between the conditional intensity estimates of the ETAS model and the earthquake magnitudes. A positive relation would suggest the existence of magnitude correlations. The aim of this study is to observe any trends of dependency between the magnitudes of aftershock earthquakes and the earthquakes that trigger them.

  10. Putting down roots in earthquake country-Your handbook for earthquakes in the Central United States

    USGS Publications Warehouse

    Contributors: Dart, Richard; McCarthy, Jill; McCallister, Natasha; Williams, Robert A.

    2011-01-01

    This handbook provides information to residents of the Central United States about the threat of earthquakes in that area, particularly along the New Madrid seismic zone, and explains how to prepare for, survive, and recover from such events. It explains the need for concern about earthquakes for those residents and describes what one can expect during and after an earthquake. Much is known about the threat of earthquakes in the Central United States, including where they are likely to occur and what can be done to reduce losses from future earthquakes, but not enough has been done to prepare for future earthquakes. The handbook describes such preparations that can be taken by individual residents before an earthquake to be safe and protect property.

  11. Important Earthquake Engineering Resources

    Science.gov Websites

    PEER logo Pacific Earthquake Engineering Research Center home about peer news events research Engineering Resources Site Map Search Important Earthquake Engineering Resources - American Concrete Institute Motion Observation Systems (COSMOS) - Consortium of Universities for Research in Earthquake Engineering

  12. Repeating Earthquakes Following an Mw 4.4 Earthquake Near Luther, Oklahoma

    NASA Astrophysics Data System (ADS)

    Clements, T.; Keranen, K. M.; Savage, H. M.

    2015-12-01

    An Mw 4.4 earthquake on April 16, 2013 near Luther, OK was one of the earliest M4+ earthquakes in central Oklahoma, following the Prague sequence in 2011. A network of four local broadband seismometers deployed within a day of the Mw 4.4 event, along with six Oklahoma netquake stations, recorded more than 500 aftershocks in the two weeks following the Luther earthquake. Here we use HypoDD (Waldhauser & Ellsworth, 2000) and waveform cross-correlation to obtain precise aftershock locations. The location uncertainty, calculated using the SVD method in HypoDD, is ~15 m horizontally and ~ 35 m vertically. The earthquakes define a near vertical, NE-SW striking fault plane. Events occur at depths from 2 km to 3.5 km within the granitic basement, with a small fraction of events shallower, near the sediment-basement interface. Earthquakes occur within a zone of ~200 meters thickness on either side of the best-fitting fault surface. We use an equivalency class algorithm to identity clusters of repeating events, defined as event pairs with median three-component correlation > 0.97 across common stations (Aster & Scott, 1993). Repeating events occur as doublets of only two events in over 50% of cases; overall, 41% of earthquakes recorded occur as repeating events. The recurrence intervals for the repeating events range from minutes to days, with common recurrence intervals of less than two minutes. While clusters occur in tight dimensions, commonly of 80 m x 200 m, aftershocks occur in 3 distinct ~2km x 2km-sized patches along the fault. Our analysis suggests that with rapidly deployed local arrays, the plethora of ~Mw 4 earthquakes occurring in Oklahoma and Southern Kansas can be used to investigate the earthquake rupture process and the role of damage zones.

  13. Earthquake Hazards.

    ERIC Educational Resources Information Center

    Donovan, Neville

    1979-01-01

    Provides a survey and a review of earthquake activity and global tectonics from the advancement of the theory of continental drift to the present. Topics include: an identification of the major seismic regions of the earth, seismic measurement techniques, seismic design criteria for buildings, and the prediction of earthquakes. (BT)

  14. Earthquakes, May-June 1981

    USGS Publications Warehouse

    Person, W.J.

    1981-01-01

    The months of May and June were somewhat quiet, seismically speaking. There was one major earthquake (7.0-7.9) off the west coast of South Island, New Zealand. The most destructive earthquake during this reporting period was in southern Iran on June 11 which caused fatalities and extensive damage. Peru also experienced a destructive earthquake on June 22 which caused fatalities and damage. In the United States, a number of earthquakes were experienced, but none caused significant damage. 

  15. ViscoSim Earthquake Simulator

    USGS Publications Warehouse

    Pollitz, Fred

    2012-01-01

    Synthetic seismicity simulations have been explored by the Southern California Earthquake Center (SCEC) Earthquake Simulators Group in order to guide long‐term forecasting efforts related to the Unified California Earthquake Rupture Forecast (Tullis et al., 2012a). In this study I describe the viscoelastic earthquake simulator (ViscoSim) of Pollitz, 2009. Recapitulating to a large extent material previously presented by Pollitz (2009, 2011) I describe its implementation of synthetic ruptures and how it differs from other simulators being used by the group.

  16. Earthquake damage orientation to infer seismic parameters in archaeological sites and historical earthquakes

    NASA Astrophysics Data System (ADS)

    Martín-González, Fidel

    2018-01-01

    Studies to provide information concerning seismic parameters and seismic sources of historical and archaeological seismic events are used to better evaluate the seismic hazard of a region. This is of especial interest when no surface rupture is recorded or the seismogenic fault cannot be identified. The orientation pattern of the earthquake damage (ED) (e.g., fallen columns, dropped key stones) that affected architectonic elements of cities after earthquakes has been traditionally used in historical and archaeoseismological studies to infer seismic parameters. However, in the literature depending on the authors, the parameters that can be obtained are contradictory (it has been proposed: the epicenter location, the orientation of the P-waves, the orientation of the compressional strain and the fault kinematics) and authors even question these relations with the earthquake damage. The earthquakes of Lorca in 2011, Christchurch in 2011 and Emilia Romagna in 2012 present an opportunity to measure systematically a large number and wide variety of earthquake damage in historical buildings (the same structures that are used in historical and archaeological studies). The damage pattern orientation has been compared with modern instrumental data, which is not possible in historical and archaeoseismological studies. From measurements and quantification of the orientation patterns in the studied earthquakes, it is observed that there is a systematic pattern of the earthquake damage orientation (EDO) in the proximity of the seismic source (fault trace) (<10 km). The EDO in these earthquakes is normal to the fault trend (±15°). This orientation can be generated by a pulse of motion that in the near fault region has a distinguishable acceleration normal to the fault due to the polarization of the S-waves. Therefore, the earthquake damage orientation could be used to estimate the seismogenic fault trend of historical earthquakes studies where no instrumental data are available.

  17. Earthquakes, March-April, 1993

    USGS Publications Warehouse

    Person, Waverly J.

    1993-01-01

    Worldwide, only one major earthquake (7.0earthquake, a magnitude 7.2 shock, struck the Santa Cruz Islands region in the South Pacific on March 6. Earthquake-related deaths occurred in the Fiji Islands, China, and Peru.

  18. 2016 National Earthquake Conference

    Science.gov Websites

    Thank you to our Presenting Sponsor, California Earthquake Authority. What's New? What's Next ? What's Your Role in Building a National Strategy? The National Earthquake Conference (NEC) is a , state government leaders, social science practitioners, U.S. State and Territorial Earthquake Managers

  19. The music of earthquakes and Earthquake Quartet #1

    USGS Publications Warehouse

    Michael, Andrew J.

    2013-01-01

    Earthquake Quartet #1, my composition for voice, trombone, cello, and seismograms, is the intersection of listening to earthquakes as a seismologist and performing music as a trombonist. Along the way, I realized there is a close relationship between what I do as a scientist and what I do as a musician. A musician controls the source of the sound and the path it travels through their instrument in order to make sound waves that we hear as music. An earthquake is the source of waves that travel along a path through the earth until reaching us as shaking. It is almost as if the earth is a musician and people, including seismologists, are metaphorically listening and trying to understand what the music means.

  20. Earthquake triggering in southeast Africa following the 2012 Indian Ocean earthquake

    NASA Astrophysics Data System (ADS)

    Neves, Miguel; Custódio, Susana; Peng, Zhigang; Ayorinde, Adebayo

    2018-02-01

    In this paper we present evidence of earthquake dynamic triggering in southeast Africa. We analysed seismic waveforms recorded at 53 broad-band and short-period stations in order to identify possible increases in the rate of microearthquakes and tremor due to the passage of teleseismic waves generated by the Mw8.6 2012 Indian Ocean earthquake. We found evidence of triggered local earthquakes and no evidence of triggered tremor in the region. We assessed the statistical significance of the increase in the number of local earthquakes using β-statistics. Statistically significant dynamic triggering of local earthquakes was observed at 7 out of the 53 analysed stations. Two of these stations are located in the northeast coast of Madagascar and the other five stations are located in the Kaapvaal Craton, southern Africa. We found no evidence of dynamically triggered seismic activity in stations located near the structures of the East African Rift System. Hydrothermal activity exists close to the stations that recorded dynamic triggering, however, it also exists near the East African Rift System structures where no triggering was observed. Our results suggest that factors other than solely tectonic regime and geothermalism are needed to explain the mechanisms that underlie earthquake triggering.

  1. Toward real-time regional earthquake simulation of Taiwan earthquakes

    NASA Astrophysics Data System (ADS)

    Lee, S.; Liu, Q.; Tromp, J.; Komatitsch, D.; Liang, W.; Huang, B.

    2013-12-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 minutes after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 minutes for a 70 sec ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  2. Geophysical Anomalies and Earthquake Prediction

    NASA Astrophysics Data System (ADS)

    Jackson, D. D.

    2008-12-01

    Finding anomalies is easy. Predicting earthquakes convincingly from such anomalies is far from easy. Why? Why have so many beautiful geophysical abnormalities not led to successful prediction strategies? What is earthquake prediction? By my definition it is convincing information that an earthquake of specified size is temporarily much more likely than usual in a specific region for a specified time interval. We know a lot about normal earthquake behavior, including locations where earthquake rates are higher than elsewhere, with estimable rates and size distributions. We know that earthquakes have power law size distributions over large areas, that they cluster in time and space, and that aftershocks follow with power-law dependence on time. These relationships justify prudent protective measures and scientific investigation. Earthquake prediction would justify exceptional temporary measures well beyond those normal prudent actions. Convincing earthquake prediction would result from methods that have demonstrated many successes with few false alarms. Predicting earthquakes convincingly is difficult for several profound reasons. First, earthquakes start in tiny volumes at inaccessible depth. The power law size dependence means that tiny unobservable ones are frequent almost everywhere and occasionally grow to larger size. Thus prediction of important earthquakes is not about nucleation, but about identifying the conditions for growth. Second, earthquakes are complex. They derive their energy from stress, which is perniciously hard to estimate or model because it is nearly singular at the margins of cracks and faults. Physical properties vary from place to place, so the preparatory processes certainly vary as well. Thus establishing the needed track record for validation is very difficult, especially for large events with immense interval times in any one location. Third, the anomalies are generally complex as well. Electromagnetic anomalies in particular require

  3. Historical earthquake research in Austria

    NASA Astrophysics Data System (ADS)

    Hammerl, Christa

    2017-12-01

    Austria has a moderate seismicity, and on average the population feels 40 earthquakes per year or approximately three earthquakes per month. A severe earthquake with light building damage is expected roughly every 2 to 3 years in Austria. Severe damage to buildings ( I 0 > 8° EMS) occurs significantly less frequently, the average period of recurrence is about 75 years. For this reason the historical earthquake research has been of special importance in Austria. The interest in historical earthquakes in the past in the Austro-Hungarian Empire is outlined, beginning with an initiative of the Austrian Academy of Sciences and the development of historical earthquake research as an independent research field after the 1978 "Zwentendorf plebiscite" on whether the nuclear power plant will start up. The applied methods are introduced briefly along with the most important studies and last but not least as an example of a recently carried out case study, one of the strongest past earthquakes in Austria, the earthquake of 17 July 1670, is presented. The research into historical earthquakes in Austria concentrates on seismic events of the pre-instrumental period. The investigations are not only of historical interest, but also contribute to the completeness and correctness of the Austrian earthquake catalogue, which is the basis for seismic hazard analysis and as such benefits the public, communities, civil engineers, architects, civil protection, and many others.

  4. Unraveling earthquake stresses: Insights from dynamically triggered and induced earthquakes

    NASA Astrophysics Data System (ADS)

    Velasco, A. A.; Alfaro-Diaz, R. A.

    2017-12-01

    Induced seismicity, earthquakes caused by anthropogenic activity, has more than doubled in the last several years resulting from practices related to oil and gas production. Furthermore, large earthquakes have been shown to promote the triggering of other events within two fault lengths (static triggering), due to static stresses caused by physical movement along the fault, and also remotely from the passage of seismic waves (dynamic triggering). Thus, in order to understand the mechanisms for earthquake failure, we investigate regions where natural, induced, and dynamically triggered events occur, and specifically target Oklahoma. We first analyze data from EarthScope's USArray Transportable Array (TA) and local seismic networks implementing an optimized (STA/LTA) detector in order to develop local detection and earthquake catalogs. After we identify triggered events through statistical analysis, and perform a stress analysis to gain insight on the stress-states leading to triggered earthquake failure. We use our observations to determine the role of different transient stresses in contributing to natural and induced seismicity by comparing these stresses to regional stress orientation. We also delineate critically stressed regions of triggered seismicity that may indicate areas susceptible to earthquake hazards associated with sustained fluid injection in provinces of induced seismicity. Anthropogenic injection and extraction activity can alter the stress state and fluid flow within production basins. By analyzing the stress release of these ancient faults caused by dynamic stresses, we may be able to determine if fluids are solely responsible for increased seismic activity in induced regions.

  5. Earthquake triggering by seismic waves following the landers and hector mine earthquakes

    USGS Publications Warehouse

    Gomberg, J.; Reasenberg, P.A.; Bodin, P.; Harris, R.A.

    2001-01-01

    The proximity and similarity of the 1992, magnitude 7.3 Landers and 1999, magnitude 7.1 Hector Mine earthquakes in California permit testing of earthquake triggering hypotheses not previously possible. The Hector Mine earthquake confirmed inferences that transient, oscillatory 'dynamic' deformations radiated as seismic waves can trigger seismicity rate increases, as proposed for the Landers earthquake1-6. Here we quantify the spatial and temporal patterns of the seismicity rate changes7. The seismicity rate increase was to the north for the Landers earthquake and primarily to the south for the Hector Mine earthquake. We suggest that rupture directivity results in elevated dynamic deformations north and south of the Landers and Hector Mine faults, respectively, as evident in the asymmetry of the recorded seismic velocity fields. Both dynamic and static stress changes seem important for triggering in the near field with dynamic stress changes dominating at greater distances. Peak seismic velocities recorded for each earthquake suggest the existence of, and place bounds on, dynamic triggering thresholds. These thresholds vary from a few tenths to a few MPa in most places, depend on local conditions, and exceed inferred static thresholds by more than an order of magnitude. At some sites, the onset of triggering was delayed until after the dynamic deformations subsided. Physical mechanisms consistent with all these observations may be similar to those that give rise to liquefaction or cyclic fatigue.

  6. Analysis of pre-earthquake ionospheric anomalies before the global M = 7.0+ earthquakes in 2010

    NASA Astrophysics Data System (ADS)

    Yao, Y. B.; Chen, P.; Zhang, S.; Chen, J. J.; Yan, F.; Peng, W. F.

    2012-03-01

    The pre-earthquake ionospheric anomalies that occurred before the global M = 7.0+ earthquakes in 2010 are investigated using the total electron content (TEC) from the global ionosphere map (GIM). We analyze the possible causes of the ionospheric anomalies based on the space environment and magnetic field status. Results show that some anomalies are related to the earthquakes. By analyzing the time of occurrence, duration, and spatial distribution of these ionospheric anomalies, a number of new conclusions are drawn, as follows: earthquake-related ionospheric anomalies are not bound to appear; both positive and negative anomalies are likely to occur; and the earthquake-related ionospheric anomalies discussed in the current study occurred 0-2 days before the associated earthquakes and in the afternoon to sunset (i.e. between 12:00 and 20:00 local time). Pre-earthquake ionospheric anomalies occur mainly in areas near the epicenter. However, the maximum affected area in the ionosphere does not coincide with the vertical projection of the epicenter of the subsequent earthquake. The directions deviating from the epicenters do not follow a fixed rule. The corresponding ionospheric effects can also be observed in the magnetically conjugated region. However, the probability of the anomalies appearance and extent of the anomalies in the magnetically conjugated region are smaller than the anomalies near the epicenter. Deep-focus earthquakes may also exhibit very significant pre-earthquake ionospheric anomalies.

  7. Modeling, Forecasting and Mitigating Extreme Earthquakes

    NASA Astrophysics Data System (ADS)

    Ismail-Zadeh, A.; Le Mouel, J.; Soloviev, A.

    2012-12-01

    Recent earthquake disasters highlighted the importance of multi- and trans-disciplinary studies of earthquake risk. A major component of earthquake disaster risk analysis is hazards research, which should cover not only a traditional assessment of ground shaking, but also studies of geodetic, paleoseismic, geomagnetic, hydrological, deep drilling and other geophysical and geological observations together with comprehensive modeling of earthquakes and forecasting extreme events. Extreme earthquakes (large magnitude and rare events) are manifestations of complex behavior of the lithosphere structured as a hierarchical system of blocks of different sizes. Understanding of physics and dynamics of the extreme events comes from observations, measurements and modeling. A quantitative approach to simulate earthquakes in models of fault dynamics will be presented. The models reproduce basic features of the observed seismicity (e.g., the frequency-magnitude relationship, clustering of earthquakes, occurrence of extreme seismic events). They provide a link between geodynamic processes and seismicity, allow studying extreme events, influence of fault network properties on seismic patterns and seismic cycles, and assist, in a broader sense, in earthquake forecast modeling. Some aspects of predictability of large earthquakes (how well can large earthquakes be predicted today?) will be also discussed along with possibilities in mitigation of earthquake disasters (e.g., on 'inverse' forensic investigations of earthquake disasters).

  8. Extending earthquakes' reach through cascading.

    PubMed

    Marsan, David; Lengliné, Olivier

    2008-02-22

    Earthquakes, whatever their size, can trigger other earthquakes. Mainshocks cause aftershocks to occur, which in turn activate their own local aftershock sequences, resulting in a cascade of triggering that extends the reach of the initial mainshock. A long-lasting difficulty is to determine which earthquakes are connected, either directly or indirectly. Here we show that this causal structure can be found probabilistically, with no a priori model nor parameterization. Large regional earthquakes are found to have a short direct influence in comparison to the overall aftershock sequence duration. Relative to these large mainshocks, small earthquakes collectively have a greater effect on triggering. Hence, cascade triggering is a key component in earthquake interactions.

  9. Earthquake Catalogue of the Caucasus

    NASA Astrophysics Data System (ADS)

    Godoladze, T.; Gok, R.; Tvaradze, N.; Tumanova, N.; Gunia, I.; Onur, T.

    2016-12-01

    The Caucasus has a documented historical catalog stretching back to the beginning of the Christian era. Most of the largest historical earthquakes prior to the 19th century are assumed to have occurred on active faults of the Greater Caucasus. Important earthquakes include the Samtskhe earthquake of 1283 (Ms˜7.0, Io=9); Lechkhumi-Svaneti earthquake of 1350 (Ms˜7.0, Io=9); and the Alaverdi earthquake of 1742 (Ms˜6.8, Io=9). Two significant historical earthquakes that may have occurred within the Javakheti plateau in the Lesser Caucasus are the Tmogvi earthquake of 1088 (Ms˜6.5, Io=9) and the Akhalkalaki earthquake of 1899 (Ms˜6.3, Io =8-9). Large earthquakes that occurred in the Caucasus within the period of instrumental observation are: Gori 1920; Tabatskuri 1940; Chkhalta 1963; Racha earthquake of 1991 (Ms=7.0), is the largest event ever recorded in the region; Barisakho earthquake of 1992 (M=6.5); Spitak earthquake of 1988 (Ms=6.9, 100 km south of Tbilisi), which killed over 50,000 people in Armenia. Recently, permanent broadband stations have been deployed across the region as part of the various national networks (Georgia (˜25 stations), Azerbaijan (˜35 stations), Armenia (˜14 stations)). The data from the last 10 years of observation provides an opportunity to perform modern, fundamental scientific investigations. In order to improve seismic data quality a catalog of all instrumentally recorded earthquakes has been compiled by the IES (Institute of Earth Sciences/NSMC, Ilia State University) in the framework of regional joint project (Armenia, Azerbaijan, Georgia, Turkey, USA) "Probabilistic Seismic Hazard Assessment (PSHA) in the Caucasus. The catalogue consists of more then 80,000 events. First arrivals of each earthquake of Mw>=4.0 have been carefully examined. To reduce calculation errors, we corrected arrivals from the seismic records. We improved locations of the events and recalculate Moment magnitudes in order to obtain unified magnitude

  10. Earthquakes, July-August 1992

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    There were two major earthquakes (7.0≤M<8.0) during this reporting period. A magnitude 7.5 earthquake occurred in Kyrgyzstan on August 19 and a magnitude 7.0 quake struck the Ascension Island region on August 28. In southern California, aftershocks of the magnitude 7.6 earthquake on June 28, 1992, continued. One of these aftershocks caused damage and injuries, and at least one other aftershock caused additional damage. Earthquake-related fatalities were reportred in Kyrgzstan and Pakistan. 

  11. Earthquakes: Recurrence and Interoccurrence Times

    NASA Astrophysics Data System (ADS)

    Abaimov, S. G.; Turcotte, D. L.; Shcherbakov, R.; Rundle, J. B.; Yakovlev, G.; Goltz, C.; Newman, W. I.

    2008-04-01

    The purpose of this paper is to discuss the statistical distributions of recurrence times of earthquakes. Recurrence times are the time intervals between successive earthquakes at a specified location on a specified fault. Although a number of statistical distributions have been proposed for recurrence times, we argue in favor of the Weibull distribution. The Weibull distribution is the only distribution that has a scale-invariant hazard function. We consider three sets of characteristic earthquakes on the San Andreas fault: (1) The Parkfield earthquakes, (2) the sequence of earthquakes identified by paleoseismic studies at the Wrightwood site, and (3) an example of a sequence of micro-repeating earthquakes at a site near San Juan Bautista. In each case we make a comparison with the applicable Weibull distribution. The number of earthquakes in each of these sequences is too small to make definitive conclusions. To overcome this difficulty we consider a sequence of earthquakes obtained from a one million year “Virtual California” simulation of San Andreas earthquakes. Very good agreement with a Weibull distribution is found. We also obtain recurrence statistics for two other model studies. The first is a modified forest-fire model and the second is a slider-block model. In both cases good agreements with Weibull distributions are obtained. Our conclusion is that the Weibull distribution is the preferred distribution for estimating the risk of future earthquakes on the San Andreas fault and elsewhere.

  12. Natural Time and Nowcasting Earthquakes: Are Large Global Earthquakes Temporally Clustered?

    NASA Astrophysics Data System (ADS)

    Luginbuhl, Molly; Rundle, John B.; Turcotte, Donald L.

    2018-02-01

    The objective of this paper is to analyze the temporal clustering of large global earthquakes with respect to natural time, or interevent count, as opposed to regular clock time. To do this, we use two techniques: (1) nowcasting, a new method of statistically classifying seismicity and seismic risk, and (2) time series analysis of interevent counts. We chose the sequences of M_{λ } ≥ 7.0 and M_{λ } ≥ 8.0 earthquakes from the global centroid moment tensor (CMT) catalog from 2004 to 2016 for analysis. A significant number of these earthquakes will be aftershocks of the largest events, but no satisfactory method of declustering the aftershocks in clock time is available. A major advantage of using natural time is that it eliminates the need for declustering aftershocks. The event count we utilize is the number of small earthquakes that occur between large earthquakes. The small earthquake magnitude is chosen to be as small as possible, such that the catalog is still complete based on the Gutenberg-Richter statistics. For the CMT catalog, starting in 2004, we found the completeness magnitude to be M_{σ } ≥ 5.1. For the nowcasting method, the cumulative probability distribution of these interevent counts is obtained. We quantify the distribution using the exponent, β, of the best fitting Weibull distribution; β = 1 for a random (exponential) distribution. We considered 197 earthquakes with M_{λ } ≥ 7.0 and found β = 0.83 ± 0.08. We considered 15 earthquakes with M_{λ } ≥ 8.0, but this number was considered too small to generate a meaningful distribution. For comparison, we generated synthetic catalogs of earthquakes that occur randomly with the Gutenberg-Richter frequency-magnitude statistics. We considered a synthetic catalog of 1.97 × 10^5 M_{λ } ≥ 7.0 earthquakes and found β = 0.99 ± 0.01. The random catalog converted to natural time was also random. We then generated 1.5 × 10^4 synthetic catalogs with 197 M_{λ } ≥ 7.0 in each catalog and

  13. Earthquake and Tsunami booklet based on two Indonesia earthquakes

    NASA Astrophysics Data System (ADS)

    Hayashi, Y.; Aci, M.

    2014-12-01

    Many destructive earthquakes occurred during the last decade in Indonesia. These experiences are very important precepts for the world people who live in earthquake and tsunami countries. We are collecting the testimonies of tsunami survivors to clarify successful evacuation process and to make clear the characteristic physical behaviors of tsunami near coast. We research 2 tsunami events, 2004 Indian Ocean tsunami and 2010 Mentawai slow earthquake tsunami. Many video and photographs were taken by people at some places in 2004 Indian ocean tsunami disaster; nevertheless these were few restricted points. We didn't know the tsunami behavior in another place. In this study, we tried to collect extensive information about tsunami behavior not only in many places but also wide time range after the strong shake. In Mentawai case, the earthquake occurred in night, so there are no impressive photos. To collect detail information about evacuation process from tsunamis, we contrived the interview method. This method contains making pictures of tsunami experience from the scene of victims' stories. In 2004 Aceh case, all survivors didn't know tsunami phenomena. Because there were no big earthquakes with tsunami for one hundred years in Sumatra region, public people had no knowledge about tsunami. This situation was highly improved in 2010 Mentawai case. TV programs and NGO or governmental public education programs about tsunami evacuation are widespread in Indonesia. Many people know about fundamental knowledge of earthquake and tsunami disasters. We made drill book based on victim's stories and painted impressive scene of 2 events. We used the drill book in disaster education event in school committee of west Java. About 80 % students and teachers evaluated that the contents of the drill book are useful for correct understanding.

  14. Earthquakes: hydrogeochemical precursors

    USGS Publications Warehouse

    Ingebritsen, Steven E.; Manga, Michael

    2014-01-01

    Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets.

  15. Earthquakes, September-October 1993

    USGS Publications Warehouse

    Person, W.J.

    1993-01-01

    The fatalities in the United States were caused by two earthquakes in southern Oregon on September 21. These earthquakes, both with magnitude 6.0 and separated in time by about 2 hrs, led to the deaths of two people. One of these deaths was apparently due to a heart attack induced by the earthquake

  16. Children's Ideas about Earthquakes

    ERIC Educational Resources Information Center

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  17. Excel, Earthquakes, and Moneyball: exploring Cascadia earthquake probabilities using spreadsheets and baseball analogies

    NASA Astrophysics Data System (ADS)

    Campbell, M. R.; Salditch, L.; Brooks, E. M.; Stein, S.; Spencer, B. D.

    2017-12-01

    Much recent media attention focuses on Cascadia's earthquake hazard. A widely cited magazine article starts "An earthquake will destroy a sizable portion of the coastal Northwest. The question is when." Stories include statements like "a massive earthquake is overdue", "in the next 50 years, there is a 1-in-10 chance a "really big one" will erupt," or "the odds of the big Cascadia earthquake happening in the next fifty years are roughly one in three." These lead students to ask where the quoted probabilities come from and what they mean. These probability estimates involve two primary choices: what data are used to describe when past earthquakes happened and what models are used to forecast when future earthquakes will happen. The data come from a 10,000-year record of large paleoearthquakes compiled from subsidence data on land and turbidites, offshore deposits recording submarine slope failure. Earthquakes seem to have happened in clusters of four or five events, separated by gaps. Earthquakes within a cluster occur more frequently and regularly than in the full record. Hence the next earthquake is more likely if we assume that we are in the recent cluster that started about 1700 years ago, than if we assume the cluster is over. Students can explore how changing assumptions drastically changes probability estimates using easy-to-write and display spreadsheets, like those shown below. Insight can also come from baseball analogies. The cluster issue is like deciding whether to assume that a hitter's performance in the next game is better described by his lifetime record, or by the past few games, since he may be hitting unusually well or in a slump. The other big choice is whether to assume that the probability of an earthquake is constant with time, or is small immediately after one occurs and then grows with time. This is like whether to assume that a player's performance is the same from year to year, or changes over their career. Thus saying "the chance of

  18. Do Earthquakes Shake Stock Markets?

    PubMed

    Ferreira, Susana; Karali, Berna

    2015-01-01

    This paper examines how major earthquakes affected the returns and volatility of aggregate stock market indices in thirty-five financial markets over the last twenty years. Results show that global financial markets are resilient to shocks caused by earthquakes even if these are domestic. Our analysis reveals that, in a few instances, some macroeconomic variables and earthquake characteristics (gross domestic product per capita, trade openness, bilateral trade flows, earthquake magnitude, a tsunami indicator, distance to the epicenter, and number of fatalities) mediate the impact of earthquakes on stock market returns, resulting in a zero net effect. However, the influence of these variables is market-specific, indicating no systematic pattern across global capital markets. Results also demonstrate that stock market volatility is unaffected by earthquakes, except for Japan.

  19. Toward real-time regional earthquake simulation II: Real-time Online earthquake Simulation (ROS) of Taiwan earthquakes

    NASA Astrophysics Data System (ADS)

    Lee, Shiann-Jong; Liu, Qinya; Tromp, Jeroen; Komatitsch, Dimitri; Liang, Wen-Tzong; Huang, Bor-Shouh

    2014-06-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 min after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). A new island-wide, high resolution SEM mesh model is developed for the whole Taiwan in this study. We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 min for a 70 s ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  20. Sedimentary Signatures of Submarine Earthquakes: Deciphering the Extent of Sediment Remobilization from the 2011 Tohoku Earthquake and Tsunami and 2010 Haiti Earthquake

    NASA Astrophysics Data System (ADS)

    McHugh, C. M.; Seeber, L.; Moernaut, J.; Strasser, M.; Kanamatsu, T.; Ikehara, K.; Bopp, R.; Mustaque, S.; Usami, K.; Schwestermann, T.; Kioka, A.; Moore, L. M.

    2017-12-01

    The 2004 Sumatra-Andaman Mw9.3 and the 2011 Tohoku (Japan) Mw9.0 earthquakes and tsunamis were huge geological events with major societal consequences. Both were along subduction boundaries and ruptured portions of these boundaries that had been deemed incapable of such events. Submarine strike-slip earthquakes, such as the 2010 Mw7.0 in Haiti, are smaller but may be closer to population centers and can be similarly catastrophic. Both classes of earthquakes remobilize sediment and leave distinct signatures in the geologic record by a wide range of processes that depends on both environment and earthquake characteristics. Understanding them has the potential of greatly expanding the record of past earthquakes, which is critical for geohazard analysis. Recent events offer precious ground truth about the earthquakes and short-lived radioisotopes offer invaluable tools to identify sediments they remobilized. In the 2011 Mw9 Japan earthquake they document the spatial extent of remobilized sediment from water depths of 626m in the forearc slope to trench depths of 8000m. Subbottom profiles, multibeam bathymetry and 40 piston cores collected by the R/V Natsushima and R/V Sonne expeditions to the Japan Trench document multiple turbidites and high-density flows. Core tops enriched in xs210Pb,137Cs and 134Cs reveal sediment deposited by the 2011 Tohoku earthquake and tsunami. The thickest deposits (2m) were documented on a mid-slope terrace and trench (4000-8000m). Sediment was deposited on some terraces (600-3000m), but shed from the steep forearc slope (3000-4000m). The 2010 Haiti mainshock ruptured along the southern flank of Canal du Sud and triggered multiple nearshore sediment failures, generated turbidity currents and stirred fine sediment into suspension throughout this basin. A tsunami was modeled to stem from both sediment failures and tectonics. Remobilized sediment was tracked with short-lived radioisotopes from the nearshore, slope, in fault basins including the

  1. WGCEP Historical California Earthquake Catalog

    USGS Publications Warehouse

    Felzer, Karen R.; Cao, Tianqing

    2008-01-01

    This appendix provides an earthquake catalog for California and the surrounding area. Our goal is to provide a listing for all known M > 5.5 earthquakes that occurred from 1850-1932 and all known M > 4.0 earthquakes that occurred from 1932-2006 within the region of 31.0 to 43.0 degrees North and -126.0 to -114.0 degrees West. Some pre-1932 earthquakes 4 5, before the Northern California network was online. Some earthquakes from 1900-1932, and particularly from 1910-1932 are also based on instrumental readings, but the quality of the instrumental record and the resulting analysis are much less precise than for later listings. A partial exception is for some of the largest earthquakes, such as the San Francisco earthquake of April 18, 1906, for which global teleseismic records (Wald et al. 1993) and geodetic measurements (Thatcher et al. 1906) have been used to help determine magnitudes.

  2. Earthquakes, September-October 1978

    USGS Publications Warehouse

    Person, W.J.

    1979-01-01

    The months of September and October were somewhat quiet seismically speaking. One major earthquake, magnitude (M) 7.7 occurred in Iran on September 16. In Germany, a magntidue 5.0 earthquake caused damage and considerable alarm to many people in parts of that country. In the United States, the largest earthquake occurred along the California-Nevada border region. 

  3. Earthquakes, March-April 1991

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    Two major earthquakes (7.0-7.9) occurred during this reporting period: a magnitude 7.6 in Costa Rica on April 22 and a magntidue 7.0 in the USSR on April 29. Destructive earthquakes hit northern Peru on April 4 and 5. There were no destructive earthquakes in the United States during this period. 

  4. Can We Predict Earthquakes?

    ScienceCinema

    Johnson, Paul

    2018-01-16

    The only thing we know for sure about earthquakes is that one will happen again very soon. Earthquakes pose a vital yet puzzling set of research questions that have confounded scientists for decades, but new ways of looking at seismic information and innovative laboratory experiments are offering tantalizing clues to what triggers earthquakes — and when.

  5. Earthquake and Schools. [Videotape].

    ERIC Educational Resources Information Center

    Federal Emergency Management Agency, Washington, DC.

    Designing schools to make them more earthquake resistant and protect children from the catastrophic collapse of the school building is discussed in this videotape. It reveals that 44 of the 50 U.S. states are vulnerable to earthquake, but most schools are structurally unprepared to take on the stresses that earthquakes exert. The cost to the…

  6. Earthquakes, September-October 1991

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    There were two major earthquakes (7.0-7.9) during this reporting period. the first was in the Solomon Islands on October 14 and the second was in India on October 19. Earthquake-related deaths were reported in Guatemala and India. Htere were no significant earthquakes in the United States during the period covered in this report. 

  7. Earthquakes in Arkansas and vicinity 1699-2010

    USGS Publications Warehouse

    Dart, Richard L.; Ausbrooks, Scott M.

    2011-01-01

    This map summarizes approximately 300 years of earthquake activity in Arkansas. It is one in a series of similar State earthquake history maps. Work on the Arkansas map was done in collaboration with the Arkansas Geological Survey. The earthquake data plotted on the map are from several sources: the Arkansas Geological Survey, the Center for Earthquake Research and Information, the National Center for Earthquake Engineering Research, and the Mississippi Department of Environmental Quality. In addition to earthquake locations, other materials presented include seismic hazard and isoseismal maps and related text. Earthquakes are a legitimate concern in Arkansas and parts of adjacent states. Arkansas has undergone a number of significant felt earthquakes since 1811. At least two of these events caused property damage: a magnitude 4.7 earthquake in 1931, and a magnitude 4.3 earthquake in 1967. The map shows all historical and instrumentally located earthquakes in Arkansas and vicinity between 1811 and 2010. The largest historic earthquake in the vicinity of the State was an intensity XI event, on December 16, 1811; the first earthquake in the New Madrid sequence. This violent event and the earthquakes that followed caused considerable damage to the then sparsely settled region.

  8. Earthquake Forecasting System in Italy

    NASA Astrophysics Data System (ADS)

    Falcone, G.; Marzocchi, W.; Murru, M.; Taroni, M.; Faenza, L.

    2017-12-01

    In Italy, after the 2009 L'Aquila earthquake, a procedure was developed for gathering and disseminating authoritative information about the time dependence of seismic hazard to help communities prepare for a potentially destructive earthquake. The most striking time dependency of the earthquake occurrence process is the time clustering, which is particularly pronounced in time windows of days and weeks. The Operational Earthquake Forecasting (OEF) system that is developed at the Seismic Hazard Center (Centro di Pericolosità Sismica, CPS) of the Istituto Nazionale di Geofisica e Vulcanologia (INGV) is the authoritative source of seismic hazard information for Italian Civil Protection. The philosophy of the system rests on a few basic concepts: transparency, reproducibility, and testability. In particular, the transparent, reproducible, and testable earthquake forecasting system developed at CPS is based on ensemble modeling and on a rigorous testing phase. Such phase is carried out according to the guidance proposed by the Collaboratory for the Study of Earthquake Predictability (CSEP, international infrastructure aimed at evaluating quantitatively earthquake prediction and forecast models through purely prospective and reproducible experiments). In the OEF system, the two most popular short-term models were used: the Epidemic-Type Aftershock Sequences (ETAS) and the Short-Term Earthquake Probabilities (STEP). Here, we report the results from OEF's 24hour earthquake forecasting during the main phases of the 2016-2017 sequence occurred in Central Apennines (Italy).

  9. The 2008 Wenchuan Earthquake and the Rise and Fall of Earthquake Prediction in China

    NASA Astrophysics Data System (ADS)

    Chen, Q.; Wang, K.

    2009-12-01

    Regardless of the future potential of earthquake prediction, it is presently impractical to rely on it to mitigate earthquake disasters. The practical approach is to strengthen the resilience of our built environment to earthquakes based on hazard assessment. But this was not common understanding in China when the M 7.9 Wenchuan earthquake struck the Sichuan Province on 12 May 2008, claiming over 80,000 lives. In China, earthquake prediction is a government-sanctioned and law-regulated measure of disaster prevention. A sudden boom of the earthquake prediction program in 1966-1976 coincided with a succession of nine M > 7 damaging earthquakes in the densely populated region of the country and the political chaos of the Cultural Revolution. It climaxed with the prediction of the 1975 Haicheng earthquake, which was due mainly to an unusually pronounced foreshock sequence and the extraordinary readiness of some local officials to issue imminent warning and evacuation order. The Haicheng prediction was a success in practice and yielded useful lessons, but the experience cannot be applied to most other earthquakes and cultural environments. Since the disastrous Tangshan earthquake in 1976 that killed over 240,000 people, there have been two opposite trends in China: decreasing confidence in prediction and increasing emphasis on regulating construction design for earthquake resilience. In 1976, most of the seismic intensity XI areas of Tangshan were literally razed to the ground, but in 2008, many buildings in the intensity XI areas of Wenchuan did not collapse. Prediction did not save life in either of these events; the difference was made by construction standards. For regular buildings, there was no seismic design in Tangshan to resist any earthquake shaking in 1976, but limited seismic design was required for the Wenchuan area in 2008. Although the construction standards were later recognized to be too low, those buildings that met the standards suffered much less

  10. Earthquake impact scale

    USGS Publications Warehouse

    Wald, D.J.; Jaiswal, K.S.; Marano, K.D.; Bausch, D.

    2011-01-01

    With the advent of the USGS prompt assessment of global earthquakes for response (PAGER) system, which rapidly assesses earthquake impacts, U.S. and international earthquake responders are reconsidering their automatic alert and activation levels and response procedures. To help facilitate rapid and appropriate earthquake response, an Earthquake Impact Scale (EIS) is proposed on the basis of two complementary criteria. On the basis of the estimated cost of damage, one is most suitable for domestic events; the other, on the basis of estimated ranges of fatalities, is generally more appropriate for global events, particularly in developing countries. Simple thresholds, derived from the systematic analysis of past earthquake impact and associated response levels, are quite effective in communicating predicted impact and response needed after an event through alerts of green (little or no impact), yellow (regional impact and response), orange (national-scale impact and response), and red (international response). Corresponding fatality thresholds for yellow, orange, and red alert levels are 1, 100, and 1,000, respectively. For damage impact, yellow, orange, and red thresholds are triggered by estimated losses reaching $1M, $100M, and $1B, respectively. The rationale for a dual approach to earthquake alerting stems from the recognition that relatively high fatalities, injuries, and homelessness predominate in countries in which local building practices typically lend themselves to high collapse and casualty rates, and these impacts lend to prioritization for international response. In contrast, financial and overall societal impacts often trigger the level of response in regions or countries in which prevalent earthquake resistant construction practices greatly reduce building collapse and resulting fatalities. Any newly devised alert, whether economic- or casualty-based, should be intuitive and consistent with established lexicons and procedures. Useful alerts should

  11. A post-Tohoku earthquake review of earthquake probabilities in the Southern Kanto District, Japan

    NASA Astrophysics Data System (ADS)

    Somerville, Paul G.

    2014-12-01

    The 2011 Mw 9.0 Tohoku earthquake generated an aftershock sequence that affected a large part of northern Honshu, and has given rise to widely divergent forecasts of changes in earthquake occurrence probabilities in northern Honshu. The objective of this review is to assess these forecasts as they relate to potential changes in the occurrence probabilities of damaging earthquakes in the Kanto Region. It is generally agreed that the 2011 Mw 9.0 Tohoku earthquake increased the stress on faults in the southern Kanto district. Toda and Stein (Geophys Res Lett 686, 40: doi:10.1002, 2013) further conclude that the probability of earthquakes in the Kanto Corridor has increased by a factor of 2.5 for the time period 11 March 2013 to 10 March 2018 in the Kanto Corridor. Estimates of earthquake probabilities in a wider region of the Southern Kanto District by Nanjo et al. (Geophys J Int, doi:10.1093, 2013) indicate that any increase in the probability of earthquakes is insignificant in this larger region. Uchida et al. (Earth Planet Sci Lett 374: 81-91, 2013) conclude that the Philippine Sea plate the extends well north of the northern margin of Tokyo Bay, inconsistent with the Kanto Fragment hypothesis of Toda et al. (Nat Geosci, 1:1-6,2008), which attributes deep earthquakes in this region, which they term the Kanto Corridor, to a broken fragment of the Pacific plate. The results of Uchida and Matsuzawa (J Geophys Res 115:B07309, 2013)support the conclusion that fault creep in southern Kanto may be slowly relaxing the stress increase caused by the Tohoku earthquake without causing more large earthquakes. Stress transfer calculations indicate a large stress transfer to the Off Boso Segment as a result of the 2011 Tohoku earthquake. However, Ozawa et al. (J Geophys Res 117:B07404, 2012) used onshore GPS measurements to infer large post-Tohoku creep on the plate interface in the Off-Boso region, and Uchida and Matsuzawa (ibid.) measured similar large creep off the Boso

  12. Post-earthquake building safety inspection: Lessons from the Canterbury, New Zealand, earthquakes

    USGS Publications Warehouse

    Marshall, J.; Jaiswal, Kishor; Gould, N.; Turner, F.; Lizundia, B.; Barnes, J.

    2013-01-01

    The authors discuss some of the unique aspects and lessons of the New Zealand post-earthquake building safety inspection program that was implemented following the Canterbury earthquake sequence of 2010–2011. The post-event safety assessment program was one of the largest and longest programs undertaken in recent times anywhere in the world. The effort engaged hundreds of engineering professionals throughout the country, and also sought expertise from outside, to perform post-earthquake structural safety inspections of more than 100,000 buildings in the city of Christchurch and the surrounding suburbs. While the building safety inspection procedure implemented was analogous to the ATC 20 program in the United States, many modifications were proposed and implemented in order to assess the large number of buildings that were subjected to strong and variable shaking during a period of two years. This note discusses some of the key aspects of the post-earthquake building safety inspection program and summarizes important lessons that can improve future earthquake response.

  13. Do Earthquakes Shake Stock Markets?

    PubMed Central

    2015-01-01

    This paper examines how major earthquakes affected the returns and volatility of aggregate stock market indices in thirty-five financial markets over the last twenty years. Results show that global financial markets are resilient to shocks caused by earthquakes even if these are domestic. Our analysis reveals that, in a few instances, some macroeconomic variables and earthquake characteristics (gross domestic product per capita, trade openness, bilateral trade flows, earthquake magnitude, a tsunami indicator, distance to the epicenter, and number of fatalities) mediate the impact of earthquakes on stock market returns, resulting in a zero net effect. However, the influence of these variables is market-specific, indicating no systematic pattern across global capital markets. Results also demonstrate that stock market volatility is unaffected by earthquakes, except for Japan. PMID:26197482

  14. Continuing Megathrust Earthquake Potential in northern Chile after the 2014 Iquique Earthquake Sequence

    NASA Astrophysics Data System (ADS)

    Hayes, G. P.; Herman, M. W.; Barnhart, W. D.; Furlong, K. P.; Riquelme, S.; Benz, H.; Bergman, E.; Barrientos, S. E.; Earle, P. S.; Samsonov, S. V.

    2014-12-01

    The seismic gap theory, which identifies regions of elevated hazard based on a lack of recent seismicity in comparison to other portions of a fault, has successfully explained past earthquakes and is useful for qualitatively describing where future large earthquakes might occur. A large earthquake had been expected in the subduction zone adjacent to northern Chile, which until recently had not ruptured in a megathrust earthquake since a M~8.8 event in 1877. On April 1 2014, a M 8.2 earthquake occurred within this northern Chile seismic gap, offshore of the city of Iquique; the size and spatial extent of the rupture indicate it was not the earthquake that had been anticipated. Here, we present a rapid assessment of the seismotectonics of the March-April 2014 seismic sequence offshore northern Chile, including analyses of earthquake (fore- and aftershock) relocations, moment tensors, finite fault models, moment deficit calculations, and cumulative Coulomb stress transfer calculations over the duration of the sequence. This ensemble of information allows us to place the current sequence within the context of historic seismicity in the region, and to assess areas of remaining and/or elevated hazard. Our results indicate that while accumulated strain has been released for a portion of the northern Chile seismic gap, significant sections have not ruptured in almost 150 years. These observations suggest that large-to-great sized megathrust earthquakes will occur north and south of the 2014 Iquique sequence sooner than might be expected had the 2014 events ruptured the entire seismic gap.

  15. The Mw 7.7 Bhuj earthquake: Global lessons for earthquake hazard in intra-plate regions

    USGS Publications Warehouse

    Schweig, E.; Gomberg, J.; Petersen, M.; Ellis, M.; Bodin, P.; Mayrose, L.; Rastogi, B.K.

    2003-01-01

    The Mw 7.7 Bhuj earthquake occurred in the Kachchh District of the State of Gujarat, India on 26 January 2001, and was one of the most damaging intraplate earthquakes ever recorded. This earthquake is in many ways similar to the three great New Madrid earthquakes that occurred in the central United States in 1811-1812, An Indo-US team is studying the similarities and differences of these sequences in order to learn lessons for earthquake hazard in intraplate regions. Herein we present some preliminary conclusions from that study. Both the Kutch and New Madrid regions have rift type geotectonic setting. In both regions the strain rates are of the order of 10-9/yr and attenuation of seismic waves as inferred from observations of intensity and liquefaction are low. These strain rates predict recurrence intervals for Bhuj or New Madrid sized earthquakes of several thousand years or more. In contrast, intervals estimated from paleoseismic studies and from other independent data are significantly shorter, probably hundreds of years. All these observations together may suggest that earthquakes relax high ambient stresses that are locally concentrated by rheologic heterogeneities, rather than loading by plate-tectonic forces. The latter model generally underlies basic assumptions made in earthquake hazard assessment, that the long-term average rate of energy released by earthquakes is determined by the tectonic loading rate, which thus implies an inherent average periodicity of earthquake occurrence. Interpreting the observations in terms of the former model therefore may require re-examining the basic assumptions of hazard assessment.

  16. Earthquake precursors: spatial-temporal gravity changes before the great earthquakes in the Sichuan-Yunnan area

    NASA Astrophysics Data System (ADS)

    Zhu, Yi-Qing; Liang, Wei-Feng; Zhang, Song

    2018-01-01

    Using multiple-scale mobile gravity data in the Sichuan-Yunnan area, we systematically analyzed the relationships between spatial-temporal gravity changes and the 2014 Ludian, Yunnan Province Ms6.5 earthquake and the 2014 Kangding Ms6.3, 2013 Lushan Ms7.0, and 2008 Wenchuan Ms8.0 earthquakes in Sichuan Province. Our main results are as follows. (1) Before the occurrence of large earthquakes, gravity anomalies occur in a large area around the epicenters. The directions of gravity change gradient belts usually agree roughly with the directions of the main fault zones of the study area. Such gravity changes might reflect the increase of crustal stress, as well as the significant active tectonic movements and surface deformations along fault zones, during the period of gestation of great earthquakes. (2) Continuous significant changes of the multiple-scale gravity fields, as well as greater gravity changes with larger time scales, can be regarded as medium-range precursors of large earthquakes. The subsequent large earthquakes always occur in the area where the gravity changes greatly. (3) The spatial-temporal gravity changes are very useful in determining the epicenter of coming large earthquakes. The large gravity networks are useful to determine the general areas of coming large earthquakes. However, the local gravity networks with high spatial-temporal resolution are suitable for determining the location of epicenters. Therefore, denser gravity observation networks are necessary for better forecasts of the epicenters of large earthquakes. (4) Using gravity changes from mobile observation data, we made medium-range forecasts of the Kangding, Ludian, Lushan, and Wenchuan earthquakes, with especially successful forecasts of the location of their epicenters. Based on the above discussions, we emphasize that medium-/long-term potential for large earthquakes might exist nowadays in some areas with significant gravity anomalies in the study region. Thus, the monitoring

  17. Earthquakes in Mississippi and vicinity 1811-2010

    USGS Publications Warehouse

    Dart, Richard L.; Bograd, Michael B.E.

    2011-01-01

    This map summarizes two centuries of earthquake activity in Mississippi. Work on the Mississippi map was done in collaboration with the Mississippi Department of Environmental Quality, Office of Geology. The earthquake data plotted on the map are from several sources: the Mississippi Department of Environmental Quality, the Center for Earthquake Research and Information, the National Center for Earthquake Engineering Research, and the Arkansas Geological Survey. In addition to earthquake locations, other materials include seismic hazard and isoseismal maps and related text. Earthquakes are a legitimate concern in Mississippi and parts of adjacent States. Mississippi has undergone a number of felt earthquakes since 1811. At least two of these events caused property damage: a magnitude 4.7 earthquake in 1931, and a magnitude 4.3 earthquake in 1967. The map shows all historical and instrumentally located earthquakes in Mississippi and vicinity between 1811 and 2010. The largest historic earthquake in the vicinity of the State was an intensity XI event, on December 16, 1811; the first earthquake in the New Madrid sequence. This violent event and the earthquakes that followed caused considerable damage to the then sparsely settled region.

  18. Overestimation of the earthquake hazard along the Himalaya: constraints in bracketing of medieval earthquakes from paleoseismic studies

    NASA Astrophysics Data System (ADS)

    Arora, Shreya; Malik, Javed N.

    2017-12-01

    The Himalaya is one of the most seismically active regions of the world. The occurrence of several large magnitude earthquakes viz. 1905 Kangra earthquake (Mw 7.8), 1934 Bihar-Nepal earthquake (Mw 8.2), 1950 Assam earthquake (Mw 8.4), 2005 Kashmir (Mw 7.6), and 2015 Gorkha (Mw 7.8) are the testimony to ongoing tectonic activity. In the last few decades, tremendous efforts have been made along the Himalayan arc to understand the patterns of earthquake occurrences, size, extent, and return periods. Some of the large magnitude earthquakes produced surface rupture, while some remained blind. Furthermore, due to the incompleteness of the earthquake catalogue, a very few events can be correlated with medieval earthquakes. Based on the existing paleoseismic data certainly, there exists a complexity to precisely determine the extent of surface rupture of these earthquakes and also for those events, which occurred during historic times. In this paper, we have compiled the paleo-seismological data and recalibrated the radiocarbon ages from the trenches excavated by previous workers along the entire Himalaya and compared earthquake scenario with the past. Our studies suggest that there were multiple earthquake events with overlapping surface ruptures in small patches with an average rupture length of 300 km limiting Mw 7.8-8.0 for the Himalayan arc, rather than two or three giant earthquakes rupturing the whole front. It has been identified that the large magnitude Himalayan earthquakes, such as 1905 Kangra, 1934 Bihar-Nepal, and 1950 Assam, that have occurred within a time frame of 45 years. Now, if these events are dated, there is a high possibility that within the range of ±50 years, they may be considered as the remnant of one giant earthquake rupturing the entire Himalayan arc. Therefore, leading to an overestimation of seismic hazard scenario in Himalaya.

  19. Simulating Earthquake Early Warning Systems in the Classroom as a New Approach to Teaching Earthquakes

    NASA Astrophysics Data System (ADS)

    D'Alessio, M. A.

    2010-12-01

    A discussion of P- and S-waves seems an ubiquitous part of studying earthquakes in the classroom. Textbooks from middle school through university level typically define the differences between the waves and illustrate the sense of motion. While many students successfully memorize the differences between wave types (often utilizing the first letter as a memory aide), textbooks rarely give tangible examples of how the two waves would "feel" to a person sitting on the ground. One reason for introducing the wave types is to explain how to calculate earthquake epicenters using seismograms and travel time charts -- very abstract representations of earthquakes. Even when the skill is mastered using paper-and-pencil activities or one of the excellent online interactive versions, locating an epicenter simply does not excite many of our students because it evokes little emotional impact, even in students located in earthquake-prone areas. Despite these limitations, huge numbers of students are mandated to complete the task. At the K-12 level, California requires that all students be able to locate earthquake epicenters in Grade 6; in New York, the skill is a required part of the Regent's Examination. Recent innovations in earthquake early warning systems around the globe give us the opportunity to address the same content standard, but with substantially more emotional impact on students. I outline a lesson about earthquakes focused on earthquake early warning systems. The introductory activities include video clips of actual earthquakes and emphasize the differences between the way P- and S-waves feel when they arrive (P arrives first, but is weaker). I include an introduction to the principle behind earthquake early warning (including a summary of possible uses of a few seconds warning about strong shaking) and show examples from Japan. Students go outdoors to simulate P-waves, S-waves, and occupants of two different cities who are talking to one another on cell phones

  20. Demand surge following earthquakes

    USGS Publications Warehouse

    Olsen, Anna H.

    2012-01-01

    Demand surge is understood to be a socio-economic phenomenon where repair costs for the same damage are higher after large- versus small-scale natural disasters. It has reportedly increased monetary losses by 20 to 50%. In previous work, a model for the increased costs of reconstruction labor and materials was developed for hurricanes in the Southeast United States. The model showed that labor cost increases, rather than the material component, drove the total repair cost increases, and this finding could be extended to earthquakes. A study of past large-scale disasters suggested that there may be additional explanations for demand surge. Two such explanations specific to earthquakes are the exclusion of insurance coverage for earthquake damage and possible concurrent causation of damage from an earthquake followed by fire or tsunami. Additional research into these aspects might provide a better explanation for increased monetary losses after large- vs. small-scale earthquakes.

  1. Intermediate-term earthquake prediction

    USGS Publications Warehouse

    Knopoff, L.

    1990-01-01

    The problems in predicting earthquakes have been attacked by phenomenological methods from pre-historic times to the present. The associations of presumed precursors with large earthquakes often have been remarked upon. the difficulty in identifying whether such correlations are due to some chance coincidence or are real precursors is that usually one notes the associations only in the relatively short time intervals before the large events. Only rarely, if ever, is notice taken of whether the presumed precursor is to be found in the rather long intervals that follow large earthquakes, or in fact is absent in these post-earthquake intervals. If there are enough examples, the presumed correlation fails as a precursor in the former case, while in the latter case the precursor would be verified. Unfortunately, the observer is usually not concerned with the 'uniteresting' intervals that have no large earthquakes

  2. Earthquakes on Your Dinner Table

    NASA Astrophysics Data System (ADS)

    Alexeev, N. A.; Tape, C.; Alexeev, V. A.

    2016-12-01

    Earthquakes have interesting physics applicable to other phenomena like propagation of waves, also, they affect human lives. This study focused on three questions, how: depth, distance from epicenter and ground hardness affect earthquake strength. Experimental setup consisted of a gelatin slab to simulate crust. The slab was hit with a weight and earthquake amplitude was measured. It was found that earthquake amplitude was larger when the epicenter was deeper, which contradicts observations and probably was an artifact of the design. Earthquake strength was inversely proportional to the distance from the epicenter, which generally follows reality. Soft and medium jello were implanted into hard jello. It was found that earthquakes are stronger in softer jello, which was a result of resonant amplification in soft ground. Similar results are found in Minto Flats, where earthquakes are stronger and last longer than in the nearby hills. Earthquakes waveforms from Minto Flats showed that that the oscillations there have longer periods compared to the nearby hills with harder soil. Two gelatin pieces with identical shapes and different hardness were vibrated on a platform at varying frequencies in order to demonstrate that their resonant frequencies are statistically different. This phenomenon also occurs in Yukon Flats.

  3. Damaging earthquakes: A scientific laboratory

    USGS Publications Warehouse

    Hays, Walter W.; ,

    1996-01-01

    This paper reviews the principal lessons learned from multidisciplinary postearthquake investigations of damaging earthquakes throughout the world during the past 15 years. The unique laboratory provided by a damaging earthquake in culturally different but tectonically similar regions of the world has increased fundamental understanding of earthquake processes, added perishable scientific, technical, and socioeconomic data to the knowledge base, and led to changes in public policies and professional practices for earthquake loss reduction.

  4. 2010 Chile Earthquake Aftershock Response

    NASA Astrophysics Data System (ADS)

    Barientos, Sergio

    2010-05-01

    The Mw=8.8 earthquake off the coast of Chile on 27 February 2010 is the 5th largest megathrust earthquake ever to be recorded and provides an unprecedented opportunity to advance our understanding of megathrust earthquakes and associated phenomena. The 2010 Chile earthquake ruptured the Concepcion-Constitucion segment of the Nazca/South America plate boundary, south of the Central Chile region and triggered a tsunami along the coast. Following the 2010 earthquake, a very energetic aftershock sequence is being observed in an area that is 600 km along strike from Valparaiso to 150 km south of Concepcion. Within the first three weeks there were over 260 aftershocks with magnitude 5.0 or greater and 18 with magnitude 6.0 or greater (NEIC, USGS). The Concepcion-Constitucion segment lies immediately north of the rupture zone associated with the great magnitude 9.5 Chile earthquake, and south of the 1906 and the 1985 Valparaiso earthquakes. The last great subduction earthquake in the region dates back to the February 1835 event described by Darwin (1871). Since 1835, part of the region was affected in the north by the Talca earthquake in December 1928, interpreted as a shallow dipping thrust event, and by the Chillan earthquake (Mw 7.9, January 1939), a slab-pull intermediate depth earthquake. For the last 30 years, geodetic studies in this area were consistent with a fully coupled elastic loading of the subduction interface at depth; this led to identify the area as a mature seismic gap with potential for an earthquake of magnitude of the order 8.5 or several earthquakes of lesser magnitude. What was less expected was the partial rupturing of the 1985 segment toward north. Today, the 2010 earthquake raises some disturbing questions: Why and how the rupture terminated where it did at the northern end? How did the 2010 earthquake load the adjacent segment to the north and did the 1985 earthquake only partially ruptured the plate interface leaving loaded asperities since

  5. Sensing the earthquake

    NASA Astrophysics Data System (ADS)

    Bichisao, Marta; Stallone, Angela

    2017-04-01

    Making science visual plays a crucial role in the process of building knowledge. In this view, art can considerably facilitate the representation of the scientific content, by offering a different perspective on how a specific problem could be approached. Here we explore the possibility of presenting the earthquake process through visual dance. From a choreographer's point of view, the focus is always on the dynamic relationships between moving objects. The observed spatial patterns (coincidences, repetitions, double and rhythmic configurations) suggest how objects organize themselves in the environment and what are the principles underlying that organization. The identified set of rules is then implemented as a basis for the creation of a complex rhythmic and visual dance system. Recently, scientists have turned seismic waves into sound and animations, introducing the possibility of "feeling" the earthquakes. We try to implement these results into a choreographic model with the aim to convert earthquake sound to a visual dance system, which could return a transmedia representation of the earthquake process. In particular, we focus on a possible method to translate and transfer the metric language of seismic sound and animations into body language. The objective is to involve the audience into a multisensory exploration of the earthquake phenomenon, through the stimulation of the hearing, eyesight and perception of the movements (neuromotor system). In essence, the main goal of this work is to develop a method for a simultaneous visual and auditory representation of a seismic event by means of a structured choreographic model. This artistic representation could provide an original entryway into the physics of earthquakes.

  6. Discrepancy between earthquake rates implied by historic earthquakes and a consensus geologic source model for California

    USGS Publications Warehouse

    Petersen, M.D.; Cramer, C.H.; Reichle, M.S.; Frankel, A.D.; Hanks, T.C.

    2000-01-01

    We examine the difference between expected earthquake rates inferred from the historical earthquake catalog and the geologic data that was used to develop the consensus seismic source characterization for the state of California [California Department of Conservation, Division of Mines and Geology (CDMG) and U.S. Geological Survey (USGS) Petersen et al., 1996; Frankel et al., 1996]. On average the historic earthquake catalog and the seismic source model both indicate about one M 6 or greater earthquake per year in the state of California. However, the overall earthquake rates of earthquakes with magnitudes (M) between 6 and 7 in this seismic source model are higher, by at least a factor of 2, than the mean historic earthquake rates for both southern and northern California. The earthquake rate discrepancy results from a seismic source model that includes earthquakes with characteristic (maximum) magnitudes that are primarily between M 6.4 and 7.1. Many of these faults are interpreted to accommodate high strain rates from geologic and geodetic data but have not ruptured in large earthquakes during historic time. Our sensitivity study indicates that the rate differences between magnitudes 6 and 7 can be reduced by adjusting the magnitude-frequency distribution of the source model to reflect more characteristic behavior, by decreasing the moment rate available for seismogenic slip along faults, by increasing the maximum magnitude of the earthquake on a fault, or by decreasing the maximum magnitude of the background seismicity. However, no single parameter can be adjusted, consistent with scientific consensus, to eliminate the earthquake rate discrepancy. Applying a combination of these parametric adjustments yields an alternative earthquake source model that is more compatible with the historic data. The 475-year return period hazard for peak ground and 1-sec spectral acceleration resulting from this alternative source model differs from the hazard resulting from the

  7. GPS Technologies as a Tool to Detect the Pre-Earthquake Signals Associated with Strong Earthquakes

    NASA Astrophysics Data System (ADS)

    Pulinets, S. A.; Krankowski, A.; Hernandez-Pajares, M.; Liu, J. Y. G.; Hattori, K.; Davidenko, D.; Ouzounov, D.

    2015-12-01

    The existence of ionospheric anomalies before earthquakes is now widely accepted. These phenomena started to be considered by GPS community to mitigate the GPS signal degradation over the territories of the earthquake preparation. The question is still open if they could be useful for seismology and for short-term earthquake forecast. More than decade of intensive studies proved that ionospheric anomalies registered before earthquakes are initiated by processes in the boundary layer of atmosphere over earthquake preparation zone and are induced in the ionosphere by electromagnetic coupling through the Global Electric Circuit. Multiparameter approach based on the Lithosphere-Atmosphere-Ionosphere Coupling model demonstrated that earthquake forecast is possible only if we consider the final stage of earthquake preparation in the multidimensional space where every dimension is one from many precursors in ensemble, and they are synergistically connected. We demonstrate approaches developed in different countries (Russia, Taiwan, Japan, Spain, and Poland) within the framework of the ISSI and ESA projects) to identify the ionospheric precursors. They are also useful to determine the all three parameters necessary for the earthquake forecast: impending earthquake epicenter position, expectation time and magnitude. These parameters are calculated using different technologies of GPS signal processing: time series, correlation, spectral analysis, ionospheric tomography, wave propagation, etc. Obtained results from different teams demonstrate the high level of statistical significance and physical justification what gives us reason to suggest these methodologies for practical validation.

  8. Earthquake Fingerprints: Representing Earthquake Waveforms for Similarity-Based Detection

    NASA Astrophysics Data System (ADS)

    Bergen, K.; Beroza, G. C.

    2016-12-01

    New earthquake detection methods, such as Fingerprint and Similarity Thresholding (FAST), use fast approximate similarity search to identify similar waveforms in long-duration data without templates (Yoon et al. 2015). These methods have two key components: fingerprint extraction and an efficient search algorithm. Fingerprint extraction converts waveforms into fingerprints, compact signatures that represent short-duration waveforms for identification and search. Earthquakes are detected using an efficient indexing and search scheme, such as locality-sensitive hashing, that identifies similar waveforms in a fingerprint database. The quality of the search results, and thus the earthquake detection results, is strongly dependent on the fingerprinting scheme. Fingerprint extraction should map similar earthquake waveforms to similar waveform fingerprints to ensure a high detection rate, even under additive noise and small distortions. Additionally, fingerprints corresponding to noise intervals should have mutually dissimilar fingerprints to minimize false detections. In this work, we compare the performance of multiple fingerprint extraction approaches for the earthquake waveform similarity search problem. We apply existing audio fingerprinting (used in content-based audio identification systems) and time series indexing techniques and present modified versions that are specifically adapted for seismic data. We also explore data-driven fingerprinting approaches that can take advantage of labeled or unlabeled waveform data. For each fingerprinting approach we measure its ability to identify similar waveforms in a low signal-to-noise setting, and quantify the trade-off between true and false detection rates in the presence of persistent noise sources. We compare the performance using known event waveforms from eight independent stations in the Northern California Seismic Network.

  9. Earthquake-related versus non-earthquake-related injuries in spinal injury patients: differentiation with multidetector computed tomography

    PubMed Central

    2010-01-01

    Introduction In recent years, several massive earthquakes have occurred across the globe. Multidetector computed tomography (MDCT) is reliable in detecting spinal injuries. The purpose of this study was to compare the features of spinal injuries resulting from the Sichuan earthquake with those of non-earthquake-related spinal trauma using MDCT. Methods Features of spinal injuries of 223 Sichuan earthquake-exposed patients and 223 non-earthquake-related spinal injury patients were retrospectively compared using MDCT. The date of non-earthquake-related spinal injury patients was collected from 1 May 2009 to 22 July 2009 to avoid the confounding effects of seasonal activity and clothing. We focused on anatomic sites, injury types and neurologic deficits related to spinal injuries. Major injuries were classified according to the grid 3-3-3 scheme of the Magerl (AO) classification system. Results A total of 185 patients (82.96%) in the earthquake-exposed cohort experienced crush injuries. In the earthquake and control groups, 65 and 92 patients, respectively, had neurologic deficits. The anatomic distribution of these two cohorts was significantly different (P < 0.001). Cervical spinal injuries were more common in the control group (risk ratio (RR) = 2.12, P < 0.001), whereas lumbar spinal injuries were more common in the earthquake-related spinal injuries group (277 of 501 injured vertebrae; 55.29%). The major types of injuries were significantly different between these cohorts (P = 0.002). Magerl AO type A lesions composed most of the lesions seen in both of these cohorts. Type B lesions were more frequently seen in earthquake-related spinal injuries (RR = 1.27), while we observed type C lesions more frequently in subjects with non-earthquake-related spinal injuries (RR = 1.98, P = 0.0029). Conclusions Spinal injuries sustained in the Sichuan earthquake were located mainly in the lumbar spine, with a peak prevalence of type A lesions and a high occurrence of

  10. Earthquake Emergency Education in Dushanbe, Tajikistan

    ERIC Educational Resources Information Center

    Mohadjer, Solmaz; Bendick, Rebecca; Halvorson, Sarah J.; Saydullaev, Umed; Hojiboev, Orifjon; Stickler, Christine; Adam, Zachary R.

    2010-01-01

    We developed a middle school earthquake science and hazards curriculum to promote earthquake awareness to students in the Central Asian country of Tajikistan. These materials include pre- and post-assessment activities, six science activities describing physical processes related to earthquakes, five activities on earthquake hazards and mitigation…

  11. Intensity earthquake scenario (scenario event - a damaging earthquake with higher probability of occurrence) for the city of Sofia

    NASA Astrophysics Data System (ADS)

    Aleksandrova, Irena; Simeonova, Stela; Solakov, Dimcho; Popova, Maria

    2014-05-01

    Among the many kinds of natural and man-made disasters, earthquakes dominate with regard to their social and economical impact on the urban environment. Global seismic risk to earthquakes are increasing steadily as urbanization and development occupy more areas that a prone to effects of strong earthquakes. Additionally, the uncontrolled growth of mega cities in highly seismic areas around the world is often associated with the construction of seismically unsafe buildings and infrastructures, and undertaken with an insufficient knowledge of the regional seismicity peculiarities and seismic hazard. The assessment of seismic hazard and generation of earthquake scenarios is the first link in the prevention chain and the first step in the evaluation of the seismic risk. The earthquake scenarios are intended as a basic input for developing detailed earthquake damage scenarios for the cities and can be used in earthquake-safe town and infrastructure planning. The city of Sofia is the capital of Bulgaria. It is situated in the centre of the Sofia area that is the most populated (the population is of more than 1.2 mil. inhabitants), industrial and cultural region of Bulgaria that faces considerable earthquake risk. The available historical documents prove the occurrence of destructive earthquakes during the 15th-18th centuries in the Sofia zone. In 19th century the city of Sofia has experienced two strong earthquakes: the 1818 earthquake with epicentral intensity I0=8-9 MSK and the 1858 earthquake with I0=9-10 MSK. During the 20th century the strongest event occurred in the vicinity of the city of Sofia is the 1917 earthquake with MS=5.3 (I0=7-8 MSK). Almost a century later (95 years) an earthquake of moment magnitude 5.6 (I0=7-8 MSK) hit the city of Sofia, on May 22nd, 2012. In the present study as a deterministic scenario event is considered a damaging earthquake with higher probability of occurrence that could affect the city with intensity less than or equal to VIII

  12. Earthquake recurrence models fail when earthquakes fail to reset the stress field

    USGS Publications Warehouse

    Tormann, Thessa; Wiemer, Stefan; Hardebeck, Jeanne L.

    2012-01-01

    Parkfield's regularly occurring M6 mainshocks, about every 25 years, have over two decades stoked seismologists' hopes to successfully predict an earthquake of significant size. However, with the longest known inter-event time of 38 years, the latest M6 in the series (28 Sep 2004) did not conform to any of the applied forecast models, questioning once more the predictability of earthquakes in general. Our study investigates the spatial pattern of b-values along the Parkfield segment through the seismic cycle and documents a stably stressed structure. The forecasted rate of M6 earthquakes based on Parkfield's microseismicity b-values corresponds well to observed rates. We interpret the observed b-value stability in terms of the evolution of the stress field in that area: the M6 Parkfield earthquakes do not fully unload the stress on the fault, explaining why time recurrent models fail. We present the 1989 M6.9 Loma Prieta earthquake as counter example, which did release a significant portion of the stress along its fault segment and yields a substantial change in b-values.

  13. The Virtual Quake Earthquake Simulator: Earthquake Probability Statistics for the El Mayor-Cucapah Region and Evidence of Predictability in Simulated Earthquake Sequences

    NASA Astrophysics Data System (ADS)

    Schultz, K.; Yoder, M. R.; Heien, E. M.; Rundle, J. B.; Turcotte, D. L.; Parker, J. W.; Donnellan, A.

    2015-12-01

    We introduce a framework for developing earthquake forecasts using Virtual Quake (VQ), the generalized successor to the perhaps better known Virtual California (VC) earthquake simulator. We discuss the basic merits and mechanics of the simulator, and we present several statistics of interest for earthquake forecasting. We also show that, though the system as a whole (in aggregate) behaves quite randomly, (simulated) earthquake sequences limited to specific fault sections exhibit measurable predictability in the form of increasing seismicity precursory to large m > 7 earthquakes. In order to quantify this, we develop an alert based forecasting metric similar to those presented in Keilis-Borok (2002); Molchan (1997), and show that it exhibits significant information gain compared to random forecasts. We also discuss the long standing question of activation vs quiescent type earthquake triggering. We show that VQ exhibits both behaviors separately for independent fault sections; some fault sections exhibit activation type triggering, while others are better characterized by quiescent type triggering. We discuss these aspects of VQ specifically with respect to faults in the Salton Basin and near the El Mayor-Cucapah region in southern California USA and northern Baja California Norte, Mexico.

  14. Earthquake Damage Assessment Using Objective Image Segmentation: A Case Study of 2010 Haiti Earthquake

    NASA Technical Reports Server (NTRS)

    Oommen, Thomas; Rebbapragada, Umaa; Cerminaro, Daniel

    2012-01-01

    In this study, we perform a case study on imagery from the Haiti earthquake that evaluates a novel object-based approach for characterizing earthquake induced surface effects of liquefaction against a traditional pixel based change technique. Our technique, which combines object-oriented change detection with discriminant/categorical functions, shows the power of distinguishing earthquake-induced surface effects from changes in buildings using the object properties concavity, convexity, orthogonality and rectangularity. Our results suggest that object-based analysis holds promise in automatically extracting earthquake-induced damages from high-resolution aerial/satellite imagery.

  15. The CATDAT damaging earthquakes database

    NASA Astrophysics Data System (ADS)

    Daniell, J. E.; Khazai, B.; Wenzel, F.; Vervaeck, A.

    2011-08-01

    The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture) database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes. Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon. Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected), and economic losses (direct, indirect, aid, and insured). Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto (214 billion USD damage; 2011 HNDECI-adjusted dollars) compared to the 2011 Tohoku (>300 billion USD at time of writing), 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product), exchange rate, wage information, population, HDI (Human Development Index), and insurance information have been collected globally to form comparisons. This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global reinsurance field.

  16. Bayesian historical earthquake relocation: an example from the 1909 Taipei earthquake

    USGS Publications Warehouse

    Minson, Sarah E.; Lee, William H.K.

    2014-01-01

    Locating earthquakes from the beginning of the modern instrumental period is complicated by the fact that there are few good-quality seismograms and what traveltimes do exist may be corrupted by both large phase-pick errors and clock errors. Here, we outline a Bayesian approach to simultaneous inference of not only the hypocentre location but also the clock errors at each station and the origin time of the earthquake. This methodology improves the solution for the source location and also provides an uncertainty analysis on all of the parameters included in the inversion. As an example, we applied this Bayesian approach to the well-studied 1909 Mw 7 Taipei earthquake. While our epicentre location and origin time for the 1909 Taipei earthquake are consistent with earlier studies, our focal depth is significantly shallower suggesting a higher seismic hazard to the populous Taipei metropolitan area than previously supposed.

  17. Comparison of aftershock sequences between 1975 Haicheng earthquake and 1976 Tangshan earthquake

    NASA Astrophysics Data System (ADS)

    Liu, B.

    2017-12-01

    The 1975 ML 7.3 Haicheng earthquake and the 1976 ML 7.8 Tangshan earthquake occurred in the same tectonic unit. There are significant differences in spatial-temporal distribution, number of aftershocks and time duration for the aftershock sequence followed by these two main shocks. As we all know, aftershocks could be triggered by the regional seismicity change derived from the main shock, which was caused by the Coulomb stress perturbation. Based on the rate- and state- dependent friction law, we quantitative estimated the possible aftershock time duration with a combination of seismicity data, and compared the results from different approaches. The results indicate that, aftershock time durations from the Tangshan main shock is several times of that form the Haicheng main shock. This can be explained by the significant relationship between aftershock time duration and earthquake nucleation history, normal stressand shear stress loading rateon the fault. In fact the obvious difference of earthquake nucleation history from these two main shocks is the foreshocks. 1975 Haicheng earthquake has clear and long foreshocks, while 1976 Tangshan earthquake did not have clear foreshocks. In that case, abundant foreshocks may mean a long and active nucleation process that may have changed (weakened) the rocks in the source regions, so they should have a shorter aftershock sequences for the reason that stress in weak rocks decay faster.

  18. Thoracic Injuries in earthquake-related versus non-earthquake-related trauma patients: differentiation via Multi-detector Computed Tomography.

    PubMed

    Dong, Zhi-Hui; Yang, Zhi-Gang; Chen, Tian-Wu; Chu, Zhi-Gang; Deng, Wen; Shao, Heng

    2011-01-01

    Massive earthquakes are harmful to humankind. This study of a historical cohort aimed to investigate the difference between earthquake-related crush thoracic traumas and thoracic traumas unrelated to earthquakes using a multi-detector Computed Tomography (CT). We retrospectively compared an earthquake-exposed cohort of 215 thoracic trauma crush victims of the Sichuan earthquake to a cohort of 215 non-earthquake-related thoracic trauma patients, focusing on the lesions and coexisting injuries to the thoracic cage and the pulmonary parenchyma and pleura using a multi-detector CT. The incidence of rib fracture was elevated in the earthquake-exposed cohort (143 vs. 66 patients in the non-earthquake-exposed cohort, Risk Ratio (RR) = 2.2; p<0.001). Among these patients, those with more than 3 fractured ribs (106/143 vs. 41/66 patients, RR=1.2; p<0.05) or flail chest (45/143 vs. 11/66 patients, RR=1.9; p<0.05) were more frequently seen in the earthquake cohort. Earthquake-related crush injuries more frequently resulted in bilateral rib fractures (66/143 vs. 18/66 patients, RR= 1.7; p<0.01). Additionally, the incidence of non-rib fracture was higher in the earthquake cohort (85 vs. 60 patients, RR= 1.4; p<0.01). Pulmonary parenchymal and pleural injuries were more frequently seen in earthquake-related crush injuries (117 vs. 80 patients, RR=1.5 for parenchymal and 146 vs. 74 patients, RR = 2.0 for pleural injuries; p<0.001). Non-rib fractures, pulmonary parenchymal and pleural injuries had significant positive correlation with rib fractures in these two cohorts. Thoracic crush traumas resulting from the earthquake were life threatening with a high incidence of bony thoracic fractures. The ribs were frequently involved in bilateral and severe types of fractures, which were accompanied by non-rib fractures, pulmonary parenchymal and pleural injuries.

  19. Earthquake hazards: a national threat

    USGS Publications Warehouse

    ,

    2006-01-01

    Earthquakes are one of the most costly natural hazards faced by the Nation, posing a significant risk to 75 million Americans in 39 States. The risks that earthquakes pose to society, including death, injury, and economic loss, can be greatly reduced by (1) better planning, construction, and mitigation practices before earthquakes happen, and (2) providing critical and timely information to improve response after they occur. As part of the multi-agency National Earthquake Hazards Reduction Program, the U.S. Geological Survey (USGS) has the lead Federal responsibility to provide notification of earthquakes in order to enhance public safety and to reduce losses through effective forecasts based on the best possible scientific information.

  20. Discussion of New Approaches to Medium-Short-Term Earthquake Forecast in Practice of The Earthquake Prediction in Yunnan

    NASA Astrophysics Data System (ADS)

    Hong, F.

    2017-12-01

    After retrospection of years of practice of the earthquake prediction in Yunnan area, it is widely considered that the fixed-point earthquake precursory anomalies mainly reflect the field information. The increase of amplitude and number of precursory anomalies could help to determine the original time of earthquakes, however it is difficult to obtain the spatial relevance between earthquakes and precursory anomalies, thus we can hardly predict the spatial locations of earthquakes using precursory anomalies. The past practices have shown that the seismic activities are superior to the precursory anomalies in predicting earthquakes locations, resulting from the increased seismicity were observed before 80% M=6.0 earthquakes in Yunnan area. While the mobile geomagnetic anomalies are turned out to be helpful in predicting earthquakes locations in recent year, for instance, the forecasted earthquakes occurring time and area derived form the 1-year-scale geomagnetic anomalies before the M6.5 Ludian earthquake in 2014 are shorter and smaller than which derived from the seismicity enhancement region. According to the past works, the author believes that the medium-short-term earthquake forecast level, as well as objective understanding of the seismogenic mechanisms, could be substantially improved by the densely laying observation array and capturing the dynamic process of physical property changes in the enhancement region of medium to small earthquakes.

  1. Earthquakes, May-June, 1992

    USGS Publications Warehouse

    Person, Waverly J.

    1992-01-01

    The months of May and June were very active in terms of earthquake occurrence. Six major earthquakes (7.0earthquakes included a magnitude 7.1 in Papua New Guinea on May 15, a magnitude 7.1 followed by a magnitude 7.5 in the Philippine Islands on May 17, a magnitude 7.0 in the Cuba region on May 25, and a magnitude 7.3 in the Santa Cruz Islands of the Pacific on May 27. In the United States, a magnitude 7.6 earthquake struck in southern California on June 28 followed by a magnitude 6.7 quake about three hours later.

  2. Normal Fault Type Earthquakes Off Fukushima Region - Comparison of the 1938 Events and Recent Earthquakes -

    NASA Astrophysics Data System (ADS)

    Murotani, S.; Satake, K.

    2017-12-01

    Off Fukushima region, Mjma 7.4 (event A) and 6.9 (event B) events occurred on November 6, 1938, following the thrust fault type earthquakes of Mjma 7.5 and 7.3 on the previous day. These earthquakes were estimated as normal fault earthquakes by Abe (1977, Tectonophysics). An Mjma 7.0 earthquake occurred on July 12, 2014 near event B and an Mjma 7.4 earthquake occurred on November 22, 2016 near event A. These recent events are the only M 7 class earthquakes occurred off Fukushima since 1938. Except for the two 1938 events, normal fault earthquakes have not occurred until many aftershocks of the 2011 Tohoku earthquake. We compared the observed tsunami and seismic waveforms of the 1938, 2014, and 2016 earthquakes to examine the normal fault earthquakes occurred off Fukushima region. It is difficult to compare the tsunami waveforms of the 1938, 2014 and 2016 events because there were only a few observations at the same station. The teleseismic body wave inversion of the 2016 earthquake yielded with the focal mechanism of strike 42°, dip 35°, and rake -94°. Other source parameters were as follows: source area 70 km x 40 km, average slip 0.2 m, maximum slip 1.2 m, seismic moment 2.2 x 1019 Nm, and Mw 6.8. A large slip area is located near the hypocenter, and it is compatible with the tsunami source area estimated from tsunami travel times. The 2016 tsunami source area is smaller than that of the 1938 event, consistent with the difference in Mw: 7.7 for event A estimated by Abe (1977) and 6.8 for the 2016 event. Although the 2014 epicenter is very close to that of event B, the teleseismic waveforms of the 2014 event are similar to those of event A and the 2016 event. While Abe (1977) assumed that the mechanism of event B was the same as event A, the initial motions at some stations are opposite, indicating that the focal mechanisms of events A and B are different and more detailed examination is needed. The normal fault type earthquake seems to occur following the

  3. What Can Sounds Tell Us About Earthquake Interactions?

    NASA Astrophysics Data System (ADS)

    Aiken, C.; Peng, Z.

    2012-12-01

    It is important not only for seismologists but also for educators to effectively convey information about earthquakes and the influences earthquakes can have on each other. Recent studies using auditory display [e.g. Kilb et al., 2012; Peng et al. 2012] have depicted catastrophic earthquakes and the effects large earthquakes can have on other parts of the world. Auditory display of earthquakes, which combines static images with time-compressed sound of recorded seismic data, is a new approach to disseminating information to a general audience about earthquakes and earthquake interactions. Earthquake interactions are influential to understanding the underlying physics of earthquakes and other seismic phenomena such as tremors in addition to their source characteristics (e.g. frequency contents, amplitudes). Earthquake interactions can include, for example, a large, shallow earthquake followed by increased seismicity around the mainshock rupture (i.e. aftershocks) or even a large earthquake triggering earthquakes or tremors several hundreds to thousands of kilometers away [Hill and Prejean, 2007; Peng and Gomberg, 2010]. We use standard tools like MATLAB, QuickTime Pro, and Python to produce animations that illustrate earthquake interactions. Our efforts are focused on producing animations that depict cross-section (side) views of tremors triggered along the San Andreas Fault by distant earthquakes, as well as map (bird's eye) views of mainshock-aftershock sequences such as the 2011/08/23 Mw5.8 Virginia earthquake sequence. These examples of earthquake interactions include sonifying earthquake and tremor catalogs as musical notes (e.g. piano keys) as well as audifying seismic data using time-compression. Our overall goal is to use auditory display to invigorate a general interest in earthquake seismology that leads to the understanding of how earthquakes occur, how earthquakes influence one another as well as tremors, and what the musical properties of these

  4. Earthquake Testing

    NASA Technical Reports Server (NTRS)

    1979-01-01

    During NASA's Apollo program, it was necessary to subject the mammoth Saturn V launch vehicle to extremely forceful vibrations to assure the moonbooster's structural integrity in flight. Marshall Space Flight Center assigned vibration testing to a contractor, the Scientific Services and Systems Group of Wyle Laboratories, Norco, California. Wyle-3S, as the group is known, built a large facility at Huntsville, Alabama, and equipped it with an enormously forceful shock and vibration system to simulate the liftoff stresses the Saturn V would encounter. Saturn V is no longer in service, but Wyle-3S has found spinoff utility for its vibration facility. It is now being used to simulate earthquake effects on various kinds of equipment, principally equipment intended for use in nuclear power generation. Government regulations require that such equipment demonstrate its ability to survive earthquake conditions. In upper left photo, Wyle3S is preparing to conduct an earthquake test on a 25ton diesel generator built by Atlas Polar Company, Ltd., Toronto, Canada, for emergency use in a Canadian nuclear power plant. Being readied for test in the lower left photo is a large circuit breaker to be used by Duke Power Company, Charlotte, North Carolina. Electro-hydraulic and electro-dynamic shakers in and around the pit simulate earthquake forces.

  5. Earthquakes in the United States

    USGS Publications Warehouse

    Stover, C.

    1977-01-01

    To supplement data in the report Preliminary Determination of Epicenters (PDE), the National earthquake Information Service (NEIS) also publishes a quarterly circular, Earthquakes in the United States. This provides information on the felt area of U.S earthquakes and their intensity. The main purpose is to describe the larger effects of these earthquakes so that they can be used in seismic risk studies, site evaluations for nuclear power plants, and answering inquiries by the general public.

  6. A smartphone application for earthquakes that matter!

    NASA Astrophysics Data System (ADS)

    Bossu, Rémy; Etivant, Caroline; Roussel, Fréderic; Mazet-Roux, Gilles; Steed, Robert

    2014-05-01

    Smartphone applications have swiftly become one of the most popular tools for rapid reception of earthquake information for the public, some of them having been downloaded more than 1 million times! The advantages are obvious: wherever someone's own location is, they can be automatically informed when an earthquake has struck. Just by setting a magnitude threshold and an area of interest, there is no longer the need to browse the internet as the information reaches you automatically and instantaneously! One question remains: are the provided earthquake notifications always relevant for the public? What are the earthquakes that really matters to laypeople? One clue may be derived from some newspaper reports that show that a while after damaging earthquakes many eyewitnesses scrap the application they installed just after the mainshock. Why? Because either the magnitude threshold is set too high and many felt earthquakes are missed, or it is set too low and the majority of the notifications are related to unfelt earthquakes thereby only increasing anxiety among the population at each new update. Felt and damaging earthquakes are the ones that matter the most for the public (and authorities). They are the ones of societal importance even when of small magnitude. A smartphone application developed by EMSC (Euro-Med Seismological Centre) with the financial support of the Fondation MAIF aims at providing suitable notifications for earthquakes by collating different information threads covering tsunamigenic, potentially damaging and felt earthquakes. Tsunamigenic earthquakes are considered here to be those ones that are the subject of alert or information messages from the PTWC (Pacific Tsunami Warning Centre). While potentially damaging earthquakes are identified through an automated system called EQIA (Earthquake Qualitative Impact Assessment) developed and operated at EMSC. This rapidly assesses earthquake impact by comparing the population exposed to each expected

  7. Nowcasting Earthquakes and Tsunamis

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Turcotte, D. L.

    2017-12-01

    The term "nowcasting" refers to the estimation of the current uncertain state of a dynamical system, whereas "forecasting" is a calculation of probabilities of future state(s). Nowcasting is a term that originated in economics and finance, referring to the process of determining the uncertain state of the economy or market indicators such as GDP at the current time by indirect means. We have applied this idea to seismically active regions, where the goal is to determine the current state of a system of faults, and its current level of progress through the earthquake cycle (http://onlinelibrary.wiley.com/doi/10.1002/2016EA000185/full). Advantages of our nowcasting method over forecasting models include: 1) Nowcasting is simply data analysis and does not involve a model having parameters that must be fit to data; 2) We use only earthquake catalog data which generally has known errors and characteristics; and 3) We use area-based analysis rather than fault-based analysis, meaning that the methods work equally well on land and in subduction zones. To use the nowcast method to estimate how far the fault system has progressed through the "cycle" of large recurring earthquakes, we use the global catalog of earthquakes, using "small" earthquakes to determine the level of hazard from "large" earthquakes in the region. We select a "small" region in which the nowcast is to be made, and compute the statistics of a much larger region around the small region. The statistics of the large region are then applied to the small region. For an application, we can define a small region around major global cities, for example a "small" circle of radius 150 km and a depth of 100 km, as well as a "large" earthquake magnitude, for example M6.0. The region of influence of such earthquakes is roughly 150 km radius x 100 km depth, which is the reason these values were selected. We can then compute and rank the seismic risk of the world's major cities in terms of their relative seismic risk

  8. Introduction to the special issue on the 2004 Parkfield earthquake and the Parkfield earthquake prediction experiment

    USGS Publications Warehouse

    Harris, R.A.; Arrowsmith, J.R.

    2006-01-01

    The 28 September 2004 M 6.0 Parkfield earthquake, a long-anticipated event on the San Andreas fault, is the world's best recorded earthquake to date, with state-of-the-art data obtained from geologic, geodetic, seismic, magnetic, and electrical field networks. This has allowed the preearthquake and postearthquake states of the San Andreas fault in this region to be analyzed in detail. Analyses of these data provide views into the San Andreas fault that show a complex geologic history, fault geometry, rheology, and response of the nearby region to the earthquake-induced ground movement. Although aspects of San Andreas fault zone behavior in the Parkfield region can be modeled simply over geological time frames, the Parkfield Earthquake Prediction Experiment and the 2004 Parkfield earthquake indicate that predicting the fine details of future earthquakes is still a challenge. Instead of a deterministic approach, forecasting future damaging behavior, such as that caused by strong ground motions, will likely continue to require probabilistic methods. However, the Parkfield Earthquake Prediction Experiment and the 2004 Parkfield earthquake have provided ample data to understand most of what did occur in 2004, culminating in significant scientific advances.

  9. Japanese earthquake predictability experiment with multiple runs before and after the 2011 Tohoku-oki earthquake

    NASA Astrophysics Data System (ADS)

    Hirata, N.; Tsuruoka, H.; Yokoi, S.

    2011-12-01

    The current Japanese national earthquake prediction program emphasizes the importance of modeling as well as monitoring for a sound scientific development of earthquake prediction research. One major focus of the current program is to move toward creating testable earthquake forecast models. For this purpose, in 2009 we joined the Collaboratory for the Study of Earthquake Predictability (CSEP) and installed, through an international collaboration, the CSEP Testing Centre, an infrastructure to encourage researchers to develop testable models for Japan. We started Japanese earthquake predictability experiment on November 1, 2009. The experiment consists of 12 categories, with 4 testing classes with different time spans (1 day, 3 months, 1 year and 3 years) and 3 testing regions called 'All Japan,' 'Mainland,' and 'Kanto.' A total of 160 models, as of August 2013, were submitted, and are currently under the CSEP official suite of tests for evaluating the performance of forecasts. We will present results of prospective forecast and testing for periods before and after the 2011 Tohoku-oki earthquake. Because a seismic activity has changed dramatically since the 2011 event, performances of models have been affected very much. In addition, as there is the problem of authorized catalogue related to the completeness magnitude, most models did not pass the CSEP consistency tests. Also, we will discuss the retrospective earthquake forecast experiments for aftershocks of the 2011 Tohoku-oki earthquake. Our aim is to describe what has turned out to be the first occasion for setting up a research environment for rigorous earthquake forecasting in Japan.

  10. Japanese earthquake predictability experiment with multiple runs before and after the 2011 Tohoku-oki earthquake

    NASA Astrophysics Data System (ADS)

    Hirata, N.; Tsuruoka, H.; Yokoi, S.

    2013-12-01

    The current Japanese national earthquake prediction program emphasizes the importance of modeling as well as monitoring for a sound scientific development of earthquake prediction research. One major focus of the current program is to move toward creating testable earthquake forecast models. For this purpose, in 2009 we joined the Collaboratory for the Study of Earthquake Predictability (CSEP) and installed, through an international collaboration, the CSEP Testing Centre, an infrastructure to encourage researchers to develop testable models for Japan. We started Japanese earthquake predictability experiment on November 1, 2009. The experiment consists of 12 categories, with 4 testing classes with different time spans (1 day, 3 months, 1 year and 3 years) and 3 testing regions called 'All Japan,' 'Mainland,' and 'Kanto.' A total of 160 models, as of August 2013, were submitted, and are currently under the CSEP official suite of tests for evaluating the performance of forecasts. We will present results of prospective forecast and testing for periods before and after the 2011 Tohoku-oki earthquake. Because a seismic activity has changed dramatically since the 2011 event, performances of models have been affected very much. In addition, as there is the problem of authorized catalogue related to the completeness magnitude, most models did not pass the CSEP consistency tests. Also, we will discuss the retrospective earthquake forecast experiments for aftershocks of the 2011 Tohoku-oki earthquake. Our aim is to describe what has turned out to be the first occasion for setting up a research environment for rigorous earthquake forecasting in Japan.

  11. Seismicity map tools for earthquake studies

    NASA Astrophysics Data System (ADS)

    Boucouvalas, Anthony; Kaskebes, Athanasios; Tselikas, Nikos

    2014-05-01

    We report on the development of new and online set of tools for use within Google Maps, for earthquake research. We demonstrate this server based and online platform (developped with PHP, Javascript, MySQL) with the new tools using a database system with earthquake data. The platform allows us to carry out statistical and deterministic analysis on earthquake data use of Google Maps and plot various seismicity graphs. The tool box has been extended to draw on the map line segments, multiple straight lines horizontally and vertically as well as multiple circles, including geodesic lines. The application is demonstrated using localized seismic data from the geographic region of Greece as well as other global earthquake data. The application also offers regional segmentation (NxN) which allows the studying earthquake clustering, and earthquake cluster shift within the segments in space. The platform offers many filters such for plotting selected magnitude ranges or time periods. The plotting facility allows statistically based plots such as cumulative earthquake magnitude plots and earthquake magnitude histograms, calculation of 'b' etc. What is novel for the platform is the additional deterministic tools. Using the newly developed horizontal and vertical line and circle tools we have studied the spatial distribution trends of many earthquakes and we here show for the first time the link between Fibonacci Numbers and spatiotemporal location of some earthquakes. The new tools are valuable for examining visualizing trends in earthquake research as it allows calculation of statistics as well as deterministic precursors. We plan to show many new results based on our newly developed platform.

  12. Earthquakes, November-December 1991

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    There were three major earthquakes (7.0-7.9) during the last two months of the year: a magntidue 7.0 on November 19 in Columbia, a magnitude 7.4 in the Kuril Islands on December 22, and a magnitude 7.1 in the South Sandwich Islands on December 27. Earthquake-related deaths were reported in Colombia, Yemen, and Iran. there were no significant earthquakes in the United States during this reporting period. 

  13. Earthquakes, September-October 1980

    USGS Publications Warehouse

    Person, W.J.

    1981-01-01

    There were two major (magnitudes 7.0-7.9) earthquakes during this reporting period; a magnitude (M) 7.3 in Algeria where many people were killed or injured and extensive damage occurred, and an M=7.2 in the Loyalty Islands region of the South Pacific. Japan was struck by a damaging earthquake on September 24, killing two people and causing injuries. There were no damaging earthquakes in the United States. 

  14. Retrospective stress-forecasting of earthquakes

    NASA Astrophysics Data System (ADS)

    Gao, Yuan; Crampin, Stuart

    2015-04-01

    Observations of changes in azimuthally varying shear-wave splitting (SWS) above swarms of small earthquakes monitor stress-induced changes to the stress-aligned vertical microcracks pervading the upper crust, lower crust, and uppermost ~400km of the mantle. (The microcracks are intergranular films of hydrolysed melt in the mantle.) Earthquakes release stress, and an appropriate amount of stress for the relevant magnitude must accumulate before each event. Iceland is on an extension of the Mid-Atlantic Ridge, where two transform zones, uniquely run onshore. These onshore transform zones provide semi-continuous swarms of small earthquakes, which are the only place worldwide where SWS can be routinely monitored. Elsewhere SWS must be monitored above temporally-active occasional swarms of small earthquakes, or in infrequent SKS and other teleseismic reflections from the mantle. Observations of changes in SWS time-delays are attributed to stress-induced changes in crack aspect-ratios allowing stress-accumulation and stress-relaxation to be identified. Monitoring SWS in SW Iceland in 1988, stress-accumulation before an impending earthquake was recognised and emails were exchanged between the University of Edinburgh (EU) and the Iceland Meteorological Office (IMO). On 10th November 1988, EU emailed IMO that a M5 earthquake could occur soon on a seismically-active fault plane where seismicity was still continuing following a M5.1 earthquake six-months earlier. Three-days later, IMO emailed EU that a M5 earthquake had just occurred on the specified fault-plane. We suggest this is a successful earthquake stress-forecast, where we refer to the procedure as stress-forecasting earthquakes as opposed to predicting or forecasting to emphasise the different formalism. Lack of funds has prevented us monitoring SWS on Iceland seismograms, however, we have identified similar characteristic behaviour of SWS time-delays above swarms of small earthquakes which have enabled us to

  15. Earthquake watch

    USGS Publications Warehouse

    Hill, M.

    1976-01-01

     When the time comes that earthquakes can be predicted accurately, what shall we do with the knowledge? This was the theme of a November 1975 conference on earthquake warning and response held in San Francisco called by Assistant Secretary of the Interior Jack W. Carlson. Invited were officials of State and local governments from Alaska, California, Hawaii, Idaho, Montana, Nevada, utah, Washington, and Wyoming and representatives of the news media. 

  16. Charles Darwin's earthquake reports

    NASA Astrophysics Data System (ADS)

    Galiev, Shamil

    2010-05-01

    As it is the 200th anniversary of Darwin's birth, 2009 has also been marked as 170 years since the publication of his book Journal of Researches. During the voyage Darwin landed at Valdivia and Concepcion, Chile, just before, during, and after a great earthquake, which demolished hundreds of buildings, killing and injuring many people. Land was waved, lifted, and cracked, volcanoes awoke and giant ocean waves attacked the coast. Darwin was the first geologist to observe and describe the effects of the great earthquake during and immediately after. These effects sometimes repeated during severe earthquakes; but great earthquakes, like Chile 1835, and giant earthquakes, like Chile 1960, are rare and remain completely unpredictable. This is one of the few areas of science, where experts remain largely in the dark. Darwin suggested that the effects were a result of ‘ …the rending of strata, at a point not very deep below the surface of the earth…' and ‘…when the crust yields to the tension, caused by its gradual elevation, there is a jar at the moment of rupture, and a greater movement...'. Darwin formulated big ideas about the earth evolution and its dynamics. These ideas set the tone for the tectonic plate theory to come. However, the plate tectonics does not completely explain why earthquakes occur within plates. Darwin emphasised that there are different kinds of earthquakes ‘...I confine the foregoing observations to the earthquakes on the coast of South America, or to similar ones, which seem generally to have been accompanied by elevation of the land. But, as we know that subsidence has gone on in other quarters of the world, fissures must there have been formed, and therefore earthquakes...' (we cite the Darwin's sentences following researchspace. auckland. ac. nz/handle/2292/4474). These thoughts agree with results of the last publications (see Nature 461, 870-872; 636-639 and 462, 42-43; 87-89). About 200 years ago Darwin gave oneself airs by the

  17. Focal mechanisms of earthquakes in Mongolia

    NASA Astrophysics Data System (ADS)

    Sodnomsambuu, D.; Natalia, R.; Gangaadorj, B.; Munkhuu, U.; Davaasuren, G.; Danzansan, E.; Yan, R.; Valentina, M.; Battsetseg, B.

    2011-12-01

    Focal mechanism data provide information on the relative magnitudes of the principal stresses, so that a tectonic regime can be assigned. Especially such information is useful for the study of intraplate seismic active regions. A study of earthquake focal mechanisms in the territory of Mongolia as landlocked and intraplate region was conducted. We present map of focal mechanisms of earthquakes with M4.5 which occurred in Mongolia and neighboring regions. Focal mechanisms solutions were constrained by the first motion solutions, as well as by waveform modeling, particularly CMT solutions. Four earthquakes have been recorded in Mongolia in XX century with magnitude more than 8, the 1905 M7.9 Tsetserleg and M8.4 Bolnai earthquakes, the 1931 M8.0 Fu Yun earthquake, the 1957 M8.1 Gobi-Altai earthquake. However the map of focal mechanisms of earthquakes in Mongolia allows seeing all seismic active structures: Gobi Altay, Mongolian Altay, active fringe of Hangay dome, Hentii range etc. Earthquakes in the most of Mongolian territory and neighboring China regions are characterized by strike-slip and reverse movements. Strike-slip movements also are typical for earthquakes in Altay Range in Russia. The north of Mongolia and south part of the Baikal area is a region where have been occurred earthquakes with different focal mechanisms. This region is a zone of the transition between compressive regime associated to India-Eurasian collision and extensive structures localized in north of the country as Huvsgul area and Baykal rift. Earthquakes in the Baikal basin itself are characterized by normal movements. Earthquakes in Trans-Baikal zone and NW of Mongolia are characterized dominantly by strike-slip movements. Analysis of stress-axis orientations, the tectonic stress tensor is presented. The map of focal mechanisms of earthquakes in Mongolia could be useful tool for researchers in their study on Geodynamics of Central Asia, particularly of Mongolian and Baikal regions.

  18. Earthquakes and emergence

    NASA Astrophysics Data System (ADS)

    Earthquakes and emerging infections may not have a direct cause and effect relationship like tax evasion and jail, but new evidence suggests that there may be a link between the two human health hazards. Various media accounts have cited a massive 1993 earthquake in Maharashtra as a potential catalyst of the recent outbreak of plague in India that has claimed more than 50 lives and alarmed the world. The hypothesis is that the earthquake may have uprooted underground rat populations that carry the fleas infected with the bacterium that causes bubonic plague and can lead to the pneumonic form of the disease that is spread through the air.

  19. Earthquake predictions using seismic velocity ratios

    USGS Publications Warehouse

    Sherburne, R. W.

    1979-01-01

    Since the beginning of modern seismology, seismologists have contemplated predicting earthquakes. The usefulness of earthquake predictions to the reduction of human and economic losses and the value of long-range earthquake prediction to planning is obvious. Not as clear are the long-range economic and social impacts of earthquake prediction to a speicifc area. The general consensus of opinion among scientists and government officials, however, is that the quest of earthquake prediction is a worthwhile goal and should be prusued with a sense of urgency. 

  20. Earthquakes and Schools

    ERIC Educational Resources Information Center

    National Clearinghouse for Educational Facilities, 2008

    2008-01-01

    Earthquakes are low-probability, high-consequence events. Though they may occur only once in the life of a school, they can have devastating, irreversible consequences. Moderate earthquakes can cause serious damage to building contents and non-structural building systems, serious injury to students and staff, and disruption of building operations.…

  1. Is there a basis for preferring characteristic earthquakes over a Gutenberg–Richter distribution in probabilistic earthquake forecasting?

    USGS Publications Warehouse

    Parsons, Thomas E.; Geist, Eric L.

    2009-01-01

    The idea that faults rupture in repeated, characteristic earthquakes is central to most probabilistic earthquake forecasts. The concept is elegant in its simplicity, and if the same event has repeated itself multiple times in the past, we might anticipate the next. In practice however, assembling a fault-segmented characteristic earthquake rupture model can grow into a complex task laden with unquantified uncertainty. We weigh the evidence that supports characteristic earthquakes against a potentially simpler model made from extrapolation of a Gutenberg–Richter magnitude-frequency law to individual fault zones. We find that the Gutenberg–Richter model satisfies key data constraints used for earthquake forecasting equally well as a characteristic model. Therefore, judicious use of instrumental and historical earthquake catalogs enables large-earthquake-rate calculations with quantifiable uncertainty that should get at least equal weighting in probabilistic forecasting.

  2. Smoking prevalence increases following Canterbury earthquakes.

    PubMed

    Erskine, Nick; Daley, Vivien; Stevenson, Sue; Rhodes, Bronwen; Beckert, Lutz

    2013-01-01

    A magnitude 7.1 earthquake hit Canterbury in September 2010. This earthquake and associated aftershocks took the lives of 185 people and drastically changed residents' living, working, and social conditions. To explore the impact of the earthquakes on smoking status and levels of tobacco consumption in the residents of Christchurch. Semistructured interviews were carried out in two city malls and the central bus exchange 15 months after the first earthquake. A total of 1001 people were interviewed. In August 2010, prior to any earthquake, 409 (41%) participants had never smoked, 273 (27%) were currently smoking, and 316 (32%) were ex-smokers. Since the September 2010 earthquake, 76 (24%) of the 316 ex-smokers had smoked at least one cigarette and 29 (38.2%) had smoked more than 100 cigarettes. Of the 273 participants who were current smokers in August 2010, 93 (34.1%) had increased consumption following the earthquake, 94 (34.4%) had not changed, and 86 (31.5%) had decreased their consumption. 53 (57%) of the 93 people whose consumption increased reported that the earthquake and subsequent lifestyle changes as a reason to increase smoking. 24% of ex-smokers resumed smoking following the earthquake, resulting in increased smoking prevalence. Tobacco consumption levels increased in around one-third of current smokers.

  3. Defining "Acceptable Risk" for Earthquakes Worldwide

    NASA Astrophysics Data System (ADS)

    Tucker, B.

    2001-05-01

    The greatest and most rapidly growing earthquake risk for mortality is in developing countries. Further, earthquake risk management actions of the last 50 years have reduced the average lethality of earthquakes in earthquake-threatened industrialized countries. (This is separate from the trend of the increasing fiscal cost of earthquakes there.) Despite these clear trends, every new earthquake in developing countries is described in the media as a "wake up" call, announcing the risk these countries face. GeoHazards International (GHI) works at both the community and the policy levels to try to reduce earthquake risk. GHI reduces death and injury by helping vulnerable communities recognize their risk and the methods to manage it, by raising awareness of its risk, building local institutions to manage that risk, and strengthening schools to protect and train the community's future generations. At the policy level, GHI, in collaboration with research partners, is examining whether "acceptance" of these large risks by people in these countries and by international aid and development organizations explains the lack of activity in reducing these risks. The goal of this pilot project - The Global Earthquake Safety Initiative (GESI) - is to develop and evaluate a means of measuring the risk and the effectiveness of risk mitigation actions in the world's largest, most vulnerable cities: in short, to develop an earthquake risk index. One application of this index is to compare the risk and the risk mitigation effort of "comparable" cities. By this means, Lima, for example, can compare the risk of its citizens dying due to earthquakes with the risk of citizens in Santiago and Guayaquil. The authorities of Delhi and Islamabad can compare the relative risk from earthquakes of their school children. This index can be used to measure the effectiveness of alternate mitigation projects, to set goals for mitigation projects, and to plot progress meeting those goals. The preliminary

  4. Cyclic migration of weak earthquakes between Lunigiana earthquake of October 10, 1995 and Reggio Emilia earthquake of October 15, 1996 (Northern Italy)

    NASA Astrophysics Data System (ADS)

    di Giovambattista, R.; Tyupkin, Yu

    The cyclic migration of weak earthquakes (M 2.2) which occurred during the yearprior to the October 15, 1996 (M = 4.9) Reggio Emilia earthquake isdiscussed in this paper. The onset of this migration was associated with theoccurrence of the October 10, 1995 (M = 4.8) Lunigiana earthquakeabout 90 km southwest from the epicenter of the Reggio Emiliaearthquake. At least three series of earthquakes migrating from theepicentral area of the Lunigiana earthquake in the northeast direction wereobserved. The migration of earthquakes of the first series terminated at adistance of about 30 km from the epicenter of the Reggio Emiliaearthquake. The earthquake migration of the other two series halted atabout 10 km from the Reggio Emilia epicenter. The average rate ofearthquake migration was about 200-300 km/year, while the time ofrecurrence of the observed cycles varied from 68 to 178 days. Weakearthquakes migrated along the transversal fault zones and sometimesjumped from one fault to another. A correlation between the migratingearthquakes and tidal variations is analysed. We discuss the hypothesis thatthe analyzed area is in a state of stress approaching the limit of thelong-term durability of crustal rocks and that the observed cyclic migrationis a result of a combination of a more or less regular evolution of tectonicand tidal variations.

  5. The 2012 Mw5.6 earthquake in Sofia seismogenic zone - is it a slow earthquake

    NASA Astrophysics Data System (ADS)

    Raykova, Plamena; Solakov, Dimcho; Slavcheva, Krasimira; Simeonova, Stela; Aleksandrova, Irena

    2017-04-01

    Recently our understanding of tectonic faulting has been shaken by the discoveries of seismic tremor, low frequency earthquakes, slow slip events, and other models of fault slip. These phenomenas represent models of failure that were thought to be non-existent and theoretically impossible only a few years ago. Slow earthquakes are seismic phenomena in which the rupture of geological faults in the earth's crust occurs gradually without creating strong tremors. Despite the growing number of observations of slow earthquakes their origin remains unresolved. Studies show that the duration of slow earthquakes ranges from a few seconds to a few hundred seconds. The regular earthquakes with which most people are familiar release a burst of built-up stress in seconds, slow earthquakes release energy in ways that do little damage. This study focus on the characteristics of the Mw5.6 earthquake occurred in Sofia seismic zone on May 22nd, 2012. The Sofia area is the most populated, industrial and cultural region of Bulgaria that faces considerable earthquake risk. The Sofia seismic zone is located in South-western Bulgaria - the area with pronounce tectonic activity and proved crustal movement. In 19th century the city of Sofia (situated in the centre of the Sofia seismic zone) has experienced two strong earthquakes with epicentral intensity of 10 MSK. During the 20th century the strongest event occurred in the vicinity of the city of Sofia is the 1917 earthquake with MS=5.3 (I0=7-8 MSK64).The 2012 quake occurs in an area characterized by a long quiescence (of 95 years) for moderate events. Moreover, a reduced number of small earthquakes have also been registered in the recent past. The Mw5.6 earthquake is largely felt on the territory of Bulgaria and neighbouring countries. No casualties and severe injuries have been reported. Mostly moderate damages were observed in the cities of Pernik and Sofia and their surroundings. These observations could be assumed indicative for a

  6. Thoracic Injuries in earthquake-related versus non-earthquake-related trauma patients: differentiation via Multi-detector Computed Tomography

    PubMed Central

    Dong, Zhi-hui; Yang, Zhi-gang; Chen, Tian-wu; Chu, Zhi-gang; Deng, Wen; Shao, Heng

    2011-01-01

    PURPOSE: Massive earthquakes are harmful to humankind. This study of a historical cohort aimed to investigate the difference between earthquake-related crush thoracic traumas and thoracic traumas unrelated to earthquakes using a multi-detector Computed Tomography (CT). METHODS: We retrospectively compared an earthquake-exposed cohort of 215 thoracic trauma crush victims of the Sichuan earthquake to a cohort of 215 non-earthquake-related thoracic trauma patients, focusing on the lesions and coexisting injuries to the thoracic cage and the pulmonary parenchyma and pleura using a multi-detector CT. RESULTS: The incidence of rib fracture was elevated in the earthquake-exposed cohort (143 vs. 66 patients in the non-earthquake-exposed cohort, Risk Ratio (RR) = 2.2; p<0.001). Among these patients, those with more than 3 fractured ribs (106/143 vs. 41/66 patients, RR = 1.2; p<0.05) or flail chest (45/143 vs. 11/66 patients, RR = 1.9; p<0.05) were more frequently seen in the earthquake cohort. Earthquake-related crush injuries more frequently resulted in bilateral rib fractures (66/143 vs. 18/66 patients, RR = 1.7; p<0.01). Additionally, the incidence of non-rib fracture was higher in the earthquake cohort (85 vs. 60 patients, RR = 1.4; p<0.01). Pulmonary parenchymal and pleural injuries were more frequently seen in earthquake-related crush injuries (117 vs. 80 patients, RR = 1.5 for parenchymal and 146 vs. 74 patients, RR = 2.0 for pleural injuries; p<0.001). Non-rib fractures, pulmonary parenchymal and pleural injuries had significant positive correlation with rib fractures in these two cohorts. CONCLUSIONS: Thoracic crush traumas resulting from the earthquake were life threatening with a high incidence of bony thoracic fractures. The ribs were frequently involved in bilateral and severe types of fractures, which were accompanied by non-rib fractures, pulmonary parenchymal and pleural injuries. PMID:21789386

  7. Issues on the Japanese Earthquake Hazard Evaluation

    NASA Astrophysics Data System (ADS)

    Hashimoto, M.; Fukushima, Y.; Sagiya, T.

    2013-12-01

    The 2011 Great East Japan Earthquake forced the policy of counter-measurements to earthquake disasters, including earthquake hazard evaluations, to be changed in Japan. Before the March 11, Japanese earthquake hazard evaluation was based on the history of earthquakes that repeatedly occurs and the characteristic earthquake model. The source region of an earthquake was identified and its occurrence history was revealed. Then the conditional probability was estimated using the renewal model. However, the Japanese authorities changed the policy after the megathrust earthquake in 2011 such that the largest earthquake in a specific seismic zone should be assumed on the basis of available scientific knowledge. According to this policy, three important reports were issued during these two years. First, the Central Disaster Management Council issued a new estimate of damages by a hypothetical Mw9 earthquake along the Nankai trough during 2011 and 2012. The model predicts a 34 m high tsunami on the southern Shikoku coast and intensity 6 or higher on the JMA scale in most area of Southwest Japan as the maximum. Next, the Earthquake Research Council revised the long-term earthquake hazard evaluation of earthquakes along the Nankai trough in May 2013, which discarded the characteristic earthquake model and put much emphasis on the diversity of earthquakes. The so-called 'Tokai' earthquake was negated in this evaluation. Finally, another report by the CDMC concluded that, with the current knowledge, it is hard to predict the occurrence of large earthquakes along the Nankai trough using the present techniques, based on the diversity of earthquake phenomena. These reports created sensations throughout the country and local governments are struggling to prepare counter-measurements. These reports commented on large uncertainty in their evaluation near their ends, but are these messages transmitted properly to the public? Earthquake scientists, including authors, are involved in

  8. The physics of an earthquake

    NASA Astrophysics Data System (ADS)

    McCloskey, John

    2008-03-01

    The Sumatra-Andaman earthquake of 26 December 2004 (Boxing Day 2004) and its tsunami will endure in our memories as one of the worst natural disasters of our time. For geophysicists, the scale of the devastation and the likelihood of another equally destructive earthquake set out a series of challenges of how we might use science not only to understand the earthquake and its aftermath but also to help in planning for future earthquakes in the region. In this article a brief account of these efforts is presented. Earthquake prediction is probably impossible, but earth scientists are now able to identify particularly dangerous places for future events by developing an understanding of the physics of stress interaction. Having identified such a dangerous area, a series of numerical Monte Carlo simulations is described which allow us to get an idea of what the most likely consequences of a future earthquake are by modelling the tsunami generated by lots of possible, individually unpredictable, future events. As this article was being written, another earthquake occurred in the region, which had many expected characteristics but was enigmatic in other ways. This has spawned a series of further theories which will contribute to our understanding of this extremely complex problem.

  9. Organizational changes at Earthquakes & Volcanoes

    USGS Publications Warehouse

    Gordon, David W.

    1992-01-01

    Primary responsibility for the preparation of Earthquakes & Volcanoes within the Geological Survey has shifted from the Office of Scientific Publications to the Office of Earthquakes, Volcanoes, and Engineering (OEVE). As a consequence of this reorganization, Henry Spall has stepepd down as Science Editor for Earthquakes & Volcanoes(E&V).

  10. Sense of Community and Depressive Symptoms among Older Earthquake Survivors Following the 2008 Earthquake in Chengdu China

    ERIC Educational Resources Information Center

    Li, Yawen; Sun, Fei; He, Xusong; Chan, Kin Sun

    2011-01-01

    This study examined the impact of an earthquake as well as the role of sense of community as a protective factor against depressive symptoms among older Chinese adults who survived an 8.0 magnitude earthquake in 2008. A household survey of a random sample was conducted 3 months after the earthquake and 298 older earthquake survivors participated…

  11. The Electronic Encyclopedia of Earthquakes

    NASA Astrophysics Data System (ADS)

    Benthien, M.; Marquis, J.; Jordan, T.

    2003-12-01

    The Electronic Encyclopedia of Earthquakes is a collaborative project of the Southern California Earthquake Center (SCEC), the Consortia of Universities for Research in Earthquake Engineering (CUREE) and the Incorporated Research Institutions for Seismology (IRIS). This digital library organizes earthquake information online as a partner with the NSF-funded National Science, Technology, Engineering and Mathematics (STEM) Digital Library (NSDL) and the Digital Library for Earth System Education (DLESE). When complete, information and resources for over 500 Earth science and engineering topics will be included, with connections to curricular materials useful for teaching Earth Science, engineering, physics and mathematics. Although conceived primarily as an educational resource, the Encyclopedia is also a valuable portal to anyone seeking up-to-date earthquake information and authoritative technical sources. "E3" is a unique collaboration among earthquake scientists and engineers to articulate and document a common knowledge base with a shared terminology and conceptual framework. It is a platform for cross-training scientists and engineers in these complementary fields and will provide a basis for sustained communication and resource-building between major education and outreach activities. For example, the E3 collaborating organizations have leadership roles in the two largest earthquake engineering and earth science projects ever sponsored by NSF: the George E. Brown Network for Earthquake Engineering Simulation (CUREE) and the EarthScope Project (IRIS and SCEC). The E3 vocabulary and definitions are also being connected to a formal ontology under development by the SCEC/ITR project for knowledge management within the SCEC Collaboratory. The E3 development system is now fully operational, 165 entries are in the pipeline, and the development teams are capable of producing 20 new, fully reviewed encyclopedia entries each month. Over the next two years teams will

  12. Energy Partition and Variability of Earthquakes

    NASA Astrophysics Data System (ADS)

    Kanamori, H.

    2003-12-01

    During an earthquake the potential energy (strain energy + gravitational energy + rotational energy) is released, and the released potential energy (Δ W) is partitioned into radiated energy (ER), fracture energy (EG), and thermal energy (E H). How Δ W is partitioned into these energies controls the behavior of an earthquake. The merit of the slip-weakening concept is that only ER and EG control the dynamics, and EH can be treated separately to discuss the thermal characteristics of an earthquake. In general, if EG/E_R is small, the event is ``brittle", if EG /ER is large, the event is ``quasi static" or, in more common terms, ``slow earthquakes" or ``creep". If EH is very large, the event may well be called a thermal runaway rather than an earthquake. The difference in energy partition has important implications for the rupture initiation, evolution and excitation of long-period ground motions from very large earthquakes. We review the current state of knowledge on this problem in light of seismological observations and the basic physics of fracture. With seismological methods, we can measure only ER and the lower-bound of Δ W, Δ W0, and estimation of other energies involves many assumptions. ER: Although ER can be directly measured from the radiated waves, its determination is difficult because a large fraction of energy radiated at the source is attenuated during propagation. With the commonly used teleseismic and regional methods, only for events with MW>7 and MW>4, respectively, we can directly measure more than 10% of the total radiated energy. The rest must be estimated after correction for attenuation. Thus, large uncertainties are involved, especially for small earthquakes. Δ W0: To estimate Δ W0, estimation of the source dimension is required. Again, only for large earthquakes, the source dimension can be estimated reliably. With the source dimension, the static stress drop, Δ σ S, and Δ W0, can be estimated. EG: Seismologically, EG is the energy

  13. Deviant Earthquakes: Data-driven Constraints on the Variability in Earthquake Source Properties and Seismic Hazard

    NASA Astrophysics Data System (ADS)

    Trugman, Daniel Taylor

    The complexity of the earthquake rupture process makes earthquakes inherently unpredictable. Seismic hazard forecasts often presume that the rate of earthquake occurrence can be adequately modeled as a space-time homogenenous or stationary Poisson process and that the relation between the dynamical source properties of small and large earthquakes obey self-similar scaling relations. While these simplified models provide useful approximations and encapsulate the first-order statistical features of the historical seismic record, they are inconsistent with the complexity underlying earthquake occurrence and can lead to misleading assessments of seismic hazard when applied in practice. The six principle chapters of this thesis explore the extent to which the behavior of real earthquakes deviates from these simplified models, and the implications that the observed deviations have for our understanding of earthquake rupture processes and seismic hazard. Chapter 1 provides a brief thematic overview and introduction to the scope of this thesis. Chapter 2 examines the complexity of the 2010 M7.2 El Mayor-Cucapah earthquake, focusing on the relation between its unexpected and unprecedented occurrence and anthropogenic stresses from the nearby Cerro Prieto Geothermal Field. Chapter 3 compares long-term changes in seismicity within California's three largest geothermal fields in an effort to characterize the relative influence of natural and anthropogenic stress transients on local seismic hazard. Chapter 4 describes a hybrid, hierarchical clustering algorithm that can be used to relocate earthquakes using waveform cross-correlation, and applies the new algorithm to study the spatiotemporal evolution of two recent seismic swarms in western Nevada. Chapter 5 describes a new spectral decomposition technique that can be used to analyze the dynamic source properties of large datasets of earthquakes, and applies this approach to revisit the question of self-similar scaling of

  14. Predecessors of the giant 1960 Chile earthquake

    USGS Publications Warehouse

    Cisternas, M.; Atwater, B.F.; Torrejon, F.; Sawai, Y.; Machuca, G.; Lagos, M.; Eipert, A.; Youlton, C.; Salgado, I.; Kamataki, T.; Shishikura, M.; Rajendran, C.P.; Malik, J.K.; Rizal, Y.; Husni, M.

    2005-01-01

    It is commonly thought that the longer the time since last earthquake, the larger the next earthquake's slip will be. But this logical predictor of earthquake size, unsuccessful for large earthquakes on a strike-slip fault, fails also with the giant 1960 Chile earthquake of magnitude 9.5 (ref. 3). Although the time since the preceding earthquake spanned 123 years (refs 4, 5), the estimated slip in 1960, which occurred on a fault between the Nazca and South American tectonic plates, equalled 250-350 years' worth of the plate motion. Thus the average interval between such giant earthquakes on this fault should span several centuries. Here we present evidence that such long intervals were indeed typical of the last two millennia. We use buried soils and sand layers as records of tectonic subsidence and tsunami inundation at an estuary midway along the 1960 rupture. In these records, the 1960 earthquake ended a recurrence interval that had begun almost four centuries before, with an earthquake documented by Spanish conquistadors in 1575. Two later earthquakes, in 1737 and 1837, produced little if any subsidence or tsunami at the estuary and they therefore probably left the fault partly loaded with accumulated plate motion that the 1960 earthquake then expended. ?? 2005 Nature Publishing Group.

  15. Predecessors of the giant 1960 Chile earthquake.

    PubMed

    Cisternas, Marco; Atwater, Brian F; Torrejón, Fernando; Sawai, Yuki; Machuca, Gonzalo; Lagos, Marcelo; Eipert, Annaliese; Youlton, Cristián; Salgado, Ignacio; Kamataki, Takanobu; Shishikura, Masanobu; Rajendran, C P; Malik, Javed K; Rizal, Yan; Husni, Muhammad

    2005-09-15

    It is commonly thought that the longer the time since last earthquake, the larger the next earthquake's slip will be. But this logical predictor of earthquake size, unsuccessful for large earthquakes on a strike-slip fault, fails also with the giant 1960 Chile earthquake of magnitude 9.5 (ref. 3). Although the time since the preceding earthquake spanned 123 years (refs 4, 5), the estimated slip in 1960, which occurred on a fault between the Nazca and South American tectonic plates, equalled 250-350 years' worth of the plate motion. Thus the average interval between such giant earthquakes on this fault should span several centuries. Here we present evidence that such long intervals were indeed typical of the last two millennia. We use buried soils and sand layers as records of tectonic subsidence and tsunami inundation at an estuary midway along the 1960 rupture. In these records, the 1960 earthquake ended a recurrence interval that had begun almost four centuries before, with an earthquake documented by Spanish conquistadors in 1575. Two later earthquakes, in 1737 and 1837, produced little if any subsidence or tsunami at the estuary and they therefore probably left the fault partly loaded with accumulated plate motion that the 1960 earthquake then expended.

  16. Next-Day Earthquake Forecasts for California

    NASA Astrophysics Data System (ADS)

    Werner, M. J.; Jackson, D. D.; Kagan, Y. Y.

    2008-12-01

    We implemented a daily forecast of m > 4 earthquakes for California in the format suitable for testing in community-based earthquake predictability experiments: Regional Earthquake Likelihood Models (RELM) and the Collaboratory for the Study of Earthquake Predictability (CSEP). The forecast is based on near-real time earthquake reports from the ANSS catalog above magnitude 2 and will be available online. The model used to generate the forecasts is based on the Epidemic-Type Earthquake Sequence (ETES) model, a stochastic model of clustered and triggered seismicity. Our particular implementation is based on the earlier work of Helmstetter et al. (2006, 2007), but we extended the forecast to all of Cali-fornia, use more data to calibrate the model and its parameters, and made some modifications. Our forecasts will compete against the Short-Term Earthquake Probabilities (STEP) forecasts of Gersten-berger et al. (2005) and other models in the next-day testing class of the CSEP experiment in California. We illustrate our forecasts with examples and discuss preliminary results.

  17. The mechanism of earthquake

    NASA Astrophysics Data System (ADS)

    Lu, Kunquan; Cao, Zexian; Hou, Meiying; Jiang, Zehui; Shen, Rong; Wang, Qiang; Sun, Gang; Liu, Jixing

    2018-03-01

    The physical mechanism of earthquake remains a challenging issue to be clarified. Seismologists used to attribute shallow earthquake to the elastic rebound of crustal rocks. The seismic energy calculated following the elastic rebound theory and with the data of experimental results upon rocks, however, shows a large discrepancy with measurement — a fact that has been dubbed as “the heat flow paradox”. For the intermediate-focus and deep-focus earthquakes, both occurring in the region of the mantle, there is not reasonable explanation either. This paper will discuss the physical mechanism of earthquake from a new perspective, starting from the fact that both the crust and the mantle are discrete collective system of matters with slow dynamics, as well as from the basic principles of physics, especially some new concepts of condensed matter physics emerged in the recent years. (1) Stress distribution in earth’s crust: Without taking the tectonic force into account, according to the rheological principle of “everything flows”, the normal stress and transverse stress must be balanced due to the effect of gravitational pressure over a long period of time, thus no differential stress in the original crustal rocks is to be expected. The tectonic force is successively transferred and accumulated via stick-slip motions of rock blocks to squeeze the fault gouge and then exerted upon other rock blocks. The superposition of such additional lateral tectonic force and the original stress gives rise to the real-time stress in crustal rocks. The mechanical characteristics of fault gouge are different from rocks as it consists of granular matters. The elastic moduli of the fault gouges are much less than those of rocks, and they become larger with increasing pressure. This peculiarity of the fault gouge leads to a tectonic force increasing with depth in a nonlinear fashion. The distribution and variation of the tectonic stress in the crust are specified. (2) The

  18. A comparison study of 2006 Java earthquake and other Tsunami earthquakes

    NASA Astrophysics Data System (ADS)

    Ji, C.; Shao, G.

    2006-12-01

    We revise the slip processes of July 17 2006 Java earthquakes by combined inverting teleseismic body wave, long period surface waves, as well as the broadband records at Christmas island (XMIS), which is 220 km away from the hypocenter and so far the closest observation for a Tsunami earthquake. Comparing with the previous studies, our approach considers the amplitude variations of surface waves with source depths as well as the contribution of ScS phase, which usually has amplitudes compatible with that of direct S phase for such low angle thrust earthquakes. The fault dip angles are also refined using the Love waves observed along fault strike direction. Our results indicate that the 2006 event initiated at a depth around 12 km and unilaterally rupture southeast for 150 sec with a speed of 1.0 km/sec. The revised fault dip is only about 6 degrees, smaller than the Harvard CMT (10.5 degrees) but consistent with that of 1994 Java earthquake. The smaller fault dip results in a larger moment magnitude (Mw=7.9) for a PREM earth, though it is dependent on the velocity structure used. After verified with 3D SEM forward simulation, we compare the inverted result with the revised slip models of 1994 Java and 1992 Nicaragua earthquakes derived using the same wavelet based finite fault inversion methodology.

  19. Turkish Children's Ideas about Earthquakes

    ERIC Educational Resources Information Center

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  20. Differential energy radiation from two earthquakes in Japan with identical Mw: The Kyushu 1996 and Tottori 2000 earthquakes

    USGS Publications Warehouse

    Choy, G.L.; Boatwright, J.

    2009-01-01

    We examine two closely located earthquakes in Japan that had identical moment magnitudes Mw but significantly different energy magnitudes Me. We use teleseismic data from the Global Seismograph Network and strong-motion data from the National Research Institute for Earth Science and Disaster Prevention's K-Net to analyze the 19 October 1996 Kyushu earthquake (Mw 6.7, Me 6.6) and the 6 October 2000 Tottori earthquake (Mw 6.7, Me 7.4). To obtain regional estimates of radiated energy ES we apply a spectral technique to regional (<200 km) waveforms that are dominated by S and Lg waves. For the thrust-fault Kyushu earthquake, we estimate an average regional attenuation Q(f) 230f0:65. For the strike-slip Tottori earthquake, the average regional attenuation is Q(f) 180f0:6. These attenuation functions are similar to those derived from studies of both California and Japan earthquakes. The regional estimate of ES for the Kyushu earthquake, 3:8 ?? 1014 J, is significantly smaller than that for the Tottori earthquake, ES 1:3 ?? 1015 J. These estimates correspond well with the teleseismic estimates of 3:9 ?? 1014 J and 1:8 ?? 1015 J, respectively. The apparent stress (Ta = ??Es/M0 with ?? equal to rigidity) for the Kyushu earthquake is 4 times smaller than the apparent stress for the Tottori earthquake. In terms of the fault maturity model, the significantly greater release of energy by the strike-slip Tottori earthquake can be related to strong deformation in an immature intraplate setting. The relatively lower energy release of the thrust-fault Kyushu earthquake can be related to rupture on mature faults at a subduction environment. The consistence between teleseismic and regional estimates of ES is particularly significant as teleseismic data for computing ES are routinely available for all large earthquakes whereas often there are no near-field data.

  1. Global earthquake fatalities and population

    USGS Publications Warehouse

    Holzer, Thomas L.; Savage, James C.

    2013-01-01

    Modern global earthquake fatalities can be separated into two components: (1) fatalities from an approximately constant annual background rate that is independent of world population growth and (2) fatalities caused by earthquakes with large human death tolls, the frequency of which is dependent on world population. Earthquakes with death tolls greater than 100,000 (and 50,000) have increased with world population and obey a nonstationary Poisson distribution with rate proportional to population. We predict that the number of earthquakes with death tolls greater than 100,000 (50,000) will increase in the 21st century to 8.7±3.3 (20.5±4.3) from 4 (7) observed in the 20th century if world population reaches 10.1 billion in 2100. Combining fatalities caused by the background rate with fatalities caused by catastrophic earthquakes (>100,000 fatalities) indicates global fatalities in the 21st century will be 2.57±0.64 million if the average post-1900 death toll for catastrophic earthquakes (193,000) is assumed.

  2. Large earthquakes and creeping faults

    USGS Publications Warehouse

    Harris, Ruth A.

    2017-01-01

    Faults are ubiquitous throughout the Earth's crust. The majority are silent for decades to centuries, until they suddenly rupture and produce earthquakes. With a focus on shallow continental active-tectonic regions, this paper reviews a subset of faults that have a different behavior. These unusual faults slowly creep for long periods of time and produce many small earthquakes. The presence of fault creep and the related microseismicity helps illuminate faults that might not otherwise be located in fine detail, but there is also the question of how creeping faults contribute to seismic hazard. It appears that well-recorded creeping fault earthquakes of up to magnitude 6.6 that have occurred in shallow continental regions produce similar fault-surface rupture areas and similar peak ground shaking as their locked fault counterparts of the same earthquake magnitude. The behavior of much larger earthquakes on shallow creeping continental faults is less well known, because there is a dearth of comprehensive observations. Computational simulations provide an opportunity to fill the gaps in our understanding, particularly of the dynamic processes that occur during large earthquake rupture and arrest.

  3. How fault geometry controls earthquake magnitude

    NASA Astrophysics Data System (ADS)

    Bletery, Q.; Thomas, A.; Karlstrom, L.; Rempel, A. W.; Sladen, A.; De Barros, L.

    2016-12-01

    Recent large megathrust earthquakes, such as the Mw9.3 Sumatra-Andaman earthquake in 2004 and the Mw9.0 Tohoku-Oki earthquake in 2011, astonished the scientific community. The first event occurred in a relatively low-convergence-rate subduction zone where events of its size were unexpected. The second event involved 60 m of shallow slip in a region thought to be aseismicaly creeping and hence incapable of hosting very large magnitude earthquakes. These earthquakes highlight gaps in our understanding of mega-earthquake rupture processes and the factors controlling their global distribution. Here we show that gradients in dip angle exert a primary control on mega-earthquake occurrence. We calculate the curvature along the major subduction zones of the world and show that past mega-earthquakes occurred on flat (low-curvature) interfaces. A simplified analytic model demonstrates that shear strength heterogeneity increases with curvature. Stress loading on flat megathrusts is more homogeneous and hence more likely to be released simultaneously over large areas than on highly-curved faults. Therefore, the absence of asperities on large faults might counter-intuitively be a source of higher hazard.

  4. Observing Triggered Earthquakes Across Iran with Calibrated Earthquake Locations

    NASA Astrophysics Data System (ADS)

    Karasozen, E.; Bergman, E.; Ghods, A.; Nissen, E.

    2016-12-01

    We investigate earthquake triggering phenomena in Iran by analyzing patterns of aftershock activity around mapped surface ruptures. Iran has an intense level of seismicity (> 40,000 events listed in the ISC Bulletin since 1960) due to it accommodating a significant portion of the continental collision between Arabia and Eurasia. There are nearly thirty mapped surface ruptures associated with earthquakes of M 6-7.5, mostly in eastern and northwestern Iran, offering a rich potential to study the kinematics of earthquake nucleation, rupture propagation, and subsequent triggering. However, catalog earthquake locations are subject to up to 50 km of location bias from the combination of unknown Earth structure and unbalanced station coverage, making it challenging to assess both the rupture directivity of larger events and the spatial patterns of their aftershocks. To overcome this limitation, we developed a new two-tiered multiple-event relocation approach to obtain hypocentral parameters that are minimally biased and have realistic uncertainties. In the first stage, locations of small clusters of well-recorded earthquakes at local spatial scales (100s of events across 100 km length scales) are calibrated either by using near-source arrival times or independent location constraints (e.g. local aftershock studies, InSAR solutions), using an implementation of the Hypocentroidal Decomposition relocation technique called MLOC. Epicentral uncertainties are typically less than 5 km. Then, these events are used as prior constraints in the code BayesLoc, a Bayesian relocation technique that can handle larger datasets, to yield region-wide calibrated hypocenters (1000s of events over 1000 km length scales). With locations and errors both calibrated, the pattern of aftershock activity can reveal the type of the earthquake triggering: dynamic stress changes promote an increase in the seismicity rate in the direction of unilateral propagation, whereas static stress changes should

  5. Earthquake casualty models within the USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David J.; Earle, Paul S.; Porter, Keith A.; Hearne, Mike

    2011-01-01

    Since the launch of the USGS’s Prompt Assessment of Global Earthquakes for Response (PAGER) system in fall of 2007, the time needed for the U.S. Geological Survey (USGS) to determine and comprehend the scope of any major earthquake disaster anywhere in the world has been dramatically reduced to less than 30 min. PAGER alerts consist of estimated shaking hazard from the ShakeMap system, estimates of population exposure at various shaking intensities, and a list of the most severely shaken cities in the epicentral area. These estimates help government, scientific, and relief agencies to guide their responses in the immediate aftermath of a significant earthquake. To account for wide variability and uncertainty associated with inventory, structural vulnerability and casualty data, PAGER employs three different global earthquake fatality/loss computation models. This article describes the development of the models and demonstrates the loss estimation capability for earthquakes that have occurred since 2007. The empirical model relies on country-specific earthquake loss data from past earthquakes and makes use of calibrated casualty rates for future prediction. The semi-empirical and analytical models are engineering-based and rely on complex datasets including building inventories, time-dependent population distributions within different occupancies, the vulnerability of regional building stocks, and casualty rates given structural collapse.

  6. In the shadow of 1857-the effect of the great Ft. Tejon earthquake on subsequent earthquakes in southern California

    USGS Publications Warehouse

    Harris, R.A.; Simpson, R.W.

    1996-01-01

    The great 1857 Fort Tejon earthquake is the largest earthquake to have hit southern California during the historic period. We investigated if seismicity patterns following 1857 could be due to static stress changes generated by the 1857 earthquake. When post-1857 earthquakes with unknown focal mechanisms were assigned strike-slip mechanisms with strike and rake determined by the nearest active fault, 13 of the 13 southern California M???5.5 earthquakes between 1857 and 1907 were encouraged by the 1857 rupture. When post-1857 earthquakes in the Transverse Ranges with unknown focal mechanisms were assigned reverse mechanisms and all other events were assumed strike-slip, 11 of the 13 earthquakes were encouraged by the 1857 earthquake. These results show significant correlations between static stress changes and seismicity patterns. The correlation disappears around 1907, suggesting that tectonic loading began to overwhelm the effect of the 1857 earthquake early in the 20th century.

  7. Intraplate triggered earthquakes: Observations and interpretation

    USGS Publications Warehouse

    Hough, S.E.; Seeber, L.; Armbruster, J.G.

    2003-01-01

    We present evidence that at least two of the three 1811-1812 New Madrid, central United States, mainshocks and the 1886 Charleston, South Carolina, earthquake triggered earthquakes at regional distances. In addition to previously published evidence for triggered earthquakes in the northern Kentucky/southern Ohio region in 1812, we present evidence suggesting that triggered events might have occurred in the Wabash Valley, to the south of the New Madrid Seismic Zone, and near Charleston, South Carolina. We also discuss evidence that earthquakes might have been triggered in northern Kentucky within seconds of the passage of surface waves from the 23 January 1812 New Madrid mainshock. After the 1886 Charleston earthquake, accounts suggest that triggered events occurred near Moodus, Connecticut, and in southern Indiana. Notwithstanding the uncertainty associated with analysis of historical accounts, there is evidence that at least three out of the four known Mw 7 earthquakes in the central and eastern United States seem to have triggered earthquakes at distances beyond the typically assumed aftershock zone of 1-2 mainshock fault lengths. We explore the possibility that remotely triggered earthquakes might be common in low-strain-rate regions. We suggest that in a low-strain-rate environment, permanent, nonelastic deformation might play a more important role in stress accumulation than it does in interplate crust. Using a simple model incorporating elastic and anelastic strain release, we show that, for realistic parameter values, faults in intraplate crust remain close to their failure stress for a longer part of the earthquake cycle than do faults in high-strain-rate regions. Our results further suggest that remotely triggered earthquakes occur preferentially in regions of recent and/or future seismic activity, which suggests that faults are at a critical stress state in only some areas. Remotely triggered earthquakes may thus serve as beacons that identify regions of

  8. Earthquakes, July-August 1991

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    There was one major earthquake during this reporting period-a magnitude 7.1 shock off the coast of Northern California on August 17. Earthquake-related deaths were reported from Indonesia, Romania, Peru, and Iraq. 

  9. Earthquakes-Rattling the Earth's Plumbing System

    USGS Publications Warehouse

    Sneed, Michelle; Galloway, Devin L.; Cunningham, William L.

    2003-01-01

    Hydrogeologic responses to earthquakes have been known for decades, and have occurred both close to, and thousands of miles from earthquake epicenters. Water wells have become turbid, dry or begun flowing, discharge of springs and ground water to streams has increased and new springs have formed, and well and surface-water quality have become degraded as a result of earthquakes. Earthquakes affect our Earth’s intricate plumbing system—whether you live near the notoriously active San Andreas Fault in California, or far from active faults in Florida, an earthquake near or far can affect you and the water resources you depend on.

  10. Measuring the size of an earthquake

    USGS Publications Warehouse

    Spence, W.

    1977-01-01

    Earthquakes occur in a broad range of sizes. A rock burst in an Idaho silver mine may involve the fracture of 1 meter of rock; the 1965 Rat island earthquake in the Aleutian arc involved a 650-kilometer lenght of Earth's crust. Earthquakes can be even smaller and even larger. if an earthquake is felt or causes perceptible surface damage, then its intesnity of shaking can be subjectively estimated. But many large earthquakes occur in oceanic area or at great focal depths. These are either simply not felt or their felt pattern does not really indicate their true size. 

  11. Volunteers in the earthquake hazard reduction program

    USGS Publications Warehouse

    Ward, P.L.

    1978-01-01

    With this in mind, I organized a small workshop for approximately 30 people on February 2 and 3, 1978, in Menlo Park, Calif. the purpose of the meeting was to discuss methods of involving volunteers in a meaningful way in earthquake research and in educating the public about earthquake hazards. The emphasis was on earthquake prediction research, but the discussions covered the whole earthquake hazard reduction program. Representatives attended from the earthquake research community, from groups doing socioeconomic research on earthquake matters, and from a wide variety of organizations who might sponsor volunteers. 

  12. Real-time earthquake source imaging: An offline test for the 2011 Tohoku earthquake

    NASA Astrophysics Data System (ADS)

    Zhang, Yong; Wang, Rongjiang; Zschau, Jochen; Parolai, Stefano; Dahm, Torsten

    2014-05-01

    In recent decades, great efforts have been expended in real-time seismology aiming at earthquake and tsunami early warning. One of the most important issues is the real-time assessment of earthquake rupture processes using near-field seismogeodetic networks. Currently, earthquake early warning systems are mostly based on the rapid estimate of P-wave magnitude, which contains generally large uncertainties and the known saturation problem. In the case of the 2011 Mw9.0 Tohoku earthquake, JMA (Japan Meteorological Agency) released the first warning of the event with M7.2 after 25 s. The following updates of the magnitude even decreased to M6.3-6.6. Finally, the magnitude estimate stabilized at M8.1 after about two minutes. This led consequently to the underestimated tsunami heights. By using the newly developed Iterative Deconvolution and Stacking (IDS) method for automatic source imaging, we demonstrate an offline test for the real-time analysis of the strong-motion and GPS seismograms of the 2011 Tohoku earthquake. The results show that we had been theoretically able to image the complex rupture process of the 2011 Tohoku earthquake automatically soon after or even during the rupture process. In general, what had happened on the fault could be robustly imaged with a time delay of about 30 s by using either the strong-motion (KiK-net) or the GPS (GEONET) real-time data. This implies that the new real-time source imaging technique is helpful to reduce false and missing warnings, and therefore should play an important role in future tsunami early warning and earthquake rapid response systems.

  13. Protecting Your Family From Earthquakes-The Seven Steps to Earthquake Safety (in Spanish and English)

    USGS Publications Warehouse

    Developed by American Red Cross, Asian Pacific Fund

    2007-01-01

    This book is provided here to share an important message on emergency preparedness. Historically, we have suffered earthquakes here in the San Francisco Bay Area that have caused severe hardship for residents and incredible damage to our cities. It is likely we will experience a severe earthquake within the next 30 years. Many of us come from other countries where we have experienced earth- quakes, so we believe that we understand them. However, the way we prepare for earthquakes in our home country may be different from the way it is necessary to prepare for earthquakes here. Very f w people die from collapsing buildings in the Bay Area because most structures are built to stand up to the shaking. But it is quite possible that your family will be without medical care or grocery stores and separated from one another for several days to weeks. It will ultimately be up to you to keep your family safe until help arrives, so we are asking you to join us in learning to take care of your family before, during, and after an earthquake. The first step is to read this book. Everyone in your family, children and adults, can learn how to prepare for an earthquake. Then take advantage of the American Red Cross Earthquake Preparedness training courses offered in your community. These preparedness courses are free, and also offered in Spanish and available to everyone in the community regardless of family history, leg al status, gender, or age. We encourage you to take one of these free training workshops. Look on the back cover for more information. Remember that an earthquake can occur without warning, and the only way that we can reduce the harm caused by earthquakes is to be prepared. Get Prepared!

  14. Links Between Earthquake Characteristics and Subducting Plate Heterogeneity in the 2016 Pedernales Ecuador Earthquake Rupture Zone

    NASA Astrophysics Data System (ADS)

    Bai, L.; Mori, J. J.

    2016-12-01

    The collision between the Indian and Eurasian plates formed the Himalayas, the largest orogenic belt on the Earth. The entire region accommodates shallow earthquakes, while intermediate-depth earthquakes are concentrated at the eastern and western Himalayan syntaxis. Here we investigate the focal depths, fault plane solutions, and source rupture process for three earthquake sequences, which are located at the western, central and eastern regions of the Himalayan orogenic belt. The Pamir-Hindu Kush region is located at the western Himalayan syntaxis and is characterized by extreme shortening of the upper crust and strong interaction of various layers of the lithosphere. Many shallow earthquakes occur on the Main Pamir Thrust at focal depths shallower than 20 km, while intermediate-deep earthquakes are mostly located below 75 km. Large intermediate-depth earthquakes occur frequently at the western Himalayan syntaxis about every 10 years on average. The 2015 Nepal earthquake is located in the central Himalayas. It is a typical megathrust earthquake that occurred on the shallow portion of the Main Himalayan Thrust (MHT). Many of the aftershocks are located above the MHT and illuminate faulting structures in the hanging wall with dip angles that are steeper than the MHT. These observations provide new constraints on the collision and uplift processes for the Himalaya orogenic belt. The Indo-Burma region is located south of the eastern Himalayan syntaxis, where the strike of the plate boundary suddenly changes from nearly east-west at the Himalayas to nearly north-south at the Burma Arc. The Burma arc subduction zone is a typical oblique plate convergence zone. The eastern boundary is the north-south striking dextral Sagaing fault, which hosts many shallow earthquakes with focal depth less than 25 km. In contrast, intermediate-depth earthquakes along the subduction zone reflect east-west trending reverse faulting.

  15. Prospective Validation of Pre-earthquake Atmospheric Signals and Their Potential for Short–term Earthquake Forecasting

    NASA Astrophysics Data System (ADS)

    Ouzounov, Dimitar; Pulinets, Sergey; Hattori, Katsumi; Lee, Lou; Liu, Tiger; Kafatos, Menas

    2015-04-01

    We are presenting the latest development in multi-sensors observations of short-term pre-earthquake phenomena preceding major earthquakes. Our challenge question is: "Whether such pre-earthquake atmospheric/ionospheric signals are significant and could be useful for early warning of large earthquakes?" To check the predictive potential of atmospheric pre-earthquake signals we have started to validate anomalous ionospheric / atmospheric signals in retrospective and prospective modes. The integrated satellite and terrestrial framework (ISTF) is our method for validation and is based on a joint analysis of several physical and environmental parameters (Satellite thermal infrared radiation (STIR), electron concentration in the ionosphere (GPS/TEC), radon/ion activities, air temperature and seismicity patterns) that were found to be associated with earthquakes. The science rationale for multidisciplinary analysis is based on concept Lithosphere-Atmosphere-Ionosphere Coupling (LAIC) [Pulinets and Ouzounov, 2011], which explains the synergy of different geospace processes and anomalous variations, usually named short-term pre-earthquake anomalies. Our validation processes consist in two steps: (1) A continuous retrospective analysis preformed over two different regions with high seismicity- Taiwan and Japan for 2003-2009 (2) Prospective testing of STIR anomalies with potential for M5.5+ events. The retrospective tests (100+ major earthquakes, M>5.9, Taiwan and Japan) show STIR anomalous behavior before all of these events with false negatives close to zero. False alarm ratio for false positives is less then 25%. The initial prospective testing for STIR shows systematic appearance of anomalies in advance (1-30 days) to the M5.5+ events for Taiwan, Kamchatka-Sakhalin (Russia) and Japan. Our initial prospective results suggest that our approach show a systematic appearance of atmospheric anomalies, one to several days prior to the largest earthquakes That feature could be

  16. Earthquakes, September-October 1984

    USGS Publications Warehouse

    Person, W.J.

    1985-01-01

    In the United States, Wyoming experienced a couple of moderate earthquakes, and off the coast of northern California, a strong earthquake shook much of the northern coast of California and parts of the Oregon coast. 

  17. Earthquakes, November-December 1977

    USGS Publications Warehouse

    Person, W.J.

    1978-01-01

    In the United States, the largest earthquake during this reporting period was a magntidue 6.6 in the Andreanof Islands, which are part of the Aleutian Islands chain, on November 4 that caused some minor damage. Northern California was struck by a magnitude 4.8 earthquake on November 22 causing moderate damage in the Willits area. This was the most damaging quake in the United States during the year. Two major earthquakes of magntidues 7.0 or above to 14 for the year. 

  18. Early Earthquakes of the Americas

    NASA Astrophysics Data System (ADS)

    Ni, James

    2004-11-01

    Robert Kovach's second book looks at the interplay of earthquake and volcanic events, archeology, and history in the Americas. Throughout history, major earthquakes have caused the deaths of millions of people and have damaged countless cities. Earthquakes undoubtedly damaged prehistoric cities in the Americas, and evidence of these events could be preserved in archeological records. Kovach asks, Did indigenous native cultures-Indians of the Pacific Northwest, Aztecs, Mayas, and Incas-document their natural history? Some events have been explicitly documented, for example, in Mayan codices, but many may have been recorded as myth and legend. Kovach's discussions of how early cultures dealt with fearful events such as earthquakes and volcanic eruptions are colorful, informative, and entertaining, and include, for example, a depiction of how the Maya would talk to maize plants in their fields during earthquakes to reassure them.

  19. The earthquake disaster risk characteristic and the problem in the earthquake emergency rescue of mountainous southwestern Sichuan

    NASA Astrophysics Data System (ADS)

    Yuan, S.; Xin, C.; Ying, Z.

    2016-12-01

    In recent years, earthquake disaster occurred frequently in Chinese mainland, the secondary disaster which have been caused by it is more serious in mountainous region. Because of the influence of terrain and geological conditions, the difficulty of earthquake emergency rescue work greatly increased, rescue force is also urged. Yet, it has been studied less on earthquake emergency rescue in mountainous region, the research in existing equipment whether can meet the actual needs of local earthquake emergency rescue is poorly. This paper intends to discuss and solve these problems. Through the mountainous regions Ganzi and Liangshan states in Sichuan field research, we investigated the process of earthquake emergency response and the projects for rescue force after an earthquake, and we also collected and collated local rescue force based data. By consulting experts and statistical analyzing the basic data, there are mainly two problems: The first is about local rescue force, they are poorly equipped and lack in the knowledge of medical help or identify architectural structure. There are no countries to establish a sound financial investment protection mechanism. Also, rescue equipment's updates and maintenance; The second problem is in earthquake emergency rescue progress. In the complicated geologic structure of mountainous regions, traffic and communication may be interrupted by landslides and mud-rock flows after earthquake. The outside rescue force may not arrive in time, rescue equipment was transported by manpower. Because of unknown earthquake disaster information, the local rescue force was deployed unreasonable. From the above, the local government worker should analyze the characteristics of the earthquake disaster in mountainous regions, and research how to improve their earthquake emergency rescue ability. We think they can do that by strengthening and regulating the rescue force structure, enhancing the skills and knowledge, training rescue workers

  20. Simulating Earthquakes for Science and Society: New Earthquake Visualizations Ideal for Use in Science Communication

    NASA Astrophysics Data System (ADS)

    de Groot, R. M.; Benthien, M. L.

    2006-12-01

    The Southern California Earthquake Center (SCEC) has been developing groundbreaking computer modeling capabilities for studying earthquakes. These visualizations were initially shared within the scientific community but have recently have gained visibility via television news coverage in Southern California. These types of visualizations are becoming pervasive in the teaching and learning of concepts related to earth science. Computers have opened up a whole new world for scientists working with large data sets, and students can benefit from the same opportunities (Libarkin &Brick, 2002). Earthquakes are ideal candidates for visualization products: they cannot be predicted, are completed in a matter of seconds, occur deep in the earth, and the time between events can be on a geologic time scale. For example, the southern part of the San Andreas fault has not seen a major earthquake since about 1690, setting the stage for an earthquake as large as magnitude 7.7 -- the "big one." Since no one has experienced such an earthquake, visualizations can help people understand the scale of such an event. Accordingly, SCEC has developed a revolutionary simulation of this earthquake, with breathtaking visualizations that are now being distributed. According to Gordin and Pea (1995), theoretically visualization should make science accessible, provide means for authentic inquiry, and lay the groundwork to understand and critique scientific issues. This presentation will discuss how the new SCEC visualizations and other earthquake imagery achieve these results, how they fit within the context of major themes and study areas in science communication, and how the efficacy of these tools can be improved.

  1. Earthquake number forecasts testing

    NASA Astrophysics Data System (ADS)

    Kagan, Yan Y.

    2017-10-01

    We study the distributions of earthquake numbers in two global earthquake catalogues: Global Centroid-Moment Tensor and Preliminary Determinations of Epicenters. The properties of these distributions are especially required to develop the number test for our forecasts of future seismic activity rate, tested by the Collaboratory for Study of Earthquake Predictability (CSEP). A common assumption, as used in the CSEP tests, is that the numbers are described by the Poisson distribution. It is clear, however, that the Poisson assumption for the earthquake number distribution is incorrect, especially for the catalogues with a lower magnitude threshold. In contrast to the one-parameter Poisson distribution so widely used to describe earthquake occurrences, the negative-binomial distribution (NBD) has two parameters. The second parameter can be used to characterize the clustering or overdispersion of a process. We also introduce and study a more complex three-parameter beta negative-binomial distribution. We investigate the dependence of parameters for both Poisson and NBD distributions on the catalogue magnitude threshold and on temporal subdivision of catalogue duration. First, we study whether the Poisson law can be statistically rejected for various catalogue subdivisions. We find that for most cases of interest, the Poisson distribution can be shown to be rejected statistically at a high significance level in favour of the NBD. Thereafter, we investigate whether these distributions fit the observed distributions of seismicity. For this purpose, we study upper statistical moments of earthquake numbers (skewness and kurtosis) and compare them to the theoretical values for both distributions. Empirical values for the skewness and the kurtosis increase for the smaller magnitude threshold and increase with even greater intensity for small temporal subdivision of catalogues. The Poisson distribution for large rate values approaches the Gaussian law, therefore its skewness

  2. Earthquake nucleation by transient deformations caused by the M = 7.9 Denali, Alaska, earthquake

    USGS Publications Warehouse

    Gomberg, J.; Bodin, P.; Larson, K.; Dragert, H.

    2004-01-01

    The permanent and dynamic (transient) stress changes inferred to trigger earthquakes are usually orders of magnitude smaller than the stresses relaxed by the earthquakes themselves, implying that triggering occurs on critically stressed faults. Triggered seismicity rate increases may therefore be most likely to occur in areas where loading rates are highest and elevated pore pressures, perhaps facilitated by high-temperature fluids, reduce frictional stresses and promote failure. Here we show that the 2002 magnitude M = 7.9 Denali, Alaska, earthquake triggered wide-spread seismicity rate increases throughout British Columbia and into the western United States. Dynamic triggering by seismic waves should be enhanced in directions where rupture directivity focuses radiated energy, and we verify this using seismic and new high-sample GPS recordings of the Denali mainshock. These observations are comparable in scale only to the triggering caused by the 1992 M = 7.4 Landers, California, earthquake, and demonstrate that Landers triggering did not reflect some peculiarity of the region or the earthquake. However, the rate increases triggered by the Denali earthquake occurred in areas not obviously tectonically active, implying that even in areas of low ambient stressing rates, faults may still be critically stressed and that dynamic triggering may be ubiquitous and unpredictable.

  3. Prediction of earthquake-triggered landslide event sizes

    NASA Astrophysics Data System (ADS)

    Braun, Anika; Havenith, Hans-Balder; Schlögel, Romy

    2016-04-01

    Seismically induced landslides are a major environmental effect of earthquakes, which may significantly contribute to related losses. Moreover, in paleoseismology landslide event sizes are an important proxy for the estimation of the intensity and magnitude of past earthquakes and thus allowing us to improve seismic hazard assessment over longer terms. Not only earthquake intensity, but also factors such as the fault characteristics, topography, climatic conditions and the geological environment have a major impact on the intensity and spatial distribution of earthquake induced landslides. We present here a review of factors contributing to earthquake triggered slope failures based on an "event-by-event" classification approach. The objective of this analysis is to enable the short-term prediction of earthquake triggered landslide event sizes in terms of numbers and size of the affected area right after an earthquake event occurred. Five main factors, 'Intensity', 'Fault', 'Topographic energy', 'Climatic conditions' and 'Surface geology' were used to establish a relationship to the number and spatial extend of landslides triggered by an earthquake. The relative weight of these factors was extracted from published data for numerous past earthquakes; topographic inputs were checked in Google Earth and through geographic information systems. Based on well-documented recent earthquakes (e.g. Haiti 2010, Wenchuan 2008) and on older events for which reliable extensive information was available (e.g. Northridge 1994, Loma Prieta 1989, Guatemala 1976, Peru 1970) the combination and relative weight of the factors was calibrated. The calibrated factor combination was then applied to more than 20 earthquake events for which landslide distribution characteristics could be cross-checked. One of our main findings is that the 'Fault' factor, which is based on characteristics of the fault, the surface rupture and its location with respect to mountain areas, has the most important

  4. Evidence for Ancient Mesoamerican Earthquakes

    NASA Astrophysics Data System (ADS)

    Kovach, R. L.; Garcia, B.

    2001-12-01

    Evidence for past earthquake damage at Mesoamerican ruins is often overlooked because of the invasive effects of tropical vegetation and is usually not considered as a casual factor when restoration and reconstruction of many archaeological sites are undertaken. Yet the proximity of many ruins to zones of seismic activity would argue otherwise. Clues as to the types of damage which should be soughtwere offered in September 1999 when the M = 7.5 Oaxaca earthquake struck the ruins of Monte Alban, Mexico, where archaeological renovations were underway. More than 20 structures were damaged, 5 of them seriously. Damage features noted were walls out of plumb, fractures in walls, floors, basal platforms and tableros, toppling of columns, and deformation, settling and tumbling of walls. A Modified Mercalli Intensity of VII (ground accelerations 18-34 %b) occurred at the site. Within the diffuse landward extension of the Caribbean plate boundary zone M = 7+ earthquakes occur with repeat times of hundreds of years arguing that many Maya sites were subjected to earthquakes. Damage to re-erected and reinforced stelae, walls, and buildings were witnessed at Quirigua, Guatemala, during an expedition underway when then 1976 M = 7.5 Guatemala earthquake on the Motagua fault struck. Excavations also revealed evidence (domestic pttery vessels and skeleton of a child crushed under fallen walls) of an ancient earthquake occurring about the teim of the demise and abandonment of Quirigua in the late 9th century. Striking evidence for sudden earthquake building collapse at the end of the Mayan Classic Period ~A.D. 889 was found at Benque Viejo (Xunantunich), Belize, located 210 north of Quirigua. It is argued that a M = 7.5 to 7.9 earthquake at the end of the Maya Classic period centered in the vicinity of the Chixoy-Polochic and Motagua fault zones cound have produced the contemporaneous earthquake damage to the above sites. As a consequences this earthquake may have accelerated the

  5. Earthquake Potential Models for China

    NASA Astrophysics Data System (ADS)

    Rong, Y.; Jackson, D. D.

    2002-12-01

    We present three earthquake potential estimates for magnitude 5.4 and larger earthquakes for China. The potential is expressed as the rate density (probability per unit area, magnitude and time). The three methods employ smoothed seismicity-, geologic slip rate-, and geodetic strain rate data. We tested all three estimates, and the published Global Seismic Hazard Assessment Project (GSHAP) model, against earthquake data. We constructed a special earthquake catalog which combines previous catalogs covering different times. We used the special catalog to construct our smoothed seismicity model and to evaluate all models retrospectively. All our models employ a modified Gutenberg-Richter magnitude distribution with three parameters: a multiplicative ``a-value," the slope or ``b-value," and a ``corner magnitude" marking a strong decrease of earthquake rate with magnitude. We assumed the b-value to be constant for the whole study area and estimated the other parameters from regional or local geophysical data. The smoothed seismicity method assumes that the rate density is proportional to the magnitude of past earthquakes and approximately as the reciprocal of the epicentral distance out to a few hundred kilometers. We derived the upper magnitude limit from the special catalog and estimated local a-values from smoothed seismicity. Earthquakes since January 1, 2000 are quite compatible with the model. For the geologic forecast we adopted the seismic source zones (based on geological, geodetic and seismicity data) of the GSHAP model. For each zone, we estimated a corner magnitude by applying the Wells and Coppersmith [1994] relationship to the longest fault in the zone, and we determined the a-value from fault slip rates and an assumed locking depth. The geological model fits the earthquake data better than the GSHAP model. We also applied the Wells and Coppersmith relationship to individual faults, but the results conflicted with the earthquake record. For our geodetic

  6. Post-earthquake building safety assessments for the Canterbury Earthquakes

    USGS Publications Warehouse

    Marshall, J.; Barnes, J.; Gould, N.; Jaiswal, K.; Lizundia, B.; Swanson, David A.; Turner, F.

    2012-01-01

    This paper explores the post-earthquake building assessment program that was utilized in Christchurch, New Zealand following the Canterbury Sequence of earthquakes beginning with the Magnitude (Mw.) 7.1 Darfield event in September 2010. The aftershocks or triggered events, two of which exceeded Mw 6.0, continued with events in February and June 2011 causing the greatest amount of damage. More than 70,000 building safety assessments were completed following the February event. The timeline and assessment procedures will be discussed including the use of rapid response teams, selection of indicator buildings to monitor damage following aftershocks, risk assessments for demolition of red-tagged buildings, the use of task forces to address management of the heavily damaged downtown area and the process of demolition. Through the post-event safety assessment program that occurred throughout the Canterbury Sequence of earthquakes, many important lessons can be learned that will benefit future response to natural hazards that have potential to damage structures.

  7. Testing earthquake source inversion methodologies

    USGS Publications Warehouse

    Page, M.; Mai, P.M.; Schorlemmer, D.

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data and the complex rupture process at depth. The resulting earthquake source models quantify the spatiotemporal evolution of ruptures. They are also used to provide a rapid assessment of the severity of an earthquake and to estimate losses. However, because of uncertainties in the data, assumed fault geometry and velocity structure, and chosen rupture parameterization, it is not clear which features of these source models are robust. Improved understanding of the uncertainty and reliability of earthquake source inversions will allow the scientific community to use the robust features of kinematic inversions to more thoroughly investigate the complexity of the rupture process and to better constrain other earthquakerelated computations, such as ground motion simulations and static stress change calculations.

  8. Testing an earthquake prediction algorithm

    USGS Publications Warehouse

    Kossobokov, V.G.; Healy, J.H.; Dewey, J.W.

    1997-01-01

    A test to evaluate earthquake prediction algorithms is being applied to a Russian algorithm known as M8. The M8 algorithm makes intermediate term predictions for earthquakes to occur in a large circle, based on integral counts of transient seismicity in the circle. In a retroactive prediction for the period January 1, 1985 to July 1, 1991 the algorithm as configured for the forward test would have predicted eight of ten strong earthquakes in the test area. A null hypothesis, based on random assignment of predictions, predicts eight earthquakes in 2.87% of the trials. The forward test began July 1, 1991 and will run through December 31, 1997. As of July 1, 1995, the algorithm had forward predicted five out of nine earthquakes in the test area, which success ratio would have been achieved in 53% of random trials with the null hypothesis.

  9. Earthquakes, July-August, 1979

    USGS Publications Warehouse

    Person, W.J.

    1980-01-01

    In the United States, on August 6, central California experienced a moderately strong earthquake, which injured several people and caused some damage. A number of earthquakes occurred in other parts of the United States but caused very little damage. 

  10. Earthquake cycles and physical modeling of the process leading up to a large earthquake

    NASA Astrophysics Data System (ADS)

    Ohnaka, Mitiyasu

    2004-08-01

    A thorough discussion is made on what the rational constitutive law for earthquake ruptures ought to be from the standpoint of the physics of rock friction and fracture on the basis of solid facts observed in the laboratory. From this standpoint, it is concluded that the constitutive law should be a slip-dependent law with parameters that may depend on slip rate or time. With the long-term goal of establishing a rational methodology of forecasting large earthquakes, the entire process of one cycle for a typical, large earthquake is modeled, and a comprehensive scenario that unifies individual models for intermediate-and short-term (immediate) forecasts is presented within the framework based on the slip-dependent constitutive law and the earthquake cycle model. The earthquake cycle includes the phase of accumulation of elastic strain energy with tectonic loading (phase II), and the phase of rupture nucleation at the critical stage where an adequate amount of the elastic strain energy has been stored (phase III). Phase II plays a critical role in physical modeling of intermediate-term forecasting, and phase III in physical modeling of short-term (immediate) forecasting. The seismogenic layer and individual faults therein are inhomogeneous, and some of the physical quantities inherent in earthquake ruptures exhibit scale-dependence. It is therefore critically important to incorporate the properties of inhomogeneity and physical scaling, in order to construct realistic, unified scenarios with predictive capability. The scenario presented may be significant and useful as a necessary first step for establishing the methodology for forecasting large earthquakes.

  11. Mega-earthquakes rupture flat megathrusts.

    PubMed

    Bletery, Quentin; Thomas, Amanda M; Rempel, Alan W; Karlstrom, Leif; Sladen, Anthony; De Barros, Louis

    2016-11-25

    The 2004 Sumatra-Andaman and 2011 Tohoku-Oki earthquakes highlighted gaps in our understanding of mega-earthquake rupture processes and the factors controlling their global distribution: A fast convergence rate and young buoyant lithosphere are not required to produce mega-earthquakes. We calculated the curvature along the major subduction zones of the world, showing that mega-earthquakes preferentially rupture flat (low-curvature) interfaces. A simplified analytic model demonstrates that heterogeneity in shear strength increases with curvature. Shear strength on flat megathrusts is more homogeneous, and hence more likely to be exceeded simultaneously over large areas, than on highly curved faults. Copyright © 2016, American Association for the Advancement of Science.

  12. The USGS Earthquake Notification Service (ENS): Customizable notifications of earthquakes around the globe

    USGS Publications Warehouse

    Wald, Lisa A.; Wald, David J.; Schwarz, Stan; Presgrave, Bruce; Earle, Paul S.; Martinez, Eric; Oppenheimer, David

    2008-01-01

    At the beginning of 2006, the U.S. Geological Survey (USGS) Earthquake Hazards Program (EHP) introduced a new automated Earthquake Notification Service (ENS) to take the place of the National Earthquake Information Center (NEIC) "Bigquake" system and the various other individual EHP e-mail list-servers for separate regions in the United States. These included northern California, southern California, and the central and eastern United States. ENS is a "one-stop shopping" system that allows Internet users to subscribe to flexible and customizable notifications for earthquakes anywhere in the world. The customization capability allows users to define the what (magnitude threshold), the when (day and night thresholds), and the where (specific regions) for their notifications. Customization is achieved by employing a per-user based request profile, allowing the notifications to be tailored for each individual's requirements. Such earthquake-parameter-specific custom delivery was not possible with simple e-mail list-servers. Now that event and user profiles are in a structured query language (SQL) database, additional flexibility is possible. At the time of this writing, ENS had more than 114,000 subscribers, with more than 200,000 separate user profiles. On a typical day, more than 188,000 messages get sent to a variety of widely distributed users for a wide range of earthquake locations and magnitudes. The purpose of this article is to describe how ENS works, highlight the features it offers, and summarize plans for future developments.

  13. A note on evaluating VAN earthquake predictions

    NASA Astrophysics Data System (ADS)

    Tselentis, G.-Akis; Melis, Nicos S.

    The evaluation of the success level of an earthquake prediction method should not be based on approaches that apply generalized strict statistical laws and avoid the specific nature of the earthquake phenomenon. Fault rupture processes cannot be compared to gambling processes. The outcome of the present note is that even an ideal earthquake prediction method is still shown to be a matter of a “chancy” association between precursors and earthquakes if we apply the same procedure proposed by Mulargia and Gasperini [1992] in evaluating VAN earthquake predictions. Each individual VAN prediction has to be evaluated separately, taking always into account the specific circumstances and information available. The success level of epicenter prediction should depend on the earthquake magnitude, and magnitude and time predictions may depend on earthquake clustering and the tectonic regime respectively.

  14. Strategies for rapid global earthquake impact estimation: the Prompt Assessment of Global Earthquakes for Response (PAGER) system

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, D.J.

    2013-01-01

    This chapter summarizes the state-of-the-art for rapid earthquake impact estimation. It details the needs and challenges associated with quick estimation of earthquake losses following global earthquakes, and provides a brief literature review of various approaches that have been used in the past. With this background, the chapter introduces the operational earthquake loss estimation system developed by the U.S. Geological Survey (USGS) known as PAGER (for Prompt Assessment of Global Earthquakes for Response). It also details some of the ongoing developments of PAGER’s loss estimation models to better supplement the operational empirical models, and to produce value-added web content for a variety of PAGER users.

  15. Stigma in science: the case of earthquake prediction.

    PubMed

    Joffe, Helene; Rossetto, Tiziana; Bradley, Caroline; O'Connor, Cliodhna

    2018-01-01

    This paper explores how earthquake scientists conceptualise earthquake prediction, particularly given the conviction of six earthquake scientists for manslaughter (subsequently overturned) on 22 October 2012 for having given inappropriate advice to the public prior to the L'Aquila earthquake of 6 April 2009. In the first study of its kind, semi-structured interviews were conducted with 17 earthquake scientists and the transcribed interviews were analysed thematically. The scientists primarily denigrated earthquake prediction, showing strong emotive responses and distancing themselves from earthquake 'prediction' in favour of 'forecasting'. Earthquake prediction was regarded as impossible and harmful. The stigmatisation of the subject is discussed in the light of research on boundary work and stigma in science. The evaluation reveals how mitigation becomes the more favoured endeavour, creating a normative environment that disadvantages those who continue to pursue earthquake prediction research. Recommendations are made for communication with the public on earthquake risk, with a focus on how scientists portray uncertainty. © 2018 The Author(s). Disasters © Overseas Development Institute, 2018.

  16. Insignificant solar-terrestrial triggering of earthquakes

    USGS Publications Warehouse

    Love, Jeffrey J.; Thomas, Jeremy N.

    2013-01-01

    We examine the claim that solar-terrestrial interaction, as measured by sunspots, solar wind velocity, and geomagnetic activity, might play a role in triggering earthquakes. We count the number of earthquakes having magnitudes that exceed chosen thresholds in calendar years, months, and days, and we order these counts by the corresponding rank of annual, monthly, and daily averages of the solar-terrestrial variables. We measure the statistical significance of the difference between the earthquake-number distributions below and above the median of the solar-terrestrial averages by χ2 and Student's t tests. Across a range of earthquake magnitude thresholds, we find no consistent and statistically significant distributional differences. We also introduce time lags between the solar-terrestrial variables and the number of earthquakes, but again no statistically significant distributional difference is found. We cannot reject the null hypothesis of no solar-terrestrial triggering of earthquakes.

  17. Laboratory investigations of earthquake dynamics

    NASA Astrophysics Data System (ADS)

    Xia, Kaiwen

    In this thesis this will be attempted through controlled laboratory experiments that are designed to mimic natural earthquake scenarios. The earthquake dynamic rupturing process itself is a complicated phenomenon, involving dynamic friction, wave propagation, and heat production. Because controlled experiments can produce results without assumptions needed in theoretical and numerical analysis, the experimental method is thus advantageous over theoretical and numerical methods. Our laboratory fault is composed of carefully cut photoelastic polymer plates (Homahte-100, Polycarbonate) held together by uniaxial compression. As a unique unit of the experimental design, a controlled exploding wire technique provides the triggering mechanism of laboratory earthquakes. Three important components of real earthquakes (i.e., pre-existing fault, tectonic loading, and triggering mechanism) correspond to and are simulated by frictional contact, uniaxial compression, and the exploding wire technique. Dynamic rupturing processes are visualized using the photoelastic method and are recorded via a high-speed camera. Our experimental methodology, which is full-field, in situ, and non-intrusive, has better control and diagnostic capacity compared to other existing experimental methods. Using this experimental approach, we have investigated several problems: dynamics of earthquake faulting occurring along homogeneous faults separating identical materials, earthquake faulting along inhomogeneous faults separating materials with different wave speeds, and earthquake faulting along faults with a finite low wave speed fault core. We have observed supershear ruptures, subRayleigh to supershear rupture transition, crack-like to pulse-like rupture transition, self-healing (Heaton) pulse, and rupture directionality.

  18. Physics of Earthquake Rupture Propagation

    NASA Astrophysics Data System (ADS)

    Xu, Shiqing; Fukuyama, Eiichi; Sagy, Amir; Doan, Mai-Linh

    2018-05-01

    A comprehensive understanding of earthquake rupture propagation requires the study of not only the sudden release of elastic strain energy during co-seismic slip, but also of other processes that operate at a variety of spatiotemporal scales. For example, the accumulation of the elastic strain energy usually takes decades to hundreds of years, and rupture propagation and termination modify the bulk properties of the surrounding medium that can influence the behavior of future earthquakes. To share recent findings in the multiscale investigation of earthquake rupture propagation, we held a session entitled "Physics of Earthquake Rupture Propagation" during the 2016 American Geophysical Union (AGU) Fall Meeting in San Francisco. The session included 46 poster and 32 oral presentations, reporting observations of natural earthquakes, numerical and experimental simulations of earthquake ruptures, and studies of earthquake fault friction. These presentations and discussions during and after the session suggested a need to document more formally the research findings, particularly new observations and views different from conventional ones, complexities in fault zone properties and loading conditions, the diversity of fault slip modes and their interactions, the evaluation of observational and model uncertainties, and comparison between empirical and physics-based models. Therefore, we organize this Special Issue (SI) of Tectonophysics under the same title as our AGU session, hoping to inspire future investigations. Eighteen articles (marked with "this issue") are included in this SI and grouped into the following six categories.

  19. Real Time Earthquake Information System in Japan

    NASA Astrophysics Data System (ADS)

    Doi, K.; Kato, T.

    2003-12-01

    An early earthquake notification system in Japan had been developed by the Japan Meteorological Agency (JMA) as a governmental organization responsible for issuing earthquake information and tsunami forecasts. The system was primarily developed for prompt provision of a tsunami forecast to the public with locating an earthquake and estimating its magnitude as quickly as possible. Years after, a system for a prompt provision of seismic intensity information as indices of degrees of disasters caused by strong ground motion was also developed so that concerned governmental organizations can decide whether it was necessary for them to launch emergency response or not. At present, JMA issues the following kinds of information successively when a large earthquake occurs. 1) Prompt report of occurrence of a large earthquake and major seismic intensities caused by the earthquake in about two minutes after the earthquake occurrence. 2) Tsunami forecast in around three minutes. 3) Information on expected arrival times and maximum heights of tsunami waves in around five minutes. 4) Information on a hypocenter and a magnitude of the earthquake, the seismic intensity at each observation station, the times of high tides in addition to the expected tsunami arrival times in 5-7 minutes. To issue information above, JMA has established; - An advanced nationwide seismic network with about 180 stations for seismic wave observation and about 3,400 stations for instrumental seismic intensity observation including about 2,800 seismic intensity stations maintained by local governments, - Data telemetry networks via landlines and partly via a satellite communication link, - Real-time data processing techniques, for example, the automatic calculation of earthquake location and magnitude, the database driven method for quantitative tsunami estimation, and - Dissemination networks, via computer-to-computer communications and facsimile through dedicated telephone lines. JMA operationally

  20. Earthquake precursory events around epicenters and local active faults; the cases of two inland earthquakes in Iran

    NASA Astrophysics Data System (ADS)

    Valizadeh Alvan, H.; Mansor, S.; Haydari Azad, F.

    2012-12-01

    The possibility of earthquake prediction in the frame of several days to few minutes before its occurrence has stirred interest among researchers, recently. Scientists believe that the new theories and explanations of the mechanism of this natural phenomenon are trustable and can be the basis of future prediction efforts. During the last thirty years experimental researches resulted in some pre-earthquake events which are now recognized as confirmed warning signs (precursors) of past known earthquakes. With the advances in in-situ measurement devices and data analysis capabilities and the emergence of satellite-based data collectors, monitoring the earth's surface is now a regular work. Data providers are supplying researchers from all over the world with high quality and validated imagery and non-imagery data. Surface Latent Heat Flux (SLHF) or the amount of energy exchange in the form of water vapor between the earth's surface and atmosphere has been frequently reported as an earthquake precursor during the past years. The accumulated stress in the earth's crust during the preparation phase of earthquakes is said to be the main cause of temperature anomalies weeks to days before the main event and subsequent shakes. Chemical and physical interactions in the presence of underground water lead to higher water evaporation prior to inland earthquakes. On the other hand, the leak of Radon gas occurred as rocks break during earthquake preparation causes the formation of airborne ions and higher Air Temperature (AT) prior to main event. Although co-analysis of direct and indirect observation for precursory events is considered as a promising method for future successful earthquake prediction, without proper and thorough knowledge about the geological setting, atmospheric factors and geodynamics of the earthquake-prone regions we will not be able to identify anomalies due to seismic activity in the earth's crust. Active faulting is a key factor in identification of the

  1. Historical and recent large megathrust earthquakes in Chile

    NASA Astrophysics Data System (ADS)

    Ruiz, S.; Madariaga, R.

    2018-05-01

    Recent earthquakes in Chile, 2014, Mw 8.2 Iquique, 2015, Mw 8.3 Illapel and 2016, Mw 7.6 Chiloé have put in evidence some problems with the straightforward application of ideas about seismic gaps, earthquake periodicity and the general forecast of large megathrust earthquakes. In northern Chile, before the 2014 Iquique earthquake 4 large earthquakes were reported in written chronicles, 1877, 1786, 1615 and 1543; in North-Central Chile, before the 2015 Illapel event, 3 large earthquakes 1943, 1880, 1730 were reported; and the 2016 Chiloé earthquake occurred in the southern zone of the 1960 Valdivia megathrust rupture, where other large earthquakes occurred in 1575, 1737 and 1837. The periodicity of these events has been proposed as a good long-term forecasting. However, the seismological aspects of historical Chilean earthquakes were inferred mainly from old chronicles written before subduction in Chile was discovered. Here we use the original description of earthquakes to re-analyze the historical archives. Our interpretation shows that a-priori ideas, like seismic gaps and characteristic earthquakes, influenced the estimation of magnitude, location and rupture area of the older Chilean events. On the other hand, the advance in the characterization of the rheological aspects that controlled the contact between Nazca and South-American plate and the study of tsunami effects provide better estimations of the location of historical earthquakes along the seismogenic plate interface. Our re-interpretation of historical earthquakes shows a large diversity of earthquakes types; there is a major difference between giant earthquakes that break the entire plate interface and those of Mw 8.0 that only break a portion of it.

  2. Volcano-earthquake interaction at Mauna Loa volcano, Hawaii

    NASA Astrophysics Data System (ADS)

    Walter, Thomas R.; Amelung, Falk

    2006-05-01

    The activity at Mauna Loa volcano, Hawaii, is characterized by eruptive fissures that propagate into the Southwest Rift Zone (SWRZ) or into the Northeast Rift Zone (NERZ) and by large earthquakes at the basal decollement fault. In this paper we examine the historic eruption and earthquake catalogues, and we test the hypothesis that the events are interconnected in time and space. Earthquakes in the Kaoiki area occur in sequence with eruptions from the NERZ, and earthquakes in the Kona and Hilea areas occur in sequence with eruptions from the SWRZ. Using three-dimensional numerical models, we demonstrate that elastic stress transfer can explain the observed volcano-earthquake interaction. We examine stress changes due to typical intrusions and earthquakes. We find that intrusions change the Coulomb failure stress along the decollement fault so that NERZ intrusions encourage Kaoiki earthquakes and SWRZ intrusions encourage Kona and Hilea earthquakes. On the other hand, earthquakes decompress the magma chamber and unclamp part of the Mauna Loa rift zone, i.e., Kaoiki earthquakes encourage NERZ intrusions, whereas Kona and Hilea earthquakes encourage SWRZ intrusions. We discuss how changes of the static stress field affect the occurrence of earthquakes as well as the occurrence, location, and volume of dikes and of associated eruptions and also the lava composition and fumarolic activity.

  3. Seismicity in the source areas of the 1896 and 1933 Sanriku earthquakes and implications for large near-trench earthquake faults

    NASA Astrophysics Data System (ADS)

    Obana, Koichiro; Nakamura, Yasuyuki; Fujie, Gou; Kodaira, Shuichi; Kaiho, Yuka; Yamamoto, Yojiro; Miura, Seiichi

    2018-03-01

    In the northern part of the Japan Trench, the 1933 Showa-Sanriku earthquake (Mw 8.4), an outer-trench, normal-faulting earthquake, occurred 37 yr after the 1896 Meiji-Sanriku tsunami earthquake (Mw 8.0), a shallow, near-trench, plate-interface rupture. Tsunamis generated by both earthquakes caused severe damage along the Sanriku coast. Precise locations of earthquakes in the source areas of the 1896 and 1933 earthquakes have not previously been obtained because they occurred at considerable distances from the coast in deep water beyond the maximum operational depth of conventional ocean bottom seismographs (OBSs). In 2015, we incorporated OBSs designed for operation in deep water (ultradeep OBSs) in an OBS array during two months of seismic observations in the source areas of the 1896 and 1933 Sanriku earthquakes to investigate the relationship of seismicity there to outer-rise normal-faulting earthquakes and near-trench tsunami earthquakes. Our analysis showed that seismicity during our observation period occurred along three roughly linear trench-parallel trends in the outer-trench region. Seismic activity along these trends likely corresponds to aftershocks of the 1933 Showa-Sanriku earthquake and the Mw 7.4 normal-faulting earthquake that occurred 40 min after the 2011 Tohoku-Oki earthquake. Furthermore, changes of the clarity of reflections from the oceanic Moho on seismic reflection profiles and low-velocity anomalies within the oceanic mantle were observed near the linear trends of the seismicity. The focal mechanisms we determined indicate that an extensional stress regime extends to about 40 km depth, below which the stress regime is compressional. These observations suggest that rupture during the 1933 Showa-Sanriku earthquake did not extend to the base of the oceanic lithosphere and that compound rupture of multiple or segmented faults is a more plausible explanation for that earthquake. The source area of the 1896 Meiji-Sanriku tsunami earthquake is

  4. Earthquakes in the New Zealand Region.

    ERIC Educational Resources Information Center

    Wallace, Cleland

    1995-01-01

    Presents a thorough overview of earthquakes in New Zealand, discussing plate tectonics, seismic measurement, and historical occurrences. Includes 10 figures illustrating such aspects as earthquake distribution, intensity, and fissures in the continental crust. Tabular data includes a list of most destructive earthquakes and descriptive effects…

  5. Napa Earthquake impact on water systems

    NASA Astrophysics Data System (ADS)

    Wang, J.

    2014-12-01

    South Napa earthquake occurred in Napa, California on August 24 at 3am, local time, and the magnitude is 6.0. The earthquake was the largest in SF Bay Area since the 1989 Loma Prieta earthquake. Economic loss topped $ 1 billion. Wine makers cleaning up and estimated the damage on tourism. Around 15,000 cases of lovely cabernet were pouring into the garden at the Hess Collection. Earthquake potentially raise water pollution risks, could cause water crisis. CA suffered water shortage recent years, and it could be helpful on how to prevent underground/surface water pollution from earthquake. This research gives a clear view on drinking water system in CA, pollution on river systems, as well as estimation on earthquake impact on water supply. The Sacramento-San Joaquin River delta (close to Napa), is the center of the state's water distribution system, delivering fresh water to more than 25 million residents and 3 million acres of farmland. Delta water conveyed through a network of levees is crucial to Southern California. The drought has significantly curtailed water export, and salt water intrusion reduced fresh water outflows. Strong shaking from a nearby earthquake can cause saturated, loose, sandy soils liquefaction, and could potentially damage major delta levee systems near Napa. Napa earthquake is a wake-up call for Southern California. It could potentially damage freshwater supply system.

  6. Earthquake Hazard Analysis Methods: A Review

    NASA Astrophysics Data System (ADS)

    Sari, A. M.; Fakhrurrozi, A.

    2018-02-01

    One of natural disasters that have significantly impacted on risks and damage is an earthquake. World countries such as China, Japan, and Indonesia are countries located on the active movement of continental plates with more frequent earthquake occurrence compared to other countries. Several methods of earthquake hazard analysis have been done, for example by analyzing seismic zone and earthquake hazard micro-zonation, by using Neo-Deterministic Seismic Hazard Analysis (N-DSHA) method, and by using Remote Sensing. In its application, it is necessary to review the effectiveness of each technique in advance. Considering the efficiency of time and the accuracy of data, remote sensing is used as a reference to the assess earthquake hazard accurately and quickly as it only takes a limited time required in the right decision-making shortly after the disaster. Exposed areas and possibly vulnerable areas due to earthquake hazards can be easily analyzed using remote sensing. Technological developments in remote sensing such as GeoEye-1 provide added value and excellence in the use of remote sensing as one of the methods in the assessment of earthquake risk and damage. Furthermore, the use of this technique is expected to be considered in designing policies for disaster management in particular and can reduce the risk of natural disasters such as earthquakes in Indonesia.

  7. Acceleration spectra for subduction zone earthquakes

    USGS Publications Warehouse

    Boatwright, J.; Choy, G.L.

    1989-01-01

    We estimate the source spectra of shallow earthquakes from digital recordings of teleseismic P wave groups, that is, P+pP+sP, by making frequency dependent corrections for the attenuation and for the interference of the free surface. The correction for the interference of the free surface assumes that the earthquake radiates energy from a range of depths. We apply this spectral analysis to a set of 12 subduction zone earthquakes which range in size from Ms = 6.2 to 8.1, obtaining corrected P wave acceleration spectra on the frequency band from 0.01 to 2.0 Hz. Seismic moment estimates from surface waves and normal modes are used to extend these P wave spectra to the frequency band from 0.001 to 0.01 Hz. The acceleration spectra of large subduction zone earthquakes, that is, earthquakes whose seismic moments are greater than 1027 dyn cm, exhibit intermediate slopes where u(w)???w5/4 for frequencies from 0.005 to 0.05 Hz. For these earthquakes, spectral shape appears to be a discontinuous function of seismic moment. Using reasonable assumptions for the phase characteristics, we transform the spectral shape observed for large earthquakes into the time domain to fit Ekstrom's (1987) moment rate functions for the Ms=8.1 Michoacan earthquake of September 19, 1985, and the Ms=7.6 Michoacan aftershock of September 21, 1985. -from Authors

  8. Earthquake Drill using the Earthquake Early Warning System at an Elementary School

    NASA Astrophysics Data System (ADS)

    Oki, Satoko; Yazaki, Yoshiaki; Koketsu, Kazuki

    2010-05-01

    Japan frequently suffers from many kinds of disasters such as earthquakes, typhoons, floods, volcanic eruptions, and landslides. On average, we lose about 120 people a year due to natural hazards in this decade. Above all, earthquakes are noteworthy, since it may kill thousands of people in a moment like in Kobe in 1995. People know that we may have "a big one" some day as long as we live on this land and that what to do; retrofit houses, appliance heavy furniture to walls, add latches to kitchen cabinets, and prepare emergency packs. Yet most of them do not take the action, and result in the loss of many lives. It is only the victims that learn something from the earthquake, and it has never become the lore of the nations. One of the most essential ways to reduce the damage is to educate the general public to be able to make the sound decision on what to do at the moment when an earthquake hits. This will require the knowledge of the backgrounds of the on-going phenomenon. The Ministry of Education, Culture, Sports, Science and Technology (MEXT), therefore, offered for public subscription to choose several model areas to adopt scientific education to the local elementary schools. This presentation is the report of a year and half courses that we had at the model elementary school in Tokyo Metropolitan Area. The tectonic setting of this area is very complicated; there are the Pacific and Philippine Sea plates subducting beneath the North America and the Eurasia plates. The subduction of the Philippine Sea plate causes mega-thrust earthquakes such as the 1923 Kanto earthquake (M 7.9) making 105,000 fatalities. A magnitude 7 or greater earthquake beneath this area is recently evaluated to occur with a probability of 70 % in 30 years. This is of immediate concern for the devastating loss of life and property because the Tokyo urban region now has a population of 42 million and is the center of approximately 40 % of the nation's activities, which may cause great global

  9. Earthquake Simulator Finds Tremor Triggers

    ScienceCinema

    Johnson, Paul

    2018-01-16

    Using a novel device that simulates earthquakes in a laboratory setting, a Los Alamos researcher has found that seismic waves-the sounds radiated from earthquakes-can induce earthquake aftershocks, often long after a quake has subsided. The research provides insight into how earthquakes may be triggered and how they recur. Los Alamos researcher Paul Johnson and colleague Chris Marone at Penn State have discovered how wave energy can be stored in certain types of granular materials-like the type found along certain fault lines across the globe-and how this stored energy can suddenly be released as an earthquake when hit by relatively small seismic waves far beyond the traditional “aftershock zone” of a main quake. Perhaps most surprising, researchers have found that the release of energy can occur minutes, hours, or even days after the sound waves pass; the cause of the delay remains a tantalizing mystery.

  10. Emergency medical rescue efforts after a major earthquake: lessons from the 2008 Wenchuan earthquake.

    PubMed

    Zhang, Lulu; Liu, Xu; Li, Youping; Liu, Yuan; Liu, Zhipeng; Lin, Juncong; Shen, Ji; Tang, Xuefeng; Zhang, Yi; Liang, Wannian

    2012-03-03

    Major earthquakes often result in incalculable environmental damage, loss of life, and threats to health. Tremendous progress has been made in response to many medical challenges resulting from earthquakes. However, emergency medical rescue is complicated, and great emphasis should be placed on its organisation to achieve the best results. The 2008 Wenchuan earthquake was one of the most devastating disasters in the past 10 years and caused more than 370,000 casualties. The lessons learnt from the medical disaster relief effort and the subsequent knowledge gained about the regulation and capabilities of medical and military back-up teams should be widely disseminated. In this Review we summarise and analyse the emergency medical rescue efforts after the Wenchuan earthquake. Establishment of a national disaster medical response system, an active and effective commanding system, successful coordination between rescue forces and government agencies, effective treatment, a moderate, timely and correct public health response, and long-term psychological support are all crucial to reduce mortality and morbidity and promote overall effectiveness of rescue efforts after a major earthquake. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. Exploring Earthquakes in Real-Time

    NASA Astrophysics Data System (ADS)

    Bravo, T. K.; Kafka, A. L.; Coleman, B.; Taber, J. J.

    2013-12-01

    Earthquakes capture the attention of students and inspire them to explore the Earth. Adding the ability to view and explore recordings of significant and newsworthy earthquakes in real-time makes the subject even more compelling. To address this opportunity, the Incorporated Research Institutions for Seismology (IRIS), in collaboration with Moravian College, developed ';jAmaSeis', a cross-platform application that enables students to access real-time earthquake waveform data. Students can watch as the seismic waves are recorded on their computer, and can be among the first to analyze the data from an earthquake. jAmaSeis facilitates student centered investigations of seismological concepts using either a low-cost educational seismograph or streamed data from other educational seismographs or from any seismic station that sends data to the IRIS Data Management System. After an earthquake, students can analyze the seismograms to determine characteristics of earthquakes such as time of occurrence, distance from the epicenter to the station, magnitude, and location. The software has been designed to provide graphical clues to guide students in the analysis and assist in their interpretations. Since jAmaSeis can simultaneously record up to three stations from anywhere on the planet, there are numerous opportunities for student driven investigations. For example, students can explore differences in the seismograms from different distances from an earthquake and compare waveforms from different azimuthal directions. Students can simultaneously monitor seismicity at a tectonic plate boundary and in the middle of the plate regardless of their school location. This can help students discover for themselves the ideas underlying seismic wave propagation, regional earthquake hazards, magnitude-frequency relationships, and the details of plate tectonics. The real-time nature of the data keeps the investigations dynamic, and offers students countless opportunities to explore.

  12. Spatial Evaluation and Verification of Earthquake Simulators

    NASA Astrophysics Data System (ADS)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.

    2017-06-01

    In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.

  13. Earthquakes March-April 1992

    USGS Publications Warehouse

    Person, Waverly J.

    1992-01-01

    The months of March and April were quite active seismically speaking. There was one major earthquake (7.0Earthquake-related deaths were reported in Iran, Costa Rica, Turkey, and Germany.

  14. Children's emotional experience two years after an earthquake: An exploration of knowledge of earthquakes and associated emotions

    PubMed Central

    Burro, Roberto; Hall, Rob

    2017-01-01

    A major earthquake has a potentially highly traumatic impact on children’s psychological functioning. However, while many studies on children describe negative consequences in terms of mental health and psychiatric disorders, little is known regarding how the developmental processes of emotions can be affected following exposure to disasters. Objectives We explored whether and how the exposure to a natural disaster such as the 2012 Emilia Romagna earthquake affected the development of children’s emotional competence in terms of understanding, regulating, and expressing emotions, after two years, when compared with a control group not exposed to the earthquake. We also examined the role of class level and gender. Method The sample included two groups of children (n = 127) attending primary school: The experimental group (n = 65) experienced the 2012 Emilia Romagna earthquake, while the control group (n = 62) did not. The data collection took place two years after the earthquake, when children were seven or ten-year-olds. Beyond assessing the children’s understanding of emotions and regulating abilities with standardized instruments, we employed semi-structured interviews to explore their knowledge of earthquakes and associated emotions, and a structured task on the intensity of some target emotions. Results We applied Generalized Linear Mixed Models. Exposure to the earthquake did not influence the understanding and regulation of emotions. The understanding of emotions varied according to class level and gender. Knowledge of earthquakes, emotional language, and emotions associated with earthquakes were, respectively, more complex, frequent, and intense for children who had experienced the earthquake, and at increasing ages. Conclusions Our data extend the generalizability of theoretical models on children’s psychological functioning following disasters, such as the dose-response model and the organizational-developmental model for child resilience, and

  15. The global distribution of magnitude 9 earthquakes

    NASA Astrophysics Data System (ADS)

    McCaffrey, R.

    2011-12-01

    The 2011 Tohoku M9 earthquake once again caught some in the earthquake community by surprise. The expectation of these massive quakes has been driven in the past by the over-reliance on our short, incomplete history of earthquakes and causal relationships derived from it. The logic applied is that if a great earthquake has not happened in the past, that we know of, one cannot happen in the future. Using the ~100-year global earthquake history, seismologists have promoted relationships between maximum earthquake sizes and other properties of subduction zones, leading to the notion that some subduction zones, like the Japan Trench, would never produce a magnitude ~9 event. The 2004 Andaman Mw = 9.2 earthquake, that occurred where there is slow subduction of old crust and a history of only moderate-sized earthquakes, seriously undermined such ideas. Given multi-century return times of the greatest earthquakes, ignorance of those return times and our very limited observation span, I suggest that we cannot yet make such determinations. Alternatively, using the length of a subduction zone that is available for slip as the predominant factor in determining maximum earthquake size, we cannot rule out that any subduction zone of a few hundred kilometers or more in length may be capable of producing a magnitude 9 or larger earthquake. Based on this method, the expected maximum size for the Japan Trench was 9.0 (McCaffrey, Geology, p. 263, 2008). The same approach portends a M > 9 for Java, with twice the population density as Honshu and much lower building standards. The Java Trench, and others where old crust subducts (Hikurangi, Marianas, Tonga, Kermadec), require increased awareness of the possibility for a great earthquake.

  16. Children's emotional experience two years after an earthquake: An exploration of knowledge of earthquakes and associated emotions.

    PubMed

    Raccanello, Daniela; Burro, Roberto; Hall, Rob

    2017-01-01

    We explored whether and how the exposure to a natural disaster such as the 2012 Emilia Romagna earthquake affected the development of children's emotional competence in terms of understanding, regulating, and expressing emotions, after two years, when compared with a control group not exposed to the earthquake. We also examined the role of class level and gender. The sample included two groups of children (n = 127) attending primary school: The experimental group (n = 65) experienced the 2012 Emilia Romagna earthquake, while the control group (n = 62) did not. The data collection took place two years after the earthquake, when children were seven or ten-year-olds. Beyond assessing the children's understanding of emotions and regulating abilities with standardized instruments, we employed semi-structured interviews to explore their knowledge of earthquakes and associated emotions, and a structured task on the intensity of some target emotions. We applied Generalized Linear Mixed Models. Exposure to the earthquake did not influence the understanding and regulation of emotions. The understanding of emotions varied according to class level and gender. Knowledge of earthquakes, emotional language, and emotions associated with earthquakes were, respectively, more complex, frequent, and intense for children who had experienced the earthquake, and at increasing ages. Our data extend the generalizability of theoretical models on children's psychological functioning following disasters, such as the dose-response model and the organizational-developmental model for child resilience, and provide further knowledge on children's emotional resources related to natural disasters, as a basis for planning educational prevention programs.

  17. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 13 Business Credit and Assistance 1 2013-01-01 2013-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program (“NEHRP...

  18. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 13 Business Credit and Assistance 1 2014-01-01 2014-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program (“NEHRP...

  19. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 13 Business Credit and Assistance 1 2012-01-01 2012-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program (“NEHRP...

  20. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 13 Business Credit and Assistance 1 2011-01-01 2011-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program (“NEHRP...

  1. Earthquake magnitude estimation using the τ c and P d method for earthquake early warning systems

    NASA Astrophysics Data System (ADS)

    Jin, Xing; Zhang, Hongcai; Li, Jun; Wei, Yongxiang; Ma, Qiang

    2013-10-01

    Earthquake early warning (EEW) systems are one of the most effective ways to reduce earthquake disaster. Earthquake magnitude estimation is one of the most important and also the most difficult parts of the entire EEW system. In this paper, based on 142 earthquake events and 253 seismic records that were recorded by the KiK-net in Japan, and aftershocks of the large Wenchuan earthquake in Sichuan, we obtained earthquake magnitude estimation relationships using the τ c and P d methods. The standard variances of magnitude calculation of these two formulas are ±0.65 and ±0.56, respectively. The P d value can also be used to estimate the peak ground motion of velocity, then warning information can be released to the public rapidly, according to the estimation results. In order to insure the stability and reliability of magnitude estimation results, we propose a compatibility test according to the natures of these two parameters. The reliability of the early warning information is significantly improved though this test.

  2. Earthquakes at North Atlantic passive margins

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gregersen, S.; Basham, P.W.

    1989-01-01

    The main focus of this volume is the earthquakes that occur at and near the continental margins on both sides of the North Atlantic. The book, which contains the proceedings of the NATO workshop on Causes and Effects of Earthquakes at Passive Margins and in Areas of Postglacial Rebound on Both Sides of the North Atlantic, draws together the fields of geophysics, geology and geodesy to address the stress and strain in the Earth's crust. The resulting earthquakes produced on ancient geological fault zones and the associated seismic hazards these pose to man are also addressed. Postglacial rebound in Northmore » America and Fennoscandia is a minor source of earthquakes today, during the interglacial period, but evidence is presented to suggest that the ice sheets suppressed earthquake strain while they were in place, and released this strain as a pulse of significant earthquakes after the ice melted about 9000 years ago.« less

  3. Widespread Triggering of Earthquakes in the Central US by the 2011 M9.0 Tohoku-Oki Earthquake

    NASA Astrophysics Data System (ADS)

    Rubinstein, J. L.; Savage, H. M.

    2011-12-01

    The strong shaking of the 2011 M9.0 off-Tohoku earthquake triggered tectonic tremor and earthquakes in many locations around the world. We analyze broadband records from the USARRAY to identify triggered seismicity in more than 10 different locations in the Central United States. We identify triggered events in many states including: Kansas, Nebraska, Arkansas, Minnesota, and Iowa. The locally triggered earthquakes are obscured in broadband records by the Tohoku-Oki mainshock but can be revealed with high-pass filtering. With the exception of one location (central Arkansas), the triggered seismicity occurred in regions that are seismically quiet. The coincidence of this seismicity with the Tohoku-Oki event suggests that these earthquakes were triggered. The triggered seismicity in Arkansas occurred in a region where there has been an active swarm of seismicity since August 2010. There are two lines of evidence to indicate that the seismicity in Arkansas is triggered instead of part of the swarm: (1) we observe two earthquakes that initiate coincident with the arrival of shear wave and Love wave; (2) the seismicity rate increased dramatically following the Tohoku-Oki mainshock. Our observations of widespread earthquake triggering in regions thought to be seismically quiet remind us that earthquakes can occur in most any location. Studying additional teleseismic events has the potential to reveal regions with a propensity for earthquake triggering.

  4. Update earthquake risk assessment in Cairo, Egypt

    NASA Astrophysics Data System (ADS)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2017-07-01

    The Cairo earthquake (12 October 1992; m b = 5.8) is still and after 25 years one of the most painful events and is dug into the Egyptians memory. This is not due to the strength of the earthquake but due to the accompanied losses and damages (561 dead; 10,000 injured and 3000 families lost their homes). Nowadays, the most frequent and important question that should rise is "what if this earthquake is repeated today." In this study, we simulate the same size earthquake (12 October 1992) ground motion shaking and the consequent social-economic impacts in terms of losses and damages. Seismic hazard, earthquake catalogs, soil types, demographics, and building inventories were integrated into HAZUS-MH to produce a sound earthquake risk assessment for Cairo including economic and social losses. Generally, the earthquake risk assessment clearly indicates that "the losses and damages may be increased twice or three times" in Cairo compared to the 1992 earthquake. The earthquake risk profile reveals that five districts (Al-Sahel, El Basateen, Dar El-Salam, Gharb, and Madinat Nasr sharq) lie in high seismic risks, and three districts (Manshiyat Naser, El-Waily, and Wassat (center)) are in low seismic risk level. Moreover, the building damage estimations reflect that Gharb is the highest vulnerable district. The analysis shows that the Cairo urban area faces high risk. Deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated buildings damages are concentrated within the most densely populated (El Basateen, Dar El-Salam, Gharb, and Madinat Nasr Gharb) districts. Moreover, about 75 % of casualties are in the same districts. Actually, an earthquake risk assessment for Cairo represents a crucial application of the HAZUS earthquake loss estimation model for risk management. Finally, for mitigation, risk reduction, and to improve the seismic performance of structures and assure life safety

  5. Revisiting the 1872 Owens Valley, California, Earthquake

    USGS Publications Warehouse

    Hough, S.E.; Hutton, K.

    2008-01-01

    The 26 March 1872 Owens Valley earthquake is among the largest historical earthquakes in California. The felt area and maximum fault displacements have long been regarded as comparable to, if not greater than, those of the great San Andreas fault earthquakes of 1857 and 1906, but mapped surface ruptures of the latter two events were 2-3 times longer than that inferred for the 1872 rupture. The preferred magnitude estimate of the Owens Valley earthquake has thus been 7.4, based largely on the geological evidence. Reinterpreting macroseismic accounts of the Owens Valley earthquake, we infer generally lower intensity values than those estimated in earlier studies. Nonetheless, as recognized in the early twentieth century, the effects of this earthquake were still generally more dramatic at regional distances than the macroseismic effects from the 1906 earthquake, with light damage to masonry buildings at (nearest-fault) distances as large as 400 km. Macroseismic observations thus suggest a magnitude greater than that of the 1906 San Francisco earthquake, which appears to be at odds with geological observations. However, while the mapped rupture length of the Owens Valley earthquake is relatively low, the average slip was high. The surface rupture was also complex and extended over multiple fault segments. It was first mapped in detail over a century after the earthquake occurred, and recent evidence suggests it might have been longer than earlier studies indicated. Our preferred magnitude estimate is Mw 7.8-7.9, values that we show are consistent with the geological observations. The results of our study suggest that either the Owens Valley earthquake was larger than the 1906 San Francisco earthquake or that, by virtue of source properties and/or propagation effects, it produced systematically higher ground motions at regional distances. The latter possibility implies that some large earthquakes in California will generate significantly larger ground motions than San

  6. The effects of earthquake measurement concepts and magnitude anchoring on individuals' perceptions of earthquake risk

    USGS Publications Warehouse

    Celsi, R.; Wolfinbarger, M.; Wald, D.

    2005-01-01

    The purpose of this research is to explore earthquake risk perceptions in California. Specifically, we examine the risk beliefs, feelings, and experiences of lay, professional, and expert individuals to explore how risk is perceived and how risk perceptions are formed relative to earthquakes. Our results indicate that individuals tend to perceptually underestimate the degree that earthquake (EQ) events may affect them. This occurs in large part because individuals' personal felt experience of EQ events are generally overestimated relative to experienced magnitudes. An important finding is that individuals engage in a process of "cognitive anchoring" of their felt EQ experience towards the reported earthquake magnitude size. The anchoring effect is moderated by the degree that individuals comprehend EQ magnitude measurement and EQ attenuation. Overall, the results of this research provide us with a deeper understanding of EQ risk perceptions, especially as they relate to individuals' understanding of EQ measurement and attenuation concepts. ?? 2005, Earthquake Engineering Research Institute.

  7. Earthquake triggering at alaskan volcanoes following the 3 November 2002 denali fault earthquake

    USGS Publications Warehouse

    Moran, S.C.; Power, J.A.; Stihler, S.D.; Sanchez, J.J.; Caplan-Auerbach, J.

    2004-01-01

    The 3 November 2002 Mw 7.9 Denali fault earthquake provided an excellent opportunity to investigate triggered earthquakes at Alaskan volcanoes. The Alaska Volcano Observatory operates short-period seismic networks on 24 historically active volcanoes in Alaska, 247-2159 km distant from the mainshock epicenter. We searched for evidence of triggered seismicity by examining the unfiltered waveforms for all stations in each volcano network for ???1 hr after the Mw 7.9 arrival time at each network and for significant increases in located earthquakes in the hours after the mainshock. We found compelling evidence for triggering only at the Katmai volcanic cluster (KVC, 720-755 km southwest of the epicenter), where small earthquakes with distinct P and 5 arrivals appeared within the mainshock coda at one station and a small increase in located earthquakes occurred for several hours after the mainshock. Peak dynamic stresses of ???0.1 MPa at Augustine Volcano (560 km southwest of the epicenter) are significantly lower than those recorded in Yellowstone and Utah (>3000 km southeast of the epicenter), suggesting that strong directivity effects were at least partly responsible for the lack of triggering at Alaskan volcanoes. We describe other incidents of earthquake-induced triggering in the KVC, and outline a qualitative magnitude/distance-dependent triggering threshold. We argue that triggering results from the perturbation of magmatic-hydrothermal systems in the KVC and suggest that the comparative lack of triggering at other Alaskan volcanoes could be a result of differences in the nature of magmatic-hydrothermal systems.

  8. Understanding Earthquake Hazard & Disaster in Himalaya - A Perspective on Earthquake Forecast in Himalayan Region of South Central Tibet

    NASA Astrophysics Data System (ADS)

    Shanker, D.; Paudyal, ,; Singh, H.

    2010-12-01

    It is not only the basic understanding of the phenomenon of earthquake, its resistance offered by the designed structure, but the understanding of the socio-economic factors, engineering properties of the indigenous materials, local skill and technology transfer models are also of vital importance. It is important that the engineering aspects of mitigation should be made a part of public policy documents. Earthquakes, therefore, are and were thought of as one of the worst enemies of mankind. Due to the very nature of release of energy, damage is evident which, however, will not culminate in a disaster unless it strikes a populated area. The word mitigation may be defined as the reduction in severity of something. The Earthquake disaster mitigation, therefore, implies that such measures may be taken which help reduce severity of damage caused by earthquake to life, property and environment. While “earthquake disaster mitigation” usually refers primarily to interventions to strengthen the built environment, and “earthquake protection” is now considered to include human, social and administrative aspects of reducing earthquake effects. It should, however, be noted that reduction of earthquake hazards through prediction is considered to be the one of the effective measures, and much effort is spent on prediction strategies. While earthquake prediction does not guarantee safety and even if predicted correctly the damage to life and property on such a large scale warrants the use of other aspects of mitigation. While earthquake prediction may be of some help, mitigation remains the main focus of attention of the civil society. Present study suggests that anomalous seismic activity/ earthquake swarm existed prior to the medium size earthquakes in the Nepal Himalaya. The mainshocks were preceded by the quiescence period which is an indication for the occurrence of future seismic activity. In all the cases, the identified episodes of anomalous seismic activity were

  9. Thermal Infrared Anomalies of Several Strong Earthquakes

    PubMed Central

    Wei, Congxin; Guo, Xiao; Qin, Manzhong

    2013-01-01

    In the history of earthquake thermal infrared research, it is undeniable that before and after strong earthquakes there are significant thermal infrared anomalies which have been interpreted as preseismic precursor in earthquake prediction and forecasting. In this paper, we studied the characteristics of thermal radiation observed before and after the 8 great earthquakes with magnitude up to Ms7.0 by using the satellite infrared remote sensing information. We used new types of data and method to extract the useful anomaly information. Based on the analyses of 8 earthquakes, we got the results as follows. (1) There are significant thermal radiation anomalies before and after earthquakes for all cases. The overall performance of anomalies includes two main stages: expanding first and narrowing later. We easily extracted and identified such seismic anomalies by method of “time-frequency relative power spectrum.” (2) There exist evident and different characteristic periods and magnitudes of thermal abnormal radiation for each case. (3) Thermal radiation anomalies are closely related to the geological structure. (4) Thermal radiation has obvious characteristics in abnormal duration, range, and morphology. In summary, we should be sure that earthquake thermal infrared anomalies as useful earthquake precursor can be used in earthquake prediction and forecasting. PMID:24222728

  10. Thermal infrared anomalies of several strong earthquakes.

    PubMed

    Wei, Congxin; Zhang, Yuansheng; Guo, Xiao; Hui, Shaoxing; Qin, Manzhong; Zhang, Ying

    2013-01-01

    In the history of earthquake thermal infrared research, it is undeniable that before and after strong earthquakes there are significant thermal infrared anomalies which have been interpreted as preseismic precursor in earthquake prediction and forecasting. In this paper, we studied the characteristics of thermal radiation observed before and after the 8 great earthquakes with magnitude up to Ms7.0 by using the satellite infrared remote sensing information. We used new types of data and method to extract the useful anomaly information. Based on the analyses of 8 earthquakes, we got the results as follows. (1) There are significant thermal radiation anomalies before and after earthquakes for all cases. The overall performance of anomalies includes two main stages: expanding first and narrowing later. We easily extracted and identified such seismic anomalies by method of "time-frequency relative power spectrum." (2) There exist evident and different characteristic periods and magnitudes of thermal abnormal radiation for each case. (3) Thermal radiation anomalies are closely related to the geological structure. (4) Thermal radiation has obvious characteristics in abnormal duration, range, and morphology. In summary, we should be sure that earthquake thermal infrared anomalies as useful earthquake precursor can be used in earthquake prediction and forecasting.

  11. Pre-Earthquake Unipolar Electromagnetic Pulses

    NASA Astrophysics Data System (ADS)

    Scoville, J.; Freund, F.

    2013-12-01

    Transient ultralow frequency (ULF) electromagnetic (EM) emissions have been reported to occur before earthquakes [1,2]. They suggest powerful transient electric currents flowing deep in the crust [3,4]. Prior to the M=5.4 Alum Rock earthquake of Oct. 21, 2007 in California a QuakeFinder triaxial search-coil magnetometer located about 2 km from the epicenter recorded unusual unipolar pulses with the approximate shape of a half-cycle of a sine wave, reaching amplitudes up to 30 nT. The number of these unipolar pulses increased as the day of the earthquake approached. These pulses clearly originated around the hypocenter. The same pulses have since been recorded prior to several medium to moderate earthquakes in Peru, where they have been used to triangulate the location of the impending earthquakes [5]. To understand the mechanism of the unipolar pulses, we first have to address the question how single current pulses can be generated deep in the Earth's crust. Key to this question appears to be the break-up of peroxy defects in the rocks in the hypocenter as a result of the increase in tectonic stresses prior to an earthquake. We investigate the mechanism of the unipolar pulses by coupling the drift-diffusion model of semiconductor theory to Maxwell's equations, thereby producing a model describing the rock volume that generates the pulses in terms of electromagnetism and semiconductor physics. The system of equations is then solved numerically to explore the electromagnetic radiation associated with drift-diffusion currents of electron-hole pairs. [1] Sharma, A. K., P. A. V., and R. N. Haridas (2011), Investigation of ULF magnetic anomaly before moderate earthquakes, Exploration Geophysics 43, 36-46. [2] Hayakawa, M., Y. Hobara, K. Ohta, and K. Hattori (2011), The ultra-low-frequency magnetic disturbances associated with earthquakes, Earthquake Science, 24, 523-534. [3] Bortnik, J., T. E. Bleier, C. Dunson, and F. Freund (2010), Estimating the seismotelluric current

  12. Earthquake Warning Performance in Vallejo for the South Napa Earthquake

    NASA Astrophysics Data System (ADS)

    Wurman, G.; Price, M.

    2014-12-01

    In 2002 and 2003, Seismic Warning Systems, Inc. installed first-generation QuakeGuardTM earthquake warning devices at all eight fire stations in Vallejo, CA. These devices are designed to detect the P-wave of an earthquake and initiate predetermined protective actions if the impending shaking is estimated at approximately Modifed Mercalli Intensity V or greater. At the Vallejo fire stations the devices were set up to sound an audio alert over the public address system and to command the equipment bay doors to open. In August 2014, after more than 11 years of operating in the fire stations with no false alarms, the five units that were still in use triggered correctly on the MW 6.0 South Napa earthquake, less than 16 km away. The audio alert sounded in all five stations, providing fire fighters with 1.5 to 2.5 seconds of warning before the arrival of the S-wave, and the equipment bay doors opened in three of the stations. In one station the doors were disconnected from the QuakeGuard device, and another station lost power before the doors opened completely. These problems highlight just a small portion of the complexity associated with realizing actionable earthquake warnings. The issues experienced in this earthquake have already been addressed in subsequent QuakeGuard product generations, with downstream connection monitoring and backup power for critical systems. The fact that the fire fighters in Vallejo were afforded even two seconds of warning at these epicentral distances results from the design of the QuakeGuard devices, which focuses on rapid false positive rejection and ground motion estimates. We discuss the performance of the ground motion estimation algorithms, with an emphasis on the accuracy and timeliness of the estimates at close epicentral distances.

  13. Earthquake Shaking - Finding the "Hot Spots"

    USGS Publications Warehouse

    Field, Edward; Jones, Lucile; Jordan, Tom; Benthien, Mark; Wald, Lisa

    2001-01-01

    A new Southern California Earthquake Center study has quantified how local geologic conditions affect the shaking experienced in an earthquake. The important geologic factors at a site are softness of the rock or soil near the surface and thickness of the sediments above hard bedrock. Even when these 'site effects' are taken into account, however, each earthquake exhibits unique 'hotspots' of anomalously strong shaking. Better predictions of strong ground shaking will therefore require additional geologic data and more comprehensive computer simulations of individual earthquakes.

  14. Earth's rotation variations and earthquakes 2010-2011

    NASA Astrophysics Data System (ADS)

    Ostřihanský, L.

    2012-01-01

    In contrast to unsuccessful searching (lasting over 150 years) for correlation of earthquakes with biweekly tides, the author found correlation of earthquakes with sidereal 13.66 days Earth's rotation variations expressed as length of a day (LOD) measured daily by International Earth's Rotation Service. After short mention about earthquakes M 8.8 Denali Fault Alaska 3 November 2002 triggered on LOD maximum and M 9.1 Great Sumatra earthquake 26 December 2004 triggered on LOD minimum and the full Moon, the main object of this paper are earthquakes of period 2010-June 2011: M 7.0 Haiti (12 January 2010 on LOD minimum, M 8.8 Maule Chile 12 February 2010 on LOD maximum, map constructed on the Indian plate revealing 6 earthquakes from 7 on LOD minimum in Sumatra and Andaman Sea region, M 7.1 New Zealand Christchurch 9 September 2010 on LOD minimum and M 6.3 Christchurch 21 February 2011 on LOD maximum, and M 9.1 Japan near coast of Honshu 11 March 2011 on LOD minimum. It was found that LOD minimums coincide with full or new Moon only twice in a year in solstices. To prove that determined coincidences of earthquakes and LOD extremes stated above are not accidental events, histograms were constructed of earthquake occurrences and their position on LOD graph deeply in the past, in some cases from the time the IERS (International Earth's Rotation Service) started to measure the Earth's rotation variations in 1962. Evaluations of histograms and the Schuster's test have proven that majority of earthquakes are triggered in both Earth's rotation deceleration and acceleration. Because during these coincidences evident movements of lithosphere occur, among others measured by GPS, it is concluded that Earth's rotation variations effectively contribute to the lithospheric plates movement. Retrospective overview of past earthquakes revealed that the Great Sumatra earthquake 26 December 2004 had its equivalent in the shape of LOD graph, full Moon position, and character of aftershocks

  15. Pre-earthquake magnetic pulses

    NASA Astrophysics Data System (ADS)

    Scoville, J.; Heraud, J.; Freund, F.

    2015-08-01

    A semiconductor model of rocks is shown to describe unipolar magnetic pulses, a phenomenon that has been observed prior to earthquakes. These pulses are suspected to be generated deep in the Earth's crust, in and around the hypocentral volume, days or even weeks before earthquakes. Their extremely long wavelength allows them to pass through kilometers of rock. Interestingly, when the sources of these pulses are triangulated, the locations coincide with the epicenters of future earthquakes. We couple a drift-diffusion semiconductor model to a magnetic field in order to describe the electromagnetic effects associated with electrical currents flowing within rocks. The resulting system of equations is solved numerically and it is seen that a volume of rock may act as a diode that produces transient currents when it switches bias. These unidirectional currents are expected to produce transient unipolar magnetic pulses similar in form, amplitude, and duration to those observed before earthquakes, and this suggests that the pulses could be the result of geophysical semiconductor processes.

  16. Automatic Earthquake Detection by Active Learning

    NASA Astrophysics Data System (ADS)

    Bergen, K.; Beroza, G. C.

    2017-12-01

    In recent years, advances in machine learning have transformed fields such as image recognition, natural language processing and recommender systems. Many of these performance gains have relied on the availability of large, labeled data sets to train high-accuracy models; labeled data sets are those for which each sample includes a target class label, such as waveforms tagged as either earthquakes or noise. Earthquake seismologists are increasingly leveraging machine learning and data mining techniques to detect and analyze weak earthquake signals in large seismic data sets. One of the challenges in applying machine learning to seismic data sets is the limited labeled data problem; learning algorithms need to be given examples of earthquake waveforms, but the number of known events, taken from earthquake catalogs, may be insufficient to build an accurate detector. Furthermore, earthquake catalogs are known to be incomplete, resulting in training data that may be biased towards larger events and contain inaccurate labels. This challenge is compounded by the class imbalance problem; the events of interest, earthquakes, are infrequent relative to noise in continuous data sets, and many learning algorithms perform poorly on rare classes. In this work, we investigate the use of active learning for automatic earthquake detection. Active learning is a type of semi-supervised machine learning that uses a human-in-the-loop approach to strategically supplement a small initial training set. The learning algorithm incorporates domain expertise through interaction between a human expert and the algorithm, with the algorithm actively posing queries to the user to improve detection performance. We demonstrate the potential of active machine learning to improve earthquake detection performance with limited available training data.

  17. Urban Earthquakes - Reducing Building Collapse Through Education

    NASA Astrophysics Data System (ADS)

    Bilham, R.

    2004-12-01

    Fatalities from earthquakes rose from 6000k to 9000k/year in the past decade, yet the ratio of numbers of earthquake fatalities to instantaneous population continues to fall. Since 1950 the ratio declined worldwide by a factor of three, but in some countries the ratio has changed little. E.g in Iran, 1 in 3000 people can expect to die in an earthquake, a percentage that has not changed significantly since 1890. Fatalities from earthquakes remain high in those countries that have traditionally suffered from frequent large earthquakes (Turkey, Iran, Japan, and China), suggesting that the exposure time of recently increased urban populations in other countries may be too short to have interacted with earthquakes with long recurrence intervals. This in turn, suggests that disasters of unprecendented size will occur (more than 1 million fatalities) when future large earthquakes occur close to megacities. However, population growth is most rapid in cities of less than 1 million people in the developing nations, where the financial ability to implement earthquake resistant construction methods is limited. In that structural collapse can often be traced to ignorance about the forces at work in an earthquake, the future collapse of buildings presently under construction could be much reduced were contractors, builders and occupants educated in the principles of earthquake resistant assembly. Education of builders who are tempted to cut assembly costs is likely to be more cost effective than material aid.

  18. Earthquake effects at nuclear reactor facilities: San Fernando earthquake of February 9th, 1971

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Howard, G.; Ibanez, P.; Matthiesen, F.

    1972-02-01

    The effects of the San Fernando earthquake of February 9, 1971 on 26 reactor facilities located in California, Arizona, and Nevada are reported. The safety performance of the facilities during the earthquake is discussed. (JWR)

  19. On near-source earthquake triggering

    USGS Publications Warehouse

    Parsons, T.; Velasco, A.A.

    2009-01-01

    When one earthquake triggers others nearby, what connects them? Two processes are observed: static stress change from fault offset and dynamic stress changes from passing seismic waves. In the near-source region (r ??? 50 km for M ??? 5 sources) both processes may be operating, and since both mechanisms are expected to raise earthquake rates, it is difficult to isolate them. We thus compare explosions with earthquakes because only earthquakes cause significant static stress changes. We find that large explosions at the Nevada Test Site do not trigger earthquakes at rates comparable to similar magnitude earthquakes. Surface waves are associated with regional and long-range dynamic triggering, but we note that surface waves with low enough frequency to penetrate to depths where most aftershocks of the 1992 M = 5.7 Little Skull Mountain main shock occurred (???12 km) would not have developed significant amplitude within a 50-km radius. We therefore focus on the best candidate phases to cause local dynamic triggering, direct waves that pass through observed near-source aftershock clusters. We examine these phases, which arrived at the nearest (200-270 km) broadband station before the surface wave train and could thus be isolated for study. Direct comparison of spectral amplitudes of presurface wave arrivals shows that M ??? 5 explosions and earthquakes deliver the same peak dynamic stresses into the near-source crust. We conclude that a static stress change model can readily explain observed aftershock patterns, whereas it is difficult to attribute near-source triggering to a dynamic process because of the dearth of aftershocks near large explosions.

  20. Remotely triggered earthquakes following moderate main shocks

    USGS Publications Warehouse

    Hough, S.E.

    2007-01-01

    Since 1992, remotely triggered earthquakes have been identified following large (M > 7) earthquakes in California as well as in other regions. These events, which occur at much greater distances than classic aftershocks, occur predominantly in active geothermal or volcanic regions, leading to theories that the earthquakes are triggered when passing seismic waves cause disruptions in magmatic or other fluid systems. In this paper, I focus on observations of remotely triggered earthquakes following moderate main shocks in diverse tectonic settings. I summarize evidence that remotely triggered earthquakes occur commonly in mid-continent and collisional zones. This evidence is derived from analysis of both historic earthquake sequences and from instrumentally recorded M5-6 earthquakes in eastern Canada. The latter analysis suggests that, while remotely triggered earthquakes do not occur pervasively following moderate earthquakes in eastern North America, a low level of triggering often does occur at distances beyond conventional aftershock zones. The inferred triggered events occur at the distances at which SmS waves are known to significantly increase ground motions. A similar result was found for 28 recent M5.3-7.1 earthquakes in California. In California, seismicity is found to increase on average to a distance of at least 200 km following moderate main shocks. This supports the conclusion that, even at distances of ???100 km, dynamic stress changes control the occurrence of triggered events. There are two explanations that can account for the occurrence of remotely triggered earthquakes in intraplate settings: (1) they occur at local zones of weakness, or (2) they occur in zones of local stress concentration. ?? 2007 The Geological Society of America.

  1. Forearc deformation and great subduction earthquakes: implications for cascadia offshore earthquake potential.

    PubMed

    McCaffrey, R; Goldfinger, C

    1995-02-10

    The maximum size of thrust earthquakes at the world's subduction zones appears to be limited by anelastic deformation of the overriding plate. Anelastic strain in weak forearcs and roughness of the plate interface produced by faults cutting the forearc may limit the size of thrust earthquakes by inhibiting the buildup of elastic strain energy or slip propagation or both. Recently discovered active strike-slip faults in the submarine forearc of the Cascadia subduction zone show that the upper plate there deforms rapidly in response to arc-parallel shear. Thus, Cascadia, as a result of its weak, deforming upper plate, may be the type of subduction zone at which great (moment magnitude approximately 9) thrust earthquakes do not occur.

  2. Frog Swarms: Earthquake Precursors or False Alarms?

    PubMed Central

    Grant, Rachel A.; Conlan, Hilary

    2013-01-01

    Simple Summary Media reports linking unusual animal behaviour with earthquakes can potentially create false alarms and unnecessary anxiety among people that live in earthquake risk zones. Recently large frog swarms in China and elsewhere have been reported as earthquake precursors in the media. By examining international media reports of frog swarms since 1850 in comparison to earthquake data, it was concluded that frog swarms are naturally occurring dispersal behaviour of juveniles and are not associated with earthquakes. However, the media in seismic risk areas may be more likely to report frog swarms, and more likely to disseminate reports on frog swarms after earthquakes have occurred, leading to an apparent link between frog swarms and earthquakes. Abstract In short-term earthquake risk forecasting, the avoidance of false alarms is of utmost importance to preclude the possibility of unnecessary panic among populations in seismic hazard areas. Unusual animal behaviour prior to earthquakes has been reported for millennia but has rarely been scientifically documented. Recently large migrations or unusual behaviour of amphibians have been linked to large earthquakes, and media reports of large frog and toad migrations in areas of high seismic risk such as Greece and China have led to fears of a subsequent large earthquake. However, at certain times of year large migrations are part of the normal behavioural repertoire of amphibians. News reports of “frog swarms” from 1850 to the present day were examined for evidence that this behaviour is a precursor to large earthquakes. It was found that only two of 28 reported frog swarms preceded large earthquakes (Sichuan province, China in 2008 and 2010). All of the reported mass migrations of amphibians occurred in late spring, summer and autumn and appeared to relate to small juvenile anurans (frogs and toads). It was concluded that most reported “frog swarms” are actually normal behaviour, probably caused by

  3. Incubation of Chile's 1960 Earthquake

    NASA Astrophysics Data System (ADS)

    Atwater, B. F.; Cisternas, M.; Salgado, I.; Machuca, G.; Lagos, M.; Eipert, A.; Shishikura, M.

    2003-12-01

    Infrequent occurrence of giant events may help explain how the 1960 Chile earthquake attained M 9.5. Although old documents imply that this earthquake followed great earthquakes of 1575, 1737 and 1837, only three earthquakes of the past 1000 years produced geologic records like those for 1960. These earlier earthquakes include the 1575 event but not 1737 or 1837. Because the 1960 earthquake had nearly twice the seismic slip expected from plate convergence since 1837, much of the strain released in 1960 may have been accumulating since 1575. Geologic evidence for such incubation comes from new paleoseismic findings at the R¡o Maullin estuary, which indents the Pacific coast at 41.5§ S midway along the 1960 rupture. The 1960 earthquake lowered the area by 1.5 m, and the ensuing tsunami spread sand across lowland soils. The subsidence killed forests and changed pastures into sandy tidal flats. Guided by these 1960 analogs, we inferred tsunami and earthquake history from sand sheets, tree rings, and old maps. At Chuyaquen, 10 km upriver from the sea, we studied sand sheets in 31 backhoe pits on a geologic transect 1 km long. Each sheet overlies the buried soil of a former marsh or meadow. The sand sheet from 1960 extends the entire length of the transect. Three earlier sheets can be correlated at least half that far. The oldest one, probably a tsunami deposit, surrounds herbaceous plants that date to AD 990-1160. Next comes a sandy tidal-flat deposit dated by stratigraphic position to about 1000-1500. The penultimate sheet is a tsunami deposit younger than twigs from 1410-1630. It probably represents the 1575 earthquake, whose accounts of shaking, tsunami, and landslides rival those of 1960. In that case, the record excludes the 1737 and 1837 events. The 1737 and 1837 events also appear missing in tree-ring evidence from islands of Misquihue, 30 km upriver from the sea. Here the subsidence in 1960 admitted brackish tidal water that defoliated tens of thousands of

  4. Bayesian exploration of recent Chilean earthquakes

    NASA Astrophysics Data System (ADS)

    Duputel, Zacharie; Jiang, Junle; Jolivet, Romain; Simons, Mark; Rivera, Luis; Ampuero, Jean-Paul; Liang, Cunren; Agram, Piyush; Owen, Susan; Ortega, Francisco; Minson, Sarah

    2016-04-01

    The South-American subduction zone is an exceptional natural laboratory for investigating the behavior of large faults over the earthquake cycle. It is also a playground to develop novel modeling techniques combining different datasets. Coastal Chile was impacted by two major earthquakes in the last two years: the 2015 M 8.3 Illapel earthquake in central Chile and the 2014 M 8.1 Iquique earthquake that ruptured the central portion of the 1877 seismic gap in northern Chile. To gain better understanding of the distribution of co-seismic slip for those two earthquakes, we derive joint kinematic finite fault models using a combination of static GPS offsets, radar interferograms, tsunami measurements, high-rate GPS waveforms and strong motion data. Our modeling approach follows a Bayesian formulation devoid of a priori smoothing thereby allowing us to maximize spatial resolution of the inferred family of models. The adopted approach also attempts to account for major sources of uncertainty in the Green's functions. The results reveal different rupture behaviors for the 2014 Iquique and 2015 Illapel earthquakes. The 2014 Iquique earthquake involved a sharp slip zone and did not rupture to the trench. The 2015 Illapel earthquake nucleated close to the coast and propagated toward the trench with significant slip apparently reaching the trench or at least very close to the trench. At the inherent resolution of our models, we also present the relationship of co-seismic models to the spatial distribution of foreshocks, aftershocks and fault coupling models.

  5. Scoring annual earthquake predictions in China

    NASA Astrophysics Data System (ADS)

    Zhuang, Jiancang; Jiang, Changsheng

    2012-02-01

    The Annual Consultation Meeting on Earthquake Tendency in China is held by the China Earthquake Administration (CEA) in order to provide one-year earthquake predictions over most China. In these predictions, regions of concern are denoted together with the corresponding magnitude range of the largest earthquake expected during the next year. Evaluating the performance of these earthquake predictions is rather difficult, especially for regions that are of no concern, because they are made on arbitrary regions with flexible magnitude ranges. In the present study, the gambling score is used to evaluate the performance of these earthquake predictions. Based on a reference model, this scoring method rewards successful predictions and penalizes failures according to the risk (probability of being failure) that the predictors have taken. Using the Poisson model, which is spatially inhomogeneous and temporally stationary, with the Gutenberg-Richter law for earthquake magnitudes as the reference model, we evaluate the CEA predictions based on 1) a partial score for evaluating whether issuing the alarmed regions is based on information that differs from the reference model (knowledge of average seismicity level) and 2) a complete score that evaluates whether the overall performance of the prediction is better than the reference model. The predictions made by the Annual Consultation Meetings on Earthquake Tendency from 1990 to 2003 are found to include significant precursory information, but the overall performance is close to that of the reference model.

  6. Potential for a large earthquake near Los Angeles inferred from the 2014 La Habra earthquake.

    PubMed

    Donnellan, Andrea; Grant Ludwig, Lisa; Parker, Jay W; Rundle, John B; Wang, Jun; Pierce, Marlon; Blewitt, Geoffrey; Hensley, Scott

    2015-09-01

    Tectonic motion across the Los Angeles region is distributed across an intricate network of strike-slip and thrust faults that will be released in destructive earthquakes similar to or larger than the 1933  M 6.4 Long Beach and 1994  M 6.7 Northridge events. Here we show that Los Angeles regional thrust, strike-slip, and oblique faults are connected and move concurrently with measurable surface deformation, even in moderate magnitude earthquakes, as part of a fault system that accommodates north-south shortening and westerly tectonic escape of northern Los Angeles. The 28 March 2014 M 5.1 La Habra earthquake occurred on a northeast striking, northwest dipping left-lateral oblique thrust fault northeast of Los Angeles. We present crustal deformation observation spanning the earthquake showing that concurrent deformation occurred on several structures in the shallow crust. The seismic moment of the earthquake is 82% of the total geodetic moment released. Slip within the unconsolidated upper sedimentary layer may reflect shallow release of accumulated strain on still-locked deeper structures. A future M 6.1-6.3 earthquake would account for the accumulated strain. Such an event could occur on any one or several of these faults, which may not have been identified by geologic surface mapping.

  7. Potential for a large earthquake near Los Angeles inferred from the 2014 La Habra earthquake

    PubMed Central

    Grant Ludwig, Lisa; Parker, Jay W.; Rundle, John B.; Wang, Jun; Pierce, Marlon; Blewitt, Geoffrey; Hensley, Scott

    2015-01-01

    Abstract Tectonic motion across the Los Angeles region is distributed across an intricate network of strike‐slip and thrust faults that will be released in destructive earthquakes similar to or larger than the 1933 M6.4 Long Beach and 1994 M6.7 Northridge events. Here we show that Los Angeles regional thrust, strike‐slip, and oblique faults are connected and move concurrently with measurable surface deformation, even in moderate magnitude earthquakes, as part of a fault system that accommodates north‐south shortening and westerly tectonic escape of northern Los Angeles. The 28 March 2014 M5.1 La Habra earthquake occurred on a northeast striking, northwest dipping left‐lateral oblique thrust fault northeast of Los Angeles. We present crustal deformation observation spanning the earthquake showing that concurrent deformation occurred on several structures in the shallow crust. The seismic moment of the earthquake is 82% of the total geodetic moment released. Slip within the unconsolidated upper sedimentary layer may reflect shallow release of accumulated strain on still‐locked deeper structures. A future M6.1–6.3 earthquake would account for the accumulated strain. Such an event could occur on any one or several of these faults, which may not have been identified by geologic surface mapping. PMID:27981074

  8. Prompt Assessment of Global Earthquakes for Response (PAGER): A System for Rapidly Determining the Impact of Earthquakes Worldwide

    USGS Publications Warehouse

    Earle, Paul S.; Wald, David J.; Jaiswal, Kishor S.; Allen, Trevor I.; Hearne, Michael G.; Marano, Kristin D.; Hotovec, Alicia J.; Fee, Jeremy

    2009-01-01

    Within minutes of a significant earthquake anywhere on the globe, the U.S. Geological Survey (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system assesses its potential societal impact. PAGER automatically estimates the number of people exposed to severe ground shaking and the shaking intensity at affected cities. Accompanying maps of the epicentral region show the population distribution and estimated ground-shaking intensity. A regionally specific comment describes the inferred vulnerability of the regional building inventory and, when available, lists recent nearby earthquakes and their effects. PAGER's results are posted on the USGS Earthquake Program Web site (http://earthquake.usgs.gov/), consolidated in a concise one-page report, and sent in near real-time to emergency responders, government agencies, and the media. Both rapid and accurate results are obtained through manual and automatic updates of PAGER's content in the hours following significant earthquakes. These updates incorporate the most recent estimates of earthquake location, magnitude, faulting geometry, and first-hand accounts of shaking. PAGER relies on a rich set of earthquake analysis and assessment tools operated by the USGS and contributing Advanced National Seismic System (ANSS) regional networks. A focused research effort is underway to extend PAGER's near real-time capabilities beyond population exposure to quantitative estimates of fatalities, injuries, and displaced population.

  9. Application of τc*Pd for identifying damaging earthquakes for earthquake early warning

    NASA Astrophysics Data System (ADS)

    Huang, P. L.; Lin, T. L.; Wu, Y. M.

    2014-12-01

    Earthquake Early Warning System (EEWS) is an effective approach to mitigate earthquake damage. In this study, we used the seismic record by the Kiban Kyoshin network (KiK-net), because it has dense station coverage and co-located borehole strong-motion seismometers along with the free-surface strong-motion seismometers. We used inland earthquakes with moment magnitude (Mw) from 5.0 to 7.3 between 1998 and 2012. We choose 135 events and 10950 strong ground accelerograms recorded by the 696 strong ground accelerographs. Both the free-surface and the borehole data are used to calculate τc and Pd, respectively. The results show that τc*Pd has a good correlation with PGV and is a robust parameter for assessing the potential of damaging earthquake. We propose the value of τc*Pd determined from seconds after the arrival of P wave could be a threshold for the on-site type of EEW.

  10. The repetition of large-earthquake ruptures.

    PubMed Central

    Sieh, K

    1996-01-01

    This survey of well-documented repeated fault rupture confirms that some faults have exhibited a "characteristic" behavior during repeated large earthquakes--that is, the magnitude, distribution, and style of slip on the fault has repeated during two or more consecutive events. In two cases faults exhibit slip functions that vary little from earthquake to earthquake. In one other well-documented case, however, fault lengths contrast markedly for two consecutive ruptures, but the amount of offset at individual sites was similar. Adjacent individual patches, 10 km or more in length, failed singly during one event and in tandem during the other. More complex cases of repetition may also represent the failure of several distinct patches. The faults of the 1992 Landers earthquake provide an instructive example of such complexity. Together, these examples suggest that large earthquakes commonly result from the failure of one or more patches, each characterized by a slip function that is roughly invariant through consecutive earthquake cycles. The persistence of these slip-patches through two or more large earthquakes indicates that some quasi-invariant physical property controls the pattern and magnitude of slip. These data seem incompatible with theoretical models that produce slip distributions that are highly variable in consecutive large events. Images Fig. 3 Fig. 7 Fig. 9 PMID:11607662

  11. The Road to Total Earthquake Safety

    NASA Astrophysics Data System (ADS)

    Frohlich, Cliff

    Cinna Lomnitz is possibly the most distinguished earthquake seismologist in all of Central and South America. Among many other credentials, Lomnitz has personally experienced the shaking and devastation that accompanied no fewer than five major earthquakes—Chile, 1939; Kern County, California, 1952; Chile, 1960; Caracas,Venezuela, 1967; and Mexico City, 1985. Thus he clearly has much to teach someone like myself, who has never even actually felt a real earthquake.What is this slim book? The Road to Total Earthquake Safety summarizes Lomnitz's May 1999 presentation at the Seventh Mallet-Milne Lecture, sponsored by the Society for Earthquake and Civil Engineering Dynamics. His arguments are motivated by the damage that occurred in three earthquakes—Mexico City, 1985; Loma Prieta, California, 1989; and Kobe, Japan, 1995. All three quakes occurred in regions where earthquakes are common. Yet in all three some of the worst damage occurred in structures located a significant distance from the epicenter and engineered specifically to resist earthquakes. Some of the damage also indicated that the structures failed because they had experienced considerable rotational or twisting motion. Clearly, Lomnitz argues, there must be fundamental flaws in the usually accepted models explaining how earthquakes generate strong motions, and how we should design resistant structures.

  12. Possible seasonality in large deep-focus earthquakes

    NASA Astrophysics Data System (ADS)

    Zhan, Zhongwen; Shearer, Peter M.

    2015-09-01

    Large deep-focus earthquakes (magnitude > 7.0, depth > 500 km) have exhibited strong seasonality in their occurrence times since the beginning of global earthquake catalogs. Of 60 such events from 1900 to the present, 42 have occurred in the middle half of each year. The seasonality appears strongest in the northwest Pacific subduction zones and weakest in the Tonga region. Taken at face value, the surplus of northern hemisphere summer events is statistically significant, but due to the ex post facto hypothesis testing, the absence of seasonality in smaller deep earthquakes, and the lack of a known physical triggering mechanism, we cannot rule out that the observed seasonality is just random chance. However, we can make a testable prediction of seasonality in future large deep-focus earthquakes, which, given likely earthquake occurrence rates, should be verified or falsified within a few decades. If confirmed, deep earthquake seasonality would challenge our current understanding of deep earthquakes.

  13. Earthquake forecasting test for Kanto district to reduce vulnerability of urban mega earthquake disasters

    NASA Astrophysics Data System (ADS)

    Yokoi, S.; Tsuruoka, H.; Nanjo, K.; Hirata, N.

    2012-12-01

    Collaboratory for the Study of Earthquake Predictability (CSEP) is a global project on earthquake predictability research. The final goal of this project is to search for the intrinsic predictability of the earthquake rupture process through forecast testing experiments. The Earthquake Research Institute, the University of Tokyo joined CSEP and started the Japanese testing center called as CSEP-Japan. This testing center provides an open access to researchers contributing earthquake forecast models applied to Japan. Now more than 100 earthquake forecast models were submitted on the prospective experiment. The models are separated into 4 testing classes (1 day, 3 months, 1 year and 3 years) and 3 testing regions covering an area of Japan including sea area, Japanese mainland and Kanto district. We evaluate the performance of the models in the official suite of tests defined by CSEP. The total number of experiments was implemented for approximately 300 rounds. These results provide new knowledge concerning statistical forecasting models. We started a study for constructing a 3-dimensional earthquake forecasting model for Kanto district in Japan based on CSEP experiments under the Special Project for Reducing Vulnerability for Urban Mega Earthquake Disasters. Because seismicity of the area ranges from shallower part to a depth of 80 km due to subducting Philippine Sea plate and Pacific plate, we need to study effect of depth distribution. We will develop models for forecasting based on the results of 2-D modeling. We defined the 3D - forecasting area in the Kanto region with test classes of 1 day, 3 months, 1 year and 3 years, and magnitudes from 4.0 to 9.0 as in CSEP-Japan. In the first step of the study, we will install RI10K model (Nanjo, 2011) and the HISTETAS models (Ogata, 2011) to know if those models have good performance as in the 3 months 2-D CSEP-Japan experiments in the Kanto region before the 2011 Tohoku event (Yokoi et al., in preparation). We use CSEP

  14. Temporal stress changes caused by earthquakes: A review

    USGS Publications Warehouse

    Hardebeck, Jeanne L.; Okada, Tomomi

    2018-01-01

    Earthquakes can change the stress field in the Earth’s lithosphere as they relieve and redistribute stress. Earthquake-induced stress changes have been observed as temporal rotations of the principal stress axes following major earthquakes in a variety of tectonic settings. The stress changes due to the 2011 Mw9.0 Tohoku-Oki, Japan, earthquake were particularly well documented. Earthquake stress rotations can inform our understanding of earthquake physics, most notably addressing the long-standing problem of whether the Earth’s crust at plate boundaries is “strong” or “weak.” Many of the observed stress rotations, including that due to the Tohoku-Oki earthquake, indicate near-complete stress drop in the mainshock. This implies low background differential stress, on the order of earthquake stress drop, supporting the weak crust model. Earthquake stress rotations can also be used to address other important geophysical questions, such as the level of crustal stress heterogeneity and the mechanisms of postseismic stress reloading. The quantitative interpretation of stress rotations is evolving from those based on simple analytical methods to those based on more sophisticated numerical modeling that can capture the spatial-temporal complexity of the earthquake stress changes.

  15. Temporal Stress Changes Caused by Earthquakes: A Review

    NASA Astrophysics Data System (ADS)

    Hardebeck, Jeanne L.; Okada, Tomomi

    2018-02-01

    Earthquakes can change the stress field in the Earth's lithosphere as they relieve and redistribute stress. Earthquake-induced stress changes have been observed as temporal rotations of the principal stress axes following major earthquakes in a variety of tectonic settings. The stress changes due to the 2011 Mw9.0 Tohoku-Oki, Japan, earthquake were particularly well documented. Earthquake stress rotations can inform our understanding of earthquake physics, most notably addressing the long-standing problem of whether the Earth's crust at plate boundaries is "strong" or "weak." Many of the observed stress rotations, including that due to the Tohoku-Oki earthquake, indicate near-complete stress drop in the mainshock. This implies low background differential stress, on the order of earthquake stress drop, supporting the weak crust model. Earthquake stress rotations can also be used to address other important geophysical questions, such as the level of crustal stress heterogeneity and the mechanisms of postseismic stress reloading. The quantitative interpretation of stress rotations is evolving from those based on simple analytical methods to those based on more sophisticated numerical modeling that can capture the spatial-temporal complexity of the earthquake stress changes.

  16. Earthquake hazards on the cascadia subduction zone.

    PubMed

    Heaton, T H; Hartzell, S H

    1987-04-10

    Large subduction earthquakes on the Cascadia subduction zone pose a potential seismic hazard. Very young oceanic lithosphere (10 million years old) is being subducted beneath North America at a rate of approximately 4 centimeters per year. The Cascadia subduction zone shares many characteristics with subduction zones in southern Chile, southwestern Japan, and Colombia, where comparably young oceanic lithosphere is also subducting. Very large subduction earthquakes, ranging in energy magnitude (M(w)) between 8 and 9.5, have occurred along these other subduction zones. If the Cascadia subduction zone is also storing elastic energy, a sequence of several great earthquakes (M(w) 8) or a giant earthquake (M(w) 9) would be necessary to fill this 1200-kilometer gap. The nature of strong ground motions recorded during subduction earthquakes of M(w) less than 8.2 is discussed. Strong ground motions from even larger earthquakes (M(w) up to 9.5) are estimated by simple simulations. If large subduction earthquakes occur in the Pacific Northwest, relatively strong shaking can be expected over a large region. Such earthquakes may also be accompanied by large local tsunamis.

  17. From Tornadoes to Earthquakes: Forecast Verification for Binary Events Applied to the 1999 Chi-Chi, Taiwan, Earthquake

    NASA Astrophysics Data System (ADS)

    Chen, C.; Rundle, J. B.; Holliday, J. R.; Nanjo, K.; Turcotte, D. L.; Li, S.; Tiampo, K. F.

    2005-12-01

    Forecast verification procedures for statistical events with binary outcomes typically rely on the use of contingency tables and Relative Operating Characteristic (ROC) diagrams. Originally developed for the statistical evaluation of tornado forecasts on a county-by-county basis, these methods can be adapted to the evaluation of competing earthquake forecasts. Here we apply these methods retrospectively to two forecasts for the m = 7.3 1999 Chi-Chi, Taiwan, earthquake. These forecasts are based on a method, Pattern Informatics (PI), that locates likely sites for future large earthquakes based on large change in activity of the smallest earthquakes. A competing null hypothesis, Relative Intensity (RI), is based on the idea that future large earthquake locations are correlated with sites having the greatest frequency of small earthquakes. We show that for Taiwan, the PI forecast method is superior to the RI forecast null hypothesis. Inspection of the two maps indicates that their forecast locations are indeed quite different. Our results confirm an earlier result suggesting that the earthquake preparation process for events such as the Chi-Chi earthquake involves anomalous changes in activation or quiescence, and that signatures of these processes can be detected in precursory seismicity data. Furthermore, we find that our methods can accurately forecast the locations of aftershocks from precursory seismicity changes alone, implying that the main shock together with its aftershocks represent a single manifestation of the formation of a high-stress region nucleating prior to the main shock.

  18. Living with earthquakes - development and usage of earthquake-resistant construction methods in European and Asian Antiquity

    NASA Astrophysics Data System (ADS)

    Kázmér, Miklós; Major, Balázs; Hariyadi, Agus; Pramumijoyo, Subagyo; Ditto Haryana, Yohanes

    2010-05-01

    Earthquakes are among the most horrible events of nature due to unexpected occurrence, for which no spiritual means are available for protection. The only way of preserving life and property is applying earthquake-resistant construction methods. Ancient Greek architects of public buildings applied steel clamps embedded in lead casing to hold together columns and masonry walls during frequent earthquakes in the Aegean region. Elastic steel provided strength, while plastic lead casing absorbed minor shifts of blocks without fracturing rigid stone. Romans invented concrete and built all sizes of buildings as a single, unflexible unit. Masonry surrounding and decorating concrete core of the wall did not bear load. Concrete resisted minor shaking, yielding only to forces higher than fracture limits. Roman building traditions survived the Dark Ages and 12th century Crusader castles erected in earthquake-prone Syria survive until today in reasonably good condition. Concrete and steel clamping persisted side-by-side in the Roman Empire. Concrete was used for cheap construction as compared to building of masonry. Applying lead-encased steel increased costs, and was avoided whenever possible. Columns of the various forums in Italian Pompeii mostly lack steel fittings despite situated in well-known earthquake-prone area. Whether frequent recurrence of earthquakes in the Naples region was known to inhabitants of Pompeii might be a matter of debate. Seemingly the shock of the AD 62 earthquake was not enough to apply well-known protective engineering methods throughout the reconstruction of the city before the AD 79 volcanic catastrophe. An independent engineering tradition developed on the island of Java (Indonesia). The mortar-less construction technique of 8-9th century Hindu masonry shrines around Yogyakarta would allow scattering of blocks during earthquakes. To prevent dilapidation an intricate mortise-and-tenon system was carved into adjacent faces of blocks. Only the

  19. Earthquake!

    ERIC Educational Resources Information Center

    Hernandez, Hildo

    2000-01-01

    Examines the types of damage experienced by California State University at Northridge during the 1994 earthquake and what lessons were learned in handling this emergency are discussed. The problem of loose asbestos is addressed. (GR)

  20. Assessment of earthquake-induced landslides hazard in El Salvador after the 2001 earthquakes using macroseismic analysis

    NASA Astrophysics Data System (ADS)

    Esposito, Eliana; Violante, Crescenzo; Giunta, Giuseppe; Ángel Hernández, Miguel

    2016-04-01

    Two strong earthquakes and a number of smaller aftershocks struck El Salvador in the year 2001. The January 13 2001 earthquake, Mw 7.7, occurred along the Cocos plate, 40 km off El Salvador southern coast. It resulted in about 1300 deaths and widespread damage, mainly due to massive landsliding. Two of the largest earthquake-induced landslides, Las Barioleras and Las Colinas (about 2x105 m3) produced major damage to buildings and infrastructures and 500 fatalities. A neighborhood in Santa Tecla, west of San Salvador, was destroyed. The February 13 2001 earthquake, Mw 6.5, occurred 40 km east-southeast of San Salvador. This earthquake caused over 300 fatalities and triggered several landslides over an area of 2,500 km2 mostly in poorly consolidated volcaniclastic deposits. The La Leona landslide (5-7x105 m3) caused 12 fatalities and extensive damage to the Panamerican Highway. Two very large landslides of 1.5 km3 and 12 km3 produced hazardous barrier lakes at Rio El Desague and Rio Jiboa, respectively. More than 16.000 landslides occurred throughout the country after both quakes; most of them occurred in pyroclastic deposits, with a volume less than 1x103m3. The present work aims to define the relationship between the above described earthquake intensity, size and areal distribution of induced landslides, as well as to refine the earthquake intensity in sparsely populated zones by using landslide effects. Landslides triggered by the 2001 seismic sequences provided useful indication for a realistic seismic hazard assessment, providing a basis for understanding, evaluating, and mapping the hazard and risk associated with earthquake-induced landslides.

  1. InSAR Analysis of the 2011 Hawthorne (Nevada) Earthquake Swarm: Implications of Earthquake Migration and Stress Transfer

    NASA Astrophysics Data System (ADS)

    Zha, X.; Dai, Z.; Lu, Z.

    2015-12-01

    The 2011 Hawthorne earthquake swarm occurred in the central Walker Lane zone, neighboring the border between California and Nevada. The swarm included an Mw 4.4 on April 13, Mw 4.6 on April 17, and Mw 3.9 on April 27. Due to the lack of the near-field seismic instrument, it is difficult to get the accurate source information from the seismic data for these moderate-magnitude events. ENVISAT InSAR observations captured the deformation mainly caused by three events during the 2011 Hawthorne earthquake swarm. The surface traces of three seismogenic sources could be identified according to the local topography and interferogram phase discontinuities. The epicenters could be determined using the interferograms and the relocated earthquake distribution. An apparent earthquake migration is revealed by InSAR observations and the earthquake distribution. Analysis and modeling of InSAR data show that three moderate magnitude earthquakes were produced by slip on three previously unrecognized faults in the central Walker Lane. Two seismogenic sources are northwest striking, right-lateral strike-slip faults with some thrust-slip components, and the other source is a northeast striking, thrust-slip fault with some strike-slip components. The former two faults are roughly parallel to each other, and almost perpendicular to the latter one. This special spatial correlation between three seismogenic faults and nature of seismogenic faults suggest the central Walker Lane has been undergoing southeast-northwest horizontal compressive deformation, consistent with the region crustal movement revealed by GPS measurement. The Coulomb failure stresses on the fault planes were calculated using the preferred slip model and the Coulomb 3.4 software package. For the Mw4.6 earthquake, the Coulomb stress change caused by the Mw4.4 event increased by ~0.1 bar. For the Mw3.9 event, the Coulomb stress change caused by the Mw4.6 earthquake increased by ~1.0 bar. This indicates that the preceding

  2. Inter-Disciplinary Validation of Pre Earthquake Signals. Case Study for Major Earthquakes in Asia (2004-2010) and for 2011 Tohoku Earthquake

    NASA Technical Reports Server (NTRS)

    Ouzounov, D.; Pulinets, S.; Hattori, K.; Liu, J.-Y.; Yang. T. Y.; Parrot, M.; Kafatos, M.; Taylor, P.

    2012-01-01

    We carried out multi-sensors observations in our investigation of phenomena preceding major earthquakes. Our approach is based on a systematic analysis of several physical and environmental parameters, which we found, associated with the earthquake processes: thermal infrared radiation, temperature and concentration of electrons in the ionosphere, radon/ion activities, and air temperature/humidity in the atmosphere. We used satellite and ground observations and interpreted them with the Lithosphere-Atmosphere- Ionosphere Coupling (LAIC) model, one of possible paradigms we study and support. We made two independent continues hind-cast investigations in Taiwan and Japan for total of 102 earthquakes (M>6) occurring from 2004-2011. We analyzed: (1) ionospheric electromagnetic radiation, plasma and energetic electron measurements from DEMETER (2) emitted long-wavelength radiation (OLR) from NOAA/AVHRR and NASA/EOS; (3) radon/ion variations (in situ data); and 4) GPS Total Electron Content (TEC) measurements collected from space and ground based observations. This joint analysis of ground and satellite data has shown that one to six (or more) days prior to the largest earthquakes there were anomalies in all of the analyzed physical observations. For the latest March 11 , 2011 Tohoku earthquake, our analysis shows again the same relationship between several independent observations characterizing the lithosphere /atmosphere coupling. On March 7th we found a rapid increase of emitted infrared radiation observed from satellite data and subsequently an anomaly developed near the epicenter. The GPS/TEC data indicated an increase and variation in electron density reaching a maximum value on March 8. Beginning from this day we confirmed an abnormal TEC variation over the epicenter in the lower ionosphere. These findings revealed the existence of atmospheric and ionospheric phenomena occurring prior to the 2011 Tohoku earthquake, which indicated new evidence of a distinct

  3. POST Earthquake Debris Management - AN Overview

    NASA Astrophysics Data System (ADS)

    Sarkar, Raju

    Every year natural disasters, such as fires, floods, earthquakes, hurricanes, landslides, tsunami, and tornadoes, challenge various communities of the world. Earthquakes strike with varying degrees of severity and pose both short- and long-term challenges to public service providers. Earthquakes generate shock waves and displace the ground along fault lines. These seismic forces can bring down buildings and bridges in a localized area and damage buildings and other structures in a far wider area. Secondary damage from fires, explosions, and localized flooding from broken water pipes can increase the amount of debris. Earthquake debris includes building materials, personal property, and sediment from landslides. The management of this debris, as well as the waste generated during the reconstruction works, can place significant challenges on the national and local capacities. Debris removal is a major component of every post earthquake recovery operation. Much of the debris generated from earthquake is not hazardous. Soil, building material, and green waste, such as trees and shrubs, make up most of the volume of earthquake debris. These wastes not only create significant health problems and a very unpleasant living environment if not disposed of safely and appropriately, but also can subsequently impose economical burdens on the reconstruction phase. In practice, most of the debris may be either disposed of at landfill sites, reused as materials for construction or recycled into useful commodities Therefore, the debris clearance operation should focus on the geotechnical engineering approach as an important post earthquake issue to control the quality of the incoming flow of potential soil materials. In this paper, the importance of an emergency management perspective in this geotechnical approach that takes into account the different criteria related to the operation execution is proposed by highlighting the key issues concerning the handling of the construction

  4. Simulating Earthquakes for Science and Society: Earthquake Visualizations Ideal for use in Science Communication and Education

    NASA Astrophysics Data System (ADS)

    de Groot, R.

    2008-12-01

    The Southern California Earthquake Center (SCEC) has been developing groundbreaking computer modeling capabilities for studying earthquakes. These visualizations were initially shared within the scientific community but have recently gained visibility via television news coverage in Southern California. Computers have opened up a whole new world for scientists working with large data sets, and students can benefit from the same opportunities (Libarkin & Brick, 2002). For example, The Great Southern California ShakeOut was based on a potential magnitude 7.8 earthquake on the southern San Andreas fault. The visualization created for the ShakeOut was a key scientific and communication tool for the earthquake drill. This presentation will also feature SCEC Virtual Display of Objects visualization software developed by SCEC Undergraduate Studies in Earthquake Information Technology interns. According to Gordin and Pea (1995), theoretically visualization should make science accessible, provide means for authentic inquiry, and lay the groundwork to understand and critique scientific issues. This presentation will discuss how the new SCEC visualizations and other earthquake imagery achieve these results, how they fit within the context of major themes and study areas in science communication, and how the efficacy of these tools can be improved.

  5. A revised “earthquake report” questionaire

    USGS Publications Warehouse

    Stover, C.; Reagor, G.; Simon, R.

    1976-01-01

    The U.S geological Survey is responsible for conducting intensity and damage surveys following felt or destructive earthquakes in the United States. Shortly after a felt or damaging earthquake occurs, a canvass of the affected area is made. Specially developed questionnaires are mailed to volunteer observers located within the estimated felt area. These questionnaires, "Earthquake Reports," are filled out by the observers and returned to the Survey's National Earthquake Information Service, which is located in Colorado. They are then evaluated, and, based on answers to questions about physical effects seen or felt, each canvassed location is assigned to the various locations, they are plotted on an intensity distribution map. When all of the intensity data have been plotted, isoseismals can then be contoured through places where equal intensity was experienced. The completed isoseismal map yields a detailed picture of the earthquake, its effects, and its felt area. All of the data and maps are published quarterly in a U.S Geological Survey Circular series entitled "Earthquakes in the United States".  

  6. Local tsunamis and earthquake source parameters

    USGS Publications Warehouse

    Geist, Eric L.; Dmowska, Renata; Saltzman, Barry

    1999-01-01

    This chapter establishes the relationship among earthquake source parameters and the generation, propagation, and run-up of local tsunamis. In general terms, displacement of the seafloor during the earthquake rupture is modeled using the elastic dislocation theory for which the displacement field is dependent on the slip distribution, fault geometry, and the elastic response and properties of the medium. Specifically, nonlinear long-wave theory governs the propagation and run-up of tsunamis. A parametric study is devised to examine the relative importance of individual earthquake source parameters on local tsunamis, because the physics that describes tsunamis from generation through run-up is complex. Analysis of the source parameters of various tsunamigenic earthquakes have indicated that the details of the earthquake source, namely, nonuniform distribution of slip along the fault plane, have a significant effect on the local tsunami run-up. Numerical methods have been developed to address the realistic bathymetric and shoreline conditions. The accuracy of determining the run-up on shore is directly dependent on the source parameters of the earthquake, which provide the initial conditions used for the hydrodynamic models.

  7. Are Earthquake Clusters/Supercycles Real or Random?

    NASA Astrophysics Data System (ADS)

    Salditch, L.; Brooks, E. M.; Stein, S.; Spencer, B. D.

    2016-12-01

    Long records of earthquakes at plate boundaries such as the San Andreas or Cascadia often show that large earthquakes occur in temporal clusters, also termed supercycles, separated by less active intervals. These are intriguing because the boundary is presumably being loaded by steady plate motion. If so, earthquakes resulting from seismic cycles - in which their probability is small shortly after the past one, and then increases with time - should occur quasi-periodically rather than be more frequent in some intervals than others. We are exploring this issue with two approaches. One is to assess whether the clusters result purely by chance from a time-independent process that has no "memory." Thus a future earthquake is equally likely immediately after the past one and much later, so earthquakes can cluster in time. We analyze the agreement between such a model and inter-event times for Parkfield, Pallet Creek, and other records. A useful tool is transformation by the inverse cumulative distribution function, so the inter-event times have a uniform distribution when the memorylessness property holds. The second is via a time-variable model in which earthquake probability increases with time between earthquakes and decreases after an earthquake. The probability of an event increases with time until one happens, after which it decreases, but not to zero. Hence after a long period of quiescence, the probability of an earthquake can remain higher than the long-term average for several cycles. Thus the probability of another earthquake is path dependent, i.e. depends on the prior earthquake history over multiple cycles. Time histories resulting from simulations give clusters with properties similar to those observed. The sequences of earthquakes result from both the model parameters and chance, so two runs with the same parameters look different. The model parameters control the average time between events and the variation of the actual times around this average, so

  8. Intrastab Earthquakes: Dehydration of the Cascadia Slab

    USGS Publications Warehouse

    Preston, L.A.; Creager, K.C.; Crosson, R.S.; Brocher, T.M.; Trehu, A.M.

    2003-01-01

    We simultaneously invert travel times of refracted and wide-angle reflected waves for three-dimensional compressional-wave velocity structure, earthquake locations, and reflector geometry in northwest Washington state. The reflector, interpreted to be the crust-mantle boundary (Moho) of the subducting Juan de Fuca plate, separates intrastab earthquakes into two groups, permitting a new understanding of the origins of intrastab earthquakes in Cascadia. Earthquakes up-dip of the Moho's 45-kilometer depth contour occur below the reflector, in the subducted oceanic mantle, consistent with serpentinite dehydration; earthquakes located down-dip occur primarily within the subducted crust, consistent with the basalt-to-eclogite transformation.

  9. Earthquake scenarios based on lessons from the past

    NASA Astrophysics Data System (ADS)

    Solakov, Dimcho; Simeonova, Stella; Aleksandrova, Irena; Popova, Iliana

    2010-05-01

    Earthquakes are the most deadly of the natural disasters affecting the human environment; indeed catastrophic earthquakes have marked the whole human history. Global seismic hazard and vulnerability to earthquakes are increasing steadily as urbanization and development occupy more areas that are prone to effects of strong earthquakes. Additionally, the uncontrolled growth of mega cities in highly seismic areas around the world is often associated with the construction of seismically unsafe buildings and infrastructures, and undertaken with an insufficient knowledge of the regional seismicity peculiarities and seismic hazard. The assessment of seismic hazard and generation of earthquake scenarios is the first link in the prevention chain and the first step in the evaluation of the seismic risk. The implementation of the earthquake scenarios into the policies for seismic risk reduction will allow focusing on the prevention of earthquake effects rather than on intervention following the disasters. The territory of Bulgaria (situated in the eastern part of the Balkan Peninsula) represents a typical example of high seismic risk area. Over the centuries, Bulgaria has experienced strong earthquakes. At the beginning of the 20-the century (from 1901 to 1928) five earthquakes with magnitude larger than or equal to MS=7.0 occurred in Bulgaria. However, no such large earthquakes occurred in Bulgaria since 1928, which may induce non-professionals to underestimate the earthquake risk. The 1986 earthquake of magnitude MS=5.7 occurred in the central northern Bulgaria (near the town of Strazhitsa) is the strongest quake after 1928. Moreover, the seismicity of the neighboring countries, like Greece, Turkey, former Yugoslavia and Romania (especially Vrancea-Romania intermediate earthquakes), influences the seismic hazard in Bulgaria. In the present study deterministic scenarios (expressed in seismic intensity) for two Bulgarian cities (Rouse and Plovdiv) are presented. The work on

  10. Possible scenarios for occurrence of M ~ 7 interplate earthquakes prior to and following the 2011 Tohoku-Oki earthquake based on numerical simulation.

    PubMed

    Nakata, Ryoko; Hori, Takane; Hyodo, Mamoru; Ariyoshi, Keisuke

    2016-05-10

    We show possible scenarios for the occurrence of M ~ 7 interplate earthquakes prior to and following the M ~ 9 earthquake along the Japan Trench, such as the 2011 Tohoku-Oki earthquake. One such M ~ 7 earthquake is so-called the Miyagi-ken-Oki earthquake, for which we conducted numerical simulations of earthquake generation cycles by using realistic three-dimensional (3D) geometry of the subducting Pacific Plate. In a number of scenarios, the time interval between the M ~ 9 earthquake and the subsequent Miyagi-ken-Oki earthquake was equal to or shorter than the average recurrence interval during the later stage of the M ~ 9 earthquake cycle. The scenarios successfully reproduced important characteristics such as the recurrence of M ~ 7 earthquakes, coseismic slip distribution, afterslip distribution, the largest foreshock, and the largest aftershock of the 2011 earthquake. Thus, these results suggest that we should prepare for future M ~ 7 earthquakes in the Miyagi-ken-Oki segment even though this segment recently experienced large coseismic slip in 2011.

  11. Possible scenarios for occurrence of M ~ 7 interplate earthquakes prior to and following the 2011 Tohoku-Oki earthquake based on numerical simulation

    PubMed Central

    Nakata, Ryoko; Hori, Takane; Hyodo, Mamoru; Ariyoshi, Keisuke

    2016-01-01

    We show possible scenarios for the occurrence of M ~ 7 interplate earthquakes prior to and following the M ~ 9 earthquake along the Japan Trench, such as the 2011 Tohoku-Oki earthquake. One such M ~ 7 earthquake is so-called the Miyagi-ken-Oki earthquake, for which we conducted numerical simulations of earthquake generation cycles by using realistic three-dimensional (3D) geometry of the subducting Pacific Plate. In a number of scenarios, the time interval between the M ~ 9 earthquake and the subsequent Miyagi-ken-Oki earthquake was equal to or shorter than the average recurrence interval during the later stage of the M ~ 9 earthquake cycle. The scenarios successfully reproduced important characteristics such as the recurrence of M ~ 7 earthquakes, coseismic slip distribution, afterslip distribution, the largest foreshock, and the largest aftershock of the 2011 earthquake. Thus, these results suggest that we should prepare for future M ~ 7 earthquakes in the Miyagi-ken-Oki segment even though this segment recently experienced large coseismic slip in 2011. PMID:27161897

  12. Ground Motions Due to Earthquakes on Creeping Faults

    NASA Astrophysics Data System (ADS)

    Harris, R.; Abrahamson, N. A.

    2014-12-01

    We investigate the peak ground motions from the largest well-recorded earthquakes on creeping strike-slip faults in active-tectonic continental regions. Our goal is to evaluate if the strong ground motions from earthquakes on creeping faults are smaller than the strong ground motions from earthquakes on locked faults. Smaller ground motions might be expected from earthquakes on creeping faults if the fault sections that strongly radiate energy are surrounded by patches of fault that predominantly absorb energy. For our study we used the ground motion data available in the PEER NGA-West2 database, and the ground motion prediction equations that were developed from the PEER NGA-West2 dataset. We analyzed data for the eleven largest well-recorded creeping-fault earthquakes, that ranged in magnitude from M5.0-6.5. Our findings are that these earthquakes produced peak ground motions that are statistically indistinguishable from the peak ground motions produced by similar-magnitude earthquakes on locked faults. These findings may be implemented in earthquake hazard estimates for moderate-size earthquakes in creeping-fault regions. Further investigation is necessary to determine if this result will also apply to larger earthquakes on creeping faults. Please also see: Harris, R.A., and N.A. Abrahamson (2014), Strong ground motions generated by earthquakes on creeping faults, Geophysical Research Letters, vol. 41, doi:10.1002/2014GL060228.

  13. Security Implications of Induced Earthquakes

    NASA Astrophysics Data System (ADS)

    Jha, B.; Rao, A.

    2016-12-01

    The increase in earthquakes induced or triggered by human activities motivates us to research how a malicious entity could weaponize earthquakes to cause damage. Specifically, we explore the feasibility of controlling the location, timing and magnitude of an earthquake by activating a fault via injection and production of fluids into the subsurface. Here, we investigate the relationship between the magnitude and trigger time of an induced earthquake to the well-to-fault distance. The relationship between magnitude and distance is important to determine the farthest striking distance from which one could intentionally activate a fault to cause certain level of damage. We use our novel computational framework to model the coupled multi-physics processes of fluid flow and fault poromechanics. We use synthetic models representative of the New Madrid Seismic Zone and the San Andreas Fault Zone to assess the risk in the continental US. We fix injection and production flow rates of the wells and vary their locations. We simulate injection-induced Coulomb destabilization of faults and evolution of fault slip under quasi-static deformation. We find that the effect of distance on the magnitude and trigger time is monotonic, nonlinear, and time-dependent. Evolution of the maximum Coulomb stress on the fault provides insights into the effect of the distance on rupture nucleation and propagation. The damage potential of induced earthquakes can be maintained even at longer distances because of the balance between pressure diffusion and poroelastic stress transfer mechanisms. We conclude that computational modeling of induced earthquakes allows us to measure feasibility of weaponzing earthquakes and developing effective defense mechanisms against such attacks.

  14. Laboratory generated M -6 earthquakes

    USGS Publications Warehouse

    McLaskey, Gregory C.; Kilgore, Brian D.; Lockner, David A.; Beeler, Nicholas M.

    2014-01-01

    We consider whether mm-scale earthquake-like seismic events generated in laboratory experiments are consistent with our understanding of the physics of larger earthquakes. This work focuses on a population of 48 very small shocks that are foreshocks and aftershocks of stick–slip events occurring on a 2.0 m by 0.4 m simulated strike-slip fault cut through a large granite sample. Unlike the larger stick–slip events that rupture the entirety of the simulated fault, the small foreshocks and aftershocks are contained events whose properties are controlled by the rigidity of the surrounding granite blocks rather than characteristics of the experimental apparatus. The large size of the experimental apparatus, high fidelity sensors, rigorous treatment of wave propagation effects, and in situ system calibration separates this study from traditional acoustic emission analyses and allows these sources to be studied with as much rigor as larger natural earthquakes. The tiny events have short (3–6 μs) rise times and are well modeled by simple double couple focal mechanisms that are consistent with left-lateral slip occurring on a mm-scale patch of the precut fault surface. The repeatability of the experiments indicates that they are the result of frictional processes on the simulated fault surface rather than grain crushing or fracture of fresh rock. Our waveform analysis shows no significant differences (other than size) between the M -7 to M -5.5 earthquakes reported here and larger natural earthquakes. Their source characteristics such as stress drop (1–10 MPa) appear to be entirely consistent with earthquake scaling laws derived for larger earthquakes.

  15. Impact of the Christchurch earthquakes on hospital staff.

    PubMed

    Tovaranonte, Pleayo; Cawood, Tom J

    2013-06-01

    On September 4, 2010 a major earthquake caused widespread damage, but no loss of life, to Christchurch city and surrounding areas. There were numerous aftershocks, including on February 22, 2011 which, in contrast, caused substantial loss of life and major damage to the city. The research aim was to assess how these two earthquakes affected the staff in the General Medicine Department at Christchurch Hospital. Problem To date there have been no published data assessing the impact of this type of natural disaster on hospital staff in Australasia. A questionnaire that examined seven domains (demographics, personal impact, psychological impact, emotional impact, impact on care for patients, work impact, and coping strategies) was handed out to General Medicine staff and students nine days after the September 2010 earthquake and 14 days after the February 2011 earthquake. Response rates were ≥ 99%. Sixty percent of responders were <30 years of age, and approximately 60% were female. Families of eight percent and 35% had to move to another place due to the September and February earthquakes, respectively. A fifth to a third of people had to find an alternative route of transport to get to work but only eight percent to 18% took time off work. Financial impact was more severe following the February earthquake, with 46% reporting damage of >NZ $1,000, compared with 15% following the September earthquake (P < .001). Significantly more people felt upset about the situation following the February earthquake than the September earthquake (42% vs 69%, P < .001). Almost a quarter thought that quality of patient care was affected in some way following the September earthquake but this rose to 53% after the February earthquake (12/53 vs 45/85, P < .001). Half believed that discharges were delayed following the September earthquake but this dropped significantly to 15% following the February earthquake (27/53 vs 13/62, P < .001). This survey provides a measure of the result of

  16. Understanding and responding to earthquake hazards

    NASA Technical Reports Server (NTRS)

    Raymond, C. A.; Lundgren, P. R.; Madsen, S. N.; Rundle, J. B.

    2002-01-01

    Advances in understanding of the earthquake cycle and in assessing earthquake hazards is a topic of great importance. Dynamic earthquake hazard assessments resolved for a range of spatial scales and time scales will allow a more systematic approach to prioritizing the retrofitting of vulnerable structures, relocating populations at risk, protecting lifelines, preparing for disasters, and educating the public.

  17. The Christchurch earthquake stroke incidence study.

    PubMed

    Wu, Teddy Y; Cheung, Jeanette; Cole, David; Fink, John N

    2014-03-01

    We examined the impact of major earthquakes on acute stroke admissions by a retrospective review of stroke admissions in the 6 weeks following the 4 September 2010 and 22 February 2011 earthquakes. The control period was the corresponding 6 weeks in the previous year. In the 6 weeks following the September 2010 earthquake there were 97 acute stroke admissions, with 79 (81.4%) ischaemic infarctions. This was similar to the 2009 control period which had 104 acute stroke admissions, of whom 80 (76.9%) had ischaemic infarction. In the 6 weeks following the February 2011 earthquake, there were 71 stroke admissions, and 61 (79.2%) were ischaemic infarction. This was less than the 96 strokes (72 [75%] ischaemic infarction) in the corresponding control period. None of the comparisons were statistically significant. There was also no difference in the rate of cardioembolic infarction from atrial fibrillation between the study periods. Patients admitted during the February 2011 earthquake period were less likely to be discharged directly home when compared to the control period (31.2% versus 46.9%, p=0.036). There was no observable trend in the number of weekly stroke admissions between the 2 weeks leading to and 6 weeks following the earthquakes. Our results suggest that severe psychological stress from earthquakes did not influence the subsequent short term risk of acute stroke, but the severity of the earthquake in February 2011 and associated civil structural damages may have influenced the pattern of discharge for stroke patients. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Source modeling of the 2015 Mw 7.8 Nepal (Gorkha) earthquake sequence: Implications for geodynamics and earthquake hazards

    NASA Astrophysics Data System (ADS)

    McNamara, D. E.; Yeck, W. L.; Barnhart, W. D.; Schulte-Pelkum, V.; Bergman, E.; Adhikari, L. B.; Dixit, A.; Hough, S. E.; Benz, H. M.; Earle, P. S.

    2017-09-01

    The Gorkha earthquake on April 25th, 2015 was a long anticipated, low-angle thrust-faulting event on the shallow décollement between the India and Eurasia plates. We present a detailed multiple-event hypocenter relocation analysis of the Mw 7.8 Gorkha Nepal earthquake sequence, constrained by local seismic stations, and a geodetic rupture model based on InSAR and GPS data. We integrate these observations to place the Gorkha earthquake sequence into a seismotectonic context and evaluate potential earthquake hazard. Major results from this study include (1) a comprehensive catalog of calibrated hypocenters for the Gorkha earthquake sequence; (2) the Gorkha earthquake ruptured a 150 × 60 km patch of the Main Himalayan Thrust (MHT), the décollement defining the plate boundary at depth, over an area surrounding but predominantly north of the capital city of Kathmandu (3) the distribution of aftershock seismicity surrounds the mainshock maximum slip patch; (4) aftershocks occur at or below the mainshock rupture plane with depths generally increasing to the north beneath the higher Himalaya, possibly outlining a 10-15 km thick subduction channel between the overriding Eurasian and subducting Indian plates; (5) the largest Mw 7.3 aftershock and the highest concentration of aftershocks occurred to the southeast the mainshock rupture, on a segment of the MHT décollement that was positively stressed towards failure; (6) the near surface portion of the MHT south of Kathmandu shows no aftershocks or slip during the mainshock. Results from this study characterize the details of the Gorkha earthquake sequence and provide constraints on where earthquake hazard remains high, and thus where future, damaging earthquakes may occur in this densely populated region. Up-dip segments of the MHT should be considered to be high hazard for future damaging earthquakes.

  19. Source modeling of the 2015 Mw 7.8 Nepal (Gorkha) earthquake sequence: Implications for geodynamics and earthquake hazards

    USGS Publications Warehouse

    McNamara, Daniel E.; Yeck, William; Barnhart, William D.; Schulte-Pelkum, V.; Bergman, E.; Adhikari, L. B.; Dixit, Amod; Hough, S.E.; Benz, Harley M.; Earle, Paul

    2017-01-01

    The Gorkha earthquake on April 25th, 2015 was a long anticipated, low-angle thrust-faulting event on the shallow décollement between the India and Eurasia plates. We present a detailed multiple-event hypocenter relocation analysis of the Mw 7.8 Gorkha Nepal earthquake sequence, constrained by local seismic stations, and a geodetic rupture model based on InSAR and GPS data. We integrate these observations to place the Gorkha earthquake sequence into a seismotectonic context and evaluate potential earthquake hazard.Major results from this study include (1) a comprehensive catalog of calibrated hypocenters for the Gorkha earthquake sequence; (2) the Gorkha earthquake ruptured a ~ 150 × 60 km patch of the Main Himalayan Thrust (MHT), the décollement defining the plate boundary at depth, over an area surrounding but predominantly north of the capital city of Kathmandu (3) the distribution of aftershock seismicity surrounds the mainshock maximum slip patch; (4) aftershocks occur at or below the mainshock rupture plane with depths generally increasing to the north beneath the higher Himalaya, possibly outlining a 10–15 km thick subduction channel between the overriding Eurasian and subducting Indian plates; (5) the largest Mw 7.3 aftershock and the highest concentration of aftershocks occurred to the southeast the mainshock rupture, on a segment of the MHT décollement that was positively stressed towards failure; (6) the near surface portion of the MHT south of Kathmandu shows no aftershocks or slip during the mainshock. Results from this study characterize the details of the Gorkha earthquake sequence and provide constraints on where earthquake hazard remains high, and thus where future, damaging earthquakes may occur in this densely populated region. Up-dip segments of the MHT should be considered to be high hazard for future damaging earthquakes.

  20. The nature of earthquake prediction

    USGS Publications Warehouse

    Lindh, A.G.

    1991-01-01

    Earthquake prediction is inherently statistical. Although some people continue to think of earthquake prediction as the specification of the time, place, and magnitude of a future earthquake, it has been clear for at least a decade that this is an unrealistic and unreasonable definition. the reality is that earthquake prediction starts from the long-term forecasts of place and magnitude, with very approximate time constraints, and progresses, at least in principle, to a gradual narrowing of the time window as data and understanding permit. Primitive long-term forecasts are clearly possible at this time on a few well-characterized fault systems. Tightly focuses monitoring experiments aimed at short-term prediction are already underway in Parkfield, California, and in the Tokai region in Japan; only time will tell how much progress will be possible. 

  1. Implications for earthquake risk reduction in the United States from the Kocaeli, Turkey, earthquake of August 17, 1999

    USGS Publications Warehouse

    ,

    2000-01-01

    This report documents implications for earthquake risk reduction in the U.S. The magnitude 7.4 earthquake caused 17,127 deaths, 43,953 injuries, and displaced more than 250,000 people from their homes. The report warns that similar disasters are possible in the United States where earthquakes of comparable size strike the heart of American urban areas. Another concern described in the report is the delayed emergency response that was caused by the inadequate seismic monitoring system in Turkey, a problem that contrasts sharply with rapid assessment and response to the September Chi-Chi earthquake in Taiwan. Additionally, the experience in Turkey suggests that techniques for forecasting earthquakes may be improving.

  2. The U.S. Earthquake Prediction Program

    USGS Publications Warehouse

    Wesson, R.L.; Filson, J.R.

    1981-01-01

    There are two distinct motivations for earthquake prediction. The mechanistic approach aims to understand the processes leading to a large earthquake. The empirical approach is governed by the immediate need to protect lives and property. With our current lack of knowledge about the earthquake process, future progress cannot be made without gathering a large body of measurements. These are required not only for the empirical prediction of earthquakes, but also for the testing and development of hypotheses that further our understanding of the processes at work. The earthquake prediction program is basically a program of scientific inquiry, but one which is motivated by social, political, economic, and scientific reasons. It is a pursuit that cannot rely on empirical observations alone nor can it carried out solely on a blackboard or in a laboratory. Experiments must be carried out in the real Earth. 

  3. Modified-Fibonacci-Dual-Lucas method for earthquake prediction

    NASA Astrophysics Data System (ADS)

    Boucouvalas, A. C.; Gkasios, M.; Tselikas, N. T.; Drakatos, G.

    2015-06-01

    The FDL method makes use of Fibonacci, Dual and Lucas numbers and has shown considerable success in predicting earthquake events locally as well as globally. Predicting the location of the epicenter of an earthquake is one difficult challenge the other being the timing and magnitude. One technique for predicting the onset of earthquakes is the use of cycles, and the discovery of periodicity. Part of this category is the reported FDL method. The basis of the reported FDL method is the creation of FDL future dates based on the onset date of significant earthquakes. The assumption being that each occurred earthquake discontinuity can be thought of as a generating source of FDL time series The connection between past earthquakes and future earthquakes based on FDL numbers has also been reported with sample earthquakes since 1900. Using clustering methods it has been shown that significant earthquakes (<6.5R) can be predicted with very good accuracy window (+-1 day). In this contribution we present an improvement modification to the FDL method, the MFDL method, which performs better than the FDL. We use the FDL numbers to develop possible earthquakes dates but with the important difference that the starting seed date is a trigger planetary aspect prior to the earthquake. Typical planetary aspects are Moon conjunct Sun, Moon opposite Sun, Moon conjunct or opposite North or South Modes. In order to test improvement of the method we used all +8R earthquakes recorded since 1900, (86 earthquakes from USGS data). We have developed the FDL numbers for each of those seeds, and examined the earthquake hit rates (for a window of 3, i.e. +-1 day of target date) and for <6.5R. The successes are counted for each one of the 86 earthquake seeds and we compare the MFDL method with the FDL method. In every case we find improvement when the starting seed date is on the planetary trigger date prior to the earthquake. We observe no improvement only when a planetary trigger coincided with

  4. Seismic databases and earthquake catalogue of the Caucasus

    NASA Astrophysics Data System (ADS)

    Godoladze, Tea; Javakhishvili, Zurab; Tvaradze, Nino; Tumanova, Nino; Jorjiashvili, Nato; Gok, Rengen

    2016-04-01

    The Caucasus has a documented historical catalog stretching back to the beginning of the Christian era. Most of the largest historical earthquakes prior to the 19th century are assumed to have occurred on active faults of the Greater Caucasus. Important earthquakes include the Samtskhe earthquake of 1283, Ms~7.0, Io=9; Lechkhumi-Svaneti earthquake of 1350, Ms~7.0, Io=9; and the Alaverdi(earthquake of 1742, Ms~6.8, Io=9. Two significant historical earthquakes that may have occurred within the Javakheti plateau in the Lesser Caucasus are the Tmogvi earthquake of 1088, Ms~6.5, Io=9 and the Akhalkalaki earthquake of 1899, Ms~6.3, Io =8-9. Large earthquakes that occurred in the Caucasus within the period of instrumental observation are: Gori 1920; Tabatskuri 1940; Chkhalta 1963; 1991 Ms=7.0 Racha earthquake, the largest event ever recorded in the region; the 1992 M=6.5 Barisakho earthquake; Ms=6.9 Spitak, Armenia earthquake (100 km south of Tbilisi), which killed over 50,000 people in Armenia. Recently, permanent broadband stations have been deployed across the region as part of various national networks (Georgia (~25 stations), Azerbaijan (~35 stations), Armenia (~14 stations)). The data from the last 10 years of observation provides an opportunity to perform modern, fundamental scientific investigations. A catalog of all instrumentally recorded earthquakes has been compiled by the IES (Institute of Earth Sciences, Ilia State University). The catalog consists of more then 80,000 events. Together with our colleagues from Armenia, Azerbaijan and Turkey the database for the Caucasus seismic events was compiled. We tried to improve locations of the events and calculate Moment magnitudes for the events more than magnitude 4 estimate in order to obtain unified magnitude catalogue of the region. The results will serve as the input for the Seismic hazard assessment for the region.

  5. Statistical tests of simple earthquake cycle models

    NASA Astrophysics Data System (ADS)

    DeVries, Phoebe M. R.; Evans, Eileen L.

    2016-12-01

    A central goal of observing and modeling the earthquake cycle is to forecast when a particular fault may generate an earthquake: a fault late in its earthquake cycle may be more likely to generate an earthquake than a fault early in its earthquake cycle. Models that can explain geodetic observations throughout the entire earthquake cycle may be required to gain a more complete understanding of relevant physics and phenomenology. Previous efforts to develop unified earthquake models for strike-slip faults have largely focused on explaining both preseismic and postseismic geodetic observations available across a few faults in California, Turkey, and Tibet. An alternative approach leverages the global distribution of geodetic and geologic slip rate estimates on strike-slip faults worldwide. Here we use the Kolmogorov-Smirnov test for similarity of distributions to infer, in a statistically rigorous manner, viscoelastic earthquake cycle models that are inconsistent with 15 sets of observations across major strike-slip faults. We reject a large subset of two-layer models incorporating Burgers rheologies at a significance level of α = 0.05 (those with long-term Maxwell viscosities ηM < 4.0 × 1019 Pa s and ηM > 4.6 × 1020 Pa s) but cannot reject models on the basis of transient Kelvin viscosity ηK. Finally, we examine the implications of these results for the predicted earthquake cycle timing of the 15 faults considered and compare these predictions to the geologic and historical record.

  6. Testing an Earthquake Prediction Algorithm: The 2016 New Zealand and Chile Earthquakes

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir G.

    2017-05-01

    The 13 November 2016, M7.8, 54 km NNE of Amberley, New Zealand and the 25 December 2016, M7.6, 42 km SW of Puerto Quellon, Chile earthquakes happened outside the area of the on-going real-time global testing of the intermediate-term middle-range earthquake prediction algorithm M8, accepted in 1992 for the M7.5+ range. Naturally, over the past two decades, the level of registration of earthquakes worldwide has grown significantly and by now is sufficient for diagnosis of times of increased probability (TIPs) by the M8 algorithm on the entire territory of New Zealand and Southern Chile as far as below 40°S. The mid-2016 update of the M8 predictions determines TIPs in the additional circles of investigation (CIs) where the two earthquakes have happened. Thus, after 50 semiannual updates in the real-time prediction mode, we (1) confirm statistically approved high confidence of the M8-MSc predictions and (2) conclude a possibility of expanding the territory of the Global Test of the algorithms M8 and MSc in an apparently necessary revision of the 1992 settings.

  7. Triggering of repeating earthquakes in central California

    USGS Publications Warehouse

    Wu, Chunquan; Gomberg, Joan; Ben-Naim, Eli; Johnson, Paul

    2014-01-01

    Dynamic stresses carried by transient seismic waves have been found capable of triggering earthquakes instantly in various tectonic settings. Delayed triggering may be even more common, but the mechanisms are not well understood. Catalogs of repeating earthquakes, earthquakes that recur repeatedly at the same location, provide ideal data sets to test the effects of transient dynamic perturbations on the timing of earthquake occurrence. Here we employ a catalog of 165 families containing ~2500 total repeating earthquakes to test whether dynamic perturbations from local, regional, and teleseismic earthquakes change recurrence intervals. The distance to the earthquake generating the perturbing waves is a proxy for the relative potential contributions of static and dynamic deformations, because static deformations decay more rapidly with distance. Clear changes followed the nearby 2004 Mw6 Parkfield earthquake, so we study only repeaters prior to its origin time. We apply a Monte Carlo approach to compare the observed number of shortened recurrence intervals following dynamic perturbations with the distribution of this number estimated for randomized perturbation times. We examine the comparison for a series of dynamic stress peak amplitude and distance thresholds. The results suggest a weak correlation between dynamic perturbations in excess of ~20 kPa and shortened recurrence intervals, for both nearby and remote perturbations.

  8. Scaling of seismic memory with earthquake size

    NASA Astrophysics Data System (ADS)

    Zheng, Zeyu; Yamasaki, Kazuko; Tenenbaum, Joel; Podobnik, Boris; Tamura, Yoshiyasu; Stanley, H. Eugene

    2012-07-01

    It has been observed that discrete earthquake events possess memory, i.e., that events occurring in a particular location are dependent on the history of that location. We conduct an analysis to see whether continuous real-time data also display a similar memory and, if so, whether such autocorrelations depend on the size of earthquakes within close spatiotemporal proximity. We analyze the seismic wave form database recorded by 64 stations in Japan, including the 2011 “Great East Japan Earthquake,” one of the five most powerful earthquakes ever recorded, which resulted in a tsunami and devastating nuclear accidents. We explore the question of seismic memory through use of mean conditional intervals and detrended fluctuation analysis (DFA). We find that the wave form sign series show power-law anticorrelations while the interval series show power-law correlations. We find size dependence in earthquake autocorrelations: as the earthquake size increases, both of these correlation behaviors strengthen. We also find that the DFA scaling exponent α has no dependence on the earthquake hypocenter depth or epicentral distance.

  9. Earthquakes in Ohio and Vicinity 1776-2007

    USGS Publications Warehouse

    Dart, Richard L.; Hansen, Michael C.

    2008-01-01

    This map summarizes two and a third centuries of earthquake activity. The seismic history consists of letters, journals, diaries, and newspaper and scholarly articles that supplement seismograph recordings (seismograms) dating from the early twentieth century to the present. All of the pre-instrumental (historical) earthquakes were large enough to be felt by people or to cause shaking damage to buildings and their contents. Later, widespread use of seismographs meant that tremors too small or distant to be felt could be detected and accurately located. Earthquakes are a legitimate concern in Ohio and parts of adjacent States. Ohio has experienced more than 160 felt earthquakes since 1776. Most of these events caused no damage or injuries. However, 15 Ohio earthquakes resulted in property damage and some minor injuries. The largest historic earthquake in the state occurred in 1937. This event had an estimated magnitude of 5.4 and caused considerable damage in the town of Anna and in several other western Ohio communities. The large map shows all historical and instrumentally located earthquakes from 1776 through 2007.

  10. Fundamental questions of earthquake statistics, source behavior, and the estimation of earthquake probabilities from possible foreshocks

    USGS Publications Warehouse

    Michael, Andrew J.

    2012-01-01

    Estimates of the probability that an ML 4.8 earthquake, which occurred near the southern end of the San Andreas fault on 24 March 2009, would be followed by an M 7 mainshock over the following three days vary from 0.0009 using a Gutenberg–Richter model of aftershock statistics (Reasenberg and Jones, 1989) to 0.04 using a statistical model of foreshock behavior and long‐term estimates of large earthquake probabilities, including characteristic earthquakes (Agnew and Jones, 1991). I demonstrate that the disparity between the existing approaches depends on whether or not they conform to Gutenberg–Richter behavior. While Gutenberg–Richter behavior is well established over large regions, it could be violated on individual faults if they have characteristic earthquakes or over small areas if the spatial distribution of large‐event nucleations is disproportional to the rate of smaller events. I develop a new form of the aftershock model that includes characteristic behavior and combines the features of both models. This new model and the older foreshock model yield the same results when given the same inputs, but the new model has the advantage of producing probabilities for events of all magnitudes, rather than just for events larger than the initial one. Compared with the aftershock model, the new model has the advantage of taking into account long‐term earthquake probability models. Using consistent parameters, the probability of an M 7 mainshock on the southernmost San Andreas fault is 0.0001 for three days from long‐term models and the clustering probabilities following the ML 4.8 event are 0.00035 for a Gutenberg–Richter distribution and 0.013 for a characteristic‐earthquake magnitude–frequency distribution. Our decisions about the existence of characteristic earthquakes and how large earthquakes nucleate have a first‐order effect on the probabilities obtained from short‐term clustering models for these large events.

  11. Lisbon 1755, a multiple-rupture earthquake

    NASA Astrophysics Data System (ADS)

    Fonseca, J. F. B. D.

    2017-12-01

    The Lisbon earthquake of 1755 poses a challenge to seismic hazard assessment. Reports pointing to MMI 8 or above at distances of the order of 500km led to magnitude estimates near M9 in classic studies. A refined analysis of the coeval sources lowered the estimates to 8.7 (Johnston, 1998) and 8.5 (Martinez-Solares, 2004). I posit that even these lower magnitude values reflect the combined effect of multiple ruptures. Attempts to identify a single source capable of explaining the damage reports with published ground motion models did not gather consensus and, compounding the challenge, the analysis of tsunami traveltimes has led to disparate source models, sometimes separated by a few hundred kilometers. From this viewpoint, the most credible source would combine a sub-set of the multiple active structures identifiable in SW Iberia. No individual moment magnitude needs to be above M8.1, thus rendering the search for candidate structures less challenging. The possible combinations of active structures should be ranked as a function of their explaining power, for macroseismic intensities and tsunami traveltimes taken together. I argue that the Lisbon 1755 earthquake is an example of a distinct class of intraplate earthquake previously unrecognized, of which the Indian Ocean earthquake of 2012 is the first instrumentally recorded example, showing space and time correlation over scales of the orders of a few hundred km and a few minutes. Other examples may exist in the historical record, such as the M8 1556 Shaanxi earthquake, with an unusually large damage footprint (MMI equal or above 6 in 10 provinces; 830000 fatalities). The ability to trigger seismicity globally, observed after the 2012 Indian Ocean earthquake, may be a characteristic of this type of event: occurrences in Massachussets (M5.9 Cape Ann earthquake on 18/11/1755), Morocco (M6.5 Fez earthquake on 27/11/1755) and Germany (M6.1 Duren earthquake, on 18/02/1756) had in all likelyhood a causal link to the

  12. The Pocatello Valley, Idaho, earthquake

    USGS Publications Warehouse

    Rogers, A. M.; Langer, C.J.; Bucknam, R.C.

    1975-01-01

    A Richter magnitude 6.3 earthquake occurred at 8:31 p.m mountain daylight time on March 27, 1975, near the Utah-Idaho border in Pocatello Valley. The epicenter of the main shock was located at 42.094° N, 112.478° W, and had a focal depth of 5.5 km. This earthquake was the largest in the continental United States since the destructive San Fernando earthquake of February 1971. The main shock was preceded by a magnitude 4.5 foreshock on March 26. 

  13. Modified Mercalli Intensity for scenario earthquakes in Evansville, Indiana

    USGS Publications Warehouse

    Cramer, Chris; Haase, Jennifer; Boyd, Oliver

    2012-01-01

    Evansville, Indiana, has experienced minor damage from earthquakes several times in the past 200 years. Because of this history and the fact that Evansville is close to the Wabash Valley and New Madrid seismic zones, there is concern about the hazards from earthquakes. Earthquakes currently cannot be predicted, but scientists can estimate how strongly the ground is likely to shake as a result of an earthquake. Earthquake-hazard maps provide one way of conveying such estimates of strong ground shaking and will help the region prepare for future earthquakes and reduce earthquake-caused losses.

  14. Evaluating spatial and temporal relationships between an earthquake cluster near Entiat, central Washington, and the large December 1872 Entiat earthquake

    USGS Publications Warehouse

    Brocher, Thomas M.; Blakely, Richard J.; Sherrod, Brian

    2017-01-01

    We investigate spatial and temporal relations between an ongoing and prolific seismicity cluster in central Washington, near Entiat, and the 14 December 1872 Entiat earthquake, the largest historic crustal earthquake in Washington. A fault scarp produced by the 1872 earthquake lies within the Entiat cluster; the locations and areas of both the cluster and the estimated 1872 rupture surface are comparable. Seismic intensities and the 1–2 m of coseismic displacement suggest a magnitude range between 6.5 and 7.0 for the 1872 earthquake. Aftershock forecast models for (1) the first several hours following the 1872 earthquake, (2) the largest felt earthquakes from 1900 to 1974, and (3) the seismicity within the Entiat cluster from 1976 through 2016 are also consistent with this magnitude range. Based on this aftershock modeling, most of the current seismicity in the Entiat cluster could represent aftershocks of the 1872 earthquake. Other earthquakes, especially those with long recurrence intervals, have long‐lived aftershock sequences, including the Mw">MwMw 7.5 1891 Nobi earthquake in Japan, with aftershocks continuing 100 yrs after the mainshock. Although we do not rule out ongoing tectonic deformation in this region, a long‐lived aftershock sequence can account for these observations.

  15. Methodology to determine the parameters of historical earthquakes in China

    NASA Astrophysics Data System (ADS)

    Wang, Jian; Lin, Guoliang; Zhang, Zhe

    2017-12-01

    China is one of the countries with the longest cultural tradition. Meanwhile, China has been suffering very heavy earthquake disasters; so, there are abundant earthquake recordings. In this paper, we try to sketch out historical earthquake sources and research achievements in China. We will introduce some basic information about the collections of historical earthquake sources, establishing intensity scale and the editions of historical earthquake catalogues. Spatial-temporal and magnitude distributions of historical earthquake are analyzed briefly. Besides traditional methods, we also illustrate a new approach to amend the parameters of historical earthquakes or even identify candidate zones for large historical or palaeo-earthquakes. In the new method, a relationship between instrumentally recorded small earthquakes and strong historical earthquakes is built up. Abundant historical earthquake sources and the achievements of historical earthquake research in China are of valuable cultural heritage in the world.

  16. Earthquakes; July-August 1982

    USGS Publications Warehouse

    Person, W.J.

    1983-01-01

    During this reporting period, there were three major (7.0-7.9) earthquakes all in unpopulated areas. The quakes occurred north of Macquarie Island on July 7, in the Santa Cruz Islands on August 5, and south of Panama on August 19. In the United Stats, a number of earthquakes occurred, but no damage was reported. 

  17. Testing hypotheses of earthquake occurrence

    NASA Astrophysics Data System (ADS)

    Kagan, Y. Y.; Jackson, D. D.; Schorlemmer, D.; Gerstenberger, M.

    2003-12-01

    We present a relatively straightforward likelihood method for testing those earthquake hypotheses that can be stated as vectors of earthquake rate density in defined bins of area, magnitude, and time. We illustrate the method as it will be applied to the Regional Earthquake Likelihood Models (RELM) project of the Southern California Earthquake Center (SCEC). Several earthquake forecast models are being developed as part of this project, and additional contributed forecasts are welcome. Various models are based on fault geometry and slip rates, seismicity, geodetic strain, and stress interactions. We would test models in pairs, requiring that both forecasts in a pair be defined over the same set of bins. Thus we offer a standard "menu" of bins and ground rules to encourage standardization. One menu category includes five-year forecasts of magnitude 5.0 and larger. Forecasts would be in the form of a vector of yearly earthquake rates on a 0.05 degree grid at the beginning of the test. Focal mechanism forecasts, when available, would be also be archived and used in the tests. The five-year forecast category may be appropriate for testing hypotheses of stress shadows from large earthquakes. Interim progress will be evaluated yearly, but final conclusions would be made on the basis of cumulative five-year performance. The second category includes forecasts of earthquakes above magnitude 4.0 on a 0.05 degree grid, evaluated and renewed daily. Final evaluation would be based on cumulative performance over five years. Other types of forecasts with different magnitude, space, and time sampling are welcome and will be tested against other models with shared characteristics. All earthquakes would be counted, and no attempt made to separate foreshocks, main shocks, and aftershocks. Earthquakes would be considered as point sources located at the hypocenter. For each pair of forecasts, we plan to compute alpha, the probability that the first would be wrongly rejected in favor of

  18. Prospective testing of Coulomb short-term earthquake forecasts

    NASA Astrophysics Data System (ADS)

    Jackson, D. D.; Kagan, Y. Y.; Schorlemmer, D.; Zechar, J. D.; Wang, Q.; Wong, K.

    2009-12-01

    Earthquake induced Coulomb stresses, whether static or dynamic, suddenly change the probability of future earthquakes. Models to estimate stress and the resulting seismicity changes could help to illuminate earthquake physics and guide appropriate precautionary response. But do these models have improved forecasting power compared to empirical statistical models? The best answer lies in prospective testing in which a fully specified model, with no subsequent parameter adjustments, is evaluated against future earthquakes. The Center of Study of Earthquake Predictability (CSEP) facilitates such prospective testing of earthquake forecasts, including several short term forecasts. Formulating Coulomb stress models for formal testing involves several practical problems, mostly shared with other short-term models. First, earthquake probabilities must be calculated after each “perpetrator” earthquake but before the triggered earthquakes, or “victims”. The time interval between a perpetrator and its victims may be very short, as characterized by the Omori law for aftershocks. CSEP evaluates short term models daily, and allows daily updates of the models. However, lots can happen in a day. An alternative is to test and update models on the occurrence of each earthquake over a certain magnitude. To make such updates rapidly enough and to qualify as prospective, earthquake focal mechanisms, slip distributions, stress patterns, and earthquake probabilities would have to be made by computer without human intervention. This scheme would be more appropriate for evaluating scientific ideas, but it may be less useful for practical applications than daily updates. Second, triggered earthquakes are imperfectly recorded following larger events because their seismic waves are buried in the coda of the earlier event. To solve this problem, testing methods need to allow for “censoring” of early aftershock data, and a quantitative model for detection threshold as a function of

  19. Increased Earthquake Rates in the Central and Eastern US Portend Higher Earthquake Hazards

    NASA Astrophysics Data System (ADS)

    Llenos, A. L.; Rubinstein, J. L.; Ellsworth, W. L.; Mueller, C. S.; Michael, A. J.; McGarr, A.; Petersen, M. D.; Weingarten, M.; Holland, A. A.

    2014-12-01

    Since 2009 the central and eastern United States has experienced an unprecedented increase in the rate of M≥3 earthquakes that is unlikely to be due to natural variation. Where the rates have increased so has the seismic hazard, making it important to understand these changes. Areas with significant seismicity increases are limited to areas where oil and gas production take place. By far the largest contributor to the seismicity increase is Oklahoma, where recent studies suggest that these rate changes may be due to fluid injection (e.g., Keranen et al., Geology, 2013; Science, 2014). Moreover, the area of increased seismicity in northern Oklahoma that began in 2013 coincides with the Mississippi Lime play, where well completions greatly increased the year before the seismicity increase. This suggests a link to oil and gas production either directly or from the disposal of significant amounts of produced water within the play. For the purpose of assessing the hazard due to these earthquakes, should they be treated differently from natural earthquakes? Previous studies suggest that induced seismicity may differ from natural seismicity in clustering characteristics or frequency-magnitude distributions (e.g., Bachmann et al., GJI, 2011; Llenos and Michael, BSSA, 2013). These differences could affect time-independent hazard computations, which typically assume that clustering and size distribution remain constant. In Oklahoma, as well as other areas of suspected induced seismicity, we find that earthquakes since 2009 tend to be considerably more clustered in space and time than before 2009. However differences between various regional and national catalogs leave unclear whether there are significant changes in magnitude distribution. Whether they are due to natural or industrial causes, the increased earthquake rates in these areas could increase the hazard in ways that are not accounted for in current hazard assessment practice. Clearly the possibility of induced

  20. Fixed recurrence and slip models better predict earthquake behavior than the time- and slip-predictable models: 2. Laboratory earthquakes

    NASA Astrophysics Data System (ADS)

    Rubinstein, Justin L.; Ellsworth, William L.; Beeler, Nicholas M.; Kilgore, Brian D.; Lockner, David A.; Savage, Heather M.

    2012-02-01

    The behavior of individual stick-slip events observed in three different laboratory experimental configurations is better explained by a "memoryless" earthquake model with fixed inter-event time or fixed slip than it is by the time- and slip-predictable models for earthquake occurrence. We make similar findings in the companion manuscript for the behavior of natural repeating earthquakes. Taken together, these results allow us to conclude that the predictions of a characteristic earthquake model that assumes either fixed slip or fixed recurrence interval should be preferred to the predictions of the time- and slip-predictable models for all earthquakes. Given that the fixed slip and recurrence models are the preferred models for all of the experiments we examine, we infer that in an event-to-event sense the elastic rebound model underlying the time- and slip-predictable models does not explain earthquake behavior. This does not indicate that the elastic rebound model should be rejected in a long-term-sense, but it should be rejected for short-term predictions. The time- and slip-predictable models likely offer worse predictions of earthquake behavior because they rely on assumptions that are too simple to explain the behavior of earthquakes. Specifically, the time-predictable model assumes a constant failure threshold and the slip-predictable model assumes that there is a constant minimum stress. There is experimental and field evidence that these assumptions are not valid for all earthquakes.

  1. Method to Determine Appropriate Source Models of Large Earthquakes Including Tsunami Earthquakes for Tsunami Early Warning in Central America

    NASA Astrophysics Data System (ADS)

    Tanioka, Yuichiro; Miranda, Greyving Jose Arguello; Gusman, Aditya Riadi; Fujii, Yushiro

    2017-08-01

    Large earthquakes, such as the Mw 7.7 1992 Nicaragua earthquake, have occurred off the Pacific coasts of El Salvador and Nicaragua in Central America and have generated distractive tsunamis along these coasts. It is necessary to determine appropriate fault models before large tsunamis hit the coast. In this study, first, fault parameters were estimated from the W-phase inversion, and then an appropriate fault model was determined from the fault parameters and scaling relationships with a depth dependent rigidity. The method was tested for four large earthquakes, the 1992 Nicaragua tsunami earthquake (Mw7.7), the 2001 El Salvador earthquake (Mw7.7), the 2004 El Astillero earthquake (Mw7.0), and the 2012 El Salvador-Nicaragua earthquake (Mw7.3), which occurred off El Salvador and Nicaragua in Central America. The tsunami numerical simulations were carried out from the determined fault models. We found that the observed tsunami heights, run-up heights, and inundation areas were reasonably well explained by the computed ones. Therefore, our method for tsunami early warning purpose should work to estimate a fault model which reproduces tsunami heights near the coast of El Salvador and Nicaragua due to large earthquakes in the subduction zone.

  2. A physically-based earthquake recurrence model for estimation of long-term earthquake probabilities

    USGS Publications Warehouse

    Ellsworth, William L.; Matthews, Mark V.; Nadeau, Robert M.; Nishenko, Stuart P.; Reasenberg, Paul A.; Simpson, Robert W.

    1999-01-01

    A physically-motivated model for earthquake recurrence based on the Brownian relaxation oscillator is introduced. The renewal process defining this point process model can be described by the steady rise of a state variable from the ground state to failure threshold as modulated by Brownian motion. Failure times in this model follow the Brownian passage time (BPT) distribution, which is specified by the mean time to failure, μ, and the aperiodicity of the mean, α (equivalent to the familiar coefficient of variation). Analysis of 37 series of recurrent earthquakes, M -0.7 to 9.2, suggests a provisional generic value of α = 0.5. For this value of α, the hazard function (instantaneous failure rate of survivors) exceeds the mean rate for times > μ⁄2, and is ~ ~ 2 ⁄ μ for all times > μ. Application of this model to the next M 6 earthquake on the San Andreas fault at Parkfield, California suggests that the annual probability of the earthquake is between 1:10 and 1:13.

  3. Triggered earthquakes and the 1811-1812 New Madrid, central United States, earthquake sequence

    USGS Publications Warehouse

    Hough, S.E.

    2001-01-01

    The 1811-1812 New Madrid, central United States, earthquake sequence included at least three events with magnitudes estimated at well above M 7.0. I discuss evidence that the sequence also produced at least three substantial triggered events well outside the New Madrid Seismic Zone, most likely in the vicinity of Cincinnati, Ohio. The largest of these events is estimated to have a magnitude in the low to mid M 5 range. Events of this size are large enough to cause damage, especially in regions with low levels of preparedness. Remotely triggered earthquakes have been observed in tectonically active regions in recent years, but not previously in stable continental regions. The results of this study suggest, however, that potentially damaging triggered earthquakes may be common following large mainshocks in stable continental regions. Thus, in areas of low seismic activity such as central/ eastern North America, the hazard associated with localized source zones might be more far reaching than previously recognized. The results also provide additional evidence that intraplate crust is critically stressed, such that small stress changes are especially effective at triggering earthquakes.

  4. Real-time earthquake data feasible

    NASA Astrophysics Data System (ADS)

    Bush, Susan

    Scientists agree that early warning devices and monitoring of both Hurricane Hugo and the Mt. Pinatubo volcanic eruption saved thousands of lives. What would it take to develop this sort of early warning and monitoring system for earthquake activity?Not all that much, claims a panel assigned to study the feasibility, costs, and technology needed to establish a real-time earthquake monitoring (RTEM) system. The panel, drafted by the National Academy of Science's Committee on Seismology, has presented its findings in Real-Time Earthquake Monitoring. The recently released report states that “present technology is entirely capable of recording and processing data so as to provide real-time information, enabling people to mitigate somewhat the earthquake disaster.” RTEM systems would consist of two parts—an early warning system that would give a few seconds warning before severe shaking, and immediate postquake information within minutes of the quake that would give actual measurements of the magnitude. At this time, however, this type of warning system has not been addressed at the national level for the United States and is not included in the National Earthquake Hazard Reduction Program, according to the report.

  5. Critical behavior in earthquake energy dissipation

    NASA Astrophysics Data System (ADS)

    Wanliss, James; Muñoz, Víctor; Pastén, Denisse; Toledo, Benjamín; Valdivia, Juan Alejandro

    2017-09-01

    We explore bursty multiscale energy dissipation from earthquakes flanked by latitudes 29° S and 35.5° S, and longitudes 69.501° W and 73.944° W (in the Chilean central zone). Our work compares the predictions of a theory of nonequilibrium phase transitions with nonstandard statistical signatures of earthquake complex scaling behaviors. For temporal scales less than 84 hours, time development of earthquake radiated energy activity follows an algebraic arrangement consistent with estimates from the theory of nonequilibrium phase transitions. There are no characteristic scales for probability distributions of sizes and lifetimes of the activity bursts in the scaling region. The power-law exponents describing the probability distributions suggest that the main energy dissipation takes place due to largest bursts of activity, such as major earthquakes, as opposed to smaller activations which contribute less significantly though they have greater relative occurrence. The results obtained provide statistical evidence that earthquake energy dissipation mechanisms are essentially "scale-free", displaying statistical and dynamical self-similarity. Our results provide some evidence that earthquake radiated energy and directed percolation belong to a similar universality class.

  6. The next new Madrid earthquake

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Atkinson, W.

    1988-01-01

    Scientists who specialize in the study of Mississippi Valley earthquakes say that the region is overdue for a powerful tremor that will cause major damage and undoubtedly some casualties. The inevitability of a future quake and the lack of preparation by both individuals and communities provided the impetus for this book. It brings together applicable information from many disciplines: history, geology and seismology, engineering, zoology, politics and community planning, economics, environmental science, sociology, and psychology and mental health to provide a perspective of the myriad impacts of a major earthquake on the Mississippi Valley. The author addresses such basic questionsmore » as What, actually, are earthquakes How do they occur Can they be predicted, perhaps even prevented He also addresses those steps that individuals can take to improve their chances for survival both during and after an earthquake.« less

  7. Centrality in earthquake multiplex networks

    NASA Astrophysics Data System (ADS)

    Lotfi, Nastaran; Darooneh, Amir Hossein; Rodrigues, Francisco A.

    2018-06-01

    Seismic time series has been mapped as a complex network, where a geographical region is divided into square cells that represent the nodes and connections are defined according to the sequence of earthquakes. In this paper, we map a seismic time series to a temporal network, described by a multiplex network, and characterize the evolution of the network structure in terms of the eigenvector centrality measure. We generalize previous works that considered the single layer representation of earthquake networks. Our results suggest that the multiplex representation captures better earthquake activity than methods based on single layer networks. We also verify that the regions with highest seismological activities in Iran and California can be identified from the network centrality analysis. The temporal modeling of seismic data provided here may open new possibilities for a better comprehension of the physics of earthquakes.

  8. Parallelization of the Coupled Earthquake Model

    NASA Technical Reports Server (NTRS)

    Block, Gary; Li, P. Peggy; Song, Yuhe T.

    2007-01-01

    This Web-based tsunami simulation system allows users to remotely run a model on JPL s supercomputers for a given undersea earthquake. At the time of this reporting, predicting tsunamis on the Internet has never happened before. This new code directly couples the earthquake model and the ocean model on parallel computers and improves simulation speed. Seismometers can only detect information from earthquakes; they cannot detect whether or not a tsunami may occur as a result of the earthquake. When earthquake-tsunami models are coupled with the improved computational speed of modern, high-performance computers and constrained by remotely sensed data, they are able to provide early warnings for those coastal regions at risk. The software is capable of testing NASA s satellite observations of tsunamis. It has been successfully tested for several historical tsunamis, has passed all alpha and beta testing, and is well documented for users.

  9. Luzon earthquake strongest in 90 years

    NASA Astrophysics Data System (ADS)

    The magnitude 7.7 Philippine earthquake that took place 2 weeks ago was the strongest recorded on the island of Luzon in nearly 90 years and the strongest in all of the Philippines in nearly 14 years, according to the U.S. Geological Survey.The earthquake occurred 60 miles north of Manila and was the third strongest recorded on Luzon, exceeded only by an earthquake with an estimated magnitude of 7.8, on December 14, 1901, near Lucena, about 80 miles southeast of Manila, and an earthquake with an estimated magnitude of 7.9 on August 15, 1897, off the northwest coast of Luzon.

  10. St. Louis Area Earthquake Hazards Mapping Project

    USGS Publications Warehouse

    Williams, Robert A.; Steckel, Phyllis; Schweig, Eugene

    2007-01-01

    St. Louis has experienced minor earthquake damage at least 12 times in the past 200 years. Because of this history and its proximity to known active earthquake zones, the St. Louis Area Earthquake Hazards Mapping Project will produce digital maps that show variability of earthquake hazards in the St. Louis area. The maps will be available free via the internet. They can be customized by the user to show specific areas of interest, such as neighborhoods or transportation routes.

  11. The California Post-Earthquake Information Clearinghouse: A Plan to Learn From the Next Large California Earthquake

    NASA Astrophysics Data System (ADS)

    Loyd, R.; Walter, S.; Fenton, J.; Tubbesing, S.; Greene, M.

    2008-12-01

    In the rush to remove debris after a damaging earthquake, perishable data related to a wide range of impacts on the physical, built and social environments can be lost. The California Post-Earthquake Information Clearinghouse is intended to prevent this data loss by supporting the earth scientists, engineers, and social and policy researchers who will conduct fieldwork in the affected areas in the hours and days following the earthquake to study these effects. First called for by Governor Ronald Reagan following the destructive M6.5 San Fernando earthquake in 1971, the concept of the Clearinghouse has since been incorporated into the response plans of the National Earthquake Hazard Reduction Program (USGS Circular 1242). This presentation is intended to acquaint scientists with the purpose, functions, and services of the Clearinghouse. Typically, the Clearinghouse is set up in the vicinity of the earthquake within 24 hours of the mainshock and is maintained for several days to several weeks. It provides a location where field researchers can assemble to share and discuss their observations, plan and coordinate subsequent field work, and communicate significant findings directly to the emergency responders and to the public through press conferences. As the immediate response effort winds down, the Clearinghouse will ensure that collected data are archived and made available through "lessons learned" reports and publications that follow significant earthquakes. Participants in the quarterly meetings of the Clearinghouse include representatives from state and federal agencies, universities, NGOs and other private groups. Overall management of the Clearinghouse is delegated to the agencies represented by the authors above.

  12. Magnitude and location of historical earthquakes in Japan and implications for the 1855 Ansei Edo earthquake

    USGS Publications Warehouse

    Bakun, W.H.

    2005-01-01

    Japan Meteorological Agency (JMA) intensity assignments IJMA are used to derive intensity attenuation models suitable for estimating the location and an intensity magnitude Mjma for historical earthquakes in Japan. The intensity for shallow crustal earthquakes on Honshu is equal to -1.89 + 1.42MJMA - 0.00887?? h - 1.66log??h, where MJMA is the JMA magnitude, ??h = (??2 + h2)1/2, and ?? and h are epicentral distance and focal depth (km), respectively. Four earthquakes located near the Japan Trench were used to develop a subducting plate intensity attenuation model where intensity is equal to -8.33 + 2.19MJMA -0.00550??h - 1.14 log ?? h. The IJMA assignments for the MJMA7.9 great 1923 Kanto earthquake on the Philippine Sea-Eurasian plate interface are consistent with the subducting plate model; Using the subducting plate model and 226 IJMA IV-VI assignments, the location of the intensity center is 25 km north of the epicenter, Mjma is 7.7, and MJMA is 7.3-8.0 at the 1?? confidence level. Intensity assignments and reported aftershock activity for the enigmatic 11 November 1855 Ansei Edo earthquake are consistent with an MJMA 7.2 Philippine Sea-Eurasian interplate source or Philippine Sea intraslab source at about 30 km depth. If the 1855 earthquake was a Philippine Sea-Eurasian interplate event, the intensity center was adjacent to and downdip of the rupture area of the great 1923 Kanto earthquake, suggesting that the 1855 and 1923 events ruptured adjoining sections of the Philippine Sea-Eurasian plate interface.

  13. Earthquakes in Virginia and vicinity 1774 - 2004

    USGS Publications Warehouse

    Tarr, Arthur C.; Wheeler, Russell L.

    2006-01-01

    This map summarizes two and a third centuries of earthquake activity. The seismic history consists of letters, journals, diaries, and newspaper and scholarly articles that supplement seismograph recordings (seismograms) dating from the early twentieth century to the present. All of the pre-instrumental (historical) earthquakes were large enough to be felt by people or to cause shaking damage to buildings and their contents. Later, widespread use of seismographs meant that tremors too small or distant to be felt could be detected and accurately located. Earthquakes are a legitimate concern in Virginia and parts of adjacent States. Moderate earthquakes cause slight local damage somewhere in the map area about twice a decade on the average. Additionally, many buildings in the map area were constructed before earthquake protection was added to local building codes. The large map shows all historical and instrumentally located earthquakes from 1774 through 2004.

  14. Earthquake activity along the Himalayan orogenic belt

    NASA Astrophysics Data System (ADS)

    Bai, L.; Mori, J. J.

    2017-12-01

    The collision between the Indian and Eurasian plates formed the Himalayas, the largest orogenic belt on the Earth. The entire region accommodates shallow earthquakes, while intermediate-depth earthquakes are concentrated at the eastern and western Himalayan syntaxis. Here we investigate the focal depths, fault plane solutions, and source rupture process for three earthquake sequences, which are located at the western, central and eastern regions of the Himalayan orogenic belt. The Pamir-Hindu Kush region is located at the western Himalayan syntaxis and is characterized by extreme shortening of the upper crust and strong interaction of various layers of the lithosphere. Many shallow earthquakes occur on the Main Pamir Thrust at focal depths shallower than 20 km, while intermediate-deep earthquakes are mostly located below 75 km. Large intermediate-depth earthquakes occur frequently at the western Himalayan syntaxis about every 10 years on average. The 2015 Nepal earthquake is located in the central Himalayas. It is a typical megathrust earthquake that occurred on the shallow portion of the Main Himalayan Thrust (MHT). Many of the aftershocks are located above the MHT and illuminate faulting structures in the hanging wall with dip angles that are steeper than the MHT. These observations provide new constraints on the collision and uplift processes for the Himalaya orogenic belt. The Indo-Burma region is located south of the eastern Himalayan syntaxis, where the strike of the plate boundary suddenly changes from nearly east-west at the Himalayas to nearly north-south at the Burma Arc. The Burma arc subduction zone is a typical oblique plate convergence zone. The eastern boundary is the north-south striking dextral Sagaing fault, which hosts many shallow earthquakes with focal depth less than 25 km. In contrast, intermediate-depth earthquakes along the subduction zone reflect east-west trending reverse faulting.

  15. Statistical tests of simple earthquake cycle models

    USGS Publications Warehouse

    Devries, Phoebe M. R.; Evans, Eileen

    2016-01-01

    A central goal of observing and modeling the earthquake cycle is to forecast when a particular fault may generate an earthquake: a fault late in its earthquake cycle may be more likely to generate an earthquake than a fault early in its earthquake cycle. Models that can explain geodetic observations throughout the entire earthquake cycle may be required to gain a more complete understanding of relevant physics and phenomenology. Previous efforts to develop unified earthquake models for strike-slip faults have largely focused on explaining both preseismic and postseismic geodetic observations available across a few faults in California, Turkey, and Tibet. An alternative approach leverages the global distribution of geodetic and geologic slip rate estimates on strike-slip faults worldwide. Here we use the Kolmogorov-Smirnov test for similarity of distributions to infer, in a statistically rigorous manner, viscoelastic earthquake cycle models that are inconsistent with 15 sets of observations across major strike-slip faults. We reject a large subset of two-layer models incorporating Burgers rheologies at a significance level of α = 0.05 (those with long-term Maxwell viscosities ηM <~ 4.0 × 1019 Pa s and ηM >~ 4.6 × 1020 Pa s) but cannot reject models on the basis of transient Kelvin viscosity ηK. Finally, we examine the implications of these results for the predicted earthquake cycle timing of the 15 faults considered and compare these predictions to the geologic and historical record.

  16. Response of a 14-story Anchorage, Alaska, building in 2002 to two close earthquakes and two distant Denali fault earthquakes

    USGS Publications Warehouse

    Celebi, M.

    2004-01-01

    The recorded responses of an Anchorage, Alaska, building during four significant earthquakes that occurred in 2002 are studied. Two earthquakes, including the 3 November 2002 M7.9 Denali fault earthquake, with epicenters approximately 275 km from the building, generated long trains of long-period (>1 s) surface waves. The other two smaller earthquakes occurred at subcrustal depths practically beneath Anchorage and produced higher frequency motions. These two pairs of earthquakes have different impacts on the response of the building. Higher modes are more pronounced in the building response during the smaller nearby events. The building responses indicate that the close-coupling of translational and torsional modes causes a significant beating effect. It is also possible that there is some resonance occurring due to the site frequency being close to the structural frequency. Identification of dynamic characteristics and behavior of buildings can provide important lessons for future earthquake-resistant designs and retrofit of existing buildings. ?? 2004, Earthquake Engineering Research Institute.

  17. Induced earthquake magnitudes are as large as (statistically) expected

    USGS Publications Warehouse

    Van Der Elst, Nicholas; Page, Morgan T.; Weiser, Deborah A.; Goebel, Thomas; Hosseini, S. Mehran

    2016-01-01

    A major question for the hazard posed by injection-induced seismicity is how large induced earthquakes can be. Are their maximum magnitudes determined by injection parameters or by tectonics? Deterministic limits on induced earthquake magnitudes have been proposed based on the size of the reservoir or the volume of fluid injected. However, if induced earthquakes occur on tectonic faults oriented favorably with respect to the tectonic stress field, then they may be limited only by the regional tectonics and connectivity of the fault network. In this study, we show that the largest magnitudes observed at fluid injection sites are consistent with the sampling statistics of the Gutenberg-Richter distribution for tectonic earthquakes, assuming no upper magnitude bound. The data pass three specific tests: (1) the largest observed earthquake at each site scales with the log of the total number of induced earthquakes, (2) the order of occurrence of the largest event is random within the induced sequence, and (3) the injected volume controls the total number of earthquakes rather than the total seismic moment. All three tests point to an injection control on earthquake nucleation but a tectonic control on earthquake magnitude. Given that the largest observed earthquakes are exactly as large as expected from the sampling statistics, we should not conclude that these are the largest earthquakes possible. Instead, the results imply that induced earthquake magnitudes should be treated with the same maximum magnitude bound that is currently used to treat seismic hazard from tectonic earthquakes.

  18. Volcanotectonic earthquakes induced by propagating dikes

    NASA Astrophysics Data System (ADS)

    Gudmundsson, Agust

    2016-04-01

    Volcanotectonic earthquakes are of high frequency and mostly generated by slip on faults. During chamber expansion/contraction earthquakes are distribution in the chamber roof. Following magma-chamber rupture and dike injection, however, earthquakes tend to concentrate around the dike and follow its propagation path, resulting in an earthquake swarm characterised by a number of earthquakes of similar magnitudes. I distinguish between two basic processes by which propagating dikes induce earthquakes. One is due to stress concentration in the process zone at the tip of the dike, the other relates to stresses induced in the walls and surrounding rocks on either side of the dike. As to the first process, some earthquakes generated at the dike tip are related to pure extension fracturing as the tip advances and the dike-path forms. Formation of pure extension fractures normally induces non-double couple earthquakes. There is also shear fracturing in the process zone, however, particularly normal faulting, which produces double-couple earthquakes. The second process relates primarily to slip on existing fractures in the host rock induced by the driving pressure of the propagating dike. Such pressures easily reach 5-20 MPa and induce compressive and shear stresses in the adjacent host rock, which already contains numerous fractures (mainly joints) of different attitudes. In piles of lava flows or sedimentary beds the original joints are primarily vertical and horizontal. Similarly, the contacts between the layers/beds are originally horizontal. As the layers/beds become buried, the joints and contacts become gradually tilted so that the joints and contacts become oblique to the horizontal compressive stress induced by a driving pressure of the (vertical) dike. Also, most of the hexagonal (or pentagonal) columnar joints in the lava flows are, from the beginning, oblique to an intrusive sheet of any attitude. Consequently, the joints and contacts function as potential shear

  19. Post-Earthquake Reconstruction — in Context of Housing

    NASA Astrophysics Data System (ADS)

    Sarkar, Raju

    Comprehensive rescue and relief operations are always launched with no loss of time with active participation of the Army, Governmental agencies, Donor agencies, NGOs, and other Voluntary organizations after each Natural Disaster. There are several natural disasters occurring throughout the world round the year and one of them is Earthquake. More than any other natural catastrophe, an earthquake represents the undoing of our most basic pre-conceptions of the earth as the source of stability or the first distressing factor due to earthquake is the collapse of our dwelling units. Earthquake has affected buildings since people began constructing them. So after each earthquake a reconstruction of housing program is very much essential since housing is referred to as shelter satisfying one of the so-called basic needs next to food and clothing. It is a well-known fact that resettlement (after an earthquake) is often accompanied by the creation of ghettos and ensuing problems in the provision of infrastructure and employment. In fact a housing project after Bhuj earthquake in Gujarat, India, illustrates all the negative aspects of resettlement in the context of reconstruction. The main theme of this paper is to consider few issues associated with post-earthquake reconstruction in context of housing, all of which are significant to communities that have had to rebuild after catastrophe or that will face such a need in the future. Few of them are as follows: (1) Why rebuilding opportunities are time consuming? (2) What are the causes of failure in post-earthquake resettlement? (3) How can holistic planning after an earthquake be planned? (4) What are the criteria to be checked for sustainable building materials? (5) What are the criteria for success in post-earthquake resettlement? (6) How mitigation in post-earthquake housing can be made using appropriate repair, restoration, and strengthening concepts?

  20. Investigating landslides caused by earthquakes - A historical review

    USGS Publications Warehouse

    Keefer, D.K.

    2002-01-01

    Post-earthquake field investigations of landslide occurrence have provided a basis for understanding, evaluating, and mapping the hazard and risk associated with earthquake-induced landslides. This paper traces the historical development of knowledge derived from these investigations. Before 1783, historical accounts of the occurrence of landslides in earthquake are typically so incomplete and vague that conclusions based on these accounts are of limited usefulness. For example, the number of landslides triggered by a given event is almost always greatly underestimated. The first formal, scientific post-earthquake investigation that included systematic documentation of the landslides was undertaken in the Calabria region of Italy after the 1783 earthquake swarm. From then until the mid-twentieth century, the best information on earthquake-induced landslides came from a succession of post-earthquake investigations largely carried out by formal commissions that undertook extensive ground-based field studies. Beginning in the mid-twentieth century, when the use of aerial photography became widespread, comprehensive inventories of landslide occurrence have been made for several earthquakes in the United States, Peru, Guatemala, Italy, El Salvador, Japan, and Taiwan. Techniques have also been developed for performing "retrospective" analyses years or decades after an earthquake that attempt to reconstruct the distribution of landslides triggered by the event. The additional use of Geographic Information System (GIS) processing and digital mapping since about 1989 has greatly facilitated the level of analysis that can applied to mapped distributions of landslides. Beginning in 1984, synthesis of worldwide and national data on earthquake-induced landslides have defined their general characteristics and relations between their occurrence and various geologic and seismic parameters. However, the number of comprehensive post-earthquake studies of landslides is still

  1. Regional and Local Glacial-Earthquake Patterns in Greenland

    NASA Astrophysics Data System (ADS)

    Olsen, K.; Nettles, M.

    2016-12-01

    Icebergs calved from marine-terminating glaciers currently account for up to half of the 400 Gt of ice lost annually from the Greenland ice sheet (Enderlin et al., 2014). When large capsizing icebergs ( 1 Gt of ice) calve, they produce elastic waves that propagate through the solid earth and are observed as teleseismically detectable MSW 5 glacial earthquakes (e.g., Ekström et al., 2003; Nettles & Ekström, 2010 Tsai & Ekström, 2007; Veitch & Nettles, 2012). The annual number of these events has increased dramatically over the past two decades. We analyze glacial earthquakes from 2011-2013, which expands the glacial-earthquake catalog by 50%. The number of glacial-earthquake solutions now available allows us to investigate regional patterns across Greenland and link earthquake characteristics to changes in ice dynamics at individual glaciers. During the years of our study Greenland's west coast dominated glacial-earthquake production. Kong Oscar Glacier, Upernavik Isstrøm, and Jakobshavn Isbræ all produced more glacial earthquakes during this time than in preceding years. We link patterns in glacial-earthquake production and cessation to the presence or absence of floating ice tongues at glaciers on both coasts of Greenland. The calving model predicts glacial-earthquake force azimuths oriented perpendicular to the calving front, and comparisons between seismic data and satellite imagery confirm this in most instances. At two glaciers we document force azimuths that have recently changed orientation and confirm that similar changes have occurred in the calving-front geometry. We also document glacial earthquakes at one previously quiescent glacier. Consistent with previous work, we model the glacial-earthquake force-time function as a boxcar with horizontal and vertical force components that vary synchronously. We investigate limitations of this approach and explore improvements that could lead to a more accurate representation of the glacial earthquake source.

  2. Post-earthquake dilatancy recovery

    NASA Technical Reports Server (NTRS)

    Scholz, C. H.

    1974-01-01

    Geodetic measurements of the 1964 Niigata, Japan earthquake and of three other examples are briefly examined. They show exponentially decaying subsidence for a year after the quakes. The observations confirm the dilatancy-fluid diffusion model of earthquake precursors and clarify the extent and properties of the dilatant zone. An analysis using one-dimensional consolidation theory is included which agrees well with this interpretation.

  3. Measures for groundwater security during and after the Hanshin-Awaji earthquake (1995) and the Great East Japan earthquake (2011), Japan

    NASA Astrophysics Data System (ADS)

    Tanaka, Tadashi

    2016-03-01

    Many big earthquakes have occurred in the tectonic regions of the world, especially in Japan. Earthquakes often cause damage to crucial life services such as water, gas and electricity supply systems and even the sewage system in urban and rural areas. The most severe problem for people affected by earthquakes is access to water for their drinking/cooking and toilet flushing. Securing safe water for daily life in an earthquake emergency requires the establishment of countermeasures, especially in a mega city like Tokyo. This paper described some examples of groundwater use in earthquake emergencies, with reference to reports, books and newspapers published in Japan. The consensus is that groundwater, as a source of water, plays a major role in earthquake emergencies, especially where the accessibility of wells coincides with the emergency need. It is also important to introduce a registration system for citizen-owned and company wells that can form the basis of a cooperative during a disaster; such a registration system was implemented by many Japanese local governments after the Hanshin-Awaji Earthquake in 1995 and the Great East Japan Earthquake in 2011, and is one of the most effective countermeasures for groundwater use in an earthquake emergency. Emphasis is also placed the importance of establishing of a continuous monitoring system of groundwater conditions for both quantity and quality during non-emergency periods.

  4. Possibility of viscoelastic stress transfer triggering of the 2007 Chuetsu-Oki earthquake by the 2004 Chuetsu earthquake, Japan

    NASA Astrophysics Data System (ADS)

    Cho, I.; Ohtani, R.; Kuwahara, Y.; Abe, Y.

    2009-12-01

    The Chuetsu district, Central Japan, recently experienced two large earthquakes, at a space interval of only about 40 km and at a time interval of just 3 years—the 2004 Chuetsu earthquake (Mw 6.5) and the 2007 Chuetsu-Oki earthquake (Mw 6.6). There has been debate whether or not the 2007 Chuetsu-Oki earthquake was induced by the 2004 Chuetsu earthquake. The changes in the Coulomb failure function (DCFF) due to the 2004 earthquake showed negative values around the faults of the 2007 earthquake. However, it should be noted that the region where the two earthquakes occurred is characterized by thick sediments (6 km) and high geothermal gradients, which may not be appropriately modeled with a homogeneous half-infinite elastic medium that was assumed in the DCFF calculation. In this study, we examined the impacts of three-dimensional inhomogeneity and viscoelastic properties of the medium on the DCFF calculation so that we can seek for the possibility that the two earthquakes are related. We modeled the subsurface structure by three layers of an upper crust, a lower crust and an upper mantle. The geometry of the boundaries, the Conrad and the Moho, were given in two ways; one is to assume horizontal planes at the depths 15 and 30 km, and the other is to use curved surfaces inferred from a seismic analysis by Zhao et al. (1992). As for the material properties, the upper crust was assumed elastic while the deeper two layers were assumed viscoelastic. To investigate the sensitivity of the DCFF calculation to viscosity, some combinations of the viscosity coefficient were used; say, {1e18, 1. e18} Pas, {1e19, 1.e19} Pas, {1e18, 1.e19}, and so on for the lower crust and the mantle, respectively. The elastic constants, P- and S-wave velocities and densities, were assumed (1) to be uniform in the whole medium, (2) to have representative values within each layer, and (3) to have three-dimensionally variable values based on the seismic tomography by Matsubara et al. (2008). A

  5. MyShake - A smartphone app to detect earthquake

    NASA Astrophysics Data System (ADS)

    Kong, Q.; Allen, R. M.; Schreier, L.; Kwon, Y. W.

    2015-12-01

    We designed an android app that harnesses the accelerometers in personal smartphones to record earthquake-shaking data for research, hazard information and warnings. The app has the function to distinguish earthquake shakings from daily human activities based on the different patterns behind the movements. It also can be triggered by the traditional earthquake early warning (EEW) system to record for a certain amount of time to collect earthquake data. When the app is triggered by the earthquake-like movements, it sends the trigger information back to our server which contains time and location of the trigger, at the same time, it stores the waveform data on local phone first, and upload to our server later. Trigger information from multiple phones will be processed in real time on the server to find the coherent signal to confirm the earthquakes. Therefore, the app provides the basis to form a smartphone seismic network that can detect earthquake and even provide warnings. A planned public roll-out of MyShake could collect millions of seismic recordings for large earthquakes in many regions around the world.

  6. Earthquake Education in Prime Time

    NASA Astrophysics Data System (ADS)

    de Groot, R.; Abbott, P.; Benthien, M.

    2004-12-01

    Since 2001, the Southern California Earthquake Center (SCEC) has collaborated on several video production projects that feature important topics related to earthquake science, engineering, and preparedness. These projects have also fostered many fruitful and sustained partnerships with a variety of organizations that have a stake in hazard education and preparedness. The Seismic Sleuths educational video first appeared in the spring season 2001 on Discovery Channel's Assignment Discovery. Seismic Sleuths is based on a highly successful curriculum package developed jointly by the American Geophysical Union and The Department of Homeland Security Federal Emergency Management Agency. The California Earthquake Authority (CEA) and the Institute for Business and Home Safety supported the video project. Summer Productions, a company with a reputation for quality science programming, produced the Seismic Sleuths program in close partnership with scientists, engineers, and preparedness experts. The program has aired on the National Geographic Channel as recently as Fall 2004. Currently, SCEC is collaborating with Pat Abbott, a geology professor at San Diego State University (SDSU) on the video project Written In Stone: Earthquake Country - Los Angeles. Partners on this project include the California Seismic Safety Commission, SDSU, SCEC, CEA, and the Insurance Information Network of California. This video incorporates live-action demonstrations, vivid animations, and a compelling host (Abbott) to tell the story about earthquakes in the Los Angeles region. The Written in Stone team has also developed a comprehensive educator package that includes the video, maps, lesson plans, and other supporting materials. We will present the process that facilitates the creation of visually effective, factually accurate, and entertaining video programs. We acknowledge the need to have a broad understanding of the literature related to communication, media studies, science education, and

  7. Defeating Earthquakes

    NASA Astrophysics Data System (ADS)

    Stein, R. S.

    2012-12-01

    The 2004 M=9.2 Sumatra earthquake claimed what seemed an unfathomable 228,000 lives, although because of its size, we could at least assure ourselves that it was an extremely rare event. But in the short space of 8 years, the Sumatra quake no longer looks like an anomaly, and it is no longer even the worst disaster of the Century: 80,000 deaths in the 2005 M=7.6 Pakistan quake; 88,000 deaths in the 2008 M=7.9 Wenchuan, China quake; 316,000 deaths in the M=7.0 Haiti, quake. In each case, poor design and construction were unable to withstand the ferocity of the shaken earth. And this was compounded by inadequate rescue, medical care, and shelter. How could the toll continue to mount despite the advances in our understanding of quake risk? The world's population is flowing into megacities, and many of these migration magnets lie astride the plate boundaries. Caught between these opposing demographic and seismic forces are 50 cities of at least 3 million people threatened by large earthquakes, the targets of chance. What we know for certain is that no one will take protective measures unless they are convinced they are at risk. Furnishing that knowledge is the animating principle of the Global Earthquake Model, launched in 2009. At the very least, everyone should be able to learn what his or her risk is. At the very least, our community owes the world an estimate of that risk. So, first and foremost, GEM seeks to raise quake risk awareness. We have no illusions that maps or models raise awareness; instead, earthquakes do. But when a quake strikes, people need a credible place to go to answer the question, how vulnerable am I, and what can I do about it? The Global Earthquake Model is being built with GEM's new open source engine, OpenQuake. GEM is also assembling the global data sets without which we will never improve our understanding of where, how large, and how frequently earthquakes will strike, what impacts they will have, and how those impacts can be lessened by

  8. Larger earthquakes recur more periodically: New insights in the megathrust earthquake cycle from lacustrine turbidite records in south-central Chile

    NASA Astrophysics Data System (ADS)

    Moernaut, J.; Van Daele, M.; Fontijn, K.; Heirman, K.; Kempf, P.; Pino, M.; Valdebenito, G.; Urrutia, R.; Strasser, M.; De Batist, M.

    2018-01-01

    Historical and paleoseismic records in south-central Chile indicate that giant earthquakes on the subduction megathrust - such as in AD1960 (Mw 9.5) - reoccur on average every ∼300 yr. Based on geodetic calculations of the interseismic moment accumulation since AD1960, it was postulated that the area already has the potential for a Mw 8 earthquake. However, to estimate the probability of such a great earthquake to take place in the short term, one needs to frame this hypothesis within the long-term recurrence pattern of megathrust earthquakes in south-central Chile. Here we present two long lacustrine records, comprising up to 35 earthquake-triggered turbidites over the last 4800 yr. Calibration of turbidite extent with historical earthquake intensity reveals a different macroseismic intensity threshold (≥VII1/2 vs. ≥VI1/2) for the generation of turbidites at the coring sites. The strongest earthquakes (≥VII1/2) have longer recurrence intervals (292 ±93 yrs) than earthquakes with intensity of ≥VI1/2 (139 ± 69yr). Moreover, distribution fitting and the coefficient of variation (CoV) of inter-event times indicate that the stronger earthquakes recur in a more periodic way (CoV: 0.32 vs. 0.5). Regional correlation of our multi-threshold shaking records with coastal paleoseismic data of complementary nature (tsunami, coseismic subsidence) suggests that the intensity ≥VII1/2 events repeatedly ruptured the same part of the megathrust over a distance of at least ∼300 km and can be assigned to Mw ≥ 8.6. We hypothesize that a zone of high plate locking - identified by geodetic studies and large slip in AD 1960 - acts as a dominant regional asperity, on which elastic strain builds up over several centuries and mostly gets released in quasi-periodic great and giant earthquakes. Our paleo-records indicate that Poissonian recurrence models are inadequate to describe large megathrust earthquake recurrence in south-central Chile. Moreover, they show an enhanced

  9. Earthquakes of Garhwal Himalaya region of NW Himalaya, India: A study of relocated earthquakes and their seismogenic source and stress

    NASA Astrophysics Data System (ADS)

    R, A. P.; Paul, A.; Singh, S.

    2017-12-01

    Since the continent-continent collision 55 Ma, the Himalaya has accommodated 2000 km of convergence along its arc. The strain energy is being accumulated at a rate of 37-44 mm/yr and releases at time as earthquakes. The Garhwal Himalaya is located at the western side of a Seismic Gap, where a great earthquake is overdue atleast since 200 years. This seismic gap (Central Seismic Gap: CSG) with 52% probability for a future great earthquake is located between the rupture zones of two significant/great earthquakes, viz. the 1905 Kangra earthquake of M 7.8 and the 1934 Bihar-Nepal earthquake of M 8.0; and the most recent one, the 2015 Gorkha earthquake of M 7.8 is in the eastern side of this seismic gap (CSG). The Garhwal Himalaya is one of the ideal locations of the Himalaya where all the major Himalayan structures and the Himalayan Seimsicity Belt (HSB) can ably be described and studied. In the present study, we are presenting the spatio-temporal analysis of the relocated local micro-moderate earthquakes, recorded by a seismicity monitoring network, which is operational since, 2007. The earthquake locations are relocated using the HypoDD (double difference hypocenter method for earthquake relocations) program. The dataset from July, 2007- September, 2015 have been used in this study to estimate their spatio-temporal relationships, moment tensor (MT) solutions for the earthquakes of M>3.0, stress tensors and their interactions. We have also used the composite focal mechanism solutions for small earthquakes. The majority of the MT solutions show thrust type mechanism and located near the mid-crustal-ramp (MCR) structure of the detachment surface at 8-15 km depth beneath the outer lesser Himalaya and higher Himalaya regions. The prevailing stress has been identified to be compressional towards NNE-SSW, which is the direction of relative plate motion between the India and Eurasia continental plates. The low friction coefficient estimated along with the stress inversions

  10. NEIC; the National Earthquake Information Center

    USGS Publications Warehouse

    Masse, R.P.; Needham, R.E.

    1989-01-01

    At least 9,500 people were killed, 30,000 were injured and 100,000 were left homeless by this earthquake. According to some unconfirmed reports, the death toll from this earthquake may have been as high as 35,000. this earthquake is estimated to have seriously affected an area of 825,000 square kilometers, caused between 3 and 4 billion dollars in damage, and been felt by 20 million people. 

  11. Tremor, remote triggering and earthquake cycle

    NASA Astrophysics Data System (ADS)

    Peng, Z.

    2012-12-01

    Deep tectonic tremor and episodic slow-slip events have been observed at major plate-boundary faults around the Pacific Rim. These events have much longer source durations than regular earthquakes, and are generally located near or below the seismogenic zone where regular earthquakes occur. Tremor and slow-slip events appear to be extremely stress sensitive, and could be instantaneously triggered by distant earthquakes and solid earth tides. However, many important questions remain open. For example, it is still not clear what are the necessary conditions for tremor generation, and how remote triggering could affect large earthquake cycle. Here I report a global search of tremor triggered by recent large teleseismic earthquakes. We mainly focus on major subduction zones around the Pacific Rim. These include the southwest and northeast Japan subduction zones, the Hikurangi subduction zone in New Zealand, the Cascadia subduction zone, and the major subduction zones in Central and South America. In addition, we examine major strike-slip faults around the Caribbean plate, the Queen Charlotte fault in northern Pacific Northwest Coast, and the San Andreas fault system in California. In each place, we first identify triggered tremor as a high-frequency non-impulsive signal that is in phase with the large-amplitude teleseismic waves. We also calculate the dynamic stress and check the triggering relationship with the Love and Rayleigh waves. Finally, we calculate the triggering potential with the local fault orientation and surface-wave incident angles. Our results suggest that tremor exists at many plate-boundary faults in different tectonic environments, and could be triggered by dynamic stress as low as a few kPas. In addition, we summarize recent observations of slow-slip events and earthquake swarms triggered by large distant earthquakes. Finally, we propose several mechanisms that could explain apparent clustering of large earthquakes around the world.

  12. Mexican Earthquakes and Tsunamis Catalog Reviewed

    NASA Astrophysics Data System (ADS)

    Ramirez-Herrera, M. T.; Castillo-Aja, R.

    2015-12-01

    Today the availability of information on the internet makes online catalogs very easy to access by both scholars and the public in general. The catalog in the "Significant Earthquake Database", managed by the National Center for Environmental Information (NCEI formerly NCDC), NOAA, allows access by deploying tabular and cartographic data related to earthquakes and tsunamis contained in the database. The NCEI catalog is the product of compiling previously existing catalogs, historical sources, newspapers, and scientific articles. Because NCEI catalog has a global coverage the information is not homogeneous. Existence of historical information depends on the presence of people in places where the disaster occurred, and that the permanence of the description is preserved in documents and oral tradition. In the case of instrumental data, their availability depends on the distribution and quality of seismic stations. Therefore, the availability of information for the first half of 20th century can be improved by careful analysis of the available information and by searching and resolving inconsistencies. This study shows the advances we made in upgrading and refining data for the earthquake and tsunami catalog of Mexico since 1500 CE until today, presented in the format of table and map. Data analysis allowed us to identify the following sources of error in the location of the epicenters in existing catalogs: • Incorrect coordinate entry • Place name erroneous or mistaken • Too general data that makes difficult to locate the epicenter, mainly for older earthquakes • Inconsistency of earthquakes and the tsunami occurrence: earthquake's epicenter located too far inland reported as tsunamigenic. The process of completing the catalogs directly depends on the availability of information; as new archives are opened for inspection, there are more opportunities to complete the history of large earthquakes and tsunamis in Mexico. Here, we also present new earthquake and

  13. Earthquake risk assessment of Alexandria, Egypt

    NASA Astrophysics Data System (ADS)

    Badawy, Ahmed; Gaber, Hanan; Ibrahim, Hamza

    2015-01-01

    Throughout historical and recent times, Alexandria has suffered great damage due to earthquakes from both near- and far-field sources. Sometimes, the sources of such damages are not well known. During the twentieth century, the city was shaken by several earthquakes generated from inland dislocations (e.g., 29 Apr. 1974, 12 Oct. 1992, and 28 Dec. 1999) and the African continental margin (e.g., 12 Sept. 1955 and 28 May 1998). Therefore, this study estimates the earthquake ground shaking and the consequent impacts in Alexandria on the basis of two earthquake scenarios. The simulation results show that Alexandria affected by both earthquakes scenarios relatively in the same manner despite the number of casualties during the first scenario (inland dislocation) is twice larger than the second one (African continental margin). An expected percentage of 2.27 from Alexandria's total constructions (12.9 millions, 2006 Census) will be affected, 0.19 % injuries and 0.01 % deaths of the total population (4.1 millions, 2006 Census) estimated by running the first scenario. The earthquake risk profile reveals that three districts (Al-Montazah, Al-Amriya, and Shark) lie in high seismic risks, two districts (Gharb and Wasat) are in moderate, and two districts (Al-Gomrok and Burg El-Arab) are in low seismic risk level. Moreover, the building damage estimations reflect that Al-Montazah is the highest vulnerable district whereas 73 % of expected damages were reported there. The undertaken analysis shows that the Alexandria urban area faces high risk. Informal areas and deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated earthquake risks (buildings damages) are concentrated at the most densely populated (Al-Montazah, Al-Amriya, and Shark) districts. Moreover, about 75 % of casualties are in the same districts.

  14. A contrast study of the traumatic condition between the wounded in 5.12 Wenchuan earthquake and 4.25 Nepal earthquake.

    PubMed

    Ding, Sheng; Hu, Yonghe; Zhang, Zhongkui; Wang, Ting

    2015-01-01

    5.12 Wenchuan earthquake and 4.25 Nepal earthquake are of the similar magnitude, but the climate and geographic environment are totally different. Our team carried out medical rescue in both disasters, so we would like to compare the different traumatic conditions of the wounded in two earthquakes. The clinical data of the wounded respectively in 5.12 Wenchuan earthquake and 4.25 Nepal earthquake rescued by Chengdu Military General Hospital were retrospectively analyzed. Then a contrast study between the wounded was conducted in terms of age, sex, injury mechanisms, traumatic conditions, complications and prognosis. Three days after 5.12 Wenchuan earthquake, 465 cases of the wounded were hospitalized in Chengdu Military General Hospital, including 245 males (52.7%) and 220 females (47.3%) with the average age of (47.6±22.7) years. Our team carried out humanitarian relief in Katmandu after 4.25 Nepal earthquake. Three days after this disaster, 71 cases were treated in our field hospital, including 37 males (52.1%) and 34 females (47.9%) with the mean age of (44.8±22.9) years. There was no obvious difference in sex and mean age between two groups, but the age distribution was a little different: there were more wounded people at the age over 60 years in 4.25 Nepal earthquake (p<0.01) while more wounded people at the age between 21 and 60 years in 5.12 Wenchuan earthquake (p<0.05). The main cause of injury in both disasters was bruise by heavy drops but 5.12 Wenchuan earthquake had a higher rate of bruise injury and crush injury (p<0.05) while 4.25 Nepal earthquake had a higher rate of falling injury (p<0.01). Limb fracture was the most common injury type in both disasters. However, compared with 5.12 Wenchuan earthquake, 4.25 Nepal earthquake has a much higher incidence of limb fractures (p<0.01), lung infection (p<0.01) and malnutrition (p<0.05), but a lower incidence of thoracic injury (p<0.05) and multiple injury (p<0.05). The other complications and death rate

  15. Towards the Future "Earthquake" School in the Cloud: Near-real Time Earthquake Games Competition in Taiwan

    NASA Astrophysics Data System (ADS)

    Chen, K. H.; Liang, W. T.; Wu, Y. F.; Yen, E.

    2014-12-01

    To prevent the future threats of natural disaster, it is important to understand how the disaster happened, why lives were lost, and what lessons have been learned. By that, the attitude of society toward natural disaster can be transformed from training to learning. The citizen-seismologists-in-Taiwan project is designed to elevate the quality of earthquake science education by means of incorporating earthquake/tsunami stories and near-real time earthquake games competition into the traditional curricula in schools. Through pilot of courses and professional development workshops, we have worked closely with teachers from elementary, junior high, and senior high schools, to design workable teaching plans through a practical operation of seismic monitoring at home or school. We will introduce how the 9-years-old do P- and S-wave picking and measure seismic intensity through interactive learning platform, how do scientists and school teachers work together, and how do we create an environment to facilitate continuous learning (i.e., near-real time earthquake games competition), to make earthquake science fun.

  16. Rapid earthquake characterization using MEMS accelerometers and volunteer hosts following the M 7.2 Darfield, New Zealand, Earthquake

    USGS Publications Warehouse

    Lawrence, J. F.; Cochran, E.S.; Chung, A.; Kaiser, A.; Christensen, C. M.; Allen, R.; Baker, J.W.; Fry, B.; Heaton, T.; Kilb, Debi; Kohler, M.D.; Taufer, M.

    2014-01-01

    We test the feasibility of rapidly detecting and characterizing earthquakes with the Quake‐Catcher Network (QCN) that connects low‐cost microelectromechanical systems accelerometers to a network of volunteer‐owned, Internet‐connected computers. Following the 3 September 2010 M 7.2 Darfield, New Zealand, earthquake we installed over 180 QCN sensors in the Christchurch region to record the aftershock sequence. The sensors are monitored continuously by the host computer and send trigger reports to the central server. The central server correlates incoming triggers to detect when an earthquake has occurred. The location and magnitude are then rapidly estimated from a minimal set of received ground‐motion parameters. Full seismic time series are typically not retrieved for tens of minutes or even hours after an event. We benchmark the QCN real‐time detection performance against the GNS Science GeoNet earthquake catalog. Under normal network operations, QCN detects and characterizes earthquakes within 9.1 s of the earthquake rupture and determines the magnitude within 1 magnitude unit of that reported in the GNS catalog for 90% of the detections.

  17. Earthquake likelihood model testing

    USGS Publications Warehouse

    Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A.

    2007-01-01

    INTRODUCTIONThe Regional Earthquake Likelihood Models (RELM) project aims to produce and evaluate alternate models of earthquake potential (probability per unit volume, magnitude, and time) for California. Based on differing assumptions, these models are produced to test the validity of their assumptions and to explore which models should be incorporated in seismic hazard and risk evaluation. Tests based on physical and geological criteria are useful but we focus on statistical methods using future earthquake catalog data only. We envision two evaluations: a test of consistency with observed data and a comparison of all pairs of models for relative consistency. Both tests are based on the likelihood method, and both are fully prospective (i.e., the models are not adjusted to fit the test data). To be tested, each model must assign a probability to any possible event within a specified region of space, time, and magnitude. For our tests the models must use a common format: earthquake rates in specified “bins” with location, magnitude, time, and focal mechanism limits.Seismology cannot yet deterministically predict individual earthquakes; however, it should seek the best possible models for forecasting earthquake occurrence. This paper describes the statistical rules of an experiment to examine and test earthquake forecasts. The primary purposes of the tests described below are to evaluate physical models for earthquakes, assure that source models used in seismic hazard and risk studies are consistent with earthquake data, and provide quantitative measures by which models can be assigned weights in a consensus model or be judged as suitable for particular regions.In this paper we develop a statistical method for testing earthquake likelihood models. A companion paper (Schorlemmer and Gerstenberger 2007, this issue) discusses the actual implementation of these tests in the framework of the RELM initiative.Statistical testing of hypotheses is a common task and a

  18. Modeling fast and slow earthquakes at various scales

    PubMed Central

    IDE, Satoshi

    2014-01-01

    Earthquake sources represent dynamic rupture within rocky materials at depth and often can be modeled as propagating shear slip controlled by friction laws. These laws provide boundary conditions on fault planes embedded in elastic media. Recent developments in observation networks, laboratory experiments, and methods of data analysis have expanded our knowledge of the physics of earthquakes. Newly discovered slow earthquakes are qualitatively different phenomena from ordinary fast earthquakes and provide independent information on slow deformation at depth. Many numerical simulations have been carried out to model both fast and slow earthquakes, but problems remain, especially with scaling laws. Some mechanisms are required to explain the power-law nature of earthquake rupture and the lack of characteristic length. Conceptual models that include a hierarchical structure over a wide range of scales would be helpful for characterizing diverse behavior in different seismic regions and for improving probabilistic forecasts of earthquakes. PMID:25311138

  19. Modeling fast and slow earthquakes at various scales.

    PubMed

    Ide, Satoshi

    2014-01-01

    Earthquake sources represent dynamic rupture within rocky materials at depth and often can be modeled as propagating shear slip controlled by friction laws. These laws provide boundary conditions on fault planes embedded in elastic media. Recent developments in observation networks, laboratory experiments, and methods of data analysis have expanded our knowledge of the physics of earthquakes. Newly discovered slow earthquakes are qualitatively different phenomena from ordinary fast earthquakes and provide independent information on slow deformation at depth. Many numerical simulations have been carried out to model both fast and slow earthquakes, but problems remain, especially with scaling laws. Some mechanisms are required to explain the power-law nature of earthquake rupture and the lack of characteristic length. Conceptual models that include a hierarchical structure over a wide range of scales would be helpful for characterizing diverse behavior in different seismic regions and for improving probabilistic forecasts of earthquakes.

  20. High resolution strain sensor for earthquake precursor observation and earthquake monitoring

    NASA Astrophysics Data System (ADS)

    Zhang, Wentao; Huang, Wenzhu; Li, Li; Liu, Wenyi; Li, Fang

    2016-05-01

    We propose a high-resolution static-strain sensor based on a FBG Fabry-Perot interferometer (FBG-FP) and a wavelet domain cross-correlation algorithm. This sensor is used for crust deformation measurement, which plays an important role in earthquake precursor observation. The Pound-Drever-Hall (PDH) technique based on a narrow-linewidth tunable fiber laser is used to interrogate the FBG-FPs. A demodulation algorithm based on wavelet domain cross-correlation is used to calculate the wavelength difference. The FBG-FP sensor head is fixed on the two steel alloy rods which are installed in the bedrock. The reference FBG-FP is placed in a strain-free state closely to compensate the environment temperature fluctuation. A static-strain resolution of 1.6 n(epsilon) can be achieved. As a result, clear solid tide signals and seismic signals can be recorded, which suggests that the proposed strain sensor can be applied to earthquake precursor observation and earthquake monitoring.

  1. Towards coupled earthquake dynamic rupture and tsunami simulations: The 2011 Tohoku earthquake.

    NASA Astrophysics Data System (ADS)

    Galvez, Percy; van Dinther, Ylona

    2016-04-01

    The 2011 Mw9 Tohoku earthquake has been recorded with a vast GPS and seismic network given an unprecedented chance to seismologists to unveil complex rupture processes in a mega-thrust event. The seismic stations surrounding the Miyagi regions (MYGH013) show two clear distinct waveforms separated by 40 seconds suggesting two rupture fronts, possibly due to slip reactivation caused by frictional melting and thermal fluid pressurization effects. We created a 3D dynamic rupture model to reproduce this rupture reactivation pattern using SPECFEM3D (Galvez et al, 2014) based on a slip-weakening friction with sudden two sequential stress drops (Galvez et al, 2015) . Our model starts like a M7-8 earthquake breaking dimly the trench, then after 40 seconds a second rupture emerges close to the trench producing additional slip capable to fully break the trench and transforming the earthquake into a megathrust event. The seismograms agree roughly with seismic records along the coast of Japan. The resulting sea floor displacements are in agreement with 1Hz GPS displacements (GEONET). The simulated sea floor displacement reaches 8-10 meters of uplift close to the trench, which may be the cause of such a devastating tsunami followed by the Tohoku earthquake. To investigate the impact of such a huge uplift, we ran tsunami simulations with the slip reactivation model and plug the sea floor displacements into GeoClaw (Finite element code for tsunami simulations, George and LeVeque, 2006). Our recent results compare well with the water height at the tsunami DART buoys 21401, 21413, 21418 and 21419 and show the potential using fully dynamic rupture results for tsunami studies for earthquake-tsunami scenarios.

  2. Geodetic Finite-Fault-based Earthquake Early Warning Performance for Great Earthquakes Worldwide

    NASA Astrophysics Data System (ADS)

    Ruhl, C. J.; Melgar, D.; Grapenthin, R.; Allen, R. M.

    2017-12-01

    GNSS-based earthquake early warning (EEW) algorithms estimate fault-finiteness and unsaturated moment magnitude for the largest, most damaging earthquakes. Because large events are infrequent, algorithms are not regularly exercised and insufficiently tested on few available datasets. The Geodetic Alarm System (G-larmS) is a GNSS-based finite-fault algorithm developed as part of the ShakeAlert EEW system in the western US. Performance evaluations using synthetic earthquakes offshore Cascadia showed that G-larmS satisfactorily recovers magnitude and fault length, providing useful alerts 30-40 s after origin time and timely warnings of ground motion for onshore urban areas. An end-to-end test of the ShakeAlert system demonstrated the need for GNSS data to accurately estimate ground motions in real-time. We replay real data from several subduction-zone earthquakes worldwide to demonstrate the value of GNSS-based EEW for the largest, most damaging events. We compare predicted ground acceleration (PGA) from first-alert-solutions with those recorded in major urban areas. In addition, where applicable, we compare observed tsunami heights to those predicted from the G-larmS solutions. We show that finite-fault inversion based on GNSS-data is essential to achieving the goals of EEW.

  3. Computer systems for automatic earthquake detection

    USGS Publications Warehouse

    Stewart, S.W.

    1974-01-01

    U.S Geological Survey seismologists in Menlo park, California, are utilizing the speed, reliability, and efficiency of minicomputers to monitor seismograph stations and to automatically detect earthquakes. An earthquake detection computer system, believed to be the only one of its kind in operation, automatically reports about 90 percent of all local earthquakes recorded by a network of over 100 central California seismograph stations. The system also monitors the stations for signs of malfunction or abnormal operation. Before the automatic system was put in operation, all of the earthquakes recorded had to be detected by manually searching the records, a time-consuming process. With the automatic detection system, the stations are efficiently monitored continuously. 

  4. Sea-level changes before large earthquakes

    USGS Publications Warehouse

    Wyss, M.

    1978-01-01

    Changes in sea level have long been used as a measure of local uplift and subsidence associated with large earthquakes. For instance, in 1835, the British naturalist Charles Darwin observed that sea level dropped by 2.7 meters during the large earthquake in Concepcion, CHile. From this piece of evidence and the terraces along the beach that he saw, Darwin concluded that the Andes had grown to their present height through earthquakes. Much more recently, George Plafker and James C. Savage of the U.S Geological Survey have shown, from barnacle lines, that the great 1960 Chile and the 1964 Alaska earthquakes caused several meters of vertical displacement of the shoreline. 

  5. Safety and survival in an earthquake

    USGS Publications Warehouse

    ,

    1969-01-01

    Many earth scientists in this country and abroad are focusing their studies on the search for means of predicting impending earthquakes, but, as yet, an accurate prediction of the time and place of such an event cannot be made. From past experience, however, one can assume that earthquakes will continue to harass mankind and that they will occur most frequently in the areas where they have been relatively common in the past. In the United States, earthquakes can be expected to occur most frequently in the western states, particularly in Alaska, California, Washington, Oregon, Nevada, Utah, and Montana. The danger, however, is not confined to any one part of the country; major earthquakes have occurred at widely scattered locations.

  6. Recent damaging earthquakes in Japan, 2003-2008

    USGS Publications Warehouse

    Kayen, Robert E

    2008-01-01

    During the last six years, from 2003-2008, Japan has been struck by three significant and damaging earthquakes: The most recent M6.6 Niigata Chuetsu Oki earthquake of July 16, 2007 off the coast of Kashiwazaki City, Japan; The M6.6 Niigata Chuetsu earthquake of October 23, 2004, located in Niigata Prefecture in the central Uonuma Hills; and the M8.0 Tokachi Oki Earthquake of September 26, 2003 effecting southeastern Hokkaido Prefecture. These earthquakes stand out among many in a very active period of seismicity in Japan. Within the upper 100 km of the crust during this period, Japan experienced 472 earthquakes of magnitude 6, or greater. Both Niigata events affected the south-central region of Tohoku Japan, and the Tokachi-Oki earthquake affected a broad region of the continental shelf and slope southeast of the Island of Hokkaido. This report is synthesized from the work of scores of Japanese and US researchers who led and participated in post-earthquake reconnaissance of these earthquakes: their noteworthy and valuable contributions are listed in an extended acknowledgements section at the end of the paper. During the Niigata Chuetsu Oki event of 2007, damage to the Kashiwazaki-Kariwa nuclear power plant, structures, infrastructure, and ground were primarily the product of two factors: (1) high intensity motions from this moderate-sized shallow event, and (2) soft, poor performing, or liquefiable soils in the coastal region of southwestern Niigata Prefecture. Structural and geotechnical damage along the slopes of dunes was ubiquitous in the Kashiwazaki-Kariwa region. The 2004 Niigata Chuetsu Earthquake was the most significant to affect Japan since the 1995 Kobe earthquake. Forty people were killed, almost 3,000 were injured, and many hundreds of landslides destroyed entire upland villages. Landslides were of all types; some dammed streams, temporarily creating lakes threatening to overtop their new embankments and cause flash floods and mudslides. The numerous

  7. Earthquakes induced by fluid injection and explosion

    USGS Publications Warehouse

    Healy, J.H.; Hamilton, R.M.; Raleigh, C.B.

    1970-01-01

    Earthquakes generated by fluid injection near Denver, Colorado, are compared with earthquakes triggered by nuclear explosion at the Nevada Test Site. Spatial distributions of the earthquakes in both cases are compatible with the hypothesis that variation of fluid pressure in preexisting fractures controls the time distribution of the seismic events in an "aftershock" sequence. We suggest that the fluid pressure changes may also control the distribution in time and space of natural aftershock sequences and of earthquakes that have been reported near large reservoirs. ?? 1970.

  8. Geological and historical evidence of irregular recurrent earthquakes in Japan.

    PubMed

    Satake, Kenji

    2015-10-28

    Great (M∼8) earthquakes repeatedly occur along the subduction zones around Japan and cause fault slip of a few to several metres releasing strains accumulated from decades to centuries of plate motions. Assuming a simple 'characteristic earthquake' model that similar earthquakes repeat at regular intervals, probabilities of future earthquake occurrence have been calculated by a government committee. However, recent studies on past earthquakes including geological traces from giant (M∼9) earthquakes indicate a variety of size and recurrence interval of interplate earthquakes. Along the Kuril Trench off Hokkaido, limited historical records indicate that average recurrence interval of great earthquakes is approximately 100 years, but the tsunami deposits show that giant earthquakes occurred at a much longer interval of approximately 400 years. Along the Japan Trench off northern Honshu, recurrence of giant earthquakes similar to the 2011 Tohoku earthquake with an interval of approximately 600 years is inferred from historical records and tsunami deposits. Along the Sagami Trough near Tokyo, two types of Kanto earthquakes with recurrence interval of a few hundred years and a few thousand years had been recognized, but studies show that the recent three Kanto earthquakes had different source extents. Along the Nankai Trough off western Japan, recurrence of great earthquakes with an interval of approximately 100 years has been identified from historical literature, but tsunami deposits indicate that the sizes of the recurrent earthquakes are variable. Such variability makes it difficult to apply a simple 'characteristic earthquake' model for the long-term forecast, and several attempts such as use of geological data for the evaluation of future earthquake probabilities or the estimation of maximum earthquake size in each subduction zone are being conducted by government committees. © 2015 The Author(s).

  9. Building Loss Estimation for Earthquake Insurance Pricing

    NASA Astrophysics Data System (ADS)

    Durukal, E.; Erdik, M.; Sesetyan, K.; Demircioglu, M. B.; Fahjan, Y.; Siyahi, B.

    2005-12-01

    After the 1999 earthquakes in Turkey several changes in the insurance sector took place. A compulsory earthquake insurance scheme was introduced by the government. The reinsurance companies increased their rates. Some even supended operations in the market. And, most important, the insurance companies realized the importance of portfolio analysis in shaping their future market strategies. The paper describes an earthquake loss assessment methodology that can be used for insurance pricing and portfolio loss estimation that is based on our work esperience in the insurance market. The basic ingredients are probabilistic and deterministic regional site dependent earthquake hazard, regional building inventory (and/or portfolio), building vulnerabilities associated with typical construction systems in Turkey and estimations of building replacement costs for different damage levels. Probable maximum and average annualized losses are estimated as the result of analysis. There is a two-level earthquake insurance system in Turkey, the effect of which is incorporated in the algorithm: the national compulsory earthquake insurance scheme and the private earthquake insurance system. To buy private insurance one has to be covered by the national system, that has limited coverage. As a demonstration of the methodology we look at the case of Istanbul and use its building inventory data instead of a portfolio. A state-of-the-art time depent earthquake hazard model that portrays the increased earthquake expectancies in Istanbul is used. Intensity and spectral displacement based vulnerability relationships are incorporated in the analysis. In particular we look at the uncertainty in the loss estimations that arise from the vulnerability relationships, and at the effect of the implemented repair cost ratios.

  10. Universal Recurrence Time Statistics of Characteristic Earthquakes

    NASA Astrophysics Data System (ADS)

    Goltz, C.; Turcotte, D. L.; Abaimov, S.; Nadeau, R. M.

    2006-12-01

    Characteristic earthquakes are defined to occur quasi-periodically on major faults. Do recurrence time statistics of such earthquakes follow a particular statistical distribution? If so, which one? The answer is fundamental and has important implications for hazard assessment. The problem cannot be solved by comparing the goodness of statistical fits as the available sequences are too short. The Parkfield sequence of M ≍ 6 earthquakes, one of the most extensive reliable data sets available, has grown to merely seven events with the last earthquake in 2004, for example. Recently, however, advances in seismological monitoring and improved processing methods have unveiled so-called micro-repeaters, micro-earthquakes which recur exactly in the same location on a fault. It seems plausible to regard these earthquakes as a miniature version of the classic characteristic earthquakes. Micro-repeaters are much more frequent than major earthquakes, leading to longer sequences for analysis. Due to their recent discovery, however, available sequences contain less than 20 events at present. In this paper we present results for the analysis of recurrence times for several micro-repeater sequences from Parkfield and adjacent regions. To improve the statistical significance of our findings, we combine several sequences into one by rescaling the individual sets by their respective mean recurrence intervals and Weibull exponents. This novel approach of rescaled combination yields the most extensive data set possible. We find that the resulting statistics can be fitted well by an exponential distribution, confirming the universal applicability of the Weibull distribution to characteristic earthquakes. A similar result is obtained from rescaled combination, however, with regard to the lognormal distribution.

  11. Anomalies of rupture velocity in deep earthquakes

    NASA Astrophysics Data System (ADS)

    Suzuki, M.; Yagi, Y.

    2010-12-01

    Explaining deep seismicity is a long-standing challenge in earth science. Deeper than 300 km, the occurrence rate of earthquakes with depth remains at a low level until ~530 km depth, then rises until ~600 km, finally terminate near 700 km. Given the difficulty of estimating fracture properties and observing the stress field in the mantle transition zone (410-660 km), the seismic source processes of deep earthquakes are the most important information for understanding the distribution of deep seismicity. However, in a compilation of seismic source models of deep earthquakes, the source parameters for individual deep earthquakes are quite varied [Frohlich, 2006]. Rupture velocities for deep earthquakes estimated using seismic waveforms range from 0.3 to 0.9Vs, where Vs is the shear wave velocity, a considerably wider range than the velocities for shallow earthquakes. The uncertainty of seismic source models prevents us from determining the main characteristics of the rupture process and understanding the physical mechanisms of deep earthquakes. Recently, the back projection method has been used to derive a detailed and stable seismic source image from dense seismic network observations [e.g., Ishii et al., 2005; Walker et al., 2005]. Using this method, we can obtain an image of the seismic source process from the observed data without a priori constraints or discarding parameters. We applied the back projection method to teleseismic P-waveforms of 24 large, deep earthquakes (moment magnitude Mw ≥ 7.0, depth ≥ 300 km) recorded since 1994 by the Data Management Center of the Incorporated Research Institutions for Seismology (IRIS-DMC) and reported in the U.S. Geological Survey (USGS) catalog, and constructed seismic source models of deep earthquakes. By imaging the seismic rupture process for a set of recent deep earthquakes, we found that the rupture velocities are less than about 0.6Vs except in the depth range of 530 to 600 km. This is consistent with the depth

  12. Designing an Earthquake-Resistant Building

    ERIC Educational Resources Information Center

    English, Lyn D.; King, Donna T.

    2016-01-01

    How do cross-bracing, geometry, and base isolation help buildings withstand earthquakes? These important structural design features involve fundamental geometry that elementary school students can readily model and understand. The problem activity, Designing an Earthquake-Resistant Building, was undertaken by several classes of sixth- grade…

  13. The HayWired Earthquake Scenario

    USGS Publications Warehouse

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    ForewordThe 1906 Great San Francisco earthquake (magnitude 7.8) and the 1989 Loma Prieta earthquake (magnitude 6.9) each motivated residents of the San Francisco Bay region to build countermeasures to earthquakes into the fabric of the region. Since Loma Prieta, bay-region communities, governments, and utilities have invested tens of billions of dollars in seismic upgrades and retrofits and replacements of older buildings and infrastructure. Innovation and state-of-the-art engineering, informed by science, including novel seismic-hazard assessments, have been applied to the challenge of increasing seismic resilience throughout the bay region. However, as long as people live and work in seismically vulnerable buildings or rely on seismically vulnerable transportation and utilities, more work remains to be done.With that in mind, the U.S. Geological Survey (USGS) and its partners developed the HayWired scenario as a tool to enable further actions that can change the outcome when the next major earthquake strikes. By illuminating the likely impacts to the present-day built environment, well-constructed scenarios can and have spurred officials and citizens to take steps that change the outcomes the scenario describes, whether used to guide more realistic response and recovery exercises or to launch mitigation measures that will reduce future risk.The HayWired scenario is the latest in a series of like-minded efforts to bring a special focus onto the impacts that could occur when the Hayward Fault again ruptures through the east side of the San Francisco Bay region as it last did in 1868. Cities in the east bay along the Richmond, Oakland, and Fremont corridor would be hit hardest by earthquake ground shaking, surface fault rupture, aftershocks, and fault afterslip, but the impacts would reach throughout the bay region and far beyond. The HayWired scenario name reflects our increased reliance on the Internet and telecommunications and also alludes to the

  14. The threat of silent earthquakes

    USGS Publications Warehouse

    Cervelli, Peter

    2004-01-01

    Not all earthquakes shake the ground. The so-called silent types are forcing scientists to rethink their understanding of the way quake-prone faults behave. In rare instances, silent earthquakes that occur along the flakes of seaside volcanoes may cascade into monstrous landslides that crash into the sea and trigger towering tsunamis. Silent earthquakes that take place within fault zones created by one tectonic plate diving under another may increase the chance of ground-shaking shocks. In other locations, however, silent slip may decrease the likelihood of destructive quakes, because they release stress along faults that might otherwise seem ready to snap.

  15. Landslides and Earthquake Lakes from the Wenchuan, China Earthquake - Can it Happen in the U.S.?

    NASA Astrophysics Data System (ADS)

    Stenner, H.; Cydzik, K.; Hamilton, D.; Cattarossi, A.; Mathieson, E.

    2008-12-01

    The May 12, 2008 M7.9 Wenchuan, China earthquake destroyed five million homes and schools, causing over 87,650 deaths. Landslides, a secondary effect of the shaking, caused much of the devastation. Debris flows buried homes, rock falls crushed cars, and landslides dammed rivers. Blocked roads greatly impeded emergency access, delaying response. Our August 2008 field experience in the affected area reminded us that the western United States faces serious risks posed by earthquake-induced landslides. The topography of the western U.S. is less extreme than that near Wenchuan, but earthquakes may still cause devastating landslides, damming rivers and blocking access to affected areas. After the Wenchuan earthquake, lakes rapidly rose behind landslide dams, threatening millions of lives. One landslide above Beichuan City created Tangjiashan Lake, a massive body of water upstream of Mianyang, an area with 5.2 million people, 30,000 of whom were killed in the quake. Potential failure of the landslide dam put thousands more people at risk from catastrophic flooding. In 1959, the M7.4 Hebgen Lake earthquake in Montana caused a large landslide, which killed 19 people and dammed the Madison River. The Army Corps excavated sluices to keep the dam from failing catastrophically. The Hebgen Lake earthquake ultimately caused 28 deaths, mostly from landslides, but the affected region was sparsely populated. Slopes prone to strong earthquake shaking and landslides in California, Washington, and Oregon have much larger populations at risk. Landslide hazards continue after the earthquake due to the effect strong shaking has on hillslopes, particularly when subjected to subsequent rain. These hazards must be taken into account. Once a landslide blocks a river, rapid and thoughtful action is needed. The Chinese government quickly and safely mitigated landslide dams that posed the greatest risk to people downstream. It took expert geotechnical advice, the speed and resources of the army

  16. Napa earthquake: An earthquake in a highly connected world

    NASA Astrophysics Data System (ADS)

    Bossu, R.; Steed, R.; Mazet-Roux, G.; Roussel, F.

    2014-12-01

    The Napa earthquake recently occurred close to Silicon Valley. This makes it a good candidate to study what social networks, wearable objects and website traffic analysis (flashsourcing) can tell us about the way eyewitnesses react to ground shaking. In the first part, we compare the ratio of people publishing tweets and with the ratio of people visiting EMSC (European Mediterranean Seismological Centre) real time information website in the first minutes following the earthquake occurrence to the results published by Jawbone, which show that the proportion of people waking up depends (naturally) on the epicentral distance. The key question to evaluate is whether the proportions of inhabitants tweeting or visiting the EMSC website are similar to the proportion of people waking up as shown by the Jawbone data. If so, this supports the premise that all methods provide a reliable image of the relative ratio of people waking up. The second part of the study focuses on the reaction time for both Twitter and EMSC website access. We show, similarly to what was demonstrated for the Mineral, Virginia, earthquake (Bossu et al., 2014), that hit times on the EMSC website follow the propagation of the P waves and that 2 minutes of website traffic is sufficient to determine the epicentral location of an earthquake on the other side of the Atlantic. We also compare with the publication time of messages on Twitter. Finally, we check whether the number of tweets and the number of visitors relative to the number of inhabitants is correlated to the local level of shaking. Together these results will tell us whether the reaction of eyewitnesses to ground shaking as observed through Twitter and the EMSC website analysis is tool specific (i.e. specific to Twitter or EMSC website) or whether they do reflect people's actual reactions.

  17. Evaluation of earthquake potential in China

    NASA Astrophysics Data System (ADS)

    Rong, Yufang

    I present three earthquake potential estimates for magnitude 5.4 and larger earthquakes for China. The potential is expressed as the rate density (that is, the probability per unit area, magnitude and time). The three methods employ smoothed seismicity-, geologic slip rate-, and geodetic strain rate data. I test all three estimates, and another published estimate, against earthquake data. I constructed a special earthquake catalog which combines previous catalogs covering different times. I estimated moment magnitudes for some events using regression relationships that are derived in this study. I used the special catalog to construct the smoothed seismicity model and to test all models retrospectively. In all the models, I adopted a kind of Gutenberg-Richter magnitude distribution with modifications at higher magnitude. The assumed magnitude distribution depends on three parameters: a multiplicative " a-value," the slope or "b-value," and a "corner magnitude" marking a rapid decrease of earthquake rate with magnitude. I assumed the "b-value" to be constant for the whole study area and estimated the other parameters from regional or local geophysical data. The smoothed seismicity method assumes that the rate density is proportional to the magnitude of past earthquakes and declines as a negative power of the epicentral distance out to a few hundred kilometers. I derived the upper magnitude limit from the special catalog, and estimated local "a-values" from smoothed seismicity. I have begun a "prospective" test, and earthquakes since the beginning of 2000 are quite compatible with the model. For the geologic estimations, I adopted the seismic source zones that are used in the published Global Seismic Hazard Assessment Project (GSHAP) model. The zones are divided according to geological, geodetic and seismicity data. Corner magnitudes are estimated from fault length, while fault slip rates and an assumed locking depth determine earthquake rates. The geological model

  18. Tidal controls on earthquake size-frequency statistics

    NASA Astrophysics Data System (ADS)

    Ide, S.; Yabe, S.; Tanaka, Y.

    2016-12-01

    The possibility that tidal stresses can trigger earthquakes is a long-standing issue in seismology. Except in some special cases, a causal relationship between seismicity and the phase of tidal stress has been rejected on the basis of studies using many small events. However, recently discovered deep tectonic tremors are highly sensitive to tidal stress levels, with the relationship being governed by a nonlinear law according to which the tremor rate increases exponentially with increasing stress; thus, slow deformation (and the probability of earthquakes) may be enhanced during periods of large tidal stress. Here, we show the influence of tidal stress on seismicity by calculating histories of tidal shear stress during the 2-week period before earthquakes. Very large earthquakes tend to occur near the time of maximum tidal stress, but this tendency is not obvious for small earthquakes. Rather, we found that tidal stress controls the earthquake size-frequency statistics; i.e., the fraction of large events increases (i.e. the b-value of the Gutenberg-Richter relation decreases) as the tidal shear stress increases. This correlation is apparent in data from the global catalog and in relatively homogeneous regional catalogues of earthquakes in Japan. The relationship is also reasonable, considering the well-known relationship between stress and the b-value. Our findings indicate that the probability of a tiny rock failure expanding to a gigantic rupture increases with increasing tidal stress levels. This finding has clear implications for probabilistic earthquake forecasting.

  19. Non-double-couple earthquakes. 1. Theory

    USGS Publications Warehouse

    Julian, B.R.; Miller, A.D.; Foulger, G.R.

    1998-01-01

    Historically, most quantitative seismological analyses have been based on the assumption that earthquakes are caused by shear faulting, for which the equivalent force system in an isotropic medium is a pair of force couples with no net torque (a 'double couple,' or DC). Observations of increasing quality and coverage, however, now resolve departures from the DC model for many earthquakes and find some earthquakes, especially in volcanic and geothermal areas, that have strongly non-DC mechanisms. Understanding non-DC earthquakes is important both for studying the process of faulting in detail and for identifying nonshear-faulting processes that apparently occur in some earthquakes. This paper summarizes the theory of 'moment tensor' expansions of equivalent-force systems and analyzes many possible physical non-DC earthquake processes. Contrary to long-standing assumption, sources within the Earth can sometimes have net force and torque components, described by first-rank and asymmetric second-rank moment tensors, which must be included in analyses of landslides and some volcanic phenomena. Non-DC processes that lead to conventional (symmetric second-rank) moment tensors include geometrically complex shear faulting, tensile faulting, shear faulting in an anisotropic medium, shear faulting in a heterogeneous region (e.g., near an interface), and polymorphic phase transformations. Undoubtedly, many non-DC earthquake processes remain to be discovered. Progress will be facilitated by experimental studies that use wave amplitudes, amplitude ratios, and complete waveforms in addition to wave polarities and thus avoid arbitrary assumptions such as the absence of volume changes or the temporal similarity of different moment tensor components.

  20. Long-range dependence in earthquake-moment release and implications for earthquake occurrence probability.

    PubMed

    Barani, Simone; Mascandola, Claudia; Riccomagno, Eva; Spallarossa, Daniele; Albarello, Dario; Ferretti, Gabriele; Scafidi, Davide; Augliera, Paolo; Massa, Marco

    2018-03-28

    Since the beginning of the 1980s, when Mandelbrot observed that earthquakes occur on 'fractal' self-similar sets, many studies have investigated the dynamical mechanisms that lead to self-similarities in the earthquake process. Interpreting seismicity as a self-similar process is undoubtedly convenient to bypass the physical complexities related to the actual process. Self-similar processes are indeed invariant under suitable scaling of space and time. In this study, we show that long-range dependence is an inherent feature of the seismic process, and is universal. Examination of series of cumulative seismic moment both in Italy and worldwide through Hurst's rescaled range analysis shows that seismicity is a memory process with a Hurst exponent H ≈ 0.87. We observe that H is substantially space- and time-invariant, except in cases of catalog incompleteness. This has implications for earthquake forecasting. Hence, we have developed a probability model for earthquake occurrence that allows for long-range dependence in the seismic process. Unlike the Poisson model, dependent events are allowed. This model can be easily transferred to other disciplines that deal with self-similar processes.

  1. Fixed recurrence and slip models better predict earthquake behavior than the time- and slip-predictable models 1: repeating earthquakes

    USGS Publications Warehouse

    Rubinstein, Justin L.; Ellsworth, William L.; Chen, Kate Huihsuan; Uchida, Naoki

    2012-01-01

    The behavior of individual events in repeating earthquake sequences in California, Taiwan and Japan is better predicted by a model with fixed inter-event time or fixed slip than it is by the time- and slip-predictable models for earthquake occurrence. Given that repeating earthquakes are highly regular in both inter-event time and seismic moment, the time- and slip-predictable models seem ideally suited to explain their behavior. Taken together with evidence from the companion manuscript that shows similar results for laboratory experiments we conclude that the short-term predictions of the time- and slip-predictable models should be rejected in favor of earthquake models that assume either fixed slip or fixed recurrence interval. This implies that the elastic rebound model underlying the time- and slip-predictable models offers no additional value in describing earthquake behavior in an event-to-event sense, but its value in a long-term sense cannot be determined. These models likely fail because they rely on assumptions that oversimplify the earthquake cycle. We note that the time and slip of these events is predicted quite well by fixed slip and fixed recurrence models, so in some sense they are time- and slip-predictable. While fixed recurrence and slip models better predict repeating earthquake behavior than the time- and slip-predictable models, we observe a correlation between slip and the preceding recurrence time for many repeating earthquake sequences in Parkfield, California. This correlation is not found in other regions, and the sequences with the correlative slip-predictable behavior are not distinguishable from nearby earthquake sequences that do not exhibit this behavior.

  2. Earthquakes in the Central United States, 1699-2010

    USGS Publications Warehouse

    Dart, Richard L.; Volpi, Christina M.

    2010-01-01

    This publication is an update of an earlier report, U.S. Geological Survey (USGS) Geologic Investigation I-2812 by Wheeler and others (2003), titled ?Earthquakes in the Central United States-1699-2002.? Like the original poster, the center of the updated poster is a map showing the pattern of earthquake locations in the most seismically active part of the central United States. Arrayed around the map are short explanatory texts and graphics, which describe the distribution of historical earthquakes and the effects of the most notable of them. The updated poster contains additional, post 2002, earthquake data. These are 38 earthquakes covering the time interval from January 2003 to June 2010, including the Mount Carmel, Illinois, earthquake of 2008. The USGS Preliminary Determination of Epicenters (PDE) was the source of these additional data. Like the I-2812 poster, this poster was prepared for a nontechnical audience and designed to inform the general public as to the widespread occurrence of felt and damaging earthquakes in the Central United States. Accordingly, the poster should not be used to assess earthquake hazard in small areas or at individual locations.

  3. Possible cause for an improbable earthquake: The 1997 MW 4.9 southern Alabama earthquake and hydrocarbon recovery

    USGS Publications Warehouse

    Gomberg, J.; Wolf, L.

    1999-01-01

    Circumstantial and physical evidence indicates that the 1997 MW 4.9 earthquake in southern Alabama may have been related to hydrocarbon recovery. Epicenters of this earthquake and its aftershocks were located within a few kilometers of active oil and gas extraction wells and two pressurized injection wells. Main shock and aftershock focal depths (2-6 km) are within a few kilometers of the injection and withdrawal depths. Strain accumulation at geologic rates sufficient to cause rupture at these shallow focal depths is not likely. A paucity of prior seismicity is difficult to reconcile with the occurrence of an earthquake of MW 4.9 and a magnitude-frequency relationship usually assumed for natural earthquakes. The normal-fault main-shock mechanism is consistent with reactivation of preexisting faults in the regional tectonic stress field. If the earthquake were purely tectonic, however, the question arises as to why it occurred on only the small fraction of a large, regional fault system coinciding with active hydrocarbon recovery. No obvious temporal correlation is apparent between the earthquakes and recovery activities. Although thus far little can be said quantitatively about the physical processes that may have caused the 1997 sequence, a plausible explanation involves the poroelastic response of the crust to extraction of hydrocarbons.

  4. 78 FR 64973 - Scientific Earthquake Studies Advisory Committee (SESAC)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-30

    ... DEPARTMENT OF THE INTERIOR Geological Survey [GX14GG009950000] Scientific Earthquake Studies... Public Law 106-503, the Scientific Earthquake Studies Advisory Committee (SESAC) will hold its next... warning and national earthquake hazard mapping. Meetings of the Scientific Earthquake Studies Advisory...

  5. Space technologies for short-term earthquake warning

    NASA Astrophysics Data System (ADS)

    Pulinets, S.

    Recent theoretical and experimental studies explicitly demonstrated the ability of space technologies to identify and monitor the specific variations at near-earth space plasma, atmosphere and ground surface associated with approaching severe earthquakes (named as earthquake precursors) appearing several days (from 1 to 5) before the seismic shock over the seismically active areas. Several countries and private companies are in the stage of preparation (or already launched) the dedicated spacecrafts for monitoring of the earthquake precursors from space and for short-term earthquake prediction. The present paper intends to outline the optimal algorithm for creation of the space-borne system for the earthquake precursors monitoring and for short-term earthquake prediction. It takes into account the following considerations: Selection of the precursors in the terms of priority, taking into account their statistical and physical parameters Configuration of the spacecraft payload Configuration of the satellite constellation (orbit selection, satellite distribution, operation schedule) Proposal of different options (cheap microsatellite or comprehensive multisatellite constellation) Taking into account that the most promising are the ionospheric precursors of earthquakes, the special attention will be devoted to the radiophysical techniques of the ionosphere monitoring. The advantages and disadvantages of such technologies as vertical sounding, in-situ probes, ionosphere tomography, GPS TEC and GPS MET technologies will be considered.

  6. Space technologies for short-term earthquake warning

    NASA Astrophysics Data System (ADS)

    Pulinets, S. A.

    Recent theoretical and experimental studies explicitly demonstrated the ability of space technologies to identify and monitor the specific variations at near-earth space plasma, atmosphere and ground surface associated with approaching severe earthquakes (named as earthquake precursors) which appear several days (from 1 to 5) before the seismic shock over the seismically active areas. Several countries and private companies are in the stage of preparation (or already launched) the dedicated spacecrafts for monitoring of the earthquake precursors from space and for short-term earthquake prediction. The present paper intends to outline the optimal algorithm for creation of the space-borne system for the earthquake precursors monitoring and for short-term earthquake prediction. It takes into account the following: Selection of the precursors in the terms of priority, considering their statistical and physical parameters.Configuration of the spacecraft payload.Configuration of the satellite constellation (orbit selection, satellite distribution, operation schedule).Different options of the satellite systems (cheap microsatellite or comprehensive multisatellite constellation). Taking into account that the most promising are the ionospheric precursors of earthquakes, the special attention is devoted to the radiophysical techniques of the ionosphere monitoring. The advantages and disadvantages of such technologies as vertical sounding, in-situ probes, ionosphere tomography, GPS TEC and GPS MET technologies are considered.

  7. Earthquake Occurrence in Bangladesh and Surrounding Region

    NASA Astrophysics Data System (ADS)

    Al-Hussaini, T. M.; Al-Noman, M.

    2011-12-01

    The collision of the northward moving Indian plate with the Eurasian plate is the cause of frequent earthquakes in the region comprising Bangladesh and neighbouring India, Nepal and Myanmar. Historical records indicate that Bangladesh has been affected by five major earthquakes of magnitude greater than 7.0 (Richter scale) during 1869 to 1930. This paper presents some statistical observations of earthquake occurrence in fulfilment of a basic groundwork for seismic hazard assessment of this region. An up to date catalogue covering earthquake information in the region bounded within 17°-30°N and 84°-97°E for the period of historical period to 2010 is derived from various reputed international sources including ISC, IRIS, Indian sources and available publications. Careful scrutiny is done to remove duplicate or uncertain earthquake events. Earthquake magnitudes in the range of 1.8 to 8.1 have been obtained and relationships between different magnitude scales have been studied. Aftershocks are removed from the catalogue using magnitude dependent space window and time window. The main shock data are then analyzed to obtain completeness period for different magnitudes evaluating their temporal homogeneity. Spatial and temporal distribution of earthquakes, magnitude-depth histograms and other statistical analysis are performed to understand the distribution of seismic activity in this region.

  8. A new way of telling earthquake stories: MOBEE - the MOBile Earthquake Exhibition

    NASA Astrophysics Data System (ADS)

    Tataru, Dragos; Toma-Danila, Dragos; Nastase, Eduard

    2016-04-01

    In the last decades, the demand and acknowledged importance of science outreach, in general and geophysics in particular, has grown, as demonstrated by many international and national projects and other activities performed by research institutes. The National Institute for Earth Physics (NIEP) from Romania is the leading national institution on earthquake monitoring and research, having at the same time a declared focus on informing and educating a wide audience about geosciences and especially seismology. This is more than welcome, since Romania is a very active country from a seismological point of view, but not too reactive when it comes to diminishing the possible effect of a major earthquake. Over the last few decades, the country has experienced several major earthquakes which have claimed thousands of lives and millions in property damage (1940; 1977; 1986 and 1990 Vrancea earthquakes). In this context, during a partnership started in 2014 together with the National Art University and Siveco IT company, a group of researchers from NIEP initiated the MOBile Earthquake Exhibition (MOBEE) project. The main goal was to design a portable museum to bring on the road educational activities focused on seismology, seismic hazard and Earth science. The exhibition is mainly focused on school students of all ages as it explains the main topics of geophysics through a unique combination of posters, digital animations and apps, large markets and exciting hand-on experiments, 3D printed models and posters. This project is singular in Romania and aims to transmit properly reviewed actual information, regarding the definition of earthquakes, the way natural hazards can affect people, buildings and the environment and the measures to be taken for prevent an aftermath. Many of the presented concepts can be used by teachers as a complementary way of demonstrating physics facts and concepts and explaining processes that shape the dynamic Earth features. It also involves

  9. Clinical characteristics of patients seizure following the 2016 Kumamoto earthquake.

    PubMed

    Inatomi, Yuichiro; Nakajima, Makoto; Yonehara, Toshiro; Ando, Yukio

    2017-06-01

    To investigate the clinical characteristics of patients with seizure following the 2016 Kumamoto earthquake. We retrospectively studied patients with seizure admitted to our hospital for 12weeks following the earthquake. We compared the clinical backgrounds and characteristics of the patients: before (the same period from the previous 3years) and after the earthquake; and the early (first 2weeks) and late (subsequent 10weeks) phases. A total of 60 patients with seizure were admitted to the emergency room after the earthquake, and 175 (58.3/year) patients were admitted before the earthquake. Of them, 35 patients with seizure were hospitalized in the Department of Neurology after the earthquake, and 96 (32/year) patients were hospitalized before the earthquake. In patients after the earthquake, males and non-cerebrovascular diseases as an epileptogenic disease were seen more frequently than before the earthquake. During the early phase after the earthquake, female, first-attack, and non-focal-type patients were seen more frequently than during the late phase after the earthquake. These characteristics of patients with seizure during the early phase after the earthquake suggest that many patients had non-epileptic seizures. To prevent seizures following earthquakes, mental stress and physical status of evacuees must be assessed. Copyright © 2017. Published by Elsevier Ltd.

  10. Estimating Casualties for Large Earthquakes Worldwide Using an Empirical Approach

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David J.; Hearne, Mike

    2009-01-01

    We developed an empirical country- and region-specific earthquake vulnerability model to be used as a candidate for post-earthquake fatality estimation by the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) system. The earthquake fatality rate is based on past fatal earthquakes (earthquakes causing one or more deaths) in individual countries where at least four fatal earthquakes occurred during the catalog period (since 1973). Because only a few dozen countries have experienced four or more fatal earthquakes since 1973, we propose a new global regionalization scheme based on idealization of countries that are expected to have similar susceptibility to future earthquake losses given the existing building stock, its vulnerability, and other socioeconomic characteristics. The fatality estimates obtained using an empirical country- or region-specific model will be used along with other selected engineering risk-based loss models for generation of automated earthquake alerts. These alerts could potentially benefit the rapid-earthquake-response agencies and governments for better response to reduce earthquake fatalities. Fatality estimates are also useful to stimulate earthquake preparedness planning and disaster mitigation. The proposed model has several advantages as compared with other candidate methods, and the country- or region-specific fatality rates can be readily updated when new data become available.

  11. Earthquake source properties from pseudotachylite

    USGS Publications Warehouse

    Beeler, Nicholas M.; Di Toro, Giulio; Nielsen, Stefan

    2016-01-01

    The motions radiated from an earthquake contain information that can be interpreted as displacements within the source and therefore related to stress drop. Except in a few notable cases, the source displacements can neither be easily related to the absolute stress level or fault strength, nor attributed to a particular physical mechanism. In contrast paleo-earthquakes recorded by exhumed pseudotachylite have a known dynamic mechanism whose properties constrain the co-seismic fault strength. Pseudotachylite can also be used to directly address a longstanding discrepancy between seismologically measured static stress drops, which are typically a few MPa, and much larger dynamic stress drops expected from thermal weakening during localized slip at seismic speeds in crystalline rock [Sibson, 1973; McKenzie and Brune, 1969; Lachenbruch, 1980; Mase and Smith, 1986; Rice, 2006] as have been observed recently in laboratory experiments at high slip rates [Di Toro et al., 2006a]. This note places pseudotachylite-derived estimates of fault strength and inferred stress levels within the context and broader bounds of naturally observed earthquake source parameters: apparent stress, stress drop, and overshoot, including consideration of roughness of the fault surface, off-fault damage, fracture energy, and the 'strength excess'. The analysis, which assumes stress drop is related to corner frequency by the Madariaga [1976] source model, is restricted to the intermediate sized earthquakes of the Gole Larghe fault zone in the Italian Alps where the dynamic shear strength is well-constrained by field and laboratory measurements. We find that radiated energy exceeds the shear-generated heat and that the maximum strength excess is ~16 MPa. More generally these events have inferred earthquake source parameters that are rate, for instance a few percent of the global earthquake population has stress drops as large, unless: fracture energy is routinely greater than existing models allow

  12. Ionospheric precursors to large earthquakes: A case study of the 2011 Japanese Tohoku Earthquake

    NASA Astrophysics Data System (ADS)

    Carter, B. A.; Kellerman, A. C.; Kane, T. A.; Dyson, P. L.; Norman, R.; Zhang, K.

    2013-09-01

    Researchers have reported ionospheric electron distribution abnormalities, such as electron density enhancements and/or depletions, that they claimed were related to forthcoming earthquakes. In this study, the Tohoku earthquake is examined using ionosonde data to establish whether any otherwise unexplained ionospheric anomalies were detected in the days and hours prior to the event. As the choices for the ionospheric baseline are generally different between previous works, three separate baselines for the peak plasma frequency of the F2 layer, foF2, are employed here; the running 30-day median (commonly used in other works), the International Reference Ionosphere (IRI) model and the Thermosphere Ionosphere Electrodynamic General Circulation Model (TIE-GCM). It is demonstrated that the classification of an ionospheric perturbation is heavily reliant on the baseline used, with the 30-day median, the IRI and the TIE-GCM generally underestimating, approximately describing and overestimating the measured foF2, respectively, in the 1-month period leading up to the earthquake. A detailed analysis of the ionospheric variability in the 3 days before the earthquake is then undertaken, where a simultaneous increase in foF2 and the Es layer peak plasma frequency, foEs, relative to the 30-day median was observed within 1 h before the earthquake. A statistical search for similar simultaneous foF2 and foEs increases in 6 years of data revealed that this feature has been observed on many other occasions without related seismic activity. Therefore, it is concluded that one cannot confidently use this type of ionospheric perturbation to predict an impending earthquake. It is suggested that in order to achieve significant progress in our understanding of seismo-ionospheric coupling, better account must be taken of other known sources of ionospheric variability in addition to solar and geomagnetic activity, such as the thermospheric coupling.

  13. Investigating Landslides Caused by Earthquakes A Historical Review

    NASA Astrophysics Data System (ADS)

    Keefer, David K.

    Post-earthquake field investigations of landslide occurrence have provided a basis for understanding, evaluating, and mapping the hazard and risk associated withearthquake-induced landslides. This paper traces thehistorical development of knowledge derived from these investigations. Before 1783, historical accounts of the occurrence of landslides in earthquakes are typically so incomplete and vague that conclusions based on these accounts are of limited usefulness. For example, the number of landslides triggered by a given event is almost always greatly underestimated. The first formal, scientific post-earthquake investigation that included systematic documentation of the landslides was undertaken in the Calabria region of Italy after the 1783 earthquake swarm. From then until the mid-twentieth century, the best information on earthquake-induced landslides came from a succession ofpost-earthquake investigations largely carried out by formal commissions that undertook extensive ground-based field studies. Beginning in the mid-twentieth century, when the use of aerial photography became widespread, comprehensive inventories of landslide occurrence have been made for several earthquakes in the United States, Peru, Guatemala, Italy, El Salvador, Japan, and Taiwan. Techniques have also been developed for performing ``retrospective'' analyses years or decades after an earthquake that attempt to reconstruct the distribution of landslides triggered by the event. The additional use of Geographic Information System (GIS) processing and digital mapping since about 1989 has greatly facilitated the level of analysis that can applied to mapped distributions of landslides. Beginning in 1984, syntheses of worldwide and national data on earthquake-induced landslides have defined their general characteristics and relations between their occurrence and various geologic and seismic parameters. However, the number of comprehensive post-earthquake studies of landslides is still

  14. 2016 update on induced earthquakes in the United States

    USGS Publications Warehouse

    Petersen, Mark D.

    2016-01-01

    During the past decade people living in numerous locations across the central U.S. experienced many more small to moderate sized earthquakes than ever before. This earthquake activity began increasing about 2009 and peaked during 2015 and into early 2016. For example, prior to 2009 Oklahoma typically experienced 1 or 2 small earthquakes per year with magnitude greater than 3.0 but by 2015 this number rose to over 900 earthquakes per year of that size and over 30 earthquakes greater than 4.0. These earthquakes can cause damage. In 2011 a magnitude 5.6 earthquake struck near the town of Prague, Oklahoma on a preexisting fault and caused severe damage to several houses and school buildings. During the past 6 years more than 1500 reports of damaging shaking levels were reported in areas of induced seismicity. This rapid increase and the potential for damaging ground shaking from induced earthquakes caused alarm to about 8 million people living nearby and officials responsible for public safety. They wanted to understand why earthquakes were increasing and the potential threats to society and buildings located nearby.

  15. Seismic Moment, Seismic Energy, and Source Duration of Slow Earthquakes: Application of Brownian slow earthquake model to three major subduction zones

    NASA Astrophysics Data System (ADS)

    Ide, Satoshi; Maury, Julie

    2018-04-01

    Tectonic tremors, low-frequency earthquakes, very low-frequency earthquakes, and slow slip events are all regarded as components of broadband slow earthquakes, which can be modeled as a stochastic process using Brownian motion. Here we show that the Brownian slow earthquake model provides theoretical relationships among the seismic moment, seismic energy, and source duration of slow earthquakes and that this model explains various estimates of these quantities in three major subduction zones: Japan, Cascadia, and Mexico. While the estimates for these three regions are similar at the seismological frequencies, the seismic moment rates are significantly different in the geodetic observation. This difference is ascribed to the difference in the characteristic times of the Brownian slow earthquake model, which is controlled by the width of the source area. We also show that the model can include non-Gaussian fluctuations, which better explains recent findings of a near-constant source duration for low-frequency earthquake families.

  16. The California Earthquake Advisory Plan: A history

    USGS Publications Warehouse

    Roeloffs, Evelyn A.; Goltz, James D.

    2017-01-01

    Since 1985, the California Office of Emergency Services (Cal OES) has issued advisory statements to local jurisdictions and the public following seismic activity that scientists on the California Earthquake Prediction Evaluation Council view as indicating elevated probability of a larger earthquake in the same area during the next several days. These advisory statements are motivated by statistical studies showing that about 5% of moderate earthquakes in California are followed by larger events within a 10-km, five-day space-time window (Jones, 1985; Agnew and Jones, 1991; Reasenberg and Jones, 1994). Cal OES issued four earthquake advisories from 1985 to 1989. In October, 1990, the California Earthquake Advisory Plan formalized this practice, and six Cal OES Advisories have been issued since then. This article describes that protocol’s scientific basis and evolution.

  17. The aftershock signature of supershear earthquakes.

    PubMed

    Bouchon, Michel; Karabulut, Hayrullah

    2008-06-06

    Recent studies show that earthquake faults may rupture at speeds exceeding the shear wave velocity of rocks. This supershear rupture produces in the ground a seismic shock wave similar to the sonic boom produced by a supersonic airplane. This shock wave may increase the destruction caused by the earthquake. We report that supershear earthquakes are characterized by a specific pattern of aftershocks: The fault plane itself is remarkably quiet whereas aftershocks cluster off the fault, on secondary structures that are activated by the supershear rupture. The post-earthquake quiescence of the fault shows that friction is relatively uniform over supershear segments, whereas the activation of off-fault structures is explained by the shock wave radiation, which produces high stresses over a wide zone surrounding the fault.

  18. Limits on great earthquake size at subduction zones

    NASA Astrophysics Data System (ADS)

    McCaffrey, R.

    2012-12-01

    Subduction zones are where the world's greatest earthquakes occur due to the large fault area available to slip. Yet some subduction zones are thought to be immune from these massive events, where quake size is limited by some physical processes or properties. Accordingly, the size of the 2011 Tohoku-oki Mw 9.0 earthquake caught some in the earthquake research community by surprise. The expectations of these massive quakes have been driven in the past by reliance on our short, incomplete history of earthquakes and causal relationships derived from it. The logic applied is that if a great earthquake has not happened in the past, that we know of, one cannot happen in the future. Using the ~100-year global earthquake seismological history, and in some cases extended with geologic observations, relationships between maximum earthquake sizes and other properties of subduction zones are suggested, leading to the notion that some subduction zones, like the Japan Trench, would never produce a magnitude ~9 event. Empirical correlations of earthquake behavior with other subduction parameters can give false positive results when the data are incomplete or incorrect, of small numbers and numerous attributes are examined. Given multi-century return times of the greatest earthquakes, ignorance of those return times and our relatively limited temporal observation span (in most places), I suggest that we cannot yet rule out great earthquakes at any subduction zones. Alternatively, using the length of a subduction zone that is available for slip as the predominant factor in determining maximum earthquake size, we cannot rule out that any subduction zone of a few hundred kilometers or more in length may be capable of producing a magnitude 9 or larger earthquake. Based on this method, the expected maximum size for the Japan Trench was 9.0 (McCaffrey, Geology, p. 263, 2008). The same approach indicates that a M > 9 off Java, with twice the population density as Honshu and much lower

  19. Aseismic blocks and destructive earthquakes in the Aegean

    NASA Astrophysics Data System (ADS)

    Stiros, Stathis

    2017-04-01

    Aseismic areas are not identified only in vast, geologically stable regions, but also within regions of active, intense, distributed deformation such as the Aegean. In the latter, "aseismic blocks" about 200m wide were recognized in the 1990's on the basis of the absence of instrumentally-derived earthquake foci, in contrast to surrounding areas. This pattern was supported by the available historical seismicity data, as well as by geologic evidence. Interestingly, GPS evidence indicates that such blocks are among the areas characterized by small deformation rates relatively to surrounding areas of higher deformation. Still, the largest and most destructive earthquake of the 1990's, the 1995 M6.6 earthquake occurred at the center of one of these "aseismic" zones at the northern part of Greece, found unprotected against seismic hazard. This case was indeed a repeat of the case of the tsunami-associated 1956 Amorgos Island M7.4 earthquake, the largest 20th century event in the Aegean back-arc region: the 1956 earthquake occurred at the center of a geologically distinct region (Cyclades Massif in Central Aegean), till then assumed aseismic. Interestingly, after 1956, the overall idea of aseismic regions remained valid, though a "promontory" of earthquake prone-areas intruding into the aseismic central Aegean was assumed. Exploitation of the archaeological excavation evidence and careful, combined analysis of historical and archaeological data and other palaeoseismic, mostly coastal data, indicated that destructive and major earthquakes have left their traces in previously assumed aseismic blocks. In the latter earthquakes typically occur with relatively low recurrence intervals, >200-300 years, much smaller than in adjacent active areas. Interestingly, areas assumed a-seismic in antiquity are among the most active in the last centuries, while areas hit by major earthquakes in the past are usually classified as areas of low seismic risk in official maps. Some reasons

  20. Tsunami evacuation plans for future megathrust earthquakes in Padang, Indonesia, considering stochastic earthquake scenarios

    NASA Astrophysics Data System (ADS)

    Muhammad, Ario; Goda, Katsuichiro; Alexander, Nicholas A.; Kongko, Widjo; Muhari, Abdul

    2017-12-01

    This study develops tsunami evacuation plans in Padang, Indonesia, using a stochastic tsunami simulation method. The stochastic results are based on multiple earthquake scenarios for different magnitudes (Mw 8.5, 8.75, and 9.0) that reflect asperity characteristics of the 1797 historical event in the same region. The generation of the earthquake scenarios involves probabilistic models of earthquake source parameters and stochastic synthesis of earthquake slip distributions. In total, 300 source models are generated to produce comprehensive tsunami evacuation plans in Padang. The tsunami hazard assessment results show that Padang may face significant tsunamis causing the maximum tsunami inundation height and depth of 15 and 10 m, respectively. A comprehensive tsunami evacuation plan - including horizontal evacuation area maps, assessment of temporary shelters considering the impact due to ground shaking and tsunami, and integrated horizontal-vertical evacuation time maps - has been developed based on the stochastic tsunami simulation results. The developed evacuation plans highlight that comprehensive mitigation policies can be produced from the stochastic tsunami simulation for future tsunamigenic events.

  1. Geophysical setting of the February 21, 2008 Mw 6 Wells earthquake, Nevada, and implications for earthquake hazards

    USGS Publications Warehouse

    Ponce, David A.; Watt, Janet T.; Bouligand, C.

    2011-01-01

    We utilize gravity and magnetic methods to investigate the regional geophysical setting of the Wells earthquake. In particular, we delineate major crustal structures that may have played a role in the location of the earthquake and discuss the geometry of a nearby sedimentary basin that may have contributed to observed ground shaking. The February 21, 2008 Mw 6.0 Wells earthquake, centered about 10 km northeast of Wells, Nevada, caused considerable damage to local buildings, especially in the historic old town area. The earthquake occurred on a previously unmapped normal fault and preliminary relocated events indicate a fault plane dipping about 55 degrees to the southeast. The epicenter lies near the intersection of major Basin and Range normal faults along the Ruby Mountains and Snake Mountains, and strike-slip faults in the southern Snake Mountains. Regionally, the Wells earthquake epicenter is aligned with a crustal-scale boundary along the edge of a basement gravity high that correlates to the Ruby Mountains fault zone. The Wells earthquake also occurred near a geophysically defined strike-slip fault that offsets buried plutonic rocks by about 30 km. In addition, a new depth-to-basement map, derived from the inversion of gravity data, indicates that the Wells earthquake and most of its associated aftershock sequence lie below a small oval- to rhomboid-shaped basin, that reaches a depth of about 2 km. Although the basin is of limited areal extent, it could have contributed to increased ground shaking in the vicinity of the city of Wells, Nevada, due to basin amplification of seismic waves.

  2. POST Earthquake Debris Management — AN Overview

    NASA Astrophysics Data System (ADS)

    Sarkar, Raju

    Every year natural disasters, such as fires, floods, earthquakes, hurricanes, landslides, tsunami, and tornadoes, challenge various communities of the world. Earthquakes strike with varying degrees of severity and pose both short- and long-term challenges to public service providers. Earthquakes generate shock waves and displace the ground along fault lines. These seismic forces can bring down buildings and bridges in a localized area and damage buildings and other structures in a far wider area. Secondary damage from fires, explosions, and localized flooding from broken water pipes can increase the amount of debris. Earthquake debris includes building materials, personal property, and sediment from landslides. The management of this debris, as well as the waste generated during the reconstruction works, can place significant challenges on the national and local capacities. Debris removal is a major component of every post earthquake recovery operation. Much of the debris generated from earthquake is not hazardous. Soil, building material, and green waste, such as trees and shrubs, make up most of the volume of earthquake debris. These wastes not only create significant health problems and a very unpleasant living environment if not disposed of safely and appropriately, but also can subsequently impose economical burdens on the reconstruction phase. In practice, most of the debris may be either disposed of at landfill sites, reused as materials for construction or recycled into useful commodities Therefore, the debris clearance operation should focus on the geotechnical engineering approach as an important post earthquake issue to control the quality of the incoming flow of potential soil materials. In this paper, the importance of an emergency management perspective in this geotechnical approach that takes into account the different criteria related to the operation execution is proposed by highlighting the key issues concerning the handling of the construction

  3. The Southern California Earthquake Survival Program

    USGS Publications Warehouse

    Harris, J.M.

    1989-01-01

    In July 1988, the Los Angeles County Board of Supervisors directed the Chief Administrative Office to develop an educational program aimed at improving earthquake preparedness among Los Angeles County residents. the board recognized that current earthquake education efforts were not only insufficient, but also often confusing and costly. The board unanimously approved the development of a program that would make earthquake preparedness a year-long effort by encouraging residents to take a different precaution each month. 

  4. The great Lisbon earthquake and tsunami of 1755: lessons from the recent Sumatra earthquakes and possible link to Plato's Atlantis

    NASA Astrophysics Data System (ADS)

    Gutscher, M.-A.

    2006-05-01

    Great earthquakes and tsunami can have a tremendous societal impact. The Lisbon earthquake and tsunami of 1755 caused tens of thousands of deaths in Portugal, Spain and NW Morocco. Felt as far as Hamburg and the Azores islands, its magnitude is estimated to be 8.5 9. However, because of the complex tectonics in Southern Iberia, the fault that produced the earthquake has not yet been clearly identified. Recently acquired data from the Gulf of Cadiz area (tomography, seismic profiles, high-resolution bathymetry, sampled active mud volcanoes) provide strong evidence for an active east dipping subduction zone beneath Gibraltar. Eleven out of 12 of the strongest earthquakes (M>8.5) of the past 100 years occurred along subduction zone megathrusts (including the December 2004 and March 2005 Sumatra earthquakes). Thus, it appears likely that the 1755 earthquake and tsunami were generated in a similar fashion, along the shallow east-dipping subduction fault plane. This implies that the Cadiz subduction zone is locked (like the Cascadia and Nankai/Japan subduction zones), with great earthquakes occurring over long return periods. Indeed, the regional paleoseismic record (contained in deep-water turbidites and shallow lagoon deposits) suggests great earthquakes off South West Iberia every 1500 2000 years. Tsunami deposits indicate an earlier great earthquake struck SW Iberia around 200 BC, as noted by Roman records from Cadiz. A written record of even older events may also exist. According to Plato's dialogues The Critias and The Timaeus, Atlantis was destroyed by ‘strong earthquakes and floods … in a single day and night’ at a date given as 11,600 BP. A 1 m thick turbidite deposit, containing coarse grained sediments from underwater avalanches, has been dated at 12,000 BP and may correspond to the destructive earthquake and tsunami described by Plato. The effects on a paleo-island (Spartel) in the straits of Gibraltar would have been devastating, if inhabited, and may

  5. Prospects for earthquake prediction and control

    USGS Publications Warehouse

    Healy, J.H.; Lee, W.H.K.; Pakiser, L.C.; Raleigh, C.B.; Wood, M.D.

    1972-01-01

    The San Andreas fault is viewed, according to the concepts of seafloor spreading and plate tectonics, as a transform fault that separates the Pacific and North American plates and along which relative movements of 2 to 6 cm/year have been taking place. The resulting strain can be released by creep, by earthquakes of moderate size, or (as near San Francisco and Los Angeles) by great earthquakes. Microearthquakes, as mapped by a dense seismograph network in central California, generally coincide with zones of the San Andreas fault system that are creeping. Microearthquakes are few and scattered in zones where elastic energy is being stored. Changes in the rate of strain, as recorded by tiltmeter arrays, have been observed before several earthquakes of about magnitude 4. Changes in fluid pressure may control timing of seismic activity and make it possible to control natural earthquakes by controlling variations in fluid pressure in fault zones. An experiment in earthquake control is underway at the Rangely oil field in Colorado, where the rates of fluid injection and withdrawal in experimental wells are being controlled. ?? 1972.

  6. Evaluation of Earthquake-Induced Effects on Neighbouring Faults and Volcanoes: Application to the 2016 Pedernales Earthquake

    NASA Astrophysics Data System (ADS)

    Bejar, M.; Alvarez Gomez, J. A.; Staller, A.; Luna, M. P.; Perez Lopez, R.; Monserrat, O.; Chunga, K.; Herrera, G.; Jordá, L.; Lima, A.; Martínez-Díaz, J. J.

    2017-12-01

    It has long been recognized that earthquakes change the stress in the upper crust around the fault rupture and can influence the short-term behaviour of neighbouring faults and volcanoes. Rapid estimates of these stress changes can provide the authorities managing the post-disaster situation with a useful tool to identify and monitor potential threads and to update the estimates of seismic and volcanic hazard in a region. Space geodesy is now routinely used following an earthquake to image the displacement of the ground and estimate the rupture geometry and the distribution of slip. Using the obtained source model, it is possible to evaluate the remaining moment deficit and to infer the stress changes on nearby faults and volcanoes produced by the earthquake, which can be used to identify which faults and volcanoes are brought closer to failure or activation. Although these procedures are commonly used today, the transference of these results to the authorities managing the post-disaster situation is not straightforward and thus its usefulness is reduced in practice. Here we propose a methodology to evaluate the potential influence of an earthquake on nearby faults and volcanoes and create easy-to-understand maps for decision-making support after an earthquake. We apply this methodology to the Mw 7.8, 2016 Ecuador earthquake. Using Sentinel-1 SAR and continuous GPS data, we measure the coseismic ground deformation and estimate the distribution of slip. Then we use this model to evaluate the moment deficit on the subduction interface and changes of stress on the surrounding faults and volcanoes. The results are compared with the seismic and volcanic events that have occurred after the earthquake. We discuss potential and limits of the methodology and the lessons learnt from discussion with local authorities.

  7. Stress triggering of the 1999 Hector Mine earthquake by transient deformation following the 1992 Landers earthquake

    USGS Publications Warehouse

    Pollitz, F.F.; Sacks, I.S.

    2002-01-01

    The M 7.3 June 28, 1992 Landers and M 7.1 October 16, 1999 Hector Mine earthquakes, California, both right lateral strike-slip events on NNW-trending subvertical faults, occurred in close proximity in space and time in a region where recurrence times for surface-rupturing earthquakes are thousands of years. This suggests a causal role for the Landers earthquake in triggering the Hector Mine earthquake. Previous modeling of the static stress change associated with the Landers earthquake shows that the area of peak Hector Mine slip lies where the Coulomb failure stress promoting right-lateral strike-slip failure was high, but the nucleation point of the Hector Mine rupture was neutrally to weakly promoted, depending on the assumed coefficient of friction. Possible explanations that could account for the 7-year delay between the two ruptures include background tectonic stressing, dissipation of fluid pressure gradients, rate- and state-dependent friction effects, and post-Landers viscoelastic relaxation of the lower crust and upper mantle. By employing a viscoelastic model calibrated by geodetic data collected during the time period between the Landers and Hector Mine events, we calculate that postseismic relaxation produced a transient increase in Coulomb failure stress of about 0.7 bars on the impending Hector Mine rupture surface. The increase is greatest over the broad surface that includes the 1999 nucleation point and the site of peak slip further north. Since stress changes of magnitude greater than or equal to 0.1 bar are associated with documented causal fault interactions elsewhere, viscoelastic relaxation likely contributed to the triggering of the Hector Mine earthquake. This interpretation relies on the assumption that the faults occupying the central Mojave Desert (i.e., both the Landers and Hector Mine rupturing faults) were critically stressed just prior to the Landers earthquake.

  8. Has El Salvador Fault Zone produced M ≥ 7.0 earthquakes? The 1719 El Salvador earthquake

    NASA Astrophysics Data System (ADS)

    Canora, C.; Martínez-Díaz, J.; Álvarez-Gómez, J.; Villamor, P.; Ínsua-Arévalo, J.; Alonso-Henar, J.; Capote, R.

    2013-05-01

    Historically, large earthquakes, Mw ≥ 7.0, in the Εl Salvador area have been attributed to activity in the Cocos-Caribbean subduction zone. Τhis is correct for most of the earthquakes of magnitude greater than 6.5. However, recent paleoseismic evidence points to the existence of large earthquakes associated with rupture of the Εl Salvador Fault Ζone, an Ε-W oriented strike slip fault system that extends for 150 km through central Εl Salvador. Τo calibrate our results from paleoseismic studies, we have analyzed the historical seismicity of the area. In particular, we suggest that the 1719 earthquake can be associated with paleoseismic activity evidenced in the Εl Salvador Fault Ζone. Α reinterpreted isoseismal map for this event suggests that the damage reported could have been a consequence of the rupture of Εl Salvador Fault Ζone, rather than rupture of the subduction zone. Τhe isoseismal is not different to other upper crustal earthquakes in similar tectonovolcanic environments. We thus challenge the traditional assumption that only the subduction zone is capable of generating earthquakes of magnitude greater than 7.0 in this region. Τhis result has broad implications for future risk management in the region. Τhe potential occurrence of strong ground motion, significantly higher and closer to the Salvadorian populations that those assumed to date, must be considered in seismic hazard assessment studies in this area.

  9. Transportations Systems Modeling and Applications in Earthquake Engineering

    DTIC Science & Technology

    2010-07-01

    49 Figure 6 PGA map of a M7.7 earthquake on all three New Madrid fault segments (g)............... 50...Memphis, Tennessee. The NMSZ was responsible for the devastating 1811-1812 New Madrid earthquakes , the largest earthquakes ever recorded in the...Figure 6 PGA map of a M7.7 earthquake on all three New Madrid fault segments (g) Table 1 Fragility parameters for MSC steel bridge (Padgett 2007

  10. Collaboratory for the Study of Earthquake Predictability

    NASA Astrophysics Data System (ADS)

    Schorlemmer, D.; Jordan, T. H.; Zechar, J. D.; Gerstenberger, M. C.; Wiemer, S.; Maechling, P. J.

    2006-12-01

    Earthquake prediction is one of the most difficult problems in physical science and, owing to its societal implications, one of the most controversial. The study of earthquake predictability has been impeded by the lack of an adequate experimental infrastructure---the capability to conduct scientific prediction experiments under rigorous, controlled conditions and evaluate them using accepted criteria specified in advance. To remedy this deficiency, the Southern California Earthquake Center (SCEC) is working with its international partners, which include the European Union (through the Swiss Seismological Service) and New Zealand (through GNS Science), to develop a virtual, distributed laboratory with a cyberinfrastructure adequate to support a global program of research on earthquake predictability. This Collaboratory for the Study of Earthquake Predictability (CSEP) will extend the testing activities of SCEC's Working Group on Regional Earthquake Likelihood Models, from which we will present first results. CSEP will support rigorous procedures for registering prediction experiments on regional and global scales, community-endorsed standards for assessing probability-based and alarm-based predictions, access to authorized data sets and monitoring products from designated natural laboratories, and software to allow researchers to participate in prediction experiments. CSEP will encourage research on earthquake predictability by supporting an environment for scientific prediction experiments that allows the predictive skill of proposed algorithms to be rigorously compared with standardized reference methods and data sets. It will thereby reduce the controversies surrounding earthquake prediction, and it will allow the results of prediction experiments to be communicated to the scientific community, governmental agencies, and the general public in an appropriate research context.

  11. New ideas about the physics of earthquakes

    NASA Astrophysics Data System (ADS)

    Rundle, John B.; Klein, William

    1995-07-01

    It may be no exaggeration to claim that this most recent quaddrenium has seen more controversy and thus more progress in understanding the physics of earthquakes than any in recent memory. The most interesting development has clearly been the emergence of a large community of condensed matter physicists around the world who have begun working on the problem of earthquake physics. These scientists bring to the study of earthquakes an entirely new viewpoint, grounded in the physics of nucleation and critical phenomena in thermal, magnetic, and other systems. Moreover, a surprising technology transfer from geophysics to other fields has been made possible by the realization that models originally proposed to explain self-organization in earthquakes can also be used to explain similar processes in problems as disparate as brain dynamics in neurobiology (Hopfield, 1994), and charge density waves in solids (Brown and Gruner, 1994). An entirely new sub-discipline is emerging that is focused around the development and analysis of large scale numerical simulations of the dynamics of faults. At the same time, intriguing new laboratory and field data, together with insightful physical reasoning, has led to significant advances in our understanding of earthquake source physics. As a consequence, we can anticipate substantial improvement in our ability to understand the nature of earthquake occurrence. Moreover, while much research in the area of earthquake physics is fundamental in character, the results have many potential applications (Cornell et al., 1993) in the areas of earthquake risk and hazard analysis, and seismic zonation.

  12. The Uniform California Earthquake Rupture Forecast, Version 2 (UCERF 2)

    USGS Publications Warehouse

    ,

    2008-01-01

    California?s 35 million people live among some of the most active earthquake faults in the United States. Public safety demands credible assessments of the earthquake hazard to maintain appropriate building codes for safe construction and earthquake insurance for loss protection. Seismic hazard analysis begins with an earthquake rupture forecast?a model of probabilities that earthquakes of specified magnitudes, locations, and faulting types will occur during a specified time interval. This report describes a new earthquake rupture forecast for California developed by the 2007 Working Group on California Earthquake Probabilities (WGCEP 2007).

  13. Earthquake damage to transportation systems

    USGS Publications Warehouse

    McCullough, Heather

    1994-01-01

    Earthquakes represent one of the most destructive natural hazards known to man. A large magnitude earthquake near a populated area can affect residents over thousands of square kilometers and cause billions of dollars in property damage. Such an event can kill or injure thousands of residents and disrupt the socioeconomic environment for months, sometimes years. A serious result of a large-magnitude earthquake is the disruption of transportation systems, which limits post-disaster emergency response. Movement of emergency vehicles, such as police cars, fire trucks and ambulances, is often severely restricted. Damage to transportation systems is categorized below by cause including: ground failure, faulting, vibration damage, and tsunamis.

  14. Do weak global stresses synchronize earthquakes?

    NASA Astrophysics Data System (ADS)

    Bendick, R.; Bilham, R.

    2017-08-01

    Insofar as slip in an earthquake is related to the strain accumulated near a fault since a previous earthquake, and this process repeats many times, the earthquake cycle approximates an autonomous oscillator. Its asymmetric slow accumulation of strain and rapid release is quite unlike the harmonic motion of a pendulum and need not be time predictable, but still resembles a class of repeating systems known as integrate-and-fire oscillators, whose behavior has been shown to demonstrate a remarkable ability to synchronize to either external or self-organized forcing. Given sufficient time and even very weak physical coupling, the phases of sets of such oscillators, with similar though not necessarily identical period, approach each other. Topological and time series analyses presented here demonstrate that earthquakes worldwide show evidence of such synchronization. Though numerous studies demonstrate that the composite temporal distribution of major earthquakes in the instrumental record is indistinguishable from random, the additional consideration of event renewal interval serves to identify earthquake groupings suggestive of synchronization that are absent in synthetic catalogs. We envisage the weak forces responsible for clustering originate from lithospheric strain induced by seismicity itself, by finite strains over teleseismic distances, or by other sources of lithospheric loading such as Earth's variable rotation. For example, quasi-periodic maxima in rotational deceleration are accompanied by increased global seismicity at multidecadal intervals.

  15. Investigating Lushan Earthquake Victims' Individual Behavior Response and Rescue Organization.

    PubMed

    Kang, Peng; Lv, Yipeng; Deng, Qiangyu; Liu, Yuan; Zhang, Yi; Liu, Xu; Zhang, Lulu

    2017-12-11

    Research concerning the impact of earthquake victims' individual behavior and its association with earthquake-related injuries is lacking. This study examined this relationship along with effectiveness of earthquake rescue measures. The six most severely destroyed townships during the Lushan earthquake were examined; 28 villages and three earthquake victims' settlement camp areas were selected as research areas. Inclusion criteria comprised living in Lushan county for a longtime, living in Lushan county during the 2013 Lushan earthquake, and having one's home destroyed. Earthquake victims with an intellectual disability or communication problems were excluded. The earthquake victims (N (number) = 5165, male = 2396) completed a questionnaire (response rate: 94.7%). Among them, 209 were injured (5.61%). Teachers (p < 0.0001, OR (odds ratios) = 3.33) and medical staff (p = 0.001, OR = 4.35) were more vulnerable to the earthquake than were farmers. Individual behavior was directly related to injuries, such as the first reaction after earthquake and fear. There is an obvious connection between earthquake-related injury and individual behavior characteristics. It is strongly suggested that victims receive mental health support from medical practitioners and the government to minimize negative effects. The initial reaction after an earthquake also played a vital role in victims' trauma; therefore, earthquake-related experience and education may prevent injuries. Self-aid and mutual help played key roles in emergency, medical rescue efforts.

  16. Land-Ocean-Atmospheric Coupling Associated with Earthquakes

    NASA Astrophysics Data System (ADS)

    Prasad, A. K.; Singh, R. P.; Kumar, S.; Cervone, G.; Kafatos, M.; Zlotnicki, J.

    2007-12-01

    Earthquakes are well known to occur along the plate boundaries and also on the stable shield. The recent studies have shown existence of strong coupling between land-ocean-atmospheric parameters associated with the earthquakes. We have carried out detailed analysis of multi sensor data (optical and microwave remote) to show existence of strong coupling between land-ocean-atmospheric parameters associated with the earthquakes with focal depth up to 30 km and magnitude greater than 5.5. Complimentary nature of various land, ocean and atmospheric parameters will be demonstrated in getting an early warning information about an impending earthquake.

  17. Magnitude Dependent Seismic Quiescence of 2008 Wenchuan Earthquake

    NASA Astrophysics Data System (ADS)

    Suyehiro, K.; Sacks, S. I.; Takanami, T.; Smith, D. E.; Rydelek, P. A.

    2014-12-01

    The change in seismicity leading to the Wenchuan Earthquake in 2008 (Mw 7.9) has been studied by various authors based on statistics and/or pattern recognitions (Huang, 2008; Yan et al., 2009; Chen and Wang, 2010; Yi et al., 2011). We show, in particular, that the magnitude-dependent seismic quiescence is observed for the Wenchuan earthquake and that it adds to other similar observations. Such studies on seismic quiescence prior to major earthquakes include 1982 Urakawa-Oki earthquake (M 7.1) (Taylor et al., 1992), 1994 Hokkaido-Toho-Oki earthquake (Mw=8.2) (Takanami et al., 1996), 2011 Tohoku earthquake (Mw=9.0) (Katsumata, 2011). Smith and Sacks (2013) proposed a magnitude-dependent quiescence based on a physical earthquake model (Rydelek and Sacks, 1995) and demonstrated the quiescence can be reproduced by the introduction of "asperities" (dilantacy hardened zones). Actual observations indicate the change occurs in a broader area than the eventual earthquake fault zone. In order to accept the explanation, we need to verify the model as the model predicts somewhat controversial features of earthquakes such as the magnitude dependent stress drop at lower magnitude range or the dynamically appearing asperities and repeating slips in some parts of the rupture zone. We show supportive observations. We will also need to verify the dilatancy diffusion to be taking place. So far, we only seem to have indirect evidences, which need to be more quantitatively substantiated.

  18. Monitoring the Earthquake source process in North America

    USGS Publications Warehouse

    Herrmann, Robert B.; Benz, H.; Ammon, C.J.

    2011-01-01

    With the implementation of the USGS National Earthquake Information Center Prompt Assessment of Global Earthquakes for Response system (PAGER), rapid determination of earthquake moment magnitude is essential, especially for earthquakes that are felt within the contiguous United States. We report an implementation of moment tensor processing for application to broad, seismically active areas of North America. This effort focuses on the selection of regional crustal velocity models, codification of data quality tests, and the development of procedures for rapid computation of the seismic moment tensor. We systematically apply these techniques to earthquakes with reported magnitude greater than 3.5 in continental North America that are not associated with a tectonic plate boundary. Using the 0.02-0.10 Hz passband, we can usually determine, with few exceptions, moment tensor solutions for earthquakes with M w as small as 3.7. The threshold is significantly influenced by the density of stations, the location of the earthquake relative to the seismic stations and, of course, the signal-to-noise ratio. With the existing permanent broadband stations in North America operated for rapid earthquake response, the seismic moment tensor of most earthquakes that are M w 4 or larger can be routinely computed. As expected the nonuniform spatial pattern of these solutions reflects the seismicity pattern. However, the orientation of the direction of maximum compressive stress and the predominant style of faulting is spatially coherent across large regions of the continent.

  19. Late Holocene megathrust earthquakes in south central Chile

    NASA Astrophysics Data System (ADS)

    Garrett, Ed; Shennan, Ian; Gulliver, Pauline; Woodroffe, Sarah

    2013-04-01

    A lack of comprehensive understanding of the seismic hazards associated with a subduction zone can lead to inadequate anticipation of earthquake and tsunami magnitudes. Four hundred and fifty years of Chilean historical documents record the effects of numerous great earthquakes; however, with recurrence intervals between the largest megathrust earthquakes approaching 300 years, seismic hazard assessment requires longer chronologies. This research seeks to verify and extend historical records in south central Chile using a relative-sea level approach to palaeoseismology. Our quantitative, diatom-based approaches to relative sea-level reconstruction are successful in reconstructing the magnitude of coseismic deformation during recent, well documented Chilean earthquakes. The few disparities between my estimates and independent data highlight the possibility of shaking-induced sediment consolidation in tidal marshes. Following this encouraging confirmation of the approach, we quantify land-level changes in longer sedimentary records from the centre of the rupture zone of the 1960 Valdivia earthquake. Here, laterally extensive marsh soils abruptly overlain by low intertidal sediments attest to the occurrence of four megathrust earthquakes. Sites preserve evidence of the 1960 and 1575 earthquakes and we constrain the timing of two predecessors to 1270 to 1410 and 1050 to 1200. The sediments and biostratigraphy lack evidence for the historically documented 1737 and 1837 earthquakes.

  20. a Collaborative Cyberinfrastructure for Earthquake Seismology

    NASA Astrophysics Data System (ADS)

    Bossu, R.; Roussel, F.; Mazet-Roux, G.; Lefebvre, S.; Steed, R.

    2013-12-01

    One of the challenges in real time seismology is the prediction of earthquake's impact. It is particularly true for moderate earthquake (around magnitude 6) located close to urbanised areas, where the slightest uncertainty in event location, depth, magnitude estimates, and/or misevaluation of propagation characteristics, site effects and buildings vulnerability can dramatically change impact scenario. The Euro-Med Seismological Centre (EMSC) has developed a cyberinfrastructure to collect observations from eyewitnesses in order to provide in-situ constraints on actual damages. This cyberinfrastructure takes benefit of the natural convergence of earthquake's eyewitnesses on EMSC website (www.emsc-csem.org), the second global earthquake information website within tens of seconds of the occurrence of a felt event. It includes classical crowdsourcing tools such as online questionnaires available in 39 languages, and tools to collect geolocated pics. It also comprises information derived from the real time analysis of the traffic on EMSC website, a method named flashsourcing; In case of a felt earthquake, eyewitnesses reach EMSC website within tens of seconds to find out the cause of the shaking they have just been through. By analysing their geographical origin through their IP address, we automatically detect felt earthquakes and in some cases map the damaged areas through the loss of Internet visitors. We recently implemented a Quake Catcher Network (QCN) server in collaboration with Stanford University and the USGS, to collect ground motion records performed by volunteers and are also involved in a project to detect earthquakes from ground motions sensors from smartphones. Strategies have been developed for several social media (Facebook, Twitter...) not only to distribute earthquake information, but also to engage with the Citizens and optimise data collection. A smartphone application is currently under development. We will present an overview of this

  1. Potentially induced earthquakes in Oklahoma, USA: links between wastewater injection and the 2011 Mw 5.7 earthquake sequence

    USGS Publications Warehouse

    Keranen, Katie M.; Savage, Heather M.; Abers, Geoffrey A.; Cochran, Elizabeth S.

    2013-01-01

    Significant earthquakes are increasingly occurring within the continental interior of the United States, including five of moment magnitude (Mw) ≥ 5.0 in 2011 alone. Concurrently, the volume of fluid injected into the subsurface related to the production of unconventional resources continues to rise. Here we identify the largest earthquake potentially related to injection, an Mw 5.7 earthquake in November 2011 in Oklahoma. The earthquake was felt in at least 17 states and caused damage in the epicentral region. It occurred in a sequence, with 2 earthquakes of Mw 5.0 and a prolific sequence of aftershocks. We use the aftershocks to illuminate the faults that ruptured in the sequence, and show that the tip of the initial rupture plane is within ~200 m of active injection wells and within ~1 km of the surface; 30% of early aftershocks occur within the sedimentary section. Subsurface data indicate that fluid was injected into effectively sealed compartments, and we interpret that a net fluid volume increase after 18 yr of injection lowered effective stress on reservoir-bounding faults. Significantly, this case indicates that decades-long lags between the commencement of fluid injection and the onset of induced earthquakes are possible, and modifies our common criteria for fluid-induced events. The progressive rupture of three fault planes in this sequence suggests that stress changes from the initial rupture triggered the successive earthquakes, including one larger than the first.

  2. Earthquake stress triggers, stress shadows, and seismic hazard

    USGS Publications Warehouse

    Harris, R.A.

    2000-01-01

    Many aspects of earthquake mechanics remain an enigma at the beginning of the twenty-first century. One potential bright spot is the realization that simple calculations of stress changes may explain some earthquake interactions, just as previous and ongoing studies of stress changes have begun to explain human- induced seismicity. This paper, which is an update of Harris1, reviews many published works and presents a compilation of quantitative earthquake-interaction studies from a stress change perspective. This synthesis supplies some clues about certain aspects of earthquake mechanics. It also demonstrates that much work remains to be done before we have a complete story of how earthquakes work.

  3. Earthquake Clustering in Noisy Viscoelastic Systems

    NASA Astrophysics Data System (ADS)

    Dicaprio, C. J.; Simons, M.; Williams, C. A.; Kenner, S. J.

    2006-12-01

    Geologic studies show evidence for temporal clustering of earthquakes on certain fault systems. Since post- seismic deformation may result in a variable loading rate on a fault throughout the inter-seismic period, it is reasonable to expect that the rheology of the non-seismogenic lower crust and mantle lithosphere may play a role in controlling earthquake recurrence times. Previously, the role of rheology of the lithosphere on the seismic cycle had been studied with a one-dimensional spring-dashpot-slider model (Kenner and Simons [2005]). In this study we use the finite element code PyLith to construct a two-dimensional continuum model a strike-slip fault in an elastic medium overlying one or more linear Maxwell viscoelastic layers loaded in the far field by a constant velocity boundary condition. Taking advantage of the linear properties of the model, we use the finite element solution to one earthquake as a spatio-temporal Green's function. Multiple Green's function solutions, scaled by the size of each earthquake, are then summed to form an earthquake sequence. When the shear stress on the fault reaches a predefined yield stress it is allowed to slip, relieving all accumulated shear stress. Random variation in the fault yield stress from one earthquake to the next results in a temporally clustered earthquake sequence. The amount of clustering depends on a non-dimensional number, W, called the Wallace number. For models with one viscoelastic layer, W is equal to the standard deviation of the earthquake stress drop divided by the viscosity times the tectonic loading rate. This definition of W is modified from the original one used in Kenner and Simons [2005] by using the standard deviation of the stress drop instead of the mean stress drop. We also use a new, more appropriate, metric to measure the amount of temporal clustering of the system. W is the ratio of the viscoelastic relaxation rate of the system to the tectonic loading rate of the system. For values of

  4. The enigmatic Bala earthquake of 1974

    NASA Astrophysics Data System (ADS)

    Musson, R. M. W.

    2006-10-01

    The earthquake that shook most of North Wales on the night of 23 January 1974 appears unremarkable from its entry in the UK earthquake catalogue. With a magnitude of 3.5 ML it represents the size of earthquake to be expected in the UK with a return period of about one year. However, the prominent atmospheric lights observed at the time of the shock led to speculation that an aircraft had crashed, and search-and-rescue teams were deployed. Since nothing was discovered, it was concluded that a meteorite was responsible; more imaginative members of the public decided (and still believe) that a UFO had crashed. In this paper the record of events is set out, and the nature of the earthquake is discussed with reference to its geological setting.

  5. Visible Earthquakes: a web-based tool for visualizing and modeling InSAR earthquake data

    NASA Astrophysics Data System (ADS)

    Funning, G. J.; Cockett, R.

    2012-12-01

    InSAR (Interferometric Synthetic Aperture Radar) is a technique for measuring the deformation of the ground using satellite radar data. One of the principal applications of this method is in the study of earthquakes; in the past 20 years over 70 earthquakes have been studied in this way, and forthcoming satellite missions promise to enable the routine and timely study of events in the future. Despite the utility of the technique and its widespread adoption by the research community, InSAR does not feature in the teaching curricula of most university geoscience departments. This is, we believe, due to a lack of accessibility to software and data. Existing tools for the visualization and modeling of interferograms are often research-oriented, command line-based and/or prohibitively expensive. Here we present a new web-based interactive tool for comparing real InSAR data with simple elastic models. The overall design of this tool was focused on ease of access and use. This tool should allow interested nonspecialists to gain a feel for the use of such data and greatly facilitate integration of InSAR into upper division geoscience courses, giving students practice in comparing actual data to modeled results. The tool, provisionally named 'Visible Earthquakes', uses web-based technologies to instantly render the displacement field that would be observable using InSAR for a given fault location, geometry, orientation, and slip. The user can adjust these 'source parameters' using a simple, clickable interface, and see how these affect the resulting model interferogram. By visually matching the model interferogram to a real earthquake interferogram (processed separately and included in the web tool) a user can produce their own estimates of the earthquake's source parameters. Once satisfied with the fit of their models, users can submit their results and see how they compare with the distribution of all other contributed earthquake models, as well as the mean and median

  6. Earthquake Preparedness Among Japanese Hemodialysis Patients in Prefectures Heavily Damaged by the 2011 Great East Japan Earthquake.

    PubMed

    Sugisawa, Hidehiro; Shimizu, Yumiko; Kumagai, Tamaki; Sugisaki, Hiroaki; Ohira, Seiji; Shinoda, Toshio

    2017-08-01

    The purpose of this study was to explore the factors related to earthquake preparedness in Japanese hemodialysis patients. We focused on three aspects of the related factors: health condition factors, social factors, and the experience of disasters. A mail survey of all the members of the Japan Association of Kidney Disease Patients in three Japanese prefectures (N = 4085) was conducted in March, 2013. We obtained 1841 valid responses for analysis. The health factors covered were: activities of daily living (ADL), mental distress, primary renal diseases, and the duration of dialysis. The social factors were: socioeconomic status, family structure, informational social support, and the provision of information regarding earthquake preparedness from dialysis facilities. The results show that the average percentage of participants that had met each criterion of earthquake preparedness in 2013 was 53%. Hemodialysis patients without disabled ADL, without mental distress, and requiring longer periods of dialysis, were likely to meet more of the earthquake preparedness criteria. Hemodialysis patients who had received informational social support from family or friends, had lived with spouse and children in comparison to living alone, and had obtained information regarding earthquake preparedness from dialysis facilities, were also likely to meet more of the earthquake preparedness criteria. © 2017 International Society for Apheresis, Japanese Society for Apheresis, and Japanese Society for Dialysis Therapy.

  7. Combining multiple earthquake models in real time for earthquake early warning

    USGS Publications Warehouse

    Minson, Sarah E.; Wu, Stephen; Beck, James L; Heaton, Thomas H.

    2017-01-01

    The ultimate goal of earthquake early warning (EEW) is to provide local shaking information to users before the strong shaking from an earthquake reaches their location. This is accomplished by operating one or more real‐time analyses that attempt to predict shaking intensity, often by estimating the earthquake’s location and magnitude and then predicting the ground motion from that point source. Other EEW algorithms use finite rupture models or may directly estimate ground motion without first solving for an earthquake source. EEW performance could be improved if the information from these diverse and independent prediction models could be combined into one unified, ground‐motion prediction. In this article, we set the forecast shaking at each location as the common ground to combine all these predictions and introduce a Bayesian approach to creating better ground‐motion predictions. We also describe how this methodology could be used to build a new generation of EEW systems that provide optimal decisions customized for each user based on the user’s individual false‐alarm tolerance and the time necessary for that user to react.

  8. Earthquake Complex Network Analysis Before and After the Mw 8.2 Earthquake in Iquique, Chile

    NASA Astrophysics Data System (ADS)

    Pasten, D.

    2017-12-01

    The earthquake complex networks have shown that they are abble to find specific features in seismic data set. In space, this networkshave shown a scale-free behavior for the probability distribution of connectivity, in directed networks and theyhave shown a small-world behavior, for the undirected networks.In this work, we present an earthquake complex network analysis for the large earthquake Mw 8.2 in the north ofChile (near to Iquique) in April, 2014. An earthquake complex network is made dividing the three dimensional space intocubic cells, if one of this cells contain an hypocenter, we name this cell like a node. The connections between nodes aregenerated in time. We follow the time sequence of seismic events and we are making the connections betweennodes. Now, we have two different networks: a directed and an undirected network. Thedirected network takes in consideration the time-direction of the connections, that is very important for the connectivityof the network: we are considering the connectivity, ki of the i-th node, like the number of connections going out ofthe node i plus the self-connections (if two seismic events occurred successive in time in the same cubic cell, we havea self-connection). The undirected network is made removing the direction of the connections and the self-connectionsfrom the directed network. For undirected networks, we are considering only if two nodes are or not connected.We have built a directed complex network and an undirected complex network, before and after the large earthquake in Iquique. We have used magnitudes greater than Mw = 1.0 and Mw = 3.0. We found that this method can recognize the influence of thissmall seismic events in the behavior of the network and we found that the size of the cell used to build the network isanother important factor to recognize the influence of the large earthquake in this complex system. This method alsoshows a difference in the values of the critical exponent γ (for the probability

  9. Make an Earthquake: Ground Shaking!

    ERIC Educational Resources Information Center

    Savasci, Funda

    2011-01-01

    The main purposes of this activity are to help students explore possible factors affecting the extent of the damage of earthquakes and learn the ways to reduce earthquake damages. In these inquiry-based activities, students have opportunities to develop science process skills and to build an understanding of the relationship among science,…

  10. Engineering geological aspect of Gorkha Earthquake 2015, Nepal

    NASA Astrophysics Data System (ADS)

    Adhikari, Basanta Raj; Andermann, Christoff; Cook, Kristen

    2016-04-01

    Strong shaking by earthquake causes massif landsliding with severe effects on infrastructure and human lives. The distribution of landslides and other hazards are depending on the combination of earthquake and local characteristics which influence the dynamic response of hillslopes. The Himalayas are one of the most active mountain belts with several kilometers of relief and is very prone to catastrophic mass failure. Strong and shallow earthquakes are very common and cause wide spread collapse of hillslopes, increasing the background landslide rate by several magnitude. The Himalaya is facing many small and large earthquakes in the past i.e. earthquakes i.e. Bihar-Nepal earthquake 1934 (Ms 8.2); Large Kangra earthquake of 1905 (Ms 7.8); Gorkha earthquake 2015 (Mw 7.8). The Mw 7.9 Gorkha earthquake has occurred on and around the main Himalayan Thrust with a hypocentral depth of 15 km (GEER 2015) followed by Mw 7.3 aftershock in Kodari causing 8700+ deaths and leaving hundreds of thousands of homeless. Most of the 3000 aftershocks located by National Seismological Center (NSC) within the first 45 days following the Gorkha Earthquake are concentrated in a narrow 40 km-wide band at midcrustal to shallow depth along the strike of the southern slope of the high Himalaya (Adhikari et al. 2015) and the ground shaking was substantially lower in the short-period range than would be expected for and earthquake of this magnitude (Moss et al. 2015). The effect of this earthquake is very unique in affected areas by showing topographic effect, liquefaction and land subsidence. More than 5000 landslides were triggered by this earthquake (Earthquake without Frontiers, 2015). Most of the landslides are shallow and occurred in weathered bedrock and appear to have mobilized primarily as raveling failures, rock slides and rock falls. Majority of landslides are limited to a zone which runs east-west, approximately parallel the lesser and higher Himalaya. There are numerous cracks in

  11. The surface latent heat flux anomalies related to major earthquake

    NASA Astrophysics Data System (ADS)

    Jing, Feng; Shen, Xuhui; Kang, Chunli; Xiong, Pan; Hong, Shunying

    2011-12-01

    SLHF (Surface Latent Heat Flux) is an atmospheric parameter, which can describe the heat released by phase changes and dependent on meteorological parameters such as surface temperature, relative humidity, wind speed etc. There is a sharp difference between the ocean surface and the land surface. Recently, many studies related to the SLHF anomalies prior to earthquakes have been developed. It has been shown that the energy exchange enhanced between coastal surface and atmosphere prior to earthquakes can increase the rate of the water-heat exchange, which will lead to an obviously increases in SLHF. In this paper, two earthquakes in 2010 (Haiti earthquake and southwest of Sumatra in Indonesia earthquake) have been analyzed using SLHF data by STD (standard deviation) threshold method. It is shows that the SLHF anomaly may occur in interpolate earthquakes or intraplate earthquakes and coastal earthquakes or island earthquakes. And the SLHF anomalies usually appear 5-6 days prior to an earthquake, then disappear quickly after the event. The process of anomaly evolution to a certain extent reflects a dynamic energy change process about earthquake preparation, that is, weak-strong-weak-disappeared.

  12. Salient Features of the 2015 Gorkha, Nepal Earthquake in Relation to Earthquake Cycle and Dynamic Rupture Models

    NASA Astrophysics Data System (ADS)

    Ampuero, J. P.; Meng, L.; Hough, S. E.; Martin, S. S.; Asimaki, D.

    2015-12-01

    Two salient features of the 2015 Gorkha, Nepal, earthquake provide new opportunities to evaluate models of earthquake cycle and dynamic rupture. The Gorkha earthquake broke only partially across the seismogenic depth of the Main Himalayan Thrust: its slip was confined in a narrow depth range near the bottom of the locked zone. As indicated by the belt of background seismicity and decades of geodetic monitoring, this is an area of stress concentration induced by deep fault creep. Previous conceptual models attribute such intermediate-size events to rheological segmentation along-dip, including a fault segment with intermediate rheology in between the stable and unstable slip segments. We will present results from earthquake cycle models that, in contrast, highlight the role of stress loading concentration, rather than frictional segmentation. These models produce "super-cycles" comprising recurrent characteristic events interspersed by deep, smaller non-characteristic events of overall increasing magnitude. Because the non-characteristic events are an intrinsic component of the earthquake super-cycle, the notion of Coulomb triggering or time-advance of the "big one" is ill-defined. The high-frequency (HF) ground motions produced in Kathmandu by the Gorkha earthquake were weaker than expected for such a magnitude and such close distance to the rupture, as attested by strong motion recordings and by macroseismic data. Static slip reached close to Kathmandu but had a long rise time, consistent with control by the along-dip extent of the rupture. Moreover, the HF (1 Hz) radiation sources, imaged by teleseismic back-projection of multiple dense arrays calibrated by aftershock data, was deep and far from Kathmandu. We argue that HF rupture imaging provided a better predictor of shaking intensity than finite source inversion. The deep location of HF radiation can be attributed to rupture over heterogeneous initial stresses left by the background seismic activity

  13. High Attenuation Rate for Shallow, Small Earthquakes in Japan

    NASA Astrophysics Data System (ADS)

    Si, Hongjun; Koketsu, Kazuki; Miyake, Hiroe

    2017-09-01

    We compared the attenuation characteristics of peak ground accelerations (PGAs) and velocities (PGVs) of strong motion from shallow, small earthquakes that occurred in Japan with those predicted by the equations of Si and Midorikawa (J Struct Constr Eng 523:63-70, 1999). The observed PGAs and PGVs at stations far from the seismic source decayed more rapidly than the predicted ones. The same tendencies have been reported for deep, moderate, and large earthquakes, but not for shallow, moderate, and large earthquakes. This indicates that the peak values of ground motion from shallow, small earthquakes attenuate more steeply than those from shallow, moderate or large earthquakes. To investigate the reason for this difference, we numerically simulated strong ground motion for point sources of M w 4 and 6 earthquakes using a 2D finite difference method. The analyses of the synthetic waveforms suggested that the above differences are caused by surface waves, which are predominant at stations far from the seismic source for shallow, moderate earthquakes but not for shallow, small earthquakes. Thus, although loss due to reflection at the boundaries of the discontinuous Earth structure occurs in all shallow earthquakes, the apparent attenuation rate for a moderate or large earthquake is essentially the same as that of body waves propagating in a homogeneous medium due to the dominance of surface waves.

  14. From a physical approach to earthquake prediction, towards long and short term warnings ahead of large earthquakes

    NASA Astrophysics Data System (ADS)

    Stefansson, R.; Bonafede, M.

    2012-04-01

    For 20 years the South Iceland Seismic Zone (SISZ) was a test site for multinational earthquake prediction research, partly bridging the gap between laboratory tests samples, and the huge transform zones of the Earth. The approach was to explore the physics of processes leading up to large earthquakes. The book Advances in Earthquake Prediction, Research and Risk Mitigation, by R. Stefansson (2011), published by Springer/PRAXIS, and an article in the August issue of the BSSA by Stefansson, M. Bonafede and G. Gudmundsson (2011) contain a good overview of the findings, and more references, as well as examples of partially successful long and short term warnings based on such an approach. Significant findings are: Earthquakes that occurred hundreds of years ago left scars in the crust, expressed in volumes of heterogeneity that demonstrate the size of their faults. Rheology and stress heterogeneity within these volumes are significantly variable in time and space. Crustal processes in and near such faults may be observed by microearthquake information decades before the sudden onset of a new large earthquake. High pressure fluids of mantle origin may in response to strain, especially near plate boundaries, migrate upward into the brittle/elastic crust to play a significant role in modifying crustal conditions on a long and short term. Preparatory processes of various earthquakes can not be expected to be the same. We learn about an impending earthquake by observing long term preparatory processes at the fault, finding a constitutive relationship that governs the processes, and then extrapolating that relationship into near space and future. This is a deterministic approach in earthquake prediction research. Such extrapolations contain many uncertainties. However the long time pattern of observations of the pre-earthquake fault process will help us to put probability constraints on our extrapolations and our warnings. The approach described is different from the usual

  15. Fault failure with moderate earthquakes

    USGS Publications Warehouse

    Johnston, M.J.S.; Linde, A.T.; Gladwin, M.T.; Borcherdt, R.D.

    1987-01-01

    High resolution strain and tilt recordings were made in the near-field of, and prior to, the May 1983 Coalinga earthquake (ML = 6.7, ?? = 51 km), the August 4, 1985, Kettleman Hills earthquake (ML = 5.5, ?? = 34 km), the April 1984 Morgan Hill earthquake (ML = 6.1, ?? = 55 km), the November 1984 Round Valley earthquake (ML = 5.8, ?? = 54 km), the January 14, 1978, Izu, Japan earthquake (ML = 7.0, ?? = 28 km), and several other smaller magnitude earthquakes. These recordings were made with near-surface instruments (resolution 10-8), with borehole dilatometers (resolution 10-10) and a 3-component borehole strainmeter (resolution 10-9). While observed coseismic offsets are generally in good agreement with expectations from elastic dislocation theory, and while post-seismic deformation continued, in some cases, with a moment comparable to that of the main shock, preseismic strain or tilt perturbations from hours to seconds (or less) before the main shock are not apparent above the present resolution. Precursory slip for these events, if any occurred, must have had a moment less than a few percent of that of the main event. To the extent that these records reflect general fault behavior, the strong constraint on the size and amount of slip triggering major rupture makes prediction of the onset times and final magnitudes of the rupture zones a difficult task unless the instruments are fortuitously installed near the rupture initiation point. These data are best explained by an inhomogeneous failure model for which various areas of the fault plane have either different stress-slip constitutive laws or spatially varying constitutive parameters. Other work on seismic waveform analysis and synthetic waveforms indicates that the rupturing process is inhomogeneous and controlled by points of higher strength. These models indicate that rupture initiation occurs at smaller regions of higher strength which, when broken, allow runaway catastrophic failure. ?? 1987.

  16. Statistical earthquake focal mechanism forecasts

    NASA Astrophysics Data System (ADS)

    Kagan, Yan Y.; Jackson, David D.

    2014-04-01

    Forecasts of the focal mechanisms of future shallow (depth 0-70 km) earthquakes are important for seismic hazard estimates and Coulomb stress, and other models of earthquake occurrence. Here we report on a high-resolution global forecast of earthquake rate density as a function of location, magnitude and focal mechanism. In previous publications we reported forecasts of 0.5° spatial resolution, covering the latitude range from -75° to +75°, based on the Global Central Moment Tensor earthquake catalogue. In the new forecasts we have improved the spatial resolution to 0.1° and the latitude range from pole to pole. Our focal mechanism estimates require distance-weighted combinations of observed focal mechanisms within 1000 km of each gridpoint. Simultaneously, we calculate an average rotation angle between the forecasted mechanism and all the surrounding mechanisms, using the method of Kagan & Jackson proposed in 1994. This average angle reveals the level of tectonic complexity of a region and indicates the accuracy of the prediction. The procedure becomes problematical where longitude lines are not approximately parallel, and where shallow earthquakes are so sparse that an adequate sample spans very large distances. North or south of 75°, the azimuths of points 1000 km away may vary by about 35°. We solved this problem by calculating focal mechanisms on a plane tangent to the Earth's surface at each forecast point, correcting for the rotation of the longitude lines at the locations of earthquakes included in the averaging. The corrections are negligible between -30° and +30° latitude, but outside that band uncorrected rotations can be significantly off. Improved forecasts at 0.5° and 0.1° resolution are posted at http://eq.ess.ucla.edu/kagan/glob_gcmt_index.html.

  17. Izmit, Turkey 1999 Earthquake Interferogram

    NASA Technical Reports Server (NTRS)

    2001-01-01

    This image is an interferogram that was created using pairs of images taken by Synthetic Aperture Radar (SAR). The images, acquired at two different times, have been combined to measure surface deformation or changes that may have occurred during the time between data acquisition. The images were collected by the European Space Agency's Remote Sensing satellite (ERS-2) on 13 August 1999 and 17 September 1999 and were combined to produce these image maps of the apparent surface deformation, or changes, during and after the 17 August 1999 Izmit, Turkey earthquake. This magnitude 7.6 earthquake was the largest in 60 years in Turkey and caused extensive damage and loss of life. Each of the color contours of the interferogram represents 28 mm (1.1 inches) of motion towards the satellite, or about 70 mm (2.8 inches) of horizontal motion. White areas are outside the SAR image or water of seas and lakes. The North Anatolian Fault that broke during the Izmit earthquake moved more than 2.5 meters (8.1 feet) to produce the pattern measured by the interferogram. Thin red lines show the locations of fault breaks mapped on the surface. The SAR interferogram shows that the deformation and fault slip extended west of the surface faults, underneath the Gulf of Izmit. Thick black lines mark the fault rupture inferred from the SAR data. Scientists are using the SAR interferometry along with other data collected on the ground to estimate the pattern of slip that occurred during the Izmit earthquake. This then used to improve computer models that predict how this deformation transferred stress to other faults and to the continuation of the North Anatolian Fault, which extends to the west past the large city of Istanbul. These models show that the Izmit earthquake further increased the already high probability of a major earthquake near Istanbul.

  18. Living on an Active Earth: Perspectives on Earthquake Science

    NASA Astrophysics Data System (ADS)

    Lay, Thorne

    2004-02-01

    The annualized long-term loss due to earthquakes in the United States is now estimated at $4.4 billion per year. A repeat of the 1923 Kanto earthquake, near Tokyo, could cause direct losses of $2-3 trillion. With such grim numbers, which are guaranteed to make you take its work seriously, the NRC Committee on the Science of Earthquakes begins its overview of the emerging multidisciplinary field of earthquake science. An up-to-date and forward-looking survey of scientific investigation of earthquake phenomena and engineering response to associated hazards is presented at a suitable level for a general educated audience. Perspectives from the fields of seismology, geodesy, neo-tectonics, paleo-seismology, rock mechanics, earthquake engineering, and computer modeling of complex dynamic systems are integrated into a balanced definition of earthquake science that has never before been adequately articulated.

  19. Material contrast does not predict earthquake rupture propagation direction

    USGS Publications Warehouse

    Harris, R.A.; Day, S.M.

    2005-01-01

    Earthquakes often occur on faults that juxtapose different rocks. The result is rupture behavior that differs from that of an earthquake occurring on a fault in a homogeneous material. Previous 2D numerical simulations have studied simple cases of earthquake rupture propagation where there is a material contrast across a fault and have come to two different conclusions: 1) earthquake rupture propagation direction can be predicted from the material contrast, and 2) earthquake rupture propagation direction cannot be predicted from the material contrast. In this paper we provide observational evidence from 70 years of earthquakes at Parkfield, CA, and new 3D numerical simulations. Both the observations and the numerical simulations demonstrate that earthquake rupture propagation direction is unlikely to be predictable on the basis of a material contrast. Copyright 2005 by the American Geophysical Union.

  20. 77 FR 62523 - Scientific Earthquake Studies Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-15

    ... DEPARTMENT OF THE INTERIOR Geological Survey [GX13GG009950000] Scientific Earthquake Studies... Law 106-503, the Scientific Earthquake Studies Advisory Committee (SESAC) will hold its next meeting... the Scientific Earthquake Studies Advisory Committee are open to the public. DATES: October 29, 2012...

  1. Global Instrumental Seismic Catalog: earthquake relocations for 1900-present

    NASA Astrophysics Data System (ADS)

    Villasenor, A.; Engdahl, E.; Storchak, D. A.; Bondar, I.

    2010-12-01

    We present the current status of our efforts to produce a set of homogeneous earthquake locations and improved focal depths towards the compilation of a Global Catalog of instrumentally recorded earthquakes that will be complete down to the lowest magnitude threshold possible on a global scale and for the time period considered. This project is currently being carried out under the auspices of GEM (Global Earthquake Model). The resulting earthquake catalog will be a fundamental dataset not only for earthquake risk modeling and assessment on a global scale, but also for a large number of studies such as global and regional seismotectonics; the rupture zones and return time of large, damaging earthquakes; the spatial-temporal pattern of moment release along seismic zones and faults etc. Our current goal is to re-locate all earthquakes with available station arrival data using the following magnitude thresholds: M5.5 for 1964-present, M6.25 for 1918-1963, M7.5 (complemented with significant events in continental regions) for 1900-1917. Phase arrival time data for earthquakes after 1963 are available in digital form from the International Seismological Centre (ISC). For earthquakes in the time period 1918-1963, phase data is obtained by scanning the printed International Seismological Summary (ISS) bulletins and applying optical character recognition routines. For earlier earthquakes we will collect phase data from individual station bulletins. We will illustrate some of the most significant results of this relocation effort, including aftershock distributions for large earthquakes, systematic differences in epicenter and depth with respect to previous location, examples of grossly mislocated events, etc.

  2. The persistence of directivity in small earthquakes

    USGS Publications Warehouse

    Boatwright, J.

    2007-01-01

    We derive a simple inversion of peak ground acceleration (PGA) or peak ground velocity (PGV) for rupture direction and rupture velocity and then test this inversion on the peak motions obtained from seven 3.5 ??? M ??? 4.1 earthquakes that occurred in two clusters in November 2002 and February 2003 near San Ramon, California. These clusters were located on two orthogonal strike-slip faults so that the events share the same approximate focal mechanism but not the same fault plane. Three earthquakes exhibit strong directivity, but the other four earthquakes exhibit relatively weak directivity. We use the residual PGAs and PGVs from the other six events to determine station corrections for each earthquake. The inferred rupture directions unambiguously identify the fault plane for the three earthquakes with strong directivity and for three of the four earthquakes with weak directivity. The events with strong directivity have fast rupture velocities (0.63????? v ??? 0.87??); the events with weak directivity either rupture more slowly (0.17????? v ???0.35??) or bilaterally. The simple unilateral inversion cannot distinguish between slow and bilateral ruptures: adding a bilateral rupture component degrades the fit of the rupture directions to the fault planes. By comparing PGAs from the events with strong and weak directivity, we show how an up-dip rupture in small events can distort the attenuation of peak ground motion with distance. When we compare the rupture directions of the earthquakes to the location of aftershocks in the two clusters, we find than almost all the aftershocks of the three earthquakes with strong directivity occur within 70?? of the direction of rupture.

  3. Earthquake geology of the Bulnay Fault (Mongolia)

    USGS Publications Warehouse

    Rizza, Magali; Ritz, Jean-Franciois; Prentice, Carol S.; Vassallo, Ricardo; Braucher, Regis; Larroque, Christophe; Arzhannikova, A.; Arzhanikov, S.; Mahan, Shannon; Massault, M.; Michelot, J-L.; Todbileg, M.

    2015-01-01

    The Bulnay earthquake of July 23, 1905 (Mw 8.3-8.5), in north-central Mongolia, is one of the world's largest recorded intracontinental earthquakes and one of four great earthquakes that occurred in the region during the 20th century. The 375-km-long surface rupture of the left-lateral, strike-slip, N095°E trending Bulnay Fault associated with this earthquake is remarkable for its pronounced expression across the landscape and for the size of features produced by previous earthquakes. Our field observations suggest that in many areas the width and geometry of the rupture zone is the result of repeated earthquakes; however, in those areas where it is possible to determine that the geomorphic features are the result of the 1905 surface rupture alone, the size of the features produced by this single earthquake are singular in comparison to most other historical strike-slip surface ruptures worldwide. Along the 80 km stretch, between 97.18°E and 98.33°E, the fault zone is characterized by several meters width and the mean left-lateral 1905 offset is 8.9 ± 0.6 m with two measured cumulative offsets that are twice the 1905 slip. These observations suggest that the displacement produced during the penultimate event was similar to the 1905 slip. Morphotectonic analyses carried out at three sites along the eastern part of the Bulnay fault, allow us to estimate a mean horizontal slip rate of 3.1 ± 1.7 mm/yr over the Late Pleistocene-Holocene period. In parallel, paleoseismological investigations show evidence for two earthquakes prior to the 1905 event with recurrence intervals of ~2700-4000 years.

  4. Earthquake Hazard in the Heart of the Homeland

    USGS Publications Warehouse

    Gomberg, Joan; Schweig, Eugene

    2007-01-01

    Evidence that earthquakes threaten the Mississippi, Ohio, and Wabash River valleys of the Central United States abounds. In fact, several of the largest historical earthquakes to strike the continental United States occurred in the winter of 1811-1812 along the New Madrid seismic zone, which stretches from just west of Memphis, Tenn., into southern Illinois. Several times in the past century, moderate earthquakes have been widely felt in the Wabash Valley seismic zone along the southern border of Illinois and Indiana. Throughout the region, between 150 and 200 earthquakes are recorded annually by a network of monitoring instruments, although most are too small to be felt by people. Geologic evidence for prehistoric earthquakes throughout the region has been mounting since the late 1970s. But how significant is the threat? How likely are large earthquakes and, more importantly, what is the chance that the shaking they cause will be damaging?

  5. A century of induced earthquakes in Oklahoma?

    USGS Publications Warehouse

    Hough, Susan E.; Page, Morgan T.

    2015-01-01

    Seismicity rates have increased sharply since 2009 in the central and eastern United States, with especially high rates of activity in the state of Oklahoma. Growing evidence indicates that many of these events are induced, primarily by injection of wastewater in deep disposal wells. The upsurge in activity has raised two questions: What is the background rate of tectonic earthquakes in Oklahoma? How much has the rate varied throughout historical and early instrumental times? In this article, we show that (1) seismicity rates since 2009 surpass previously observed rates throughout the twentieth century; (2) several lines of evidence suggest that most of the significant earthquakes in Oklahoma during the twentieth century were likely induced by oil production activities, as they exhibit statistically significant temporal and spatial correspondence with disposal wells, and intensity measurements for the 1952 El Reno earthquake and possibly the 1956 Tulsa County earthquake follow the pattern observed in other induced earthquakes; and (3) there is evidence for a low level of tectonic seismicity in southeastern Oklahoma associated with the Ouachita structural belt. The 22 October 1882 Choctaw Nation earthquake, for which we estimate Mw 4.8, occurred in this zone.

  6. Unbonded Prestressed Columns for Earthquake Resistance

    DOT National Transportation Integrated Search

    2012-05-01

    Modern structures are able to survive significant shaking caused by earthquakes. By implementing unbonded post-tensioned tendons in bridge columns, the damage caused by an earthquake can be significantly lower than that of a standard reinforced concr...

  7. The October 1992 Parkfield, California, earthquake prediction

    USGS Publications Warehouse

    Langbein, J.

    1992-01-01

    A magnitude 4.7 earthquake occurred near Parkfield, California, on October 20, 992, at 05:28 UTC (October 19 at 10:28 p.m. local or Pacific Daylight Time).This moderate shock, interpreted as the potential foreshock of a damaging earthquake on the San Andreas fault, triggered long-standing federal, state and local government plans to issue a public warning of an imminent magnitude 6 earthquake near Parkfield. Although the predicted earthquake did not take place, sophisticated suites of instruments deployed as part of the Parkfield Earthquake Prediction Experiment recorded valuable data associated with an unusual series of events. this article describes the geological aspects of these events, which occurred near Parkfield in October 1992. The accompnaying article, an edited version of a press conference b Richard Andrews, the Director of the California Office of Emergency Service (OES), describes governmental response to the prediction.   

  8. Oklahoma’s recent earthquakes and saltwater disposal

    PubMed Central

    Walsh, F. Rall; Zoback, Mark D.

    2015-01-01

    Over the past 5 years, parts of Oklahoma have experienced marked increases in the number of small- to moderate-sized earthquakes. In three study areas that encompass the vast majority of the recent seismicity, we show that the increases in seismicity follow 5- to 10-fold increases in the rates of saltwater disposal. Adjacent areas where there has been relatively little saltwater disposal have had comparatively few recent earthquakes. In the areas of seismic activity, the saltwater disposal principally comes from “produced” water, saline pore water that is coproduced with oil and then injected into deeper sedimentary formations. These formations appear to be in hydraulic communication with potentially active faults in crystalline basement, where nearly all the earthquakes are occurring. Although most of the recent earthquakes have posed little danger to the public, the possibility of triggering damaging earthquakes on potentially active basement faults cannot be discounted. PMID:26601200

  9. Implications of fault constitutive properties for earthquake prediction

    USGS Publications Warehouse

    Dieterich, J.H.; Kilgore, B.

    1996-01-01

    The rate- and state-dependent constitutive formulation for fault slip characterizes an exceptional variety of materials over a wide range of sliding conditions. This formulation provides a unified representation of diverse sliding phenomena including slip weakening over a characteristic sliding distance D(c), apparent fracture energy at a rupture front, time- dependent healing after rapid slip, and various other transient and slip rate effects. Laboratory observations and theoretical models both indicate that earthquake nucleation is accompanied by long intervals of accelerating slip. Strains from the nucleation process on buried faults generally could not be detected if laboratory values of D, apply to faults in nature. However, scaling of D(c) is presently an open question and the possibility exists that measurable premonitory creep may precede some earthquakes. Earthquake activity is modeled as a sequence of earthquake nucleation events. In this model, earthquake clustering arises from sensitivity of nucleation times to the stress changes induced by prior earthquakes. The model gives the characteristic Omori aftershock decay law and assigns physical interpretation to aftershock parameters. The seismicity formulation predicts large changes of earthquake probabilities result from stress changes. Two mechanisms for foreshocks are proposed that describe observed frequency of occurrence of foreshock-mainshock pairs by time and magnitude. With the first mechanism, foreshocks represent a manifestation of earthquake clustering in which the stress change at the time of the foreshock increases the probability of earthquakes at all magnitudes including the eventual mainshock. With the second model, accelerating fault slip on the mainshock nucleation zone triggers foreshocks.

  10. Implications of fault constitutive properties for earthquake prediction.

    PubMed Central

    Dieterich, J H; Kilgore, B

    1996-01-01

    The rate- and state-dependent constitutive formulation for fault slip characterizes an exceptional variety of materials over a wide range of sliding conditions. This formulation provides a unified representation of diverse sliding phenomena including slip weakening over a characteristic sliding distance Dc, apparent fracture energy at a rupture front, time-dependent healing after rapid slip, and various other transient and slip rate effects. Laboratory observations and theoretical models both indicate that earthquake nucleation is accompanied by long intervals of accelerating slip. Strains from the nucleation process on buried faults generally could not be detected if laboratory values of Dc apply to faults in nature. However, scaling of Dc is presently an open question and the possibility exists that measurable premonitory creep may precede some earthquakes. Earthquake activity is modeled as a sequence of earthquake nucleation events. In this model, earthquake clustering arises from sensitivity of nucleation times to the stress changes induced by prior earthquakes. The model gives the characteristic Omori aftershock decay law and assigns physical interpretation to aftershock parameters. The seismicity formulation predicts large changes of earthquake probabilities result from stress changes. Two mechanisms for foreshocks are proposed that describe observed frequency of occurrence of foreshock-mainshock pairs by time and magnitude. With the first mechanism, foreshocks represent a manifestation of earthquake clustering in which the stress change at the time of the foreshock increases the probability of earthquakes at all magnitudes including the eventual mainshock. With the second model, accelerating fault slip on the mainshock nucleation zone triggers foreshocks. Images Fig. 3 PMID:11607666

  11. Implications of fault constitutive properties for earthquake prediction.

    PubMed

    Dieterich, J H; Kilgore, B

    1996-04-30

    The rate- and state-dependent constitutive formulation for fault slip characterizes an exceptional variety of materials over a wide range of sliding conditions. This formulation provides a unified representation of diverse sliding phenomena including slip weakening over a characteristic sliding distance Dc, apparent fracture energy at a rupture front, time-dependent healing after rapid slip, and various other transient and slip rate effects. Laboratory observations and theoretical models both indicate that earthquake nucleation is accompanied by long intervals of accelerating slip. Strains from the nucleation process on buried faults generally could not be detected if laboratory values of Dc apply to faults in nature. However, scaling of Dc is presently an open question and the possibility exists that measurable premonitory creep may precede some earthquakes. Earthquake activity is modeled as a sequence of earthquake nucleation events. In this model, earthquake clustering arises from sensitivity of nucleation times to the stress changes induced by prior earthquakes. The model gives the characteristic Omori aftershock decay law and assigns physical interpretation to aftershock parameters. The seismicity formulation predicts large changes of earthquake probabilities result from stress changes. Two mechanisms for foreshocks are proposed that describe observed frequency of occurrence of foreshock-mainshock pairs by time and magnitude. With the first mechanism, foreshocks represent a manifestation of earthquake clustering in which the stress change at the time of the foreshock increases the probability of earthquakes at all magnitudes including the eventual mainshock. With the second model, accelerating fault slip on the mainshock nucleation zone triggers foreshocks.

  12. Statistical aspects and risks of human-caused earthquakes

    NASA Astrophysics Data System (ADS)

    Klose, C. D.

    2013-12-01

    The seismological community invests ample human capital and financial resources to research and predict risks associated with earthquakes. Industries such as the insurance and re-insurance sector are equally interested in using probabilistic risk models developed by the scientific community to transfer risks. These models are used to predict expected losses due to naturally occurring earthquakes. But what about the risks associated with human-caused earthquakes? Such risk models are largely absent from both industry and academic discourse. In countries around the world, informed citizens are becoming increasingly aware and concerned that this economic bias is not sustainable for long-term economic growth, environmental and human security. Ultimately, citizens look to their government officials to hold industry accountable. In the Netherlands, for example, the hydrocarbon industry is held accountable for causing earthquakes near Groningen. In Switzerland, geothermal power plants were shut down or suspended because they caused earthquakes in canton Basel and St. Gallen. The public and the private non-extractive industry needs access to information about earthquake risks in connection with sub/urban geoengineeing activities, including natural gas production through fracking, geothermal energy production, carbon sequestration, mining and water irrigation. This presentation illuminates statistical aspects of human-caused earthquakes with respect to different geologic environments. Statistical findings are based on the first catalog of human-caused earthquakes (in Klose 2013). Findings are discussed which include the odds to die during a medium-size earthquake that is set off by geomechanical pollution. Any kind of geoengineering activity causes this type of pollution and increases the likelihood of triggering nearby faults to rupture.

  13. Streamflow and water well responses to earthquakes.

    PubMed

    Montgomery, David R; Manga, Michael

    2003-06-27

    Earthquake-induced crustal deformation and ground shaking can alter stream flow and water levels in wells through consolidation of surficial deposits, fracturing of solid rocks, aquifer deformation, and the clearing of fracture-filling material. Although local conditions affect the type and amplitude of response, a compilation of reported observations of hydrological response to earthquakes indicates that the maximum distance to which changes in stream flow and water levels in wells have been reported is related to earthquake magnitude. Detectable streamflow changes occur in areas within tens to hundreds of kilometers of the epicenter, whereas changes in groundwater levels in wells can occur hundreds to thousands of kilometers from earthquake epicenters.

  14. Crowd-Sourced Global Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Minson, S. E.; Brooks, B. A.; Glennie, C. L.; Murray, J. R.; Langbein, J. O.; Owen, S. E.; Iannucci, B. A.; Hauser, D. L.

    2014-12-01

    Although earthquake early warning (EEW) has shown great promise for reducing loss of life and property, it has only been implemented in a few regions due, in part, to the prohibitive cost of building the required dense seismic and geodetic networks. However, many cars and consumer smartphones, tablets, laptops, and similar devices contain low-cost versions of the same sensors used for earthquake monitoring. If a workable EEW system could be implemented based on either crowd-sourced observations from consumer devices or very inexpensive networks of instruments built from consumer-quality sensors, EEW coverage could potentially be expanded worldwide. Controlled tests of several accelerometers and global navigation satellite system (GNSS) receivers typically found in consumer devices show that, while they are significantly noisier than scientific-grade instruments, they are still accurate enough to capture displacements from moderate and large magnitude earthquakes. The accuracy of these sensors varies greatly depending on the type of data collected. Raw coarse acquisition (C/A) code GPS data are relatively noisy. These observations have a surface displacement detection threshold approaching ~1 m and would thus only be useful in large Mw 8+ earthquakes. However, incorporating either satellite-based differential corrections or using a Kalman filter to combine the raw GNSS data with low-cost acceleration data (such as from a smartphone) decreases the noise dramatically. These approaches allow detection thresholds as low as 5 cm, potentially enabling accurate warnings for earthquakes as small as Mw 6.5. Simulated performance tests show that, with data contributed from only a very small fraction of the population, a crowd-sourced EEW system would be capable of warning San Francisco and San Jose of a Mw 7 rupture on California's Hayward fault and could have accurately issued both earthquake and tsunami warnings for the 2011 Mw 9 Tohoku-oki, Japan earthquake.

  15. Development and Progress of Education for Earthquake Disaster

    NASA Astrophysics Data System (ADS)

    Usui, Hiromoto

    We had experienced the great Hanshin-Awaji earthquake disaster around ten years ago. Recently, the succession of disaster memory to the next generation becomes an important action-assignment. Since the occurrence of huge earthquake is expected in the near future, it is important to teach widely the lesson of the great Hanshin-Awaji earthquake disaster to the next generation, and this educational activity is also important for the disaster mitigation strategy in Japan. In this project, the accumulated data of disaster memory can be utilized to construct the educational system for earthquake disaster, and the collaboration between Kobe University, local government, city, civic group and media organization can be exploited to characterize the educational system of earthquake disaster mitigation.

  16. Photointerpretation of Alaskan post-earthquake photography

    USGS Publications Warehouse

    Hackman, R.J.

    1965-01-01

    Aerial photographs taken after the March 27, 1964, Good Friday, Alaskan earthquake were examined stereoscopically to determine effects of the earthquake in areas remote from the towns, highways, and the railroad. The two thousand black and white photographs used in this study were taking in April, after the earthquake, by the U. S. Coast & Geodetic Survey and were generously supplied to the U. S. Geological Survey. Part of the photographs, at a scale of 1/24,000, provide blanket coverage of approximately 2,000 square miles of land area north and west of Prince William Sound, including parts of the mainland and some of the adjacent islands. The epicenter of the earthquake, near the head of Unakwik Inlet, is located in this area. The rest of the photographs, at scales ranging from 1/17,000 to 1/40,000, cover isolated strips of the coastline of the mainland and nearby islands in the general area of Prince William Sound. Figure 1 shows the area of new photo coverage used in this study. The objective of the study was to determine quickly whether geological features resulting from the earthquake, such as faults, changes in shoreline, cracks in surficial material, pressure ridges in lake ice, fractures in glaciers and lake ice, and rock slides and avalanches, might be identifiable by photointerpretation. The study was made without benefit of comparisons with older, or pre-earthquake photography, which was not readily available for immediate use.

  17. Computing Earthquake Probabilities on Global Scales

    NASA Astrophysics Data System (ADS)

    Holliday, James R.; Graves, William R.; Rundle, John B.; Turcotte, Donald L.

    2016-03-01

    Large devastating events in systems such as earthquakes, typhoons, market crashes, electricity grid blackouts, floods, droughts, wars and conflicts, and landslides can be unexpected and devastating. Events in many of these systems display frequency-size statistics that are power laws. Previously, we presented a new method for calculating probabilities for large events in systems such as these. This method counts the number of small events since the last large event and then converts this count into a probability by using a Weibull probability law. We applied this method to the calculation of large earthquake probabilities in California-Nevada, USA. In that study, we considered a fixed geographic region and assumed that all earthquakes within that region, large magnitudes as well as small, were perfectly correlated. In the present article, we extend this model to systems in which the events have a finite correlation length. We modify our previous results by employing the correlation function for near mean field systems having long-range interactions, an example of which is earthquakes and elastic interactions. We then construct an application of the method and show examples of computed earthquake probabilities.

  18. 75 FR 2159 - Scientific Earthquake Studies Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-14

    ... DEPARTMENT OF THE INTERIOR U.S. Geological Survey Scientific Earthquake Studies Advisory Committee... Scientific Earthquake Studies Advisory Committee (SESAC) will hold its next meeting at the U.S. Geological.... Meetings of the Scientific Earthquake Studies Advisory Committee are open to the public. DATES: January 26...

  19. 76 FR 61113 - Scientific Earthquake Studies Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-03

    ... DEPARTMENT OF THE INTERIOR U.S. Geological Survey Scientific Earthquake Studies Advisory Committee...-503, the Scientific Earthquake Studies Advisory Committee (SESAC) will hold its next meeting at the.... Meetings of the Scientific Earthquake Studies Advisory Committee are open to the public. DATES: November 2...

  20. 78 FR 19004 - Scientific Earthquake Studies Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-28

    ... DEPARTMENT OF THE INTERIOR U.S. Geological Survey [GX13GG009950000] Scientific Earthquake Studies... Law 106-503, the Scientific Earthquake Studies Advisory Committee (SESAC) will hold its next meeting... international activities. Meetings of the Scientific Earthquake Studies Advisory Committee are open to the...

  1. 77 FR 12323 - Scientific Earthquake Studies Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-29

    ... DEPARTMENT OF THE INTERIOR U.S. Geological Survey Scientific Earthquake Studies Advisory Committee...-503, the Scientific Earthquake Studies Advisory Committee (SESAC) will hold its next meeting at the.... Meetings of the Scientific Earthquake Studies Advisory Committee are open to the public. DATES: March 29...

  2. Rescaled earthquake recurrence time statistics: application to microrepeaters

    NASA Astrophysics Data System (ADS)

    Goltz, Christian; Turcotte, Donald L.; Abaimov, Sergey G.; Nadeau, Robert M.; Uchida, Naoki; Matsuzawa, Toru

    2009-01-01

    Slip on major faults primarily occurs during `characteristic' earthquakes. The recurrence statistics of characteristic earthquakes play an important role in seismic hazard assessment. A major problem in determining applicable statistics is the short sequences of characteristic earthquakes that are available worldwide. In this paper, we introduce a rescaling technique in which sequences can be superimposed to establish larger numbers of data points. We consider the Weibull and log-normal distributions, in both cases we rescale the data using means and standard deviations. We test our approach utilizing sequences of microrepeaters, micro-earthquakes which recur in the same location on a fault. It seems plausible to regard these earthquakes as a miniature version of the classic characteristic earthquakes. Microrepeaters are much more frequent than major earthquakes, leading to longer sequences for analysis. In this paper, we present results for the analysis of recurrence times for several microrepeater sequences from Parkfield, CA as well as NE Japan. We find that, once the respective sequence can be considered to be of sufficient stationarity, the statistics can be well fitted by either a Weibull or a log-normal distribution. We clearly demonstrate this fact by our technique of rescaled combination. We conclude that the recurrence statistics of the microrepeater sequences we consider are similar to the recurrence statistics of characteristic earthquakes on major faults.

  3. Earthquake response analysis of 11-story RC building that suffered damage in 2011 East Japan Earthquake

    NASA Astrophysics Data System (ADS)

    Shibata, Akenori; Masuno, Hidemasa

    2017-10-01

    An eleven-story RC apartment building suffered medium damage in the 2011 East Japan earthquake and was retrofitted for re-use. Strong motion records were obtained near the building. This paper discusses the inelastic earthquake response analysis of the building using the equivalent single-degree-of-freedom (1-DOF) system to account for the features of damage. The method of converting the building frame into 1-DOF system with tri-linear reducing-stiffness restoring force characteristics was given. The inelastic response analysis of the building against the earthquake using the inelastic 1-DOF equivalent system could interpret well the level of actual damage.

  4. Magnitude and intensity: Measures of earthquake size and severity

    USGS Publications Warehouse

    Spall, Henry

    1982-01-01

    Earthquakes can be measured in terms of either the amount of energy they release (magnitude) or the degree of ground shaking they cause at a particular locality (intensity).  Although magnitude and intensity are basically different measures of an earthquake, they are frequently confused by the public and new reports of earthquakes.  Part of the confusion probably arises from the general similarity of scales used express these quantities.  The various magnitude scales represent logarithmic expressions of the energy released by an earthquake.  Magnitude is calculated from the record made by an earthquake on a calibrated seismograph.  There are no upper or lower limits to magnitude, although no measured earthquakes have exceeded magnitude 8.9.

  5. Tilt precursors before earthquakes on the San Andreas fault, California

    USGS Publications Warehouse

    Johnston, M.J.S.; Mortensen, C.E.

    1974-01-01

    An array of 14 biaxial shallow-borehole tiltmeters (at 10-7 radian sensitivity) has been installed along 85 kilometers of the San Andreas fault during the past year. Earthquake-related changes in tilt have been simultaneously observed on up to four independent instruments. At earthquake distances greater than 10 earthquake source dimensions, there are few clear indications of tilt change. For the four instruments with the longest records (>10 months), 26 earthquakes have occurred since July 1973 with at least one instrument closer than 10 source dimensions and 8 earthquakes with more than one instrument within that distance. Precursors in tilt direction have been observed before more than 10 earthquakes or groups of earthquakes, and no similar effect has yet been seen without the occurrence of an earthquake.

  6. The Differences in Source Dynamics Between Intermediate-Depth and Deep EARTHQUAKES:A Comparative Study Between the 2014 Rat Islands Intermediate-Depth Earthquake and the 2015 Bonin Islands Deep Earthquake

    NASA Astrophysics Data System (ADS)

    Twardzik, C.; Ji, C.

    2015-12-01

    It has been proposed that the mechanisms for intermediate-depth and deep earthquakes might be different. While previous extensive seismological studies suggested that such potential differences do not significantly affect the scaling relationships of earthquake parameters, there has been only a few investigations regarding their dynamic characteristics, especially for fracture energy. In this work, the 2014 Mw7.9 Rat Islands intermediate-depth (105 km) earthquake and the 2015 Mw7.8 Bonin Islands deep (680 km) earthquake are studied from two different perspectives. First, their kinematic rupture models are constrained using teleseismic body waves. Our analysis reveals that the Rat Islands earthquake breaks the entire cold core of the subducting slab defined as the depth of the 650oC isotherm. The inverted stress drop is 4 MPa, compatible to that of intra-plate earthquakes at shallow depths. On the other hand, the kinematic rupture model of the Bonin Islands earthquake, which occurred in a region lacking of seismicity for the past forty years, according to the GCMT catalog, exhibits an energetic rupture within a 35 km by 30 km slip patch and a high stress drop of 24 MPa. It is of interest to note that although complex rupture patterns are allowed to match the observations, the inverted slip distributions of these two earthquakes are simple enough to be approximated as the summation of a few circular/elliptical slip patches. Thus, we investigate subsequently their dynamic rupture models. We use a simple modelling approach in which we assume that the dynamic rupture propagation obeys a slip-weakening friction law, and we describe the distribution of stress and friction on the fault as a set of elliptical patches. We will constrain the three dynamic parameters that are yield stress, background stress prior to the rupture and slip weakening distance, as well as the shape of the elliptical patches directly from teleseismic body waves observations. The study would help us

  7. Earthquake in Hindu Kush Region, Afghanistan

    NASA Image and Video Library

    2015-10-27

    On Oct. 26, 2015, NASA Terra spacecraft acquired this image of northeastern Afghanistan where a magnitude 7.5 earthquake struck the Hindu Kush region. The earthquake's epicenter was at a depth of 130 miles (210 kilometers), on a probable shallowly dipping thrust fault. At this location, the Indian subcontinent moves northward and collides with Eurasia, subducting under the Asian continent, and raising the highest mountains in the world. This type of earthquake is common in the area: a similar earthquake occurred 13 years ago about 12 miles (20 kilometers) away. This perspective image from the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) instrument on NASA's Terra spacecraft, looking southwest, shows the hypocenter with a star. The image was acquired July 8, 2015, and is located near 36.4 degrees north, 70.7 degrees east. http://photojournal.jpl.nasa.gov/catalog/PIA20035

  8. Earthquake Facts

    MedlinePlus

    ... recordings of large earthquakes, scientists built large spring-pendulum seismometers in an attempt to record the long- ... are moving away from one another. The first “pendulum seismoscope” to measure the shaking of the ground ...

  9. Earthquakes Threaten Many American Schools

    ERIC Educational Resources Information Center

    Bailey, Nancy E.

    2010-01-01

    Millions of U.S. children attend schools that are not safe from earthquakes, even though they are in earthquake-prone zones. Several cities and states have worked to identify and repair unsafe buildings, but many others have done little or nothing to fix the problem. The reasons for ignoring the problem include political and financial ones, but…

  10. Earthquake precursors: activation or quiescence?

    NASA Astrophysics Data System (ADS)

    Rundle, John B.; Holliday, James R.; Yoder, Mark; Sachs, Michael K.; Donnellan, Andrea; Turcotte, Donald L.; Tiampo, Kristy F.; Klein, William; Kellogg, Louise H.

    2011-10-01

    We discuss the long-standing question of whether the probability for large earthquake occurrence (magnitudes m > 6.0) is highest during time periods of smaller event activation, or highest during time periods of smaller event quiescence. The physics of the activation model are based on an idea from the theory of nucleation, that a small magnitude earthquake has a finite probability of growing into a large earthquake. The physics of the quiescence model is based on the idea that the occurrence of smaller earthquakes (here considered as magnitudes m > 3.5) may be due to a mechanism such as critical slowing down, in which fluctuations in systems with long-range interactions tend to be suppressed prior to large nucleation events. To illuminate this question, we construct two end-member forecast models illustrating, respectively, activation and quiescence. The activation model assumes only that activation can occur, either via aftershock nucleation or triggering, but expresses no choice as to which mechanism is preferred. Both of these models are in fact a means of filtering the seismicity time-series to compute probabilities. Using 25 yr of data from the California-Nevada catalogue of earthquakes, we show that of the two models, activation and quiescence, the latter appears to be the better model, as judged by backtesting (by a slight but not significant margin). We then examine simulation data from a topologically realistic earthquake model for California seismicity, Virtual California. This model includes not only earthquakes produced from increases in stress on the fault system, but also background and off-fault seismicity produced by a BASS-ETAS driving mechanism. Applying the activation and quiescence forecast models to the simulated data, we come to the opposite conclusion. Here, the activation forecast model is preferred to the quiescence model, presumably due to the fact that the BASS component of the model is essentially a model for activated seismicity. These

  11. Earthquake Protection Measures for People with Disabilities

    NASA Astrophysics Data System (ADS)

    Gountromichou, C.; Kourou, A.; Kerpelis, P.

    2009-04-01

    The problem of seismic safety for people with disabilities not only exists but is also urgent and of primary importance. Working towards disability equality, Earthquake Planning and Protection Organization of Greece (E.P.P.O.) has developed an educational scheme for people with disabilities in order to guide them to develop skills to protect themselves as well as to take the appropriate safety measures before, during and after an earthquake. The framework of this initiative includes a number of actions have been already undertaken, including the following: a. Recently, the main guidelines have been published to help people who have physical, cognitive, visual, or auditory disabilities to cope with a destructive earthquake. Of great importance, in case of people with disabilities, is to be prepared for the disaster, with several measures that must be taken starting today. In the pre-earthquake period, it is important that these people, in addition to other measures, do the following: - Create a Personal Support Network The Personal Support Network should be a group of at least three trustful people that can assist the disabled person to prepare for a disastrous event and to recover after it. - Complete a Personal Assessment The environment may change after a destructive earthquake. People with disabilities are encouraged to make a list of their personal needs and their resources for meeting them in a disaster environment. b. Lectures and training seminars on earthquake protection are given for students, teachers and educators in Special Schools for disabled people, mainly for informing and familiarizing them with earthquakes and with safety measures. c. Many earthquake drills have already taken place, for each disability, in order to share good practices and lessons learned to further disaster reduction and to identify gaps and challenges. The final aim of this action is all people with disabilities to be well informed and motivated towards a culture of earthquake

  12. Strike-slip earthquakes can also be detected in the ionosphere

    NASA Astrophysics Data System (ADS)

    Astafyeva, Elvira; Rolland, Lucie M.; Sladen, Anthony

    2014-11-01

    It is generally assumed that co-seismic ionospheric disturbances are generated by large vertical static displacements of the ground during an earthquake. Consequently, it is expected that co-seismic ionospheric disturbances are only observable after earthquakes with a significant dip-slip component. Therefore, earthquakes dominated by strike-slip motion, i.e. with very little vertical co-seismic component, are not expected to generate ionospheric perturbations. In this work, we use total electron content (TEC) measurements from ground-based GNSS-receivers to study ionospheric response to six recent largest strike-slip earthquakes: the Mw7.8 Kunlun earthquake of 14 November 2001, the Mw8.1 Macquarie earthquake of 23 December 2004, the Sumatra earthquake doublet, Mw8.6 and Mw8.2, of 11 April 2012, the Mw7.7 Balochistan earthquake of 24 September 2013 and the Mw 7.7 Scotia Sea earthquake of 17 November 2013. We show that large strike-slip earthquakes generate large ionospheric perturbations of amplitude comparable with those induced by dip-slip earthquakes of equivalent magnitude. We consider that in the absence of significant vertical static co-seismic displacements of the ground, other seismological parameters (primarily the magnitude of co-seismic horizontal displacements, seismic fault dimensions, seismic slip) may contribute in generation of large-amplitude ionospheric perturbations.

  13. Do I Really Sound Like That? Communicating Earthquake Science Following Significant Earthquakes at the NEIC

    NASA Astrophysics Data System (ADS)

    Hayes, G. P.; Earle, P. S.; Benz, H.; Wald, D. J.; Yeck, W. L.

    2017-12-01

    The U.S. Geological Survey's National Earthquake Information Center (NEIC) responds to about 160 magnitude 6.0 and larger earthquakes every year and is regularly inundated with information requests following earthquakes that cause significant impact. These requests often start within minutes after the shaking occurs and come from a wide user base including the general public, media, emergency managers, and government officials. Over the past several years, the NEIC's earthquake response has evolved its communications strategy to meet the changing needs of users and the evolving media landscape. The NEIC produces a cascade of products starting with basic hypocentral parameters and culminating with estimates of fatalities and economic loss. We speed the delivery of content by prepositioning and automatically generating products such as, aftershock plots, regional tectonic summaries, maps of historical seismicity, and event summary posters. Our goal is to have information immediately available so we can quickly address the response needs of a particular event or sequence. This information is distributed to hundreds of thousands of users through social media, email alerts, programmatic data feeds, and webpages. Many of our products are included in event summary posters that can be downloaded and printed for local display. After significant earthquakes, keeping up with direct inquiries and interview requests from TV, radio, and print reports is always challenging. The NEIC works with the USGS Office of Communications and the USGS Science Information Services to organize and respond to these requests. Written executive summaries reports are produced and distributed to USGS personnel and collaborators throughout the country. These reports are updated during the response to keep our message consistent and information up to date. This presentation will focus on communications during NEIC's rapid earthquake response but will also touch on the broader USGS traditional and

  14. Earthquakes in South Carolina and Vicinity 1698-2009

    USGS Publications Warehouse

    Dart, Richard L.; Talwani, Pradeep; Stevenson, Donald

    2010-01-01

    This map summarizes more than 300 years of South Carolina earthquake history. It is one in a series of three similar State earthquake history maps. The current map and the previous two for Virginia and Ohio are accessible at http://pubs.usgs.gov/of/2006/1017/ and http://pubs.usgs.gov/of/2008/1221/. All three State earthquake maps were collaborative efforts between the U.S. Geological Survey and respective State agencies. Work on the South Carolina map was done in collaboration with the Department of Geological Sciences, University of South Carolina. As with the two previous maps, the history of South Carolina earthquakes was derived from letters, journals, diaries, newspaper accounts, academic journal articles, and, beginning in the early 20th century, instrumental recordings (seismograms). All historical (preinstrumental) earthquakes that were large enough to be felt have been located based on felt reports. Some of these events caused damage to buildings and their contents. The more recent widespread use of seismographs has allowed many smaller earthquakes, previously undetected, to be recorded and accurately located. The seismicity map shows historically located and instrumentally recorded earthquakes in and near South Carolina

  15. Weather Satellite Thermal IR Responses Prior to Earthquakes

    NASA Technical Reports Server (NTRS)

    OConnor, Daniel P.

    2005-01-01

    A number of observers claim to have seen thermal anomalies prior to earthquakes, but subsequent analysis by others has failed to produce similar findings. What exactly are these anomalies? Might they be useful for earthquake prediction? It is the purpose of this study to determine if thermal anomalies can be found in association with known earthquakes by systematically co-registering weather satellite images at the sub-pixel level and then determining if statistically significant responses occurred prior to the earthquake event. A new set of automatic co-registration procedures was developed for this task to accommodate all properties particular to weather satellite observations taken at night, and it relies on the general condition that the ground cools after sunset. Using these procedures, we can produce a set of temperature-sensitive satellite images for each of five selected earthquakes (Algeria 2003; Bhuj, India 2001; Izmit, Turkey 2001; Kunlun Shan, Tibet 2001; Turkmenistan 2000) and thus more effectively investigate heating trends close to the epicenters a few hours prior to the earthquake events. This study will lay tracks for further work in earthquake prediction and provoke the question of the exact nature of the thermal anomalies.

  16. Earthquake hazard assessment after Mexico (1985).

    PubMed

    Degg, M R

    1989-09-01

    The 1985 Mexican earthquake ranks foremost amongst the major earthquake disasters of the twentieth century. One of the few positive aspects of the disaster is that it provided massive quantities of data that would otherwise have been unobtainable. Every opportunity should be taken to incorporate the findings from these data in earthquake hazard assessments. The purpose of this paper is to provide a succinct summary of some of the more important lessons from Mexico. It stems from detailed field investigations, and subsequent analyses, conducted by the author on the behalf of reinsurance companies.

  17. SITE AMPLIFICATION OF EARTHQUAKE GROUND MOTION.

    USGS Publications Warehouse

    Hays, Walter W.

    1986-01-01

    When analyzing the patterns of damage in an earthquake, physical parameters of the total earthquake-site-structure system are correlated with the damage. Soil-structure interaction, the cause of damage in many earthquakes, involves the frequency-dependent response of both the soil-rock column and the structure. The response of the soil-rock column (called site amplification) is controversial because soil has strain-dependent properties that affect the way the soil column filters the input body and surface seismic waves, modifying the amplitude and phase spectra and the duration of the surface ground motion.

  18. Geophysical advances triggered by 1964 Great Alaska Earthquake

    USGS Publications Warehouse

    Haeussler, Peter J.; Leith, William S.; Wald, David J.; Filson, John R.; Wolfe, Cecily; Applegate, David

    2014-01-01

    A little more than 50 years ago, on 27 March 1964, the Great Alaska earthquake and tsunami struck. At moment magnitude 9.2, this earthquake is notable as the largest in U.S. written history and as the second-largest ever recorded by instruments worldwide. But what resonates today are its impacts on the understanding of plate tectonics, tsunami generation, and earthquake history as well as on the development of national programs to reduce risk from earthquakes and tsunamis.

  19. Toward standardization of slow earthquake catalog -Development of database website-

    NASA Astrophysics Data System (ADS)

    Kano, M.; Aso, N.; Annoura, S.; Arai, R.; Ito, Y.; Kamaya, N.; Maury, J.; Nakamura, M.; Nishimura, T.; Obana, K.; Sugioka, H.; Takagi, R.; Takahashi, T.; Takeo, A.; Yamashita, Y.; Matsuzawa, T.; Ide, S.; Obara, K.

    2017-12-01

    Slow earthquakes have now been widely discovered in the world based on the recent development of geodetic and seismic observations. Many researchers detect a wide frequency range of slow earthquakes including low frequency tremors, low frequency earthquakes, very low frequency earthquakes and slow slip events by using various methods. Catalogs of the detected slow earthquakes are open to us in different formats by each referring paper or through a website (e.g., Wech 2010; Idehara et al. 2014). However, we need to download catalogs from different sources, to deal with unformatted catalogs and to understand the characteristics of different catalogs, which may be somewhat complex especially for those who are not familiar with slow earthquakes. In order to standardize slow earthquake catalogs and to make such a complicated work easier, Scientific Research on Innovative Areas "Science of Slow Earthquakes" has been developing a slow earthquake catalog website. In the website, we can plot locations of various slow earthquakes via the Google Maps by compiling a variety of slow earthquake catalogs including slow slip events. This enables us to clearly visualize spatial relations among slow earthquakes at a glance and to compare the regional activities of slow earthquakes or the locations of different catalogs. In addition, we can download catalogs in the unified format and refer the information on each catalog on the single website. Such standardization will make it more convenient for users to utilize the previous achievements and to promote research on slow earthquakes, which eventually leads to collaborations with researchers in various fields and further understanding of the mechanisms, environmental conditions, and underlying physics of slow earthquakes. Furthermore, we expect that the website has a leading role in the international standardization of slow earthquake catalogs. We report the overview of the website and the progress of construction. Acknowledgment: This

  20. Analysis of the Seismicity Preceding Large Earthquakes

    NASA Astrophysics Data System (ADS)

    Stallone, A.; Marzocchi, W.

    2016-12-01

    The most common earthquake forecasting models assume that the magnitude of the next earthquake is independent from the past. This feature is probably one of the most severe limitations of the capability to forecast large earthquakes.In this work, we investigate empirically on this specific aspect, exploring whether spatial-temporal variations in seismicity encode some information on the magnitude of the future earthquakes. For this purpose, and to verify the universality of the findings, we consider seismic catalogs covering quite different space-time-magnitude windows, such as the Alto Tiberina Near Fault Observatory (TABOO) catalogue, and the California and Japanese seismic catalog. Our method is inspired by the statistical methodology proposed by Zaliapin (2013) to distinguish triggered and background earthquakes, using the nearest-neighbor clustering analysis in a two-dimension plan defined by rescaled time and space. In particular, we generalize the metric based on the nearest-neighbor to a metric based on the k-nearest-neighbors clustering analysis that allows us to consider the overall space-time-magnitude distribution of k-earthquakes (k-foreshocks) which anticipate one target event (the mainshock); then we analyze the statistical properties of the clusters identified in this rescaled space. In essence, the main goal of this study is to verify if different classes of mainshock magnitudes are characterized by distinctive k-foreshocks distribution. The final step is to show how the findings of this work may (or not) improve the skill of existing earthquake forecasting models.