Science.gov

Sample records for operating basis earthquake

  1. Prototype operational earthquake prediction system

    USGS Publications Warehouse

    Spall, Henry

    1986-01-01

    An objective if the U.S. Earthquake Hazards Reduction Act of 1977 is to introduce into all regions of the country that are subject to large and moderate earthquakes, systems for predicting earthquakes and assessing earthquake risk. In 1985, the USGS developed for the Secretary of the Interior a program for implementation of a prototype operational earthquake prediction system in southern California.

  2. Operational earthquake forecasting can enhance earthquake preparedness

    USGS Publications Warehouse

    Jordan, T.H.; Marzocchi, W.; Michael, A.J.; Gerstenberger, M.C.

    2014-01-01

    We cannot yet predict large earthquakes in the short term with much reliability and skill, but the strong clustering exhibited in seismic sequences tells us that earthquake probabilities are not constant in time; they generally rise and fall over periods of days to years in correlation with nearby seismic activity. Operational earthquake forecasting (OEF) is the dissemination of authoritative information about these time‐dependent probabilities to help communities prepare for potentially destructive earthquakes. The goal of OEF is to inform the decisions that people and organizations must continually make to mitigate seismic risk and prepare for potentially destructive earthquakes on time scales from days to decades. To fulfill this role, OEF must provide a complete description of the seismic hazard—ground‐motion exceedance probabilities as well as short‐term rupture probabilities—in concert with the long‐term forecasts of probabilistic seismic‐hazard analysis (PSHA).

  3. [Autism after an earthquake: the experience of L'Aquila (Central Italy) as a basis for an operative guideline].

    PubMed

    Valenti, Marco; Di Giovanni, Chiara; Mariano, Melania; Pino, Maria Chiara; Sconci, Vittorio; Mazza, Monica

    2016-01-01

    People with autism, their families, and their specialised caregivers are a social group at high health risk after a disruptive earthquake. They need emergency assistance and immediate structured support according to definite protocols and quality standards. We recommend to establish national guidelines for taking-in-charge people with autism after an earthquake. The adaptive behaviour of participants with autism declined dramatically in the first months after the earthquake in all the dimensions examined (i.e., communication, daily living, socialisation, and motor skills). After relatively stable conditions returned and with immediate and intensive post-disaster intervention, children and adolescents with autism showed a trend towards partial recovery of adaptive functioning. As to the impact on services, this study indicates the need for supporting exposed caregivers at high risk of burnout over the first two years after the disaster and for an immediate reorganisation of person-tailored services.

  4. The potential uses of operational earthquake forecasting

    USGS Publications Warehouse

    Field, Ned; Jordan, Thomas; Jones, Lucille; Michael, Andrew; Blanpied, Michael L.

    2016-01-01

    This article reports on a workshop held to explore the potential uses of operational earthquake forecasting (OEF). We discuss the current status of OEF in the United States and elsewhere, the types of products that could be generated, the various potential users and uses of OEF, and the need for carefully crafted communication protocols. Although operationalization challenges remain, there was clear consensus among the stakeholders at the workshop that OEF could be useful.

  5. Operational Earthquake Forecasting of Aftershocks for New England

    NASA Astrophysics Data System (ADS)

    Ebel, J.; Fadugba, O. I.

    2015-12-01

    Although the forecasting of mainshocks is not possible, recent research demonstrates that probabilistic forecasts of expected aftershock activity following moderate and strong earthquakes is possible. Previous work has shown that aftershock sequences in intraplate regions behave similarly to those in California, and thus the operational aftershocks forecasting methods that are currently employed in California can be adopted for use in areas of the eastern U.S. such as New England. In our application, immediately after a felt earthquake in New England, a forecast of expected aftershock activity for the next 7 days will be generated based on a generic aftershock activity model. Approximately 24 hours after the mainshock, the parameters of the aftershock model will be updated using the observed aftershock activity observed to that point in time, and a new forecast of expected aftershock activity for the next 7 days will be issued. The forecast will estimate the average number of weak, felt aftershocks and the average expected number of aftershocks based on the aftershock statistics of past New England earthquakes. The forecast also will estimate the probability that an earthquake that is stronger than the mainshock will take place during the next 7 days. The aftershock forecast will specify the expected aftershocks locations as well as the areas over which aftershocks of different magnitudes could be felt. The system will use web pages, email and text messages to distribute the aftershock forecasts. For protracted aftershock sequences, new forecasts will be issued on a regular basis, such as weekly. Initially, the distribution system of the aftershock forecasts will be limited, but later it will be expanded as experience with and confidence in the system grows.

  6. Linking earthquakes and hydraulic fracturing operations

    NASA Astrophysics Data System (ADS)

    Balcerak, Ernie

    2013-01-01

    Hydraulic fracturing, also known as fracking, to extract oil and gas from rock, has been a controversial but increasingly common practice; some studies have linked it to groundwater contamination and induced earthquakes. Scientists discussed several studies on the connection between fracking and earthquakes at the AGU Fall Meeting in San Francisco in December.

  7. The Bender-Dunne basis operators as Hilbert space operators

    SciTech Connect

    Bunao, Joseph; Galapon, Eric A. E-mail: eric.galapon@upd.edu.ph

    2014-02-15

    The Bender-Dunne basis operators, T{sub −m,n}=2{sup −n}∑{sub k=0}{sup n}(n/k )q{sup k}p{sup −m}q{sup n−k} where q and p are the position and momentum operators, respectively, are formal integral operators in position representation in the entire real line R for positive integers n and m. We show, by explicit construction of a dense domain, that the operators T{sub −m,n}'s are densely defined operators in the Hilbert space L{sup 2}(R)

  8. Ground motion following selection of SRS design basis earthquake and associated deterministic approach

    SciTech Connect

    Not Available

    1991-03-01

    This report summarizes the results of a deterministic assessment of earthquake ground motions at the Savannah River Site (SRS). The purpose of this study is to assist the Environmental Sciences Section of the Savannah River Laboratory in reevaluating the design basis earthquake (DBE) ground motion at SRS during approaches defined in Appendix A to 10 CFR Part 100. This work is in support of the Seismic Engineering Section's Seismic Qualification Program for reactor restart.

  9. Development of Probabilistic Design Basis Earthquake (DBE) Parameters for Moderate and High Hazard Facilities at INEEL

    SciTech Connect

    S. M. Payne; V. W. Gorman; S. A. Jensen; M. E. Nitzel; M. J. Russell; R. P. Smith

    2000-03-01

    Design Basis Earthquake (DBE) horizontal and vertical response spectra are developed for moderate and high hazard facilities or Performance Categories (PC) 3 and 4, respectively, at the Idaho National Engineering and Environmental Laboratory (INEEL). The probabilistic DBE response spectra will replace the deterministic DBE response spectra currently in the U.S. Department of Energy Idaho Operations Office (DOE-ID) Architectural Engineering Standards that govern seismic design criteria for several facility areas at the INEEL. Probabilistic DBE response spectra are recommended to DOE Naval Reactors for use at the Naval Reactor Facility at INEEL. The site-specific Uniform Hazard Spectra (UHS) developed by URS Greiner Woodward Clyde Federal Services are used as the basis for developing the DBE response spectra. In 1999, the UHS for all INEEL facility areas were recomputed using more appropriate attenuation relationships for the Basin and Range province. The revised UHS have lower ground motions than those produced in the 1996 INEEL site-wide probabilistic ground motion study. The DBE response spectra were developed by incorporating smoothed broadened regions of the peak accelerations, velocities, and displacements defined by the site-specific UHS. Portions of the DBE response spectra were adjusted to ensure conservatism for the structural design process.

  10. Minimization of Basis Risk in Parametric Earthquake Cat Bonds

    NASA Astrophysics Data System (ADS)

    Franco, G.

    2009-12-01

    A catastrophe -cat- bond is an instrument used by insurance and reinsurance companies, by governments or by groups of nations to cede catastrophic risk to the financial markets, which are capable of supplying cover for highly destructive events, surpassing the typical capacity of traditional reinsurance contracts. Parametric cat bonds, a specific type of cat bonds, use trigger mechanisms or indices that depend on physical event parameters published by respected third parties in order to determine whether a part or the entire bond principal is to be paid for a certain event. First generation cat bonds, or cat-in-a-box bonds, display a trigger mechanism that consists of a set of geographic zones in which certain conditions need to be met by an earthquake’s magnitude and depth in order to trigger payment of the bond principal. Second generation cat bonds use an index formulation that typically consists of a sum of products of a set of weights by a polynomial function of the ground motion variables reported by a geographically distributed seismic network. These instruments are especially appealing to developing countries with incipient insurance industries wishing to cede catastrophic losses to the financial markets because the payment trigger mechanism is transparent and does not involve the parties ceding or accepting the risk, significantly reducing moral hazard. In order to be successful in the market, however, parametric cat bonds have typically been required to specify relatively simple trigger conditions. The consequence of such simplifications is the increase of basis risk. This risk represents the possibility that the trigger mechanism fails to accurately capture the actual losses of a catastrophic event, namely that it does not trigger for a highly destructive event or vice versa, that a payment of the bond principal is caused by an event that produced insignificant losses. The first case disfavors the sponsor who was seeking cover for its losses while the

  11. Operational earthquake forecasting in the South Iceland Seismic Zone: improving the earthquake catalogue

    NASA Astrophysics Data System (ADS)

    Panzera, Francesco; Vogfjörd, Kristin; Zechar, J. Douglas; Eberhard, David

    2014-05-01

    A major earthquake sequence is ongoing in the South Iceland Seismic Zone (SISZ), where experts expect earthquakes of up to MW = 7.1 in the coming years to decades. The historical seismicity in this region is well known and many major faults here and on Reykjanes Peninsula (RP) have already been mapped. The faults are predominantly N-S with right-lateral strike-slip motion, while the overall motion in the SISZ is E-W oriented left-lateral motion. The area that we propose for operational earthquake forecasting(OEF) contains both the SISZ and the RP. The earthquake catalogue considered for OEF, called the SIL catalogue, spans the period from 1991 until September 2013 and contains more than 200,000 earthquakes. Some of these events have a large azimuthal gap between stations, and some have large horizontal and vertical uncertainties. We are interested in building seismicity models using high-quality data, so we filter the catalogue using the criteria proposed by Gomberg et al. (1990) and Bondar et al. (2004). The resulting filtered catalogue contains around 130,000 earthquakes. Magnitude estimates in the Iceland catalogue also require special attention. The SIL system uses two methods to estimate magnitude. The first method is based on an empirical local magnitude (ML) relationship. The other magnitude scale is a so-called "local moment magnitude" (MLW), originally constructed by Slunga et al. (1984) to agree with local magnitude scales in Sweden. In the SIL catalogue, there are two main problems with the magnitude estimates and consequently it is not immediately possible to convert MLW to moment magnitude (MW). These problems are: (i) immediate aftershocks of large events are assigned magnitudes that are too high; and (ii) the seismic moment of large earthquakes is underestimated. For this reason the magnitude values in the catalogue must be corrected before developing an OEF system. To obtain a reliable MW estimate, we calibrate a magnitude relationship based on

  12. Operational Earthquake Forecasting: Proposed Guidelines for Implementation (Invited)

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.

    2010-12-01

    The goal of operational earthquake forecasting (OEF) is to provide the public with authoritative information about how seismic hazards are changing with time. During periods of high seismic activity, short-term earthquake forecasts based on empirical statistical models can attain nominal probability gains in excess of 100 relative to the long-term forecasts used in probabilistic seismic hazard analysis (PSHA). Prospective experiments are underway by the Collaboratory for the Study of Earthquake Predictability (CSEP) to evaluate the reliability and skill of these seismicity-based forecasts in a variety of tectonic environments. How such information should be used for civil protection is by no means clear, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing formal procedures for OEF in this sort of “low-probability environment.” Nevertheless, the need to move more quickly towards OEF has been underscored by recent experiences, such as the 2009 L’Aquila earthquake sequence and other seismic crises in which an anxious public has been confused by informal, inconsistent earthquake forecasts. Whether scientists like it or not, rising public expectations for real-time information, accelerated by the use of social media, will require civil protection agencies to develop sources of authoritative information about the short-term earthquake probabilities. In this presentation, I will discuss guidelines for the implementation of OEF informed by my experience on the California Earthquake Prediction Evaluation Council, convened by CalEMA, and the International Commission on Earthquake Forecasting, convened by the Italian government following the L’Aquila disaster. (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and

  13. The Establishment of an Operational Earthquake Forecasting System in Italy

    NASA Astrophysics Data System (ADS)

    Marzocchi, Warner; Lombardi, Anna Maria; Casarotti, Emanuele

    2014-05-01

    Just after the Mw 6.2 earthquake that hit L'Aquila, on April 6 2009, the Civil Protection nominated an International Commission on Earthquake Forecasting (ICEF) that paved the way to the development of the Operational Earthquake Forecasting (OEF), defined as the "procedures for gathering and disseminating authoritative information about the time dependence of seismic hazards to help communities prepare for potentially destructive earthquakes". In this paper we introduce the first official OEF system in Italy that has been developed by the new-born Centro di Pericolosità Sismica at the Istituto Nazionale di Geofisica e Vulcanologia. The system provides every day an update of the weekly probabilities of ground shaking over the whole Italian territory. In this presentation, we describe in detail the philosophy behind the system, the scientific details, and the output format that has been preliminary defined in agreement with Civil Protection. To our knowledge, this is the first operational system that fully satisfies the ICEF guidelines. Probably, the most sensitive issue is related to the communication of such a kind of message to the population. Acknowledging this inherent difficulty, in agreement with Civil Protection we are planning pilot tests to be carried out in few selected areas in Italy; the purpose of such tests is to check the effectiveness of the message and to receive feedbacks.

  14. FB Line Basis for Interim Operation

    SciTech Connect

    Shedrow, B.

    1998-10-19

    The safety analysis of the FB-Line Facility indicates that the operation of FB-Line to support the current mission does not present undue risk to the facility and co-located workers, general public, or the environment.

  15. Design basis for the NRC Operations Center

    SciTech Connect

    Lindell, M.K.; Wise, J.A.; Griffin, B.N.; Desrosiers, A.E.; Meitzler, W.D.

    1983-05-01

    This report documents the development of a design for a new NRC Operations Center (NRCOC). The project was conducted in two phases: organizational analysis and facility design. In order to control the amount of traffic, congestion and noise within the facility, it is recommended that information flow in the new NRCOC be accomplished by means of an electronic Status Information Management System. Functional requirements and a conceptual design for this system are described. An idealized architectural design and a detailed design program are presented that provide the appropriate amount of space for operations, equipment and circulation within team areas. The overall layout provides controlled access to the facility and, through the use of a zoning concept, provides each team within the NRCOC the appropriate balance of ready access and privacy determined from the organizational analyses conducted during the initial phase of the project.

  16. Geological and seismological survey for new design-basis earthquake ground motion of Kashiwazaki-Kariwa NPS

    NASA Astrophysics Data System (ADS)

    Takao, M.; Mizutani, H.

    2009-05-01

    At about 10:13 on July 16, 2007, a strong earthquake named 'Niigata-ken Chuetsu-oki Earthquake' of Mj6.8 on Japan Meteorological Agencyfs scale occurred offshore Niigata prefecture in Japan. However, all of the nuclear reactors at Kashiwazaki-Kariwa Nuclear Power Station (KKNPS) in Niigata prefecture operated by Tokyo Electric Power Company shut down safely. In other words, automatic safety function composed of shutdown, cooling and containment worked as designed immediately after the earthquake. During the earthquake, the peak acceleration of the ground motion exceeded the design-basis ground motion (DBGM), but the force due to the earthquake applied to safety-significant facilities was about the same as or less than the design basis taken into account as static seismic force. In order to assess anew the safety of nuclear power plants, we have evaluated a new DBGM after conducting geomorphological, geological, geophysical, seismological survey and analyses. [Geomorphological, Geological and Geophysical survey] In the land area, aerial photograph interpretation was performed at least within the 30km radius to extract geographies that could possibly be tectonic reliefs as a geomorphological survey. After that, geological reconnaissance was conducted to confirm whether the extracted landforms are tectonic reliefs or not. Especially we carefully investigated Nagaoka Plain Western Boundary Fault Zone (NPWBFZ), which consists of Kakuda-Yahiko fault, Kihinomiya fault and Katakai fault, because NPWBFZ is the one of the active faults which have potential of Mj8 class in Japan. In addition to the geological survey, seismic reflection prospecting of approximate 120km in total length was completed to evaluate the geological structure of the faults and to assess the consecutiveness of the component faults of NPWBFZ. As a result of geomorphological, geological and geophysical surveys, we evaluated that the three component faults of NPWBFZ are independent to each other from the

  17. Lessons of L'Aquila for Operational Earthquake Forecasting

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.

    2012-12-01

    The L'Aquila earthquake of 6 Apr 2009 (magnitude 6.3) killed 309 people and left tens of thousands homeless. The mainshock was preceded by a vigorous seismic sequence that prompted informal earthquake predictions and evacuations. In an attempt to calm the population, the Italian Department of Civil Protection (DPC) convened its Commission on the Forecasting and Prevention of Major Risk (MRC) in L'Aquila on 31 March 2009 and issued statements about the hazard that were widely received as an "anti-alarm"; i.e., a deterministic prediction that there would not be a major earthquake. On October 23, 2012, a court in L'Aquila convicted the vice-director of DPC and six scientists and engineers who attended the MRC meeting on charges of criminal manslaughter, and it sentenced each to six years in prison. A few weeks after the L'Aquila disaster, the Italian government convened an International Commission on Earthquake Forecasting for Civil Protection (ICEF) with the mandate to assess the status of short-term forecasting methods and to recommend how they should be used in civil protection. The ICEF, which I chaired, issued its findings and recommendations on 2 Oct 2009 and published its final report, "Operational Earthquake Forecasting: Status of Knowledge and Guidelines for Implementation," in Aug 2011 (www.annalsofgeophysics.eu/index.php/annals/article/view/5350). As defined by the Commission, operational earthquake forecasting (OEF) involves two key activities: the continual updating of authoritative information about the future occurrence of potentially damaging earthquakes, and the officially sanctioned dissemination of this information to enhance earthquake preparedness in threatened communities. Among the main lessons of L'Aquila is the need to separate the role of science advisors, whose job is to provide objective information about natural hazards, from that of civil decision-makers who must weigh the benefits of protective actions against the costs of false alarms

  18. Earthquakes

    MedlinePlus

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  19. Earthquakes

    MedlinePlus

    ... Thunderstorms & Lightning Tornadoes Tsunamis Volcanoes Wildfires Main Content Earthquakes Earthquakes are sudden rolling or shaking events caused ... at any time of the year. Before An Earthquake Look around places where you spend time. Identify ...

  20. Is there a basis for preferring characteristic earthquakes over a Gutenberg-Richter distribution in probabilistic earthquake forecasting?

    USGS Publications Warehouse

    Parsons, T.; Geist, E.L.

    2009-01-01

    The idea that faults rupture in repeated, characteristic earthquakes is central to most probabilistic earthquake forecasts. The concept is elegant in its simplicity, and if the same event has repeated itself multiple times in the past, we might anticipate the next. In practice however, assembling a fault-segmented characteristic earthquake rupture model can grow into a complex task laden with unquantified uncertainty. We weigh the evidence that supports characteristic earthquakes against a potentially simpler model made from extrapolation of a Gutenberg-Richter magnitude-frequency law to individual fault zones. We find that the Gutenberg-Richter model satisfies key data constraints used for earthquake forecasting equally well as a characteristic model. Therefore, judicious use of instrumental and historical earthquake catalogs enables large-earthquake-rate calculations with quantifiable uncertainty that should get at least equal weighting in probabilistic forecasting.

  1. Is there a basis for preferring characteristic earthquakes over a Gutenberg–Richter distribution in probabilistic earthquake forecasting?

    USGS Publications Warehouse

    Parsons, Thomas E.; Geist, Eric L.

    2009-01-01

    The idea that faults rupture in repeated, characteristic earthquakes is central to most probabilistic earthquake forecasts. The concept is elegant in its simplicity, and if the same event has repeated itself multiple times in the past, we might anticipate the next. In practice however, assembling a fault-segmented characteristic earthquake rupture model can grow into a complex task laden with unquantified uncertainty. We weigh the evidence that supports characteristic earthquakes against a potentially simpler model made from extrapolation of a Gutenberg–Richter magnitude-frequency law to individual fault zones. We find that the Gutenberg–Richter model satisfies key data constraints used for earthquake forecasting equally well as a characteristic model. Therefore, judicious use of instrumental and historical earthquake catalogs enables large-earthquake-rate calculations with quantifiable uncertainty that should get at least equal weighting in probabilistic forecasting.

  2. Scientific and non-scientific challenges for Operational Earthquake Forecasting

    NASA Astrophysics Data System (ADS)

    Marzocchi, W.

    2015-12-01

    Tracking the time evolution of seismic hazard in time windows shorter than the usual 50-years of long-term hazard models may offer additional opportunities to reduce the seismic risk. This is the target of operational earthquake forecasting (OEF). During the OEF development in Italy we identify several challenges that range from pure science to the more practical interface of science with society. From a scientific point of view, although earthquake clustering is the clearest empirical evidence about earthquake occurrence, and OEF clustering models are the most (successfully) tested hazard models in seismology, we note that some seismologists are still reluctant to accept their scientific reliability. After exploring the motivations of these scientific doubts, we also look into an issue that is often overlooked in this discussion, i.e., in any kind of hazard analysis, we do not use a model because it is the true one, but because it is the better than anything else we can think of. The non-scientific aspects are mostly related to the fact that OEF usually provides weekly probabilities of large eartquakes smaller than 1%. These probabilities are considered by some seismologists too small to be of interest or useful. However, in a recent collaboration with engineers we show that such earthquake probabilities may lead to intolerable individual risk of death. Interestingly, this debate calls for a better definition of the still fuzzy boundaries among the different expertise required for the whole risk mitigation process. The last and probably more pressing challenge is related to the communication to the public. In fact, a wrong message could be useless or even counterproductive. Here we show some progresses that we have made in this field working with communication experts in Italy.

  3. 78 FR 39781 - Consequence Study of a Beyond-Design-Basis Earthquake Affecting the Spent Fuel Pool for a U.S...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-02

    ... COMMISSION Consequence Study of a Beyond-Design-Basis Earthquake Affecting the Spent Fuel Pool for a U.S... comment, titled Consequence Study of a Beyond- Design-Basis Earthquake Affecting the Spent Fuel Pool for a... earthquakes present the dominant risk for spent fuel pools, the draft study evaluated how a potential...

  4. Earthquakes.

    ERIC Educational Resources Information Center

    Walter, Edward J.

    1977-01-01

    Presents an analysis of the causes of earthquakes. Topics discussed include (1) geological and seismological factors that determine the effect of a particular earthquake on a given structure; (2) description of some large earthquakes such as the San Francisco quake; and (3) prediction of earthquakes. (HM)

  5. Earthquakes.

    ERIC Educational Resources Information Center

    Pakiser, Louis C.

    One of a series of general interest publications on science topics, the booklet provides those interested in earthquakes with an introduction to the subject. Following a section presenting an historical look at the world's major earthquakes, the booklet discusses earthquake-prone geographic areas, the nature and workings of earthquakes, earthquake…

  6. Ground motion following selection of SRS design basis earthquake and associated deterministic approach. Final report: Revision 1

    SciTech Connect

    Not Available

    1991-03-01

    This report summarizes the results of a deterministic assessment of earthquake ground motions at the Savannah River Site (SRS). The purpose of this study is to assist the Environmental Sciences Section of the Savannah River Laboratory in reevaluating the design basis earthquake (DBE) ground motion at SRS during approaches defined in Appendix A to 10 CFR Part 100. This work is in support of the Seismic Engineering Section`s Seismic Qualification Program for reactor restart.

  7. Earthquakes

    ERIC Educational Resources Information Center

    Roper, Paul J.; Roper, Jere Gerard

    1974-01-01

    Describes the causes and effects of earthquakes, defines the meaning of magnitude (measured on the Richter Magnitude Scale) and intensity (measured on a modified Mercalli Intensity Scale) and discusses earthquake prediction and control. (JR)

  8. Earthquakes

    MedlinePlus

    ... and Cleanup Workers Hurricanes PSAs ASL Videos: Hurricanes Landslides & Mudslides Lightning Lightning Safety Tips First Aid Recommendations ... Disasters & Severe Weather Earthquakes Extreme Heat Floods Hurricanes Landslides Tornadoes Tsunamis Volcanoes Wildfires Winter Weather Earthquakes Language: ...

  9. Monitoring and control of lifeline systems to enhance post-earthquake operation

    SciTech Connect

    Ballantyne, D.

    1995-12-31

    This paper summarizes the problem of earthquake damage to lifeline systems, particularly buried pipe, and the high cost of mitigation by replacement. System monitoring and control is presented as an alternative. Earthquake hazard, structural, soils, and system operation parameters are identified as useful for system control; examples are presented. Monitoring and control system implementation issues are discussed including system configuration, local/centralized control, hardware, and appropriate types of systems for earthquake mitigation implementation.

  10. 1/f and the Earthquake Problem: Scaling constraints to facilitate operational earthquake forecasting

    NASA Astrophysics Data System (ADS)

    Yoder, M. R.; Rundle, J. B.; Glasscoe, M. T.

    2013-12-01

    The difficulty of forecasting earthquakes can fundamentally be attributed to the self-similar, or '1/f', nature of seismic sequences. Specifically, the rate of occurrence of earthquakes is inversely proportional to their magnitude m, or more accurately to their scalar moment M. With respect to this '1/f problem,' it can be argued that catalog selection (or equivalently, determining catalog constraints) constitutes the most significant challenge to seismicity based earthquake forecasting. Here, we address and introduce a potential solution to this most daunting problem. Specifically, we introduce a framework to constrain, or partition, an earthquake catalog (a study region) in order to resolve local seismicity. In particular, we combine Gutenberg-Richter (GR), rupture length, and Omori scaling with various empirical measurements to relate the size (spatial and temporal extents) of a study area (or bins within a study area), in combination with a metric to quantify rate trends in local seismicity, to the local earthquake magnitude potential - the magnitudes of earthquakes the region is expected to experience. From this, we introduce a new type of time dependent hazard map for which the tuning parameter space is nearly fully constrained. In a similar fashion, by combining various scaling relations and also by incorporating finite extents (rupture length, area, and duration) as constraints, we develop a method to estimate the Omori (temporal) and spatial aftershock decay parameters as a function of the parent earthquake's magnitude m. From this formulation, we develop an ETAS type model that overcomes many point-source limitations of contemporary ETAS. These models demonstrate promise with respect to earthquake forecasting applications. Moreover, the methods employed suggest a general framework whereby earthquake and other complex-system, 1/f type, problems can be constrained from scaling relations and finite extents.

  11. Earthquake Response Modeling for a Parked and Operating Megawatt-Scale Wind Turbine

    SciTech Connect

    Prowell, I.; Elgamal, A.; Romanowitz, H.; Duggan, J. E.; Jonkman, J.

    2010-10-01

    Demand parameters for turbines, such as tower moment demand, are primarily driven by wind excitation and dynamics associated with operation. For that purpose, computational simulation platforms have been developed, such as FAST, maintained by the National Renewable Energy Laboratory (NREL). For seismically active regions, building codes also require the consideration of earthquake loading. Historically, it has been common to use simple building code approaches to estimate the structural demand from earthquake shaking, as an independent loading scenario. Currently, International Electrotechnical Commission (IEC) design requirements include the consideration of earthquake shaking while the turbine is operating. Numerical and analytical tools used to consider earthquake loads for buildings and other static civil structures are not well suited for modeling simultaneous wind and earthquake excitation in conjunction with operational dynamics. Through the addition of seismic loading capabilities to FAST, it is possible to simulate earthquake shaking in the time domain, which allows consideration of non-linear effects such as structural nonlinearities, aerodynamic hysteresis, control system influence, and transients. This paper presents a FAST model of a modern 900-kW wind turbine, which is calibrated based on field vibration measurements. With this calibrated model, both coupled and uncoupled simulations are conducted looking at the structural demand for the turbine tower. Response is compared under the conditions of normal operation and potential emergency shutdown due the earthquake induced vibrations. The results highlight the availability of a numerical tool for conducting such studies, and provide insights into the combined wind-earthquake loading mechanism.

  12. Post Test Analysis of a PCCV Model Dynamically Tested Under Simulated Design-Basis Earthquakes

    SciTech Connect

    Cherry, J.; Chokshi, N.; James, R.J.; Rashid, Y.R.; Tsurumaki, S.; Zhang, L.

    1998-11-09

    In a collaborative program between the United States Nuclear Regulatory Commission (USNRC) and the Nuclear Power Engineering Corporation (NUPEC) of Japan under sponsorship of the Ministry of International Trade and Ihdustry, the seismic behavior of Prestressed Concrete Containment Vessels (PCCV) is being investigated. A 1:10 scale PCCV model has been constructed by NUPEC and subjected to seismic simulation tests using the high performance shaking table at the Tadotsu Engineering Laboratory. A primary objective of the testing program is to demonstrate the capability of the PCCV to withstand design basis earthquakes with a significant safety margin against major damage or failure. As part of the collaborative program, Sandia National Laboratories (SNL) is conducting research in state-of-the-art analytical methods for predicting the seismic behavior of PCCV structures, with the eventual goal of understanding, validating, and improving calculations dated to containment structure performance under design and severe seismic events. With the increased emphasis on risk-informed- regulatory focus, more accurate ch&@erization (less uncertainty) of containment structural and functional integri~ is desirable. This paper presents results of post-test calculations conducted at ANATECH to simulate the design level scale model tests.

  13. Earthquake!

    ERIC Educational Resources Information Center

    Hernandez, Hildo

    2000-01-01

    Examines the types of damage experienced by California State University at Northridge during the 1994 earthquake and what lessons were learned in handling this emergency are discussed. The problem of loose asbestos is addressed. (GR)

  14. Earthquakes

    USGS Publications Warehouse

    Shedlock, Kaye M.; Pakiser, Louis Charles

    1998-01-01

    One of the most frightening and destructive phenomena of nature is a severe earthquake and its terrible aftereffects. An earthquake is a sudden movement of the Earth, caused by the abrupt release of strain that has accumulated over a long time. For hundreds of millions of years, the forces of plate tectonics have shaped the Earth as the huge plates that form the Earth's surface slowly move over, under, and past each other. Sometimes the movement is gradual. At other times, the plates are locked together, unable to release the accumulating energy. When the accumulated energy grows strong enough, the plates break free. If the earthquake occurs in a populated area, it may cause many deaths and injuries and extensive property damage. Today we are challenging the assumption that earthquakes must present an uncontrollable and unpredictable hazard to life and property. Scientists have begun to estimate the locations and likelihoods of future damaging earthquakes. Sites of greatest hazard are being identified, and definite progress is being made in designing structures that will withstand the effects of earthquakes.

  15. Earthquakes

    EPA Pesticide Factsheets

    Information on this page will help you understand environmental dangers related to earthquakes, what you can do to prepare and recover. It will also help you recognize possible environmental hazards and learn what you can do to protect you and your family

  16. PBO Southwest Region: Baja Earthquake Response and Network Operations

    NASA Astrophysics Data System (ADS)

    Walls, C. P.; Basset, A.; Mann, D.; Lawrence, S.; Jarvis, C.; Feaux, K.; Jackson, M. E.

    2011-12-01

    The SW region of the Plate Boundary Observatory consists of 455 continuously operating GPS stations located principally along the transform system of the San Andreas fault and Eastern California Shear Zone. In the past year network uptime exceeded an average of 97% with greater than 99% data acquisition. Communications range from CDMA modem (307), radio (92), Vsat (30), DSL/T1/other (25) to manual downloads (1). Sixty-three stations stream 1 Hz data over the VRS3Net typically with <0.5 second latency. Over 620 maintenance activities were performed during 316 onsite visits out of approximately 368 engineer field days. Within the past year there have been 7 incidences of minor (attempted theft) to moderate vandalism (solar panel stolen) with one total loss of receiver and communications gear. Security was enhanced at these sites through fencing and more secure station configurations. In the past 12 months, 4 new stations were installed to replace removed stations or to augment the network at strategic locations. Following the M7.2 El Mayor-Cucapah earthquake CGPS station P796, a deep-drilled braced monument, was constructed in San Luis, AZ along the border within 5 weeks of the event. In addition, UNAVCO participated in a successful University of Arizona-led RAPID proposal for the installation of six continuous GPS stations for post-seismic observations. Six stations are installed and telemetered through a UNAM relay at the Sierra San Pedro Martir. Four of these stations have Vaisala WXT520 meteorological sensors. An additional site in the Sierra Cucapah (PTAX) that was built by CICESE, an Associate UNAVCO Member institution in Mexico, and Caltech has been integrated into PBO dataflow. The stations will be maintained as part of the PBO network in coordination with CICESE. UNAVCO is working with NOAA to upgrade PBO stations with WXT520 meteorological sensors and communications systems capable of streaming real-time GPS and met data. The real-time GPS and

  17. Operational earthquake forecasting in California: A prototype system combining UCERF3 and CyberShake

    NASA Astrophysics Data System (ADS)

    Milner, K. R.; Jordan, T. H.; Field, E. H.

    2014-12-01

    Operational earthquake forecasting (OEF) is the dissemination of authoritative information about time-dependent earthquake probabilities to help communities prepare for potentially destructive earthquakes. The goal of OEF is to inform the decisions that people and organizations must continually make to mitigate seismic risk and prepare for potentially destructive earthquakes on time scales from days to decades. To attain this goal, OEF must provide a complete description of the seismic hazard—ground motion exceedance probabilities as well as short-term rupture probabilities—in concert with the long-term forecasts of probabilistic seismic hazard analysis. We have combined the Third Uniform California Earthquake Rupture Forecast (UCERF3) of the Working Group on California Earthquake Probabilities (Field et al., 2014) with the CyberShake ground-motion model of the Southern California Earthquake Center (Graves et al., 2011; Callaghan et al., this meeting) into a prototype OEF system for generating time-dependent hazard maps. UCERF3 represents future earthquake activity in terms of fault-rupture probabilities, incorporating both Reid-type renewal models and Omori-type clustering models. The current CyberShake model comprises approximately 415,000 earthquake rupture variations to represent the conditional probability of future shaking at 285 geographic sites in the Los Angeles region (~236 million horizontal-component seismograms). This combination provides significant probability gains relative to OEF models based on empirical ground-motion prediction equations (GMPEs), primarily because the physics-based CyberShake simulations account for the rupture directivity, basin effects, and directivity-basin coupling that are not represented by the GMPEs.

  18. Medical Response to Haiti Earthquake: Operation Unified Response

    DTIC Science & Technology

    2011-01-24

    struck Haiti at 4:53 pm, Tues, Jan 12, 2010 –230,000 Deaths * –197,000 Injured – 1.1M Displaced People –3,000,000 Affected People –60% of government...15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 Magnitude 7.0 Earthquake vic. Port au Prince 6.1 Aftershock COMFORT Arrives BATAAN ARG Arrives AFSOC...RESPONSE Deployment Timeline ( March ) 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 281110986 754321 JTF-H Conducts RIP w/ARSOUTH Final DHHS Treatment

  19. Plutonium uranium extraction (PUREX) end state basis for interim operation (BIO) for surveillance and maintenance

    SciTech Connect

    DODD, E.N.

    1999-05-12

    This Basis for Interim Operation (BIO) was developed for the PUREX end state condition following completion of the deactivation project. The deactivation project has removed or stabilized the hazardous materials within the facility structure and equipment to reduce the hazards posed by the facility during the surveillance and maintenance (S and M) period, and to reduce the costs associated with the S and M. This document serves as the authorization basis for the PUREX facility, excluding the storage tunnels, railroad cut, and associated tracks, for the deactivated end state condition during the S and M period. The storage tunnels, and associated systems and areas, are addressed in WHC-SD-HS-SAR-001, Rev. 1, PUREX Final Safety Analysis Report. During S and M, the mission of the facility is to maintain the conditions and equipment in a manner that ensures the safety of the workers, environment, and the public. The S and M phase will continue until the final decontamination and decommissioning (D and D) project and activities are begun. Based on the methodology of DOE-STD-1027-92, Hazards Categorization and Accident Analysis Techniques for Compliance with DOE Order 5480.23, Nuclear Safety Analysis Reports, the final facility hazards category is identified as hazards category This considers the remaining material inventories, form and distribution of the material, and the energies present to initiate events of concern. Given the current facility configuration, conditions, and authorized S and M activities, there are no operational events identified resulting in significant hazard to any of the target receptor groups (e.g., workers, public, environment). The only accident scenarios identified with consequences to the onsite co-located workers were based on external natural phenomena, specifically an earthquake. The dose consequences of these events are within the current risk evaluation guidelines and are consistent with the expectations for a hazards category 2

  20. Operational Earthquake Forecasting and Decision-Making in a Low-Probability Environment

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.; the International Commission on Earthquake ForecastingCivil Protection

    2011-12-01

    Operational earthquake forecasting (OEF) is the dissemination of authoritative information about the time dependence of seismic hazards to help communities prepare for potentially destructive earthquakes. Most previous work on the public utility of OEF has anticipated that forecasts would deliver high probabilities of large earthquakes; i.e., deterministic predictions with low error rates (false alarms and failures-to-predict) would be possible. This expectation has not been realized. An alternative to deterministic prediction is probabilistic forecasting based on empirical statistical models of aftershock triggering and seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains in excess of 100 relative to long-term forecasts. The utility of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing OEF in this sort of "low-probability environment." The need to move more quickly has been underscored by recent seismic crises, such as the 2009 L'Aquila earthquake sequence, in which an anxious public was confused by informal and inaccurate earthquake predictions. After the L'Aquila earthquake, the Italian Department of Civil Protection appointed an International Commission on Earthquake Forecasting (ICEF), which I chaired, to recommend guidelines for OEF utilization. Our report (Ann. Geophys., 54, 4, 2011; doi: 10.4401/ag-5350) concludes: (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and timely, and need to convey epistemic uncertainties. (b) Earthquake probabilities should be based on operationally qualified, regularly updated forecasting systems. (c) All operational models should be evaluated

  1. Development of Site-Specific Soil Design Basis Earthquake (DBE) Parameters for the Integrated Waste Treatment Unit (IWTU)

    SciTech Connect

    Payne, Suzette

    2008-08-01

    Horizontal and vertical PC 3 (2,500 yr) Soil Design Basis Earthquake (DBE) 5% damped spectra, corresponding time histories, and strain-compatible soil properties were developed for the Integrated Waste Treatment Unit (IWTU). The IWTU is located at the Idaho Nuclear Technology and Engineering Center (INTEC) at the Idaho National Laboratory (INL). Mean and 84th percentile horizontal DBE spectra derived from site-specific site response analyses were evaluated for the IWTU. The horizontal and vertical PC 3 (2,500 yr) Soil DBE 5% damped spectra at the 84th percentile were selected for Soil Structure Interaction (SSI) analyses at IWTU. The site response analyses were performed consistent with applicable Department of Energy (DOE) Standards, recommended guidance of the Nuclear Regulatory Commission (NRC), American Society of Civil Engineers (ASCE) Standards, and recommendations of the Blue Ribbon Panel (BRP) and Defense Nuclear Facilities Safety Board (DNFSB).

  2. The Earthquake Prediction Experiment on the Basis of the Jet Stream's Precursor

    NASA Astrophysics Data System (ADS)

    Wu, H. C.; Tikhonov, I. N.

    2014-12-01

    Simultaneous analysis of the jet stream maps and EQ data of M > 6.0 have been made. 58 cases of EQ occurred in 2006-2010 were studied. It has been found that interruption or velocity flow lines cross above an epicenter of EQ take place 1-70 days prior to event. The duration was 6-12 hours. The assumption is that jet stream will go up or down near an epicenter. In 45 cases the distance between epicenters and jet stream's precursor does not exceed 90 km. The forecast during 30 days before the EQ was 66.1 % (Wu and Tikhonov, 2014). This technique has been used to predict the strong EQ and pre-registered on the website (for example, the 23 October 2011, M 7.2 EQ (Turkey); the 20 May 2012, M 6.1 EQ (Italy); the 16 April 2013, M 7.8 EQ (Iran); the 12 November 2013, M 6.6 EQ (Russia); the 03 March 2014, M 6.7 Ryukyu EQ (Japan); the 21 July 2014, M 6.2 Kuril EQ). We obtain satisfactory accuracy of the epicenter location. As well we define the short alarm period. That's the positive aspects of forecast. However, estimates of magnitude contain a big uncertainty. Reference Wu, H.C., Tikhonov, I.N., 2014. Jet streams anomalies as possible short-term precursors of earthquakes with M > 6.0. Research in Geophysics, Special Issue on Earthquake Precursors. Vol. 4. No 1. doi:10.4081/rg.2014.4939. The precursor of M9.0 Japan EQ on 2011/03/11(fig1). A. M6.1 Italy EQ (2012/05/20, 44.80 N, 11.19 E, H = 5.1 km) Prediction: 2012/03/20~2012/04/20 (45.6 N, 10.5 E), M > 5.5(fig2) http://ireport.cnn.com/docs/DOC-764800 B. M7.8 Iran EQ (2013/04/16, 28.11 N, 62.05 E, H = 82.0 km) Prediction: 2013/01/14~2013/02/04 (28.0 N, 61.3 E) M > 6.0(fig3) http://ireport.cnn.com/docs/DOC-910919 C. M6.6 Russia EQ (2013/11/12, 54.68 N, 162.29 E, H = 47.2 km). Prediction: 2013/10/27~2013/11/13 (56.0 N, 162.9 E) M > 5.5 http://ireport.cnn.com/docs/DOC-1053599 D. M6.7 Japan EQ (2014/03/03, 27.41 N, 127.34 E, H = 111.2 km). Prediction: 2013/12/02 ~2014/01/15 (26.7 N, 128.1 E) M > 6.5(fig4) http

  3. Planning a Preliminary program for Earthquake Loss Estimation and Emergency Operation by Three-dimensional Structural Model of Active Faults

    NASA Astrophysics Data System (ADS)

    Ke, M. C.

    2015-12-01

    Large scale earthquakes often cause serious economic losses and a lot of deaths. Because the seismic magnitude, the occurring time and the occurring location of earthquakes are still unable to predict now. The pre-disaster risk modeling and post-disaster operation are really important works of reducing earthquake damages. In order to understanding disaster risk of earthquakes, people usually use the technology of Earthquake simulation to build the earthquake scenarios. Therefore, Point source, fault line source and fault plane source are the models which often are used as a seismic source of scenarios. The assessment results made from different models used on risk assessment and emergency operation of earthquakes are well, but the accuracy of the assessment results could still be upgrade. This program invites experts and scholars from Taiwan University, National Central University, and National Cheng Kung University, and tries using historical records of earthquakes, geological data and geophysical data to build underground three-dimensional structure planes of active faults. It is a purpose to replace projection fault planes by underground fault planes as similar true. The analysis accuracy of earthquake prevention efforts can be upgraded by this database. Then these three-dimensional data will be applied to different stages of disaster prevention. For pre-disaster, results of earthquake risk analysis obtained by the three-dimensional data of the fault plane are closer to real damage. For disaster, three-dimensional data of the fault plane can be help to speculate that aftershocks distributed and serious damage area. The program has been used 14 geological profiles to build the three dimensional data of Hsinchu fault and HisnCheng faults in 2015. Other active faults will be completed in 2018 and be actually applied on earthquake disaster prevention.

  4. [How to solve hospital operating problems based on the experience of the Hanshin earthquake?].

    PubMed

    Yoshikawa, J

    1995-06-01

    Immediately after the Hanshin (Kobe-Osaka) earthquake, electricity, water and gas supplies were discontinued at Kobe General Hospital, causing difficulties with many important hospital functions including the water cooled independent electric power plant, respirators, and sterilizers. A large water storage facility is needed to keep the water cooled type independent power plant operating. Alternative plans including the introduction of an air cooled independent power plant and a sea water to fresh water exchange system should be considered. Portable compressors are needed to retain the function of respirators in the absence of a water supply. The emergency use of propane gas should also be considered for sterilization and cooking facilities. There were very great problems in communication after the earthquake. The only method was the use of public phones, which have priority over private lines. Therefore, each hospital should have phones with similar priority. In addition, the use of personal computers and/or computer network methods should be considered to preserve a high level of communication after an earthquake. Otherwise, a hospital should be equipped with wireless phones. Immediately after the earthquake, the care of critically ill patients could not be achieved. Therefore, 20 cardiac patients requiring intensive care had to be transferred to other heart centers. The best method for the transfer is by helicopter. Kobe City suffered a transport crisis which occurred immediately after the earthquake and continued up to the end of March. A big helicopter or a special bus guided by a police car should be considered for hospital staff transport.(ABSTRACT TRUNCATED AT 250 WORDS)

  5. Planning Matters: Response Operations following the 30 September 2009 Sumatran Earthquake

    NASA Astrophysics Data System (ADS)

    Comfort, L. K.; Cedillos, V.; Rahayu, H.

    2009-12-01

    Response operations following the 9/30/2009 West Sumatra earthquake tested extensive planning that had been done in Indonesia since the 26 December 2004 Sumatran Earthquake and Tsunami. After massive destruction in Aceh Province in 2004, the Indonesian National Government revised its national disaster management plans. A key component was to select six cities in Indonesia exposed to significant risk and make a focused investment of resources, planning activities, and public education to reduce risk of major disasters. Padang City was selected for this national “showcase” for disaster preparedness, planning, and response. The question is whether planning improved governmental performance and coordination in practice. There is substantial evidence that disaster preparedness planning and training initiated over the past four years had a positive effect on Padang in terms of disaster risk reduction. The National Disaster Management Agency (BNPB, 10/28/09) reported the following casualties: Padang City: deaths, 383; severe injuries, 431, minor injuries, 771. Province of West Sumatra: deaths, 1209; severe injuries, 67; minor injuries, 1179. These figures contrasted markedly with the estimated losses following the 2004 Earthquake and Tsunami when no training had been done: Banda Aceh, deaths, 118,000; Aceh Province, dead/missing, 236,169 (ID Health Ministry 2/22/05). The 2004 events were more severe, yet the comparable scale of loss was significantly lower in the 9/30/09 earthquake. Three factors contributed to reducing disaster risk in Padang and West Sumatra. First, annual training exercises for tsunami warning and evacuation had been organized by national agencies since 2004. In 2008, all exercises and training activities were placed under the newly established BNPB. The exercise held in Padang in February, 2009 served as an organizing framework for response operations in the 9/30/09 earthquake. Public officers with key responsibilities for emergency operations

  6. Theoretical basis for operational ensemble forecasting of coronal mass ejections

    NASA Astrophysics Data System (ADS)

    Pizzo, V. J.; Koning, C.; Cash, M.; Millward, G.; Biesecker, D. A.; Puga, L.; Codrescu, M.; Odstrcil, D.

    2015-10-01

    We lay out the theoretical underpinnings for the application of the Wang-Sheeley-Arge-Enlil modeling system to ensemble forecasting of coronal mass ejections (CMEs) in an operational environment. In such models, there is no magnetic cloud component, so our results pertain only to CME front properties, such as transit time to Earth. Within this framework, we find no evidence that the propagation is chaotic, and therefore, CME forecasting calls for different tactics than employed for terrestrial weather or hurricane forecasting. We explore a broad range of CME cone inputs and ambient states to flesh out differing CME evolutionary behavior in the various dynamical domains (e.g., large, fast CMEs launched into a slow ambient, and the converse; plus numerous permutations in between). CME propagation in both uniform and highly structured ambient flows is considered to assess how much the solar wind background affects the CME front properties at 1 AU. Graphical and analytic tools pertinent to an ensemble approach are developed to enable uncertainties in forecasting CME impact at Earth to be realistically estimated. We discuss how uncertainties in CME pointing relative to the Sun-Earth line affects the reliability of a forecast and how glancing blows become an issue for CME off-points greater than about the half width of the estimated input CME. While the basic results appear consistent with established impressions of CME behavior, the next step is to use existing records of well-observed CMEs at both Sun and Earth to verify that real events appear to follow the systematic tendencies presented in this study.

  7. A century of oilfield operations and earthquakes in the greater Los Angeles Basin, southern California

    USGS Publications Warehouse

    Hauksson, Egill; Goebel, Thomas; Ampuero, Jean-Paul; Cochran, Elizabeth S.

    2015-01-01

    Most of the seismicity in the Los Angeles Basin (LA Basin) occurs at depth below the sediments and is caused by transpressional tectonics related to the big bend in the San Andreas fault. However, some of the seismicity could be associated with fluid extraction or injection in oil fields that have been in production for almost a century and cover ∼ 17% of the basin. In a recent study, first the influence of industry operations was evaluated by analyzing seismicity characteristics, including normalized seismicity rates, focal depths, and b-values, but no significant difference was found in seismicity characteristics inside and outside the oil fields. In addition, to identify possible temporal correlations, the seismicity and available monthly fluid extraction and injection volumes since 1977 were analyzed. Second, the production and deformation history of the Wilmington oil field were used to evaluate whether other oil fields are likely to experience similar surface deformation in the future. Third, the maximum earthquake magnitudes of events within the perimeters of the oil fields were analyzed to see whether they correlate with total net injected volumes, as suggested by previous studies. Similarly, maximum magnitudes were examined to see whether they exhibit an increase with net extraction volume. Overall, no obvious previously unidentified induced earthquakes were found, and the management of balanced production and injection of fluids appears to reduce the risk of induced-earthquake activity in the oil fields.

  8. Basis for Interim Operation for the K-Reactor in Cold Standby

    SciTech Connect

    Shedrow, B.

    1998-10-19

    The Basis for Interim Operation (BIO) document for K Reactor in Cold Standby and the L- and P-Reactor Disassembly Basins was prepared in accordance with the draft DOE standard for BIO preparation (dated October 26, 1993).

  9. Ground motions associated with the design basis earthquake at the Savannah River Site, South Carolina, based on a deterministic approach

    SciTech Connect

    Youngs, R.R.; Coppersmith, K.J. ); Stephenson, D.E. ); Silva, W. )

    1991-01-01

    Ground motion assessments are presented for evaluation of the seismic safety of K-Reactor at the Savannah River Site. Two earthquake sources are identified as the most significant to seismic hazard at the site, a M 7.5 earthquake occurring in Charleston, South Carolina, and a M 5 event occurring in the site vicinity. These events control the low frequency and high frequency portions of the spectrum, respectively. Three major issues were identified in the assessment of ground motions for the Savannah River site; specification of the appropriate stress drop for the Charleston source earthquake, specification of the appropriate levels of soil damping at large depths for site response analyses, and the appropriateness of western US recordings for specification of ground motions in the eastern US.

  10. Ground motions associated with the design basis earthquake at the Savannah River Site, South Carolina, based on a deterministic approach

    SciTech Connect

    Youngs, R.R.; Coppersmith, K.J.; Stephenson, D.E.; Silva, W.

    1991-12-31

    Ground motion assessments are presented for evaluation of the seismic safety of K-Reactor at the Savannah River Site. Two earthquake sources are identified as the most significant to seismic hazard at the site, a M 7.5 earthquake occurring in Charleston, South Carolina, and a M 5 event occurring in the site vicinity. These events control the low frequency and high frequency portions of the spectrum, respectively. Three major issues were identified in the assessment of ground motions for the Savannah River site; specification of the appropriate stress drop for the Charleston source earthquake, specification of the appropriate levels of soil damping at large depths for site response analyses, and the appropriateness of western US recordings for specification of ground motions in the eastern US.

  11. Distinction of Abnormality of Surgical Operation on the Basis of Surface EMG Signals

    NASA Astrophysics Data System (ADS)

    Nakaya, Yusuke; Ishii, Chiharu; Nakakuki, Takashi; Nishitani, Yosuke; Hikita, Mitsutaka

    In this paper, a novel method for automatic identification of a surgical operation and on-line recognition of the singularity of the identified surgical operation is proposed. Suturing is divided into six operations. The features of the operation are extracted from the measurements of the movement of the forceps, and then, on the basis of the threshold criteria for the six operations, a surgical operation is identified as one of the six operations. Next, the features of any singularity of operation are extracted from operator's surface electromyogram signals, and the identified surgical operation is classified as either normal or singular using a self-organizing map. Using the built laparoscopic-surgery simulator with two forceps, the identification of each surgical operation and the distinction of the singularity of the identified surgical operation were carried out for a specific surgical operation, namely, insertion of a needle during suturing. Each surgical operation in suturing could be identified with more than 80% accuracy, and the singularity of the surgical operation of insertion could be distinguished with approximately 80% accuracy on an average. The experimental results showed the effectiveness of the proposed method.

  12. Anthropogenic Triggering of Large Earthquakes

    NASA Astrophysics Data System (ADS)

    Mulargia, Francesco; Bizzarri, Andrea

    2014-08-01

    The physical mechanism of the anthropogenic triggering of large earthquakes on active faults is studied on the basis of experimental phenomenology, i.e., that earthquakes occur on active tectonic faults, that crustal stress values are those measured in situ and, on active faults, comply to the values of the stress drop measured for real earthquakes, that the static friction coefficients are those inferred on faults, and that the effective triggering stresses are those inferred for real earthquakes. Deriving the conditions for earthquake nucleation as a time-dependent solution of the Tresca-Von Mises criterion applied in the framework of poroelasticity yields that active faults can be triggered by fluid overpressures < 0.1 MPa. Comparing this with the deviatoric stresses at the depth of crustal hypocenters, which are of the order of 1-10 MPa, we find that injecting in the subsoil fluids at the pressures typical of oil and gas production and storage may trigger destructive earthquakes on active faults at a few tens of kilometers. Fluid pressure propagates as slow stress waves along geometric paths operating in a drained condition and can advance the natural occurrence of earthquakes by a substantial amount of time. Furthermore, it is illusory to control earthquake triggering by close monitoring of minor ``foreshocks'', since the induction may occur with a delay up to several years.

  13. Anthropogenic triggering of large earthquakes.

    PubMed

    Mulargia, Francesco; Bizzarri, Andrea

    2014-08-26

    The physical mechanism of the anthropogenic triggering of large earthquakes on active faults is studied on the basis of experimental phenomenology, i.e., that earthquakes occur on active tectonic faults, that crustal stress values are those measured in situ and, on active faults, comply to the values of the stress drop measured for real earthquakes, that the static friction coefficients are those inferred on faults, and that the effective triggering stresses are those inferred for real earthquakes. Deriving the conditions for earthquake nucleation as a time-dependent solution of the Tresca-Von Mises criterion applied in the framework of poroelasticity yields that active faults can be triggered by fluid overpressures < 0.1 MPa. Comparing this with the deviatoric stresses at the depth of crustal hypocenters, which are of the order of 1-10 MPa, we find that injecting in the subsoil fluids at the pressures typical of oil and gas production and storage may trigger destructive earthquakes on active faults at a few tens of kilometers. Fluid pressure propagates as slow stress waves along geometric paths operating in a drained condition and can advance the natural occurrence of earthquakes by a substantial amount of time. Furthermore, it is illusory to control earthquake triggering by close monitoring of minor "foreshocks", since the induction may occur with a delay up to several years.

  14. Anthropogenic Triggering of Large Earthquakes

    PubMed Central

    Mulargia, Francesco; Bizzarri, Andrea

    2014-01-01

    The physical mechanism of the anthropogenic triggering of large earthquakes on active faults is studied on the basis of experimental phenomenology, i.e., that earthquakes occur on active tectonic faults, that crustal stress values are those measured in situ and, on active faults, comply to the values of the stress drop measured for real earthquakes, that the static friction coefficients are those inferred on faults, and that the effective triggering stresses are those inferred for real earthquakes. Deriving the conditions for earthquake nucleation as a time-dependent solution of the Tresca-Von Mises criterion applied in the framework of poroelasticity yields that active faults can be triggered by fluid overpressures < 0.1 MPa. Comparing this with the deviatoric stresses at the depth of crustal hypocenters, which are of the order of 1–10 MPa, we find that injecting in the subsoil fluids at the pressures typical of oil and gas production and storage may trigger destructive earthquakes on active faults at a few tens of kilometers. Fluid pressure propagates as slow stress waves along geometric paths operating in a drained condition and can advance the natural occurrence of earthquakes by a substantial amount of time. Furthermore, it is illusory to control earthquake triggering by close monitoring of minor “foreshocks”, since the induction may occur with a delay up to several years. PMID:25156190

  15. 29 CFR 780.314 - Operations customarily * * * paid on a piece rate basis * * *.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... FAIR LABOR STANDARDS ACT Employment in Agriculture That Is Exempted From the Minimum Wage and Overtime... 29 Labor 3 2013-07-01 2013-07-01 false Operations customarily * * * paid on a piece rate basis * * *. 780.314 Section 780.314 Labor Regulations Relating to Labor (Continued) WAGE AND HOUR...

  16. 29 CFR 780.314 - Operations customarily * * * paid on a piece rate basis * * *.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... FAIR LABOR STANDARDS ACT Employment in Agriculture That Is Exempted From the Minimum Wage and Overtime... 29 Labor 3 2014-07-01 2014-07-01 false Operations customarily * * * paid on a piece rate basis * * *. 780.314 Section 780.314 Labor Regulations Relating to Labor (Continued) WAGE AND HOUR...

  17. 29 CFR 780.314 - Operations customarily * * * paid on a piece rate basis * * *.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... FAIR LABOR STANDARDS ACT Employment in Agriculture That Is Exempted From the Minimum Wage and Overtime... 29 Labor 3 2012-07-01 2012-07-01 false Operations customarily * * * paid on a piece rate basis * * *. 780.314 Section 780.314 Labor Regulations Relating to Labor (Continued) WAGE AND HOUR...

  18. 29 CFR 780.314 - Operations customarily * * * paid on a piece rate basis * * *.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... FAIR LABOR STANDARDS ACT Employment in Agriculture That Is Exempted From the Minimum Wage and Overtime... 29 Labor 3 2010-07-01 2010-07-01 false Operations customarily * * * paid on a piece rate basis * * *. 780.314 Section 780.314 Labor Regulations Relating to Labor (Continued) WAGE AND HOUR...

  19. 29 CFR 780.314 - Operations customarily * * * paid on a piece rate basis * * *.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... FAIR LABOR STANDARDS ACT Employment in Agriculture That Is Exempted From the Minimum Wage and Overtime... 29 Labor 3 2011-07-01 2011-07-01 false Operations customarily * * * paid on a piece rate basis * * *. 780.314 Section 780.314 Labor Regulations Relating to Labor (Continued) WAGE AND HOUR...

  20. On the Physical Basis of Rate Law Formulations for River Evolution, and their Applicability to the Simulation of Evolution after Earthquakes

    NASA Astrophysics Data System (ADS)

    An, C.; Parker, G.; Fu, X.

    2015-12-01

    River morphology evolves in response to trade-offs among a series of environmental forcing factors, and this evolution will be disturbed if such environmental factors change. One example of response to chronic disturbance is the intensive river evolution after earthquakes in southwest China's mountain areas. When simulating river morphological response to environmental disturbance, an exponential rate law with a specified characteristic response time is often regarded as a practical tool for quantification. As conceptual models, empirical rate law formulations can be used to describe broad brush morphological response, but their physically basis is not solid in that they do not consider the details of morphodynamic processes. Meanwhile, river evolution can also be simulated with physically-based morphodynamic models which conserve sediment via the Exner equation. Here we study the links between the rate law formalism and the Exner equation through solving the Exner equation mathematically and numerically. The results show that, when implementing a very simplified form of a relation for bedload transport, the Exner equation can be reduced to the diffusion equation, the solution of which is a Gaussian function. This solution coincides with the solution associated with rate laws, thus providing a physical basis for such formulations. However, when the complexities of a natural river are considered, the solution of the Exner equation will no longer be a simple Gaussian function. Under such circumstances, the rate law becomes invalid, and a full understanding of the response of rivers to earthquakes requires a complete morphodynamic model.

  1. Nuclear magnetic resonance of J-coupled quadrupolar nuclei: Use of the tensor operator product basis

    NASA Astrophysics Data System (ADS)

    Kemp-Harper, R.; Philp, D. J.; Kuchel, P. W.

    2001-08-01

    In nuclear magnetic resonance (NMR) of I=1/2 nuclei that are scalar coupled to quadrupolar spins, a tensor operator product (TOP) basis set provides a convenient description of the time evolution of the density operator. Expressions for the evolution of equivalent I=1/2 spins, coupled to an arbitrary spin S>1/2, were obtained by explicit algebraic density operator calculations in Mathematica, and specific examples are given for S=1 and S=3/2. Tensor operators are described by the convenient quantum numbers rank and order and this imparts to the TOP basis features that enable an intuitive understanding of NMR behavior of these spin systems. It is shown that evolution as a result of J coupling alone changes the rank of tensors for the coupling partner, generating higher-rank tensors, which allow efficient excitation of S-spin multiple-quantum coherences. Theoretical predictions obtained using the TOP formalism were confirmed using multiple-quantum filtered heteronuclear spin-echo experiments and were further employed to demonstrate polarization transfer directly to multiple-quantum transitions using the insensitive nucleus enhancement by polarization transfer pulse sequence. This latter experiment is the basis of two-dimensional heteronuclear correlation experiments and direct generation of multiple-quantum S-spin coherences can therefore be exploited to yield greater spectral resolution in such experiments. Simulated spectra and experimental results are presented.

  2. Theory of coherent two-photon NMR: Standard-basis operators and coherent averaging

    NASA Astrophysics Data System (ADS)

    Stepišnik, Janez

    1980-05-01

    Theory of the two-photon coherent transitions for the multilevel spin system is developed by using the coherent averaging of the time-evolution operator and the spin description by the standard-basis operators. The employed formalism provides a clear picture of the interactions which cause the multi-quantum transitions and make possible to evaluate not only the two-photon but also the multiphoton transitions. The theory has been applied to the quadrupole perturbed spin-systems with s = 1 and s = {3}/{2} where the effective double-quantum rf field has been evaluated.

  3. A probabilistic risk assessment of the LLNL Plutonium facility`s evaluation basis fire operational accident

    SciTech Connect

    Brumburgh, G.

    1994-08-31

    The Lawrence Livermore National Laboratory (LLNL) Plutonium Facility conducts numerous involving plutonium to include device fabrication, development of fabrication techniques, metallurgy research, and laser isotope separation. A Safety Analysis Report (SAR) for the building 332 Plutonium Facility was completed rational safety and acceptable risk to employees, the public, government property, and the environment. This paper outlines the PRA analysis of the Evaluation Basis Fire (EDF) operational accident. The EBF postulates the worst-case programmatic impact event for the Plutonium Facility.

  4. Lessons Learned from Eight Years' Experience of Actual Operation, and Future Prospects of JMA Earthquake Early Warning System

    NASA Astrophysics Data System (ADS)

    Hoshiba, M.; Nishimae, Y.

    2015-12-01

    Since 2007, experiences of actual operation of EEW have been gained by the Japan Meteorological Agency (JMA). During this period, we have learned lessons from many M6- and M7-class earthquakes, and the Mw9.0 Tohoku earthquake. During the Mw9.0 Tohoku earthquake, JMA system functioned well: it issued a warning message more than 15 s before strong ground shaking in the Tohoku district (relatively near distance from the epicenter). However, it was not perfect: in addition to the problem of large extent of fault rupture, some false warning messages were issued due to the confusion of the system because of simultaneous multiple aftershocks which occurred at the wide rupture area. To address the problems, JMA will introduce two new methods into the operational system this year to start their tests, aiming at practical operation within a couple of years. One is Integrated Particle Filter (IPF) method, which is an integrated algorithm of multiple hypocenter determination techniques with Bayesian estimation, in which amplitude information is also used for hypocenter determination. The other is Propagation of Local Undamped Motion (PLUM) method, in which warning message is issued when strong ground shaking is detected at nearby stations around the target site (e.g., within 30 km). Here, hypocenter and magnitude are not required in PLUM. Aiming at application for several years later, we are investigating a new approach, in which current wavefield is estimated in real time, and then future wavefield is predicted time evolutionally from the current situation using physics of wave propagation. Here, hypocenter and magnitude are not necessarily required, but real-time observation of ground shaking is necessary. JMA also plans to predict long period ground motion (up to 8 s) with the EEW system for earthquake damage mitigation in high-rise buildings. Its test will start using the operational system in the near future.

  5. Earthquake alarm; operating the seismograph station at the University of California, Berkeley.

    USGS Publications Warehouse

    Stump, B.

    1980-01-01

    At the University of California seismographic stations, the task of locating and determining magnitudes for both local and distant earthquakes is a continuous one. Teleseisms must be located rapidly so that events that occur in the Pacific can be identified and the Pacific Tsunami Warning System alerted. For great earthquakes anywhere, there is a responsibility to notify public agencies such as the California Office of Emergency Services, the Federal Disaster Assistance Administration, the Earthquake Engineering Research Institute, the California Seismic Safety Commission, and the American Red Cross. In the case of damaging local earthquakes, it is necessary to alert also the California Department of Water Resources, California Division of Mines and Geology, U.S Army Corps of Engineers, Federal Bureau of Reclamation, and the Bay Area Rapid Transit. These days, any earthquakes that are felt in northern California cause immediate inquiries from the news media and an interested public. The series of earthquakes that jolted the Livermore area from January 24 to 26 1980, is a good case in point. 

  6. Computing single step operators of logic programming in radial basis function neural networks

    NASA Astrophysics Data System (ADS)

    Hamadneh, Nawaf; Sathasivam, Saratha; Choon, Ong Hong

    2014-07-01

    Logic programming is the process that leads from an original formulation of a computing problem to executable programs. A normal logic program consists of a finite set of clauses. A valuation I of logic programming is a mapping from ground atoms to false or true. The single step operator of any logic programming is defined as a function (Tp:I→I). Logic programming is well-suited to building the artificial intelligence systems. In this study, we established a new technique to compute the single step operators of logic programming in the radial basis function neural networks. To do that, we proposed a new technique to generate the training data sets of single step operators. The training data sets are used to build the neural networks. We used the recurrent radial basis function neural networks to get to the steady state (the fixed point of the operators). To improve the performance of the neural networks, we used the particle swarm optimization algorithm to train the networks.

  7. Computing single step operators of logic programming in radial basis function neural networks

    SciTech Connect

    Hamadneh, Nawaf; Sathasivam, Saratha; Choon, Ong Hong

    2014-07-10

    Logic programming is the process that leads from an original formulation of a computing problem to executable programs. A normal logic program consists of a finite set of clauses. A valuation I of logic programming is a mapping from ground atoms to false or true. The single step operator of any logic programming is defined as a function (T{sub p}:I→I). Logic programming is well-suited to building the artificial intelligence systems. In this study, we established a new technique to compute the single step operators of logic programming in the radial basis function neural networks. To do that, we proposed a new technique to generate the training data sets of single step operators. The training data sets are used to build the neural networks. We used the recurrent radial basis function neural networks to get to the steady state (the fixed point of the operators). To improve the performance of the neural networks, we used the particle swarm optimization algorithm to train the networks.

  8. The investigation of the impacts of major disasters, on the basis of the Van earthquake (October 23, 2011, Turkey), on the profile of the injuries due to occupational accidents.

    PubMed

    Hekimoglu, Yavuz; Dursun, Recep; Karadas, Sevdegul; Asirdizer, Mahmut

    2015-10-01

    The purpose of this study is to identify the impacts of major disasters, on the basis of the Van earthquake (October 23, 2011, Turkey), on the profile of the injuries due to occupational accidents. In this study, we evaluated 245 patients of occupational accidents who were admitted to emergency services of Van city hospitals in the 1-year periods including pre-earthquake and post-earthquake. We determined that there was a 63.4% (P < 0.05) increase in work-related accidents in the post-earthquake period compared to the pre-earthquake period. Also, injuries due to occupational accidents increased 211% (P < 0.05) in the construction industry, the rate of injuries due to falls from height increased 168% (P < 0.05), and the rate of traumas to the head and upper limbs increased 200% (P < 0.05) and 130% (P < 0.05), respectively, in the post-earthquake period compared to the pre-earthquake period. We determined that the ignoring of measures for occupational health and safety by employers and employees during conducted rapid construction activities and post-earthquake restoration works in order to remove the effects of the earthquake increased the number of work accidents. In this study, the impact of disasters such as earthquakes on the accidents at work was evaluated as we have not seen in literature. This study emphasizes that governments should make regulations and process relating to the post-disaster business before the emergence of disaster by taking into account factors that may increase their work-related accidents.

  9. The Mixed Waste Management Facility. Design basis integrated operations plan (Title I design)

    SciTech Connect

    1994-12-01

    The Mixed Waste Management Facility (MWMF) will be a fully integrated, pilotscale facility for the demonstration of low-level, organic-matrix mixed waste treatment technologies. It will provide the bridge from bench-scale demonstrated technologies to the deployment and operation of full-scale treatment facilities. The MWMF is a key element in reducing the risk in deployment of effective and environmentally acceptable treatment processes for organic mixed-waste streams. The MWMF will provide the engineering test data, formal evaluation, and operating experience that will be required for these demonstration systems to become accepted by EPA and deployable in waste treatment facilities. The deployment will also demonstrate how to approach the permitting process with the regulatory agencies and how to operate and maintain the processes in a safe manner. This document describes, at a high level, how the facility will be designed and operated to achieve this mission. It frequently refers the reader to additional documentation that provides more detail in specific areas. Effective evaluation of a technology consists of a variety of informal and formal demonstrations involving individual technology systems or subsystems, integrated technology system combinations, or complete integrated treatment trains. Informal demonstrations will typically be used to gather general operating information and to establish a basis for development of formal demonstration plans. Formal demonstrations consist of a specific series of tests that are used to rigorously demonstrate the operation or performance of a specific system configuration.

  10. Gas storage project development, operation, and analysis: Basis guidelines for gas storage project development, operation, and operations analysis

    SciTech Connect

    Nowaczewski, S.F.

    1995-09-01

    Reservoir selection matches location, capacity, and deliverability to market demand; gathering, processing, compression, land acquisition, and pipeline connection significantly impact economics. Geologic considerations include field-wide variations in permeability, porosity, pay thickness. Well deliverability, and the number of wells required to meet targeted field deliverability can be estimated from kh or {phi}h. Analogous reservoir types can be used to estimate kh, {phi}h ranges for particular fields. Capillary pressure data define pore size distribution and gas-water threshold pressure. Existing well location and log data are essential in mapping subtle stratigraphic relationships. Definitions of field type, trap type, and liquid phases are important to the economics of storage development and operations, since safe high pressure storage is of greater benefit. Well construction considerations include location, type (vertical/slant/horizontal), and completion type to maximize drainage and deliverability; casing sizing to eliminate frictional pressure loss; and casing cementing for long-term mechanical integrity. Deliverability prediction uses well/gathering system nodal pressure data. The importance of deliverability maintenance/enhancement is increasing as markets demand ever greater deliverability. By design, a field allows cycling of an expected volume; loss of potential decreases efficiently. Inventory verification relies on well pressure and fluid data, accurate metering, and estimation of losses or leaks and fuel use. Data quality, quantity and management affect results in all these major areas of storage operations.

  11. Power systems after the Northridge earthquake: Emergency operations and changes in seismic equipment specifications, practice, and system configuration

    SciTech Connect

    Schiff, A.J.; Tognazzini, R.; Ostrom, D.

    1995-12-31

    The Northridge earthquake caused extensive damage to high voltage substation equipment, and for the first time the failure of transmission towers. Power was lost to much of the earthquake impacted area, and 93% of the customers were restored within 24 hours. To restore service, damage monitoring, communication and protective equipment, such as current-voltage transformers, wave traps, and lightning arresters, were removed or bypassed and operation restored. To improve performance some porcelain members are being replaced with composite materials for bushings, current-voltage transformers and lightning arresters. Interim equipment seismic specifications for equipment have been instituted. Some substations are being re-configured and rigid bus and conductors are being replaced with flexible conductors. Non-load carrying conductors, such as those used on lightning arrester, are being reduced in size to reduce potential interaction problems. Better methods of documenting damage and repair costs are being considered.

  12. Representation of discrete Steklov-Poincare operator arising in domain decomposition methods in wavelet basis

    SciTech Connect

    Jemcov, A.; Matovic, M.D.

    1996-12-31

    This paper examines the sparse representation and preconditioning of a discrete Steklov-Poincare operator which arises in domain decomposition methods. A non-overlapping domain decomposition method is applied to a second order self-adjoint elliptic operator (Poisson equation), with homogeneous boundary conditions, as a model problem. It is shown that the discrete Steklov-Poincare operator allows sparse representation with a bounded condition number in wavelet basis if the transformation is followed by thresholding and resealing. These two steps combined enable the effective use of Krylov subspace methods as an iterative solution procedure for the system of linear equations. Finding the solution of an interface problem in domain decomposition methods, known as a Schur complement problem, has been shown to be equivalent to the discrete form of Steklov-Poincare operator. A common way to obtain Schur complement matrix is by ordering the matrix of discrete differential operator in subdomain node groups then block eliminating interface nodes. The result is a dense matrix which corresponds to the interface problem. This is equivalent to reducing the original problem to several smaller differential problems and one boundary integral equation problem for the subdomain interface.

  13. Operant self-administration models for testing the neuropharmacological basis of ethanol consumption in rats.

    PubMed

    June, Harry L; Gilpin, Nicholas W

    2010-04-01

    Operant self-administration procedures are used to assess the neural basis of ethanol-seeking behavior under a wide range of experimental conditions. In general, rats do not spontaneously self-administer ethanol in pharmacologically meaningful amounts. This unit provides a step-by-step guide for training rats to self-administer quantities of ethanol that produce moderate to high blood-alcohol content. Different protocols are used for rats that are genetically heterogeneous versus rats that are selectively bred for high alcohol preference. Also, these protocols have different sets of advantages and disadvantages in terms of the ability to control for caloric intake and taste of solutions in operant testing. Basic self-administration protocols can also be altered to focus on different aspects of the motivational properties of ethanol (for example, those related to dependence). This unit provides multiple protocols that lead to alcohol intake in rats, which can be pharmacologically probed relative to a variety of control conditions.

  14. Real-time earthquake alert system for the greater San Francisco Bay Area: a prototype design to address operational issues

    SciTech Connect

    Harben, P.E.; Jarpe, S.; Hunter, S.

    1996-12-10

    The purpose of the earthquake alert system (EAS) is to outrun the seismic energy released in a large earthquake using a geographically distributed network of strong motion sensors that telemeter data to a rapid CPU-processing station, which then issues an area-wide warning to a region before strong motion will occur. The warning times involved are short, from 0 to 30 seconds or so; consequently, most responses must be automated. The San Francisco Bay Area is particularly well suited for an EAS because (1) large earthquakes have relatively shallow hypocenters (10- to 20-kilometer depth), giving favorable ray-path geometries for larger warning times than deeper from earthquakes, and (2) the active faults are few in number and well characterized, which means far fewer geographically distributed strong motion sensors are (about 50 in this region). An EAS prototype is being implemented in the San Francisco Bay Area. The system consists of four distinct subsystems: (1) a distributed strong motion seismic network, (2) a central processing station, (3) a warning communications system and (4) user receiver and response systems. We have designed a simple, reliable, and inexpensive strong motion monitoring station that consists of a three-component Analog Devices ADXLO5 accelerometer sensing unit, a vertical component weak motion sensor for system testing, a 16-bit digitizer with multiplexing, and communication output ports for RS232 modem or radio telemetry. The unit is battery-powered and will be sited in fire stations. The prototype central computer analysis system consists of a PC dam-acquisition platform that pipes the incoming strong motion data via Ethernet to Unix-based workstations for dam processing. Simple real-time algorithms, particularly for magnitude estimation, are implemented to give estimates of the time since the earthquake`s onset its hypocenter location, its magnitude, and the reliability of the estimate. These parameters are calculated and transmitted

  15. GLASS 2.0: An Operational, Multimodal, Bayesian Earthquake Data Association Engine

    NASA Astrophysics Data System (ADS)

    Benz, H.; Johnson, C. E.; Patton, J. M.; McMahon, N. D.; Earle, P. S.

    2015-12-01

    The legacy approach to automated detection and determination of hypocenters is arrival time stacking algorithms. Examples of such algorithms are the associator, Binder, which has been in continuous use in many USGS-supported regional seismic networks since the 1980s and the spherical earth successor, GLASS 1.0, currently in service at the USGS National Earthquake Information Center for over 10 years. The principle short-comings of the legacy approach are 1) it can only use phase arrival times, 2) it does not adequately address the problems of extreme variations in station density worldwide, 3) it cannot incorporate multiple phase models or statistical attributes of phases with distance, and 4) it cannot incorporate noise model attributes of individual stations. Previously we introduced a theoretical framework of a new associator using a Bayesian kernel stacking approach to approximate a joint probability density function for hypocenter localization. More recently we added station- and phase-specific Bayesian constraints to the association process. GLASS 2.0 incorporates a multiplicity of earthquake related data including phase arrival times, back-azimuth and slowness information from array beamforming, arrival times from waveform cross correlation processing, and geographic constraints from real-time social media reports of ground shaking. We demonstrate its application by modeling an aftershock sequence using dozens of stations that recorded tens of thousands of earthquakes over a period of one month. We also demonstrate Glass 2.0 performance regionally and teleseismically using the globally distributed real-time monitoring system at NEIC.

  16. An approach to estimating radiological risk of offsite release from a design basis earthquake for the Process Experimental Pilot Plant (PREPP)

    SciTech Connect

    Lucero, V.; Meale, B.M.; Reny, D.A.; Brown, A.N.

    1990-09-01

    In compliance with Department of Energy (DOE) Order 6430.1A, a seismic analysis was performed on DOE's Process Experimental Pilot Plant (PREPP), a facility for processing low-level and transuranic (TRU) waste. Because no hazard curves were available for the Idaho National Engineering Laboratory (INEL), DOE guidelines were used to estimate the frequency for the specified design-basis earthquake (DBE). A dynamic structural analysis of the building was performed, using the DBE parameters, followed by a probabilistic risk assessment (PRA). For the PRA, a functional organization of the facility equipment was effected so that top events for a representative event tree model could be determined. Building response spectra (calculated from the structural analysis), in conjunction with generic fragility data, were used to generate fragility curves for the PREPP equipment. Using these curves, failure probabilities for each top event were calculated. These probabilities were integrated into the event tree model, and accident sequences and respective probabilities were calculated through quantification. By combining the sequences failure probabilities with a transport analysis of the estimated airborne source term from a DBE, onsite and offsite consequences were calculated. The results of the comprehensive analysis substantiated the ability of the PREPP facility to withstand a DBE with negligible consequence (i.e., estimated release was within personnel and environmental dose guidelines). 57 refs., 19 figs., 20 tabs.

  17. 29 CFR 780.407 - System must be nonprofit or operated on a share-crop basis.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 3 2010-07-01 2010-07-01 false System must be nonprofit or operated on a share-crop basis... Requirements Under Section 13(b)(12) The Irrigation Exemption § 780.407 System must be nonprofit or operated on... on facilities of any irrigation system unless the ditches, canals, reservoirs, or waterways...

  18. Tracking Earthquake Cascades

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.

    2011-12-01

    In assessing their risk to society, earthquakes are best characterized as cascades that can propagate from the natural environment into the socio-economic (built) environment. Strong earthquakes rarely occur as isolated events; they usually cluster in foreshock-mainshock-aftershock sequences, seismic swarms, and extended sequences of large earthquakes that propagate along major fault systems. These cascades are regulated by stress-mediated interactions among faults driven by tectonic loading. Within these cascades, each large event can itself cause a chain reaction in which the primary effects of faulting and ground shaking induce secondary effects, including tsunami, landslides, liquefaction, and set off destructive processes within the built environment, such as fires and radiation leakage from nuclear plants. Recent earthquakes have demonstrated how the socio-economic effects of large earthquakes can reverberate for many years. To reduce earthquake risk and improve the resiliency of communities to earthquake damage, society depends on five geotechnologies for tracking earthquake cascades: long-term probabilistic seismic hazard analysis (PSHA), short-term (operational) earthquake forecasting, earthquake early warning, tsunami warning, and the rapid production of post-event information for response and recovery (see figure). In this presentation, I describe how recent advances in earthquake system science are leading to improvements in this geotechnology pipeline. In particular, I will highlight the role of earthquake simulations in predicting strong ground motions and their secondary effects before and during earthquake cascades

  19. A chemical basis for the partitioning of radionuclides in incinerator operation

    SciTech Connect

    Burger, L.L.

    1995-01-01

    Incineration as a method of treating radioactive or mixed waste is attractive because of volume reduction, but may result in high concentrations of some hazardous components. For safety reasons during operation, and because of the environmental impact of the plant, it is important to know how these materials partition between the furnace slay, the fly ash, and the stack emission. The chemistry of about 50 elements is discussed and through consideration of high temperature thermodynamic equilibria, an attempt is made to provide a basis for predicting how various radionuclides and heavy metals behave in a typical incinerator. The chemistry of the individual elements is first considered and a prediction of the most stable chemical species in the typical incinerator atmosphere is made. The treatment emphasizes volatility and the parameters considered are temperature, acidity, oxygen, sulfur, and halogen content, and the presence of several other key non-radioactive elements. A computer model is used to calculate equilibrium concentrations of many species in several systems at temperatures ranging from 500 to 1600{degrees}K. It is suggested that deliberate addition of various feed chemicals can have a major impact on the fate of many radionuclides and heavy metals. Several problems concerning limitations and application of the data are considered.

  20. The power of simplification: Operator interface with the AP1000{sup R} during design-basis and beyond design-basis events

    SciTech Connect

    Williams, M. G.; Mouser, M. R.; Simon, J. B.

    2012-07-01

    The AP1000{sup R} plant is an 1100-MWe pressurized water reactor with passive safety features and extensive plant simplifications that enhance construction, operation, maintenance, safety and cost. The passive safety features are designed to function without safety-grade support systems such as component cooling water, service water, compressed air or HVAC. The AP1000 passive safety features achieve and maintain safe shutdown in case of a design-basis accident for 72 hours without need for operator action, meeting the expectations provided in the European Utility Requirements and the Utility Requirement Document for passive plants. Limited operator actions may be required to maintain safe conditions in the spent fuel pool (SFP) via passive means. This safety approach therefore minimizes the reliance on operator action for accident mitigation, and this paper examines the operator interaction with the Human-System Interface (HSI) as the severity of an accident increases from an anticipated transient to a design basis accident and finally, to a beyond-design-basis event. The AP1000 Control Room design provides an extremely effective environment for addressing the first 72 hours of design-basis events and transients, providing ease of information dissemination and minimal reliance upon operator actions. Symptom-based procedures including Emergency Operating Procedures (EOPs), Abnormal Operating Procedures (AOPs) and Alarm Response Procedures (ARPs) are used to mitigate design basis transients and accidents. Use of the Computerized Procedure System (CPS) aids the operators during mitigation of the event. The CPS provides cues and direction to the operators as the event progresses. If the event becomes progressively worse or lasts longer than 72 hours, and depending upon the nature of failures that may have occurred, minimal operator actions may be required outside of the control room in areas that have been designed to be accessible using components that have been

  1. The Electronic Encyclopedia of Earthquakes

    NASA Astrophysics Data System (ADS)

    Benthien, M.; Marquis, J.; Jordan, T.

    2003-12-01

    The Electronic Encyclopedia of Earthquakes is a collaborative project of the Southern California Earthquake Center (SCEC), the Consortia of Universities for Research in Earthquake Engineering (CUREE) and the Incorporated Research Institutions for Seismology (IRIS). This digital library organizes earthquake information online as a partner with the NSF-funded National Science, Technology, Engineering and Mathematics (STEM) Digital Library (NSDL) and the Digital Library for Earth System Education (DLESE). When complete, information and resources for over 500 Earth science and engineering topics will be included, with connections to curricular materials useful for teaching Earth Science, engineering, physics and mathematics. Although conceived primarily as an educational resource, the Encyclopedia is also a valuable portal to anyone seeking up-to-date earthquake information and authoritative technical sources. "E3" is a unique collaboration among earthquake scientists and engineers to articulate and document a common knowledge base with a shared terminology and conceptual framework. It is a platform for cross-training scientists and engineers in these complementary fields and will provide a basis for sustained communication and resource-building between major education and outreach activities. For example, the E3 collaborating organizations have leadership roles in the two largest earthquake engineering and earth science projects ever sponsored by NSF: the George E. Brown Network for Earthquake Engineering Simulation (CUREE) and the EarthScope Project (IRIS and SCEC). The E3 vocabulary and definitions are also being connected to a formal ontology under development by the SCEC/ITR project for knowledge management within the SCEC Collaboratory. The E3 development system is now fully operational, 165 entries are in the pipeline, and the development teams are capable of producing 20 new, fully reviewed encyclopedia entries each month. Over the next two years teams will

  2. Transient Fluid Flow Along Basement Faults and Rupture Mechanics: Can We Expect Injection-Induced Earthquake Behavior to Correspond Directly With Injection Operations?

    NASA Astrophysics Data System (ADS)

    Norbeck, J. H.; Horne, R. N.

    2015-12-01

    We explored injection-induced earthquake behavior in geologic settings where basement faults are connected hydraulically to overlying saline aquifers targeted for wastewater disposal. Understanding how the interaction between natural geology and injection well operations affects the behavior of injection-induced earthquake sequences has important implications for characterizing seismic hazard risk. Numerical experiments were performed to investigate the extent to which seismicity is influenced by the migration of pressure perturbations along fault zones. Two distinct behaviors were observed: a) earthquake ruptures that were confined to the pressurized region of the fault and b) sustained earthquake ruptures that propagated far beyond the pressure front. These two faulting mechanisms have important implications for assessing the manner in which seismicity can be expected respond to injection well operations.Based upon observations from the numerical experiments, we developed a criterion that can be used to classify the expected faulting behavior near wastewater disposal sites. The faulting criterion depends on the state of stress, the initial fluid pressure, the orientation of the fault, and the dynamic friction coefficient of the fault. If the initial ratio of shear to effective normal stress resolved on the fault (the prestress ratio) is less than the fault's dynamic friction coefficient, then earthquake ruptures will tend to be limited by the distance of the pressure front. In this case, parameters that affect seismic hazard assessment, like the maximum earthquake magnitude or earthquake recurrence interval, could correlate with injection well operational parameters. For example, the maximum earthquake magnitude might be expected to grow over time in a systematic manner as larger patches of the fault are exposed to significant pressure changes. In contrast, if the prestress ratio is greater than dynamic friction, a stress drop can occur outside of the pressurized

  3. [Management of the operating room at the time of emergency outbreak--the experience of the 2011 Off the Pacific Coast of Tohoku Earthquake].

    PubMed

    Isosu, Tsuyoshi; Murakawa, Masahiro

    2012-03-01

    This article introduces the operating room disaster manual of our hospital. When "The 2011 Off the Pacific Coast of Tohoku Earthquake of magnitude 9" occurred, the 9 operations were being performed in our hospital. Among these, general and regional anesthesia had been induced in 8 cases, and as for one, patient was just leaving the operation room. General anesthesia was stopped in 6 cases. In our manual, all operations should be stopped, and then immediately finished if it is possible. There was no patient injured in our hospital. This was the first time we experienced such a large scale earthquake. It seemed closer cooperation between anesthesiologists, surgeons and the other co-medical staffs are very important to manage the unusual situation.

  4. Martin Marietta Energy Systems, Inc. comprehensive earthquake management plan: Emergency Operations Center training manual

    SciTech Connect

    Not Available

    1990-02-28

    The objective of this training is to: describe the responsibilities, resources, and goals of the Emergency Operations Center and be able to evaluate and interpret this information to best direct and allocate emergency, plant, and other resources to protect life and the Paducah Gaseous Diffusion Plant.

  5. Darwin's earthquake.

    PubMed

    Lee, Richard V

    2010-07-01

    Charles Darwin experienced a major earthquake in the Concepción-Valdivia region of Chile 175 years ago, in February 1835. His observations dramatically illustrated the geologic principles of James Hutton and Charles Lyell which maintained that the surface of the earth was subject to alterations by natural events, such as earthquakes, volcanoes, and the erosive action of wind and water, operating over very long periods of time. Changes in the land created new environments and fostered adaptations in life forms that could lead to the formation of new species. Without the demonstration of the accumulation of multiple crustal events over time in Chile, the biologic implications of the specific species of birds and tortoises found in the Galapagos Islands and the formulation of the concept of natural selection might have remained dormant.

  6. Earthquakes and Schools

    ERIC Educational Resources Information Center

    National Clearinghouse for Educational Facilities, 2008

    2008-01-01

    Earthquakes are low-probability, high-consequence events. Though they may occur only once in the life of a school, they can have devastating, irreversible consequences. Moderate earthquakes can cause serious damage to building contents and non-structural building systems, serious injury to students and staff, and disruption of building operations.…

  7. Duration and predictors of emergency surgical operations - basis for medical management of mass casualty incidents

    PubMed Central

    2009-01-01

    Background Hospitals have a critically important role in the management of mass causality incidents (MCI), yet there is little information to assist emergency planners. A significantly limiting factor of a hospital's capability to treat those affected is its surgical capacity. We therefore intended to provide data about the duration and predictors of life saving operations. Methods The data of 20,815 predominantly blunt trauma patients recorded in the Trauma Registry of the German-Trauma-Society was retrospectively analyzed to calculate the duration of life-saving operations as well as their predictors. Inclusion criteria were an ISS ≥ 16 and the performance of relevant ICPM-coded procedures within 6 h of admission. Results From 1,228 patients fulfilling the inclusion criteria 1,793 operations could be identified as life-saving operations. Acute injuries to the abdomen accounted for 54.1% followed by head injuries (26.3%), pelvic injuries (11.5%), thoracic injuries (5.0%) and major amputations (3.1%). The mean cut to suture time was 130 min (IQR 65-165 min). Logistic regression revealed 8 variables associated with an emergency operation: AIS of abdomen ≥ 3 (OR 4,00), ISS ≥ 35 (OR 2,94), hemoglobin level ≤ 8 mg/dL (OR 1,40), pulse rate on hospital admission < 40 or > 120/min (OR 1,39), blood pressure on hospital admission < 90 mmHg (OR 1,35), prehospital infusion volume ≥ 2000 ml (OR 1,34), GCS ≤ 8 (OR 1,32) and anisocoria (OR 1,28) on-scene. Conclusions The mean operation time of 130 min calculated for emergency life-saving surgical operations provides a realistic guideline for the prospective treatment capacity which can be estimated and projected into an actual incident admission capacity. Knowledge of predictive factors for life-saving emergency operations helps to identify those patients that need most urgent operative treatment in case of blunt MCI. PMID:20149987

  8. Draft Function Allocation Framework and Preliminary Technical Basis for Advanced SMR Concepts of Operations

    SciTech Connect

    Jacques Hugo; John Forester; David Gertman; Jeffrey Joe; Heather Medema; Julius Persensky; April Whaley

    2013-04-01

    This report presents preliminary research results from the investigation in to the development of new models and guidance for concepts of operations (ConOps) in advanced small modular reactor (aSMR) designs. In support of this objective, three important research areas were included: operating principles of multi-modular plants, functional allocation models and strategies that would affect the development of new, non-traditional concept of operations, and the requiremetns for human performance, based upon work domain analysis and current regulatory requirements. As part of the approach for this report, we outline potential functions, including the theoretical and operational foundations for the development of a new functional allocation model and the identification of specific regulatory requirements that will influence the development of future concept of operations. The report also highlights changes in research strategy prompted by confirmationof the importance of applying the work domain analysis methodology to a reference aSMR design. It is described how this methodology will enrich the findings from this phase of the project in the subsequent phases and help in identification of metrics and focused studies for the determination of human performance criteria that can be used to support the design process.

  9. Experience in Construction and Operation of the Distributed Information Systems on the Basis of the Z39.50 Protocol

    NASA Astrophysics Data System (ADS)

    Zhizhimov, Oleg; Mazov, Nikolay; Skibin, Sergey

    Questions concerned with construction and operation of the distributed information systems on the basis of ANSI/NISO Z39.50 Information Retrieval Protocol are discussed in the paper. The paper is based on authors' practice in developing ZooPARK server. Architecture of distributed information systems, questions of reliability of such systems, minimization of search time and administration are examined. Problems with developing of distributed information systems are also described.

  10. Draft Function Allocation Framework and Preliminary Technical Basis for Advanced SMR Concepts of Operations

    SciTech Connect

    Jacques Hugo; John Forester; David Gertman; Jeffrey Joe; Heather Medema; Julius Persensky; April Whaley

    2013-08-01

    This report presents preliminary research results from the investigation into the development of new models and guidance for Concepts of Operations in advanced small modular reactor (AdvSMR) designs. AdvSMRs are nuclear power plants (NPPs), but unlike conventional large NPPs that are constructed on site, AdvSMRs systems and components will be fabricated in a factory and then assembled on site. AdvSMRs will also use advanced digital instrumentation and control systems, and make greater use of automation. Some AdvSMR designs also propose to be operated in a multi-unit configuration with a single central control room as a way to be more cost-competitive with existing NPPs. These differences from conventional NPPs not only pose technical and operational challenges, but they will undoubtedly also have regulatory compliance implications, especially with respect to staffing requirements and safety standards.

  11. A chemical basis for the partitioning of radionuclides in incinerator operation

    SciTech Connect

    Burger, L.L.

    1994-09-01

    For waste containing small amounts of radioactivity, rad waste (RW), or mixed waste (MW) containing both radioactive and chemically hazardous components, incineration is a logical management candidate because of inherent safety, waste volume reduction, and low costs. Successful operation requires that the facility is properly designed and operated to protect workers and to limit releases of hazardous materials. The large decrease in waste volume achieved by incineration also results in a higher concentration of most of the radionuclides and non radioactive heavy metals in the ash products. These concentrations impact subsequent treatment and disposal. The various constituents (chemical elements) are not equal in concentration in the various incinerator feed materials, nor are they equal in their contribution to health risks on subsequent handling, or accidental release. Thus, for management of the wastes it is important to be able to predict how the nuclides partition between the primary combustion residue which may be an ash or a fused slag, the fine particulates or fly ash that is trapped in the burner off-gas by several different techniques, and the airborne fraction that escapes to the atmosphere. The objective of this report is to provide an estimate of how different elements of concern may behave in the chemical environment of the incinerator. The study briefly examines published incinerator operation data, then considers the properties of the elements of concern, and employs thermodynamic calculations, to help predict the fate of these RW and MW constituents. Many types and configurations of incinerators have been designed and tested.

  12. A matrix representation of the translation operator with respect to a basis set of exponentially declining functions

    NASA Astrophysics Data System (ADS)

    Filter, Eckhard; Steinborn, E. Otto

    1980-12-01

    The matrix elements of the translation operator with respect to a complete orthonormal basis set of the Hilbert space L2(R3) are given in closed form as functions of the displacement vector. The basis functions are composed of an exponential, a Laguerre polynomial, and a regular solid spherical harmonic. With this formalism, a function which is defined with respect to a certain origin, can be ''shifted'', i.e., expressed in terms of given functions which are defined with respect to another origin. In this paper we also demonstrate the feasibility of this method by applying it to problems that are of special interest in the theory of the electronic structure of molecules and solids. We present new one-center expansions for some exponential-type functions (ETF's), and a closed-form expression for a multicenter integral over ETF's is given and numerically tested.

  13. Waste Encapsulation and Storage Facility (WESF) Basis for Interim Operation (BIO)

    SciTech Connect

    COVEY, L.I.

    2000-11-28

    The Waste Encapsulation and Storage Facility (WESF) is located in the 200 East Area adjacent to B Plant on the Hanford Site north of Richland, Washington. The current WESF mission is to receive and store the cesium and strontium capsules that were manufactured at WESF in a safe manner and in compliance with all applicable rules and regulations. The scope of WESF operations is currently limited to receipt, inspection, decontamination, storage, and surveillance of capsules in addition to facility maintenance activities. The capsules are expected to be stored at WESF until the year 2017, at which time they will have been transferred for ultimate disposition. The WESF facility was designed and constructed to process, encapsulate, and store the extracted long-lived radionuclides, {sup 90}Sr and {sup 137}Cs, from wastes generated during the chemical processing of defense fuel on the Hanford Site thus ensuring isolation of hazardous radioisotopes from the environment. The construction of WESF started in 1971 and was completed in 1973. Some of the {sup 137}Cs capsules were leased by private irradiators or transferred to other programs. All leased capsules have been returned to WESF. Capsules transferred to other programs will not be returned except for the seven powder and pellet Type W overpacks already stored at WESF.

  14. Modeling of the Reactor Core Isolation Cooling Response to Beyond Design Basis Operations - Interim Report

    SciTech Connect

    Ross, Kyle; Cardoni, Jeffrey N.; Wilson, Chisom Shawn; Morrow, Charles; Osborn, Douglas; Gauntt, Randall O.

    2015-12-01

    Efforts are being pursued to develop and qualify a system-level model of a reactor core isolation (RCIC) steam-turbine-driven pump. The model is being developed with the intent of employing it to inform the design of experimental configurations for full-scale RCIC testing. The model is expected to be especially valuable in sizing equipment needed in the testing. An additional intent is to use the model in understanding more fully how RCIC apparently managed to operate far removed from its design envelope in the Fukushima Daiichi Unit 2 accident. RCIC modeling is proceeding along two avenues that are expected to complement each other well. The first avenue is the continued development of the system-level RCIC model that will serve in simulating a full reactor system or full experimental configuration of which a RCIC system is part. The model reasonably represents a RCIC system today, especially given design operating conditions, but lacks specifics that are likely important in representing the off-design conditions a RCIC system might experience in an emergency situation such as a loss of all electrical power. A known specific lacking in the system model, for example, is the efficiency at which a flashing slug of water (as opposed to a concentrated jet of steam) could propel the rotating drive wheel of a RCIC turbine. To address this specific, the second avenue is being pursued wherein computational fluid dynamics (CFD) analyses of such a jet are being carried out. The results of the CFD analyses will thus complement and inform the system modeling. The system modeling will, in turn, complement the CFD analysis by providing the system information needed to impose appropriate boundary conditions on the CFD simulations. The system model will be used to inform the selection of configurations and equipment best suitable of supporting planned RCIC experimental testing. Preliminary investigations with the RCIC model indicate that liquid water ingestion by the turbine

  15. Turkish Compulsory Earthquake Insurance (TCIP)

    NASA Astrophysics Data System (ADS)

    Erdik, M.; Durukal, E.; Sesetyan, K.

    2009-04-01

    Through a World Bank project a government-sponsored Turkish Catastrophic Insurance Pool (TCIP) is created in 2000 with the essential aim of transferring the government's financial burden of replacing earthquake-damaged housing to international reinsurance and capital markets. Providing coverage to about 2.9 Million homeowners TCIP is the largest insurance program in the country with about 0.5 Billion USD in its own reserves and about 2.3 Billion USD in total claims paying capacity. The total payment for earthquake damage since 2000 (mostly small, 226 earthquakes) amounts to about 13 Million USD. The country-wide penetration rate is about 22%, highest in the Marmara region (30%) and lowest in the south-east Turkey (9%). TCIP is the sole-source provider of earthquake loss coverage up to 90,000 USD per house. The annual premium, categorized on the basis of earthquake zones type of structure, is about US90 for a 100 square meter reinforced concrete building in the most hazardous zone with 2% deductible. The earthquake engineering related shortcomings of the TCIP is exemplified by fact that the average rate of 0.13% (for reinforced concrete buildings) with only 2% deductible is rather low compared to countries with similar earthquake exposure. From an earthquake engineering point of view the risk underwriting (Typification of housing units to be insured, earthquake intensity zonation and the sum insured) of the TCIP needs to be overhauled. Especially for large cities, models can be developed where its expected earthquake performance (and consequently the insurance premium) can be can be assessed on the basis of the location of the unit (microzoned earthquake hazard) and basic structural attributes (earthquake vulnerability relationships). With such an approach, in the future the TCIP can contribute to the control of construction through differentiation of premia on the basis of earthquake vulnerability.

  16. Stress and fluid-pressure changes associated with oil-field operations: A critical assessment of effects in the focal region of the earthquake

    SciTech Connect

    Segall, P.; Yerkes, R.F.

    1990-01-01

    The proximity of the May 2 earthquake to the active oil fields on Anticline Ridge has led to speculation that this earthquake might have been triggered by oil-field operations. Elsewhere, earthquakes have been associated with pore-pressure increases resulting from fluid injection and with subsidence resulting from fluid extraction. Simple calculations show that shale units, which underlie the oil-producing strata, hydraulically isolate the oil field from the earthquake focal region. The large volumes of fluid extracted from the oil fields caused a 50% decline in reservoir pressures from 1938 to 1983. These observations independently rule out substantial increases in pore pressure at focal depths due to fluid injection. The authors use a theoretical method, based on Biot's constitutive theory for fluid-infiltrated elastic media, to evaluate the change in stresses acting in the focal region resulting from fluid extraction in the overlying oil fields. As an independent check on this method, the subsidence of the Earth's surface in response to fluid withdrawal is calculated and compared with measured elevation changes on Anticline Ridge. The producing horizons are taken to be horizontal permeable layers, bounded above and below by impermeable horizons. Strains within the producing layers are related to extraction-induced changes in pore-fluid mass. Contraction of the producing layers causes the free surface to subside and strains the elastic surroundings. The calculated subsidence rate of Anticline Ridge between 1933 and 1972 if 3 mm/yr, in good agreement with the measured subsidence rate of 3.3 {plus minus} 0.7 mm/yr.

  17. Earthquake Facts

    MedlinePlus

    ... May 22, 1960. The earliest reported earthquake in California was felt in 1769 by the exploring expedition ... by wind or tides. Each year the southern California area has about 10,000 earthquakes . Most of ...

  18. Forecasting Earthquakes

    NASA Technical Reports Server (NTRS)

    1994-01-01

    In this video there are scenes of damage from the Northridge Earthquake and interviews with Dr. Andrea Donnelan, Geophysics at JPL, and Dr. Jim Dolan, earthquake geologist from Cal. Tech. The interviews discuss earthquake forecasting by tracking changes in the earth's crust using antenna receiving signals from a series of satellites called the Global Positioning System (GPS).

  19. Nowcasting earthquakes

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Turcotte, D. L.; Donnellan, A.; Grant Ludwig, L.; Luginbuhl, M.; Gong, G.

    2016-11-01

    Nowcasting is a term originating from economics and finance. It refers to the process of determining the uncertain state of the economy or markets at the current time by indirect means. We apply this idea to seismically active regions, where the goal is to determine the current state of the fault system and its current level of progress through the earthquake cycle. In our implementation of this idea, we use the global catalog of earthquakes, using "small" earthquakes to determine the level of hazard from "large" earthquakes in the region. Our method does not involve any model other than the idea of an earthquake cycle. Rather, we define a specific region and a specific large earthquake magnitude of interest, ensuring that we have enough data to span at least 20 or more large earthquake cycles in the region. We then compute the earthquake potential score (EPS) which is defined as the cumulative probability distribution P(n < n(t)) for the current count n(t) for the small earthquakes in the region. From the count of small earthquakes since the last large earthquake, we determine the value of EPS = P(n < n(t)). EPS is therefore the current level of hazard and assigns a number between 0% and 100% to every region so defined, thus providing a unique measure. Physically, the EPS corresponds to an estimate of the level of progress through the earthquake cycle in the defined region at the current time.

  20. Hidden Earthquakes.

    ERIC Educational Resources Information Center

    Stein, Ross S.; Yeats, Robert S.

    1989-01-01

    Points out that large earthquakes can take place not only on faults that cut the earth's surface but also on blind faults under folded terrain. Describes four examples of fold earthquakes. Discusses the fold earthquakes using several diagrams and pictures. (YP)

  1. PoroTomo Project - Subatask 6.2: Deploy and Operate DAS and DTS arrays - DAS Earthquake Data

    SciTech Connect

    Kurt Feigl

    2016-03-21

    The submitted data correspond to the vibration caused by a 3.4 M earthquake and captured by the DAS horizontal and vertical arrays during the PoroTomo Experiment. Earthquake information : M 4.3 - 23km ESE of Hawthorne, Nevada Time: 2016-03-21 07:37:10 (UTC) Location: 38.479°N 118.366°W Depth: 9.9 km Files for horizontal DAS array (each file is 30 s long and contain 8700 channels): PoroTomo_iDAS16043_160321073721.sgy PoroTomo_iDAS16043_160321073751.sgy Files for vertical DAS Array (each file is 30 s long and contain 380 channels): PoroTomo_iDAS025_160321073717.sgy PoroTomo_iDAS025_160321073747.sgy

  2. The parkfield, california, earthquake prediction experiment.

    PubMed

    Bakun, W H; Lindh, A G

    1985-08-16

    Five moderate (magnitude 6) earthquakes with similar features have occurred on the Parkfield section of the San Andreas fault in central California since 1857. The next moderate Parkfield earthquake is expected to occur before 1993. The Parkfield prediction experiment is designed to monitor the details of the final stages of the earthquake preparation process; observations and reports of seismicity and aseismic slip associated with the last moderate Parkfield earthquake in 1966 constitute much of the basis of the design of the experiment.

  3. Earthquake impact scale

    USGS Publications Warehouse

    Wald, D.J.; Jaiswal, K.S.; Marano, K.D.; Bausch, D.

    2011-01-01

    With the advent of the USGS prompt assessment of global earthquakes for response (PAGER) system, which rapidly assesses earthquake impacts, U.S. and international earthquake responders are reconsidering their automatic alert and activation levels and response procedures. To help facilitate rapid and appropriate earthquake response, an Earthquake Impact Scale (EIS) is proposed on the basis of two complementary criteria. On the basis of the estimated cost of damage, one is most suitable for domestic events; the other, on the basis of estimated ranges of fatalities, is generally more appropriate for global events, particularly in developing countries. Simple thresholds, derived from the systematic analysis of past earthquake impact and associated response levels, are quite effective in communicating predicted impact and response needed after an event through alerts of green (little or no impact), yellow (regional impact and response), orange (national-scale impact and response), and red (international response). Corresponding fatality thresholds for yellow, orange, and red alert levels are 1, 100, and 1,000, respectively. For damage impact, yellow, orange, and red thresholds are triggered by estimated losses reaching $1M, $100M, and $1B, respectively. The rationale for a dual approach to earthquake alerting stems from the recognition that relatively high fatalities, injuries, and homelessness predominate in countries in which local building practices typically lend themselves to high collapse and casualty rates, and these impacts lend to prioritization for international response. In contrast, financial and overall societal impacts often trigger the level of response in regions or countries in which prevalent earthquake resistant construction practices greatly reduce building collapse and resulting fatalities. Any newly devised alert, whether economic- or casualty-based, should be intuitive and consistent with established lexicons and procedures. Useful alerts should

  4. Technical Basis for Safe Operations with Pu-239 in NMS and S Facilities (F and H Areas)

    SciTech Connect

    Bronikowski, M.G.

    1999-03-18

    Plutonium-239 is now being processed in HB-Line and H-Canyon as well as FB-Line and F-Canyon. As part of the effort to upgrade the Authorization Basis for H Area facilities relative to nuclear criticality, a literature review of Pu polymer characteristics was conducted to establish a more quantitative vs. qualitative technical basis for safe operations. The results are also applicable to processing in F Area facilities.The chemistry of Pu polymer formation, precipitation, and depolymerization is complex. Establishing limits on acid concentrations of solutions or changing the valence to Pu(III) or Pu(VI) can prevent plutonium polymer formation in tanks in the B lines and canyons. For Pu(IV) solutions of 7 g/L or less, 0.22 M HNO3 prevents polymer formation at ambient temperature. This concentration should remain the minimum acid limit for the canyons and B lines when processing Pu-239 solutions. If the minimum acid concentration is compromised, the solution may need to be sampled and tested for the presence of polymer. If polymer is not detected, processing may proceed. If polymer is detected, adding HNO3 to a final concentration above 4 M is the safest method for handling the solution. The solution could also be heated to speed up the depolymerization process. Heating with > 4 M HNO3 will depolymerize the solution for further processing.Adsorption of Pu(IV) polymer onto the steel walls of canyon and B line tanks is likely to be 11 mg/cm2, a literature value for unpolished steel. This value will be confirmed by experimental work. Tank-to-tank transfers via steam jets are not expected to produce Pu(IV) polymer unless a larger than normal dilution occurs (e.g., >3 percent) at acidities below 0.4 M.

  5. Earthquakes: A Teacher's Package for K-6.

    ERIC Educational Resources Information Center

    National Science Teachers Association, Washington, DC.

    Like rain, an earthquake is a natural occurrence which may be mild or catastrophic. Although an earthquake may last only a few seconds, the processes that cause it have operated within the earth for millions of years. Until recently, the cause of earthquakes was a mystery and the subject of fanciful folklore to people all around the world. This…

  6. [Efficient OP management. Suggestions for optimisation of organisation and administration as a basis for establishing statutes for operating theatres].

    PubMed

    Geldner, G; Eberhart, L H J; Trunk, S; Dahmen, K G; Reissmann, T; Weiler, T; Bach, A

    2002-09-01

    Economic aspects have gained increasing importance in recent years. The operating room (OR) is the most cost-intensive sector and determines the turnover process of a surgical patient within the hospital. Thus, optimisation of workflow processes is of particular interest for health care providers. If the results of surgery are viewed as a product, everything associated with surgery can be evaluated analogously to a manufacturing process. All steps involved in producing the end-result can and should be analysed with the goal of producing an efficient, economical and quality product. The leadership that physicians can provide to manage this process is important and leads to the introduction of a specialised "OR manager". This position must have the authority to issue directives to all other members of the OR team. An OR management subordinates directly to the administration of the hospital. By integrating and improving management of various elements of the surgical process, health care institutions are able to rationally trim costs while maintaining high-quality services. This paper gives a short introduction into the difficulties of organising an OR. Some suggestions are made to overcome common shortcomings in the daily practise. A proposal for an "OR statute" is presented that should be a basis for discussion within the OR team. It must be modified according to individual needs and prerequisites in every hospital. The single best opportunity for dramatic improvement in effective resource use in surgical services lies in the perioperative process. The management strategy must focus on process measurement using information technology and feed-back implementing modern quality management tools.However, no short-term effects can be expected from these changes. Improvements take about a year and continuous feed-back of all measures must accompany the reorganisation process.

  7. Earthquake prediction

    NASA Technical Reports Server (NTRS)

    Turcotte, Donald L.

    1991-01-01

    The state of the art in earthquake prediction is discussed. Short-term prediction based on seismic precursors, changes in the ratio of compressional velocity to shear velocity, tilt and strain precursors, electromagnetic precursors, hydrologic phenomena, chemical monitors, and animal behavior is examined. Seismic hazard assessment is addressed, and the applications of dynamical systems to earthquake prediction are discussed.

  8. Earthquake Hazards.

    ERIC Educational Resources Information Center

    Donovan, Neville

    1979-01-01

    Provides a survey and a review of earthquake activity and global tectonics from the advancement of the theory of continental drift to the present. Topics include: an identification of the major seismic regions of the earth, seismic measurement techniques, seismic design criteria for buildings, and the prediction of earthquakes. (BT)

  9. Analog earthquakes

    SciTech Connect

    Hofmann, R.B.

    1995-09-01

    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed. A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository.

  10. The 1988 earthquake in soviet armenia: implications for earthquake preparedness.

    PubMed

    Noji, E K

    1989-09-01

    An earthquake registering 6.9 on the Richter scale hit the northern part of the Armenian Republic of the Soviet Union on 7 December 1988, resulting in thousands of deaths and injuries. The majority of these resulted from the collapse of inadequately designed and constructed buildings. Analysis of the effects of the Armenian earthquake on the population, as well as of the rescue and medical response, has strong implications for earthquake preparedness and response in other seismically vulnerable parts of the world. Specifically, this paper will recommend a number of important endeavours deemed necessary to improve medical planning, preparedness and response to earthquakes. Strengthening the self-reliance of the community in disaster preparedness is suggested as the best way to improve the effectiveness of relief operations. In earthquake-prone areas, training and education in basic first aid and methods of rescue should be an integral part of any community preparedness programme.

  11. Earthquake technology fights crime

    USGS Publications Warehouse

    Lahr, John C.; Ward, Peter L.; Stauffer, Peter H.; Hendley, James W.

    1996-01-01

    Scientists with the U.S. Geological Survey have adapted their methods for quickly finding the exact source of an earthquake to the problem of locating gunshots. On the basis of this work, a private company is now testing an automated gunshot-locating system in a San Francisco Bay area community. This system allows police to rapidly pinpoint and respond to illegal gunfire, helping to reduce crime in our neighborhoods.

  12. Earthquake Facts

    MedlinePlus

    ... the source of earthquakes. Moonquakes (“earthquakes” on the moon) do occur, but they happen less frequently and ... with the varying distance between the Earth and Moon. They also occur at great depth, about halfway ...

  13. Earthquake Analysis.

    ERIC Educational Resources Information Center

    Espinoza, Fernando

    2000-01-01

    Indicates the importance of the development of students' measurement and estimation skills. Analyzes earthquake data recorded at seismograph stations and explains how to read and modify the graphs. Presents an activity for student evaluation. (YDS)

  14. Deep Earthquakes.

    ERIC Educational Resources Information Center

    Frohlich, Cliff

    1989-01-01

    Summarizes research to find the nature of deep earthquakes occurring hundreds of kilometers down in the earth's mantle. Describes further research problems in this area. Presents several illustrations and four references. (YP)

  15. Proposed plan/Statement of basis for the Grace Road Site (631-22G) operable unit: Final action

    SciTech Connect

    Palmer, E.

    1997-08-19

    This Statement of Basis/Proposed Plan is being issued by the U. S. Department of Energy (DOE), which functions as the lead agency for the Savannah River Site (SRS) remedial activities, with concurrence by the U. S. Environmental Protection Agency (EPA), and the South Carolina Department of Health and Environmental Control (SCDHEC). The purpose of this Statement of Basis/Proposed Plan is to describe the preferred alternative for addressing the Grace Road site (GRS) located at the Savannah River Site (SRS), in Aiken, South Carolina and to provide an opportunity for public input into the remedial action selection process.

  16. Earthquake watch

    USGS Publications Warehouse

    Hill, M.

    1976-01-01

     When the time comes that earthquakes can be predicted accurately, what shall we do with the knowledge? This was the theme of a November 1975 conference on earthquake warning and response held in San Francisco called by Assistant Secretary of the Interior Jack W. Carlson. Invited were officials of State and local governments from Alaska, California, Hawaii, Idaho, Montana, Nevada, utah, Washington, and Wyoming and representatives of the news media. 

  17. Rapid estimation of the economic consequences of global earthquakes

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David J.

    2011-01-01

    to reduce this time gap to more rapidly and effectively mobilize response. We present here a procedure to rapidly and approximately ascertain the economic impact immediately following a large earthquake anywhere in the world. In principle, the approach presented is similar to the empirical fatality estimation methodology proposed and implemented by Jaiswal and others (2009). In order to estimate economic losses, we need an assessment of the economic exposure at various levels of shaking intensity. The economic value of all the physical assets exposed at different locations in a given area is generally not known and extremely difficult to compile at a global scale. In the absence of such a dataset, we first estimate the total Gross Domestic Product (GDP) exposed at each shaking intensity by multiplying the per-capita GDP of the country by the total population exposed at that shaking intensity level. We then scale the total GDP estimated at each intensity by an exposure correction factor, which is a multiplying factor to account for the disparity between wealth and/or economic assets to the annual GDP. The economic exposure obtained using this procedure is thus a proxy estimate for the economic value of the actual inventory that is exposed to the earthquake. The economic loss ratio, defined in terms of a country-specific lognormal cumulative distribution function of shaking intensity, is derived and calibrated against the losses from past earthquakes. This report describes the development of a country or region-specific economic loss ratio model using economic loss data available for global earthquakes from 1980 to 2007. The proposed model is a potential candidate for directly estimating economic losses within the currently-operating PAGER system. PAGER's other loss models use indirect methods that require substantially more data (such as building/asset inventories, vulnerabilities, and the asset values exposed at the time of earthquake) to implement on a global basis

  18. 26 CFR 1.832-6 - Policyholders of mutual fire or flood insurance companies operating on the basis of premium...

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... unable to determine the applicable rate of absorption in effect during his taxable year, he shall compute his deduction on the basis of the rate of absorption in effect at the end of the company's taxable... adjustment will be made upon the final determination of the rate of absorption applicable to the taxable year....

  19. 26 CFR 1.832-6 - Policyholders of mutual fire or flood insurance companies operating on the basis of premium...

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... unable to determine the applicable rate of absorption in effect during his taxable year, he shall compute his deduction on the basis of the rate of absorption in effect at the end of the company's taxable... adjustment will be made upon the final determination of the rate of absorption applicable to the taxable year....

  20. 26 CFR 1.832-6 - Policyholders of mutual fire or flood insurance companies operating on the basis of premium...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... determine the applicable rate of absorption in effect during his taxable year, he shall compute his deduction on the basis of the rate of absorption in effect at the end of the company's taxable year which... made upon the final determination of the rate of absorption applicable to the taxable year....

  1. 26 CFR 1.832-6 - Policyholders of mutual fire or flood insurance companies operating on the basis of premium...

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... unable to determine the applicable rate of absorption in effect during his taxable year, he shall compute his deduction on the basis of the rate of absorption in effect at the end of the company's taxable... adjustment will be made upon the final determination of the rate of absorption applicable to the taxable year....

  2. Earthquake Scaling, Simulation and Forecasting

    NASA Astrophysics Data System (ADS)

    Sachs, Michael Karl

    Earthquakes are among the most devastating natural events faced by society. In 2011, just two events, the magnitude 6.3 earthquake in Christcurch New Zealand on February 22, and the magnitude 9.0 Tohoku earthquake off the coast of Japan on March 11, caused a combined total of $226 billion in economic losses. Over the last decade, 791,721 deaths were caused by earthquakes. Yet, despite their impact, our ability to accurately predict when earthquakes will occur is limited. This is due, in large part, to the fact that the fault systems that produce earthquakes are non-linear. The result being that very small differences in the systems now result in very big differences in the future, making forecasting difficult. In spite of this, there are patterns that exist in earthquake data. These patterns are often in the form of frequency-magnitude scaling relations that relate the number of smaller events observed to the number of larger events observed. In many cases these scaling relations show consistent behavior over a wide range of scales. This consistency forms the basis of most forecasting techniques. However, the utility of these scaling relations is limited by the size of the earthquake catalogs which, especially in the case of large events, are fairly small and limited to a few 100 years of events. In this dissertation I discuss three areas of earthquake science. The first is an overview of scaling behavior in a variety of complex systems, both models and natural systems. The focus of this area is to understand how this scaling behavior breaks down. The second is a description of the development and testing of an earthquake simulator called Virtual California designed to extend the observed catalog of earthquakes in California. This simulator uses novel techniques borrowed from statistical physics to enable the modeling of large fault systems over long periods of time. The third is an evaluation of existing earthquake forecasts, which focuses on the Regional

  3. Two grave issues concerning the expected Tokai Earthquake

    NASA Astrophysics Data System (ADS)

    Mogi, K.

    2004-08-01

    The possibility of a great shallow earthquake (M 8) in the Tokai region, central Honshu, in the near future was pointed out by Mogi in 1969 and by the Coordinating Committee for Earthquake Prediction (CCEP), Japan (1970). In 1978, the government enacted the Large-Scale Earthquake Countermeasures Law and began to set up intensified observations in this region for short-term prediction of the expected Tokai earthquake. In this paper, two serious issues are pointed out, which may contribute to catastrophic effects in connection with the Tokai earthquake: 1. The danger of black-and-white predictions: According to the scenario based on the Large-Scale Earthquake Countermeasures Law, if abnormal crustal changes are observed, the Earthquake Assessment Committee (EAC) will determine whether or not there is an imminent danger. The findings are reported to the Prime Minister who decides whether to issue an official warning statement. Administrative policy clearly stipulates the measures to be taken in response to such a warning, and because the law presupposes the ability to predict a large earthquake accurately, there are drastic measures appropriate to the situation. The Tokai region is a densely populated region with high social and economic activity, and it is traversed by several vital transportation arteries. When a warning statement is issued, all transportation is to be halted. The Tokyo capital region would be cut off from the Nagoya and Osaka regions, and there would be a great impact on all of Japan. I (the former chairman of EAC) maintained that in view of the variety and complexity of precursory phenomena, it was inadvisable to attempt a black-and-white judgment as the basis for a "warning statement". I urged that the government adopt a "soft warning" system that acknowledges the uncertainty factor and that countermeasures be designed with that uncertainty in mind. 2. The danger of nuclear power plants in the focal region: Although the possibility of the

  4. Implementation Guidance on Annual Compliance Certification Reporting and Statement of Basis Requirements for Title V Operating Permits

    EPA Pesticide Factsheets

    This document may be of assistance in applying the Title V air operating permit regulations. This document is part of the Title V Policy and Guidance Database available at www2.epa.gov/title-v-operating-permits/title-v-operating-permit-policy-and-guidance-document-index. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.

  5. Selection of operating parameters on the basis of hydrodynamics in centrifugal partition chromatography for the purification of nybomycin derivatives.

    PubMed

    Adelmann, S; Baldhoff, T; Koepcke, B; Schembecker, G

    2013-01-25

    The selection of solvent systems in centrifugal partition chromatography (CPC) is the most critical point in setting up a separation. Therefore, lots of research was done on the topic in the last decades. But the selection of suitable operating parameters (mobile phase flow rate, rotational speed and mode of operation) with respect to hydrodynamics and pressure drop limit in CPC is still mainly driven by experience of the chromatographer. In this work we used hydrodynamic analysis for the prediction of most suitable operating parameters. After selection of different solvent systems with respect to partition coefficients for the target compound the hydrodynamics were visualized. Based on flow pattern and retention the operating parameters were selected for the purification runs of nybomycin derivatives that were carried out with a 200 ml FCPC(®) rotor. The results have proven that the selection of optimized operating parameters by analysis of hydrodynamics only is possible. As the hydrodynamics are predictable by the physical properties of the solvent system the optimized operating parameters can be estimated, too. Additionally, we found that dispersion and especially retention are improved if the less viscous phase is mobile.

  6. A computational method for solving stochastic Itô–Volterra integral equations based on stochastic operational matrix for generalized hat basis functions

    SciTech Connect

    Heydari, M.H.; Hooshmandasl, M.R.; Maalek Ghaini, F.M.; Cattani, C.

    2014-08-01

    In this paper, a new computational method based on the generalized hat basis functions is proposed for solving stochastic Itô–Volterra integral equations. In this way, a new stochastic operational matrix for generalized hat functions on the finite interval [0,T] is obtained. By using these basis functions and their stochastic operational matrix, such problems can be transformed into linear lower triangular systems of algebraic equations which can be directly solved by forward substitution. Also, the rate of convergence of the proposed method is considered and it has been shown that it is O(1/(n{sup 2}) ). Further, in order to show the accuracy and reliability of the proposed method, the new approach is compared with the block pulse functions method by some examples. The obtained results reveal that the proposed method is more accurate and efficient in comparison with the block pule functions method.

  7. Generation Of Manufacturing Routing And Operations Using Structured Knowledge As Basis To Application Of Computer Aided In Process Planning

    NASA Astrophysics Data System (ADS)

    Oswaldo, Luiz Agostinho

    2011-01-01

    The development of computer aided resources in automation of generation of manufacturing routings and operations is being mainly accomplished through the search of similarities between existent ones, resulting standard process routings that are grouped by analysis of similarities between parts or routings. This article proposes the development of manufacturing routings and operations detailment using a methodology which steps will define the initial, intermediate and final operations, starting from the rough piece and going up to the final specifications, that must have binunivocal relationship with the part design specifications. Each step will use the so called rules of precedence to link and chain the routing operations. The rules of precedence order and prioritize the knowledge of various manufacturing processes, taking in account the theories of machining, forging, assembly, and heat treatments; also, utilizes the theories of accumulation of tolerances and process capabilities, between others. It is also reinforced the availability of manufacturing databases related to process tolerances, deviations of machine tool- cutting tool- fixturing devices—workpiece, and process capabilities. The statement and application of rules of precedence, linking and joining manufacturing concepts in a logical and structured way, and their application in the methodology steps will make viable the utilization of structured knowledge instead of tacit one currently available in the manufacturing engineering departments, in the generation of manufacturing routing and operations. Consequently, the development of Computer Aided in Process Planning will be facilitated, due to the structured knowledge applied with this methodology.

  8. Using an internal coordinate Gaussian basis and a space-fixed Cartesian coordinate kinetic energy operator to compute a vibrational spectrum with rectangular collocation

    NASA Astrophysics Data System (ADS)

    Manzhos, Sergei; Carrington, Tucker

    2016-12-01

    We demonstrate that it is possible to use basis functions that depend on curvilinear internal coordinates to compute vibrational energy levels without deriving a kinetic energy operator (KEO) and without numerically computing coefficients of a KEO. This is done by using a space-fixed KEO and computing KEO matrix elements numerically. Whenever one has an excellent basis, more accurate solutions to the Schrödinger equation can be obtained by computing the KEO, potential, and overlap matrix elements numerically. Using a Gaussian basis and bond coordinates, we compute vibrational energy levels of formaldehyde. We show, for the first time, that it is possible with a Gaussian basis to solve a six-dimensional vibrational Schrödinger equation. For the zero-point energy (ZPE) and the lowest 50 vibrational transitions of H2CO, we obtain a mean absolute error of less than 1 cm-1; with 200 000 collocation points and 40 000 basis functions, most errors are less than 0.4 cm-1.

  9. Using an internal coordinate Gaussian basis and a space-fixed Cartesian coordinate kinetic energy operator to compute a vibrational spectrum with rectangular collocation.

    PubMed

    Manzhos, Sergei; Carrington, Tucker

    2016-12-14

    We demonstrate that it is possible to use basis functions that depend on curvilinear internal coordinates to compute vibrational energy levels without deriving a kinetic energy operator (KEO) and without numerically computing coefficients of a KEO. This is done by using a space-fixed KEO and computing KEO matrix elements numerically. Whenever one has an excellent basis, more accurate solutions to the Schrödinger equation can be obtained by computing the KEO, potential, and overlap matrix elements numerically. Using a Gaussian basis and bond coordinates, we compute vibrational energy levels of formaldehyde. We show, for the first time, that it is possible with a Gaussian basis to solve a six-dimensional vibrational Schrödinger equation. For the zero-point energy (ZPE) and the lowest 50 vibrational transitions of H2CO, we obtain a mean absolute error of less than 1 cm(-1); with 200 000 collocation points and 40 000 basis functions, most errors are less than 0.4 cm(-1).

  10. Deep earthquakes

    SciTech Connect

    Frohlich, C.

    1989-01-01

    Earthquakes are often recorded at depths as great as 650 kilometers or more. These deep events mark regions where plates of the earth's surface are consumed in the mantle. But the earthquakes themselves present a conundrum: the high pressures and temperatures at such depths should keep rock from fracturing suddenly and generating a tremor. This paper reviews the research on this problem. Almost all deep earthquakes conform to the pattern described by Wadati, namely, they generally occur at the edge of a deep ocean and define an inclined zone extending from near the surface to a depth of 600 kilometers of more, known as the Wadati-Benioff zone. Several scenarios are described that were proposed to explain the fracturing and slipping of rocks at this depth.

  11. 26 CFR 1.832-6 - Policyholders of mutual fire or flood insurance companies operating on the basis of premium...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 26 Internal Revenue 8 2011-04-01 2011-04-01 false Policyholders of mutual fire or flood insurance...) Other Insurance Companies § 1.832-6 Policyholders of mutual fire or flood insurance companies operating..., a taxpayer insured by a mutual fire or flood insurance company under a policy for which the...

  12. Postseismic Transient after the 2002 Denali Fault Earthquake from VLBI Measurements at Fairbanks

    NASA Technical Reports Server (NTRS)

    MacMillan, Daniel; Cohen, Steven

    2004-01-01

    The VLBI antenna (GILCREEK) at Fairbanks, Alaska observes in networks routinely twice a week with operational networks and on additional days with other networks on a more uneven basis. The Fairbanks antenna position is about 150 km north of the Denali fault and from the earthquake epicenter. We examine the transient behavior of the estimated VLBI position during the year following the earthquake to determine how the rate of change of postseismic deformation has changed. This is compared with what is seen in the GPS site position series.

  13. The Effects of Degraded Digital Instrumentation and Control Systems on Human-system Interfaces and Operator Performance: HFE Review Guidance and Technical Basis

    SciTech Connect

    O'Hara, J.M.; W. Gunther, G. Martinez-Guridi

    2010-02-26

    New and advanced reactors will use integrated digital instrumentation and control (I&C) systems to support operators in their monitoring and control functions. Even though digital systems are typically highly reliable, their potential for degradation or failure could significantly affect operator performance and, consequently, impact plant safety. The U.S. Nuclear Regulatory Commission (NRC) supported this research project to investigate the effects of degraded I&C systems on human performance and plant operations. The objective was to develop human factors engineering (HFE) review guidance addressing the detection and management of degraded digital I&C conditions by plant operators. We reviewed pertinent standards and guidelines, empirical studies, and plant operating experience. In addition, we conducted an evaluation of the potential effects of selected failure modes of the digital feedwater system on human-system interfaces (HSIs) and operator performance. The results indicated that I&C degradations are prevalent in plants employing digital systems and the overall effects on plant behavior can be significant, such as causing a reactor trip or causing equipment to operate unexpectedly. I&C degradations can impact the HSIs used by operators to monitor and control the plant. For example, sensor degradations can make displays difficult to interpret and can sometimes mislead operators by making it appear that a process disturbance has occurred. We used the information obtained as the technical basis upon which to develop HFE review guidance. The guidance addresses the treatment of degraded I&C conditions as part of the design process and the HSI features and functions that support operators to monitor I&C performance and manage I&C degradations when they occur. In addition, we identified topics for future research.

  14. United States earthquakes, 1984

    SciTech Connect

    Stover, C.W.

    1988-01-01

    The report contains information for eartthquakes in the 50 states and Puerto Rico and the area near their shorelines. The data consist of earthquake locations (date, time, geographic coordinates, depth, and magnitudes), intensities, macroseismic information, and isoseismal and seismicity maps. Also, included are sections detailing the activity of seismic networks operated by universities and other government agencies and a list of results form strong-motion seismograph records.

  15. Earthquake engineering research: 1982

    NASA Astrophysics Data System (ADS)

    The Committee on Earthquake Engineering Research addressed two questions: What progress has research produced in earthquake engineering and which elements of the problem should future earthquake engineering pursue. It examined and reported in separate chapters of the report: Applications of Past Research, Assessment of Earthquake Hazard, Earthquake Ground Motion, Soil Mechanics and Earth Structures, Analytical and Experimental Structural Dynamics, Earthquake Design of Structures, Seismic Interaction of Structures and Fluids, Social and Economic Aspects, Earthquake Engineering Education, Research in Japan.

  16. Earthquake Testing

    NASA Technical Reports Server (NTRS)

    1979-01-01

    During NASA's Apollo program, it was necessary to subject the mammoth Saturn V launch vehicle to extremely forceful vibrations to assure the moonbooster's structural integrity in flight. Marshall Space Flight Center assigned vibration testing to a contractor, the Scientific Services and Systems Group of Wyle Laboratories, Norco, California. Wyle-3S, as the group is known, built a large facility at Huntsville, Alabama, and equipped it with an enormously forceful shock and vibration system to simulate the liftoff stresses the Saturn V would encounter. Saturn V is no longer in service, but Wyle-3S has found spinoff utility for its vibration facility. It is now being used to simulate earthquake effects on various kinds of equipment, principally equipment intended for use in nuclear power generation. Government regulations require that such equipment demonstrate its ability to survive earthquake conditions. In upper left photo, Wyle3S is preparing to conduct an earthquake test on a 25ton diesel generator built by Atlas Polar Company, Ltd., Toronto, Canada, for emergency use in a Canadian nuclear power plant. Being readied for test in the lower left photo is a large circuit breaker to be used by Duke Power Company, Charlotte, North Carolina. Electro-hydraulic and electro-dynamic shakers in and around the pit simulate earthquake forces.

  17. Earthquake tectonics

    SciTech Connect

    Steward, R.F. )

    1991-02-01

    Earthquakes release a tremendous amount of energy into the subsurface in the form of seismic waves. The seismic wave energy of the San Francisco 1906 (M = 8.2) earthquake was equivalent to over 8 billion tons of TNT (3.3 {times} 10{sup 19} joules). Four basic wave types are propagated form seismic sources, two non-rotational and two rotational. As opposed to the non-rotational R and SH waves, the rotational compressional (RC) and rotational shear (RS) waves carry the bulk of the energy from a seismic source. RC wavefronts propagate in the subsurface and refract similarly to P waves, but are considerably slower. RC waves are critically refracted beneath the air surface interface at velocities less than the velocity of sound in air because they refract at the velocity of sound in air minus the retrograde particle velocity at the top of the wave. They propagate at tsunami waves in the open ocean, and produce loud sounds on land that are heard by humans and animals during earthquakes. The energy of the RS wave dwarfs that of the P, SH, and even the RC wave. The RS wave is the same as what is currently called the S wave in earthquake seismology, and produces both folding and strike-slip faulting at considerable distances from the epicenter. RC and RS waves, propagated during earthquakes from the Santa Ynez fault and a right-slip fault on trend with the Red Mountain fault, produced the Santa Ynez Mountains in California beginning in the middle Pliocene and continuing until the present.

  18. Comparison of two large earthquakes: the 2008 Sichuan Earthquake and the 2011 East Japan Earthquake.

    PubMed

    Otani, Yuki; Ando, Takayuki; Atobe, Kaori; Haiden, Akina; Kao, Sheng-Yuan; Saito, Kohei; Shimanuki, Marie; Yoshimoto, Norifumi; Fukunaga, Koichi

    2012-01-01

    Between August 15th and 19th, 2011, eight 5th-year medical students from the Keio University School of Medicine had the opportunity to visit the Peking University School of Medicine and hold a discussion session titled "What is the most effective way to educate people for survival in an acute disaster situation (before the mental health care stage)?" During the session, we discussed the following six points: basic information regarding the Sichuan Earthquake and the East Japan Earthquake, differences in preparedness for earthquakes, government actions, acceptance of medical rescue teams, earthquake-induced secondary effects, and media restrictions. Although comparison of the two earthquakes was not simple, we concluded that three major points should be emphasized to facilitate the most effective course of disaster planning and action. First, all relevant agencies should formulate emergency plans and should supply information regarding the emergency to the general public and health professionals on a normal basis. Second, each citizen should be educated and trained in how to minimize the risks from earthquake-induced secondary effects. Finally, the central government should establish a single headquarters responsible for command, control, and coordination during a natural disaster emergency and should centralize all powers in this single authority. We hope this discussion may be of some use in future natural disasters in China, Japan, and worldwide.

  19. Earthquakes and plate tectonics.

    USGS Publications Warehouse

    Spall, H.

    1982-01-01

    Earthquakes occur at the following three kinds of plate boundary: ocean ridges where the plates are pulled apart, margins where the plates scrape past one another, and margins where one plate is thrust under the other. Thus, we can predict the general regions on the earth's surface where we can expect large earthquakes in the future. We know that each year about 140 earthquakes of magnitude 6 or greater will occur within this area which is 10% of the earth's surface. But on a worldwide basis we cannot say with much accuracy when these events will occur. The reason is that the processes in plate tectonics have been going on for millions of years. Averaged over this interval, plate motions amount to several mm per year. But at any instant in geologic time, for example the year 1982, we do not know, exactly where we are in the worldwide cycle of strain build-up and strain release. Only by monitoring the stress and strain in small areas, for instance, the San Andreas fault, in great detail can we hope to predict when renewed activity in that part of the plate tectonics arena is likely to take place. -from Author

  20. 2010 Chile Earthquake Aftershock Response

    NASA Astrophysics Data System (ADS)

    Barientos, Sergio

    2010-05-01

    1906? Since the number of M>7.0 aftershocks has been low, does the distribution of large-magnitude aftershocks differ from previous events of this size? What is the origin of the extensional-type aftershocks at shallow depths within the upper plate? The international seismological community (France, Germany, U.K., U.S.A.) in collaboration with the Chilean seismological community responded with a total of 140 portable seismic stations to deploy in order to record aftershocks. This combined with the Chilean permanent seismic network, in the area results in 180 stations now in operation recording continuous at 100 cps. The seismic equipment is a mix of accelerometers, short -period and broadband seismic sensors deployed along the entire length of the aftershock zone that will record the aftershock sequence for three to six months. The collected seismic data will be merged and archived to produce an international data set open to the entire seismological community immediately after archiving. Each international group will submit their data as soon as possible in standard (mini seed) format with accompanying meta data to the IRIS DMC where the data will be merged into a combined data set and available to individuals and other data centers. This will be by far the best-recorded aftershock sequence of a large megathrust earthquake. This outstanding international collaboration will provide an open data set for this important earthquake as well as provide a model for future aftershock deployments around the world.

  1. Issues on the Japanese Earthquake Hazard Evaluation

    NASA Astrophysics Data System (ADS)

    Hashimoto, M.; Fukushima, Y.; Sagiya, T.

    2013-12-01

    The 2011 Great East Japan Earthquake forced the policy of counter-measurements to earthquake disasters, including earthquake hazard evaluations, to be changed in Japan. Before the March 11, Japanese earthquake hazard evaluation was based on the history of earthquakes that repeatedly occurs and the characteristic earthquake model. The source region of an earthquake was identified and its occurrence history was revealed. Then the conditional probability was estimated using the renewal model. However, the Japanese authorities changed the policy after the megathrust earthquake in 2011 such that the largest earthquake in a specific seismic zone should be assumed on the basis of available scientific knowledge. According to this policy, three important reports were issued during these two years. First, the Central Disaster Management Council issued a new estimate of damages by a hypothetical Mw9 earthquake along the Nankai trough during 2011 and 2012. The model predicts a 34 m high tsunami on the southern Shikoku coast and intensity 6 or higher on the JMA scale in most area of Southwest Japan as the maximum. Next, the Earthquake Research Council revised the long-term earthquake hazard evaluation of earthquakes along the Nankai trough in May 2013, which discarded the characteristic earthquake model and put much emphasis on the diversity of earthquakes. The so-called 'Tokai' earthquake was negated in this evaluation. Finally, another report by the CDMC concluded that, with the current knowledge, it is hard to predict the occurrence of large earthquakes along the Nankai trough using the present techniques, based on the diversity of earthquake phenomena. These reports created sensations throughout the country and local governments are struggling to prepare counter-measurements. These reports commented on large uncertainty in their evaluation near their ends, but are these messages transmitted properly to the public? Earthquake scientists, including authors, are involved in

  2. Connecting slow earthquakes to huge earthquakes.

    PubMed

    Obara, Kazushige; Kato, Aitaro

    2016-07-15

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes.

  3. Earthquake ground motion: Chapter 3

    USGS Publications Warehouse

    Luco, Nicolas; Valley, Michael; Crouse, C.B.

    2012-01-01

    Most of the effort in seismic design of buildings and other structures is focused on structural design. This chapter addresses another key aspect of the design process—characterization of earthquake ground motion. Section 3.1 describes the basis of the earthquake ground motion maps in the Provisions and in ASCE 7. Section 3.2 has examples for the determination of ground motion parameters and spectra for use in design. Section 3.3 discusses and provides an example for the selection and scaling of ground motion records for use in response history analysis.

  4. Statistical tests of simple earthquake cycle models

    NASA Astrophysics Data System (ADS)

    DeVries, Phoebe M. R.; Evans, Eileen L.

    2016-12-01

    A central goal of observing and modeling the earthquake cycle is to forecast when a particular fault may generate an earthquake: a fault late in its earthquake cycle may be more likely to generate an earthquake than a fault early in its earthquake cycle. Models that can explain geodetic observations throughout the entire earthquake cycle may be required to gain a more complete understanding of relevant physics and phenomenology. Previous efforts to develop unified earthquake models for strike-slip faults have largely focused on explaining both preseismic and postseismic geodetic observations available across a few faults in California, Turkey, and Tibet. An alternative approach leverages the global distribution of geodetic and geologic slip rate estimates on strike-slip faults worldwide. Here we use the Kolmogorov-Smirnov test for similarity of distributions to infer, in a statistically rigorous manner, viscoelastic earthquake cycle models that are inconsistent with 15 sets of observations across major strike-slip faults. We reject a large subset of two-layer models incorporating Burgers rheologies at a significance level of α = 0.05 (those with long-term Maxwell viscosities ηM < 4.0 × 1019 Pa s and ηM > 4.6 × 1020 Pa s) but cannot reject models on the basis of transient Kelvin viscosity ηK. Finally, we examine the implications of these results for the predicted earthquake cycle timing of the 15 faults considered and compare these predictions to the geologic and historical record.

  5. Defeating Earthquakes

    NASA Astrophysics Data System (ADS)

    Stein, R. S.

    2012-12-01

    The 2004 M=9.2 Sumatra earthquake claimed what seemed an unfathomable 228,000 lives, although because of its size, we could at least assure ourselves that it was an extremely rare event. But in the short space of 8 years, the Sumatra quake no longer looks like an anomaly, and it is no longer even the worst disaster of the Century: 80,000 deaths in the 2005 M=7.6 Pakistan quake; 88,000 deaths in the 2008 M=7.9 Wenchuan, China quake; 316,000 deaths in the M=7.0 Haiti, quake. In each case, poor design and construction were unable to withstand the ferocity of the shaken earth. And this was compounded by inadequate rescue, medical care, and shelter. How could the toll continue to mount despite the advances in our understanding of quake risk? The world's population is flowing into megacities, and many of these migration magnets lie astride the plate boundaries. Caught between these opposing demographic and seismic forces are 50 cities of at least 3 million people threatened by large earthquakes, the targets of chance. What we know for certain is that no one will take protective measures unless they are convinced they are at risk. Furnishing that knowledge is the animating principle of the Global Earthquake Model, launched in 2009. At the very least, everyone should be able to learn what his or her risk is. At the very least, our community owes the world an estimate of that risk. So, first and foremost, GEM seeks to raise quake risk awareness. We have no illusions that maps or models raise awareness; instead, earthquakes do. But when a quake strikes, people need a credible place to go to answer the question, how vulnerable am I, and what can I do about it? The Global Earthquake Model is being built with GEM's new open source engine, OpenQuake. GEM is also assembling the global data sets without which we will never improve our understanding of where, how large, and how frequently earthquakes will strike, what impacts they will have, and how those impacts can be lessened by

  6. Earthquake Archaeology: a logical approach?

    NASA Astrophysics Data System (ADS)

    Stewart, I. S.; Buck, V. A.

    2001-12-01

    Ancient earthquakes can leave their mark in the mythical and literary accounts of ancient peoples, the stratigraphy of their site histories, and the structural integrity of their constructions. Within this broad cross-disciplinary tramping ground, earthquake geologists have tended to focus on those aspects of the cultural record that are most familiar to them; the physical effects of seismic deformation on ancient constructions. One of the core difficulties with this 'earthquake archaeology' approach is that recent attempts to isolate structural criteria that are diagnostic or strongly suggestive of a seismic origin are undermined by the recognition that signs of ancient seismicity are generally indistinguishable from non-seismic mechanisms (poor construction, adverse geotechnical conditions). We illustrate the difficulties and inconsistencies in current proposed 'earthquake diagnostic' schemes by reference to two case studies of archaeoseismic damage in central Greece. The first concerns fallen columns at various Classical temple localities in mainland Greece (Nemea, Sounio, Olympia, Bassai) which, on the basis of observed structural criteria, are earthquake-induced but which are alternatively explained by archaeologists as the action of human disturbance. The second re-examines the almost type example of the Kyparissi site in the Atalanti region as a Classical stoa offset across a seismic surface fault, arguing instead for its deformation by ground instability. Finally, in highlighting the inherent ambiguity of archaeoseismic data, we consider the value of a logic-tree approach for quantifying and quantifying our uncertainities for seismic-hazard analysis.

  7. On subduction zone earthquakes and the Pacific Northwest seismicity

    SciTech Connect

    Chung, Dae H.

    1991-12-01

    A short review of subduction zone earthquakes and the seismicity of the Pacific Northwest region of the United States is provided for the purpose of a basis for assessing issues related to earthquake hazard evaluations for the region. This review of seismotectonics regarding historical subduction zone earthquakes and more recent seismological studies pertaining to rupture processes of subduction zone earthquakes, with specific references to the Pacific Northwest, is made in this brief study. Subduction zone earthquakes tend to rupture updip and laterally from the hypocenter. Thus, the rupture surface tends to become more elongated as one considers larger earthquakes (there is limited updip distance that is strongly coupled, whereas rupture length can be quite large). The great Aleutian-Alaska earthquakes of 1957, 1964, and 1965 had rupture lengths of greater than 650 km. The largest earthquake observed instrumentally, the M{sub W} 9.5, 1960 Chile Earthquake, had a rupture length over 1000 km. However, earthquakes of this magnitude are very unlikely on Cascadia. The degree of surface shaking has a very strong dependency on the depth and style of rupture. The rupture surface during a great earthquake shows heterogeneous stress drop, displacement, energy release, etc. The high strength zones are traditionally termed asperities and these asperities control when and how large an earthquake is generated. Mapping of these asperities in specific subduction zones is very difficult before an earthquake. They show up more easily in inversions of dynamic source studies of earthquake ruptures, after an earthquake. Because seismic moment is based on the total radiated-energy from an earthquake, the moment-based magnitude M{sub W} is superior to all other magnitude estimates, such as M{sub L}, m{sub b}, M{sub bLg}, M{sub S}, etc Probably, just to have a common language, non-moment magnitudes should be converted to M{sub W} in any discussions of subduction zone earthquakes.

  8. Effects of the 2011 Tohoku Earthquake on VLBI Geodetic Measurements

    NASA Astrophysics Data System (ADS)

    MacMillan, D. S.; Kurihara, S.; Behrend, D.

    2011-12-01

    The VLBI antenna TSUKUB32 at Tsukuba, Japan regularly observes in 24-hour observing sessions once per week with the R1 operational network and on additional days with other networks on a more irregular basis. Further, the antenna is an endpoint of the single-baseline, 1-hour Intensive sessions observed on the weekends for determination of UT1. TSUKUB32 returned to normal operational observing 25 days after the earthquake. The antenna is 160 km west and 240 km south of the epicenter (about the same distance west of the plate subduction boundary). We looked at the transient behavior of the TSUKUB32 position time series following the earthquake and found that significant deformation is continuing. The eastward rate as of July 2011, 4 months after the earthquake, is 20 cm/yr greater than the long-term rate prior to the earthquake. The VLBI series agrees with the corresponding JPL GPS series (M. B. Heflin, http://sideshow.jpl.nasa.gov/mbh/series.html, 2011) measured by the co-located GPS antenna TSUK. The coseismic UEN displacement at Tsukuba was approximately (-90 mm, 550 mm, 50 mm). We examined the effect of the variation of TSUKUB32 position on EOP estimates and specifically how best to correct its position for estimation of UT1 in the Intensive experiments. For this purpose and to provide operational UT1, the IVS scheduled a series of weekend Intensive sessions observing on the Kokee-Wettzell baseline immediately before each of the two Tsukuba-Wettzell Intensive sessions. Comparisons between UT1 estimates from these pairs of sessions were used in validating a model for the post-seismic displacement of TSUKUB32.

  9. Automated Microwave Complex on the Basis of a Continuous-Wave Gyrotron with an Operating Frequency of 263 GHz and an Output Power of 1 kW

    NASA Astrophysics Data System (ADS)

    Glyavin, M. Yu.; Morozkin, M. V.; Tsvetkov, A. I.; Lubyako, L. V.; Golubiatnikov, G. Yu.; Kuftin, A. N.; Zapevalov, V. E.; V. Kholoptsev, V.; Eremeev, A. G.; Sedov, A. S.; Malygin, V. I.; Chirkov, A. V.; Fokin, A. P.; Sokolov, E. V.; Denisov, G. G.

    2016-02-01

    We study experimentally the automated microwave complex for microwave spectroscopy and diagnostics of various media, which was developed at the Institute of Applied Physics of the Russian Academy of Sciences in cooperation with GYCOM Ltd. on the basis of a gyrotron with a frequency of 263 GHz and operated at the first gyrofrequency harmonic. In the process of the experiments, a controllable output power of 0 .1 -1 kW was achieved with an efficiency of up to 17 % in the continuous-wave generation regime. The measured radiation spectrum with a relative width of about 10 -6 and the frequency values measured at various parameters of the device are presented. The results of measuring the parameters of the wave beam, which was formed by a built-in quasioptical converter, as well as the data obtained by measuring the heat loss in the cavity and the vacuum output window are analyzed.

  10. Chern-Simons gravity with (curvature){sup 2} and (torsion){sup 2} terms and a basis of degree-of-freedom projection operators

    SciTech Connect

    Helayeel-Neto, J. A.; Hernaski, C. A.; Pereira-Dias, B.; Vargas-Paredes, A. A.; Vasquez-Otoya, V. J.

    2010-09-15

    The effects of (curvature){sup 2}- and (torsion){sup 2}-terms in the Einstein-Hilbert-Chern-Simons Lagrangian are investigated. The purposes are two-fold: (i) to show the efficacy of an orthogonal basis of degree-of-freedom projection operators recently proposed and to ascertain its adequacy for obtaining propagators of general parity-breaking gravity models in three dimensions; (ii) to analyze the role of the topological Chern-Simons term for the unitarity and the particle spectrum of the model squared-curvature terms in connection with dynamical torsion. Our conclusion is that the Chern-Simons term does not influence the unitarity conditions imposed on the parameters of the Lagrangian but significantly modifies the particle spectrum.

  11. A smartphone application for earthquakes that matter!

    NASA Astrophysics Data System (ADS)

    Bossu, Rémy; Etivant, Caroline; Roussel, Fréderic; Mazet-Roux, Gilles; Steed, Robert

    2014-05-01

    Smartphone applications have swiftly become one of the most popular tools for rapid reception of earthquake information for the public, some of them having been downloaded more than 1 million times! The advantages are obvious: wherever someone's own location is, they can be automatically informed when an earthquake has struck. Just by setting a magnitude threshold and an area of interest, there is no longer the need to browse the internet as the information reaches you automatically and instantaneously! One question remains: are the provided earthquake notifications always relevant for the public? What are the earthquakes that really matters to laypeople? One clue may be derived from some newspaper reports that show that a while after damaging earthquakes many eyewitnesses scrap the application they installed just after the mainshock. Why? Because either the magnitude threshold is set too high and many felt earthquakes are missed, or it is set too low and the majority of the notifications are related to unfelt earthquakes thereby only increasing anxiety among the population at each new update. Felt and damaging earthquakes are the ones that matter the most for the public (and authorities). They are the ones of societal importance even when of small magnitude. A smartphone application developed by EMSC (Euro-Med Seismological Centre) with the financial support of the Fondation MAIF aims at providing suitable notifications for earthquakes by collating different information threads covering tsunamigenic, potentially damaging and felt earthquakes. Tsunamigenic earthquakes are considered here to be those ones that are the subject of alert or information messages from the PTWC (Pacific Tsunami Warning Centre). While potentially damaging earthquakes are identified through an automated system called EQIA (Earthquake Qualitative Impact Assessment) developed and operated at EMSC. This rapidly assesses earthquake impact by comparing the population exposed to each expected

  12. Can Earthquakes Induced by Deep Fluid Injection Projects Be Controlled or Limited?

    NASA Astrophysics Data System (ADS)

    McGarr, A.; Williams, C. F.; Hickman, S.; Oppenheimer, D. H.

    2011-12-01

    Projects that involve the injection of high-pressure fluids at depth include Enhanced Geothermal Systems (EGS), CO2 sequestration and liquid waste disposal. We consider some case histories to address the question of the extent to which earthquakes induced by fluid injection can be controlled or limited. For instance, can induced earthquakes be controlled in ways that don't compromise the effectiveness of a given injection project? It is difficult to answer this question definitively because, to our knowledge, only one successful experiment in earthquake control has been performed (Raleigh et al., Science, v. 191, pp. 1230-1237, 1976). Moreover, for numerous injection projects, the induced earthquakes of maximum magnitude have been post shut-in, e.g., the Rocky Mountain Arsenal well, a liquid waste disposal project for which the three largest induced earthquakes occurred more than a year after injection had been terminated. For EGS operations requiring the injection of liquid into rock of low permeability, estimations of maximum magnitudes based on the volume of injected fluid have been moderately successful. For a typical magnitude distribution of induced earthquakes, it can be shown that the largest event accounts for about half of the total induced seismic moment, which is given by the volume of injected liquid multiplied by the modulus of rigidity (McGarr, J. Geophys. Res., v. 81, p. 1487, 1976). The Basel Deep Heat Mining project, an EGS injection of 11,500 cubic meters of water into low-permeability rock at a depth of five km, induced earthquakes with magnitudes that exceeded the safety threshold and so injection was discontinued (Deichmann and Giardini, Seismol. Res. Letters, v. 80, p. 784, 2009). Approximately half a day after shut-in, however, an earthquake of magnitude 3.4 occurred, the largest event of the sequence. It is worth noting that the magnitude of this earthquake is quite close to what could have been estimated based on the volume of injected

  13. Stress Drops for Potentially Induced Earthquake Sequences

    NASA Astrophysics Data System (ADS)

    Huang, Y.; Beroza, G. C.; Ellsworth, W. L.

    2015-12-01

    Stress drop, the difference between shear stress acting across a fault before and after an earthquake, is a fundamental parameter of the earthquake source process and the generation of strong ground motions. Higher stress drops usually lead to more high-frequency ground motions. Hough [2014 and 2015] observed low intensities in "Did You Feel It?" data for injection-induced earthquakes, and interpreted them to be a result of low stress drops. It is also possible that the low recorded intensities could be a result of propagation effects. Atkinson et al. [2015] show that the shallow depth of injection-induced earthquakes can lead to a lack of high-frequency ground motion as well. We apply the spectral ratio method of Imanishi and Ellsworth [2006] to analyze stress drops of injection-induced earthquakes, using smaller earthquakes with similar waveforms as empirical Green's functions (eGfs). Both the effects of path and linear site response should be cancelled out through the spectral ratio analysis. We apply this technique to the Guy-Greenbrier earthquake sequence in central Arkansas. The earthquakes migrated along the Guy-Greenbrier Fault while nearby injection wells were operating in 2010-2011. Huang and Beroza [GRL, 2015] improved the magnitude of completeness to about -1 using template matching and found that the earthquakes deviated from Gutenberg-Richter statistics during the operation of nearby injection wells. We identify 49 clusters of highly similar events in the Huang and Beroza [2015] catalog and calculate stress drops using the source model described in Imanishi and Ellsworth [2006]. Our results suggest that stress drops of the Guy-Greenbrier sequence are similar to tectonic earthquakes at Parkfield, California (the attached figure). We will also present stress drop analysis of other suspected induced earthquake sequences using the same method.

  14. Evaluation of near-field earthquake effects

    SciTech Connect

    Shrivastava, H.P.

    1994-11-01

    Structures and equipment, which are qualified for the design basis earthquake (DBE) and have anchorage designed for the DBE loading, do not require an evaluation of the near-field earthquake (NFE) effects. However, safety class 1 acceleration sensitive equipment such as electrical relays must be evaluated for both NFE and DBE since they are known to malfunction when excited by high frequency seismic motions.

  15. A Simplified Approach to the Basis Functions of Symmetry Operations and Terms of Metal Complexes in an Octahedral Field with d[superscript 1] to d[superscript 9] Configurations

    ERIC Educational Resources Information Center

    Lee, Liangshiu

    2010-01-01

    The basis sets for symmetry operations of d[superscript 1] to d[superscript 9] complexes in an octahedral field and the resulting terms are derived for the ground states and spin-allowed excited states. The basis sets are of fundamental importance in group theory. This work addresses such a fundamental issue, and the results are pedagogically…

  16. Identification of Deep Earthquakes

    DTIC Science & Technology

    2010-09-01

    develop a ground truth dataset of earthquakes at both normal crustal depths and earthquakes from subduction zones , below the overlying crust. Many...deep earthquakes (depths between about 50 and 300 km). These deep earthquakes are known to occur in the Asia-India continental collision zone ...and/or NIL, as these stations are within a few hundred km of the zone where deep earthquakes are known to occur. To date we have selected about 300

  17. Earthquake Shaking and Damage to Buildings: Recent evidence for severe ground shaking raises questions about the earthquake resistance of structures.

    PubMed

    Page, R A; Joyner, W B; Blume, J A

    1975-08-22

    Ground shaking close to the causative fault of an earthquake is more intense than it was previously believed to be. This raises the possibility that large numbers of buildings and other structures are not sufficiently resistant for the intense levels of shaking that can occur close to the fault. Many structures were built before earthquake codes were adopted; others were built according to codes formulated when less was known about the intensity of near-fault shaking. Although many building types are more resistant than conventional design analyses imply, the margin of safety is difficult to quantify. Many modern structures, such as freeways, have not been subjected to and tested by near-fault shaking in major earthquakes (magnitude 7 or greater). Damage patterns in recent moderate-sized earthquakes occurring in or adjacent to urbanized areas (17), however, indicate that many structures, including some modern ones designed to meet earthquake code requirements, cannot withstand the severe shaking that can occur close to a fault. It is necessary to review the ground motion assumed and the methods utilized in the design of important existing structures and, if necessary, to strengthen or modify the use of structures that are found to be weak. New structures situated close to active faults should be designed on the basis of ground motion estimates greater than those used in the past. The ultimate balance between risk of earthquake losses and cost for both remedial strengthening and improved earthquake-resistant construction must be decided by the public. Scientists and engineers must inform the public about earthquake shaking and its effect on structures. The exposure to damage from seismic shaking is steadily increasing because of continuing urbanization and the increasing complexity of lifeline systems, such as power, water, transportation, and communication systems. In the near future we should expect additional painful examples of the damage potential of moderate

  18. Material contrast does not predict earthquake rupture propagation direction

    USGS Publications Warehouse

    Harris, R.A.; Day, S.M.

    2005-01-01

    Earthquakes often occur on faults that juxtapose different rocks. The result is rupture behavior that differs from that of an earthquake occurring on a fault in a homogeneous material. Previous 2D numerical simulations have studied simple cases of earthquake rupture propagation where there is a material contrast across a fault and have come to two different conclusions: 1) earthquake rupture propagation direction can be predicted from the material contrast, and 2) earthquake rupture propagation direction cannot be predicted from the material contrast. In this paper we provide observational evidence from 70 years of earthquakes at Parkfield, CA, and new 3D numerical simulations. Both the observations and the numerical simulations demonstrate that earthquake rupture propagation direction is unlikely to be predictable on the basis of a material contrast. Copyright 2005 by the American Geophysical Union.

  19. High voltage electric substation performance in earthquakes

    SciTech Connect

    Eidinger, J.; Ostrom, D.; Matsuda, E.

    1995-12-31

    This paper examines the performance of several types of high voltage substation equipment in past earthquakes. Damage data is provided in chart form. This data is then developed into a tool for estimating the performance of a substation subjected to an earthquake. First, suggests are made about the development of equipment class fragility curves that represent the expected earthquake performance of different voltages and types of equipment. Second, suggestions are made about how damage to individual pieces of equipment at a substation likely affects the post-earthquake performance of the substation as a whole. Finally, estimates are provided as to how quickly a substation, at various levels of damage, can be restored to operational service after the earthquake.

  20. Spatial Evaluation and Verification of Earthquake Simulators

    NASA Astrophysics Data System (ADS)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.

    2016-09-01

    In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.

  1. Evidence for co-operativity in coenzyme binding to tetrameric Sulfolobus solfataricus alcohol dehydrogenase and its structural basis: fluorescence, kinetic and structural studies of the wild-type enzyme and non-co-operative N249Y mutant

    PubMed Central

    2005-01-01

    The interaction of coenzyme with thermostable homotetrameric NAD(H)-dependent alcohol dehydrogenase from the thermoacidophilic sulphur-dependent crenarchaeon Sulfolobus solfataricus (SsADH) and its N249Y (Asn-249→Tyr) mutant was studied using the high fluorescence sensitivity of its tryptophan residues Trp-95 and Trp-117 to the binding of coenzyme moieties. Fluorescence quenching studies performed at 25 °C show that SsADH exhibits linearity in the NAD(H) binding [the Hill coefficient (h)∼1) at pH 9.8 and at moderate ionic strength, in addition to positive co-operativity (h=2.0–2.4) at pH 7.8 and 6.8, and at pH 9.8 in the presence of salt. Furthermore, NADH binding is positively co-operative below 20 °C (h∼3) and negatively co-operative at 40–50 °C (h∼0.7), as determined at moderate ionic strength and pH 9.8. Steady-state kinetic measurements show that SsADH displays standard Michaelis–Menten kinetics between 35 and 45 °C, but exhibits positive and negative co-operativity for NADH oxidation below (h=3.3 at 20 °C) and above (h=0.7 at 70–80 °C) this range of temperatures respectively. However, N249Y SsADH displays non-co-operative behaviour in coenzyme binding under the same experimental conditions used for the wild-type enzyme. In loop 270–275 of the coenzyme domain and segments at the interface of dimer A–B, analyses of the wild-type and mutant SsADH structures identified the structural elements involved in the intersubunit communication and suggested a possible structural basis for co-operativity. This is the first report of co-operativity in a tetrameric ADH and of temperature-induced co-operativity in a thermophilic enzyme. PMID:15651978

  2. From Earthquake Prediction Research to Time-Variable Seismic Hazard Assessment Applications

    NASA Astrophysics Data System (ADS)

    Bormann, Peter

    2011-01-01

    The first part of the paper defines the terms and classifications common in earthquake prediction research and applications. This is followed by short reviews of major earthquake prediction programs initiated since World War II in several countries, for example the former USSR, China, Japan, the United States, and several European countries. It outlines the underlying expectations, concepts, and hypotheses, introduces the technologies and methodologies applied and some of the results obtained, which include both partial successes and failures. Emphasis is laid on discussing the scientific reasons why earthquake prediction research is so difficult and demanding and why the prospects are still so vague, at least as far as short-term and imminent predictions are concerned. However, classical probabilistic seismic hazard assessments, widely applied during the last few decades, have also clearly revealed their limitations. In their simple form, they are time-independent earthquake rupture forecasts based on the assumption of stable long-term recurrence of earthquakes in the seismotectonic areas under consideration. Therefore, during the last decade, earthquake prediction research and pilot applications have focused mainly on the development and rigorous testing of long and medium-term rupture forecast models in which event probabilities are conditioned by the occurrence of previous earthquakes, and on their integration into neo-deterministic approaches for improved time-variable seismic hazard assessment. The latter uses stress-renewal models that are calibrated for variations in the earthquake cycle as assessed on the basis of historical, paleoseismic, and other data, often complemented by multi-scale seismicity models, the use of pattern-recognition algorithms, and site-dependent strong-motion scenario modeling. International partnerships and a global infrastructure for comparative testing have recently been developed, for example the Collaboratory for the Study of

  3. EARTHQUAKE CAUSED RELEASES FROM A NUCLEAR FUEL CYCLE FACILITY

    SciTech Connect

    Charles W. Solbrig; Chad Pope; Jason Andrus

    2014-08-01

    The fuel cycle facility (FCF) at the Idaho National Laboratory is a nuclear facility which must be licensed in order to operate. A safety analysis is required for a license. This paper describes the analysis of the Design Basis Accident for this facility. This analysis involves a model of the transient behavior of the FCF inert atmosphere hot cell following an earthquake initiated breach of pipes passing through the cell boundary. The hot cell is used to process spent metallic nuclear fuel. Such breaches allow the introduction of air and subsequent burning of pyrophoric metals. The model predicts the pressure, temperature, volumetric releases, cell heat transfer, metal fuel combustion, heat generation rates, radiological releases and other quantities. The results show that releases from the cell are minimal and satisfactory for safety. This analysis method should be useful in other facilities that have potential for damage from an earthquake and could eliminate the need to back fit facilities with earthquake proof boundaries or lessen the cost of new facilities.

  4. Performance of lifelines during the January 17, 1994 Northridge earthquake

    SciTech Connect

    Eguchi, R.T.; Chung, R.M.

    1995-12-31

    The occurrence for the January 17, 1994 Northridge earthquake has provided a unique opportunity to study the earthquake performance of lifeline systems. This particular areas has experienced two major earthquake events in the last 25 years, each playing a significant role in changing the way in which one designs and constructs lifeline systems for earthquake. In 1971, the San Fernando earthquake shook apart many lifeline systems causing significant damage and service disruption to Los Angeles area residents and businesses. As a result of this earthquake, special investigations were initiated to better understand and design these systems to remain functional after moderate and major earthquakes. Because of these post-1971 efforts, significant damage to lifelines was minimized in the January event. In each new earthquake, however, new lessons are learned, and as a result of these lessons, changes in either design or operational procedures are made to reduce the effects in future events. In the Northridge earthquake, some of the most significant lessons include effects on electric power system components and older steel natural gas transmission pipelines. This paper attempts to identify where lessons from previous southern California earthquakes were useful in preparing for the Northridge earthquake. In addition, areas that deserve further research or analysis, as a result of new lessons learned from the Northridge earthquake, are identified.

  5. Toward real-time regional earthquake simulation of Taiwan earthquakes

    NASA Astrophysics Data System (ADS)

    Lee, S.; Liu, Q.; Tromp, J.; Komatitsch, D.; Liang, W.; Huang, B.

    2013-12-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 minutes after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 minutes for a 70 sec ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  6. Earthquake friction

    NASA Astrophysics Data System (ADS)

    Mulargia, Francesco; Bizzarri, Andrea

    2016-12-01

    Laboratory friction slip experiments on rocks provide firm evidence that the static friction coefficient μ has values ∼0.7. This would imply large amounts of heat produced by seismically active faults, but no heat flow anomaly is observed, and mineralogic evidence of frictional heating is virtually absent. This stands for lower μ values ∼0.2, as also required by the observed orientation of faults with respect to the maximum compressive stress. We show that accounting for the thermal and mechanical energy balance of the system removes this inconsistence, implying a multi-stage strain release process. The first stage consists of a small and slow aseismic slip at high friction on pre-existent stress concentrators within the fault volume but angled with the main fault as Riedel cracks. This introduces a second stage dominated by frictional temperature increase inducing local pressurization of pore fluids around the slip patches, which is in turn followed by a third stage in which thermal diffusion extends the frictionally heated zones making them coalesce into a connected pressurized region oriented as the fault plane. Then, the system enters a state of equivalent low static friction in which it can undergo the fast elastic radiation slip prescribed by dislocation earthquake models.

  7. POST Earthquake Debris Management - AN Overview

    NASA Astrophysics Data System (ADS)

    Sarkar, Raju

    Every year natural disasters, such as fires, floods, earthquakes, hurricanes, landslides, tsunami, and tornadoes, challenge various communities of the world. Earthquakes strike with varying degrees of severity and pose both short- and long-term challenges to public service providers. Earthquakes generate shock waves and displace the ground along fault lines. These seismic forces can bring down buildings and bridges in a localized area and damage buildings and other structures in a far wider area. Secondary damage from fires, explosions, and localized flooding from broken water pipes can increase the amount of debris. Earthquake debris includes building materials, personal property, and sediment from landslides. The management of this debris, as well as the waste generated during the reconstruction works, can place significant challenges on the national and local capacities. Debris removal is a major component of every post earthquake recovery operation. Much of the debris generated from earthquake is not hazardous. Soil, building material, and green waste, such as trees and shrubs, make up most of the volume of earthquake debris. These wastes not only create significant health problems and a very unpleasant living environment if not disposed of safely and appropriately, but also can subsequently impose economical burdens on the reconstruction phase. In practice, most of the debris may be either disposed of at landfill sites, reused as materials for construction or recycled into useful commodities Therefore, the debris clearance operation should focus on the geotechnical engineering approach as an important post earthquake issue to control the quality of the incoming flow of potential soil materials. In this paper, the importance of an emergency management perspective in this geotechnical approach that takes into account the different criteria related to the operation execution is proposed by highlighting the key issues concerning the handling of the construction

  8. The use of volunteer interpreters during the 201 0 Haiti earthquake: lessons learned from the USNS COMFORT Operation Unified Response Haiti.

    PubMed

    Powell, Clydette; Pagliara-Miller, Claire

    2012-01-01

    On January 12, 2010, a 7.0 magnitude Richter earthquake devastated Haiti, leading to the world's largest humanitarian effort in 60 years. The catastrophe led to massive destruction of homes and buildings, the loss of more than 200,000 lives, and overwhelmed the host nation response and its public health infrastructure. Among the many responders, the United States Government acted immediately by sending assistance to Haiti including a naval hospital ship as a tertiary care medical center, the USNS COMFORT. To adequately respond to the acute needs of patients, healthcare professionals on the USNS COMFORT relied on Haitian Creole-speaking volunteers who were recruited by the American Red Cross (ARC). These volunteers complemented full-time Creole-speaking military staff on board. The ARC provided 78 volunteers who were each able to serve up to 4 weeks on board. Volunteers' demographics, such as age and gender, as well as linguistic skills, work background, and prior humanitarian assistance experience varied. Volunteer efforts were critical in assisting with informed consent for surgery, family reunification processes, explanation of diagnosis and treatment, comfort to patients and families in various stages of grieving and death, and helping healthcare professionals to understand the cultural context and sensitivities unique to Haiti. This article explores key lessons learned in the use of volunteer interpreters in earthquake disaster relief in Haiti and highlights the approaches that optimize volunteer services in such a setting, and which may be applicable in similar future events.

  9. Earthquakes: hydrogeochemical precursors

    USGS Publications Warehouse

    Ingebritsen, Steven E.; Manga, Michael

    2014-01-01

    Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets.

  10. The size of earthquakes

    USGS Publications Warehouse

    Kanamori, H.

    1980-01-01

    How we should measure the size of an earthquake has been historically a very important, as well as a very difficult, seismological problem. For example, figure 1 shows the loss of life caused by earthquakes in recent times and clearly demonstrates that 1976 was the worst year for earthquake casualties in the 20th century. However, the damage caused by an earthquake is due not only to its physical size but also to other factors such as where and when it occurs; thus, figure 1 is not necessarily an accurate measure of the "size" of earthquakes in 1976. the point is that the physical process underlying an earthquake is highly complex; we therefore cannot express every detail of an earthquake by a simple straightforward parameter. Indeed, it would be very convenient if we could find a single number that represents the overall physical size of an earthquake. This was in fact the concept behind the Richter magnitude scale introduced in 1935. 

  11. Earthquakes for Kids

    MedlinePlus

    ... lab. Earthquake Animations A trench dug across a fault to learn about past earthquakes. Science Fair Projects ... History A scientist stands in front of a fault scarp in southern California. Damage to badly-constructed ...

  12. Speeding earthquake disaster relief

    USGS Publications Warehouse

    Mortensen, Carl; Donlin, Carolyn; Page, Robert A.; Ward, Peter

    1995-01-01

    In coping with recent multibillion-dollar earthquake disasters, scientists and emergency managers have found new ways to speed and improve relief efforts. This progress is founded on the rapid availability of earthquake information from seismograph networks.

  13. Earthquakes: Predicting the unpredictable?

    USGS Publications Warehouse

    Hough, S.E.

    2005-01-01

    The earthquake prediction pendulum has swung from optimism in the 1970s to rather extreme pessimism in the 1990s. Earlier work revealed evidence of possible earthquake precursors: physical changes in the planet that signal that a large earthquake is on the way. Some respected earthquake scientists argued that earthquakes are likewise fundamentally unpredictable. The fate of the Parkfield prediction experiment appeared to support their arguments: A moderate earthquake had been predicted along a specified segment of the central San Andreas fault within five years of 1988, but had failed to materialize on schedule. At some point, however, the pendulum began to swing back. Reputable scientists began using the "P-word" in not only polite company, but also at meetings and even in print. If the optimism regarding earthquake prediction can be attributed to any single cause, it might be scientists' burgeoning understanding of the earthquake cycle.

  14. Estimating earthquake potential

    USGS Publications Warehouse

    Page, R.A.

    1980-01-01

    The hazards to life and property from earthquakes can be minimized in three ways. First, structures can be designed and built to resist the effects of earthquakes. Second, the location of structures and human activities can be chosen to avoid or to limit the use of areas known to be subject to serious earthquake hazards. Third, preparations for an earthquake in response to a prediction or warning can reduce the loss of life and damage to property as well as promote a rapid recovery from the disaster. The success of the first two strategies, earthquake engineering and land use planning, depends on being able to reliably estimate the earthquake potential. The key considerations in defining the potential of a region are the location, size, and character of future earthquakes and frequency of their occurrence. Both historic seismicity of the region and the geologic record are considered in evaluating earthquake potential. 

  15. Earthquake Loss Estimation Uncertainties

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Bonnin, Jean; Larionov, Valery; Ugarov, Aleksander

    2013-04-01

    The paper addresses the reliability issues of strong earthquakes loss assessment following strong earthquakes with worldwide Systems' application in emergency mode. Timely and correct action just after an event can result in significant benefits in saving lives. In this case the information about possible damage and expected number of casualties is very critical for taking decision about search, rescue operations and offering humanitarian assistance. Such rough information may be provided by, first of all, global systems, in emergency mode. The experience of earthquakes disasters in different earthquake-prone countries shows that the officials who are in charge of emergency response at national and international levels are often lacking prompt and reliable information on the disaster scope. Uncertainties on the parameters used in the estimation process are numerous and large: knowledge about physical phenomena and uncertainties on the parameters used to describe them; global adequacy of modeling techniques to the actual physical phenomena; actual distribution of population at risk at the very time of the shaking (with respect to immediate threat: buildings or the like); knowledge about the source of shaking, etc. Needless to be a sharp specialist to understand, for example, that the way a given building responds to a given shaking obeys mechanical laws which are poorly known (if not out of the reach of engineers for a large portion of the building stock); if a carefully engineered modern building is approximately predictable, this is far not the case for older buildings which make up the bulk of inhabited buildings. The way population, inside the buildings at the time of shaking, is affected by the physical damage caused to the buildings is not precisely known, by far. The paper analyzes the influence of uncertainties in strong event parameters determination by Alert Seismological Surveys, of simulation models used at all stages from, estimating shaking intensity

  16. A Century of Induced Earthquakes in Oklahoma

    NASA Astrophysics Data System (ADS)

    Hough, S. E.; Page, M. T.

    2015-12-01

    Seismicity rates have increased sharply since 2009 in the central and eastern United States, with especially high rates of activity in the state of Oklahoma. A growing body of evidence indicates that many of these events are induced, primarily by injection of wastewater in deep disposal wells. The upsurge in activity has raised the questions, what is the background rate of tectonic earthquakes in Oklahoma? And how much has the rate varied throughout historical and early instrumental times? We first review the historical catalog, including assessment of the completeness level of felt earthquakes, and show that seismicity rates since 2009 surpass previously observed rates throughout the 20th century. Furthermore, several lines of evidence suggest that most of the significant (Mw > 3.5) earthquakes in Oklahoma during the 20th century were likely induced by wastewater injection and/or enhanced oil recovery operations. We show that there is a statistically significant temporal and spatial correspondence between earthquakes and disposal wells permitted during the 1950s. The intensity distributions of the 1952 Mw5.7 El Reno earthquake and the 1956 Mw3.9 Tulsa county earthquake are similar to those from recent induced earthquakes, with significantly lower shaking than predicted given a regional intensity-prediction equation. The rate of tectonic earthquakes is thus inferred to be significantly lower than previously estimated throughout most of the state, but is difficult to estimate given scant incontrovertible evidence for significant tectonic earthquakes during the 20th century. We do find evidence for a low level of tectonic seismicity in southeastern Oklahoma associated with the Ouachita structural belt, and conclude that the 22 October 1882 Choctaw Nation earthquake, for which we estimate Mw4.8, occurred in this zone.

  17. Redefining Earthquakes and the Earthquake Machine

    ERIC Educational Resources Information Center

    Hubenthal, Michael; Braile, Larry; Taber, John

    2008-01-01

    The Earthquake Machine (EML), a mechanical model of stick-slip fault systems, can increase student engagement and facilitate opportunities to participate in the scientific process. This article introduces the EML model and an activity that challenges ninth-grade students' misconceptions about earthquakes. The activity emphasizes the role of models…

  18. Children's Ideas about Earthquakes

    ERIC Educational Resources Information Center

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  19. Can We Predict Earthquakes?

    SciTech Connect

    Johnson, Paul

    2016-08-31

    The only thing we know for sure about earthquakes is that one will happen again very soon. Earthquakes pose a vital yet puzzling set of research questions that have confounded scientists for decades, but new ways of looking at seismic information and innovative laboratory experiments are offering tantalizing clues to what triggers earthquakes — and when.

  20. Earthquake and Schools. [Videotape].

    ERIC Educational Resources Information Center

    Federal Emergency Management Agency, Washington, DC.

    Designing schools to make them more earthquake resistant and protect children from the catastrophic collapse of the school building is discussed in this videotape. It reveals that 44 of the 50 U.S. states are vulnerable to earthquake, but most schools are structurally unprepared to take on the stresses that earthquakes exert. The cost to the…

  1. School Safety and Earthquakes.

    ERIC Educational Resources Information Center

    Dwelley, Laura; Tucker, Brian; Fernandez, Jeanette

    1997-01-01

    A recent assessment of earthquake risk to Quito, Ecuador, concluded that many of its public schools are vulnerable to collapse during major earthquakes. A subsequent examination of 60 buildings identified 15 high-risk buildings. These schools were retrofitted to meet standards that would prevent injury even during Quito's largest earthquakes. US…

  2. Real Earthquakes, Real Learning

    ERIC Educational Resources Information Center

    Schomburg, Aaron

    2003-01-01

    One teacher took her class on a year long earthquake expedition. The goal was to monitor the occurrences of real earthquakes during the year and mark their locations with push pins on a wall-sized world map in the hallway outside the science room. The purpose of the project was to create a detailed picture of the earthquakes that occurred…

  3. Can We Predict Earthquakes?

    ScienceCinema

    Johnson, Paul

    2016-09-09

    The only thing we know for sure about earthquakes is that one will happen again very soon. Earthquakes pose a vital yet puzzling set of research questions that have confounded scientists for decades, but new ways of looking at seismic information and innovative laboratory experiments are offering tantalizing clues to what triggers earthquakes — and when.

  4. Safety Basis Report

    SciTech Connect

    R.J. Garrett

    2002-01-14

    As part of the internal Integrated Safety Management Assessment verification process, it was determined that there was a lack of documentation that summarizes the safety basis of the current Yucca Mountain Project (YMP) site characterization activities. It was noted that a safety basis would make it possible to establish a technically justifiable graded approach to the implementation of the requirements identified in the Standards/Requirements Identification Document. The Standards/Requirements Identification Documents commit a facility to compliance with specific requirements and, together with the hazard baseline documentation, provide a technical basis for ensuring that the public and workers are protected. This Safety Basis Report has been developed to establish and document the safety basis of the current site characterization activities, establish and document the hazard baseline, and provide the technical basis for identifying structures, systems, and components (SSCs) that perform functions necessary to protect the public, the worker, and the environment from hazards unique to the YMP site characterization activities. This technical basis for identifying SSCs serves as a grading process for the implementation of programs such as Conduct of Operations (DOE Order 5480.19) and the Suspect/Counterfeit Items Program. In addition, this report provides a consolidated summary of the hazards analyses processes developed to support the design, construction, and operation of the YMP site characterization facilities and, therefore, provides a tool for evaluating the safety impacts of changes to the design and operation of the YMP site characterization activities.

  5. Application of Seismic Array Processing to Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Meng, L.; Allen, R. M.; Ampuero, J. P.

    2013-12-01

    Earthquake early warning (EEW) systems that can issue warnings prior to the arrival of strong ground shaking during an earthquake are essential in mitigating seismic hazard. Many of the currently operating EEW systems work on the basis of empirical magnitude-amplitude/frequency scaling relations for a point source. This approach is of limited effectiveness for large events, such as the 2011 Tohoku-Oki earthquake, for which ignoring finite source effects may result in underestimation of the magnitude. Here, we explore the concept of characterizing rupture dimensions in real time for EEW using clusters of dense low-cost accelerometers located near active faults. Back tracing the waveforms recorded by such arrays allows the estimation of the earthquake rupture size, duration and directivity in real-time, which enables the EEW of M > 7 earthquakes. The concept is demonstrated with the 2004 Parkfield earthquake, one of the few big events (M>6) that have been recorded by a local small-scale seismic array (UPSAR array, Fletcher et al, 2006). We first test the approach against synthetic rupture scenarios constructed by superposition of empirical Green's functions. We find it important to correct for the bias in back azimuth induced by dipping structures beneath the array. We implemented the proposed methodology to the mainshock in a simulated real-time environment. After calibrating the dipping-layer effect with data from smaller events, we obtained an estimated rupture length of 9 km, consistent with the distance between the two main high frequency subevents identified by back-projection using all local stations (Allman and Shearer, 2007). We proposed to deploy small-scale arrays every 30 km along the San Andreas Fault. The array processing is performed in local processing centers at each array. The output is compared with finite fault solutions based on real-time GPS system and then incorporated into the standard ElarmS system. The optimal aperture and array geometry is

  6. Crowdsourced earthquake early warning.

    PubMed

    Minson, Sarah E; Brooks, Benjamin A; Glennie, Craig L; Murray, Jessica R; Langbein, John O; Owen, Susan E; Heaton, Thomas H; Iannucci, Robert A; Hauser, Darren L

    2015-04-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an M w (moment magnitude) 7 earthquake on California's Hayward fault, and real data from the M w 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.

  7. Crowdsourced earthquake early warning

    PubMed Central

    Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

    2015-01-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing. PMID:26601167

  8. Earthquake forecasting and warning

    SciTech Connect

    Rikitake, T.

    1983-01-01

    This review briefly describes two other books on the same subject either written or partially written by Rikitake. In this book, the status of earthquake prediction efforts in Japan, China, the Soviet Union, and the United States are updated. An overview of some of the organizational, legal, and societal aspects of earthquake prediction in these countries is presented, and scientific findings of precursory phenomena are included. A summary of circumstances surrounding the 1975 Haicheng earthquake, the 1978 Tangshan earthquake, and the 1976 Songpan-Pingwu earthquake (all magnitudes = 7.0) in China and the 1978 Izu-Oshima earthquake in Japan is presented. This book fails to comprehensively summarize recent advances in earthquake prediction research.

  9. Operations

    ERIC Educational Resources Information Center

    Wilkins, Jesse L. M.; Norton, Anderson; Boyce, Steven J.

    2013-01-01

    Previous research has documented schemes and operations that undergird students' understanding of fractions. This prior research was based, in large part, on small-group teaching experiments. However, written assessments are needed in order for teachers and researchers to assess students' ways of operating on a whole-class scale. In this study,…

  10. Retrospective Evaluation of Earthquake Forecasts during the 2010-12 Canterbury, New Zealand, Earthquake Sequence

    NASA Astrophysics Data System (ADS)

    Werner, M. J.; Marzocchi, W.; Taroni, M.; Zechar, J. D.; Gerstenberger, M.; Liukis, M.; Rhoades, D. A.; Cattania, C.; Christophersen, A.; Hainzl, S.; Helmstetter, A.; Jimenez, A.; Steacy, S.; Jordan, T. H.

    2014-12-01

    The M7.1 Darfield, New Zealand (NZ), earthquake triggered a complex earthquake cascade that provides a wealth of new scientific data to study earthquake triggering and the predictive skill of statistical and physics-based forecasting models. To this end, the Collaboratory for the Study of Earthquake Predictability (CSEP) is conducting a retrospective evaluation of over a dozen short-term forecasting models that were developed by groups in New Zealand, Europe and the US. The statistical model group includes variants of the Epidemic-Type Aftershock Sequence (ETAS) model, non-parametric kernel smoothing models, and the Short-Term Earthquake Probabilities (STEP) model. The physics-based model group includes variants of the Coulomb stress triggering hypothesis, which are embedded either in Dieterich's (1994) rate-state formulation or in statistical Omori-Utsu clustering formulations (hybrid models). The goals of the CSEP evaluation are to improve our understanding of the physical mechanisms governing earthquake triggering, to improve short-term earthquake forecasting models and time-dependent hazard assessment for the Canterbury area, and to understand the influence of poor-quality, real-time data on the skill of operational (real-time) forecasts. To assess the latter, we use the earthquake catalog data that the NZ CSEP Testing Center archived in near real-time during the earthquake sequence and compare the predictive skill of models using the archived data as input with the skill attained using the best available data today. We present results of the retrospective model comparison and discuss implications for operational earthquake forecasting.

  11. Earthquake Warning Performance in Vallejo for the South Napa Earthquake

    NASA Astrophysics Data System (ADS)

    Wurman, G.; Price, M.

    2014-12-01

    In 2002 and 2003, Seismic Warning Systems, Inc. installed first-generation QuakeGuardTM earthquake warning devices at all eight fire stations in Vallejo, CA. These devices are designed to detect the P-wave of an earthquake and initiate predetermined protective actions if the impending shaking is estimated at approximately Modifed Mercalli Intensity V or greater. At the Vallejo fire stations the devices were set up to sound an audio alert over the public address system and to command the equipment bay doors to open. In August 2014, after more than 11 years of operating in the fire stations with no false alarms, the five units that were still in use triggered correctly on the MW 6.0 South Napa earthquake, less than 16 km away. The audio alert sounded in all five stations, providing fire fighters with 1.5 to 2.5 seconds of warning before the arrival of the S-wave, and the equipment bay doors opened in three of the stations. In one station the doors were disconnected from the QuakeGuard device, and another station lost power before the doors opened completely. These problems highlight just a small portion of the complexity associated with realizing actionable earthquake warnings. The issues experienced in this earthquake have already been addressed in subsequent QuakeGuard product generations, with downstream connection monitoring and backup power for critical systems. The fact that the fire fighters in Vallejo were afforded even two seconds of warning at these epicentral distances results from the design of the QuakeGuard devices, which focuses on rapid false positive rejection and ground motion estimates. We discuss the performance of the ground motion estimation algorithms, with an emphasis on the accuracy and timeliness of the estimates at close epicentral distances.

  12. Frog Swarms: Earthquake Precursors or False Alarms?

    PubMed Central

    Grant, Rachel A.; Conlan, Hilary

    2013-01-01

    juvenile animals migrating away from their breeding pond, after a fruitful reproductive season. As amphibian populations undergo large fluctuations in numbers from year to year, this phenomenon will not occur on a yearly basis but will depend on successful reproduction, which is related to numerous climatic and geophysical factors. Hence, most large swarms of amphibians, particularly those involving very small frogs and occurring in late spring or summer, are not unusual and should not be considered earthquake precursors. In addition, it is likely that reports of several mass migration of small toads prior to the Great Sichuan Earthquake in 2008 were not linked to the subsequent M = 7.9 event (some occurred at a great distance from the epicentre), and were probably co-incidence. Statistical analysis of the data indicated frog swarms are unlikely to be connected with earthquakes. Reports of unusual behaviour giving rise to earthquake fears should be interpreted with caution, and consultation with experts in the field of earthquake biology is advised. PMID:26479746

  13. Earthquakes and the office-based surgeon.

    PubMed Central

    Conover, W A

    1992-01-01

    A major earthquake may strike while a surgeon is performing an operation in an office surgical facility. A sudden major fault disruption will lead to thousands of casualties and widespread destruction. Surgeons who operate in offices can help lessen havoc by careful preparation. These plans should coordinate with other disaster plans for effective triage, evacuation, and the treatment of casualties. PMID:1413756

  14. [Biomechanical characteristics of spinal cord tissue--basis for the development of modifications of the DREZ (dorsal root entry zone) operation].

    PubMed

    Spaić, M; Mikicić, D; Ilić, S; Milosavljević, I; Ivanović, S; Slavik, E; Antić, B

    2004-01-01

    Mechanical properties of the spinal cord tissue--biological basis for the development of the modality of the DREZ surgery lesioning technique Succesful treatment of the chronic neurogenic pain of spinal cord and cauda equina injury origin remains a significant management problem. The mechanism of this pain phenomenon has been shown to be related to neurochemical changes that lead to the state of hypereactivity of the second order dorsal horn neurons. The DREZ surgery (Dorsal Root Entry Zone lesion), designed to destroy anatomy structures involved in pain generating thus interrupting the neurogenic pain mechanism, as a causative procedure in treating this chronic pain, has been performed by using different technical modalities: Radiofrequency (RF) coagulation technic, Laser, Ultrasound and Microsurgical DREZotomy technic. The purpose of the study was to assess the possibility for the establishment of the lesioning technic based on the natural difference in the mechanical properties between the white and gray cord substance. We experimentally deteminated mechanical properties of the human cadaveric cord white versus gray tissue for the purpose of testing possibility of selective suction of the dorsal horn gray substance as a DREZ lesioning procedure. Based on the fact of the difference in tissue elasticity between white and gray cord substance we established a new and simple DREZ surgical lesioning technique that was tested on cadaver cord. For the purpose of testing and comparing the size and shape of the DREZ lesion axchieved the DREZ surgery has been performed on cadaver cord by employing selective dorsal horn suction as a lesioning method. After the procedure cadaver cord underwent histological fixation and analysis of the DREZ lesions achieved. Our result revealed that the white cord substance with longitudinal fiber structure had four time higher dynamical viscosity than gray substance of local neuronal network structure (150 PaS versus 37.5 PaS) that provided

  15. Anthropogenic seismicity rates and operational parameters at the Salton Sea Geothermal Field.

    PubMed

    Brodsky, Emily E; Lajoie, Lia J

    2013-08-02

    Geothermal power is a growing energy source; however, efforts to increase production are tempered by concern over induced earthquakes. Although increased seismicity commonly accompanies geothermal production, induced earthquake rate cannot currently be forecast on the basis of fluid injection volumes or any other operational parameters. We show that at the Salton Sea Geothermal Field, the total volume of fluid extracted or injected tracks the long-term evolution of seismicity. After correcting for the aftershock rate, the net fluid volume (extracted-injected) provides the best correlation with seismicity in recent years. We model the background earthquake rate with a linear combination of injection and net production rates that allows us to track the secular development of the field as the number of earthquakes per fluid volume injected decreases over time.

  16. Comment on “Rethinking first-principles electron transport theories with projection operators: The problems caused by partitioning the basis set” [J. Chem. Phys. 139, 114104 (2013)

    SciTech Connect

    Brandbyge, Mads

    2014-05-07

    In a recent paper Reuter and Harrison [J. Chem. Phys. 139, 114104 (2013)] question the widely used mean-field electron transport theories, which employ nonorthogonal localized basis sets. They claim these can violate an “implicit decoupling assumption,” leading to wrong results for the current, different from what would be obtained by using an orthogonal basis, and dividing surfaces defined in real-space. We argue that this assumption is not required to be fulfilled to get exact results. We show how the current/transmission calculated by the standard Greens function method is independent of whether or not the chosen basis set is nonorthogonal, and that the current for a given basis set is consistent with divisions in real space. The ambiguity known from charge population analysis for nonorthogonal bases does not carry over to calculations of charge flux.

  17. Comment on "Rethinking first-principles electron transport theories with projection operators: The problems caused by partitioning the basis set" [J. Chem. Phys. 139, 114104 (2013)

    NASA Astrophysics Data System (ADS)

    Brandbyge, Mads

    2014-05-01

    In a recent paper Reuter and Harrison [J. Chem. Phys. 139, 114104 (2013)] question the widely used mean-field electron transport theories, which employ nonorthogonal localized basis sets. They claim these can violate an "implicit decoupling assumption," leading to wrong results for the current, different from what would be obtained by using an orthogonal basis, and dividing surfaces defined in real-space. We argue that this assumption is not required to be fulfilled to get exact results. We show how the current/transmission calculated by the standard Greens function method is independent of whether or not the chosen basis set is nonorthogonal, and that the current for a given basis set is consistent with divisions in real space. The ambiguity known from charge population analysis for nonorthogonal bases does not carry over to calculations of charge flux.

  18. Maximum magnitude earthquakes induced by fluid injection

    NASA Astrophysics Data System (ADS)

    McGarr, A.

    2014-02-01

    Analysis of numerous case histories of earthquake sequences induced by fluid injection at depth reveals that the maximum magnitude appears to be limited according to the total volume of fluid injected. Similarly, the maximum seismic moment seems to have an upper bound proportional to the total volume of injected fluid. Activities involving fluid injection include (1) hydraulic fracturing of shale formations or coal seams to extract gas and oil, (2) disposal of wastewater from these gas and oil activities by injection into deep aquifers, and (3) the development of enhanced geothermal systems by injecting water into hot, low-permeability rock. Of these three operations, wastewater disposal is observed to be associated with the largest earthquakes, with maximum magnitudes sometimes exceeding 5. To estimate the maximum earthquake that could be induced by a given fluid injection project, the rock mass is assumed to be fully saturated, brittle, to respond to injection with a sequence of earthquakes localized to the region weakened by the pore pressure increase of the injection operation and to have a Gutenberg-Richter magnitude distribution with a b value of 1. If these assumptions correctly describe the circumstances of the largest earthquake, then the maximum seismic moment is limited to the volume of injected liquid times the modulus of rigidity. Observations from the available case histories of earthquakes induced by fluid injection are consistent with this bound on seismic moment. In view of the uncertainties in this analysis, however, this should not be regarded as an absolute physical limit.

  19. Missing great earthquakes

    USGS Publications Warehouse

    Hough, Susan E.

    2013-01-01

    The occurrence of three earthquakes with moment magnitude (Mw) greater than 8.8 and six earthquakes larger than Mw 8.5, since 2004, has raised interest in the long-term global rate of great earthquakes. Past studies have focused on the analysis of earthquakes since 1900, which roughly marks the start of the instrumental era in seismology. Before this time, the catalog is less complete and magnitude estimates are more uncertain. Yet substantial information is available for earthquakes before 1900, and the catalog of historical events is being used increasingly to improve hazard assessment. Here I consider the catalog of historical earthquakes and show that approximately half of all Mw ≥ 8.5 earthquakes are likely missing or underestimated in the 19th century. I further present a reconsideration of the felt effects of the 8 February 1843, Lesser Antilles earthquake, including a first thorough assessment of felt reports from the United States, and show it is an example of a known historical earthquake that was significantly larger than initially estimated. The results suggest that incorporation of best available catalogs of historical earthquakes will likely lead to a significant underestimation of seismic hazard and/or the maximum possible magnitude in many regions, including parts of the Caribbean.

  20. Earthquake risk assessment of Alexandria, Egypt

    NASA Astrophysics Data System (ADS)

    Badawy, Ahmed; Gaber, Hanan; Ibrahim, Hamza

    2015-01-01

    Throughout historical and recent times, Alexandria has suffered great damage due to earthquakes from both near- and far-field sources. Sometimes, the sources of such damages are not well known. During the twentieth century, the city was shaken by several earthquakes generated from inland dislocations (e.g., 29 Apr. 1974, 12 Oct. 1992, and 28 Dec. 1999) and the African continental margin (e.g., 12 Sept. 1955 and 28 May 1998). Therefore, this study estimates the earthquake ground shaking and the consequent impacts in Alexandria on the basis of two earthquake scenarios. The simulation results show that Alexandria affected by both earthquakes scenarios relatively in the same manner despite the number of casualties during the first scenario (inland dislocation) is twice larger than the second one (African continental margin). An expected percentage of 2.27 from Alexandria's total constructions (12.9 millions, 2006 Census) will be affected, 0.19 % injuries and 0.01 % deaths of the total population (4.1 millions, 2006 Census) estimated by running the first scenario. The earthquake risk profile reveals that three districts (Al-Montazah, Al-Amriya, and Shark) lie in high seismic risks, two districts (Gharb and Wasat) are in moderate, and two districts (Al-Gomrok and Burg El-Arab) are in low seismic risk level. Moreover, the building damage estimations reflect that Al-Montazah is the highest vulnerable district whereas 73 % of expected damages were reported there. The undertaken analysis shows that the Alexandria urban area faces high risk. Informal areas and deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated earthquake risks (buildings damages) are concentrated at the most densely populated (Al-Montazah, Al-Amriya, and Shark) districts. Moreover, about 75 % of casualties are in the same districts.

  1. Modified-Fibonacci-Dual-Lucas method for earthquake prediction

    NASA Astrophysics Data System (ADS)

    Boucouvalas, A. C.; Gkasios, M.; Tselikas, N. T.; Drakatos, G.

    2015-06-01

    The FDL method makes use of Fibonacci, Dual and Lucas numbers and has shown considerable success in predicting earthquake events locally as well as globally. Predicting the location of the epicenter of an earthquake is one difficult challenge the other being the timing and magnitude. One technique for predicting the onset of earthquakes is the use of cycles, and the discovery of periodicity. Part of this category is the reported FDL method. The basis of the reported FDL method is the creation of FDL future dates based on the onset date of significant earthquakes. The assumption being that each occurred earthquake discontinuity can be thought of as a generating source of FDL time series The connection between past earthquakes and future earthquakes based on FDL numbers has also been reported with sample earthquakes since 1900. Using clustering methods it has been shown that significant earthquakes (<6.5R) can be predicted with very good accuracy window (+-1 day). In this contribution we present an improvement modification to the FDL method, the MFDL method, which performs better than the FDL. We use the FDL numbers to develop possible earthquakes dates but with the important difference that the starting seed date is a trigger planetary aspect prior to the earthquake. Typical planetary aspects are Moon conjunct Sun, Moon opposite Sun, Moon conjunct or opposite North or South Modes. In order to test improvement of the method we used all +8R earthquakes recorded since 1900, (86 earthquakes from USGS data). We have developed the FDL numbers for each of those seeds, and examined the earthquake hit rates (for a window of 3, i.e. +-1 day of target date) and for <6.5R. The successes are counted for each one of the 86 earthquake seeds and we compare the MFDL method with the FDL method. In every case we find improvement when the starting seed date is on the planetary trigger date prior to the earthquake. We observe no improvement only when a planetary trigger coincided with

  2. Compiling the 'Global Earthquake History' (1000-1903)

    NASA Astrophysics Data System (ADS)

    Albini, P.; Musson, R.; Locati, M.; Rovida, A.

    2013-12-01

    The study of historical earthquakes from historical sources, or historical seismology, is of wider interest than just the seismic hazard and risk community. In the scope of the two-year project (October 2010-March 2013) "Global Earthquake History", developed in the framework of GEM, a reassessment of world historical seismicity was made, from available published studies. The scope of the project is the time window 1000-1903, with magnitudes 7.0 and above. Events with lower magnitudes are included on a case by case, or region by region, basis. The Global Historical Earthquake Archive (GHEA) provides a complete account of the global situation in historical seismology. From GHEA, the Global Historical Earthquake Catalogue (GHEC, v1, available at http://www.emidius.eu/GEH/, under Creative Commons licence) was derived, i.e. a world catalogue of earthquakes for the period 1000-1903, with magnitude 7 and over, using publically-available materials, as for the Archive. This is intended to be the best global historical catalogue of large earthquakes presently available, with the best parameters selected, duplications and fakes removed, and in some cases, new earthquakes discovered. GHEA and GHEC are conceived as providing a basis for co-ordinating future research into historical seismology in any part of the world, and hopefully, encouraging new historical earthquake research initiatives that will continue to improve the information available.

  3. Microearthquake networks and earthquake prediction

    USGS Publications Warehouse

    Lee, W.H.K.; Steward, S. W.

    1979-01-01

    A microearthquake network is a group of highly sensitive seismographic stations designed primarily to record local earthquakes of magnitudes less than 3. Depending on the application, a microearthquake network will consist of several stations or as many as a few hundred . They are usually classified as either permanent or temporary. In a permanent network, the seismic signal from each is telemetered to a central recording site to cut down on the operating costs and to allow more efficient and up-to-date processing of the data. However, telemetering can restrict the location sites because of the line-of-site requirement for radio transmission or the need for telephone lines. Temporary networks are designed to be extremely portable and completely self-contained so that they can be very quickly deployed. They are most valuable for recording aftershocks of a major earthquake or for studies in remote areas.  

  4. NCEER seminars on earthquakes

    USGS Publications Warehouse

    Pantelic, J.

    1987-01-01

    In May of 1986, the National Center for Earthquake Engineering Research (NCEER) in Buffalo, New York, held the first seminar in its new monthly forum called Seminars on Earthquakes. The Center's purpose in initiating the seminars was to educate the audience about earthquakes, to facilitate cooperation between the NCEER and visiting researchers, and to enable visiting speakers to learn more about the NCEER   

  5. Earthquakes, November-December 1973

    USGS Publications Warehouse

    Person, W.J.

    1974-01-01

    Other parts of the world suffered fatalities and significant damage from earthquakes. In Iran, an earthquake killed one person, injured many, and destroyed a number of homes. Earthquake fatalities also occurred in the Azores and in Algeria. 

  6. Earthquake history of Oregon

    USGS Publications Warehouse

    von Hake, C. A.

    1976-01-01

    Although situated between two States (California and Washington) that have has many violent earthquakes, Oregon is noticeably less active seismically. the greatest damage experienced resulted from a major shock near Olympia, Wash., in 1949. During the short history record available (since 1841), 34 earthquakes of intensity V, Modified Mercalli Scale, or greater have centered within Oregon or near its borders. Only 13 of the earthquakes had an intensity above V, and many of the shocks were local. However, a 1936 earthquake in the eastern Oregon-Washington region caused extensive damage and was felt over an area of 272,000 square kilometers. 

  7. Earthquakes of the Holocene.

    USGS Publications Warehouse

    Schwartz, D.P.

    1987-01-01

    Areas in which significant new data and insights have been obtained are: 1) fault slip rates; 2) earthquake recurrence models; 3) fault segmentation; 4) dating past earthquakes; 5) paleoseismicity in the E and central US; 6) folds and earthquakes, and 7) future earthquake behavior. Summarizes important trends in each of these research areas based on information published between June 1982 and June 1986 and preprints of papers in press. The bibliography for this period contains mainly referred publications in journals and books.-from Author

  8. Are Earthquake Magnitudes Clustered?

    SciTech Connect

    Davidsen, Joern; Green, Adam

    2011-03-11

    The question of earthquake predictability is a long-standing and important challenge. Recent results [Phys. Rev. Lett. 98, 098501 (2007); ibid.100, 038501 (2008)] have suggested that earthquake magnitudes are clustered, thus indicating that they are not independent in contrast to what is typically assumed. Here, we present evidence that the observed magnitude correlations are to a large extent, if not entirely, an artifact due to the incompleteness of earthquake catalogs and the well-known modified Omori law. The latter leads to variations in the frequency-magnitude distribution if the distribution is constrained to those earthquakes that are close in space and time to the directly following event.

  9. Investigating landslides caused by earthquakes - A historical review

    USGS Publications Warehouse

    Keefer, D.K.

    2002-01-01

    Post-earthquake field investigations of landslide occurrence have provided a basis for understanding, evaluating, and mapping the hazard and risk associated with earthquake-induced landslides. This paper traces the historical development of knowledge derived from these investigations. Before 1783, historical accounts of the occurrence of landslides in earthquake are typically so incomplete and vague that conclusions based on these accounts are of limited usefulness. For example, the number of landslides triggered by a given event is almost always greatly underestimated. The first formal, scientific post-earthquake investigation that included systematic documentation of the landslides was undertaken in the Calabria region of Italy after the 1783 earthquake swarm. From then until the mid-twentieth century, the best information on earthquake-induced landslides came from a succession of post-earthquake investigations largely carried out by formal commissions that undertook extensive ground-based field studies. Beginning in the mid-twentieth century, when the use of aerial photography became widespread, comprehensive inventories of landslide occurrence have been made for several earthquakes in the United States, Peru, Guatemala, Italy, El Salvador, Japan, and Taiwan. Techniques have also been developed for performing "retrospective" analyses years or decades after an earthquake that attempt to reconstruct the distribution of landslides triggered by the event. The additional use of Geographic Information System (GIS) processing and digital mapping since about 1989 has greatly facilitated the level of analysis that can applied to mapped distributions of landslides. Beginning in 1984, synthesis of worldwide and national data on earthquake-induced landslides have defined their general characteristics and relations between their occurrence and various geologic and seismic parameters. However, the number of comprehensive post-earthquake studies of landslides is still

  10. Earthquake detection through computationally efficient similarity search.

    PubMed

    Yoon, Clara E; O'Reilly, Ossian; Bergen, Karianne J; Beroza, Gregory C

    2015-12-01

    Seismology is experiencing rapid growth in the quantity of data, which has outpaced the development of processing algorithms. Earthquake detection-identification of seismic events in continuous data-is a fundamental operation for observational seismology. We developed an efficient method to detect earthquakes using waveform similarity that overcomes the disadvantages of existing detection methods. Our method, called Fingerprint And Similarity Thresholding (FAST), can analyze a week of continuous seismic waveform data in less than 2 hours, or 140 times faster than autocorrelation. FAST adapts a data mining algorithm, originally designed to identify similar audio clips within large databases; it first creates compact "fingerprints" of waveforms by extracting key discriminative features, then groups similar fingerprints together within a database to facilitate fast, scalable search for similar fingerprint pairs, and finally generates a list of earthquake detections. FAST detected most (21 of 24) cataloged earthquakes and 68 uncataloged earthquakes in 1 week of continuous data from a station located near the Calaveras Fault in central California, achieving detection performance comparable to that of autocorrelation, with some additional false detections. FAST is expected to realize its full potential when applied to extremely long duration data sets over a distributed network of seismic stations. The widespread application of FAST has the potential to aid in the discovery of unexpected seismic signals, improve seismic monitoring, and promote a greater understanding of a variety of earthquake processes.

  11. Earthquake detection through computationally efficient similarity search

    PubMed Central

    Yoon, Clara E.; O’Reilly, Ossian; Bergen, Karianne J.; Beroza, Gregory C.

    2015-01-01

    Seismology is experiencing rapid growth in the quantity of data, which has outpaced the development of processing algorithms. Earthquake detection—identification of seismic events in continuous data—is a fundamental operation for observational seismology. We developed an efficient method to detect earthquakes using waveform similarity that overcomes the disadvantages of existing detection methods. Our method, called Fingerprint And Similarity Thresholding (FAST), can analyze a week of continuous seismic waveform data in less than 2 hours, or 140 times faster than autocorrelation. FAST adapts a data mining algorithm, originally designed to identify similar audio clips within large databases; it first creates compact “fingerprints” of waveforms by extracting key discriminative features, then groups similar fingerprints together within a database to facilitate fast, scalable search for similar fingerprint pairs, and finally generates a list of earthquake detections. FAST detected most (21 of 24) cataloged earthquakes and 68 uncataloged earthquakes in 1 week of continuous data from a station located near the Calaveras Fault in central California, achieving detection performance comparable to that of autocorrelation, with some additional false detections. FAST is expected to realize its full potential when applied to extremely long duration data sets over a distributed network of seismic stations. The widespread application of FAST has the potential to aid in the discovery of unexpected seismic signals, improve seismic monitoring, and promote a greater understanding of a variety of earthquake processes. PMID:26665176

  12. Earthquake activity in Oklahoma

    SciTech Connect

    Luza, K.V.; Lawson, J.E. Jr. )

    1989-08-01

    Oklahoma is one of the most seismically active areas in the southern Mid-Continent. From 1897 to 1988, over 700 earthquakes are known to have occurred in Oklahoma. The earliest documented Oklahoma earthquake took place on December 2, 1897, near Jefferson, in Grant County. The largest known Oklahoma earthquake happened near El Reno on April 9, 1952. This magnitude 5.5 (mb) earthquake was felt from Austin, Texas, to Des Moines, Iowa, and covered a felt area of approximately 362,000 km{sup 2}. Prior to 1962, all earthquakes in Oklahoma (59) were either known from historical accounts or from seismograph stations outside the state. Over half of these events were located in Canadian County. In late 1961, the first seismographs were installed in Oklahoma. From 1962 through 1976, 70 additional earthquakes were added to the earthquake database. In 1977, a statewide network of seven semipermanent and three radio-telemetry seismograph stations were installed. The additional stations have improved earthquake detection and location in the state of Oklahoma. From 1977 to 1988, over 570 additional earthquakes were located in Oklahoma, mostly of magnitudes less than 2.5. Most of these events occurred on the eastern margin of the Anadarko basin along a zone 135 km long by 40 km wide that extends from Canadian County to the southern edge of Garvin County. Another general area of earthquake activity lies along and north of the Ouachita Mountains in the Arkoma basin. A few earthquakes have occurred in the shelves that border the Arkoma and Anadarko basins.

  13. Recent Progress and Development on Multi-parameters Remote Sensing Application in Earthquake Monitoring in China

    NASA Astrophysics Data System (ADS)

    Shen, Xuhui; Zhang, Xuemin; Hong, Shunying; Jing, Feng; Zhao, Shufan

    2014-05-01

    In the last ten years, a few national research plans and scientific projects on remote sensing application in Earthquake monitoring research are implemented in China. Focusing on advancing earthquake monitoring capability searching for the way of earthquake prediction, satellite electromagnetism, satellite infrared and D-InSAR technology were developed systematically and some remarkable progress were achieved by statistical research on historical earthquakes and summarized initially the space precursory characters, which laid the foundation for gradually promoting the practical use. On the basis of these works, argumentation on the first space-based platform has been finished in earthquake stereoscope observation system in China, and integrated earthquake remote sensing application system has been designed comprehensively. To develop the space-based earthquake observational system has become a major trend of technological development in earthquake monitoring and prediction. We shall pay more emphasis on the construction of the space segment of China earthquake stereoscope observation system and Imminent major scientific projects such as earthquake deformation observation system and application research combined INSAR, satellite gravity and GNSS with the goal of medium and long term earthquake monitoring and forcasting, infrared observation and technical system and application research with the goal of medium and short term earthquake monitoring and forcasting, and satellite-based electromagnetic observation and technical system and application system with the goal of short term and imminent earthquake monitoring.

  14. OMG Earthquake! Can Twitter improve earthquake response?

    NASA Astrophysics Data System (ADS)

    Earle, P. S.; Guy, M.; Ostrum, C.; Horvath, S.; Buckmaster, R. A.

    2009-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment its earthquake response products and the delivery of hazard information. The goal is to gather near real-time, earthquake-related messages (tweets) and provide geo-located earthquake detections and rough maps of the corresponding felt areas. Twitter and other social Internet technologies are providing the general public with anecdotal earthquake hazard information before scientific information has been published from authoritative sources. People local to an event often publish information within seconds via these technologies. In contrast, depending on the location of the earthquake, scientific alerts take between 2 to 20 minutes. Examining the tweets following the March 30, 2009, M4.3 Morgan Hill earthquake shows it is possible (in some cases) to rapidly detect and map the felt area of an earthquake using Twitter responses. Within a minute of the earthquake, the frequency of “earthquake” tweets rose above the background level of less than 1 per hour to about 150 per minute. Using the tweets submitted in the first minute, a rough map of the felt area can be obtained by plotting the tweet locations. Mapping the tweets from the first six minutes shows observations extending from Monterey to Sacramento, similar to the perceived shaking region mapped by the USGS “Did You Feel It” system. The tweets submitted after the earthquake also provided (very) short first-impression narratives from people who experienced the shaking. Accurately assessing the potential and robustness of a Twitter-based system is difficult because only tweets spanning the previous seven days can be searched, making a historical study impossible. We have, however, been archiving tweets for several months, and it is clear that significant limitations do exist. The main drawback is the lack of quantitative information

  15. A starting earthquake with harmonic effects

    NASA Astrophysics Data System (ADS)

    Babeshko, V. A.; Evdokimova, O. V.; Babeshko, O. M.

    2016-11-01

    The possibility of the occurrence of a starting earthquake with harmonic vibrations (caused by the vertical harmonic effect) of the lithospheric plates and the base on which the plates are resting is considered. This case differs from the static one [1], for which the boundary problem operator is characterized by the presence of manifold eigenvalues. In the dynamic case, the eigenvalues of the operator are single. It is found that the starting earthquake also occurs in this case and, in addition, earthquake hazard can increase due to the appearance of fatigue breakdown conditions in the zone of the approach of lithospheric plates. In turn, fatigue breakdown is related to periodic changes in the effective directions of maximal stresses in this zone.

  16. Earthquake history of Texas

    USGS Publications Warehouse

    von Hake, C. A.

    1977-01-01

    Seventeen earthquakes, intensity V or greater, have centered in Texas since 1882, when the first shock was reported. The strongest earthquake, a maximum intensity VIII, was in western Texas in 1931 and was felt over 1 165 000 km 2. Three shocks in the Panhandle region in 1925, 1936, and 1943 were widely felt. 

  17. Earthquake research in China

    USGS Publications Warehouse

    Raleigh, B.

    1977-01-01

    The prediction of the Haicheng earthquake was an extraordinary achievement by the geophysical workers of the People's Republic of China, whose national program in earthquake reserach was less than 10 years old at the time. To study the background to this prediction, a delgation of 10 U.S scientists, which I led, visited China in June 1976. 

  18. Earthquake history of Oklahoma

    USGS Publications Warehouse

    von Hake, C. A.

    1976-01-01

    The strongest and most widely felt earthquake in Oklahoma occured on April 9, 1952. The intensity VII (Modified Mercalli Scale) tremor was felt over 362,000 sqaure kilometres. A second intensity VII earthquake, felt over a very small area, occurred in October 1956. In addition, 15 other shocks, intensity V or VI, have originated within Oklahoma. 

  19. Earthquake history of Mississippi

    USGS Publications Warehouse

    von Hake, C. A.

    1974-01-01

    Since its admission into the Union in 1817, Mississippi has had only four earthquakes of intensity V or greater within its borders. Although the number of earthquakes known to have been centered within Mississippi's boundaries is small, the State has been affected by numerous shocks located in neighboring States. In 1811 and 1812, a series of great earthquakes near the New Madrid Missouri area was felt in Mississippi as far south as the gulf coast. The New Madrid series caused the banks of the Mississippi River to cave in as far as Vicksburg, mroe than 300 miles from the epicentral region. As a result of this great earthquake series, the northwest corner of Mississippi is in seismic risk zone 3, the highest risk zone. Expect for the new Madrid series, effects in Mississippi from earthquakes located outside of the State have been less than intensity V. 

  20. Demand surge following earthquakes

    USGS Publications Warehouse

    Olsen, Anna H.

    2012-01-01

    Demand surge is understood to be a socio-economic phenomenon where repair costs for the same damage are higher after large- versus small-scale natural disasters. It has reportedly increased monetary losses by 20 to 50%. In previous work, a model for the increased costs of reconstruction labor and materials was developed for hurricanes in the Southeast United States. The model showed that labor cost increases, rather than the material component, drove the total repair cost increases, and this finding could be extended to earthquakes. A study of past large-scale disasters suggested that there may be additional explanations for demand surge. Two such explanations specific to earthquakes are the exclusion of insurance coverage for earthquake damage and possible concurrent causation of damage from an earthquake followed by fire or tsunami. Additional research into these aspects might provide a better explanation for increased monetary losses after large- vs. small-scale earthquakes.

  1. Maximum Earthquake Magnitude Assessments by Japanese Government Committees (Invited)

    NASA Astrophysics Data System (ADS)

    Satake, K.

    2013-12-01

    The 2011 Tohoku earthquake (M 9.0) was the largest earthquake in Japanese history and such a gigantic earthquake was not foreseen around Japan. After the 2011 disaster, various government committees in Japan have discussed and assessed the maximum credible earthquake size around Japan, but their values vary without definite consensus. I will review them with earthquakes along the Nankai Trough as an example. The Central Disaster Management Council, under Cabinet Office, set up a policy for the future tsunami disaster mitigation. The possible future tsunamis are classified into two levels: L1 and L2. The L2 tsunamis are the largest possible tsunamis with low frequency of occurrence, for which saving people's lives is the first priority with soft measures such as tsunami hazard maps, evacuation facilities or disaster education. The L1 tsunamis are expected to occur more frequently, typically once in a few decades, for which hard countermeasures such as breakwater must be prepared. The assessments of L1 and L2 events are left to local governments. The CDMC also assigned M 9.1 as the maximum size of earthquake along the Nankai trough, then computed the ground shaking and tsunami inundation for several scenario earthquakes. The estimated loss is about ten times the 2011 disaster, with maximum casualties of 320,000 and economic loss of 2 trillion dollars. The Headquarters of Earthquake Research Promotion, under MEXT, was set up after the 1995 Kobe earthquake and has made long-term forecast of large earthquakes and published national seismic hazard maps. The future probability of earthquake occurrence, for example in the next 30 years, was calculated from the past data of large earthquakes, on the basis of characteristic earthquake model. The HERP recently revised the long-term forecast of Naknai trough earthquake; while the 30 year probability (60 - 70 %) is similar to the previous estimate, they noted the size can be M 8 to 9, considering the variability of past

  2. Strategies for rapid global earthquake impact estimation: the Prompt Assessment of Global Earthquakes for Response (PAGER) system

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, D.J.

    2013-01-01

    This chapter summarizes the state-of-the-art for rapid earthquake impact estimation. It details the needs and challenges associated with quick estimation of earthquake losses following global earthquakes, and provides a brief literature review of various approaches that have been used in the past. With this background, the chapter introduces the operational earthquake loss estimation system developed by the U.S. Geological Survey (USGS) known as PAGER (for Prompt Assessment of Global Earthquakes for Response). It also details some of the ongoing developments of PAGER’s loss estimation models to better supplement the operational empirical models, and to produce value-added web content for a variety of PAGER users.

  3. Monitoring the Earthquake source process in North America

    USGS Publications Warehouse

    Herrmann, Robert B.; Benz, H.; Ammon, C.J.

    2011-01-01

    With the implementation of the USGS National Earthquake Information Center Prompt Assessment of Global Earthquakes for Response system (PAGER), rapid determination of earthquake moment magnitude is essential, especially for earthquakes that are felt within the contiguous United States. We report an implementation of moment tensor processing for application to broad, seismically active areas of North America. This effort focuses on the selection of regional crustal velocity models, codification of data quality tests, and the development of procedures for rapid computation of the seismic moment tensor. We systematically apply these techniques to earthquakes with reported magnitude greater than 3.5 in continental North America that are not associated with a tectonic plate boundary. Using the 0.02-0.10 Hz passband, we can usually determine, with few exceptions, moment tensor solutions for earthquakes with M w as small as 3.7. The threshold is significantly influenced by the density of stations, the location of the earthquake relative to the seismic stations and, of course, the signal-to-noise ratio. With the existing permanent broadband stations in North America operated for rapid earthquake response, the seismic moment tensor of most earthquakes that are M w 4 or larger can be routinely computed. As expected the nonuniform spatial pattern of these solutions reflects the seismicity pattern. However, the orientation of the direction of maximum compressive stress and the predominant style of faulting is spatially coherent across large regions of the continent.

  4. How Small the Number of Test Items Can Be for the Basis of Estimating the Operating Characteristics of the Discrete Responses to Unknown Test Items.

    ERIC Educational Resources Information Center

    Samejima, Fumiko; Changas, Paul S.

    The methods and approaches for estimating the operating characteristics of the discrete item responses without assuming any mathematical form have been developed and expanded. It has been made possible that, even if the test information function of a given test is not constant for the interval of ability of interest, it is used as the Old Test.…

  5. MyShake - A smartphone app to detect earthquake

    NASA Astrophysics Data System (ADS)

    Kong, Q.; Allen, R. M.; Schreier, L.; Kwon, Y. W.

    2015-12-01

    We designed an android app that harnesses the accelerometers in personal smartphones to record earthquake-shaking data for research, hazard information and warnings. The app has the function to distinguish earthquake shakings from daily human activities based on the different patterns behind the movements. It also can be triggered by the traditional earthquake early warning (EEW) system to record for a certain amount of time to collect earthquake data. When the app is triggered by the earthquake-like movements, it sends the trigger information back to our server which contains time and location of the trigger, at the same time, it stores the waveform data on local phone first, and upload to our server later. Trigger information from multiple phones will be processed in real time on the server to find the coherent signal to confirm the earthquakes. Therefore, the app provides the basis to form a smartphone seismic network that can detect earthquake and even provide warnings. A planned public roll-out of MyShake could collect millions of seismic recordings for large earthquakes in many regions around the world.

  6. The Challenge of Centennial Earthquakes to Improve Modern Earthquake Engineering

    NASA Astrophysics Data System (ADS)

    Saragoni, G. Rodolfo

    2008-07-01

    The recent commemoration of the centennial of the San Francisco and Valparaiso 1906 earthquakes has given the opportunity to reanalyze their damages from modern earthquake engineering perspective. These two earthquakes plus Messina Reggio Calabria 1908 had a strong impact in the birth and developing of earthquake engineering. The study of the seismic performance of some up today existing buildings, that survive centennial earthquakes, represent a challenge to better understand the limitations of our in use earthquake design methods. Only Valparaiso 1906 earthquake, of the three considered centennial earthquakes, has been repeated again as the Central Chile, 1985, Ms = 7.8 earthquake. In this paper a comparative study of the damage produced by 1906 and 1985 Valparaiso earthquakes is done in the neighborhood of Valparaiso harbor. In this study the only three centennial buildings of 3 stories that survived both earthquakes almost undamaged were identified. Since for 1985 earthquake accelerogram at El Almendral soil conditions as well as in rock were recoded, the vulnerability analysis of these building is done considering instrumental measurements of the demand. The study concludes that good performance of these buildings in the epicentral zone of large earthquakes can not be well explained by modern earthquake engineering methods. Therefore, it is recommended to use in the future of more suitable instrumental parameters, such as the destructiveness potential factor, to describe earthquake demand.

  7. The Challenge of Centennial Earthquakes to Improve Modern Earthquake Engineering

    SciTech Connect

    Saragoni, G. Rodolfo

    2008-07-08

    The recent commemoration of the centennial of the San Francisco and Valparaiso 1906 earthquakes has given the opportunity to reanalyze their damages from modern earthquake engineering perspective. These two earthquakes plus Messina Reggio Calabria 1908 had a strong impact in the birth and developing of earthquake engineering. The study of the seismic performance of some up today existing buildings, that survive centennial earthquakes, represent a challenge to better understand the limitations of our in use earthquake design methods. Only Valparaiso 1906 earthquake, of the three considered centennial earthquakes, has been repeated again as the Central Chile, 1985, Ms = 7.8 earthquake. In this paper a comparative study of the damage produced by 1906 and 1985 Valparaiso earthquakes is done in the neighborhood of Valparaiso harbor. In this study the only three centennial buildings of 3 stories that survived both earthquakes almost undamaged were identified. Since for 1985 earthquake accelerogram at El Almendral soil conditions as well as in rock were recoded, the vulnerability analysis of these building is done considering instrumental measurements of the demand. The study concludes that good performance of these buildings in the epicentral zone of large earthquakes can not be well explained by modern earthquake engineering methods. Therefore, it is recommended to use in the future of more suitable instrumental parameters, such as the destructiveness potential factor, to describe earthquake demand.

  8. Triggered Earthquakes Following Parkfield?

    NASA Astrophysics Data System (ADS)

    Hough, S. E.

    2004-12-01

    When the M5.0 Arvin earthquake struck approximately 30 hours after the 28 September 2004 M6.0 Parkfield earthquake, it seemed likely if not obvious that the latter had triggered the former. The odds of a M5.0 or greater event occurring by random chance in a given 2-day window is low, on the order of 2%. However, previously published results suggest that remotely triggered earthquakes are observed only following much larger mainshocks, typically M7 or above. Moreover, using a standard beta-statistic approach, one finds no pervasive regional increase of seismicity in the weeks following the Parkfield mainshock. (Neither were any moderate events observed at regional distances following the 1934 and 1966 Parkfield earthquakes.) Was Arvin a remotely triggered earthquake? To address this issue further I compare the seismicity rate changes following the Parkfield mainshock with those following 14 previous M5.3-7.1 earthquakes in central and southern California. I show that, on average, seismicity increased to a distance of at least 120 km following these events. For all but the M7.1 Hector Mine mainshock, this is well beyond the radius of what would be considered a traditional aftershock zone. Average seismicity rates also increase, albeit more weakly, to a distance of about 220 km. These results suggest that even moderate mainshocks in central and southern California do trigger seismicity at distances up to 220 km, supporting the inference that Arvin was indeed a remotely triggered earthquake. In general, only weak triggering is expected following moderate (M5.5-6.5) mainshocks. However, as illustrated by Arvin and, in retrospect, the 1986 M5.5 Oceanside earthquake, which struck just 5 days after the M5.9 North Palm Springs earthquake, triggered events can sometimes be large enough to generate public interest, and anxiety.

  9. An empirical/theoretical model with dimensionless numbers to predict the performance of electrodialysis systems on the basis of operating conditions.

    PubMed

    Karimi, Leila; Ghassemi, Abbas

    2016-07-01

    Among the different technologies developed for desalination, the electrodialysis/electrodialysis reversal (ED/EDR) process is one of the most promising for treating brackish water with low salinity when there is high risk of scaling. Multiple researchers have investigated ED/EDR to optimize the process, determine the effects of operating parameters, and develop theoretical/empirical models. Previously published empirical/theoretical models have evaluated the effect of the hydraulic conditions of the ED/EDR on the limiting current density using dimensionless numbers. The reason for previous studies' emphasis on limiting current density is twofold: 1) to maximize ion removal, most ED/EDR systems are operated close to limiting current conditions if there is not a scaling potential in the concentrate chamber due to a high concentration of less-soluble salts; and 2) for modeling the ED/EDR system with dimensionless numbers, it is more accurate and convenient to use limiting current density, where the boundary layer's characteristics are known at constant electrical conditions. To improve knowledge of ED/EDR systems, ED/EDR models should be also developed for the Ohmic region, where operation reduces energy consumption, facilitates targeted ion removal, and prolongs membrane life compared to limiting current conditions. In this paper, theoretical/empirical models were developed for ED/EDR performance in a wide range of operating conditions. The presented ion removal and selectivity models were developed for the removal of monovalent ions and divalent ions utilizing the dominant dimensionless numbers obtained from laboratory scale electrodialysis experiments. At any system scale, these models can predict ED/EDR performance in terms of monovalent and divalent ion removal.

  10. Earthquakes and emergence

    NASA Astrophysics Data System (ADS)

    Earthquakes and emerging infections may not have a direct cause and effect relationship like tax evasion and jail, but new evidence suggests that there may be a link between the two human health hazards. Various media accounts have cited a massive 1993 earthquake in Maharashtra as a potential catalyst of the recent outbreak of plague in India that has claimed more than 50 lives and alarmed the world. The hypothesis is that the earthquake may have uprooted underground rat populations that carry the fleas infected with the bacterium that causes bubonic plague and can lead to the pneumonic form of the disease that is spread through the air.

  11. Earthquake engineering in Peru

    USGS Publications Warehouse

    Vargas, N.J

    1983-01-01

    During the last decade, earthquake engineering research in Peru has been carried out at the Catholic University of Peru and at the Universidad Nacional de Ingeniera (UNI). The Geophysical Institute (IGP) under the auspices of the Organization of American States (OAS) has initiated in Peru other efforts in regional seismic hazard assessment programs with direct impact to the earthquake engineering program. Further details on these programs have been reported by L. Ocola in the Earthquake Information Bulletin, January-February 1982, vol. 14, no. 1, pp. 33-38. 

  12. Properties of "started" earthquakes

    NASA Astrophysics Data System (ADS)

    Babeshko, V. A.; Evdokimova, O. V.; Babeshko, O. M.

    2016-04-01

    The properties of earthquakes called "started" in [1] are studied. The problems associated with the method of revealing them, the expected behavior of the event, and the determination of its place, time, and intensity are discussed. Certain characteristic properties of real earthquakes are compared with the modeled ones. It is emphasized that there are no data on earthquakes of a similar type in scientific publications. The method of using high-efficiency calculations is proposed by imbedding the investigations in topological spaces having a wider spectrum of properties than the functional ones.

  13. On near-source earthquake triggering

    USGS Publications Warehouse

    Parsons, T.; Velasco, A.A.

    2009-01-01

    When one earthquake triggers others nearby, what connects them? Two processes are observed: static stress change from fault offset and dynamic stress changes from passing seismic waves. In the near-source region (r ??? 50 km for M ??? 5 sources) both processes may be operating, and since both mechanisms are expected to raise earthquake rates, it is difficult to isolate them. We thus compare explosions with earthquakes because only earthquakes cause significant static stress changes. We find that large explosions at the Nevada Test Site do not trigger earthquakes at rates comparable to similar magnitude earthquakes. Surface waves are associated with regional and long-range dynamic triggering, but we note that surface waves with low enough frequency to penetrate to depths where most aftershocks of the 1992 M = 5.7 Little Skull Mountain main shock occurred (???12 km) would not have developed significant amplitude within a 50-km radius. We therefore focus on the best candidate phases to cause local dynamic triggering, direct waves that pass through observed near-source aftershock clusters. We examine these phases, which arrived at the nearest (200-270 km) broadband station before the surface wave train and could thus be isolated for study. Direct comparison of spectral amplitudes of presurface wave arrivals shows that M ??? 5 explosions and earthquakes deliver the same peak dynamic stresses into the near-source crust. We conclude that a static stress change model can readily explain observed aftershock patterns, whereas it is difficult to attribute near-source triggering to a dynamic process because of the dearth of aftershocks near large explosions.

  14. Seafloor earthquake measurement system, SEMS IV

    SciTech Connect

    Platzbecker, M.R.; Ehasz, J.P.; Franco, R.J.

    1997-07-01

    Staff of the Telemetry Technology Development Department (2664) have, in support of the U.S. Interior Department Mineral Management Services (MMS), developed and deployed the Seafloor Earthquake Measurement System IV (SEMS IV). The result of this development project is a series of three fully operational seafloor seismic monitor systems located at offshore platforms: Eureka, Grace, and Irene. The instrument probes are embedded from three to seven feet into the seafloor and hardwired to seismic data recorders installed top side at the offshore platforms. The probes and underwater cables were designed to survive the seafloor environment with an operation life of five years. The units have been operational for two years and have produced recordings of several minor earthquakes in that time. Sandia Labs will transfer operation of SEMS IV to MMS contractors in the coming months. 29 figs., 25 tabs.

  15. Harnessing the Collective Power of Eyewitnesses for Improved Earthquake Information

    NASA Astrophysics Data System (ADS)

    Bossu, R.; Lefebvre, S.; Mazet-Roux, G.; Steed, R.

    2013-12-01

    The Euro-Med Seismological Centre (EMSC) operates the second global earthquake information website (www.emsc-csem.org) which attracts 2 million visits a month from about 200 different countries. We collect information about earthquakes' effects from eyewitnesses such as online questionnaires, geolocated pics to rapidly constrain impact scenario. At the beginning, the collection was purely intended to address a scientific issue: the rapid evaluation of earthquake's impact. However, it rapidly appears that the understanding of eyewitnesses' expectations and motivations in the immediate aftermath of an earthquake was essential to optimise this data collection. Crowdsourcing information on earthquake's effects does not apply to a pre-existing community. By definition, eyewitnesses only exist once the earthquake has struck. We developed a strategy on social networks (Facebook, Google+, Twitter...) to interface with spontaneously emerging online communities of eyewitnesses. The basic idea is to create a positive feedback loop: attract eyewitnesses and engage with them by providing expected earthquake information and services, collect their observations, collate them for improved earthquake information services to attract more witnesses. We will present recent examples to illustrate how important the use of social networks is to engage with eyewitnesses especially in regions of low seismic activity where people are unaware of existing Internet resources dealing with earthquakes. A second type of information collated in our information services is derived from the real time analysis of the traffic on our website in the first minutes following an earthquake occurrence, an approach named flashsourcing. We show, using the example of the Mineral, Virginia earthquake that the arrival times of eyewitnesses of our website follows the propagation of the generated seismic waves and then, that eyewitnesses can be considered as ground motion sensors. Flashsourcing discriminates felt

  16. On a Riesz basis of exponentials related to the eigenvalues of an analytic operator and application to a non-selfadjoint problem deduced from a perturbation method for sound radiation

    NASA Astrophysics Data System (ADS)

    Ellouz, Hanen; Feki, Ines; Jeribi, Aref

    2013-11-01

    In the present paper, we prove that the family of exponentials associated to the eigenvalues of the perturbed operator T(ɛ) ≔ T0 + ɛT1 + ɛ2T2 + … + ɛkTk + … forms a Riesz basis in L2(0, T), T > 0, where \\varepsilon in {C}, T0 is a closed densely defined linear operator on a separable Hilbert space H with domain D(T_0) having isolated eigenvalues with multiplicity one, while T1, T2, … are linear operators on H having the same domain Dsupset D(T_0) and satisfying a specific growing inequality. After that, we generalize this result using a H-Lipschitz function. As application, we consider a non-selfadjoint problem deduced from a perturbation method for sound radiation.

  17. On a Riesz basis of exponentials related to the eigenvalues of an analytic operator and application to a non-selfadjoint problem deduced from a perturbation method for sound radiation

    SciTech Connect

    Ellouz, Hanen; Feki, Ines; Jeribi, Aref

    2013-11-15

    In the present paper, we prove that the family of exponentials associated to the eigenvalues of the perturbed operator T(ε) ≔ T{sub 0} + εT{sub 1} + ε{sup 2}T{sub 2} + … + ε{sup k}T{sub k} + … forms a Riesz basis in L{sup 2}(0, T), T > 0, where ε∈C, T{sub 0} is a closed densely defined linear operator on a separable Hilbert space H with domain D(T{sub 0}) having isolated eigenvalues with multiplicity one, while T{sub 1}, T{sub 2}, … are linear operators on H having the same domain D⊃D(T{sub 0}) and satisfying a specific growing inequality. After that, we generalize this result using a H-Lipschitz function. As application, we consider a non-selfadjoint problem deduced from a perturbation method for sound radiation.

  18. Building losses assessment for Lushan earthquake utilization multisource remote sensing data and GIS

    NASA Astrophysics Data System (ADS)

    Nie, Juan; Yang, Siquan; Fan, Yida; Wen, Qi; Xu, Feng; Li, Lingling

    2015-12-01

    On 20 April 2013, a catastrophic earthquake of magnitude 7.0 struck the Lushan County, northwestern Sichuan Province, China. This earthquake named Lushan earthquake in China. The Lushan earthquake damaged many buildings. The situation of building loss is one basis for emergency relief and reconstruction. Thus, the building losses of the Lushan earthquake must be assessed. Remote sensing data and geographic information systems (GIS) can be employed to assess the building loss of the Lushan earthquake. The building losses assessment results for Lushan earthquake disaster utilization multisource remote sensing dada and GIS were reported in this paper. The assessment results indicated that 3.2% of buildings in the affected areas were complete collapsed. 12% and 12.5% of buildings were heavy damaged and slight damaged, respectively. The complete collapsed buildings, heavy damaged buildings, and slight damaged buildings mainly located at Danling County, Hongya County, Lushan County, Mingshan County, Qionglai County, Tianquan County, and Yingjing County.

  19. Earthquake swarms on Mount Erebus, Antarctica

    NASA Astrophysics Data System (ADS)

    Kaminuma, Katsutada; Baba, Megumi; Ueki, Sadato

    1986-12-01

    Mount Erebus (3794 m), located on Ross Island in McMurdo Sound, is one of the few active volcanoes in Antartica. A high-sensitivity seismic network has been operated by Japanese and US parties on and around the Volcano since December, 1980. The results of these observations show two kinds of seismic activity on Ross Island: activity concentrated near the summit of Mount Erebus associated with Strombolian eruptions, and micro-earthquake activity spread through Mount Erebus and the surrounding area. Seismicity on Mount Erebus has been quite high, usually exceeding 20 volcanic earthquakes per day. They frequently occur in swarms with daily counts exceeding 100 events. Sixteen earthquake swarms with more than 250 events per day were recorded by the seismic network during the three year period 1982-1984, and three notable earthquake swarms out of the sixteen were recognized, in October, 1982 (named 82-C), March-April, 1984 (84-B) and July, 1984 (84-F). Swarms 84-B and 84-F have a large total number of earthquakes and large Ishimoto-Iida's "m"; hence these two swarms are presumed to constitute on one of the precursor phenomena to the new eruption, which took place on 13 September, 1984, and lasted a few months.

  20. Metrics for comparing dynamic earthquake rupture simulations

    USGS Publications Warehouse

    Barall, Michael; Harris, Ruth A.

    2014-01-01

    Earthquakes are complex events that involve a myriad of interactions among multiple geologic features and processes. One of the tools that is available to assist with their study is computer simulation, particularly dynamic rupture simulation. A dynamic rupture simulation is a numerical model of the physical processes that occur during an earthquake. Starting with the fault geometry, friction constitutive law, initial stress conditions, and assumptions about the condition and response of the near‐fault rocks, a dynamic earthquake rupture simulation calculates the evolution of fault slip and stress over time as part of the elastodynamic numerical solution (Ⓔ see the simulation description in the electronic supplement to this article). The complexity of the computations in a dynamic rupture simulation make it challenging to verify that the computer code is operating as intended, because there are no exact analytic solutions against which these codes’ results can be directly compared. One approach for checking if dynamic rupture computer codes are working satisfactorily is to compare each code’s results with the results of other dynamic rupture codes running the same earthquake simulation benchmark. To perform such a comparison consistently, it is necessary to have quantitative metrics. In this paper, we present a new method for quantitatively comparing the results of dynamic earthquake rupture computer simulation codes.

  1. Nonlinear processes in earthquakes

    SciTech Connect

    Jones, E.M.; Frohlich, C.

    1998-12-31

    This is the final report of a one-year, Laboratory Directed Research and Development (LDRD) project at Los Alamos National Laboratory (LANL). Three-dimensional, elastic-wave-propagation calculations were performed to define the effects of near-source geologic structure on the degree to which seismic signals produced by earthquakes resemble {open_quotes}non-double-couple{close_quotes} sources. Signals from sources embedded in a subducting slab showed significant phase and amplitude differences compared with a {open_quotes}no-slab{close_quotes} case. Modifications to the LANL elastic-wave propagation code enabled improved simulations of path effects on earthquake and explosion signals. These simulations demonstrate that near-source, shallow, low-velocity basins can introduce earthquake-like features into explosion signatures through conversion of compressive (P-wave) energy to shear (S- and R-wave) modes. Earthquake sources simulated to date do not show significant modifications.

  2. Historical Earthquakes and Active Structure for Georgia

    NASA Astrophysics Data System (ADS)

    Tsereteli, Nino; Varazanashivli, Otar

    2014-05-01

    Long-term seismic history is an important foundation for reliable assessment of seismic hazard and risk. Therefore, completeness of earthquake catalogues in the longest historical part is very important. Survived historical sources, as well as special researches from the institutes, museums, libraries and archives in Georgia, the Caucasus and the Middle East indicate to high level of seismicity which entailed numerous human casualties and destruction on the territory of Georgia during the historical period. The study and detailed analysis of these original documents and researches have allowed us to create a new catalogue of historical earthquakes of Georgia from 1250 BC to 1900 AD. The method of the study is based on a multidisciplinary approach, i.e. on the joint use of methods of history and paleoseismology, archeoseismology, seismotectonics, geomorphology, etc. We present here a new parametric catalogue of 44 historic earthquakes of Georgia and a full "descriptor" of all the phenomena described in it. Constructed on its basis, the summarized map of the distribution of maximum damage in the historical period (before 1900) on the territory of Georgia clearly shows the main features of the seismic field during this period. In particular, in the axial part and the southern slope of the Greater Caucasus there is a seismic gap, which was filled in 1991 by the strongest earthquake and its aftershocks in Racha. In addition, it is also obvious that very high seismic activity in the central and eastern parts of the Javakheti highland is not described in historical materials and this fact requires further searches of various kinds of sources that contain data about historical earthquakes. We hope that this catalogue will enable to create a new joint (instrumental and historical) parametric earthquake catalogue of Georgia and will serve to assess the real seismic hazard and risk in the country.

  3. Dynamic stress changes during earthquake rupture

    USGS Publications Warehouse

    Day, S.M.; Yu, G.; Wald, D.J.

    1998-01-01

    We assess two competing dynamic interpretations that have been proposed for the short slip durations characteristic of kinematic earthquake models derived by inversion of earthquake waveform and geodetic data. The first interpretation would require a fault constitutive relationship in which rapid dynamic restrengthening of the fault surface occurs after passage of the rupture front, a hypothesized mechanical behavior that has been referred to as "self-healing." The second interpretation would require sufficient spatial heterogeneity of stress drop to permit rapid equilibration of elastic stresses with the residual dynamic friction level, a condition we refer to as "geometrical constraint." These interpretations imply contrasting predictions for the time dependence of the fault-plane shear stresses. We compare these predictions with dynamic shear stress changes for the 1992 Landers (M 7.3), 1994 Northridge (M 6.7), and 1995 Kobe (M 6.9) earthquakes. Stress changes are computed from kinematic slip models of these earthquakes, using a finite-difference method. For each event, static stress drop is highly variable spatially, with high stress-drop patches embedded in a background of low, and largely negative, stress drop. The time histories of stress change show predominantly monotonic stress change after passage of the rupture front, settling to a residual level, without significant evidence for dynamic restrengthening. The stress change at the rupture front is usually gradual rather than abrupt, probably reflecting the limited resolution inherent in the underlying kinematic inversions. On the basis of this analysis, as well as recent similar results obtained independently for the Kobe and Morgan Hill earthquakes, we conclude that, at the present time, the self-healing hypothesis is unnecessary to explain earthquake kinematics.

  4. Earthquake education in California

    USGS Publications Warehouse

    MacCabe, M. P.

    1980-01-01

    In a survey of community response to the earthquake threat in southern California, Ralph Turner and his colleagues in the Department of Sociology at the University of California, Los Angeles, found that the public very definitely wants to be educated about the kinds of problems and hazards they can expect during and after a damaging earthquake; and they also want to know how they can prepare themselves to minimize their vulnerability. Decisionmakers, too, are recognizing this new wave of public concern. 

  5. Source processes of strong earthquakes in the North Tien-Shan region

    NASA Astrophysics Data System (ADS)

    Kulikova, G.; Krueger, F.

    2013-12-01

    Tien-Shan region attracts attention of scientists worldwide due to its complexity and tectonic uniqueness. A series of very strong destructive earthquakes occurred in Tien-Shan at the turn of XIX and XX centuries. Such large intraplate earthquakes are rare in seismology, which increases the interest in the Tien-Shan region. The presented study focuses on the source processes of large earthquakes in Tien-Shan. The amount of seismic data is limited for those early times. In 1889, when a major earthquake has occurred in Tien-Shan, seismic instruments were installed in very few locations in the world and these analog records did not survive till nowadays. Although around a hundred seismic stations were operating at the beginning of XIX century worldwide, it is not always possible to get high quality analog seismograms. Digitizing seismograms is a very important step in the work with analog seismic records. While working with historical seismic records one has to take into account all the aspects and uncertainties of manual digitizing and the lack of accurate timing and instrument characteristics. In this study, we develop an easy-to-handle and fast digitization program on the basis of already existing software which allows to speed up digitizing process and to account for all the recoding system uncertainties. Owing to the lack of absolute timing for the historical earthquakes (due to the absence of a universal clock at that time), we used time differences between P and S phases to relocate the earthquakes in North Tien-Shan and the body-wave amplitudes to estimate their magnitudes. Combining our results with geological data, five earthquakes in North Tien-Shan were precisely relocated. The digitizing of records can introduce steps into the seismograms which makes restitution (removal of instrument response) undesirable. To avoid the restitution, we simulated historic seismograph recordings with given values for damping and free period of the respective instrument and

  6. Injection-induced earthquakes.

    PubMed

    Ellsworth, William L

    2013-07-12

    Earthquakes in unusual locations have become an important topic of discussion in both North America and Europe, owing to the concern that industrial activity could cause damaging earthquakes. It has long been understood that earthquakes can be induced by impoundment of reservoirs, surface and underground mining, withdrawal of fluids and gas from the subsurface, and injection of fluids into underground formations. Injection-induced earthquakes have, in particular, become a focus of discussion as the application of hydraulic fracturing to tight shale formations is enabling the production of oil and gas from previously unproductive formations. Earthquakes can be induced as part of the process to stimulate the production from tight shale formations, or by disposal of wastewater associated with stimulation and production. Here, I review recent seismic activity that may be associated with industrial activity, with a focus on the disposal of wastewater by injection in deep wells; assess the scientific understanding of induced earthquakes; and discuss the key scientific challenges to be met for assessing this hazard.

  7. Earthquake prediction, societal implications

    NASA Astrophysics Data System (ADS)

    Aki, Keiiti

    1995-07-01

    "If I were a brilliant scientist, I would be working on earthquake prediction." This is a statement from a Los Angeles radio talk show I heard just after the Northridge earthquake of January 17, 1994. Five weeks later, at a monthly meeting of the Southern California Earthquake Center (SCEC), where more than two hundred scientists and engineers gathered to exchange notes on the earthquake, a distinguished French geologist who works on earthquake faults in China envied me for working now in southern California. This place is like northeastern China 20 years ago, when high seismicity and research activities led to the successful prediction of the Haicheng earthquake of February 4, 1975 with magnitude 7.3. A difficult question still haunting us [Aki, 1989] is whether the Haicheng prediction was founded on the physical reality of precursory phenomena or on the wishful thinking of observers subjected to the political pressure which encouraged precursor reporting. It is, however, true that a successful life-saving prediction like the Haicheng prediction can only be carried out by the coordinated efforts of decision makers and physical scientists.

  8. Injection-induced earthquakes

    USGS Publications Warehouse

    Ellsworth, William L.

    2013-01-01

    Earthquakes in unusual locations have become an important topic of discussion in both North America and Europe, owing to the concern that industrial activity could cause damaging earthquakes. It has long been understood that earthquakes can be induced by impoundment of reservoirs, surface and underground mining, withdrawal of fluids and gas from the subsurface, and injection of fluids into underground formations. Injection-induced earthquakes have, in particular, become a focus of discussion as the application of hydraulic fracturing to tight shale formations is enabling the production of oil and gas from previously unproductive formations. Earthquakes can be induced as part of the process to stimulate the production from tight shale formations, or by disposal of wastewater associated with stimulation and production. Here, I review recent seismic activity that may be associated with industrial activity, with a focus on the disposal of wastewater by injection in deep wells; assess the scientific understanding of induced earthquakes; and discuss the key scientific challenges to be met for assessing this hazard.

  9. Using a pruned basis, a non-product quadrature grid, and the exact Watson normal-coordinate kinetic energy operator to solve the vibrational Schrödinger equation for C2H4

    NASA Astrophysics Data System (ADS)

    Avila, Gustavo; Carrington, Tucker

    2011-08-01

    In this paper we propose and test a method for computing numerically exact vibrational energy levels of a molecule with six atoms. We use a pruned product basis, a non-product quadrature, the Lanczos algorithm, and the exact normal-coordinate kinetic energy operator (KEO) with the πtμπ term. The Lanczos algorithm is applied to a Hamiltonian with a KEO for which μ is evaluated at equilibrium. Eigenvalues and eigenvectors obtained from this calculation are used as a basis to obtain the final energy levels. The quadrature scheme is designed, so that integrals for the most important terms in the potential will be exact. The procedure is tested on C2H4. All 12 coordinates are treated explicitly. We need only ˜1.52 × 108 quadrature points. A product Gauss grid with which one could calculate the same energy levels has at least 5.67 × 1013 points.

  10. Charles Darwin's earthquake reports

    NASA Astrophysics Data System (ADS)

    Galiev, Shamil

    2010-05-01

    As it is the 200th anniversary of Darwin's birth, 2009 has also been marked as 170 years since the publication of his book Journal of Researches. During the voyage Darwin landed at Valdivia and Concepcion, Chile, just before, during, and after a great earthquake, which demolished hundreds of buildings, killing and injuring many people. Land was waved, lifted, and cracked, volcanoes awoke and giant ocean waves attacked the coast. Darwin was the first geologist to observe and describe the effects of the great earthquake during and immediately after. These effects sometimes repeated during severe earthquakes; but great earthquakes, like Chile 1835, and giant earthquakes, like Chile 1960, are rare and remain completely unpredictable. This is one of the few areas of science, where experts remain largely in the dark. Darwin suggested that the effects were a result of ‘ …the rending of strata, at a point not very deep below the surface of the earth…' and ‘…when the crust yields to the tension, caused by its gradual elevation, there is a jar at the moment of rupture, and a greater movement...'. Darwin formulated big ideas about the earth evolution and its dynamics. These ideas set the tone for the tectonic plate theory to come. However, the plate tectonics does not completely explain why earthquakes occur within plates. Darwin emphasised that there are different kinds of earthquakes ‘...I confine the foregoing observations to the earthquakes on the coast of South America, or to similar ones, which seem generally to have been accompanied by elevation of the land. But, as we know that subsidence has gone on in other quarters of the world, fissures must there have been formed, and therefore earthquakes...' (we cite the Darwin's sentences following researchspace. auckland. ac. nz/handle/2292/4474). These thoughts agree with results of the last publications (see Nature 461, 870-872; 636-639 and 462, 42-43; 87-89). About 200 years ago Darwin gave oneself airs by the

  11. An efficient repeating signal detector to investigate earthquake swarms

    NASA Astrophysics Data System (ADS)

    Skoumal, Robert J.; Brudzinski, Michael R.; Currie, Brian S.

    2016-08-01

    Repetitive earthquake swarms have been recognized as key signatures in fluid injection induced seismicity, precursors to volcanic eruptions, and slow slip events preceding megathrust earthquakes. We investigate earthquake swarms by developing a Repeating Signal Detector (RSD), a computationally efficient algorithm utilizing agglomerative clustering to identify similar waveforms buried in years of seismic recordings using a single seismometer. Instead of relying on existing earthquake catalogs of larger earthquakes, RSD identifies characteristic repetitive waveforms by rapidly identifying signals of interest above a low signal-to-noise ratio and then grouping based on spectral and time domain characteristics, resulting in dramatically shorter processing time than more exhaustive autocorrelation approaches. We investigate seismicity in four regions using RSD: (1) volcanic seismicity at Mammoth Mountain, California, (2) subduction-related seismicity in Oaxaca, Mexico, (3) induced seismicity in Central Alberta, Canada, and (4) induced seismicity in Harrison County, Ohio. In each case, RSD detects a similar or larger number of earthquakes than existing catalogs created using more time intensive methods. In Harrison County, RSD identifies 18 seismic sequences that correlate temporally and spatially to separate hydraulic fracturing operations, 15 of which were previously unreported. RSD utilizes a single seismometer for earthquake detection which enables seismicity to be quickly identified in poorly instrumented regions at the expense of relying on another method to locate the new detections. Due to the smaller computation overhead and success at distances up to ~50 km, RSD is well suited for real-time detection of low-magnitude earthquake swarms with permanent regional networks.

  12. Earthquakes; January-March 1976

    USGS Publications Warehouse

    Person, W.J.

    1976-01-01

    The year 1976 started out quite active, seismically. Four major earthquakes occurred in different parts of the world during the first 3 months of the year. Three earthquakes rattled the western rim of the Pacific Ocean from the Kuril Islands to the Kermadec Islands. The fourth major earthquake struck Guatemala, killing thousands of people, injuring many, and leaving thousands homeless. Earthquakes in Kentucky and Arkansas caused little damage but were felt in several States. Arizona experienced a sharp earthquake in the Chico Valley, which caused very little damage. Other States experienced earthquakes, but none caused damage. 

  13. Trial application of guidelines for nuclear plant response to an earthquake. Final report

    SciTech Connect

    Schmidt, W.; Oliver, R.; O`Connor, W.

    1993-09-01

    Guidelines have been developed to assist nuclear plant personnel in the preparation of earthquake response procedures for nuclear power plants. These guidelines are published in EPRI report NP-6695, ``Guidelines for Nuclear Plant Response to an Earthquake,`` dated December 1989. This report includes two sets of nuclear plant procedures which were prepared to implement the guidelines of EPRI report NP-6695. The first set were developed by the Toledo Edison Company Davis-Besse plant. Davis-Besse is a pressurized water reactor (PWR) and contains relatively standard seismic monitoring instrumentation typical of many domestic nuclear plants. The second set of procedures were prepared by Yankee Atomic Electric Company for the Vermont Yankee facility. This plant is a boiling water reactor (BWR) with state-of-the-art seismic monitoring and PC-based data processing equipment, software developed specifically to implement the OBE Exceedance Criterion presented in EPRI report NP-5930, ``A Criterion for Determining Exceedance of the operating Basis Earthquake.`` The two sets of procedures are intended to demonstrate how two different nuclear utilities have interpreted and applied the EPRI guidance given in report NP-6695.

  14. Developing ShakeCast statistical fragility analysis framework for rapid post-earthquake assessment

    USGS Publications Warehouse

    Lin, K.-W.; Wald, D.J.

    2012-01-01

    When an earthquake occurs, the U. S. Geological Survey (USGS) ShakeMap estimates the extent of potentially damaging shaking and provides overall information regarding the affected areas. The USGS ShakeCast system is a freely-available, post-earthquake situational awareness application that automatically retrieves earthquake shaking data from ShakeMap, compares intensity measures against users’ facilities, sends notifications of potential damage to responsible parties, and generates facility damage assessment maps and other web-based products for emergency managers and responders. We describe notable improvements of the ShakeMap and the ShakeCast applications. We present a design for comprehensive fragility implementation, integrating spatially-varying ground-motion uncertainties into fragility curves for ShakeCast operations. For each facility, an overall inspection priority (or damage assessment) is assigned on the basis of combined component-based fragility curves using pre-defined logic. While regular ShakeCast users receive overall inspection priority designations for each facility, engineers can access the full fragility analyses for further evaluation.

  15. Toward real-time regional earthquake simulation II: Real-time Online earthquake Simulation (ROS) of Taiwan earthquakes

    NASA Astrophysics Data System (ADS)

    Lee, Shiann-Jong; Liu, Qinya; Tromp, Jeroen; Komatitsch, Dimitri; Liang, Wen-Tzong; Huang, Bor-Shouh

    2014-06-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 min after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). A new island-wide, high resolution SEM mesh model is developed for the whole Taiwan in this study. We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 min for a 70 s ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  16. Prompt Assessment of Global Earthquakes for Response (PAGER): A System for Rapidly Determining the Impact of Earthquakes Worldwide

    USGS Publications Warehouse

    Earle, Paul S.; Wald, David J.; Jaiswal, Kishor S.; Allen, Trevor I.; Hearne, Michael G.; Marano, Kristin D.; Hotovec, Alicia J.; Fee, Jeremy

    2009-01-01

    Within minutes of a significant earthquake anywhere on the globe, the U.S. Geological Survey (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system assesses its potential societal impact. PAGER automatically estimates the number of people exposed to severe ground shaking and the shaking intensity at affected cities. Accompanying maps of the epicentral region show the population distribution and estimated ground-shaking intensity. A regionally specific comment describes the inferred vulnerability of the regional building inventory and, when available, lists recent nearby earthquakes and their effects. PAGER's results are posted on the USGS Earthquake Program Web site (http://earthquake.usgs.gov/), consolidated in a concise one-page report, and sent in near real-time to emergency responders, government agencies, and the media. Both rapid and accurate results are obtained through manual and automatic updates of PAGER's content in the hours following significant earthquakes. These updates incorporate the most recent estimates of earthquake location, magnitude, faulting geometry, and first-hand accounts of shaking. PAGER relies on a rich set of earthquake analysis and assessment tools operated by the USGS and contributing Advanced National Seismic System (ANSS) regional networks. A focused research effort is underway to extend PAGER's near real-time capabilities beyond population exposure to quantitative estimates of fatalities, injuries, and displaced population.

  17. Persistent earthquake clusters and gaps from slip on irregular faults

    NASA Astrophysics Data System (ADS)

    Parsons, Tom

    2008-01-01

    Earthquake-producing fault systems like the San Andreas fault in California show self-similar structural variation; earthquakes cluster in space, leaving aseismic gaps between clusters. Whether gaps represent overdue earthquakes or signify diminished risk is a question with which seismic-hazard forecasters wrestle. Here I use spectral analysis of the spatial distribution of seismicity along the San Andreas fault (for earthquakes that are at least 2 in magnitude), which reveals that it obeys a power-law relationship, indicative of self-similarity in clusters across a range of spatial scales. To determine whether the observed clustering of earthquakes is the result of a heterogeneous stress distribution, I use a finite-element method to simulate the motion of two rigid blocks past each other along a model fault surface that shows three-dimensional complexity on the basis of mapped traces of the San Andreas fault. The results indicate that long-term slip on the model fault generates a temporally stable, spatially variable distribution of stress that shows the same power-law relationship as the earthquake distribution. At the highest rates of San Andreas fault slip (40mmyr-1), stress patterns produced are stable over a minimum of 25,000 years before the model fault system evolves into a new configuration. These results suggest that although gaps are not immune to rupture propagation they are less likely to be nucleation sites for earthquakes.

  18. The earthquake potential of the New Madrid seismic zone

    USGS Publications Warehouse

    Tuttle, M.P.; Schweig, E.S.; Sims, J.D.; Lafferty, R.H.; Wolf, L.W.; Haynes, M.L.

    2002-01-01

    The fault system responsible for New Madrid seismicity has generated temporally clustered very large earthquakes in A.D. 900 ?? 100 years and A.D. 1450 ?? 150 years as well as in 1811-1812. Given the uncertainties in dating liquefaction features, the time between the past three New Madrid events may be as short as 200 years and as long as 800 years, with an average of 500 years. This advance in understanding the Late Holocene history of the New Madrid seismic zone and thus, the contemporary tectonic behavior of the associated fault system was made through studies of hundreds of earthquake-induced liquefaction features at more than 250 sites across the New Madrid region. We have found evidence that prehistoric sand blows, like those that formed during the 1811-1812 earthquakes, are probably compound structures resulting from multiple earthquakes closely clustered in time or earthquake sequences. From the spatial distribution and size of sand blows and their sedimentary units, we infer the source zones and estimate the magnitudes of earthquakes within each sequence and thereby characterize the detailed behavior of the fault system. It appears that fault rupture was complex and that the central branch of the seismic zone produced very large earthquakes during the A.D. 900 and A.D. 1450 events as well as in 1811-1812. On the basis of a minimum recurrence rate of 200 years, we are now entering the period during which the next 1811-1812-type event could occur.

  19. Large magnitude (M > 7.5) offshore earthquakes in 2012: few examples of absent or little tsunamigenesis, with implications for tsunami early warning

    NASA Astrophysics Data System (ADS)

    Pagnoni, Gianluca; Armigliato, Alberto; Tinti, Stefano

    2013-04-01

    We take into account some examples of offshore earthquakes occurred worldwide in year 2012 that were characterised by a "large" magnitude (Mw equal or larger than 7.5) but which produced no or little tsunami effects. Here, "little" is intended as "lower than expected on the basis of the parent earthquake magnitude". The examples we analyse include three earthquakes occurred along the Pacific coasts of Central America (20 March, Mw=7.8, Mexico; 5 September, Mw=7.6, Costa Rica; 7 November, Mw=7.5, Mexico), the Mw=7.6 and Mw=7.7 earthquakes occurred respectively on 31 August and 28 October offshore Philippines and offshore Alaska, and the two Indian Ocean earthquakes registered on a single day (11 April) and characterised by Mw=8.6 and Mw=8.2. For each event, we try to face the problem related to its tsunamigenic potential from two different perspectives. The first can be considered purely scientific and coincides with the question: why was the ensuing tsunami so weak? The answer can be related partly to the particular tectonic setting in the source area, partly to the particular position of the source with respect to the coastline, and finally to the focal mechanism of the earthquake and to the slip distribution on the ruptured fault. The first two pieces of information are available soon after the earthquake occurrence, while the third requires time periods in the order of tens of minutes. The second perspective is more "operational" and coincides with the tsunami early warning perspective, for which the question is: will the earthquake generate a significant tsunami and if so, where will it strike? The Indian Ocean events of 11 April 2012 are perfect examples of the fact that the information on the earthquake magnitude and position alone may not be sufficient to produce reliable tsunami warnings. We emphasise that it is of utmost importance that the focal mechanism determination is obtained in the future much more quickly than it is at present and that this

  20. Applications of Multi-Cycle Earthquake Simulations to Earthquake Hazard

    NASA Astrophysics Data System (ADS)

    Gilchrist, Jacquelyn Joan

    This dissertation seeks to contribute to earthquake hazard analyses and forecasting by conducting a detailed study of the processes controlling the occurrence, and particularly the clustering, of large earthquakes, the probabilities of these large events, and the dynamics of their ruptures. We use the multi-cycle earthquake simulator RSQSim to investigate several fundamental aspects of earthquake occurrence in order to improve the understanding of earthquake hazard. RSQSim, a 3D, boundary element code that incorporates rate- and state-friction to simulate earthquakes in fully interacting, complex fault systems has been successful at modeling several aspects of fault slip and earthquake occurrence. Multi-event earthquake models with time-dependent nucleation based on rate- and state-dependent friction, such as RSQSim, provide a viable physics-based method for modeling earthquake processes. These models can provide a better understanding of earthquake hazard by improving our knowledge of earthquake processes and probabilities. RSQSim is fast and efficient, and therefore is able to simulate very long sequences of earthquakes (from hundreds of thousands to millions of events). This makes RSQSim an ideal instrument for filling in the current gaps in earthquake data, from short and incomplete earthquake catalogs to unrealistic initial conditions used for dynamic rupture models. RSQSim catalogs include foreshocks, aftershocks, and occasional clusters of large earthquakes, the statistics of which are important for the estimation of earthquake probabilities. Additionally, RSQSim finds a near optimal nucleation location that enables ruptures to propagate at minimal stress conditions and thus can provide suites of heterogeneous initial conditions for dynamic rupture models that produce reduced ground motions compared to models with homogeneous initial stresses and arbitrary forced nucleation locations.

  1. Training for Success: Intelligence Training in Support of Humanitarian Assistance Operations

    DTIC Science & Technology

    2016-06-10

    following the 2015 Nepal earthquake . In visualizing these humanitarian operations, one mostly conjures up images of doctors conducting surgery in... earthquake recovery operation, a radiological disaster operation, and an infectious disease operation. This diversity strengthens the relevance of...deployment of a NGA team to natural disasters, specifically including earthquakes , floods, hurricanes, and wildfires in order to provide initial damage or

  2. Earthquakes and Earthquake Engineering. LC Science Tracer Bullet.

    ERIC Educational Resources Information Center

    Buydos, John F., Comp.

    An earthquake is a shaking of the ground resulting from a disturbance in the earth's interior. Seismology is the (1) study of earthquakes; (2) origin, propagation, and energy of seismic phenomena; (3) prediction of these phenomena; and (4) investigation of the structure of the earth. Earthquake engineering or engineering seismology includes the…

  3. Estimating surface faulting impacts from the shakeout scenario earthquake

    USGS Publications Warehouse

    Treiman, J.A.; Pontib, D.J.

    2011-01-01

    An earthquake scenario, based on a kinematic rupture model, has been prepared for a Mw 7.8 earthquake on the southern San Andreas Fault. The rupture distribution, in the context of other historic large earthquakes, is judged reasonable for the purposes of this scenario. This model is used as the basis for generating a surface rupture map and for assessing potential direct impacts on lifelines and other infrastructure. Modeling the surface rupture involves identifying fault traces on which to place the rupture, assigning slip values to the fault traces, and characterizing the specific displacements that would occur to each lifeline impacted by the rupture. Different approaches were required to address variable slip distribution in response to a variety of fault patterns. Our results, involving judgment and experience, represent one plausible outcome and are not predictive because of the variable nature of surface rupture. ?? 2011, Earthquake Engineering Research Institute.

  4. Reducing the Risks of Nonstructural Earthquake Damage: A Practical Guide. Earthquake Hazards Reduction Series 1.

    ERIC Educational Resources Information Center

    Reitherman, Robert

    The purpose of this booklet is to provide practical information to owners, operators, and occupants of office and commercial buildings on the vulnerabilities posed by earthquake damage to nonstructural items and the means available to deal with these potential problems. Examples of dangerous nonstructural damages that have occurred in past…

  5. Rupture, waves and earthquakes.

    PubMed

    Uenishi, Koji

    2017-01-01

    Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but "extraordinary" phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable.

  6. Rupture, waves and earthquakes

    NASA Astrophysics Data System (ADS)

    Uenishi, Koji

    2017-01-01

    Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but "extraordinary" phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable.

  7. Earthquakes, September-October 1984

    USGS Publications Warehouse

    Person, W.J.

    1985-01-01

    In the United States, Wyoming experienced a couple of moderate earthquakes, and off the coast of northern California, a strong earthquake shook much of the northern coast of California and parts of the Oregon coast. 

  8. Earthquakes, July-August 1991

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    There was one major earthquake during this reporting period-a magnitude 7.1 shock off the coast of Northern California on August 17. Earthquake-related deaths were reported from Indonesia, Romania, Peru, and Iraq. 

  9. Distribution of similar earthquakes in aftershocks of inland earthquakes

    NASA Astrophysics Data System (ADS)

    Hayashi, M.; Hiramatsu, Y.; Aftershock Observations Of 2007 Noto Hanto, G.

    2010-12-01

    Frictional properties control the slip behavior on a fault surface such as seismic slip and aseismic slip. Asperity, as a seismic slip area, is characterized by a strong coupling in the interseismic period and large coseismic slip. On the other hand, steady slip or afterslip occurs in an aseismic slip area around the asperity. If an afterslip area includes small asperities, a repeating rupture of single asperity can generate similar earthquakes due to the stress accumulation caused by the afterslip. We here investigate a detail distribution of similar earthquakes in the aftershocks of the 2007 Noto Hanto earthquake (Mjma 6.9) and the 2000 Western Tottori earthquake (Mjma 7.3), inland large earthquakes in Japan. We use the data obtained by the group for the aftershock observations of the 2007 Noto Hanto Earthquake and by the group for the aftershock observations of the 2000 Western Tottori earthquake. First, we select pairs of aftershocks whose cross correlation coefficients in 10 s time window of band-pass filtered waveforms of 1~4 Hz are greater than 0.95 at more than 5 stations and divide those into groups by a link of the cross correlation coefficients. Second, we reexamine the arrival times of P and S waves and the maximum amplitude for earthquakes of each group and apply the double-difference method (Waldhouser and Ellsworth, 2000) to relocate them. As a result of the analysis, we find 24 groups of similar earthquakes in the aftershocks on the source fault of the 2007 Noto Hanto Earthquake and 86 groups of similar earthquakes in the aftershocks on the source fault of the 2000 Western Tottori Earthquake. Most of them are distributed around or outside the asperity of the main shock. Geodetic studies reported that postseismic deformation was detected for the both earthquakes (Sagiya et al., 2002; Hashimoto et al., 2008). The source area of similar earthquakes seems to correspond to the afterslip area. These features suggest that the similar earthquakes observed

  10. Earthquakes; January-February 1977

    USGS Publications Warehouse

    Person, W.J.

    1977-01-01

    There were no major earthquakes (7.0-7.9) during the first 2 months of the year, and no fatalities were reported. Three strong earthquakes occurred- New Guinea, Tadzhik S.S.R, and the Aleutian Islands. The Tadzhik earthquake on January 31 caused considerable damage and possible injuries. The United States experienced a number of earthquakes, but only very minor damage was reported. 

  11. Earthquakes; January-February, 1979

    USGS Publications Warehouse

    Person, W.J.

    1979-01-01

    The first major earthquake (magnitude 7.0 to 7.9) of the year struck in southeastern Alaska in a sparsely populated area on February 28. On January 16, Iran experienced the first destructive earthquake of the year causing a number of casualties and considerable damage. Peru was hit by a destructive earthquake on February 16 that left casualties and damage. A number of earthquakes were experienced in parts of the Untied States, but only minor damage was reported. 

  12. Earthquakes, March-April, 1993

    USGS Publications Warehouse

    Person, Waverly J.

    1993-01-01

    Worldwide, only one major earthquake (7.0earthquake, a magnitude 7.2 shock, struck the Santa Cruz Islands region in the South Pacific on March 6. Earthquake-related deaths occurred in the Fiji Islands, China, and Peru.

  13. Earthquakes, January-February 1992

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    In terms of seismic activity, the first two months of 1992 were somewhat quiet. There was one major earthquake (7.0-7.9) during this reporting period-a magntidue 7.1 earthquake in the Vanuatu Islands. There were no earthquake-related deaths for the first two months.

  14. Earthquakes, May-June 1991

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    In the United States, a magnitude 5.8 earthquake in southern California on June 28 killed two people and caused considerable damage. Strong earthquakes hit Alaska on May 1 and May 30; the May 1 earthquake caused some minor damage. 

  15. Earthquakes, March-April 1978

    USGS Publications Warehouse

    Person, W.J.

    1978-01-01

    Earthquakes caused fatalities in Mexico and Sicily; injuries and damage were sustained in eastern Kazakh SSR and Yugoslavia. There were four major earthquakes; one south of Honshu, Japan, two in the Kuril Islands region, and one in the Soviet Union. The United States experienced a number of earthquakes, but only very minor damage was reported. 

  16. Earthquakes, May-June 1984

    USGS Publications Warehouse

    Person, W.J.

    1984-01-01

    No major earthquakes (7.0-7.9) occurred during this reporting period. earthquake-rated deaths were reported from Italy, the Dominican Republic, and Yugoslavia. A number of earthquakes occurred in the United States but none caused casualties or any significant damage. 

  17. Earthquakes, March-April 1991

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    Two major earthquakes (7.0-7.9) occurred during this reporting period: a magnitude 7.6 in Costa Rica on April 22 and a magntidue 7.0 in the USSR on April 29. Destructive earthquakes hit northern Peru on April 4 and 5. There were no destructive earthquakes in the United States during this period. 

  18. Earthquakes, September-October 1993

    USGS Publications Warehouse

    Person, W.J.

    1993-01-01

    The fatalities in the United States were caused by two earthquakes in southern Oregon on September 21. These earthquakes, both with magnitude 6.0 and separated in time by about 2 hrs, led to the deaths of two people. One of these deaths was apparently due to a heart attack induced by the earthquake

  19. Organizational changes at Earthquakes & Volcanoes

    USGS Publications Warehouse

    Gordon, David W.

    1992-01-01

    Primary responsibility for the preparation of Earthquakes & Volcanoes within the Geological Survey has shifted from the Office of Scientific Publications to the Office of Earthquakes, Volcanoes, and Engineering (OEVE). As a consequence of this reorganization, Henry Spall has stepepd down as Science Editor for Earthquakes & Volcanoes(E&V).

  20. Turkish Children's Ideas about Earthquakes

    ERIC Educational Resources Information Center

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  1. Earthquakes, September-October 1978

    USGS Publications Warehouse

    Person, W.J.

    1979-01-01

    The months of September and October were somewhat quiet seismically speaking. One major earthquake, magnitude (M) 7.7 occurred in Iran on September 16. In Germany, a magntidue 5.0 earthquake caused damage and considerable alarm to many people in parts of that country. In the United States, the largest earthquake occurred along the California-Nevada border region. 

  2. InSAR observations of the 2009 Racha earthquake, Georgia

    NASA Astrophysics Data System (ADS)

    Nikolaeva, Elena; Walter, Thomas R.

    2016-09-01

    Central Georgia is an area strongly affected by earthquake and landslide hazards. On 29 April 1991 a major earthquake (Mw  =  7.0) struck the Racha region in Georgia, followed by aftershocks and significant afterslip. The same region was hit by another major event (Mw  =  6.0) on 7 September 2009. The aim of the study reported here was to utilize interferometric synthetic aperture radar (InSAR) data to improve knowledge about the spatial pattern of deformation due to the 2009 earthquake. There were no actual earthquake observations by InSAR in Georgia. We considered all available SAR data images from different space agencies. However, due to the long wavelength and the frequent acquisitions, only the multi-temporal ALOS L-band SAR data allowed us to produce interferograms spanning the 2009 earthquake. We detected a local uplift around 10 cm (along the line-of-sight propagation) in the interferogram near the earthquake's epicenter, whereas evidence of surface ruptures could not be found in the field along the active thrust fault. We simulated a deformation signal which could be created by the 2009 Racha earthquake on the basis of local seismic records and by using an elastic dislocation model. We compared our modeled fault surface of the September 2009 with the April 1991 Racha earthquake fault surfaces and identify the same fault or a sub-parallel fault of the same system as the origin. The patch that was active in 2009 is just adjacent to the 1991 patch, indicating a possible mainly westward propagation direction, with important implications for future earthquake hazards.

  3. PAGER--Rapid assessment of an earthquake?s impact

    USGS Publications Warehouse

    Wald, D.J.; Jaiswal, K.; Marano, K.D.; Bausch, D.; Hearne, M.

    2010-01-01

    PAGER (Prompt Assessment of Global Earthquakes for Response) is an automated system that produces content concerning the impact of significant earthquakes around the world, informing emergency responders, government and aid agencies, and the media of the scope of the potential disaster. PAGER rapidly assesses earthquake impacts by comparing the population exposed to each level of shaking intensity with models of economic and fatality losses based on past earthquakes in each country or region of the world. Earthquake alerts--which were formerly sent based only on event magnitude and location, or population exposure to shaking--now will also be generated based on the estimated range of fatalities and economic losses.

  4. California earthquake history

    USGS Publications Warehouse

    Toppozada, T.; Branum, D.

    2004-01-01

    This paper presents an overview of the advancement in our knowledge of California's earthquake history since ??? 1800, and especially during the last 30 years. We first review the basic statewide research on earthquake occurrences that was published from 1928 through 2002, to show how the current catalogs and their levels of completeness have evolved with time. Then we review some of the significant new results in specific regions of California, and some of what remains to be done. Since 1850, 167 potentially damaging earthquakes of M ??? 6 or larger have been identified in California and its border regions, indicating an average rate of 1.1 such events per year. Table I lists the earthquakes of M ??? 6 to 6.5 that were also destructive since 1812 in California and its border regions, indicating an average rate of one such event every ??? 5 years. Many of these occurred before 1932 when epicenters and magnitudes started to be determined routinely using seismographs in California. The number of these early earthquakes is probably incomplete in sparsely populated remote parts of California before ??? 1870. For example, 6 of the 7 pre-1873 events in table I are of M ??? 7, suggesting that other earthquakes of M 6.5 to 6.9 occurred but were not properly identified, or were not destructive. The epicenters and magnitudes (M) of the pre-instrumental earthquakes were determined from isoseismal maps that were based on the Modified Mercalli Intensity of shaking (MMI) at the communities that reported feeling the earthquakes. The epicenters were estimated to be in the regions of most intense shaking, and values of M were estimated from the extent of the areas shaken at various MMI levels. MMI VII or greater shaking is the threshold of damage to weak buildings. Certain areas in the regions of Los Angeles, San Francisco, and Eureka were each shaken repeatedly at MMI VII or greater at least six times since ??? 1812, as depicted by Toppozada and Branum (2002, fig. 19).

  5. Earthquakes in New England

    USGS Publications Warehouse

    Fratto, E. S.; Ebel, J.E.; Kadinsky-Cade, K.

    1990-01-01

    New England has a long history of earthquakes. Some of the first explorers were startled when they experienced strong shaking and rumbling of the earth below their feet. they soon learned from the Indians that this was not an uncommon occurrence in the New World. the Plymouth Pilgrims felt their first earthquake in 1638. that first shock rattled dishes, doors, and buildings. The shaking so frightened those working in the fields that they threw down their tools and ran panic-stricken through the countryside. 

  6. Subducted sediment thickness and Mw 9 earthquakes

    NASA Astrophysics Data System (ADS)

    Seno, Tetsuzo

    2017-01-01

    I measure the thickness of subducted sediment (Δss) beneath the décollement in the fore-arc wedge and show that the average value of Δss over a subduction zone segment (Δss>¯) is greater than 1.3 km in segments where Mw ≥ 9 earthquakes have occurred and less than 1.2 km in segments without such large earthquakes. In a previous study, I showed that the stress drop (Δσ) of large earthquakes (Mw ≥ 7) averaged over a subduction zone segment (Δσ>¯) is larger in segments where Mw ≥ 9 earthquakes have occurred than in segments without such an event. It has also been shown that Δσ>¯ is linearly related to 1 - λ (λ = the pore fluid pressure ratio in the interplate megathrust). In this study, I revise the previous estimates of Δσ>¯ and λ and show that there is a positive correlation between Δss>¯, Δσ>¯, and 1 - λ. I present a model that relates Δss to 1 - λ based on the porous flow of H2O in the subducted sediments, which gives a theoretical basis for the correlation between Δss>¯ and Δσ>¯. The combination of these parameters thus provides a better indicator for identifying segments where Mw ≥ 9 earthquakes may occur. Based on this, I propose that the tectonic environments where such huge events are likely to occur are (1) near collision zones, (2) near subduction of spreading centers, and (3) erosive margins with compressional fore arcs. Near the Japanese islands, SE Hokkaido is prone to such an event, but the Nankai Trough is not.

  7. Earthquakes, November-December 1992

    USGS Publications Warehouse

    Person, W.J.

    1993-01-01

    There were two major earthquakes (7.0≤M<8.0) during the last two months of the year, a magntidue 7.5 earthquake on December 12 in the Flores region, Indonesia, and a magnitude 7.0 earthquake on December 20 in the Banda Sea. Earthquakes caused fatalities in China and Indonesia. The greatest number of deaths (2,500) for the year occurred in Indonesia. In Switzerland, six people were killed by an accidental explosion recoreded by seismographs. In teh United States, a magnitude 5.3 earthquake caused slight damage at Big Bear in southern California. 

  8. Earthquakes; March-April 1975

    USGS Publications Warehouse

    Person, W.J.

    1975-01-01

    There were no major earthquakes (magnitude 7.0-7.9) in March or April; however, there were earthquake fatalities in Chile, Iran, and Venezuela and approximately 35 earthquake-related injuries were reported around the world. In the United States a magnitude 6.0 earthquake struck the Idaho-Utah border region. Damage was estimated at about a million dollars. The shock was felt over a wide area and was the largest to hit the continental Untied States since the San Fernando earthquake of February 1971. 

  9. The Doctrinal Basis for Medical Stability Operations

    DTIC Science & Technology

    2010-01-01

    security forces (HNSFs). As most developing countries have poor health systems, all levels and branches of the health sector should be targeted including...actors will be available in more permissive environments, perhaps by orders of magnitude. Poor coordi- nation fragments efforts, weakens health systems...further services unless there is imme- diate threat to life, limb, or eyesight . MEDCAPs may have a role in rural areas without services but government

  10. Monitoring road losses for Lushan 7.0 earthquake disaster utilization multisource remote sensing images

    NASA Astrophysics Data System (ADS)

    Huang, He; Yang, Siquan; Li, Suju; He, Haixia; Liu, Ming; Xu, Feng; Lin, Yueguan

    2015-12-01

    Earthquake is one major nature disasters in the world. At 8:02 on 20 April 2013, a catastrophic earthquake with Ms 7.0 in surface wave magnitude occurred in Sichuan province, China. The epicenter of this earthquake located in the administrative region of Lushan County and this earthquake was named the Lushan earthquake. The Lushan earthquake caused heavy casualties and property losses in Sichuan province. After the earthquake, various emergency relief supplies must be transported to the affected areas. Transportation network is the basis for emergency relief supplies transportation and allocation. Thus, the road losses of the Lushan earthquake must be monitoring. The road losses monitoring results for Lushan earthquake disaster utilization multisource remote sensing images were reported in this paper. The road losses monitoring results indicated that there were 166 meters' national roads, 3707 meters' provincial roads, 3396 meters' county roads, 7254 meters' township roads, and 3943 meters' village roads were damaged during the Lushan earthquake disaster. The damaged roads mainly located at Lushan County, Baoxing County, Tianquan County, Yucheng County, Mingshan County, and Qionglai County. The results also can be used as a decision-making information source for the disaster management government in China.

  11. Problems with Interagency Integration in Contemporary Operations

    DTIC Science & Technology

    2014-12-04

    systems. Case studies of the Haitian earthquake in 2010 and the Provincial Reconstruction Team effort in Afghanistan offer an insight into the sources...and disaster relief operations. 15. SUBJECT TERMS Interagency, Foreign Disaster Relief, Stability Operations, 2010 Haitian Earthquake , Provincial...Integration however, remains a challenge and often requires time to establish efficient systems. Case studies of the Haitian earthquake in 2010 and the

  12. Measures for groundwater security during and after the Hanshin-Awaji earthquake (1995) and the Great East Japan earthquake (2011), Japan

    NASA Astrophysics Data System (ADS)

    Tanaka, Tadashi

    2016-03-01

    Many big earthquakes have occurred in the tectonic regions of the world, especially in Japan. Earthquakes often cause damage to crucial life services such as water, gas and electricity supply systems and even the sewage system in urban and rural areas. The most severe problem for people affected by earthquakes is access to water for their drinking/cooking and toilet flushing. Securing safe water for daily life in an earthquake emergency requires the establishment of countermeasures, especially in a mega city like Tokyo. This paper described some examples of groundwater use in earthquake emergencies, with reference to reports, books and newspapers published in Japan. The consensus is that groundwater, as a source of water, plays a major role in earthquake emergencies, especially where the accessibility of wells coincides with the emergency need. It is also important to introduce a registration system for citizen-owned and company wells that can form the basis of a cooperative during a disaster; such a registration system was implemented by many Japanese local governments after the Hanshin-Awaji Earthquake in 1995 and the Great East Japan Earthquake in 2011, and is one of the most effective countermeasures for groundwater use in an earthquake emergency. Emphasis is also placed the importance of establishing of a continuous monitoring system of groundwater conditions for both quantity and quality during non-emergency periods.

  13. HOMOGENEOUS CATALOGS OF EARTHQUAKES*

    PubMed Central

    Knopoff, Leon; Gardner, J. K.

    1969-01-01

    The usual bias in earthquake catalogs against shocks of small magnitudes can be removed by testing the randomness of the magnitudes of successive shocks. The southern California catalog, 1933-1967, is found to be unbiased in the sense of the test at magnitude 4 or above; the cutoff is improved to M = 3 for the subcatalog 1953-1967. PMID:16578700

  14. Earthquake Prediction is Coming

    ERIC Educational Resources Information Center

    MOSAIC, 1977

    1977-01-01

    Describes (1) several methods used in earthquake research, including P:S ratio velocity studies, dilatancy models; and (2) techniques for gathering base-line data for prediction using seismographs, tiltmeters, laser beams, magnetic field changes, folklore, animal behavior. The mysterious Palmdale (California) bulge is discussed. (CS)

  15. Earthquake damage to schools

    USGS Publications Warehouse

    McCullough, Heather

    1994-01-01

    These unusual slides show earthquake damage to school and university buildings around the world. They graphically illustrate the potential danger to our schools, and to the welfare of our children, that results from major earthquakes. The slides range from Algeria, where a collapsed school roof is held up only by students' desks; to Anchorage, Alaska, where an elementary school structure has split in half; to California and other areas, where school buildings have sustained damage to walls, roofs, and chimneys. Interestingly, all the United States earthquakes depicted in this set of slides occurred either on a holiday or before or after school hours, except the 1935 tremor in Helena, Montana, which occurred at 11:35 am. It undoubtedly would have caused casualties had the schools not been closed days earlier by Helena city officials because of a damaging foreshock. Students in Algeria, the People's Republic of China, Armenia, and other stricken countries were not so fortunate. This set of slides represents 17 destructive earthquakes that occurred in 9 countries, and covers more than a century--from 1886 to 1988. Two of the tremors, both of which occurred in the United States, were magnitude 8+ on the Richter Scale, and four were magnitude 7-7.9. The events represented by the slides (see table below) claimed more than a quarter of a million lives.

  16. Road Damage Following Earthquake

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Ground shaking triggered liquefaction in a subsurface layer of water-saturated sand, producing differential lateral and vertical movement in a overlying carapace of unliquified sand and slit, which moved from right to left towards the Pajaro River. This mode of ground failure, termed lateral spreading, is a principal cause of liquefaction-related earthquake damage caused by the Oct. 17, 1989, Loma Prieta earthquake. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditons that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. Credit: S.D. Ellen, U.S. Geological Survey

  17. Fractal dynamics of earthquakes

    SciTech Connect

    Bak, P.; Chen, K.

    1995-05-01

    Many objects in nature, from mountain landscapes to electrical breakdown and turbulence, have a self-similar fractal spatial structure. It seems obvious that to understand the origin of self-similar structures, one must understand the nature of the dynamical processes that created them: temporal and spatial properties must necessarily be completely interwoven. This is particularly true for earthquakes, which have a variety of fractal aspects. The distribution of energy released during earthquakes is given by the Gutenberg-Richter power law. The distribution of epicenters appears to be fractal with dimension D {approx} 1--1.3. The number of after shocks decay as a function of time according to the Omori power law. There have been several attempts to explain the Gutenberg-Richter law by starting from a fractal distribution of faults or stresses. But this is a hen-and-egg approach: to explain the Gutenberg-Richter law, one assumes the existence of another power-law--the fractal distribution. The authors present results of a simple stick slip model of earthquakes, which evolves to a self-organized critical state. Emphasis is on demonstrating that empirical power laws for earthquakes indicate that the Earth`s crust is at the critical state, with no typical time, space, or energy scale. Of course the model is tremendously oversimplified; however in analogy with equilibrium phenomena they do not expect criticality to depend on details of the model (universality).

  18. WGCEP Historical California Earthquake Catalog

    USGS Publications Warehouse

    Felzer, Karen R.; Cao, Tianqing

    2008-01-01

    This appendix provides an earthquake catalog for California and the surrounding area. Our goal is to provide a listing for all known M > 5.5 earthquakes that occurred from 1850-1932 and all known M > 4.0 earthquakes that occurred from 1932-2006 within the region of 31.0 to 43.0 degrees North and -126.0 to -114.0 degrees West. Some pre-1932 earthquakes 4 5, before the Northern California network was online. Some earthquakes from 1900-1932, and particularly from 1910-1932 are also based on instrumental readings, but the quality of the instrumental record and the resulting analysis are much less precise than for later listings. A partial exception is for some of the largest earthquakes, such as the San Francisco earthquake of April 18, 1906, for which global teleseismic records (Wald et al. 1993) and geodetic measurements (Thatcher et al. 1906) have been used to help determine magnitudes.

  19. The EM Earthquake Precursor

    NASA Astrophysics Data System (ADS)

    Jones, K. B., II; Saxton, P. T.

    2013-12-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After the 1989 Loma Prieta Earthquake, American earthquake investigators predetermined magnetometer use and a minimum earthquake magnitude necessary for EM detection. This action was set in motion, due to the extensive damage incurred and public outrage concerning earthquake forecasting; however, the magnetometers employed, grounded or buried, are completely subject to static and electric fields and have yet to correlate to an identifiable precursor. Secondly, there is neither a networked array for finding any epicentral locations, nor have there been any attempts to find even one. This methodology needs dismissal, because it is overly complicated, subject to continuous change, and provides no response time. As for the minimum magnitude threshold, which was set at M5, this is simply higher than what modern technological advances have gained. Detection can now be achieved at approximately M1, which greatly improves forecasting chances. A propagating precursor has now been detected in both the field and laboratory. Field antenna testing conducted outside the NE Texas town of Timpson in February, 2013, detected three strong EM sources along with numerous weaker signals. The antenna had mobility, and observations were noted for recurrence, duration, and frequency response. Next, two

  20. Detailed source process of the 2007 Tocopilla earthquake.

    NASA Astrophysics Data System (ADS)

    Peyrat, S.; Madariaga, R.; Campos, J.; Asch, G.; Favreau, P.; Bernard, P.; Vilotte, J.

    2008-05-01

    We investigated the detail rupture process of the Tocopilla earthquake (Mw 7.7) of the 14 November 2007 and of the main aftershocks that occurred in the southern part of the North Chile seismic gap using strong motion data. The earthquake happen in the middle of the permanent broad band and strong motion network IPOC newly installed by GFZ and IPGP, and of a digital strong-motion network operated by the University of Chile. The Tocopilla earthquake is the last large thrust subduction earthquake that occurred since the major Iquique 1877 earthquake which produced a destructive tsunami. The Arequipa (2001) and Antofagasta (1995) earthquakes already ruptured the northern and southern parts of the gap, and the intraplate intermediate depth Tarapaca earthquake (2005) may have changed the tectonic loading of this part of the Peru-Chile subduction zone. For large earthquakes, the depth of the seismic rupture is bounded by the depth of the seismogenic zone. What controls the horizontal extent of the rupture for large earthquakes is less clear. Factors that influence the extent of the rupture include fault geometry, variations of material properties and stress heterogeneities inherited from the previous ruptures history. For subduction zones where structures are not well known, what may have stopped the rupture is not obvious. One crucial problem raised by the Tocopilla earthquake is to understand why this earthquake didn't extent further north, and at south, what is the role of the Mejillones peninsula that seems to act as a barrier. The focal mechanism was determined using teleseismic waveforms inversion and with a geodetic analysis (cf. Campos et al.; Bejarpi et al., in the same session). We studied the detailed source process using the strong motion data available. This earthquake ruptured the interplate seismic zone over more than 150 km and generated several large aftershocks, mainly located south of the rupture area. The strong-motion data show clearly two S

  1. Rationalizing Hybrid Earthquake Probabilities

    NASA Astrophysics Data System (ADS)

    Gomberg, J.; Reasenberg, P.; Beeler, N.; Cocco, M.; Belardinelli, M.

    2003-12-01

    An approach to including stress transfer and frictional effects in estimates of the probability of failure of a single fault affected by a nearby earthquake has been suggested in Stein et al. (1997). This `hybrid' approach combines conditional probabilities, which depend on the time elapsed since the last earthquake on the affected fault, with Poissonian probabilities that account for friction and depend only on the time since the perturbing earthquake. The latter are based on the seismicity rate change model developed by Dieterich (1994) to explain the temporal behavior of aftershock sequences in terms of rate-state frictional processes. The model assumes an infinite population of nucleation sites that are near failure at the time of the perturbing earthquake. In the hybrid approach, assuming the Dieterich model can lead to significant transient increases in failure probability. We explore some of the implications of applying the Dieterich model to a single fault and its impact on the hybrid probabilities. We present two interpretations that we believe can rationalize the use of the hybrid approach. In the first, a statistical distribution representing uncertainties in elapsed and/or mean recurrence time on the fault serves as a proxy for Dieterich's population of nucleation sites. In the second, we imagine a population of nucleation patches distributed over the fault with a distribution of maturities. In both cases we find that the probability depends on the time since the last earthquake. In particular, the size of the transient probability increase may only be significant for faults already close to failure. Neglecting the maturity of a fault may lead to overestimated rate and probability increases.

  2. The 1868 Hayward Earthquake Alliance: A Case Study - Using an Earthquake Anniversary to Promote Earthquake Preparedness

    NASA Astrophysics Data System (ADS)

    Brocher, T. M.; Garcia, S.; Aagaard, B. T.; Boatwright, J. J.; Dawson, T.; Hellweg, M.; Knudsen, K. L.; Perkins, J.; Schwartz, D. P.; Stoffer, P. W.; Zoback, M.

    2008-12-01

    Last October 21st marked the 140th anniversary of the M6.8 1868 Hayward Earthquake, the last damaging earthquake on the southern Hayward Fault. This anniversary was used to help publicize the seismic hazards associated with the fault because: (1) the past five such earthquakes on the Hayward Fault occurred about 140 years apart on average, and (2) the Hayward-Rodgers Creek Fault system is the most likely (with a 31 percent probability) fault in the Bay Area to produce a M6.7 or greater earthquake in the next 30 years. To promote earthquake awareness and preparedness, over 140 public and private agencies and companies and many individual joined the public-private nonprofit 1868 Hayward Earthquake Alliance (1868alliance.org). The Alliance sponsored many activities including a public commemoration at Mission San Jose in Fremont, which survived the 1868 earthquake. This event was followed by an earthquake drill at Bay Area schools involving more than 70,000 students. The anniversary prompted the Silver Sentinel, an earthquake response exercise based on the scenario of an earthquake on the Hayward Fault conducted by Bay Area County Offices of Emergency Services. 60 other public and private agencies also participated in this exercise. The California Seismic Safety Commission and KPIX (CBS affiliate) produced professional videos designed forschool classrooms promoting Drop, Cover, and Hold On. Starting in October 2007, the Alliance and the U.S. Geological Survey held a sequence of press conferences to announce the release of new research on the Hayward Fault as well as new loss estimates for a Hayward Fault earthquake. These included: (1) a ShakeMap for the 1868 Hayward earthquake, (2) a report by the U. S. Bureau of Labor Statistics forecasting the number of employees, employers, and wages predicted to be within areas most strongly shaken by a Hayward Fault earthquake, (3) new estimates of the losses associated with a Hayward Fault earthquake, (4) new ground motion

  3. Force and pressure characteristics for a series of nose inlets at Mach numbers from 1.59 to 1.99 V : analysis and comparison on basis of ram-jet aircraft range and operational characteristics

    NASA Technical Reports Server (NTRS)

    Howard, E; Luidens, R W; Allen, J L

    1951-01-01

    Performance of four experimentally investigated axially symmetric spike-type nose inlets is compared on basis of ram-jet-engine aircraft range and operational problems. At design conditions, calculated peak engine efficiencies varied 25 percent from the highest value which indicates importance of inlet design. Calculations for a typical supersonic aircraft indicate possible increase in range if engine is flown at moderate angle of attack and result in engine lift utilized. For engines with fixed exhaust nozzle, propulsive thrust increases with increasing heat addition in subcritical flow region in spite of increasing additive drag. For the perforated inlet there is a range of increasing total-temperature ratios in subcritical flow region that does not yield an increase in propulsive thrust. Effects of inlet characteristics on speed stability of a typical aircraft for three types of fuel control is discussed.

  4. Meeting the Challenge of Earthquake Risk Globalisation: Towards the Global Earthquake Model GEM (Sergey Soloviev Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Zschau, J.

    2009-04-01

    Earthquake risk, like natural risks in general, has become a highly dynamic and globally interdependent phenomenon. Due to the "urban explosion" in the Third World, an increasingly complex cross linking of critical infrastructure and lifelines in the industrial nations and a growing globalisation of the world's economies, we are presently facing a dramatic increase of our society's vulnerability to earthquakes in practically all seismic regions on our globe. Such fast and global changes cannot be captured with conventional earthquake risk models anymore. The sciences in this field are, therefore, asked to come up with new solutions that are no longer exclusively aiming at the best possible quantification of the present risks but also keep an eye on their changes with time and allow to project these into the future. This does not apply to the vulnerablity component of earthquake risk alone, but also to its hazard component which has been realized to be time-dependent, too. The challenges of earthquake risk dynamics and -globalisation have recently been accepted by the Global Science Forum of the Organisation for Economic Co-operation and Development (OECD - GSF) who initiated the "Global Earthquake Model (GEM)", a public-private partnership for establishing an independent standard to calculate, monitor and communicate earthquake risk globally, raise awareness and promote mitigation.

  5. Urban Earthquake Shaking and Loss Assessment

    NASA Astrophysics Data System (ADS)

    Hancilar, U.; Tuzun, C.; Yenidogan, C.; Zulfikar, C.; Durukal, E.; Erdik, M.

    2009-04-01

    of accelerometric and other macroseismic data (similar to the USGS ShakeMap System). The building inventory data for the Level 2 analysis will consist of grid (geo-cell) based urban building and demographic inventories. For building grouping the European building typology developed within the EU-FP5 RISK-EU project is used. The building vulnerability/fragility relationships to be used can be user selected from a list of applicable relationships developed on the basis of a comprehensive study, Both empirical and analytical relationships (based on the Coefficient Method, Equivalent Linearization Method and the Reduction Factor Method of analysis) can be employed. Casualties in Level 2 analysis are estimated based on the number of buildings in different damaged states and the casualty rates for each building type and damage level. Modifications to the casualty rates can be used if necessary. ELER Level 2 analysis will include calculation of direct monetary losses as a result building damage that will allow for repair-cost estimations and specific investigations associated with earthquake insurance applications (PML and AAL estimations). ELER Level 2 analysis loss results obtained for Istanbul for a scenario earthquake using different techniques will be presented with comparisons using different earthquake damage assessment software. The urban earthquake shaking and loss information is intented for dissemination in a timely manner to related agencies for the planning and coordination of the post-earthquake emergency response. However the same software can also be used for scenario earthquake loss estimation, related Monte-Carlo type simulations and eathquake insurance applications.

  6. The health effects of earthquakes in the mid-1990s.

    PubMed

    Alexander, D

    1996-09-01

    This paper gives an overview of the global pattern of casualties in earthquakes which occurred during the 30-month period from 1 September 1993 to 29 February 1996. It also describes some of the behavioural and logistical regularities associated with mortality and morbidity in these events. Of 83 earthquakes studied, there were casualties in 49. Lethal earthquakes occurred in rapid succession in Indonesia, China, Colombia and Iran. In the events studied, a disproportionate number of deaths and injuries occurred during the first six hours of the day and in earthquakes with magnitudes between 6.5 and 7.4. Ratios of death to injury varied markedly (though with some averages close to 1:3), as did the nature and causes of mortality and morbidity and the proportion of serious to slight injuries. As expected on the basis of previous knowledge, few problems were caused by post-earthquake illness and disease. Also, as expected, building collapse was the principal source of casualties: tsunamis, landslides, debris flows and bridge collapses were the main secondary causes. In addition, new findings are presented on the temporal sequence of casualty estimates after seismic disaster. In synthesis, though mortality in earthquakes may have been low in relation to long-term averages, the interval of time studied was probably typical of other periods in which seismic catastrophes were relatively limited in scope.

  7. Improving the RST Approach for Earthquake Prone Areas Monitoring: Results of Correlation Analysis among Significant Sequences of TIR Anomalies and Earthquakes (M>4) occurred in Italy during 2004-2014

    NASA Astrophysics Data System (ADS)

    Tramutoli, V.; Coviello, I.; Filizzola, C.; Genzano, N.; Lisi, M.; Paciello, R.; Pergola, N.

    2015-12-01

    Looking toward the assessment of a multi-parametric system for dynamically updating seismic hazard estimates and earthquake short term (from days to weeks) forecast, a preliminary step is to identify those parameters (chemical, physical, biological, etc.) whose anomalous variations can be, to some extent, associated to the complex process of preparation of a big earthquake. Among the different parameters, the fluctuations of Earth's thermally emitted radiation, as measured by sensors on board of satellite system operating in the Thermal Infra-Red (TIR) spectral range, have been proposed since long time as potential earthquake precursors. Since 2001, a general approach called Robust Satellite Techniques (RST) has been used to discriminate anomalous thermal signals, possibly associated to seismic activity from normal fluctuations of Earth's thermal emission related to other causes (e.g. meteorological) independent on the earthquake occurrence. Thanks to its full exportability on different satellite packages, RST has been implemented on TIR images acquired by polar (e.g. NOAA-AVHRR, EOS-MODIS) and geostationary (e.g. MSG-SEVIRI, NOAA-GOES/W, GMS-5/VISSR) satellite sensors, in order to verify the presence (or absence) of TIR anomalies in presence (absence) of earthquakes (with M>4) in different seismogenic areas around the world (e.g. Italy, Turkey, Greece, California, Taiwan, etc.).In this paper, a refined RST (Robust Satellite Techniques) data analysis approach and RETIRA (Robust Estimator of TIR Anomalies) index were used to identify Significant Sequences of TIR Anomalies (SSTAs) during eleven years (from May 2004 to December 2014) of TIR satellite records, collected over Italy by the geostationary satellite sensor MSG-SEVIRI. On the basis of specific validation rules (mainly based on physical models and results obtained by applying RST approach to several earthquakes all around the world) the level of space-time correlation among SSTAs and earthquakes (with M≥4

  8. Scientific aspects of the Tohoku earthquake and Fukushima nuclear accident

    NASA Astrophysics Data System (ADS)

    Koketsu, Kazuki

    2016-04-01

    We investigated the 2011 Tohoku earthquake, the accident of the Fukushima Daiichi nuclear power plant, and assessments conducted beforehand for earthquake and tsunami potential in the Pacific offshore region of the Tohoku District. The results of our investigation show that all the assessments failed to foresee the earthquake and its related tsunami, which was the main cause of the accident. Therefore, the disaster caused by the earthquake, and the accident were scientifically unforeseeable at the time. However, for a zone neighboring the reactors, a 2008 assessment showed tsunamis higher than the plant height. As a lesson learned from the accident, companies operating nuclear power plants should be prepared using even such assessment results for neighboring zones.

  9. Listening to Earthquakes with Infrasound

    NASA Astrophysics Data System (ADS)

    Mucek, A. E.; Langston, C. A.

    2011-12-01

    A tripartite infrasound array was installed to listen to earthquakes occurring along the Guy-Greenbrier fault in Arkansas. The active earthquake swarm is believed to be caused by deep waste water injections and will allow us to explain the mechanisms causing earthquake "booms" that have been heard during an earthquake. The array has an aperture of 50 meters and is installed next to the X301 seismograph station run by the Center for Earthquake Research and Information (CERI). This arrangement allows simultaneous recording of seismic and acoustic changes from the arrival of an earthquake. Other acoustic and seismic sources that have been found include thunder from thunderstorms, gunshots, quarry explosions and hydraulic fracturing activity from the local gas wells. The duration of the experiment is from the last week of June to the last week of September 2011. During the first month and a half, seven local earthquakes were recorded, along with numerous occurrences of the other infrasound sources. Phase arrival times of the recorded waves allow us to estimate wave slowness and azimuth of infrasound events. Using these two properties, we can determine whether earthquake "booms" occur at a site from the arrival of the P-wave or whether the earthquake "booms" occur elsewhere and travel through the atmosphere. Preliminary results show that the infrasound correlates well to the ground motion during an earthquake for frequencies below 15 Hertz.

  10. Development of a High-Power Wideband Amplifier on the Basis of a Free-Electron Maser Having an Operating Frequency Near 30 GHz: Modeling and Results of the Initial Experiments

    NASA Astrophysics Data System (ADS)

    Bandurkin, I. V.; Donets, D. E.; Kaminsky, A. K.; Kuzikov, S. V.; Perel'shteyn, E. A.; Peskov, N. Yu.; Savilov, A. V.; Sedykh, S. N.

    2017-01-01

    We develop a high-power wideband amplifier based on a free-electron maser for particle acceleration, which will be operated in the 30 GHz frequency band, on the basis of the LIU-3000 linear induction accelerator forming an electron beam with an electron energy of 0.8 MeV, a current of 250 A, and a pulse duration of 200 ns. As the operating regime, we chose the regime of grazing of dispersion curves, since, according to the modeling performed, it allows one to ensure an instantaneous amplification band of about 5-7% in an undulator with regular winding for an output radiation power at a level of 20 MW and a gain of 30-35 dB. The results of the first experiments studying this FEM-based scheme are presented, in which the specified power level is achieved in the range around 30 GHz, and fast tuning of ±0.5 GHz in the band of variations in the frequency of the master magnetron is demonstrated. Modeling shows that the use of the non-resonance trapping/braking regime, which is realized in an undulator with profiled parameters, allows one to expect an increase in the radiation power of up to 35-40 MW with simultaneous widening of the amplification band up to 30% under the conditions of the LIU-3000 experiments.

  11. VLF/LF EM emissions as main precursor of earthquakes and their searching possibilities for Georgian s/a region

    NASA Astrophysics Data System (ADS)

    Kachakhidze, Manana; Kachakhidze, Nino

    2016-04-01

    Authors of abstract have created work which offers model of earth electromagnetic emissions generation detected in the process of earthquake preparation on the basis of electrodynamics. The model gives qualitative explanation of a mechanism of generation of electromagnetic waves emitted in the earthquake preparation period. Besides, scheme of the methodology of earthquake forecasting is created based on avalanche-like unstable model of fault formation and an analogous model of electromagnetic contour, synthesis of which, is rather harmonious. According to the authors of the work electromagnetic emissions in radiodiapason is more universal and reliable than other anomalous variations of various geophysical phenomena in earthquake preparation period; Besides, VLF/LF electromagnetic emissions might be declared as the main precursor of earthquake because it might turn out very useful with the view of prediction of large (M ≥5) inland earthquakes and to govern processes going on in lithosphere-atmosphere - ionosphere coupling (LAIC) system. Since the other geophysical phenomena, which may accompany earthquake preparation process and expose themselves several months, weeks or days prior to earthquakes are less informative with the view of earthquake forecasting, it is admissible to consider them as earthquake indicators. Physical mechanisms of mentioned phenomena are explained on the basis of the model of generation of electromagnetic emissions detected before earthquake, where a process of earthquake preparation and its realization are considered taking into account distributed and conservative systems properties. Up to these days electromagnetic emissions detection network did not exist in Georgia. European colleagues helped us (Prof. Dr. PF Biagi, Prof. Dr. Aydın BÜYÜKSARAÇ) and made possible the installation of a receiver. We are going to develop network and put our share in searching of earthquakes problem. Participation in conference is supported by financial

  12. A Discrimination Analysis of Regional Seismic Data Recorded at Tonto Forest Observatory from Nevada Test Site Explosions and Nearby Earthquakes

    DTIC Science & Technology

    1981-12-01

    average earthquake Lg phase appears to be significantly richer in high frequency content than the corresponding average explosion Lg phase. On the basis...source types are quite different, with the average earthquake Pg spectrum being noticeably richer in high frequency content. However, the individual...seen that they show differ- ences very similar to those noted for Pg, indicating that the average earthquake Lg phase is richer in high frequency con

  13. Earthquake triggering at alaskan volcanoes following the 3 November 2002 denali fault earthquake

    USGS Publications Warehouse

    Moran, S.C.; Power, J.A.; Stihler, S.D.; Sanchez, J.J.; Caplan-Auerbach, J.

    2004-01-01

    The 3 November 2002 Mw 7.9 Denali fault earthquake provided an excellent opportunity to investigate triggered earthquakes at Alaskan volcanoes. The Alaska Volcano Observatory operates short-period seismic networks on 24 historically active volcanoes in Alaska, 247-2159 km distant from the mainshock epicenter. We searched for evidence of triggered seismicity by examining the unfiltered waveforms for all stations in each volcano network for ???1 hr after the Mw 7.9 arrival time at each network and for significant increases in located earthquakes in the hours after the mainshock. We found compelling evidence for triggering only at the Katmai volcanic cluster (KVC, 720-755 km southwest of the epicenter), where small earthquakes with distinct P and 5 arrivals appeared within the mainshock coda at one station and a small increase in located earthquakes occurred for several hours after the mainshock. Peak dynamic stresses of ???0.1 MPa at Augustine Volcano (560 km southwest of the epicenter) are significantly lower than those recorded in Yellowstone and Utah (>3000 km southeast of the epicenter), suggesting that strong directivity effects were at least partly responsible for the lack of triggering at Alaskan volcanoes. We describe other incidents of earthquake-induced triggering in the KVC, and outline a qualitative magnitude/distance-dependent triggering threshold. We argue that triggering results from the perturbation of magmatic-hydrothermal systems in the KVC and suggest that the comparative lack of triggering at other Alaskan volcanoes could be a result of differences in the nature of magmatic-hydrothermal systems.

  14. The Earthquake That Tweeted

    NASA Astrophysics Data System (ADS)

    Petersen, D.

    2011-12-01

    Advances in mobile technology and social networking are enabling new behaviors that were not possible even a few short years ago. When people experience a tiny earthquake, it's more likely they're going to reach for their phones and tell their friends about it than actually take cover under a desk. With 175 million Twitter accounts, 750 million Facebook users and more than five billion mobile phones in the world today, people are generating terrific amounts of data simply by going about their everyday lives. Given the right tools and guidance these connected individuals can act as the world's largest sensor network, doing everything from reporting on earthquakes to anticipating global crises. Drawing on the author's experience as a user researcher and experience designer, this presentation will discuss these trends in crowdsourcing the collection and analysis of data, and consider their implications for how the public encounters the earth sciences in their everyday lives.

  15. Identified EM Earthquake Precursors

    NASA Astrophysics Data System (ADS)

    Jones, Kenneth, II; Saxton, Patrick

    2014-05-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After a number of custom rock experiments, two hypotheses were formed which could answer the EM wave model. The first hypothesis concerned a sufficient and continuous electron movement either by surface or penetrative flow, and the second regarded a novel approach to radio transmission. Electron flow along fracture surfaces was determined to be inadequate in creating strong EM fields, because rock has a very high electrical resistance making it a high quality insulator. Penetrative flow could not be corroborated as well, because it was discovered that rock was absorbing and confining electrons to a very thin skin depth. Radio wave transmission and detection worked with every single test administered. This hypothesis was reviewed for propagating, long-wave generation with sufficient amplitude, and the capability of penetrating solid rock. Additionally, fracture spaces, either air or ion-filled, can facilitate this concept from great depths and allow for surficial detection. A few propagating precursor signals have been detected in the field occurring with associated phases using custom-built loop antennae. Field testing was conducted in Southern California from 2006-2011, and outside the NE Texas town of Timpson in February, 2013. The antennae have mobility and observations were noted for

  16. Pain after earthquake

    PubMed Central

    2012-01-01

    Introduction On 6 April 2009, at 03:32 local time, an Mw 6.3 earthquake hit the Abruzzi region of central Italy causing widespread damage in the City of L Aquila and its nearby villages. The earthquake caused 308 casualties and over 1,500 injuries, displaced more than 25,000 people and induced significant damage to more than 10,000 buildings in the L'Aquila region. Objectives This observational retrospective study evaluated the prevalence and drug treatment of pain in the five weeks following the L'Aquila earthquake (April 6, 2009). Methods 958 triage documents were analysed for patients pain severity, pain type, and treatment efficacy. Results A third of pain patients reported pain with a prevalence of 34.6%. More than half of pain patients reported severe pain (58.8%). Analgesic agents were limited to available drugs: anti-inflammatory agents, paracetamol, and weak opioids. Reduction in verbal numerical pain scores within the first 24 hours after treatment was achieved with the medications at hand. Pain prevalence and characterization exhibited a biphasic pattern with acute pain syndromes owing to trauma occurring in the first 15 days after the earthquake; traumatic pain then decreased and re-surged at around week five, owing to rebuilding efforts. In the second through fourth week, reports of pain occurred mainly owing to relapses of chronic conditions. Conclusions This study indicates that pain is prevalent during natural disasters, may exhibit a discernible pattern over the weeks following the event, and current drug treatments in this region may be adequate for emergency situations. PMID:22747796

  17. Testing Earthquake Source Inversion Methodologies

    NASA Astrophysics Data System (ADS)

    Page, Morgan; Mai, P. Martin; Schorlemmer, Danijel

    2011-03-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data and the complex rupture process at depth. The resulting earthquake source models quantify the spatiotemporal evolution of ruptures. They are also used to provide a rapid assessment of the severity of an earthquake and to estimate losses. However, because of uncertainties in the data, assumed fault geometry and velocity structure, and chosen rupture parameterization, it is not clear which features of these source models are robust. Improved understanding of the uncertainty and reliability of earthquake source inversions will allow the scientific community to use the robust features of kinematic inversions to more thoroughly investigate the complexity of the rupture process and to better constrain other earthquake-related computations, such as ground motion simulations and static stress change calculations.

  18. Do Earthquakes Shake Stock Markets?

    PubMed

    Ferreira, Susana; Karali, Berna

    2015-01-01

    This paper examines how major earthquakes affected the returns and volatility of aggregate stock market indices in thirty-five financial markets over the last twenty years. Results show that global financial markets are resilient to shocks caused by earthquakes even if these are domestic. Our analysis reveals that, in a few instances, some macroeconomic variables and earthquake characteristics (gross domestic product per capita, trade openness, bilateral trade flows, earthquake magnitude, a tsunami indicator, distance to the epicenter, and number of fatalities) mediate the impact of earthquakes on stock market returns, resulting in a zero net effect. However, the influence of these variables is market-specific, indicating no systematic pattern across global capital markets. Results also demonstrate that stock market volatility is unaffected by earthquakes, except for Japan.

  19. Do Earthquakes Shake Stock Markets?

    PubMed Central

    2015-01-01

    This paper examines how major earthquakes affected the returns and volatility of aggregate stock market indices in thirty-five financial markets over the last twenty years. Results show that global financial markets are resilient to shocks caused by earthquakes even if these are domestic. Our analysis reveals that, in a few instances, some macroeconomic variables and earthquake characteristics (gross domestic product per capita, trade openness, bilateral trade flows, earthquake magnitude, a tsunami indicator, distance to the epicenter, and number of fatalities) mediate the impact of earthquakes on stock market returns, resulting in a zero net effect. However, the influence of these variables is market-specific, indicating no systematic pattern across global capital markets. Results also demonstrate that stock market volatility is unaffected by earthquakes, except for Japan. PMID:26197482

  20. Tien Shan Geohazards Database: Earthquakes and landslides

    NASA Astrophysics Data System (ADS)

    Havenith, H. B.; Strom, A.; Torgoev, I.; Torgoev, A.; Lamair, L.; Ischuk, A.; Abdrakhmatov, K.

    2015-11-01

    In this paper we present new and review already existing landslide and earthquake data for a large part of the Tien Shan, Central Asia. For the same area, only partial databases for sub-regions have been presented previously. They were compiled and new data were added to fill the gaps between the databases. Major new inputs are products of the Central Asia Seismic Risk Initiative (CASRI): a tentative digital map of active faults (even with indication of characteristic or possible maximum magnitude) and the earthquake catalogue of Central Asia until 2009 that was now updated with USGS data (to May 2014). The new compiled landslide inventory contains existing records of 1600 previously mapped mass movements and more than 1800 new landslide data. Considering presently available seismo-tectonic and landslide data, a target region of 1200 km (E-W) by 600 km (N-S) was defined for the production of more or less continuous geohazards information. This target region includes the entire Kyrgyz Tien Shan, the South-Western Tien Shan in Tajikistan, the Fergana Basin (Kyrgyzstan, Tajikistan and Uzbekistan) as well as the Western part in Uzbekistan, the North-Easternmost part in Kazakhstan and a small part of the Eastern Chinese Tien Shan (for the zones outside Kyrgyzstan and Tajikistan, only limited information was available and compiled). On the basis of the new landslide inventory and the updated earthquake catalogue, the link between landslide and earthquake activity is analysed. First, size-frequency relationships are studied for both types of geohazards, in terms of Gutenberg-Richter Law for the earthquakes and in terms of probability density function for the landslides. For several regions and major earthquake events, case histories are presented to outline further the close connection between earthquake and landslide hazards in the Tien Shan. From this study, we concluded first that a major hazard component is still now insufficiently known for both types of geohazards

  1. Distant, delayed and ancient earthquake-induced landslides

    NASA Astrophysics Data System (ADS)

    Havenith, Hans-Balder; Torgoev, Almaz; Braun, Anika; Schlögel, Romy; Micu, Mihai

    2016-04-01

    On the basis of a new classification of seismically induced landslides we outline particular effects related to the delayed and distant triggering of landslides. Those cannot be predicted by state-of-the-art methods. First, for about a dozen events the 'predicted' extension of the affected area is clearly underestimated. The most problematic cases are those for which far-distant triggering of landslides had been reported, such as for the 1988 Saguenay earthquake. In Central Asia reports for such cases are known for areas marked by a thick cover of loess. One possible contributing effect could be a low-frequency resonance of the thick soils induced by distant earthquakes, especially those in the Pamir - Hindu Kush seismic region. Such deep focal and high magnitude (>>7) earthquakes are also found in Europe, first of all in the Vrancea region (Romania). For this area and others in Central Asia we computed landslide event sizes related to scenario earthquakes with M>7.5. The second particular and challenging type of triggering is the one delayed with respect to the main earthquake event: case histories have been reported for the Racha earthquake in 1991 when several larger landslides only started moving 2 or 3 days after the main shock. Similar observations were also made after other earthquake events in the U.S., such as after the 1906 San Francisco, the 1949 Tacoma, the 1959 Hebgen Lake and the 1983 Bora Peak earthquakes. Here, we will present a series of detailed examples of (partly monitored) mass movements in Central Asia that mainly developed after earthquakes, some even several weeks after the main shock: e.g. the Tektonik and Kainama landslides triggered in 1992 and 2004, respectively. We believe that the development of the massive failures is a consequence of the opening of tension cracks during the seismic shaking and their filling up with water during precipitations that followed the earthquakes. The third particular aspect analysed here is the use of large

  2. Foreshocks of strong earthquakes

    NASA Astrophysics Data System (ADS)

    Guglielmi, A. V.; Sobisevich, L. E.; Sobisevich, A. L.; Lavrov, I. P.

    2014-07-01

    The specific enhancement of ultra-low-frequency (ULF) electromagnetic oscillations a few hours prior to the strong earthquakes, which was previously mentioned in the literature, motivated us to search for the distinctive features of the mechanical (foreshock) activity of the Earth's crust in the epicentral zones of the future earthquakes. Activation of the foreshocks three hours before the main shock is revealed, which is roughly similar to the enhancement of the specific electromagnetic ULF emission. It is hypothesized that the round-the-world seismic echo signals from the earthquakes, which form the peak of energy release 2 h 50 min before the main events, act as the triggers of the main shocks due to the cumulative action of the surface waves converging to the epicenter. It is established that the frequency of the fluctuations in the foreshock activity decreases at the final stages of the preparation of the main shocks, which probably testifies to the so-called mode softening at the approach of the failure point according to the catastrophe theory.

  3. Housing Damage Following Earthquake

    NASA Technical Reports Server (NTRS)

    1989-01-01

    An automobile lies crushed under the third story of this apartment building in the Marina District after the Oct. 17, 1989, Loma Prieta earthquake. The ground levels are no longer visible because of structural failure and sinking due to liquefaction. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditons that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. Credit: J.K. Nakata, U.S. Geological Survey.

  4. Sand Volcano Following Earthquake

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Sand boil or sand volcano measuring 2 m (6.6 ft.) in length erupted in median of Interstate Highway 80 west of the Bay Bridge toll plaza when ground shaking transformed loose water-saturated deposit of subsurface sand into a sand-water slurry (liquefaction) in the October 17, 1989, Loma Prieta earthquake. Vented sand contains marine-shell fragments. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditions that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. (Credit: J.C. Tinsley, U.S. Geological Survey)

  5. Human casualties in earthquakes: modelling and mitigation

    USGS Publications Warehouse

    Spence, R.J.S.; So, E.K.M.

    2011-01-01

    Earthquake risk modelling is needed for the planning of post-event emergency operations, for the development of insurance schemes, for the planning of mitigation measures in the existing building stock, and for the development of appropriate building regulations; in all of these applications estimates of casualty numbers are essential. But there are many questions about casualty estimation which are still poorly understood. These questions relate to the causes and nature of the injuries and deaths, and the extent to which they can be quantified. This paper looks at the evidence on these questions from recent studies. It then reviews casualty estimation models available, and finally compares the performance of some casualty models in making rapid post-event casualty estimates in recent earthquakes.

  6. Evaluation of earthquake and tsunami on JSFR

    SciTech Connect

    Chikazawa, Y.; Enuma, Y.; Kisohara, N.; Yamano, H.; Kubo, S.; Hayafune, H.; Sagawa, H.; Okamura, S.; Shimakawa, Y.

    2012-07-01

    Evaluation of earthquake and tsunami on JSFR has been analyzed. For seismic design, safety components are confirmed to maintain their functions even against recent strong earthquakes. As for Tsunami, some parts of reactor building might be submerged including component cooling water system whose final heat sink is sea water. However, in the JSFR design, safety grade components are independent from component cooling water system (CCWS). The JSFR emergency power supply adopts a gas turbine system with air cooling, since JSFR does not basically require quick start-up of the emergency power supply thanks to the natural convection DHRS. Even in case of long station blackout, the DHRS could be activated by emergency batteries or manually and be operated continuously by natural convection. (authors)

  7. Pipeline experiment co-located with USGS Parkfield earthquake prediction project

    SciTech Connect

    Isenberg, J.; Richardson, E.

    1995-12-31

    A field experiment to investigate the response of buried pipelines to lateral offsets and traveling waves has been operational since June, 1988 at the Owens` Pasture site near Parkfield, CA where the US Geological Survey has predicted a M6 earthquake. Although the predicted earthquake has not yet occurred, the 1989 Loma Prieta earthquake and 1992 M4.7 earthquake near Parkfield produced measurable response at the pipeline experiment. The present paper describes upgrades to the experiment which were introduced after Loma Prieta which performed successfully in the 1992 event.

  8. Structural performance of the DOE's Idaho National Engineering Laboratory during the 1983 Borak Peak earthquake

    SciTech Connect

    Guenzler, R.C.; Gorman, V.W.

    1985-01-01

    The 1983 Borah Peak Earthquake (7.3 Richter magnitude) was the largest earthquake ever experienced by the DOE's Idaho National Engineering Laboratory (INEL). Reactor and plant facilities are generally located about 90 to 110 km (60 miles) from the epicenter. Several reactors were operating normally at the time of the earthquake. Based on detailed inspections, comparisons of measured accelerations with design levels, and instrumental seismograph information, it was concluded that the 1983 Borah Peak Earthquake created no safety problems for INEL reactors or other facilities. 10 refs., 16 figs., 2 tabs.

  9. Earthquakes; July-August 1977

    USGS Publications Warehouse

    Person, W.J.

    1978-01-01

    July and August were somewhat active seismically speaking, compared to previous months of this year. There were seven earthquakes having magnitudes of 6.5 or greater. The largest was a magnitudes of 6.5 or greater. The largest was a magnitude 8.0 earthquake south of Sumbawa Island on August 19 that killed at least 111. The United States experienced a number of earthquakes during this period, but only one, in California, caused some minor damage. 

  10. Space geodesy and earthquake prediction

    NASA Technical Reports Server (NTRS)

    Bilham, Roger

    1987-01-01

    Earthquake prediction is discussed from the point of view of a new development in geodesy known as space geodesy, which involves the use of extraterrestrial sources or reflectors to measure earth-based distances. Space geodesy is explained, and its relation to terrestrial geodesy is examined. The characteristics of earthquakes are reviewed, and the ways that they can be exploited by space geodesy to predict earthquakes is demonstrated.

  11. Earthquakes, September-October 1980

    USGS Publications Warehouse

    Person, W.J.

    1981-01-01

    There were two major (magnitudes 7.0-7.9) earthquakes during this reporting period; a magnitude (M) 7.3 in Algeria where many people were killed or injured and extensive damage occurred, and an M=7.2 in the Loyalty Islands region of the South Pacific. Japan was struck by a damaging earthquake on September 24, killing two people and causing injuries. There were no damaging earthquakes in the United States. 

  12. Earthquakes, November-December 1991

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    There were three major earthquakes (7.0-7.9) during the last two months of the year: a magntidue 7.0 on November 19 in Columbia, a magnitude 7.4 in the Kuril Islands on December 22, and a magnitude 7.1 in the South Sandwich Islands on December 27. Earthquake-related deaths were reported in Colombia, Yemen, and Iran. there were no significant earthquakes in the United States during this reporting period. 

  13. Earthquake prediction: Simple methods for complex phenomena

    NASA Astrophysics Data System (ADS)

    Luen, Bradley

    2010-09-01

    Earthquake predictions are often either based on stochastic models, or tested using stochastic models. Tests of predictions often tacitly assume predictions do not depend on past seismicity, which is false. We construct a naive predictor that, following each large earthquake, predicts another large earthquake will occur nearby soon. Because this "automatic alarm" strategy exploits clustering, it succeeds beyond "chance" according to a test that holds the predictions _xed. Some researchers try to remove clustering from earthquake catalogs and model the remaining events. There have been claims that the declustered catalogs are Poisson on the basis of statistical tests we show to be weak. Better tests show that declustered catalogs are not Poisson. In fact, there is evidence that events in declustered catalogs do not have exchangeable times given the locations, a necessary condition for the Poisson. If seismicity followed a stochastic process, an optimal predictor would turn on an alarm when the conditional intensity is high. The Epidemic-Type Aftershock (ETAS) model is a popular point process model that includes clustering. It has many parameters, but is still a simpli_cation of seismicity. Estimating the model is di_cult, and estimated parameters often give a non-stationary model. Even if the model is ETAS, temporal predictions based on the ETAS conditional intensity are not much better than those of magnitude-dependent automatic (MDA) alarms, a much simpler strategy with only one parameter instead of _ve. For a catalog of Southern Californian seismicity, ETAS predictions again o_er only slight improvement over MDA alarms

  14. Earthquake early warning for the 2016 Kumamoto earthquake: performance evaluation of the current system and the next-generation methods of the Japan Meteorological Agency

    NASA Astrophysics Data System (ADS)

    Kodera, Yuki; Saitou, Jun; Hayashimoto, Naoki; Adachi, Shimpei; Morimoto, Masahiko; Nishimae, Yuji; Hoshiba, Mitsuyuki

    2016-12-01

    The 2016 Kumamoto earthquake (Kumamoto earthquake sequence) is an extremely high-seismicity event that has been occurring across Kumamoto and Oita Prefectures in Japan since April 14, 2016 (JST). The earthquake early warning system of the Japan Meteorological Agency (JMA) issued warnings for 19 events in the Kumamoto earthquake sequence from April 14 to 19, under some of the heaviest loading conditions since the system began operating in 2007. We analyzed the system performance for cases where a warning was issued and/or strong motion was actually observed. The results indicated that the system exhibited remarkable performance, especially for the most destructive earthquakes in the Kumamoto earthquake sequence. In addition, the system did not miss or seriously under-predict strong motion of any large earthquake from April 14 to 30. However, in four cases, the system issued over-predicted warnings due to the simultaneous occurrence of small earthquakes within a short distance, which implies a fundamental obstacle in trigger-data classifications based solely on arrival time. We also performed simulations using the integrated particle filter (IPF) and propagation of local undamped motion (PLUM) methods, which JMA plans to implement to address over-prediction for multiple simultaneous earthquakes and under-prediction for massive earthquakes with large rupture zones. The simulation results of the IPF method indicated that the IPF method is highly effective at minimizing over-prediction even for multiple simultaneous earthquakes within a short distance, since it adopts a trigger-data classification using velocity amplitude and hypocenter determinations using not-yet-arrived data. The simulation results of the PLUM method demonstrated that the PLUM method is capable of issuing warnings for destructive inland earthquakes more rapidly than the current system owing to the use of additional seismometers that can only be incorporated by this method.[Figure not available: see

  15. Performed Surgical Interventions After the 1999 Marmara Earthquake in Turkey, and Their Importance Regarding Nursing Practices.

    PubMed

    Gul, Asiye; Andsoy, Isil Isik

    2015-01-01

    Effectively dealing with earthquakes is especially important for the people who live in areas prone to earthquakes such as the country of Turkey. Trauma related to earthquakes has specific relevance to nursing practice. The purpose of this review was to describe the types of surgical interventions after the Marmara earthquake and to evaluate the implications for nursing care. English and Turkish articles about the Marmara earthquake were reviewed between May and July 2013. A total of 7 studies were evaluated. The number of patients admitted to the units, types of injuries, and surgical treatments were recorded, with a total of 2378 patients with earthquake-related injuries. The most commonly traumatized parts of the body were the extremities. Fasciotomy operations were performed on 286 patients and 75 patients underwent extremity amputations. Predetermining surgical problems and interventions may be useful in planning for possible future problems in the case of a disaster.

  16. Earthquakes in the United States

    USGS Publications Warehouse

    Stover, C.

    1977-01-01

    To supplement data in the report Preliminary Determination of Epicenters (PDE), the National earthquake Information Service (NEIS) also publishes a quarterly circular, Earthquakes in the United States. This provides information on the felt area of U.S earthquakes and their intensity. The main purpose is to describe the larger effects of these earthquakes so that they can be used in seismic risk studies, site evaluations for nuclear power plants, and answering inquiries by the general public.

  17. Earthquakes, January-February 1974

    USGS Publications Warehouse

    Person, W.J.

    1974-01-01

    During the first 2 months of 1974, earthquakes caused fatalities in Peru and Turkey. The largest earthquake during the period was a magnitude 7.2 shock in the New Hebrides Islands. A local tsunami was generated by a magnitude 7.0 earthquake in the Solomon Islands. The relative quiet that characterized world seismicity during the last year continued through the period. There have been no great earthquakes (magnitude 8.0 or larger) since January 10, 1971, when a magnitude 8.1 shock occurred in western New Guinea. 

  18. Radon in earthquake prediction research.

    PubMed

    Friedmann, H

    2012-04-01

    The observation of anomalies in the radon concentration in soil gas and ground water before earthquakes initiated systematic investigations on earthquake precursor phenomena. The question what is needed for a meaningful earthquake prediction as well as what types of precursory effects can be expected is shortly discussed. The basic ideas of the dilatancy theory are presented which in principle can explain the occurrence of earthquake forerunners. The reasons for radon anomalies in soil gas and in ground water are clarified and a possible classification of radon anomalies is given.

  19. Earthquakes, May-June 1981

    USGS Publications Warehouse

    Person, W.J.

    1981-01-01

    The months of May and June were somewhat quiet, seismically speaking. There was one major earthquake (7.0-7.9) off the west coast of South Island, New Zealand. The most destructive earthquake during this reporting period was in southern Iran on June 11 which caused fatalities and extensive damage. Peru also experienced a destructive earthquake on June 22 which caused fatalities and damage. In the United States, a number of earthquakes were experienced, but none caused significant damage. 

  20. Earthquakes, July-August 1992

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    There were two major earthquakes (7.0≤M<8.0) during this reporting period. A magnitude 7.5 earthquake occurred in Kyrgyzstan on August 19 and a magnitude 7.0 quake struck the Ascension Island region on August 28. In southern California, aftershocks of the magnitude 7.6 earthquake on June 28, 1992, continued. One of these aftershocks caused damage and injuries, and at least one other aftershock caused additional damage. Earthquake-related fatalities were reportred in Kyrgzstan and Pakistan. 

  1. Atmospheric Baseline Monitoring Data Losses Due to the Samoa Earthquake

    NASA Astrophysics Data System (ADS)

    Schnell, R. C.; Cunningham, M. C.; Vasel, B. A.; Butler, J. H.

    2009-12-01

    The National Oceanic and Atmospheric Administration (NOAA) operates an Atmospheric Baseline Observatory at Cape Matatula on the north-eastern point of American Samoa, opened in 1973. The manned observatory conducts continuous measurements of a wide range of climate forcing and atmospheric composition data including greenhouse gas concentrations, solar radiation, CFC and HFC concentrations, aerosols and ozone as well as less frequent measurements of many other parameters. The onset of September 29, 2009 earthquake is clearly visible in the continuous data streams in a variety of ways. The station electrical generator came online when the Samoa power grid failed so instruments were powered during and subsequent to the earthquake. Some instruments ceased operation in a spurt of spurious data followed by silence. Other instruments just stopped sending data abruptly when the shaking from the earthquake broke a data or power links, or an integral part of the instrument was damaged. Others survived the shaking but were put out of calibration. Still others suffered damage after the earthquake as heaters ran uncontrolled or rotating shafts continued operating in a damaged environment grinding away until they seized up or chewed a new operating space. Some instruments operated as if there was no earthquake, others were brought back online within a few days. Many of the more complex (and in most cases, most expensive) instruments will be out of service, some for at least 6 months or more. This presentation will show these results and discuss the impact of the earthquake on long-term measurements of climate forcing agents and other critical climate measurements.

  2. Synthetic earthquake catalogs simulating seismic activity in the Corynth Gulf, Greece, fault system

    NASA Astrophysics Data System (ADS)

    Console, R.; Carluccio, R.; Papadimitriou, E. E.; Karakostas, V. G.

    2014-12-01

    The characteristic earthquake hypothesis is the basis of time-dependent modeling of earthquake recurrence on major faults, using the renewal process methodology. However, the characteristic earthquake hypothesis is not strongly supported by observational data. Few fault segments have long historical or paleoseismic records of individually dated ruptures, and when data and parameter uncertainties are allowed for, the form of the recurrence-distribution is difficult to establish. This is the case, for instance, of the Corinth gulf fault system, for which documents about strong earthquakes exist for at least two thousand years, but they can be considered complete for magnitudes > 6.0 only for the latest 300 years, during which only few characteristic earthquakes are reported for single fault segments. The use of a physics-based earthquake simulator has allowed the production of catalogs lasting 100,000 years and containing more than 500,000 events of magnitudes > 4.0. The main features of our simulation algorithm are (1) the imposition of an average slip rate released by earthquakes to every single segment recognized in the investigated fault system, (2) the interaction between earthquake sources, (3) a self-organized earthquake magnitude distribution, and (4) the effect of minor earthquakes in redistributing stress. The application of our simulation algorithm to the Corinth gulf fault system has shown realistic features in time, space and magnitude behavior of the seismicity. These features include long-term periodicity of strong earthquakes, short-term clustering of both strong and smaller events, and a realistic earthquake magnitude distribution departing from the Gutenberg-Richter distribution in the higher magnitude range.

  3. Some differences in seismic hazard assessment for natural and fluid-induced earthquakes

    NASA Astrophysics Data System (ADS)

    McGarr, A.

    2014-12-01

    Although there is little doubt that fluid-induced earthquakes contribute significantly to the seismic hazard in some parts of the United States, assessing this contribution in ways consistent with hazard assessment for natural earthquakes is proving to be challenging. For natural earthquakes, the hazard is considered to be independent of time whereas for fluid-induced seismicity there is considerable time dependence as evidenced, for instance, by the dramatic increase in recent years of the seismicity in Oklahoma. Case histories of earthquakes induced by the development of Enhanced Geothermal Systems and wastewater injection at depth illustrate a few of the problems. Analyses of earthquake sequences induced by these operations indicate that the rate of earthquake occurrence is proportional to the rate of injection, a factor that, on a broad scale, depends on the level of energy production activities. For natural earthquakes, in contrast, the rate of earthquake occurrence depends on time-independent tectonic factors including the long-term slip rates across known faults. Maximum magnitude assessments for natural and fluid-induced earthquake sources also show a contrast in behavior. For a natural earthquake source, maximum magnitude is commonly assessed from empirical relations between magnitude and the area of a potentially-active fault. The same procedure applied to fluid-induced earthquakes yields magnitudes that are systematically higher than what is observed. For instance, the maximum magnitude estimated from the fault area of the Prague, OK, main shock of 6 November 2011 is 6.2 whereas the magnitude measured from seismic data is 5.65 (Sun and Hartzell, 2014). For fluid-induced earthquakes, maximum magnitude appears to be limited according to the volume of fluid injected before the largest earthquake. This implies that for a given fluid-injection project, the upper limit on magnitude increases as long as injection continues.

  4. Catalog of earthquakes along the San Andreas fault system in Central California: January-March, 1972

    USGS Publications Warehouse

    Wesson, R.L.; Bennett, R.E.; Meagher, K.L.

    1973-01-01

    Numerous small earthquakes occur each day in the Coast Ranges of Central California. The detailed study of these earthquakes provides a tool for gaining insight into the tectonic and physical processes responsible for the generation of damaging earthquakes. This catalog contains the fundamental parameters for earthquakes located within and adjacent to the seismograph network operated by the National Center for Earthquake Research (NCER), U.S. Geological Survey, during the period January - March, 1972. The motivation for these detailed studies has been described by Pakiser and others (1969) and by Eaton and others (1970). Similar catalogs of earthquakes for the years 1969, 1970 and 1971 have been prepared by Lee and others (1972 b,c,d). The basic data contained in these catalogs provide a foundation for further studies. This catalog contains data on 1,718 earthquakes in Central California. Of particular interest is a sequence of earthquakes in the Bear Valley area which contained single shocks with local magnitudes of S.O and 4.6. Earthquakes from this sequence make up roughly 66% of the total and are currently the subject of an interpretative study. Arrival times at 118 seismograph stations were used to locate the earthquakes listed in this catalog. Of these, 94 are telemetered stations operated by NCER. Readings from the remaining 24 stations were obtained through the courtesy of the Seismographic Stations, University of California, Berkeley (UCB); the Earthquake Mechanism Laboratory, National Oceanic and Atmospheric Administration, San Francisco (EML); and the California Department of Water Resources, Sacramento. The Seismographic Stations of the University of California, Berkeley,have for many years published a bulletin describing earthquakes in Northern California and the surrounding area, and readings at UCB Stations from more distant events. The purpose of the present catalog is not to replace the UCB Bulletin, but rather to supplement it, by describing the

  5. Catalog of earthquakes along the San Andreas fault system in Central California, April-June 1972

    USGS Publications Warehouse

    Wesson, R.L.; Bennett, R.E.; Lester, F.W.

    1973-01-01

    Numerous small earthquakes occur each day in the coast ranges of Central California. The detailed study of these earthquakes provides a tool for gaining insight into the tectonic and physical processes responsible for the generation of damaging earthquakes. This catalog contains the fundamental parameters for earthquakes located within and adjacent to the seismograph network operated by the National Center for Earthquake Research (NCER), U.S. Geological Survey, during the period April - June, 1972. The motivation for these detailed studies has been described by Pakiser and others (1969) and by Eaton and others (1970). Similar catalogs of earthquakes for the years 1969, 1970 and 1971 have been prepared by Lee and others (1972 b, c, d). A catalog for the first quarter of 1972 has been prepared by Wesson and others (1972). The basic data contained in these catalogs provide a foundation for further studies. This catalog contains data on 910 earthquakes in Central California. A substantial portion of the earthquakes reported in this catalog represents a continuation of the sequence of earthquakes in the Bear Valley area which began in February, 1972 (Wesson and others, 1972). Arrival times at 126 seismograph stations were used to locate the earthquakes listed in this catalog. Of these, 101 are telemetered stations operated by NCER. Readings from the remaining 25 stations were obtained through the courtesy of the Seismographic Stations, University of California, Berkeley (UCB); the Earthquake Mechanism Laboratory, National Oceanic and Atmospheric Administration, San Francisco (EML); and the California Department of Water Resources, Sacramento. The Seismographic Stations of the University of California, Berkeley, have for many years published a bulletin describing earthquakes in Northern California and the surrounding area, and readings at UCB Stations from more distant events. The purpose of the present catalog is not to replace the UCB Bulletin, but rather to supplement

  6. Source Rupture Process of the 2005 Tarapaca Intermediate Depth Earthquake

    NASA Astrophysics Data System (ADS)

    Peyrat, S.; Favreau, P.; de Chabalier, J.; Bouin, M.

    2007-12-01

    We investigate the details of the rupture process of the large (Mw 7.7) intermediate-depth earthquake that occurred on 13 June 2005 in the Tarapaca region of the Northern Chile seismic gap, using different data sets and different methods. The high quality and variety of seismic and geodetic data available for this event provided an unprecedented opportunity to study its source in detail. This earthquake is a slab-pull event with down dip extensional source mechanism. The aftershock distribution, determined from a post-seismic temporary array, indicates a sub-horizontal fault plane lying between the upper and lower planes of the double seismic zone. This earthquake was also recorded by a permanent digital strong-motion network operated by the University of Chile. These records have absolute time and high dynamic range so that they contain direct information about the rupture process. We used a systematic, fully nonlinear inversion method based on the neighbourhood algorithm to invert for the kinematic slip distribution using the accelerometric data set. This low frequency inversion provides a relatively smooth image of the rupture history. The kinematic inversion shows that the earthquake occurred by the rupture of two asperities. Based on the kinematic inversion result, we propose dynamic rupture models in order to quantify the dynamic rupture process. We simulate the dynamic rupture process and the strong ground motion using a 3D finite-difference method. In our simulation, dynamic rupture grows under the simultaneous control of initial stress and rupture resistance by friction. We constrain dynamic rupture parameters of the Tarapaca earthquake by simple trial and error. Large intraplate earthquakes in subduction zone are quite common although very few have been studied in detail. These earthquakes occurred at depth where the mechanism by which they are triggered remains poorly understood. Consequently, the determination of source rupture for intermediate

  7. Conditional Probabilities for Large Events Estimated by Small Earthquake Rate

    NASA Astrophysics Data System (ADS)

    Wu, Yi-Hsuan; Chen, Chien-Chih; Li, Hsien-Chi

    2016-01-01

    We examined forecasting quiescence and activation models to obtain the conditional probability that a large earthquake will occur in a specific time period on different scales in Taiwan. The basic idea of the quiescence and activation models is to use earthquakes that have magnitudes larger than the completeness magnitude to compute the expected properties of large earthquakes. We calculated the probability time series for the whole Taiwan region and for three subareas of Taiwan—the western, eastern, and northeastern Taiwan regions—using 40 years of data from the Central Weather Bureau catalog. In the probability time series for the eastern and northeastern Taiwan regions, a high probability value is usually yielded in cluster events such as events with foreshocks and events that all occur in a short time period. In addition to the time series, we produced probability maps by calculating the conditional probability for every grid point at the time just before a large earthquake. The probability maps show that high probability values are yielded around the epicenter before a large earthquake. The receiver operating characteristic (ROC) curves of the probability maps demonstrate that the probability maps are not random forecasts, but also suggest that lowering the magnitude of a forecasted large earthquake may not improve the forecast method itself. From both the probability time series and probability maps, it can be observed that the probability obtained from the quiescence model increases before a large earthquake and the probability obtained from the activation model increases as the large earthquakes occur. The results lead us to conclude that the quiescence model has better forecast potential than the activation model.

  8. Future Earth: Reducing Loss By Automating Response to Earthquake Shaking

    NASA Astrophysics Data System (ADS)

    Allen, R. M.

    2014-12-01

    Earthquakes pose a significant threat to society in the U.S. and around the world. The risk is easily forgotten given the infrequent recurrence of major damaging events, yet the likelihood of a major earthquake in California in the next 30 years is greater than 99%. As our societal infrastructure becomes ever more interconnected, the potential impacts of these future events are difficult to predict. Yet, the same inter-connected infrastructure also allows us to rapidly detect earthquakes as they begin, and provide seconds, tens or seconds, or a few minutes warning. A demonstration earthquake early warning system is now operating in California and is being expanded to the west coast (www.ShakeAlert.org). In recent earthquakes in the Los Angeles region, alerts were generated that could have provided warning to the vast majority of Los Angelinos who experienced the shaking. Efforts are underway to build a public system. Smartphone technology will be used not only to issue that alerts, but could also be used to collect data, and improve the warnings. The MyShake project at UC Berkeley is currently testing an app that attempts to turn millions of smartphones into earthquake-detectors. As our development of the technology continues, we can anticipate ever-more automated response to earthquake alerts. Already, the BART system in the San Francisco Bay Area automatically stops trains based on the alerts. In the future, elevators will stop, machinery will pause, hazardous materials will be isolated, and self-driving cars will pull-over to the side of the road. In this presentation we will review the current status of the earthquake early warning system in the US. We will illustrate how smartphones can contribute to the system. Finally, we will review applications of the information to reduce future losses.

  9. Earthquake precursory events around epicenters and local active faults; the cases of two inland earthquakes in Iran

    NASA Astrophysics Data System (ADS)

    Valizadeh Alvan, H.; Mansor, S.; Haydari Azad, F.

    2012-12-01

    The possibility of earthquake prediction in the frame of several days to few minutes before its occurrence has stirred interest among researchers, recently. Scientists believe that the new theories and explanations of the mechanism of this natural phenomenon are trustable and can be the basis of future prediction efforts. During the last thirty years experimental researches resulted in some pre-earthquake events which are now recognized as confirmed warning signs (precursors) of past known earthquakes. With the advances in in-situ measurement devices and data analysis capabilities and the emergence of satellite-based data collectors, monitoring the earth's surface is now a regular work. Data providers are supplying researchers from all over the world with high quality and validated imagery and non-imagery data. Surface Latent Heat Flux (SLHF) or the amount of energy exchange in the form of water vapor between the earth's surface and atmosphere has been frequently reported as an earthquake precursor during the past years. The accumulated stress in the earth's crust during the preparation phase of earthquakes is said to be the main cause of temperature anomalies weeks to days before the main event and subsequent shakes. Chemical and physical interactions in the presence of underground water lead to higher water evaporation prior to inland earthquakes. On the other hand, the leak of Radon gas occurred as rocks break during earthquake preparation causes the formation of airborne ions and higher Air Temperature (AT) prior to main event. Although co-analysis of direct and indirect observation for precursory events is considered as a promising method for future successful earthquake prediction, without proper and thorough knowledge about the geological setting, atmospheric factors and geodynamics of the earthquake-prone regions we will not be able to identify anomalies due to seismic activity in the earth's crust. Active faulting is a key factor in identification of the

  10. Neoliberalism and criticisms of earthquake insurance arrangements in New Zealand.

    PubMed

    Hay, I

    1996-03-01

    Global collapse of the Fordist-Keynesian regime of accumulation and an attendant philosophical shift in New Zealand politics to neoliberalism have prompted criticisms of, and changes to, the Earthquake and War Damage Commission. Earthquake insurance arrangements made 50 years ago in an era of collectivist, welfarist political action are now set in an environment in which emphasis is given to competitive relations and individualism. Six specific criticisms of the Commission are identified, each of which is founded in the rhetoric and ideology of a neoliberal political project which has underpinned radical social and economic changes in New Zealand since the early 1980s. On the basis of those criticisms, and in terms of the Earthquake Commission Act 1993, the Commission has been restructured. The new Commission is withdrawing from its primary position as the nation's non-residential property hazards insurer and is restricting its coverage of residential properties.

  11. RICHTER: A Smartphone Application for Rapid Collection of Geo-Tagged Pictures of Earthquake Damage

    NASA Astrophysics Data System (ADS)

    Skinnemoen, H.; Bossu, R.; Furuheim, K.; Bjorgo, E.

    2010-12-01

    RICHTER (Rapid geo-Images for Collaborative Help Targeting Earthquake Response) is a smartphone version of a professional application developed to provide high quality geo-tagged image communication over challenging network links, such as satellites and poor mobile links. Developed for Android mobile phones, it allows eyewitnesses to share their pictures of earthquake damage easily and without cost with the Euro-Mediterranean Seismological Centre (EMSC). The goal is to engage citizens in the collection of the most up-to-date visual information on local damage for improved rapid impact assessment. RICHTER integrates the innovative and award winning ASIGN protocol initially developed for satellite communication between cameras / computers / satcom terminals and servers at HQ. ASIGN is a robust and optimal image and video communication management solution for bandwidth-limited communication networks which was developed for use particularly in emergency and disaster situations. Contrary to a simple Multimedia Messaging System (MMS), RICHTER allows access to high definition images with embedded location information. Location is automatically assigned from either the internal GPS, derived from the mobile network (triangulation) or the current Wi-Fi domain, in that order, as this corresponds to the expected positioning accuracy. Pictures are compressed to 20-30KB of data typically for fast transfer and to avoid network overload. Full size images can be requested by the EMSC either fully automatically, or on a case-by-case basis, depending on the user preferences. ASIGN was initially developed in coordination with INMARSAT and the European Space Agency. It was used by the Rapid Mapping Unit of the United Nations notably for the damage assessment of the January 12, 2010 Haiti earthquake where more than 700 photos were collected. RICHTER will be freely distributed on the EMSC website to eyewitnesses in the event of significantly damaging earthquakes. The EMSC is the second

  12. Rapid earthquake characterization using MEMS accelerometers and volunteer hosts following the M 7.2 Darfield, New Zealand, Earthquake

    USGS Publications Warehouse

    Lawrence, J. F.; Cochran, E.S.; Chung, A.; Kaiser, A.; Christensen, C. M.; Allen, R.; Baker, J.W.; Fry, B.; Heaton, T.; Kilb, Debi; Kohler, M.D.; Taufer, M.

    2014-01-01

    We test the feasibility of rapidly detecting and characterizing earthquakes with the Quake‐Catcher Network (QCN) that connects low‐cost microelectromechanical systems accelerometers to a network of volunteer‐owned, Internet‐connected computers. Following the 3 September 2010 M 7.2 Darfield, New Zealand, earthquake we installed over 180 QCN sensors in the Christchurch region to record the aftershock sequence. The sensors are monitored continuously by the host computer and send trigger reports to the central server. The central server correlates incoming triggers to detect when an earthquake has occurred. The location and magnitude are then rapidly estimated from a minimal set of received ground‐motion parameters. Full seismic time series are typically not retrieved for tens of minutes or even hours after an event. We benchmark the QCN real‐time detection performance against the GNS Science GeoNet earthquake catalog. Under normal network operations, QCN detects and characterizes earthquakes within 9.1 s of the earthquake rupture and determines the magnitude within 1 magnitude unit of that reported in the GNS catalog for 90% of the detections.

  13. Integrated Seismicity Model to Detect Pairs of Possible Interdependent Earthquakes and Its Application to Aftershocks of the 2011 Tohoku-Oki Earthquake and Sequence of the 2014 Kermadec and Rat Islands Earthquakes

    NASA Astrophysics Data System (ADS)

    Miyazawa, M.; Tamura, R.

    2015-12-01

    We introduce an integrated seismicity model to stochastically evaluate the time intervals of consecutive earthquakes at global scales, making it possible to detect a pair of earthquakes that are remotely located and possibly related to each other. The model includes seismicity in non-overlapping areas and comprehensively explains the seismicity on the basis of point process models, which include the stationary Poisson model, the aftershock decay model following Omori-Utsu's law, and/or the epidemic-type aftershock sequence (ETAS) model. By use of this model, we examine the possibility of remote triggering of the 2011 M6.4 eastern Shizuoka earthquake in the vicinity of Mt. Fuji that occurred 4 days after the Mw9.0 Tohoku-Oki earthquake and 4 minutes after the M6.2 off-Fukushima earthquake that located about 400 km away, and that of the 2014 Mw7.9 Rat Islands earthquake that occurred within one hour after the Mw6.7 Kermadec earthquake that located about 9,000 km away and followed two large (Mw6.9, 6.5) earthquakes in the region. Both target earthquakes occurred during the passage of surface waves propagating from the previous large events. We estimated probability that the time interval is shorter than that between consecutive events and obtained dynamic stress changes on the faults. The results indicate that the M6.4 eastern Shizuoka event may be rather triggered by the static stress changes from the Tohoku-Oki earthquake and that the Mw7.9 Rat Islands event may have been remotely triggered by the Kermadec events possibly via cyclic fatigue.

  14. Statistical Earthquake Focal Mechanism Forecasts

    NASA Astrophysics Data System (ADS)

    Kagan, Y. Y.; Jackson, D. D.

    2013-12-01

    The new whole Earth focal mechanism forecast, based on the GCMT catalog, has been created. In the present forecast, the sum of normalized seismic moment tensors within 1000 km radius is calculated and the P- and T-axes for the focal mechanism are evaluated on the basis of the sum. Simultaneously we calculate an average rotation angle between the forecasted mechanism and all the surrounding mechanisms. This average angle shows tectonic complexity of a region and indicates the accuracy of the prediction. The method was originally proposed by Kagan and Jackson (1994, JGR). Recent interest by CSEP and GEM has motivated some improvements, particularly to extend the previous forecast to polar and near-polar regions. The major problem in extending the forecast is the focal mechanism calculation on a spherical surface. In the previous forecast as our average focal mechanism was computed, it was assumed that longitude lines are approximately parallel within 1000 km radius. This is largely accurate in the equatorial and near-equatorial areas. However, when one approaches the 75 degree latitude, the longitude lines are no longer parallel: the bearing (azimuthal) difference at points separated by 1000 km reach about 35 degrees. In most situations a forecast point where we calculate an average focal mechanism is surrounded by earthquakes, so a bias should not be strong due to the difference effect cancellation. But if we move into polar regions, the bearing difference could approach 180 degrees. In a modified program focal mechanisms have been projected on a plane tangent to a sphere at a forecast point. New longitude axes which are parallel in the tangent plane are corrected for the bearing difference. A comparison with the old 75S-75N forecast shows that in equatorial regions the forecasted focal mechanisms are almost the same, and the difference in the forecasted focal mechanisms rotation angle is close to zero. However, though the forecasted focal mechanisms are similar

  15. Self-Organized Earthquakes

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Holliday, J. R.; Turcotte, D. L.; Klein, W.

    2011-12-01

    Self-Organized Criticality was proposed by the Per Bak et al. [1] as a means of explaining scaling laws observed in driven natural systems, usually in (slowly) driven threshold systems. The example used by Bak was a simple cellular automaton model of a sandpile, in which grains of sand were slowly dropped (randomly) onto a flat plate. After a period of time, during which the 'critical state' was approached, a series of self-similar avalanches would begin. Scaling exponents for the frequency-area statistics of the sandpile avalanches were found to be approximately 1, a value that characterizes 'flicker noise' in natural systems. SOC is associated with a critical point in the phase diagram of the system, and it was found that the usual 2-scaling field theory applies. A model related to SOC is the Self-Organized Spinodal (SOS), or intermittent criticality model. Here a slow but persistent driving force leads to quasi-periodic approach to, and retreat from, the classical limit of stability, or spinodal. Scaling exponents for this model can be related to Gutenberg-Richter and Omori exponents observed in earthquake systems. In contrast to SOC models, nucleation, both classical and non-classical types, is possible in SOS systems. Tunneling or nucleation rates can be computed from Langer-Klein-Landau-Ginzburg theories for comparison to observations. Nucleating droplets play a role similar to characteristic earthquake events. Simulations of these systems reveals much of the phenomenology associated with earthquakes and other types of "burst" dynamics. Whereas SOC is characterized by the full scaling spectrum of avalanches, SOS is characterized by both system-size events above the nominal frequency-size scaling curve, and scaling of small events. Applications to other systems including integrate-and-fire neural networks and financial crashes will be discussed. [1] P. Bak, C. Tang and K. Weisenfeld, Self-Organized Criticality, Phys. Rev. Lett., 59, 381 (1987).

  16. Analysis of worldwide earthquake mortality using multivariate demographic and seismic data.

    PubMed

    Gutiérrez, E; Taucer, F; De Groeve, T; Al-Khudhairy, D H A; Zaldivar, J M

    2005-06-15

    In this paper, mortality in the immediate aftermath of an earthquake is studied on a worldwide scale using multivariate analysis. A statistical method is presented that analyzes reported earthquake fatalities as a function of a heterogeneous set of parameters selected on the basis of their presumed influence on earthquake mortality. The ensemble was compiled from demographic, seismic, and reported fatality data culled from available records of past earthquakes organized in a geographic information system. The authors consider the statistical relation between earthquake mortality and the available data ensemble, analyze the validity of the results in view of the parametric uncertainties, and propose a multivariate mortality analysis prediction method. The analysis reveals that, although the highest mortality rates are expected in poorly developed rural areas, high fatality counts can result from a wide range of mortality ratios that depend on the effective population size.

  17. Who cares about Mid-Ocean Ridge Earthquakes? And Why?

    NASA Astrophysics Data System (ADS)

    Tolstoy, M.

    2004-12-01

    Every day the surface of our planet is being slowly ripped apart by the forces of plate tectonics. Much of this activity occurs underwater and goes unnoticed except for by a few marine seismologists who avidly follow the creaks and groans of the ocean floor in an attempt to understand the spreading and formation of oceanic crust. Are marine seismologists really the only ones that care? As it turns out, deep beneath the ocean surface, earthquakes play a fundamental role in a myriad of activity centered on mid-ocean ridges where new crust forms and breaks on a regular basis. This activity takes the form of exotic geological structures hosting roasting hot fluids and bizarre chemosynthetic life forms. One of the fundamental drivers for this other world on the seafloor is earthquakes. Earthquakes provide cracks that allow seawater to penetrate the rocks, heat up, and resurface as hydrothermal vent fluids, thus providing chemicals to feed a thriving biological community. Earthquakes can cause pressure changes along cracks that can fundamentally alter fluid flow rates and paths. Thus earthquakes can both cut off existing communities from their nutrient source and provide new oases on the seafloor around which life can thrive. This poster will present some of the fundamental physical principals of how earthquakes can impact fluid flow, and hence life on the seafloor. Using these other-wordly landscapes and alien-like life forms to woe the unsuspecting passerby, we will sneak geophysics into the picture and tell the story of why earthquakes are so fundamental to life on the seafloor, and perhaps life elsewhere in the universe.

  18. Earthquakes; May-June 1977

    USGS Publications Warehouse

    Person, W.J.

    1977-01-01

    The months of May and June were somewhat quiet seismically speaking. There was only on significant earthquake, a magnitude 7.2 on June 22 in teh Tonga Islands. In teh United States, the two largest earthquakes occurred in California and on Hawaii. 

  19. Earthquake prediction; fact and fallacy

    USGS Publications Warehouse

    Hunter, R.N.

    1976-01-01

    Earthquake prediction is a young and growing area in the field of seismology. Only a few years ago, experts in seismology were declaring flatly that it was impossible. Now, some successes have been achieved and more are expected. Within a few years, earthquakes may be predicted as routinely as the weather, and possibly with greater accuracy. 

  20. Earthquakes Threaten Many American Schools

    ERIC Educational Resources Information Center

    Bailey, Nancy E.

    2010-01-01

    Millions of U.S. children attend schools that are not safe from earthquakes, even though they are in earthquake-prone zones. Several cities and states have worked to identify and repair unsafe buildings, but many others have done little or nothing to fix the problem. The reasons for ignoring the problem include political and financial ones, but…

  1. Make an Earthquake: Ground Shaking!

    ERIC Educational Resources Information Center

    Savasci, Funda

    2011-01-01

    The main purposes of this activity are to help students explore possible factors affecting the extent of the damage of earthquakes and learn the ways to reduce earthquake damages. In these inquiry-based activities, students have opportunities to develop science process skills and to build an understanding of the relationship among science,…

  2. Heavy tails and earthquake probabilities

    USGS Publications Warehouse

    Ellsworth, William L.

    2012-01-01

    The 21st century has already seen its share of devastating earthquakes, some of which have been labeled as “unexpected,” at least in the eyes of some seismologists and more than a few journalists. A list of seismological surprises could include the 2004 Sumatra-Andaman Islands; 2008 Wenchuan, China; 2009 Haiti; 2011 Christchurch, New Zealand; and 2011 Tohoku, Japan, earthquakes

  3. Earthquakes March-April 1992

    USGS Publications Warehouse

    Person, Waverly J.

    1992-01-01

    The months of March and April were quite active seismically speaking. There was one major earthquake (7.0Earthquake-related deaths were reported in Iran, Costa Rica, Turkey, and Germany.

  4. Towards Modelling slow Earthquakes with Geodynamics

    NASA Astrophysics Data System (ADS)

    Regenauer-Lieb, K.; Yuen, D. A.

    2006-12-01

    We explore a new, properly scaled, thermal-mechanical geodynamic model{^1} that can generate timescales now very close to those of earthquakes and of the same order as slow earthquakes. In our simulations we encounter two basically different bifurcation phenomena. One in which the shear zone nucleates in the ductile field, and the second which is fully associated with elasto-plastic (brittle, pressure- dependent) displacements. A quartz/feldspar composite slab has all two modes operating simultaneously in three different depth levels. The bottom of the crust is predominantly controlled by the elasto-visco-plastic mode while the top is controlled by the elasto-plastic mode. The exchange of the two modes appears to communicate on a sub-horizontal layer in a flip-flop fashion, which may yield a fractal-like signature in time and collapses into a critical temperature which for crustal rocks is around 500-580 K; in the middle of the brittle-ductile transition-zone. Near the critical temperature, stresses close to the ideal strength can be reached at local, meter-scale. Investigations of the thermal-mechanical properties under such extreme conditions are pivotal for understanding the physics of earthquakes. 1. Regenauer-Lieb, K., Weinberg, R. & Rosenbaum, G. The effect of energy feedbacks on continental strength. Nature 442, 67-70 (2006).

  5. Intermediate-term earthquake prediction

    USGS Publications Warehouse

    Knopoff, L.

    1990-01-01

    The problems in predicting earthquakes have been attacked by phenomenological methods from pre-historic times to the present. The associations of presumed precursors with large earthquakes often have been remarked upon. the difficulty in identifying whether such correlations are due to some chance coincidence or are real precursors is that usually one notes the associations only in the relatively short time intervals before the large events. Only rarely, if ever, is notice taken of whether the presumed precursor is to be found in the rather long intervals that follow large earthquakes, or in fact is absent in these post-earthquake intervals. If there are enough examples, the presumed correlation fails as a precursor in the former case, while in the latter case the precursor would be verified. Unfortunately, the observer is usually not concerned with the 'uniteresting' intervals that have no large earthquakes

  6. Exaggerated Claims About Earthquake Predictions

    NASA Astrophysics Data System (ADS)

    Kafka, Alan L.; Ebel, John E.

    2007-01-01

    The perennial promise of successful earthquake prediction captures the imagination of a public hungry for certainty in an uncertain world. Yet, given the lack of any reliable method of predicting earthquakes [e.g., Geller, 1997; Kagan and Jackson, 1996; Evans, 1997], seismologists regularly have to explain news stories of a supposedly successful earthquake prediction when it is far from clear just how successful that prediction actually was. When journalists and public relations offices report the latest `great discovery' regarding the prediction of earthquakes, seismologists are left with the much less glamorous task of explaining to the public the gap between the claimed success and the sober reality that there is no scientifically proven method of predicting earthquakes.

  7. Earthquake Simulator Finds Tremor Triggers

    SciTech Connect

    Johnson, Paul

    2015-03-27

    Using a novel device that simulates earthquakes in a laboratory setting, a Los Alamos researcher has found that seismic waves-the sounds radiated from earthquakes-can induce earthquake aftershocks, often long after a quake has subsided. The research provides insight into how earthquakes may be triggered and how they recur. Los Alamos researcher Paul Johnson and colleague Chris Marone at Penn State have discovered how wave energy can be stored in certain types of granular materials-like the type found along certain fault lines across the globe-and how this stored energy can suddenly be released as an earthquake when hit by relatively small seismic waves far beyond the traditional “aftershock zone” of a main quake. Perhaps most surprising, researchers have found that the release of energy can occur minutes, hours, or even days after the sound waves pass; the cause of the delay remains a tantalizing mystery.

  8. Early Earthquakes of the Americas

    NASA Astrophysics Data System (ADS)

    Ni, James

    2004-11-01

    Robert Kovach's second book looks at the interplay of earthquake and volcanic events, archeology, and history in the Americas. Throughout history, major earthquakes have caused the deaths of millions of people and have damaged countless cities. Earthquakes undoubtedly damaged prehistoric cities in the Americas, and evidence of these events could be preserved in archeological records. Kovach asks, Did indigenous native cultures-Indians of the Pacific Northwest, Aztecs, Mayas, and Incas-document their natural history? Some events have been explicitly documented, for example, in Mayan codices, but many may have been recorded as myth and legend. Kovach's discussions of how early cultures dealt with fearful events such as earthquakes and volcanic eruptions are colorful, informative, and entertaining, and include, for example, a depiction of how the Maya would talk to maize plants in their fields during earthquakes to reassure them.

  9. Earthquake hazards: a national threat

    USGS Publications Warehouse

    ,

    2006-01-01

    Earthquakes are one of the most costly natural hazards faced by the Nation, posing a significant risk to 75 million Americans in 39 States. The risks that earthquakes pose to society, including death, injury, and economic loss, can be greatly reduced by (1) better planning, construction, and mitigation practices before earthquakes happen, and (2) providing critical and timely information to improve response after they occur. As part of the multi-agency National Earthquake Hazards Reduction Program, the U.S. Geological Survey (USGS) has the lead Federal responsibility to provide notification of earthquakes in order to enhance public safety and to reduce losses through effective forecasts based on the best possible scientific information.

  10. Are Earthquakes a Critical Phenomenon?

    NASA Astrophysics Data System (ADS)

    Ramos, O.

    2014-12-01

    Earthquakes, granular avalanches, superconducting vortices, solar flares, and even stock markets are known to evolve through power-law distributed events. During decades, the formalism of equilibrium phase transition has coined these phenomena as critical, which implies that they are also unpredictable. This work revises these ideas and uses earthquakes as the paradigm to demonstrate that slowly driven systems evolving through uncorrelated and power-law distributed avalanches (UPLA) are not necessarily critical systems, and therefore not necessarily unpredictable. By linking the correlation length to the pdf of the distribution, and comparing it with the one obtained at a critical point, a condition of criticality is introduced. Simulations in the classical Olami-Feder-Christensen (OFC) earthquake model confirm the findings, showing that earthquakes are not a critical phenomenon. However, one single catastrophic earthquake may show critical properties and, paradoxically, the emergence of this temporal critical behaviour may eventually carry precursory signs of catastrophic events.

  11. Testing earthquake source inversion methodologies

    USGS Publications Warehouse

    Page, M.; Mai, P.M.; Schorlemmer, D.

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data and the complex rupture process at depth. The resulting earthquake source models quantify the spatiotemporal evolution of ruptures. They are also used to provide a rapid assessment of the severity of an earthquake and to estimate losses. However, because of uncertainties in the data, assumed fault geometry and velocity structure, and chosen rupture parameterization, it is not clear which features of these source models are robust. Improved understanding of the uncertainty and reliability of earthquake source inversions will allow the scientific community to use the robust features of kinematic inversions to more thoroughly investigate the complexity of the rupture process and to better constrain other earthquakerelated computations, such as ground motion simulations and static stress change calculations.

  12. Earthquake Simulator Finds Tremor Triggers

    ScienceCinema

    Johnson, Paul

    2016-07-12

    Using a novel device that simulates earthquakes in a laboratory setting, a Los Alamos researcher has found that seismic waves-the sounds radiated from earthquakes-can induce earthquake aftershocks, often long after a quake has subsided. The research provides insight into how earthquakes may be triggered and how they recur. Los Alamos researcher Paul Johnson and colleague Chris Marone at Penn State have discovered how wave energy can be stored in certain types of granular materials-like the type found along certain fault lines across the globe-and how this stored energy can suddenly be released as an earthquake when hit by relatively small seismic waves far beyond the traditional “aftershock zone” of a main quake. Perhaps most surprising, researchers have found that the release of energy can occur minutes, hours, or even days after the sound waves pass; the cause of the delay remains a tantalizing mystery.

  13. Testing an earthquake prediction algorithm

    USGS Publications Warehouse

    Kossobokov, V.G.; Healy, J.H.; Dewey, J.W.

    1997-01-01

    A test to evaluate earthquake prediction algorithms is being applied to a Russian algorithm known as M8. The M8 algorithm makes intermediate term predictions for earthquakes to occur in a large circle, based on integral counts of transient seismicity in the circle. In a retroactive prediction for the period January 1, 1985 to July 1, 1991 the algorithm as configured for the forward test would have predicted eight of ten strong earthquakes in the test area. A null hypothesis, based on random assignment of predictions, predicts eight earthquakes in 2.87% of the trials. The forward test began July 1, 1991 and will run through December 31, 1997. As of July 1, 1995, the algorithm had forward predicted five out of nine earthquakes in the test area, which success ratio would have been achieved in 53% of random trials with the null hypothesis.

  14. The Lusi mud eruption was not triggered by an earthquake

    NASA Astrophysics Data System (ADS)

    Manga, M.; Rudolph, M. L.; Tingay, M. R.; Davies, R.; Wang, C.; Shirzaei, M.; Fukushima, Y.

    2013-12-01

    The Lusi mud eruption in East Java, Indonesia has displaced tens of thousands of people with economic costs that exceed $4 billion USD to date. Consequently, understanding the cause and future of the eruption are important. There has been considerable debate as to whether the eruption was triggered by the MW 6.3 Yogyakarta earthquake, which struck two days prior to the eruption, or by drilling operations at a gas exploration well (BJP-1) 200 m from the 700 m lineament, along which mud first erupted. A recent letter by Lupi et al. (Nature Geoscience, 2013) argues for an earthquake trigger, invoking the presence of a seismically fast structure that amplifies seismic shaking in the mud source region. The absence of an eruption during larger and closer earthquakes reveals that an earthquake trigger is unlikely. Furthermore, the high seismic velocities, central to the model of Lupi et al. , are impossibly high and are primarily artifacts associated with steel casing installed in the well where the velocities were measured. Finally, the stress changes caused by drilling operations greatly exceeded those produced by the earthquake. Assuming no major changes in plumbing, we conclude by using satellite InSAR to reveal the evolution of surface deformation caused by the eruption and predict a 10 fold decrease in discharge in the next 5 years.

  15. Review of variations in Mw < 7 earthquake motions on position and TEC (Mw = 6.5 Aegean Sea earthquake sample)

    NASA Astrophysics Data System (ADS)

    Yildirim, Omer; Inyurt, Samed; Mekik, Cetin

    2016-02-01

    Turkey is a country located in the middle latitude zone, where tectonic activity is intensive. Recently, an earthquake of magnitude 6.5 Mw occurred offshore in the Aegean Sea on 24 May 2014 at 09:25 UTC, which lasted about 40 s. The earthquake was also felt in Greece, Romania, and Bulgaria in addition to Turkey. In recent years, ionospheric anomaly detection studies have been carried out because of seismicity with total electron content (TEC) computed from the global navigation satellite system's (GNSS) signal delays and several interesting findings have been published. In this study, both TEC and positional variations have been examined separately following a moderate size earthquake in the Aegean Sea. The correlation of the aforementioned ionospheric variation with the positional variation has also been investigated. For this purpose, a total of 15 stations was used, including four continuously operating reference stations in Turkey (CORS-TR) and stations in the seismic zone (AYVL, CANA, IPSA, and YENC), as well as international GNSS service (IGS) and European reference frame permanent network (EPN) stations. The ionospheric and positional variations of the AYVL, CANA, IPSA, and YENC stations were examined using Bernese v5.0 software. When the precise point positioning TEC (PPP-TEC) values were examined, it was observed that the TEC values were approximately 4 TECU (total electron content unit) above the upper-limit TEC value at four stations located in Turkey, 3 days before the earthquake at 08:00 and 10:00 UTC. At the same stations, on the day before the earthquake at 06:00, 08:00, and 10:00 UTC, the TEC values were approximately 5 TECU below the lower-limit TEC value. The global ionosphere model TEC (GIM-TEC) values published by the Centre for Orbit Determination in Europe (CODE) were also examined. Three days before the earthquake, at all stations, it was observed that the TEC values in the time period between 08:00 and 10:00 UTC were approximately 2 TECU

  16. The CATDAT damaging earthquakes database

    NASA Astrophysics Data System (ADS)

    Daniell, J. E.; Khazai, B.; Wenzel, F.; Vervaeck, A.

    2011-08-01

    The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture) database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes. Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon. Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected), and economic losses (direct, indirect, aid, and insured). Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto (214 billion USD damage; 2011 HNDECI-adjusted dollars) compared to the 2011 Tohoku (>300 billion USD at time of writing), 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product), exchange rate, wage information, population, HDI (Human Development Index), and insurance information have been collected globally to form comparisons. This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global reinsurance field.

  17. What to Expect from the Virtual Seismologist: Delay Times and Uncertainties of Initial Earthquake Alerts in California

    NASA Astrophysics Data System (ADS)

    Behr, Y.; Cua, G. B.; Clinton, J. F.; Racine, R.; Meier, M.; Cauzzi, C.

    2013-12-01

    The Virtual Seismologist (VS) method is a Bayesian approach to regional network-based earthquake early warning (EEW) originally formulated by Cua and Heaton (2007). Implementation of VS into real-time EEW codes has been an on-going effort of the Swiss Seismological Service at ETH Zürich since 2006, with support from ETH Zürich, various European projects, and the United States Geological Survey (USGS). VS is one of three EEW algorithms that form the basis of the California Integrated Seismic Network (CISN) ShakeAlert system, a USGS-funded prototype end-to-end EEW system that could potentially be implemented in California. In Europe, VS is currently operating as a real-time test system in Switzerland, western Greece and Istanbul. As part of the on-going EU project REAKT (Strategies and Tools for Real-Time Earthquake Risk Reduction), VS installations in southern Italy, Romania, and Iceland are planned or underway. The possible use cases for an EEW system will be determined by the speed and reliability of earthquake source parameter estimates. A thorough understanding of both is therefore essential to evaluate the usefulness of VS. For California, we present state-wide theoretical alert times for hypothetical earthquakes by analyzing time delays introduced by the different components in the VS EEW system. Taking advantage of the fully probabilistic formulation of the VS algorithm we further present an improved way to describe the uncertainties of every magnitude estimate by evaluating the width and shape of the probability density function that describes the relationship between waveform envelope amplitudes and magnitude. We evaluate these new uncertainty values for past seismicity in California through off-line playbacks and compare them to the previously defined static definitions of uncertainty based on real-time detections. Our results indicate where VS alerts are most useful in California and also suggest where most effective improvements to the VS EEW system

  18. Performance Basis for Airborne Separation

    NASA Technical Reports Server (NTRS)

    Wing, David J.

    2008-01-01

    Emerging applications of Airborne Separation Assistance System (ASAS) technologies make possible new and powerful methods in Air Traffic Management (ATM) that may significantly improve the system-level performance of operations in the future ATM system. These applications typically involve the aircraft managing certain components of its Four Dimensional (4D) trajectory within the degrees of freedom defined by a set of operational constraints negotiated with the Air Navigation Service Provider. It is hypothesized that reliable individual performance by many aircraft will translate into higher total system-level performance. To actually realize this improvement, the new capabilities must be attracted to high demand and complexity regions where high ATM performance is critical. Operational approval for use in such environments will require participating aircraft to be certified to rigorous and appropriate performance standards. Currently, no formal basis exists for defining these standards. This paper provides a context for defining the performance basis for 4D-ASAS operations. The trajectory constraints to be met by the aircraft are defined, categorized, and assessed for performance requirements. A proposed extension of the existing Required Navigation Performance (RNP) construct into a dynamic standard (Dynamic RNP) is outlined. Sample data is presented from an ongoing high-fidelity batch simulation series that is characterizing the performance of an advanced 4D-ASAS application. Data of this type will contribute to the evaluation and validation of the proposed performance basis.

  19. MyShake: A smartphone seismic network for earthquake early warning and beyond.

    PubMed

    Kong, Qingkai; Allen, Richard M; Schreier, Louis; Kwon, Young-Woo

    2016-02-01

    Large magnitude earthquakes in urban environments continue to kill and injure tens to hundreds of thousands of people, inflicting lasting societal and economic disasters. Earthquake early warning (EEW) provides seconds to minutes of warning, allowing people to move to safe zones and automated slowdown and shutdown of transit and other machinery. The handful of EEW systems operating around the world use traditional seismic and geodetic networks that exist only in a few nations. Smartphones are much more prevalent than traditional networks and contain accelerometers that can also be used to detect earthquakes. We report on the development of a new type of seismic system, MyShake, that harnesses personal/private smartphone sensors to collect data and analyze earthquakes. We show that smartphones can record magnitude 5 earthquakes at distances of 10 km or less and develop an on-phone detection capability to separate earthquakes from other everyday shakes. Our proof-of-concept system then collects earthquake data at a central site where a network detection algorithm confirms that an earthquake is under way and estimates the location and magnitude in real time. This information can then be used to issue an alert of forthcoming ground shaking. MyShake could be used to enhance EEW in regions with traditional networks and could provide the only EEW capability in regions without. In addition, the seismic waveforms recorded could be used to deliver rapid microseism maps, study impacts on buildings, and possibly image shallow earth structure and earthquake rupture kinematics.

  20. Prevalence and predictors of posttraumatic stress disorder, anxiety, depression, and burnout in Pakistani earthquake recovery workers.

    PubMed

    Ehring, Thomas; Razik, Saiqa; Emmelkamp, Paul M G

    2011-01-30

    Past research has shown a substantial prevalence of emotional disorders in professionals involved in rescue and/or relief operations following natural disasters, including earthquakes. However, no published study to date has investigated whether disaster rehabilitation and reconstruction workers involved in later phases of the earthquake response are also affected by emotional problems. A nearly complete sample of earthquake rehabilitation and reconstruction workers (N=267) involved in the response to the 2005 earthquake in Northern Pakistan filled in a set of self-report questionnaires assessing emotional problems and predictor variables approximately 24 months after the earthquake. Most participants had experienced the disaster themselves and suffered from a number of stressors during and shortly after the acute earthquake phase. A substantial subgroup of participants reported clinically relevant levels of emotional disorders, especially earthquake-related posttraumatic stress disorder (42.6%), as well as depression and anxiety (approx. 20%). Levels of burnout were low. Symptom levels of posttraumatic stress disorder were associated with the severity of the earthquake experience, past traumas, work-related stressors, low social support, and female gender. The results document a high prevalence of emotional problems in earthquake rehabilitation and recovery workers.

  1. MyShake: A smartphone seismic network for earthquake early warning and beyond

    PubMed Central

    Kong, Qingkai; Allen, Richard M.; Schreier, Louis; Kwon, Young-Woo

    2016-01-01

    Large magnitude earthquakes in urban environments continue to kill and injure tens to hundreds of thousands of people, inflicting lasting societal and economic disasters. Earthquake early warning (EEW) provides seconds to minutes of warning, allowing people to move to safe zones and automated slowdown and shutdown of transit and other machinery. The handful of EEW systems operating around the world use traditional seismic and geodetic networks that exist only in a few nations. Smartphones are much more prevalent than traditional networks and contain accelerometers that can also be used to detect earthquakes. We report on the development of a new type of seismic system, MyShake, that harnesses personal/private smartphone sensors to collect data and analyze earthquakes. We show that smartphones can record magnitude 5 earthquakes at distances of 10 km or less and develop an on-phone detection capability to separate earthquakes from other everyday shakes. Our proof-of-concept system then collects earthquake data at a central site where a network detection algorithm confirms that an earthquake is under way and estimates the location and magnitude in real time. This information can then be used to issue an alert of forthcoming ground shaking. MyShake could be used to enhance EEW in regions with traditional networks and could provide the only EEW capability in regions without. In addition, the seismic waveforms recorded could be used to deliver rapid microseism maps, study impacts on buildings, and possibly image shallow earth structure and earthquake rupture kinematics. PMID:26933682

  2. Design and realization of RS application system for earthquake emergency based on digital earth

    NASA Astrophysics Data System (ADS)

    Yuan, Xiaoxiang; Wang, Xiaoqing; Guo, Jianxing; Dou, Aixia; Ding, Xiang

    2016-11-01

    The current RS-based earthquake emergency system is mainly based on stand-alone software which cannot meet the requirements of massive remote sensing data and parallel seismic damage information extraction after a devastating earthquake. Taking Shaanxi Province as an example, this paper explored firstly the network-based working mode of seismic damage information extraction and data management strategy for multi-user cooperative operation based on analysing work flow of the RS application to earthquake emergency. Then, using WorldWind java SDK, the RS application system for earthquake emergency based on digital earth platform was brought out in CS architecture. Finally, spatial data tables of classification and grade of seismic damage were designed and the system was developed. This system realized functions including 3D display, management of seismic RS image and GIS data obtained before and after earthquake for different user levels and cooperative extraction and publish of such seismic information as building damage, traffic damage and seismo-geological disasters caused by earthquake in real time. Some application to earthquake cases such as 2014 M s6.5 Ludian earthquake show that this system can improve the efficiency of seismic damage information interpretation and data sharing, and provide import disaster information for decision making of earthquake emergency rescue and disaster relief.

  3. Statistical analysis of earthquakes after the 1999 MW 7.7 Chi-Chi, Taiwan, earthquake based on a modified Reasenberg-Jones model

    NASA Astrophysics Data System (ADS)

    Chen, Yuh-Ing; Huang, Chi-Shen; Liu, Jann-Yenq

    2015-12-01

    We investigated the temporal-spatial hazard of the earthquakes after the 1999 September 21 MW = 7.7 Chi-Chi shock in a continental region of Taiwan. The Reasenberg-Jones (RJ) model (Reasenberg and Jones, 1989, 1994) that combines the frequency-magnitude distribution (Gutenberg and Richter, 1944) and time-decaying occurrence rate (Utsu et al., 1995) is conventionally employed for assessing the earthquake hazard after a large shock. However, it is found that the b values in the frequency-magnitude distribution of the earthquakes in the study region dramatically decreased from background values after the Chi-Chi shock, and then gradually increased up. The observation of a time-dependent frequency-magnitude distribution motivated us to propose a modified RJ model (MRJ) to assess the earthquake hazard. To see how the models perform on assessing short-term earthquake hazard, the RJ and MRJ models were separately used to sequentially forecast earthquakes in the study region. To depict the potential rupture area for future earthquakes, we further constructed relative hazard (RH) maps based on the two models. The Receiver Operating Characteristics (ROC) curves (Swets, 1988) finally demonstrated that the RH map based on the MRJ model was, in general, superior to the one based on the original RJ model for exploring the spatial hazard of earthquakes in a short time after the Chi-Chi shock.

  4. Earthquakes: Risk, Monitoring, Notification, and Research

    DTIC Science & Technology

    2008-06-19

    far away as Bangladesh , Taiwan, Thailand, and Vietnam. Several large aftershocks have occurred since the main seismic event. The May 12 earthquake...motion of tectonic plates; ! Earthquake geology and paleoseismology: studies of the history, effects, and mechanics of earthquakes; ! Earthquake hazards

  5. Determination of controlling earthquakes from probabilistic seismic hazard analysis for nuclear reactor sites

    SciTech Connect

    Boissonnade, A.; Bernreuter, D.; Chokshi, N.; Murphy, A.

    1995-04-04

    Recently, the US Nuclear Regulatory Commission published, for public comments, a revision to 10 CFR Part 100. The proposed regulation acknowledges that uncertainties are inherent in estimates of the Safe Shutdown Earthquake Ground Motion (SSE) and requires that these uncertainties be addressed through an appropriate analysis. One element of this evaluation is the assessment of the controlling earthquake through the probabilistic seismic hazard analysis (PSHA) and its use in determining the SSE. This paper reviews the basis for the various key choices in characterizing the controlling earthquake.

  6. Probability based earthquake load and resistance factor design criteria for offshore platforms

    SciTech Connect

    Bea, R.G.

    1996-12-31

    This paper describes a probability reliability based formulation to determine earthquake Load and Resistance Factor Design (LRFD) parameters for conventional, steel, pile supported, tubular membered platforms that is proposed as a basis for earthquake design criteria and guidelines for offshore platforms that are intended to have worldwide applicability. The formulation is illustrated with application to platforms located in five areas: offshore California, Venezuela (Rio Caribe), the East Coast of Canada, in the Caspian Sea (Azeri), and the Norwegian sector of the North Sea.

  7. Earthquake rupture below the brittle-ductile transition in continental lithospheric mantle

    PubMed Central

    Prieto, Germán A.; Froment, Bérénice; Yu, Chunquan; Poli, Piero; Abercrombie, Rachel

    2017-01-01

    Earthquakes deep in the continental lithosphere are rare and hard to interpret in our current understanding of temperature control on brittle failure. The recent lithospheric mantle earthquake with a moment magnitude of 4.8 at a depth of ~75 km in the Wyoming Craton was exceptionally well recorded and thus enabled us to probe the cause of these unusual earthquakes. On the basis of complete earthquake energy balance estimates using broadband waveforms and temperature estimates using surface heat flow and shear wave velocities, we argue that this earthquake occurred in response to ductile deformation at temperatures above 750°C. The high stress drop, low rupture velocity, and low radiation efficiency are all consistent with a dissipative mechanism. Our results imply that earthquake nucleation in the lithospheric mantle is not exclusively limited to the brittle regime; weakening mechanisms in the ductile regime can allow earthquakes to initiate and propagate. This finding has significant implications for understanding deep earthquake rupture mechanics and rheology of the continental lithosphere. PMID:28345055

  8. Earthquake rupture below the brittle-ductile transition in continental lithospheric mantle.

    PubMed

    Prieto, Germán A; Froment, Bérénice; Yu, Chunquan; Poli, Piero; Abercrombie, Rachel

    2017-03-01

    Earthquakes deep in the continental lithosphere are rare and hard to interpret in our current understanding of temperature control on brittle failure. The recent lithospheric mantle earthquake with a moment magnitude of 4.8 at a depth of ~75 km in the Wyoming Craton was exceptionally well recorded and thus enabled us to probe the cause of these unusual earthquakes. On the basis of complete earthquake energy balance estimates using broadband waveforms and temperature estimates using surface heat flow and shear wave velocities, we argue that this earthquake occurred in response to ductile deformation at temperatures above 750°C. The high stress drop, low rupture velocity, and low radiation efficiency are all consistent with a dissipative mechanism. Our results imply that earthquake nucleation in the lithospheric mantle is not exclusively limited to the brittle regime; weakening mechanisms in the ductile regime can allow earthquakes to initiate and propagate. This finding has significant implications for understanding deep earthquake rupture mechanics and rheology of the continental lithosphere.

  9. Fracking, wastewater disposal, and earthquakes

    NASA Astrophysics Data System (ADS)

    McGarr, Arthur

    2016-03-01

    In the modern oil and gas industry, fracking of low-permeability reservoirs has resulted in a considerable increase in the production of oil and natural gas, but these fluid-injection activities also can induce earthquakes. Earthquakes induced by fracking are an inevitable consequence of the injection of fluid at high pressure, where the intent is to enhance permeability by creating a system of cracks and fissures that allow hydrocarbons to flow to the borehole. The micro-earthquakes induced during these highly-controlled procedures are generally much too small to be felt at the surface; indeed, the creation or reactivation of a large fault would be contrary to the goal of enhancing permeability evenly throughout the formation. Accordingly, the few case histories for which fracking has resulted in felt earthquakes have been due to unintended fault reactivation. Of greater consequence for inducing earthquakes, modern techniques for producing hydrocarbons, including fracking, have resulted in considerable quantities of coproduced wastewater, primarily formation brines. This wastewater is commonly disposed by injection into deep aquifers having high permeability and porosity. As reported in many case histories, pore pressure increases due to wastewater injection were channeled from the target aquifers into fault zones that were, in effect, lubricated, resulting in earthquake slip. These fault zones are often located in the brittle crystalline rocks in the basement. Magnitudes of earthquakes induced by wastewater disposal often exceed 4, the threshold for structural damage. Even though only a small fraction of disposal wells induce earthquakes large enough to be of concern to the public, there are so many of these wells that this source of seismicity contributes significantly to the seismic hazard in the United States, especially east of the Rocky Mountains where standards of building construction are generally not designed to resist shaking from large earthquakes.

  10. The key role of eyewitnesses in rapid earthquake impact assessment

    NASA Astrophysics Data System (ADS)

    Bossu, Rémy; Steed, Robert; Mazet-Roux, Gilles; Roussel, Frédéric; Etivant, Caroline

    2014-05-01

    Uncertainties in rapid earthquake impact models are intrinsically large even when excluding potential indirect losses (fires, landslides, tsunami…). The reason is that they are based on several factors which are themselves difficult to constrain, such as the geographical distribution of shaking intensity, building type inventory and vulnerability functions. The difficulties can be illustrated by two boundary cases. For moderate (around M6) earthquakes, the size of potential damage zone and the epicentral location uncertainty share comparable dimension of about 10-15km. When such an earthquake strikes close to an urban area, like in 1999, in Athens (M5.9), earthquake location uncertainties alone can lead to dramatically different impact scenario. Furthermore, for moderate magnitude, the overall impact is often controlled by individual accidents, like in 2002 in Molise, Italy (M5.7), in Bingol, Turkey (M6.4) in 2003 or in Christchurch, New Zealand (M6.3) where respectively 23 out of 30, 84 out of 176 and 115 out of 185 of the causalities perished in a single building failure. Contrastingly, for major earthquakes (M>7), the point source approximation is not valid anymore, and impact assessment requires knowing exactly where the seismic rupture took place, whether it was unilateral, bilateral etc.… and this information is not readily available directly after the earthquake's occurrence. In-situ observations of actual impact provided by eyewitnesses can dramatically reduce impact models uncertainties. We will present the overall strategy developed at the EMSC which comprises of crowdsourcing and flashsourcing techniques, the development of citizen operated seismic networks, and the use of social networks to engage with eyewitnesses within minutes of an earthquake occurrence. For instance, testimonies are collected through online questionnaires available in 32 languages and automatically processed in maps of effects. Geo-located pictures are collected and then

  11. Earthquake damage to transportation systems

    USGS Publications Warehouse

    McCullough, Heather

    1994-01-01

    Earthquakes represent one of the most destructive natural hazards known to man. A large magnitude earthquake near a populated area can affect residents over thousands of square kilometers and cause billions of dollars in property damage. Such an event can kill or injure thousands of residents and disrupt the socioeconomic environment for months, sometimes years. A serious result of a large-magnitude earthquake is the disruption of transportation systems, which limits post-disaster emergency response. Movement of emergency vehicles, such as police cars, fire trucks and ambulances, is often severely restricted. Damage to transportation systems is categorized below by cause including: ground failure, faulting, vibration damage, and tsunamis.

  12. Earthquakes: Thinking about the unpredictable

    NASA Astrophysics Data System (ADS)

    Geller, Robert J.

    The possibility of predicting earthquakes has been investigated by professionals and amateurs, seismologists and nonseismologists, for over 100 years. More than once, hopes of a workable earthquake prediction scheme have been raised only to be dashed. Such schemes—on some occasions accompanied by claims of an established track record—continue to be proposed, not only by Earth scientists, but also by workers in other fields. The assessment of these claims is not just a scientific or technical question. Public administrators and policy makers must make decisions regarding appropriate action in response to claims that some scheme has a predictive capability, or to specific predictions of imminent earthquakes.

  13. Seismology: dynamic triggering of earthquakes.

    PubMed

    Gomberg, Joan; Johnson, Paul

    2005-10-06

    After an earthquake, numerous smaller shocks are triggered over distances comparable to the dimensions of the mainshock fault rupture, although they are rare at larger distances. Here we analyse the scaling of dynamic deformations (the stresses and strains associated with seismic waves) with distance from, and magnitude of, their triggering earthquake, and show that they can cause further earthquakes at any distance if their amplitude exceeds several microstrain, regardless of their frequency content. These triggering requirements are remarkably similar to those measured in the laboratory for inducing dynamic elastic nonlinear behaviour, which suggests that the underlying physics is similar.

  14. The threat of silent earthquakes

    USGS Publications Warehouse

    Cervelli, Peter

    2004-01-01

    Not all earthquakes shake the ground. The so-called silent types are forcing scientists to rethink their understanding of the way quake-prone faults behave. In rare instances, silent earthquakes that occur along the flakes of seaside volcanoes may cascade into monstrous landslides that crash into the sea and trigger towering tsunamis. Silent earthquakes that take place within fault zones created by one tectonic plate diving under another may increase the chance of ground-shaking shocks. In other locations, however, silent slip may decrease the likelihood of destructive quakes, because they release stress along faults that might otherwise seem ready to snap.

  15. Earthquakes, November-December 1977

    USGS Publications Warehouse

    Person, W.J.

    1978-01-01

    In the United States, the largest earthquake during this reporting period was a magntidue 6.6 in the Andreanof Islands, which are part of the Aleutian Islands chain, on November 4 that caused some minor damage. Northern California was struck by a magnitude 4.8 earthquake on November 22 causing moderate damage in the Willits area. This was the most damaging quake in the United States during the year. Two major earthquakes of magntidues 7.0 or above to 14 for the year. 

  16. Regional Earthquake Shaking and Loss Estimation

    NASA Astrophysics Data System (ADS)

    Sesetyan, K.; Demircioglu, M. B.; Zulfikar, C.; Durukal, E.; Erdik, M.

    2009-04-01

    basis source parameters the intensity distributions can be computed using: a)Regional intensity attenuation relationships, b)Intensity correlations with attenuation relationship based PGV, PGA, and Spectral Amplitudes and, c)Intensity correlations with synthetic Fourier Amplitude Spectrum. In Level 1 analysis EMS98 based building vulnerability relationships are used for regional estimates of building damage and the casualty distributions. Results obtained from pilot applications of the Level 0 and Level 1 analysis modes of the ELER software to the 1999 M 7.4 Kocaeli, 1995 M 6.1 Dinar, and 2007 M 5.4 Bingol earthquakes in terms of ground shaking and losses are presented and comparisons with the observed losses are made. The regional earthquake shaking and loss information is intented for dissemination in a timely manner to related agencies for the planning and coordination of the post-earthquake emergency response. However the same software can also be used for scenario earthquake loss estimation and related Monte-Carlo type simulations.

  17. Earthquakes Induced by Hydraulic Fracturing in Poland Township, Ohio

    NASA Astrophysics Data System (ADS)

    Skoumal, R.; Brudzinski, M. R.; Currie, B. S.

    2014-12-01

    Felt seismicity induced by hydraulic fracturing is very rare with only a handful of reported cases worldwide. Using an optimized multi-station cross-correlation template matching routine, 77 earthquakes were identified in Poland Township, Mahoning County, Ohio that were closely related spatially and temporally to active hydraulic fracturing operations. We identified earthquakes as small as M ~1 up to M 3, one of the largest earthquakes induced by hydraulic fracturing in the United States. These events all occurred 4-12 March 2014 and the rate decayed once the Ohio Department of Natural Resources issued a shutdown of hydraulic fracturing at a nearby well on 10 March. Using a locally derived velocity model and double difference relocation, the earthquake epicenters occurred during six stimulation stages along two horizontal well legs that were located ~0.8 km away. Nearly 100 stages in nearby wells at greater distances from the earthquake source region did not coincide with detected seismicity. During the sequence, hypocenters migrated ~600 m along an azimuth of 083 degrees defining a vertically oriented plane of seismicity close to the top of the Precambrian basement. The focal mechanism determined for the M 3 event had a vertically oriented left-lateral fault plane consistent with the earthquake distribution and the regional stress field. The focal mechanism, orientation, and depth of hypocenters were similar to that of the 2011 Youngstown earthquake sequence that occurred ~20 km away, but was correlated with wastewater injection instead of hydraulic fracturing. Considering the relatively large magnitude of these events and the b-value of 0.85, it appears the hydraulic fracturing induced slip along a pre-existing fault/fracture zone optimally oriented in the regional stress field.

  18. Analysis of Recent Major Outer-Rise Earthquake Rupture Characteristics

    NASA Astrophysics Data System (ADS)

    Cleveland, M.; Ammon, C.; Lay, T.; Kanamori, H.

    2009-12-01

    events, analysis of large earthquake outer-rise ruptures can provide insight useful for the evaluation of seismic hazard and increase our understanding of stress transfer properties operating within subducting oceanic lithosphere.

  19. Twitter earthquake detection: Earthquake monitoring in a social world

    USGS Publications Warehouse

    Earle, Paul S.; Bowden, Daniel C.; Guy, Michelle R.

    2011-01-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  20. Nonextensive models for earthquakes

    NASA Astrophysics Data System (ADS)

    Silva, R.; França, G. S.; Vilar, C. S.; Alcaniz, J. S.

    2006-02-01

    We have revisited the fragment-asperity interaction model recently introduced by Sotolongo-Costa and Posadas [Phy. Rev. Lett. 92, 048501 (2004)] by considering a different definition for mean values in the context of Tsallis nonextensive statistics and introducing a scale between the earthquake energy and the size of fragment γ∝r3 . The energy-distribution function (EDF) deduced in our approach is considerably different from the one obtained in the above reference. We have also tested the viability of this EDF with data from two different catalogs (in three different areas), namely, the NEIC and the Bulletin Seismic of the Revista Brasileira de Geofísica. Although both approaches provide very similar values for the nonextensive parameter q , other physical quantities, e.g., energy density, differ considerably by several orders of magnitude.

  1. Nonextensive models for earthquakes.

    PubMed

    Silva, R; França, G S; Vilar, C S; Alcaniz, J S

    2006-02-01

    We have revisited the fragment-asperity interaction model recently introduced by Sotolongo-Costa and Posadas [Phy. Rev. Lett. 92, 048501 (2004)] by considering a different definition for mean values in the context of Tsallis nonextensive statistics and introducing a scale between the earthquake energy and the size of fragment epsilon proportional to r3. The energy-distribution function (EDF) deduced in our approach is considerably different from the one obtained in the above reference. We have also tested the viability of this EDF with data from two different catalogs (in three different areas), namely, the NEIC and the Bulletin Seismic of the Revista Brasileira de Geofísica. Although both approaches provide very similar values for the nonextensive parameter , other physical quantities, e.g., energy density, differ considerably by several orders of magnitude.

  2. Earthquakes - on the moon

    NASA Technical Reports Server (NTRS)

    Nakamura, Y.

    1981-01-01

    Information obtained with the Apollo lunar seismic stations is discussed. The four types of natural seismic sources that have been identified are described, viz., thermal moonquakes, deep moonquakes, meteoroid impacts, and shallow moonquakes. It is suggested that: (1) the thermal quakes represent the slow cracking and movement of surface rocks; (2) the deep quakes are induced by the tide-generating force of the earth's gravity; (3) the meteoroids responsible for most of the observed impacts are in the mass range from 1 to 100 kg and are clustered in groups near the earth's orbit; and (4) the shallow quakes are similar to intraplate earthquakes and indicate that the moon is as seismically active as the interior regions of the earth's tectonic plates. The structure of the lunar interior as inferred from seismic signals due to both the last three natural sources and 'artificial' impacts of used spacecraft is examined in detail.

  3. Sichuan Earthquake in China

    NASA Technical Reports Server (NTRS)

    2008-01-01

    The Sichuan earthquake in China occurred on May 12, 2008, along faults within the mountains, but near and almost parallel the mountain front, northwest of the city of Chengdu. This major quake caused immediate and severe damage to many villages and cities in the area. Aftershocks pose a continuing danger, but another continuing hazard is the widespread occurrence of landslides that have formed new natural dams and consequently new lakes. These lakes are submerging roads and flooding previously developed lands. But an even greater concern is the possible rapid release of water as the lakes eventually overflow the new dams. The dams are generally composed of disintegrated rock debris that may easily erode, leading to greater release of water, which may then cause faster erosion and an even greater release of water. This possible 'positive feedback' between increasing erosion and increasing water release could result in catastrophic debris flows and/or flooding. The danger is well known to the Chinese earthquake response teams, which have been building spillways over some of the new natural dams.

    This ASTER image, acquired on June 1, 2008, shows two of the new large landslide dams and lakes upstream from the town of Chi-Kua-Kan at 32o12'N latitude and 104o50'E longitude. Vegetation is green, water is blue, and soil is grayish brown in this enhanced color view. New landslides appear bright off-white. The northern (top) lake is upstream from the southern lake. Close inspection shows a series of much smaller lakes in an elongated 'S' pattern along the original stream path. Note especially the large landslides that created the dams. Some other landslides in this area, such as the large one in the northeast corner of the image, occur only on the mountain slopes, so do not block streams, and do not form lakes.

  4. From a physical approach to earthquake prediction, towards long and short term warnings ahead of large earthquakes

    NASA Astrophysics Data System (ADS)

    Stefansson, R.; Bonafede, M.

    2012-04-01

    approach of statistics of universal precursors or stress level. The approach is more related to failure physics, by studying the ongoing failure. But it requires watching and relevant modeling for years, even decades. Useful information on fault process and warnings can be issued along the way, starting when we discover a fault showing signs of preparatory processes, up to the time of the earthquake. Such information and warnings could be issued by government agencies in cooperation with scientists to the local Civil Protection committee closest to the fault with information about how to prepare, including directives about enhanced watching. For such a warning service we need a continuously operating geo-watching system, applying modern computing technology to the multidisciplinary data, and a rule based schedule to prepare adequate warnings.

  5. Lessons learned by the DOE complex from recent earthquakes

    SciTech Connect

    Eli, M.W.

    1993-07-01

    Recent earthquake damage investigations at various industrial facilities have resulted in providing the DOE complex with reminders of practical lessons for structures, systems, and components (SSCs) involving: confinement of hazardous materials; continuous, safe operations; occupant safety; and protection of DOE investments and mission-dependent items. Recent assessments are summarized, showing examples of damage caused by the 1992 California Earthquakes (Cape Mendocino, Landers, and Big Bear) and the 1991 Costa Rica Earthquake (Valle de la Estrella). These lessons if applied along with the new DOE NPH Standards (1020--92 Series) can help assure that DOE facilities will meet the intent of the seismic requirements in the new DOE NPH Order 5480.28.

  6. Selecting optimum groundwater monitoring stations for earthquake observation and prediction

    NASA Astrophysics Data System (ADS)

    Lee, H.; Woo, N. C.

    2011-12-01

    In Korea, the National Groundwater Monitoring Network (NGMN), consisted of a total of 327 stations around the country up to date, has been established and operated to monitor the background level and quality of ground water since 1995. From some of the monitoring wells, we identified abnormal changes in groundwater due to earthquakes. Then, this project was initiated with the following objectives: a) to identify and characterize groundwater changes due to earthquakes from the NGMN wells, and b) to suggest groundwater monitoring wells that can be used as supplementary monitoring stations for present seismic network. To accomplish the objectives, we need to identify previous responding history of each well to the other earthquakes, and the well's hydrogeological setting. Groundwater responses to earthquake events are identified as the direction of water-level movement (rise/fall), the amount of absolute changes, and the time for recovery to the previous level. Then, the distribution of responded wells is analyzed for their locations with GIS tools. Finally, statistical analyses perform to identify the optimum monitoring stations, considering geological features and hydrogeological settings of the stations and the earthquake epicenters. In this presentation, we report the results of up-to-date study as a part of the above-mentioned program.

  7. Earthquake Hoax in Ghana: Exploration of the Cry Wolf Hypothesis

    PubMed Central

    Aikins, Moses; Binka, Fred

    2012-01-01

    This paper investigated the belief of the news of impending earthquake from any source in the context of the Cry Wolf hypothesis as well as the belief of the news of any other imminent disaster from any source. We were also interested in the correlation between preparedness, risk perception and antecedents. This explorative study consisted of interviews, literature and Internet reviews. Sampling was of a simple random nature. Stratification was carried out by sex and residence type. The sample size of (N=400), consisted of 195 males and 205 Females. Further stratification was based on residential classification used by the municipalities. The study revealed that a person would believe news of an impending earthquake from any source, (64.4%) and a model significance of (P=0.000). It also showed that a person would believe news of any other impending disaster from any source, (73.1%) and a significance of (P=0.003). There is association between background, risk perception and preparedness. Emergency preparedness is weak. Earthquake awareness needs to be re-enforced. There is a critical need for public education of earthquake preparedness. The authors recommend developing emergency response program for earthquakes, standard operating procedures for a national risk communication through all media including instant bulk messaging. PMID:28299086

  8. Artificial neural network model for earthquake prediction with radon monitoring.

    PubMed

    Külahci, Fatih; Inceöz, Murat; Doğru, Mahmut; Aksoy, Ercan; Baykara, Oktay

    2009-01-01

    Apart from the linear monitoring studies concerning the relationship between radon and earthquake, an artificial neural networks (ANNs) model approach is presented starting out from non-linear changes of the eight different parameters during the earthquake occurrence. A three-layer Levenberg-Marquardt feedforward learning algorithm is used to model the earthquake prediction process in the East Anatolian Fault System (EAFS). The proposed ANN system employs individual training strategy with fixed-weight and supervised models leading to estimations. The average relative error between the magnitudes of the earthquakes acquired by ANN and measured data is about 2.3%. The relative error between the test and earthquake data varies between 0% and 12%. In addition, the factor analysis was applied on all data and the model output values to see the statistical variation. The total variance of 80.18% was explained with four factors by this analysis. Consequently, it can be concluded that ANN approach is a potential alternative to other models with complex mathematical operations.

  9. Earthquakes, September-October, 1979

    USGS Publications Warehouse

    Person, W.J.

    1980-01-01

    In the United States, California experienced the strongest earthquake in that State since 1971. The quake, a M=6.8, occurred on October 15, in Baja California, Mexico, near the California border and caused injuries and damage. 

  10. Earthquakes; March-April, 1979

    USGS Publications Warehouse

    Person, W.J.

    1979-01-01

    In the United States, a number of earthquakes were experienced, the most damaging one in southern California on March 15. The aftershocks continued in southeastern Alaska but caused no additional damage. 

  11. Seismology: Remote-controlled earthquakes

    NASA Astrophysics Data System (ADS)

    Hayes, Gavin

    2016-04-01

    Large earthquakes cause other quakes near and far. Analyses of quakes in Pakistan and Chile suggest that such triggering can occur almost instantaneously, making triggered events hard to detect, and potentially enhancing the associated hazards.

  12. Sociological aspects of earthquake prediction

    USGS Publications Warehouse

    Spall, H.

    1979-01-01

    Henry Spall talked recently with Denis Mileti who is in the Department of Sociology, Colorado State University, Fort Collins, Colo. Dr. Mileti is a sociologst involved with research programs that study the socioeconomic impact of earthquake prediction. 

  13. Earthquakes, May-June, 1992

    USGS Publications Warehouse

    Person, Waverly J.

    1992-01-01

    The months of May and June were very active in terms of earthquake occurrence. Six major earthquakes (7.0earthquakes included a magnitude 7.1 in Papua New Guinea on May 15, a magnitude 7.1 followed by a magnitude 7.5 in the Philippine Islands on May 17, a magnitude 7.0 in the Cuba region on May 25, and a magnitude 7.3 in the Santa Cruz Islands of the Pacific on May 27. In the United States, a magnitude 7.6 earthquake struck in southern California on June 28 followed by a magnitude 6.7 quake about three hours later.

  14. Crisis Intervention in an Earthquake

    ERIC Educational Resources Information Center

    Blaufarb, Herbert; Levine, Jules

    1972-01-01

    This article describes the crisis intervention techniques used by the San Fernanco Valley Child Guidance Clinic to help families deal with the traumatic events experienced in the 1971 earthquake in California. (Author)

  15. Geochemical challenge to earthquake prediction.

    PubMed

    Wakita, H

    1996-04-30

    The current status of geochemical and groundwater observations for earthquake prediction in Japan is described. The development of the observations is discussed in relation to the progress of the earthquake prediction program in Japan. Three major findings obtained from our recent studies are outlined. (i) Long-term radon observation data over 18 years at the SKE (Suikoen) well indicate that the anomalous radon change before the 1978 Izu-Oshima-kinkai earthquake can with high probability be attributed to precursory changes. (ii) It is proposed that certain sensitive wells exist which have the potential to detect precursory changes. (iii) The appearance and nonappearance of coseismic radon drops at the KSM (Kashima) well reflect changes in the regional stress state of an observation area. In addition, some preliminary results of chemical changes of groundwater prior to the 1995 Kobe (Hyogo-ken nanbu) earthquake are presented.

  16. Medical complications associated with earthquakes.

    PubMed

    Bartels, Susan A; VanRooyen, Michael J

    2012-02-25

    Major earthquakes are some of the most devastating natural disasters. The epidemiology of earthquake-related injuries and mortality is unique for these disasters. Because earthquakes frequently affect populous urban areas with poor structural standards, they often result in high death rates and mass casualties with many traumatic injuries. These injuries are highly mechanical and often multisystem, requiring intensive curative medical and surgical care at a time when the local and regional medical response capacities have been at least partly disrupted. Many patients surviving blunt and penetrating trauma and crush injuries have subsequent complications that lead to additional morbidity and mortality. Here, we review and summarise earthquake-induced injuries and medical complications affecting major organ systems.

  17. The nature of earthquake prediction

    USGS Publications Warehouse

    Lindh, A.G.

    1991-01-01

    Earthquake prediction is inherently statistical. Although some people continue to think of earthquake prediction as the specification of the time, place, and magnitude of a future earthquake, it has been clear for at least a decade that this is an unrealistic and unreasonable definition. the reality is that earthquake prediction starts from the long-term forecasts of place and magnitude, with very approximate time constraints, and progresses, at least in principle, to a gradual narrowing of the time window as data and understanding permit. Primitive long-term forecasts are clearly possible at this time on a few well-characterized fault systems. Tightly focuses monitoring experiments aimed at short-term prediction are already underway in Parkfield, California, and in the Tokai region in Japan; only time will tell how much progress will be possible. 

  18. Earthquakes in stable continental crust

    SciTech Connect

    Johnson, A.C.; Kanter, L.R. )

    1990-03-01

    Earthquakes can strike even in stable crust, well away from the familiar earthquake zones at the edges of tectonic plates, but their mere occurrence is both a source of concern in planning critical facilities such as nuclear power plants. The authors sought answers to two major questions: Just how much seismic activity does take place within the stable parts of continents And are there specific geologic features that make some areas of stable crust particularly susceptible to earthquakes They began by studying North America alone, but it soon became clear that the fairly short record of these rare events on a single continent would not provide enough data for reliable analysis. Hence, they decided to substitute space for time--to survey earthquake frequency and distribution in stable continental areas worldwide. This paper discusses their findings.

  19. Earthquakes in Stable Continental Crust.

    ERIC Educational Resources Information Center

    Johnston, Arch C.; Kanter, Lisa R.

    1990-01-01

    Discussed are some of the reasons for earthquakes which occur in stable crust away from familiar zones at the ends of tectonic plates. Crust stability and the reactivation of old faults are described using examples from India and Australia. (CW)

  20. Coseismic ionospheric and geomagnetic disturbances caused by great earthquakes

    NASA Astrophysics Data System (ADS)

    Hao, Yongqiang; Zhang, Donghe; Xiao, Zuo

    2016-04-01

    Despite primary energy disturbances from the Sun, oscillations of the Earth surface due to a large earthquake will couple with the atmosphere and therefore the ionosphere, then the so-called coseismic ionospheric disturbances (CIDs) can be detected in the ionosphere. Using a combination of techniques, total electron content, HF Doppler, and ground magnetometer, a new time-sequence of such effects propagation were developed on observational basis and ideas on explanation provided. In the cases of 2008 Wenchuan and 2011 Tohoku earthquakes, infrasonic waves accompanying the propagation of seismic Rayleigh waves were observed in the ionosphere by all the three kinds of techniques. This is the very first report to present CIDs recorded by different techniques at co-located sites and profiled with regard to changes of both ionospheric plasma and current (geomagnetic field) simultaneously. Comparison between the oceanic (2011 Tohoku) and inland (2008 Wenchuan) earthquakes revealed that the main directional lobe of latter case is more distinct which is perpendicular to the direction of the fault rupture. We argue that the different fault slip (inland or submarine) may affect the way of couplings of lithosphere with atmosphere. References Zhao, B., and Y. Hao (2015), Ionospheric and geomagnetic disturbances caused by the 2008 Wenchuan earthquake: A revisit, J. Geophys. Res. Space Physics, 120, doi:10.1002/2015JA021035. Hao, Y. Q., Z. Xiao, and D. H. Zhang (2013), Teleseismic magnetic effects (TMDs) of 2011 Tohoku earthquake, J. Geophys. Res. Space Physics, 118, 3914-3923, doi:10.1002/jgra.50326. Hao, Y. Q., Z. Xiao, and D. H. Zhang (2012), Multi-instrument observation on co-seismic ionospheric effects after great Tohoku earthquake, J. Geophys. Res., 117, A02305, doi:10.1029/2011JA017036.

  1. Elastic energy release in great earthquakes and eruptions

    NASA Astrophysics Data System (ADS)

    Gudmundsson, Agust

    2014-05-01

    The sizes of earthquakes are measured using well-defined, measurable quantities such as seismic moment and released (transformed) elastic energy. No similar measures exist for the sizes of volcanic eruptions, making it difficult to compare the energies released in earthquakes and eruptions. Here I provide a new measure of the elastic energy (the potential mechanical energy) associated with magma chamber rupture and contraction (shrinkage) during an eruption. For earthquakes and eruptions, elastic energy derives from two sources: (1) the strain energy stored in the volcano/fault zone before rupture, and (2) the external applied load (force, pressure, stress, displacement) on the volcano/fault zone. From thermodynamic considerations it follows that the elastic energy released or transformed (dU) during an eruption is directly proportional to the excess pressure (pe) in the magma chamber at the time of rupture multiplied by the volume decrease (-dVc) of the chamber, so that . This formula can be used as a basis for a new eruption magnitude scale, based on elastic energy released, which can be related to the moment-magnitude scale for earthquakes. For very large eruptions (>100 km3), the volume of the feeder-dike is negligible, so that the decrease in chamber volume during an eruption corresponds roughly to the associated volume of erupted materials , so that the elastic energy is . Using a typical excess pressures of 5 MPa, it is shown that the largest known eruptions on Earth, such as the explosive La Garita Caldera eruption (27-28 million years ago) and largest single (effusive) Colombia River basalt lava flows (15-16 million years ago), both of which have estimated volumes of about 5000 km3, released elastic energy of the order of 10EJ. For comparison, the seismic moment of the largest earthquake ever recorded, the M9.5 1960 Chile earthquake, is estimated at 100 ZJ and the associated elastic energy release at 10EJ.

  2. Mitigating earthquakes; the federal role

    USGS Publications Warehouse

    Press, F.

    1977-01-01

    With rapid approach of a capability to make reliable earthquake forecasts, it essential that the Federal Government play a strong, positive role in formulating and implementing plans to reduce earthquake hazards. Many steps are being taken in this direction, with the President looking to the Office of Science and Technology Policy (OSTP) in his Executive Office to provide leadership in establishing and coordinating Federal activities. 

  3. Tectonic summaries of magnitude 7 and greater earthquakes from 2000 to 2015

    USGS Publications Warehouse

    Hayes, Gavin P.; Meyers, Emma K.; Dewey, James W.; Briggs, Richard W.; Earle, Paul S.; Benz, Harley M.; Smoczyk, Gregory M.; Flamme, Hanna E.; Barnhart, William D.; Gold, Ryan D.; Furlong, Kevin P.

    2017-01-11

    This paper describes the tectonic summaries for all magnitude 7 and larger earthquakes in the period 2000–2015, as produced by the U.S. Geological Survey National Earthquake Information Center during their routine response operations to global earthquakes. The goal of such summaries is to provide important event-specific information to the public rapidly and concisely, such that recent earthquakes can be understood within a global and regional seismotectonic framework. We compile these summaries here to provide a long-term archive for this information, and so that the variability in tectonic setting and earthquake history from region to region, and sometimes within a given region, can be more clearly understood.

  4. Impact-based earthquake alerts with the U.S. Geological Survey's PAGER system: what's next?

    USGS Publications Warehouse

    Wald, D.J.; Jaiswal, K.S.; Marano, K.D.; Garcia, D.; So, E.; Hearne, M.

    2012-01-01

    In September 2010, the USGS began publicly releasing earthquake alerts for significant earthquakes around the globe based on estimates of potential casualties and economic losses with its Prompt Assessment of Global Earthquakes for Response (PAGER) system. These estimates significantly enhanced the utility of the USGS PAGER system which had been, since 2006, providing estimated population exposures to specific shaking intensities. Quantifying earthquake impacts and communicating estimated losses (and their uncertainties) to the public, the media, humanitarian, and response communities required a new protocol—necessitating the development of an Earthquake Impact Scale—described herein and now deployed with the PAGER system. After two years of PAGER-based impact alerting, we now review operations, hazard calculations, loss models, alerting protocols, and our success rate for recent (2010-2011) events. This review prompts analyses of the strengths, limitations, opportunities, and pressures, allowing clearer definition of future research and development priorities for the PAGER system.

  5. USGS GNSS Applications to Earthquake Disaster Response and Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Hudnut, K. W.; Murray, J. R.; Minson, S. E.

    2015-12-01

    Rapid characterization of earthquake rupture is important during a disaster because it establishes which fault ruptured and the extent and amount of fault slip. These key parameters, in turn, can augment in situ seismic sensors for identifying disruption to lifelines as well as localized damage along the fault break. Differential GNSS station positioning, along with imagery differencing, are important methods for augmenting seismic sensors. During response to recent earthquakes (1989 Loma Prieta, 1992 Landers, 1994 Northridge, 1999 Hector Mine, 2010 El Mayor - Cucapah, 2012 Brawley Swarm and 2014 South Napa earthquakes), GNSS co-seismic and post-seismic observations proved to be essential for rapid earthquake source characterization. Often, we find that GNSS results indicate key aspects of the earthquake source that would not have been known in the absence of GNSS data. Seismic, geologic, and imagery data alone, without GNSS, would miss important details of the earthquake source. That is, GNSS results provide important additional insight into the earthquake source properties, which in turn help understand the relationship between shaking and damage patterns. GNSS also adds to understanding of the distribution of slip along strike and with depth on a fault, which can help determine possible lifeline damage due to fault offset, as well as the vertical deformation and tilt that are vitally important for gravitationally driven water systems. The GNSS processing work flow that took more than one week 25 years ago now takes less than one second. Formerly, portable receivers needed to be set up at a site, operated for many hours, then data retrieved, processed and modeled by a series of manual steps. The establishment of continuously telemetered, continuously operating high-rate GNSS stations and the robust automation of all aspects of data retrieval and processing, has led to sub-second overall system latency. Within the past few years, the final challenges of

  6. Sumatran megathrust earthquakes: from science to saving lives.

    PubMed

    Sieh, Kerry

    2006-08-15

    Most of the loss of life, property and well-being stemming from the great Sumatran earthquake and tsunami of 2004 could have been avoided and losses from similar future events can be largely prevented. However, achieving this goal requires forging a chain linking basic science-the study of why, when and where these events occur-to people's everyday lives. The intermediate links in this chain are emergency response preparedness, warning capability, education and infrastructural changes. In this article, I first describe our research on the Sumatran subduction zone. This research has allowed us to understand the basis of the earthquake cycle on the Sumatran megathrust and to reconstruct the sequence of great earthquakes that have occurred there in historic and prehistoric times. On the basis of our findings, we expect that one or two more great earthquakes and tsunamis, nearly as devastating as the 2004 event, are to be expected within the next few decades in a region of coastal Sumatra to the south of the zone affected in 2004. I go on to argue that preventing future tragedies does not necessarily involve hugely expensive or high-tech solutions such as the construction of coastal defences or sensor-based tsunami warning systems. More valuable and practical steps include extending the scientific research, educating the at-risk populations as to what to do in the event of a long-lasting earthquake (i.e. one that might be followed by a tsunami), taking simple measures to strengthen buildings against shaking, providing adequate escape routes and helping the residents of the vulnerable low-lying coastal strips to relocate their homes and businesses to land that is higher or farther from the coast. Such steps could save hundreds and thousands of lives in the coastal cities and offshore islands of western Sumatra, and have general applicability to strategies for helping the developing nations to deal with natural hazards.

  7. Two models for earthquake forerunners

    USGS Publications Warehouse

    Mjachkin, V.I.; Brace, W.F.; Sobolev, G.A.; Dieterich, J.H.

    1975-01-01

    Similar precursory phenomena have been observed before earthquakes in the United States, the Soviet Union, Japan, and China. Two quite different physical models are used to explain these phenomena. According to a model developed by US seismologists, the so-called dilatancy diffusion model, the earthquake occurs near maximum stress, following a period of dilatant crack expansion. Diffusion of water in and out of the dilatant volume is required to explain the recovery of seismic velocity before the earthquake. According to a model developed by Soviet scientists growth of cracks is also involved but diffusion of water in and out of the focal region is not required. With this model, the earthquake is assumed to occur during a period of falling stress and recovery of velocity here is due to crack closure as stress relaxes. In general, the dilatancy diffusion model gives a peaked precursor form, whereas the dry model gives a bay form, in which recovery is well under way before the earthquake. A number of field observations should help to distinguish between the two models: study of post-earthquake recovery, time variation of stress and pore pressure in the focal region, the occurrence of pre-existing faults, and any changes in direction of precursory phenomena during the anomalous period. ?? 1975 Birkha??user Verlag.

  8. Global earthquake fatalities and population

    USGS Publications Warehouse

    Holzer, Thomas L.; Savage, James C.

    2013-01-01

    Modern global earthquake fatalities can be separated into two components: (1) fatalities from an approximately constant annual background rate that is independent of world population growth and (2) fatalities caused by earthquakes with large human death tolls, the frequency of which is dependent on world population. Earthquakes with death tolls greater than 100,000 (and 50,000) have increased with world population and obey a nonstationary Poisson distribution with rate proportional to population. We predict that the number of earthquakes with death tolls greater than 100,000 (50,000) will increase in the 21st century to 8.7±3.3 (20.5±4.3) from 4 (7) observed in the 20th century if world population reaches 10.1 billion in 2100. Combining fatalities caused by the background rate with fatalities caused by catastrophic earthquakes (>100,000 fatalities) indicates global fatalities in the 21st century will be 2.57±0.64 million if the average post-1900 death toll for catastrophic earthquakes (193,000) is assumed.

  9. Hydrological signatures of earthquake strain

    SciTech Connect

    Muir-Wood, R.; King, G.C.P. |

    1993-12-01

    The character of the hydrological changes that follow major earthquakes has been investigated and found to be dependent on the style of faulting. The most significant response is found to accompany major normal fault earthquakes. Increases in spring and river discharges peak a few days after the earthquake, and typically, excesss flow is sustained for a period of 6-12 months. In contrast, hydrological changes accompanying pure reverse fault earthquakes are either undetected or indicate lowering of well levels and spring flows. Strike-slip and oblique-slip fault movements are associated with a mixture of responses but appear to release no more than 10% of the water volume of the same sized normal fault event. For two major normal fault earthquakes in the western United States (those of Hebgen Lake on August 17, 1959, and Borah Peak on October 28, 1983), there is sufficient river flow information to allow the magnitude and extent of the postseismic discharge to be quantified. The discharge has been converted to a rainfall equivalent, which is found to exceed 100 mm close to the fault and to remain above 10 mm at distances greater than 50 km. Results suggest that water-filled craks are ubiquitous throughout the brittle continental crust and that these cracks open and close throughout the earthquake cycle. The existence of tectonically induced fluid flows on the scale that we demonstrate has major implications for our understanding of the mechanical and chemical behavior of crustal rocks.

  10. Building with Earthquakes in Mind

    NASA Astrophysics Data System (ADS)

    Mangieri, Nicholas

    2016-04-01

    Earthquakes are some of the most elusive and destructive disasters humans interact with on this planet. Engineering structures to withstand earthquake shaking is critical to ensure minimal loss of life and property. However, the majority of buildings today in non-traditional earthquake prone areas are not built to withstand this devastating force. Understanding basic earthquake engineering principles and the effect of limited resources helps students grasp the challenge that lies ahead. The solution can be found in retrofitting existing buildings with proper reinforcements and designs to deal with this deadly disaster. The students were challenged in this project to construct a basic structure, using limited resources, that could withstand a simulated tremor through the use of an earthquake shake table. Groups of students had to work together to creatively manage their resources and ideas to design the most feasible and realistic type of building. This activity provided a wealth of opportunities for the students to learn more about a type of disaster they do not experience in this part of the country. Due to the fact that most buildings in New York City were not designed to withstand earthquake shaking, the students were able to gain an appreciation for how difficult it would be to prepare every structure in the city for this type of event.

  11. Mapping Tectonic Stress Using Earthquakes

    SciTech Connect

    Arnold, Richard; Townend, John; Vignaux, Tony

    2005-11-23

    An earthquakes occurs when the forces acting on a fault overcome its intrinsic strength and cause it to slip abruptly. Understanding more specifically why earthquakes occur at particular locations and times is complicated because in many cases we do not know what these forces actually are, or indeed what processes ultimately trigger slip. The goal of this study is to develop, test, and implement a Bayesian method of reliably determining tectonic stresses using the most abundant stress gauges available - earthquakes themselves.Existing algorithms produce reasonable estimates of the principal stress directions, but yield unreliable error bounds as a consequence of the generally weak constraint on stress imposed by any single earthquake, observational errors, and an unavoidable ambiguity between the fault normal and the slip vector.A statistical treatment of the problem can take into account observational errors, combine data from multiple earthquakes in a consistent manner, and provide realistic error bounds on the estimated principal stress directions.We have developed a realistic physical framework for modelling multiple earthquakes and show how the strong physical and geometrical constraints present in this problem allow inference to be made about the orientation of the principal axes of stress in the earth's crust.

  12. Towards the Future "Earthquake" School in the Cloud: Near-real Time Earthquake Games Competition in Taiwan

    NASA Astrophysics Data System (ADS)

    Chen, K. H.; Liang, W. T.; Wu, Y. F.; Yen, E.

    2014-12-01

    To prevent the future threats of natural disaster, it is important to understand how the disaster happened, why lives were lost, and what lessons have been learned. By that, the attitude of society toward natural disaster can be transformed from training to learning. The citizen-seismologists-in-Taiwan project is designed to elevate the quality of earthquake science education by means of incorporating earthquake/tsunami stories and near-real time earthquake games competition into the traditional curricula in schools. Through pilot of courses and professional development workshops, we have worked closely with teachers from elementary, junior high, and senior high schools, to design workable teaching plans through a practical operation of seismic monitoring at home or school. We will introduce how the 9-years-old do P- and S-wave picking and measure seismic intensity through interactive learning platform, how do scientists and school teachers work together, and how do we create an environment to facilitate continuous learning (i.e., near-real time earthquake games competition), to make earthquake science fun.

  13. Catalog of earthquakes along the San Andreas fault system in Central California, July-September 1972

    USGS Publications Warehouse

    Wesson, R.L.; Meagher, K.L.; Lester, F.W.

    1973-01-01

    Numerous small earthquakes occur each day in the coast ranges of Central California. The detailed study of these earthquakes provides a tool for gaining insight into the tectonic and physical processes responsible for the generation of damaging earthquakes. This catalog contains the fundamental parameters for earthquakes located within and adjacent to the seismograph network operated by the National Center for Earthquake Research (NCER), U.S. Geological Survey, during the period July - September, 1972. The motivation for these detailed studies has been described by Pakiser and others (1969) and by Eaton and others (1970). Similar catalogs of earthquakes for the years 1969, 1970 and 1971 have been prepared by Lee and others (1972 b, c, d). Catalogs for the first and second quarters of 1972 have been prepared by Wessan and others (1972 a & b). The basic data contained in these catalogs provide a foundation for further studies. This catalog contains data on 1254 earthquakes in Central California. Arrival times at 129 seismograph stations were used to locate the earthquakes listed in this catalog. Of these, 104 are telemetered stations operated by NCER. Readings from the remaining 25 stations were obtained through the courtesy of the Seismographic Stations, University of California, Berkeley (UCB), the Earthquake Mechanism Laboratory, National Oceanic and Atmospheric Administration, San Francisco (EML); and the California Department of Water Resources, Sacramento. The Seismographic Stations of the University of California, Berkeley, have for many years published a bulletin describing earthquakes in Northern California and the surrounding area, and readings at UCB Stations from more distant events. The purpose of the present catalog is not to replace the UCB Bulletin, but rather to supplement it, by describing the seismicity of a portion of central California in much greater detail.

  14. Was the 2015 Hindu-Kush intermediate-depth earthquake a repeat of the previous M~7 earthquakes ?

    NASA Astrophysics Data System (ADS)

    Harada, Tomoya; Satake, Kenji; Ishibashi, Katsuhiko

    2016-04-01

    On Oct. 26, 2015, an Mw7.5 earthquake occurred at intermediate depth (230 km) beneath Hindu-Kush. This event took place in the source region of the six previous M~7 earthquakes which recurred about every nine years:1956 (mb 6.5), 1965 (mb 7.5), 1974 (mb 7.1), 1983 (Mw 7.4), 1993 (Mw 7.0), and 2002 (Mw 7.3). On the basis of these past events, Harada and Ishibashi (2012, EGU) proposed that next event might be imminent in this region. However, recurrence interval between the 2002 and 2015 events is longer than those of events before 2002. In this study, in order to examine whether the 2015 earthquake re-ruptured the source region of the repeating M~7 earthquakes, we performed the same analysis of Harada and Ishibashi (2012) for the previous M~7 intermediate-depth earthquakes; namely, simultaneous relocation of the 1956 main shock and the earthquakes from 1964 to 2015, and mechanism determination / slip distribution estimation of the six events by tele-seismic body-wave analysis. As a result, the 2015 main shock is located close to the 1956, 1965, 1974, and 1983 main shocks and the 1993 foreshock (Mw 6.3) which occurred about 30 minutes before the main shock. The 2015 mechanism solution is very similar to those of the former six events (ESE-WNW striking and southward-dipping high-angle reverse faulting with a down-dip tension). However, the 2015 slip is distributed at the un-ruptured area by the five earthquakes from 1965 to 2002. The 1965, 1974, 1983, and 1993 events rupture the same region repeatedly. The main slips of the 1993, 2002, and 2015 events do not overlap each other; this was confirmed by re-analysis of the waveforms recorded at the same stations. As for the 1965, 1974, and 1983 earthquakes, overlap of the slip distributions may be caused by the low quality of the waveform data. From slip distributions, the M~7 earthquakes, at least for the 1993, 2002, and 2015 events, may not be considered as characteristic earthquakes. However, it is notable that main

  15. The Basis System

    SciTech Connect

    Dubois, P.F.

    1989-05-16

    This paper discusses the basis system. Basis is a program development system for scientific programs. It has been developed over the last five years at Lawrence Livermore National Laboratory (LLNL), where it is now used in about twenty major programming efforts. The Basis System includes two major components, a program development system and a run-time package. The run-time package provides the Basis Language interpreter, through which the user does input, output, plotting, and control of the program's subroutines and functions. Variables in the scientific packages are known to this interpreter, so that the user may arbitrarily print, plot, and calculate with, any major program variables. Also provided are facilities for dynamic memory management, terminal logs, error recovery, text-file i/o, and the attachment of non-Basis-developed packages.

  16. PRELIMINARY SELECTION OF MGR DESIGN BASIS EVENTS

    SciTech Connect

    J.A. Kappes

    1999-09-16

    The purpose of this analysis is to identify the preliminary design basis events (DBEs) for consideration in the design of the Monitored Geologic Repository (MGR). For external events and natural phenomena (e.g., earthquake), the objective is to identify those initiating events that the MGR will be designed to withstand. Design criteria will ensure that radiological release scenarios resulting from these initiating events are beyond design basis (i.e., have a scenario frequency less than once per million years). For internal (i.e., human-induced and random equipment failures) events, the objective is to identify credible event sequences that result in bounding radiological releases. These sequences will be used to establish the design basis criteria for MGR structures, systems, and components (SSCs) design basis criteria in order to prevent or mitigate radiological releases. The safety strategy presented in this analysis for preventing or mitigating DBEs is based on the preclosure safety strategy outlined in ''Strategy to Mitigate Preclosure Offsite Exposure'' (CRWMS M&O 1998f). DBE analysis is necessary to provide feedback and requirements to the design process, and also to demonstrate compliance with proposed 10 CFR 63 (Dyer 1999b) requirements. DBE analysis is also required to identify and classify the SSCs that are important to safety (ITS).

  17. Biological Indicators in Studies of Earthquake Precursors

    NASA Astrophysics Data System (ADS)

    Sidorin, A. Ya.; Deshcherevskii, A. V.

    2012-04-01

    Time series of data on variations in the electric activity (EA) of four species of weakly electric fish Gnathonemus leopoldianus and moving activity (MA) of two cat-fishes Hoplosternum thoracatum and two groups of Columbian cockroaches Blaberus craniifer were analyzed. The observations were carried out in the Garm region of Tajikistan within the frameworks of the experiments aimed at searching for earthquake precursors. An automatic recording system continuously recorded EA and DA over a period of several years. Hourly means EA and MA values were processed. Approximately 100 different parameters were calculated on the basis of six initial EA and MA time series, which characterize different variations in the EA and DA structure: amplitude of the signal and fluctuations of activity, parameters of diurnal rhythms, correlated changes in the activity of various biological indicators, and others. A detailed analysis of the statistical structure of the total array of parametric time series obtained in the experiment showed that the behavior of all animals shows a strong temporal variability. All calculated parameters are unstable and subject to frequent changes. A comparison of the data obtained with seismicity allow us to make the following conclusions: (1) The structure of variations in the studied parameters is represented by flicker noise or even a more complex process with permanent changes in its characteristics. Significant statistics are required to prove the cause-and-effect relationship of the specific features of such time series with seismicity. (2) The calculation of the reconstruction statistics in the EA and MA series structure demonstrated an increase in their frequency in the last hours or a few days before the earthquake if the hypocenter distance is comparable to the source size. Sufficiently dramatic anomalies in the behavior of catfishes and cockroaches (changes in the amplitude of activity variation, distortions of diurnal rhythms, increase in the

  18. Seismic gaps and source zones of recent large earthquakes in coastal Peru

    USGS Publications Warehouse

    Dewey, J.W.; Spence, W.

    1979-01-01

    The earthquakes of central coastal Peru occur principally in two distinct zones of shallow earthquake activity that are inland of and parallel to the axis of the Peru Trench. The interface-thrust (IT) zone includes the great thrust-fault earthquakes of 17 October 1966 and 3 October 1974. The coastal-plate interior (CPI) zone includes the great earthquake of 31 May 1970, and is located about 50 km inland of and 30 km deeper than the interface thrust zone. The occurrence of a large earthquake in one zone may not relieve elastic strain in the adjoining zone, thus complicating the application of the seismic gap concept to central coastal Peru. However, recognition of two seismic zones may facilitate detection of seismicity precursory to a large earthquake in a given zone; removal of probable CPI-zone earthquakes from plots of seismicity prior to the 1974 main shock dramatically emphasizes the high seismic activity near the rupture zone of that earthquake in the five years preceding the main shock. Other conclusions on the seismicity of coastal Peru that affect the application of the seismic gap concept to this region are: (1) Aftershocks of the great earthquakes of 1966, 1970, and 1974 occurred in spatially separated clusters. Some clusters may represent distinct small source regions triggered by the main shock rather than delimiting the total extent of main-shock rupture. The uncertainty in the interpretation of aftershock clusters results in corresponding uncertainties in estimates of stress drop and estimates of the dimensions of the seismic gap that has been filled by a major earthquake. (2) Aftershocks of the great thrust-fault earthquakes of 1966 and 1974 generally did not extend seaward as far as the Peru Trench. (3) None of the three great earthquakes produced significant teleseismic activity in the following month in the source regions of the other two earthquakes. The earthquake hypocenters that form the basis of this study were relocated using station

  19. Evidence for Ancient Mesoamerican Earthquakes

    NASA Astrophysics Data System (ADS)

    Kovach, R. L.; Garcia, B.

    2001-12-01

    Evidence for past earthquake damage at Mesoamerican ruins is often overlooked because of the invasive effects of tropical vegetation and is usually not considered as a casual factor when restoration and reconstruction of many archaeological sites are undertaken. Yet the proximity of many ruins to zones of seismic activity would argue otherwise. Clues as to the types of damage which should be soughtwere offered in September 1999 when the M = 7.5 Oaxaca earthquake struck the ruins of Monte Alban, Mexico, where archaeological renovations were underway. More than 20 structures were damaged, 5 of them seriously. Damage features noted were walls out of plumb, fractures in walls, floors, basal platforms and tableros, toppling of columns, and deformation, settling and tumbling of walls. A Modified Mercalli Intensity of VII (ground accelerations 18-34 %b) occurred at the site. Within the diffuse landward extension of the Caribbean plate boundary zone M = 7+ earthquakes occur with repeat times of hundreds of years arguing that many Maya sites were subjected to earthquakes. Damage to re-erected and reinforced stelae, walls, and buildings were witnessed at Quirigua, Guatemala, during an expedition underway when then 1976 M = 7.5 Guatemala earthquake on the Motagua fault struck. Excavations also revealed evidence (domestic pttery vessels and skeleton of a child crushed under fallen walls) of an ancient earthquake occurring about the teim of the demise and abandonment of Quirigua in the late 9th century. Striking evidence for sudden earthquake building collapse at the end of the Mayan Classic Period ~A.D. 889 was found at Benque Viejo (Xunantunich), Belize, located 210 north of Quirigua. It is argued that a M = 7.5 to 7.9 earthquake at the end of the Maya Classic period centered in the vicinity of the Chixoy-Polochic and Motagua fault zones cound have produced the contemporaneous earthquake damage to the above sites. As a consequences this earthquake may have accelerated the

  20. Seismic responses of Baozhusi gravity dam upon MS 8.0 Wenchuan Earthquake

    NASA Astrophysics Data System (ADS)

    Xu, Y.; Zhang, C. H.

    2012-04-01

    Baozhusi gravity dam was not destructively damaged during the Ms 8.0 Wenchuan Earthquake even though the earthquake intensity (0.2g) at the dam site exceeded the design level of the dam (0.1g). In order to analyze the dam's performance to resist the earthquake, we design a three-dimensional model to simulate the dam's dynamic responses with finite element modeling scheme with consideration of the nonlinearities of contraction joint opening and different combination patterns of three-component seismic processes. Then with 2D elasto-plastic yielding analysis technique we reassess the seismic safety and discuss the possible destruction modals of the dam during strong earthquake with updated seismic fortification levels. The results demonstrate that (1) the cross-stream component of earthquake motion predominates in the dynamic responses of the dam, and the stream component has relatively weaker excitation to the dam, which are probably the reason that the dam luckily avoided strong damage in the Wenchuan Earthquake; (2) the concrete fracture occurred near the permanent contraction joints at the top of the dam may have resulted from the impact of concrete blocks during joints opening; (3) the dam safety can meet the requirement under the updated design earthquake (0.27g) and will be lower under the maximum credible earthquake (0.32g), which may affect the reservoir operating and the resistant abilities to aftershocks.

  1. Web-Based Real Time Earthquake Forecasting and Personal Risk Management

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Turcotte, D. L.; Donnellan, A.

    2012-12-01

    Earthquake forecasts have been computed by a variety of countries and economies world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. One example is the Working Group on California Earthquake Probabilities that has been responsible for the official California earthquake forecast since 1988. However, in a time of increasingly severe global financial constraints, we are now moving inexorably towards personal risk management, wherein mitigating risk is becoming the responsibility of individual members of the public. Under these circumstances, open access to a variety of web-based tools, utilities and information is a necessity. Here we describe a web-based system that has been operational since 2009 at www.openhazards.com and www.quakesim.org. Models for earthquake physics and forecasting require input data, along with model parameters. The models we consider are the Natural Time Weibull (NTW) model for regional earthquake forecasting, together with models for activation and quiescence. These models use small earthquakes ('seismicity-based models") to forecast the occurrence of large earthquakes, either through varying rates of small earthquake activity, or via an accumulation of this activity over time. These approaches use data-mining algorithms combined with the ANSS earthquake catalog. The basic idea is to compute large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Each of these approaches has computational challenges associated with computing forecast information in real time. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we show that real-time forecasting is possible at a grid scale of 0.1o. We have analyzed the performance of these models using Reliability/Attributes and standard Receiver Operating Characteristic (ROC) tests. We show how the Reliability and

  2. Earthquake early warning for Romania - most recent improvements

    NASA Astrophysics Data System (ADS)

    Marmureanu, Alexandru; Elia, Luca; Martino, Claudio; Colombelli, Simona; Zollo, Aldo; Cioflan, Carmen; Toader, Victorin; Marmureanu, Gheorghe; Marius Craiu, George; Ionescu, Constantin

    2014-05-01

    EWS for Vrancea earthquakes uses the time interval (28-32 sec.) between the moment when the earthquake is detected by the local seismic network installed in the epicenter area (Vrancea) and the arrival time of the seismic waves in the protected area (Bucharest) to send earthquake warning to users. In the last years, National Institute for Earth Physics (NIEP) upgraded its seismic network in order to cover better the seismic zones of Romania. Currently the National Institute for Earth Physics (NIEP) operates a real-time seismic network designed to monitor the seismic activity on the Romania territory, dominated by the Vrancea intermediate-depth (60-200 km) earthquakes. The NIEP real-time network consists of 102 stations and two seismic arrays equipped with different high quality digitizers (Kinemetrics K2, Quanterra Q330, Quanterra Q330HR, PS6-26, Basalt), broadband and short period seismometers (CMG3ESP, CMG40T, KS2000, KS54000, KS2000, CMG3T,STS2, SH-1, S13, Ranger, gs21, Mark l22) and acceleration sensors (Episensor). Recent improvement of the seismic network and real-time communication technologies allows implementation of a nation-wide EEWS for Vrancea and other seismic sources from Romania. We present a regional approach to Earthquake Early Warning for Romania earthquakes. The regional approach is based on PRESTo (Probabilistic and Evolutionary early warning SysTem) software platform: PRESTo processes in real-time three channel acceleration data streams: once the P-waves arrival have been detected, it provides earthquake location and magnitude estimations, and peak ground motion predictions at target sites. PRESTo is currently implemented in real- time at National Institute for Earth Physics, Bucharest for several months in parallel with a secondary EEWS. The alert notification is issued only when both systems validate each other. Here we present the results obtained using offline earthquakes originating from Vrancea area together with several real

  3. Volunteers in the earthquake hazard reduction program

    USGS Publications Warehouse

    Ward, P.L.

    1978-01-01

    With this in mind, I organized a small workshop for approximately 30 people on February 2 and 3, 1978, in Menlo Park, Calif. the purpose of the meeting was to discuss methods of involving volunteers in a meaningful way in earthquake research and in educating the public about earthquake hazards. The emphasis was on earthquake prediction research, but the discussions covered the whole earthquake hazard reduction program. Representatives attended from the earthquake research community, from groups doing socioeconomic research on earthquake matters, and from a wide variety of organizations who might sponsor volunteers. 

  4. Do earthquakes exhibit self-organized criticality?

    PubMed

    Yang, Xiaosong; Du, Shuming; Ma, Jin

    2004-06-04

    If earthquakes are phenomena of self-organized criticality (SOC), statistical characteristics of the earthquake time series should be invariant after the sequence of events in an earthquake catalog are randomly rearranged. In this Letter we argue that earthquakes are unlikely phenomena of SOC because our analysis of the Southern California Earthquake Catalog shows that the first-return-time probability PM(T) is apparently changed after the time series is rearranged. This suggests that the SOC theory should not be used to oppose the efforts of earthquake prediction.

  5. Earthquakes with non--double-couple mechanisms.

    PubMed

    Frohlich, C

    1994-05-06

    Seismological observations confirm that the pattern of seismic waves from some earthquakes cannot be produced by slip along a planar fault surface. More than one physical mechanism is required to explain the observed varieties of these non-double-couple earthquakes. The simplest explanation is that some earthquakes are complex, with stress released on two or more suitably oriented, nonparallel fault surfaces. However, some shallow earthquakes in volcanic and geothermal areas require other explanations. Current research focuses on whether fault complexity explains most observed non-double-couple earthquakes and to what extent ordinary earthquakes have non-double-couple components.

  6. Earthquake induced Landslides in the Sikkim Himalaya - A Consequences of the 18th September 2011 Earthquake

    NASA Astrophysics Data System (ADS)

    Sharma, Ashok Kumar

    2015-04-01

    On September 18, 2011 an earthquake of 6.8 magnitude on the Richter scale struck Sikkim at 18.11 hours IST. The epicenter of the quake was latidude 27.7o North and longitude 88.2o East about 64 km North-West of Gangtok along the junction point of Teesta lineament and Kanchenjunga fault in the North District of Sikkim. The high intensity tremor triggered various types of natural calamities in the form of landslides, road blocks, falling boulders, lake bursts, flash floods, falling of trees, etc. and caused severe damage to life and property of the people in Sikkim. As the earthquake occurred during the monsoon season, heavy rain and landslides rendered rescue operations extremely difficult. Almost all road connectivity and communication network were disrupted. Sikkim experiences landslides year after year, especially during the monsoons and periods of intense rain. This hazard affects the economy of the State very badly. But due to the earthquake, many new and a few reactivated landslides have occurred in the Sikkim Himalaya.

  7. Authorization basis for the 209-E Building

    SciTech Connect

    TIFFANY, M.S.

    1999-02-23

    This Authorization Basis document is one of three documents that constitute the Authorization Basis for the 209-E Building. Per the U.S. Department of Energy, Richland Operations Office (RL) letter 98-WSD-074, this document, the 209-E Building Preliminary Hazards Analysis (WHC-SD-WM-TI-789), and the 209-E Building Safety Evaluation Report (97-WSD-074) constitute the Authorization Basis for the 209-E Building. This Authorization Basis and the associated controls and safety programs will remain in place until safety documentation addressing deactivation of the 209-E Building is developed by the contractor and approved by RL.

  8. Is Your Class a Natural Disaster? It can be... The Real Time Earthquake Education (RTEE) System

    NASA Astrophysics Data System (ADS)

    Whitlock, J. S.; Furlong, K.

    2003-12-01

    In cooperation with the U.S. Geological Survey (USGS) and its National Earthquake Information Center (NEIC) in Golden, Colorado, we have implemented an autonomous version of the NEIC's real-time earthquake database management and earthquake alert system (Earthworm). This is the same system used professionally by the USGS in its earthquake response operations. Utilizing this system, Penn State University students participating in natural hazard classes receive real-time alerts of worldwide earthquake events on cell phones distributed to the class. The students are then responsible for reacting to actual earthquake events, in real-time, with the same data (or lack thereof) as earthquake professionals. The project was first implemented in Spring 2002, and although it had an initial high intrigue and "coolness" factor, the interest of the students waned with time. Through student feedback, we observed that scientific data presented on its own without an educational context does not foster student learning. In order to maximize the impact of real-time data and the accompanying e-media, the students need to become personally involved. Therefore, in collaboration with the Incorporated Research Institutes of Seismology (IRIS), we have begun to develop an online infrastructure that will help teachers and faculty effectively use real-time earthquake information. The Real-Time Earthquake Education (RTEE) website promotes student learning by integrating inquiry-based education modules with real-time earthquake data. The first module guides the students through an exploration of real-time and historic earthquake datasets to model the most important criteria for determining the potential impact of an earthquake. Having provided the students with content knowledge in the first module, the second module presents a more authentic, open-ended educational experience by setting up an earthquake role-play situation. Through the Earthworm system, we have the ability to "set off

  9. Earthquake Forecast Science Research with a Small Satellite

    NASA Astrophysics Data System (ADS)

    Jason, Susan; da Silva Curiel, Alex; Pulinets, Sergey; Sweeting, Martin, , Sir

    events, as well as an indication of the seismic centre may also be possible. These mission data should also lead to improved knowledge of the physics of earthquakes, improved accuracy for GPS-based navigation models, and could be used to study the reaction of the global ionosphere during magnetic storms and other solar-terrestrial events. The poster presents an overview of the scientific basis, goals, and proposed platform for this research mission.

  10. Physical model for earthquakes, 2. Application to southern California

    SciTech Connect

    Rundle, J.B.

    1988-06-10

    The purpose of this paper is to apply ideas developed in a previous paper to the construction of a detailed model for earthquake dynamics in southern California. The basis upon which the approach is formulated is that earthquakes are perturbations on, or more specifically fluctuations about, the long-term motions of the plates. This concept is made mathematically precise by means of a ''fluctuation hypothesis,'' which states that all physical quantities associated with earthquakes can be expressed as integral expansions in a fluctuating quantity called the ''offset phase.'' While in general, the frictional stick-slip properties of the complex, interacting faults should properly come out of the underlying physics, a simplification is made here, and a simple, spatially varying friction law is assumed. Together with the complex geometry of the major active faults, an assumed, spatially varying Earth rheology, the average rates of long-term offsets on all the major faults, and the friction coefficients, one can generate synthetic earthquake histories for comparison to the real data.

  11. Ten Years of Real-Time Earthquake Loss Alerts

    NASA Astrophysics Data System (ADS)

    Wyss, M.

    2013-12-01

    The priorities of the most important parameters of an earthquake disaster are: Number of fatalities, number of injured, mean damage as a function of settlement, expected intensity of shaking at critical facilities. The requirements to calculate these parameters in real time are: 1) Availability of reliable earthquake source parameters within minutes. 2) Capability of calculating expected intensities of strong ground shaking. 3) Data sets on population distribution and conditions of building stock as a function of settlements. 4) Data on locations of critical facilities. 5) Verified methods of calculating damage and losses. 6) Personnel available on a 24/7 basis to perform and review these calculations. There are three services available that distribute information about the likely consequences of earthquakes within about half an hour of the event. Two of these calculate losses, one gives only general information. Although, much progress has been made during the last ten years improving the data sets and the calculating methods, much remains to be done. The data sets are only first order approximations and the methods bare refinement. Nevertheless, the quantitative loss estimates after damaging earthquakes in real time are generally correct in the sense that they allow distinguishing disastrous from inconsequential events.

  12. Disaster Mitigation by Quick Response Against Strong Earthquake Motion

    NASA Astrophysics Data System (ADS)

    Nakamura, Y.

    2007-12-01

    The concept of EEW, earthquake early warning, was reported on San Francisco Daily Evening Bulletin on 3rd November 1868 by Dr. Cooper first for San Francisco area. According to him this idea was triggered by a failure of earthquake prediction. It is impressive that the thinking way is same as present. In 1982 more than 100 years later, the warning system for Tohoku Shinkansen realized the Cooper's idea for the first time on the world. After that, SAS for Mexico City started operating in 1991, and UrEDAS for Tokaido Shinkansen, an evolutional P-wave detection/warning system, began to operate in 1992. The UrEDAS technology is based on new concepts and methods to realize a real time system for estimating the earthquake parameters as magnitude, location and depth. In Japan at 1992, a new information service using UrEDAS technology had been prepared, but it was not born due to objection of JMA, Japan Meteorological Agency. By the same JMA, an information service, so called "Kinkyu Jishin Sokuho", will be broadcasted in nation wide from the first of October 2007. This implies that our UrEDAS Information Service plan has been correct, and it is my pleasure. However, it shall be rare case in Japan that JMA's information will reach faster than arriving of M7 class or less earthquake at the possible damaged areas, because it takes a time relatively long for processing and transmitting. Only for M8 class earthquakes of which the occurrence probability is about once in ten years in Japan, it is possible to receive the information before arriving of strong shaking in a possible damaged area far from the epicenter. JMA should popularize "Kinkyu Jishin Sokuho" after understandable explanation of these facts. I'm afraid that it will mislead people to broadcast a film on an evacuation training in unbelievable condition which is assuming an earthquake of seismic intensity 7 (corresponding to MMI scale XII) arriving 20 seconds later. Anyway, we can rely only on the onsite alarm in

  13. Detection of change points in underlying earthquake rates, with application to global mega-earthquakes

    NASA Astrophysics Data System (ADS)

    Touati, Sarah; Naylor, Mark; Main, Ian

    2016-02-01

    The recent spate of mega-earthquakes since 2004 has led to speculation of an underlying change in the global `background' rate of large events. At a regional scale, detecting changes in background rate is also an important practical problem for operational forecasting and risk calculation, for example due to volcanic processes, seismicity induced by fluid injection or withdrawal, or due to redistribution of Coulomb stress after natural large events. Here we examine the general problem of detecting changes in background rate in earthquake catalogues with and without correlated events, for the first time using the Bayes factor as a discriminant for models of varying complexity. First we use synthetic Poisson (purely random) and Epidemic-Type Aftershock Sequence (ETAS) models (which also allow for earthquake triggering) to test the effectiveness of many standard methods of addressing this question. These fall into two classes: those that evaluate the relative likelihood of different models, for example using Information Criteria or the Bayes Factor; and those that evaluate the probability of the observations (including extreme events or clusters of events) under a single null hypothesis, for example by applying the Kolmogorov-Smirnov and `runs' tests, and a variety of Z-score tests. The results demonstrate that the effectiveness among these tests varies widely. Information Criteria worked at least as well as the more computationally expensive Bayes factor method, and the Kolmogorov-Smirnov and runs tests proved to be the relatively ineffective in reliably detecting a change point. We then apply the methods tested to events at different thresholds above magnitude M ≥ 7 in the global earthquake catalogue since 1918, after first declustering the catalogue. This is most effectively done by removing likely correlated events using a much lower magnitude threshold (M ≥ 5), where triggering is much more obvious. We find no strong evidence that the background rate of large

  14. The effects of the Yogyakarta earthquake at LUSI mud volcano, Indonesia

    NASA Astrophysics Data System (ADS)

    Lupi, M.; Saenger, E. H.; Fuchs, F.; Miller, S. A.

    2013-12-01

    The M6.3 Yogyakarta earthquake shook Central Java on May 27th, 2006. Forty seven hours later, hot mud outburst at the surface near Sidoarjo, approximately 250 km from the earthquake epicentre. The mud eruption continued and originated LUSI, the youngest mud volcanic system on earth. Since the beginning of the eruption, approximately 30,000 people lost their homes and 13 people died due to the mud flooding. The causes that initiated the eruption are still debated and are based on different geological observations. The earthquake-triggering hypothesis is supported by the evidence that at the time of the earthquake ongoing drilling operations experienced a loss of the drilling mud downhole. In addition, the eruption of the mud began only 47 hours after the Yogyakarta earthquake and the mud reached the surface at different locations aligned along the Watukosek fault, a strike-slip fault upon which LUSI resides. Moreover, the Yogyakarta earthquake also affected the volcanic activity of Mt. Semeru, located as far as Lusi from the epicentre of the earthquake. However, the drilling-triggering hypothesis points out that the earthquake was too far from LUSI for inducing relevant stress changes at depth and highlight how upwelling fluids that reached the surface first emerged only 200 m far from the drilling rig that was operative at the time. Hence, was LUSI triggered by the earthquake or by drilling operations? We conducted a seismic wave propagation study on a geological model based on vp, vs, and density values for the different lithologies and seismic profiles of the crust beneath LUSI. Our analysis shows compelling evidence for the effects produced by the passage of seismic waves through the geological formations and highlights the importance of the overall geological structure that focused and reflected incoming seismic energy.

  15. Multi-parameter observation of pre-earthquake signals and their potential for short -term earthquake forecasting

    NASA Astrophysics Data System (ADS)

    Kalenda, Pavel; Ouzounov, Dimitar; Bobrovskiy, Vadim; Neumann, Libor; Boborykina, Olga; Nazarevych, Andrij; Šebela, Stanka; Kvetko, Július; Shen, Wen-Bin

    2013-04-01

    We present methodologies for the multi-parameter observations of pre-earthquake phenomena and their retrospective/prospective testing. The hypothesis that the strongest earthquakes depend on the global stress field leads to global observations and a multi-parameter and multi-sensors approach. In 2012 we performed coordinated tests of several geophysical and environmental parameters, which are associated with the earthquakes preparation processes, namely: 1) Rocks deformation measurements (Kalenda et al. 2012); 2) Subterranean non-stationary electric processes (Bobrovskij 2011); 3) superconducting gravimeters (SGs) records and broadband seismometers (BS) time series (Shen et al); and 4) satellite infra-red observations (10-13 μm) measured at the top of the atmosphere (Ouzounov et al , 2011). In the retrospective test for the two most recent major events in Asia: Wenchuan earthquake (2008,China) and the latest Tohoku earthquake/tsunami (2011, Japan) our combined analysis showed a coordinated appearance of anomalies in advance (days) that could be explained by a coupling process between the observed physical parameters and the earthquake preparation processes. In 2012 three internal retrospective alerts were issued in advance (days) associated with the following events: M7.7 Okhotsk sea of August 14; M7.3 Honshu EQ of December 7 and M7.1 Banda sea EQ on December 10. Not all observations were able to detect anomalies before the M 7.4 Guatemala EQ of November 11. We discuss the reliability of each observation, their time lag, ability to localize and estimate the magnitude of the main shock. References Bobrovskij V. (2011): Kamchatkian Subterranean Electric Operative Forerunners of Catastrophic Earthquake with M9, occurred close to Honshu Island 2011/03/11 . IUGG Meeting Melbourne, 2011. postrer. Kalenda P. et al. (2012): Tilts, global tectonics and earthquake prediction. SWB, London, 247pp. Ozounov D. et al. (2011): Atmosphere-Ionosphere Response to the M9 Tohoku

  16. Authorization basis requirements comparison report

    SciTech Connect

    Brantley, W.M.

    1997-08-18

    The TWRS Authorization Basis (AB) consists of a set of documents identified by TWRS management with the concurrence of DOE-RL. Upon implementation of the TWRS Basis for Interim Operation (BIO) and Technical Safety Requirements (TSRs), the AB list will be revised to include the BIO and TSRs. Some documents that currently form part of the AB will be removed from the list. This SD identifies each - requirement from those documents, and recommends a disposition for each to ensure that necessary requirements are retained when the AB is revised to incorporate the BIO and TSRs. This SD also identifies documents that will remain part of the AB after the BIO and TSRs are implemented. This document does not change the AB, but provides guidance for the preparation of change documentation.

  17. 340 waste handling facility interim safety basis

    SciTech Connect

    VAIL, T.S.

    1999-04-01

    This document presents an interim safety basis for the 340 Waste Handling Facility classifying the 340 Facility as a Hazard Category 3 facility. The hazard analysis quantifies the operating safety envelop for this facility and demonstrates that the facility can be operated without a significant threat to onsite or offsite people.

  18. The music of earthquakes and Earthquake Quartet #1

    USGS Publications Warehouse

    Michael, Andrew J.

    2013-01-01

    Earthquake Quartet #1, my composition for voice, trombone, cello, and seismograms, is the intersection of listening to earthquakes as a seismologist and performing music as a trombonist. Along the way, I realized there is a close relationship between what I do as a scientist and what I do as a musician. A musician controls the source of the sound and the path it travels through their instrument in order to make sound waves that we hear as music. An earthquake is the source of waves that travel along a path through the earth until reaching us as shaking. It is almost as if the earth is a musician and people, including seismologists, are metaphorically listening and trying to understand what the music means.

  19. Collaborative Comparison of Earthquake Simulators

    NASA Astrophysics Data System (ADS)

    Richards-Dinger, K.; Zielke, O.; Tullis, T. E.; Ward, S. N.; Kaneko, Y.; Shaw, B. E.; Lapusta, N.; Pollitz, F. F.; Morein, G.; Turcotte, D. L.; Robinson, R.; Dieterich, J. H.; Rundle, J. D.; Beeler, N. M.

    2008-12-01

    Earthquake simulators, i.e. computer models in which a series of earthquakes spontaneously occur, are important for understanding earthquake mechanics and earthquake predictability. However, to use earthquake simulators in hazard anaylsis they must show realistic behavior. It is difficult to determine how realistic simulator results are. This is in part because of the complexity of their behavior and the limited database of long sequences of natural earthquakes, especially large ones, against which to compare a simulator's behavior. Due to limits on memory and computation speed it is presently impossible to construct a simulator that simultaneously incorporates everything known about frictional behavior of rock, includes full elastodynamics, and utilizes both small enough elements to properly represent a continuum and enough elements to cover a large geographic area and represent many faults. Consequently, all simulators make compromises. A wide variety of simulators exist, each with different compromises. The effects on the simulator results of these compromises are not currently known. Our goal is to gain a better understanding of the validity of the results of earthquake simulators. This is a joint effort to compare the behavior of our nine independently devised earthquake simulators. We have defined and studied two simple problems. The first checks that each simulator accurately gives the stresses due to slip on a simple vertical strike-slip fault. All simulators satisfactorily passed this test. The second is a comparison of the behavior of a simple strike slip fault, with a simple bi-linear asymmetrically peaked initial stress distribution, and a constant loading rate. The fault constitutive properties have a fixed failure stress, higher than the peak in the initial stress, and a fixed dynamic sliding stress, although models utilizing rate and state friction only approximate this simple description. A series of earthquakes occur in the simulations and the

  20. The Value, Protocols, and Scientific Ethics of Earthquake Forecasting

    NASA Astrophysics Data System (ADS)

    Jordan, Thomas H.

    2013-04-01

    Earthquakes are different from other common natural hazards because precursory signals diagnostic of the magnitude, location, and time of impending seismic events have not yet been found. Consequently, the short-term, localized prediction of large earthquakes at high probabilities with low error rates (false alarms and failures-to-predict) is not yet feasible. An alternative is short-term probabilistic forecasting based on empirical statistical models of seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains up to 1000 relative to long-term forecasts. The value of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing operational forecasting protocols in this sort of "low-probability environment." This paper will explore the complex interrelations among the valuation of low-probability earthquake forecasting, which must account for social intangibles; the protocols of operational forecasting, which must factor in large uncertainties; and the ethics that guide scientists as participants in the forecasting process, who must honor scientific principles without doing harm. Earthquake forecasts possess no intrinsic societal value; rather, they acquire value through their ability to influence decisions made by users seeking to mitigate seismic risk and improve community resilience to earthquake disasters. According to the recommendations of the International Commission on Earthquake Forecasting (www.annalsofgeophysics.eu/index.php/annals/article/view/5350), operational forecasting systems should appropriately separate the hazard-estimation role of scientists from the decision-making role of civil protection authorities and individuals. They should

  1. Response facilitation: implications for perceptual theory, psychotherapy, neurophysiology, and earthquake prediction.

    PubMed

    Medici, R G; Frey, A H; Frey, D

    1985-04-01

    There have been numerous naturalistic observations and anecdotal reports of abnormal animal behavior prior to earthquakes. Basic physiological and behavioral data have been brought together with geophysical data to develop a specific explanation to account for how animals could perceive and respond to precursors of impending earthquakes. The behavior predicted provides a reasonable approximation to the reported abnormal behaviors; that is, the behavior appears to be partly reflexive and partly operant. It can best be described as agitated stereotypic behavior. The explanation formulated has substantial implications for perceptual theory, psychotherapy, and neurophysiology, as well as for earthquake prediction. Testable predictions for biology, psychology, and geophysics can be derived from the explanation.

  2. Hydrothermal response to a volcano-tectonic earthquake swarm, Lassen, California

    USGS Publications Warehouse

    Ingebritsen, Steven E.; Shelly, David R.; Hsieh, Paul A.; Clor, Laura; P.H. Seward,; Evans, William C.

    2015-01-01

    The increasing capability of seismic, geodetic, and hydrothermal observation networks allows recognition of volcanic unrest that could previously have gone undetected, creating an imperative to diagnose and interpret unrest episodes. A November 2014 earthquake swarm near Lassen Volcanic National Park, California, which included the largest earthquake in the area in more than 60 years, was accompanied by a rarely observed outburst of hydrothermal fluids. Although the earthquake swarm likely reflects upward migration of endogenous H2O-CO2 fluids in the source region, there is no evidence that such fluids emerged at the surface. Instead, shaking from the modest sized (moment magnitude 3.85) but proximal earthquake caused near-vent permeability increases that triggered increased outflow of hydrothermal fluids already present and equilibrated in a local hydrothermal aquifer. Long-term, multiparametric monitoring at Lassen and other well-instrumented volcanoes enhances interpretation of unrest and can provide a basis for detailed physical modeling.

  3. Irregular Recurrence of Large Earthquakes along the San Andreas Fault: Evidence from Trees

    NASA Astrophysics Data System (ADS)

    Jacoby, Gordon C.; Sheppard, Paul R.; Sieh, Kerry E.

    1988-07-01

    Old trees growing along the San Andreas fault near Wrightwood, California, record in their annual ring-width patterns the effects of a major earthquake in the fall or winter of 1812 to 1813. Paleoseismic data and historical information indicate that this event was the ``San Juan Capistrano'' earthquake of 8 December 1812, with a magnitude of 7.5. The discovery that at least 12 kilometers of the Mojave segment of the San Andreas fault ruptured in 1812, only 44 years before the great January 1857 rupture, demonstrates that intervals between large earthquakes on this part of the fault are highly variable. This variability increases the uncertainty of forecasting destructive earthquakes on the basis of past behavior and accentuates the need for a more fundamental knowledge of San Andreas fault dynamics.

  4. Structure of the tsunamigenic plate boundary and low-frequency earthquakes in the southern Ryukyu Trench

    NASA Astrophysics Data System (ADS)

    Arai, Ryuta; Takahashi, Tsutomu; Kodaira, Shuichi; Kaiho, Yuka; Nakanishi, Ayako; Fujie, Gou; Nakamura, Yasuyuki; Yamamoto, Yojiro; Ishihara, Yasushi; Miura, Seiichi; Kaneda, Yoshiyuki

    2016-07-01

    It has been recognized that even weakly coupled subduction zones may cause large interplate earthquakes leading to destructive tsunamis. The Ryukyu Trench is one of the best fields to study this phenomenon, since various slow earthquakes and tsunamis have occurred; yet the fault structure and seismic activity there are poorly constrained. Here we present seismological evidence from marine observation for megathrust faults and low-frequency earthquakes (LFEs). On the basis of passive observation we find LFEs occur at 15-18 km depths along the plate interface and their distribution seems to bridge the gap between the shallow tsunamigenic zone and the deep slow slip region. This suggests that the southern Ryukyu Trench is dominated by slow earthquakes at any depths and lacks a typical locked zone. The plate interface is overlaid by a low-velocity wedge and is accompanied by polarity reversals of seismic reflections, indicating fluids exist at various depths along the plate interface.

  5. Reduction of earthquake risk in the united states: Bridging the gap between research and practice

    USGS Publications Warehouse

    Hays, W.W.

    1998-01-01

    Continuing efforts under the auspices of the National Earthquake Hazards Reduction Program are under way to improve earthquake risk assessment and risk management in earthquake-prone regions of Alaska, California, Nevada, Washington, Oregon, Arizona, Utah, Wyoming, and Idaho, the New Madrid and Wabash Valley seismic zones in the central United States, the southeastern and northeastern United States, Puerto Rico, Virgin Islands, Guam, and Hawaii. Geologists, geophysicists, seismologists, architects, engineers, urban planners, emergency managers, health care specialists, and policymakers are having to work at the margins of their disciplines to bridge the gap between research and practice and to provide a social, technical, administrative, political, legal, and economic basis for changing public policies and professional practices in communities where the earthquake risk is unacceptable. ?? 1998 IEEE.

  6. Structure of the tsunamigenic plate boundary and low-frequency earthquakes in the southern Ryukyu Trench.

    PubMed

    Arai, Ryuta; Takahashi, Tsutomu; Kodaira, Shuichi; Kaiho, Yuka; Nakanishi, Ayako; Fujie, Gou; Nakamura, Yasuyuki; Yamamoto, Yojiro; Ishihara, Yasushi; Miura, Seiichi; Kaneda, Yoshiyuki

    2016-07-22

    It has been recognized that even weakly coupled subduction zones may cause large interplate earthquakes leading to destructive tsunamis. The Ryukyu Trench is one of the best fields to study this phenomenon, since various slow earthquakes and tsunamis have occurred; yet the fault structure and seismic activity there are poorly constrained. Here we present seismological evidence from marine observation for megathrust faults and low-frequency earthquakes (LFEs). On the basis of passive observation we find LFEs occur at 15-18 km depths along the plate interface and their distribution seems to bridge the gap between the shallow tsunamigenic zone and the deep slow slip region. This suggests that the southern Ryukyu Trench is dominated by slow earthquakes at any depths and lacks a typical locked zone. The plate interface is overlaid by a low-velocity wedge and is accompanied by polarity reversals of seismic reflections, indicating fluids exist at various depths along the plate interface.

  7. Structure of the tsunamigenic plate boundary and low-frequency earthquakes in the southern Ryukyu Trench

    PubMed Central

    Arai, Ryuta; Takahashi, Tsutomu; Kodaira, Shuichi; Kaiho, Yuka; Nakanishi, Ayako; Fujie, Gou; Nakamura, Yasuyuki; Yamamoto, Yojiro; Ishihara, Yasushi; Miura, Seiichi; Kaneda, Yoshiyuki

    2016-01-01

    It has been recognized that even weakly coupled subduction zones may cause large interplate earthquakes leading to destructive tsunamis. The Ryukyu Trench is one of the best fields to study this phenomenon, since various slow earthquakes and tsunamis have occurred; yet the fault structure and seismic activity there are poorly constrained. Here we present seismological evidence from marine observation for megathrust faults and low-frequency earthquakes (LFEs). On the basis of passive observation we find LFEs occur at 15–18 km depths along the plate interface and their distribution seems to bridge the gap between the shallow tsunamigenic zone and the deep slow slip region. This suggests that the southern Ryukyu Trench is dominated by slow earthquakes at any depths and lacks a typical locked zone. The plate interface is overlaid by a low-velocity wedge and is accompanied by polarity reversals of seismic reflections, indicating fluids exist at various depths along the plate interface. PMID:27447546

  8. Earthquake and Tsunami booklet based on two Indonesia earthquakes

    NASA Astrophysics Data System (ADS)

    Hayashi, Y.; Aci, M.

    2014-12-01

    Many destructive earthquakes occurred during the last decade in Indonesia. These experiences are very important precepts for the world people who live in earthquake and tsunami countries. We are collecting the testimonies of tsunami survivors to clarify successful evacuation process and to make clear the characteristic physical behaviors of tsunami near coast. We research 2 tsunami events, 2004 Indian Ocean tsunami and 2010 Mentawai slow earthquake tsunami. Many video and photographs were taken by people at some places in 2004 Indian ocean tsunami disaster; nevertheless these were few restricted points. We didn't know the tsunami behavior in another place. In this study, we tried to collect extensive information about tsunami behavior not only in many places but also wide time range after the strong shake. In Mentawai case, the earthquake occurred in night, so there are no impressive photos. To collect detail information about evacuation process from tsunamis, we contrived the interview method. This method contains making pictures of tsunami experience from the scene of victims' stories. In 2004 Aceh case, all survivors didn't know tsunami phenomena. Because there were no big earthquakes with tsunami for one hundred years in Sumatra region, public people had no knowledge about tsunami. This situation was highly improved in 2010 Mentawai case. TV programs and NGO or governmental public education programs about tsunami evacuation are widespread in Indonesia. Many people know about fundamental knowledge of earthquake and tsunami disasters. We made drill book based on victim's stories and painted impressive scene of 2 events. We used the drill book in disaster education event in school committee of west Java. About 80 % students and teachers evaluated that the contents of the drill book are useful for correct understanding.

  9. Earthquakes and the urban environment. Volume I

    SciTech Connect

    Berlin, G.L.

    1980-01-01

    Because of the complex nature of earthquake effects, current investigations encompass many disciplines, including those of both the physical and social sciences. Research activities center on such diversified topics as earthquake mechanics, earthquake prediction and control, the prompt and accurate detection of tsunamis (seismic sea waves), earthquake-resistant construction, seismic building code improvements, land use zoning, earthquake risk and hazard perception, disaster preparedness, plus the study of the concerns and fears of people who have experienced the effects of an earthquake. This monograph attempts to amalgamate recent research input comprising the vivifying components of urban seismology at a level useful to those having an interest in the earthquake and its effects upon an urban environment. Volume 1 contains chapters on earthquake parameters and hazards.

  10. Earthquakes and the urban environment. Volume II

    SciTech Connect

    Berlin, G.L.

    1980-01-01

    Because of the complex nature of earthquake effects, current investigations encompass many disciplines, including those of both the physical and social sciences. Research activities center on such diversified topics as earthquake mechanics, earthquake prediction and control, the prompt and accurate detection of tsunamis (seismic sea waves), earthquake-resistant construction, seismic building code improvements, land use zoning, earthquake risk and hazard perception, disaster preparedness, plus the study of the concerns and fears of people who have experienced the effects of an earthquake. This monograph attempts to amalgamate recent research input comprising the vivifying components of urban seismology at a level useful to those having an interest in the earthquake and its effects upon an urban environment. Volume 2 contains chapters on earthquake prediction, control, building design and building response.

  11. The October 12, 1992, Dahshur, Egypt, Earthquake

    USGS Publications Warehouse

    Thenhaus, P.C.; Celebi, M.; Sharp, R.V.

    1993-01-01

    We were part of an international reconnaissance team that investigated the Dahsur earthquake. This article summarizes our findings and points out how even a relatively moderate sized earthquake can cause widespread damage and a large number of casualities. 

  12. Recent earthquake prediction research in Japan.

    PubMed

    Mogi, K

    1986-07-18

    Japan has experienced many major earthquake disasters in the past. Early in this century research began that was aimed at predicting the occurrence of earthquakes, and in 1965 an earthquake prediction program was started as a national project. In 1978 a program for constant monitoring and assessment was formally inaugurated with the goal of forecasting the major earthquake that is expected to occur in the near future in the Tokai district of central Honshu Island. The issue of predicting the anticipated Tokai earthquake is discussed in this article as well as the results of research on major recent earthquakes in Japan-the Izu earthquakes (1978 and 1980) and the Japan Sea earthquake (1983).

  13. Earthquakes & Volcanoes, Volume 23, Number 6, 1992

    USGS Publications Warehouse

    ,; Gordon, David W.

    1993-01-01

    Earthquakes and Volcanoes is published bimonthly by the U.S. Geological Survey to provide current information on earthquakes and seismology, volcanoes, and related natural hazards of interest to both generalized and specialized readers.

  14. St. Louis Area Earthquake Hazards Mapping Project

    USGS Publications Warehouse

    Williams, Robert A.; Steckel, Phyllis; Schweig, Eugene

    2007-01-01

    St. Louis has experienced minor earthquake damage at least 12 times in the past 200 years. Because of this history and its proximity to known active earthquake zones, the St. Louis Area Earthquake Hazards Mapping Project will produce digital maps that show variability of earthquake hazards in the St. Louis area. The maps will be available free via the internet. They can be customized by the user to show specific areas of interest, such as neighborhoods or transportation routes.

  15. GPS Earthquake Early Warning in Cascadia

    NASA Astrophysics Data System (ADS)

    Melbourne, T. I.; Scrivner, C. W.; Santillan, V. M.; Webb, F.

    2011-12-01

    Over 400 GPS receivers of the combined PANGA and PBO networks currently operate along the Cascadia subduction zone, all of which are high-rate and telemetered in real-time. These receivers span the M9 megathrust, M7 crustal faults beneath population centers, several active Cascades volcanoes, and a host of other hazard sources, and together enable a host of new approaches towards hazards mitigation. Data from the majority of the stations is received in real time at CWU and processed into one-second position estimates using 1) relative positioning within several reference frames constrained by 2) absolute point positioning using streamed satellite orbit and clock corrections. While the former produces lower-noise time series, for earthquakes greater than ~M7 and ground displacements exceeding ~20 cm, point positioning alone is shown to provide very rapid and robust estimates of the location and amplitude of both dynamic strong ground motion and permanent deformation. The advantage of point-positioning over relative positioning for earthquake applications lies primarily in the fact that each station's position is estimated independently, without double-differencing, within a reference frame defined by earth's center of mass and the satellite orbits. Point positioning does not require a nearby stable reference station or network whose motion (such as during a seismic event) aliases directly into fictitious displacement of any station in question. Thus, for real-time GPS earthquake characterization, this is of great importance in ensuring a robust measurement. We are now producing real-time point-positions using GIPSY5 and corrections to broadcast satellite clocks and orbits streamed live from the DLR in Germany. We have also developed a stream-editor to flag and fix cycle-slips and other data problems on the fly prior to positioning. We are achieving < 3s latency and RMS scatter of under 4 cm. For use in earthquake early warning, we have developed estimation routines

  16. Analysis of post-earthquake landslide activity and geo-environmental effects

    NASA Astrophysics Data System (ADS)

    Tang, Chenxiao; van Westen, Cees; Jetten, Victor

    2014-05-01

    Large earthquakes can cause huge losses to human society, due to ground shaking, fault rupture and due to the high density of co-seismic landslides that can be triggered in mountainous areas. In areas that have been affected by such large earthquakes, the threat of landslides continues also after the earthquake, as the co-seismic landslides may be reactivated by high intensity rainfall events. Earthquakes create Huge amount of landslide materials remain on the slopes, leading to a high frequency of landslides and debris flows after earthquakes which threaten lives and create great difficulties in post-seismic reconstruction in the earthquake-hit regions. Without critical information such as the frequency and magnitude of landslides after a major earthquake, reconstruction planning and hazard mitigation works appear to be difficult. The area hit by Mw 7.9 Wenchuan earthquake in 2008, Sichuan province, China, shows some typical examples of bad reconstruction planning due to lack of information: huge debris flows destroyed several re-constructed settlements. This research aim to analyze the decay in post-seismic landslide activity in areas that have been hit by a major earthquake. The areas hit by the 2008 Wenchuan earthquake will be taken a study area. The study will analyze the factors that control post-earthquake landslide activity through the quantification of the landslide volume changes well as through numerical simulation of their initiation process, to obtain a better understanding of the potential threat of post-earthquake landslide as a basis for mitigation planning. The research will make use of high-resolution stereo satellite images, UAV and Terrestrial Laser Scanning(TLS) to obtain multi-temporal DEM to monitor the change of loose sediments and post-seismic landslide activities. A debris flow initiation model that incorporates the volume of source materials, vegetation re-growth, and intensity-duration of the triggering precipitation, and that evaluates

  17. On the short-term earthquake prediction: renormalization algorithm and observational evidence in S. California, E. Mediterranean, and Japan

    NASA Astrophysics Data System (ADS)

    Keilis-Borok, V.; Shebalin, P.; Zaliapin, I.; Novikova, O.; Gabrielov, A.

    2002-12-01

    Our point of departure is provided by premonitory seismicity patterns found in models and observations. They reflect increase of earthquake correlation range and seismic activity within "intermediate" lead-time of years before a strong earthquake. A combination of these patterns, in renormalized definition, precedes within months eight out of nine strong earthquakes in S. California, E. Mediterranean, and Japan. We suggest on that basis a hypothetical short-term prediction algorithm, to be tested by advance prediction. The algorithm is self-adapting and can be transferred without readaptation from earthquake to earthquake and from area to area. If confirmed, it will have a simple, albeit non-unique, qualitative interpretation. The suggested algorithm is designed to provide a short-term approximation to an intermediate-term prediction. It remains not clear, whether it could be used independently.

  18. Geophysical setting of the 2000 ML 5.2 Yountville, California, earthquake: Implications for seismic Hazard in Napa Valley, California

    USGS Publications Warehouse

    Langenheim, V.E.; Graymer, R.W.; Jachens, R.C.

    2006-01-01

    The epicenter of the 2000 ML 5.2 Yountville earthquake was located 5 km west of the surface trace of the West Napa fault, as defined by Helley and Herd (1977). On the basis of the re-examination of geologic data and the analysis of potential field data, the earthquake occurred on a strand of the West Napa fault, the main basin-bounding fault along the west side of Napa Valley. Linear aeromagnetic anomalies and a prominent gravity gradient extend the length of the fault to the latitude of Calistoga, suggesting that this fault may be capable of larger-magnitude earthquakes. Gravity data indicate an ???2-km-deep basin centered on the town of Napa, where damage was concentrated during the Yountville earthquake. It most likely played a minor role in enhancing shaking during this event but may lead to enhanced shaking caused by wave trapping during a larger-magnitude earthquake.

  19. 100 Years after the San Francisco Earthquake of 1906: Earthquake Forecasting and Forecast Verification - Status, Prospects and Promise

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Turcotte, D. L.; Holliday, J. R.; Rundle, P. B.; Tiampo, K. F.; Chen, C.; Nanjo`, K.; Donnellan, A.; Klein, W.

    2005-12-01

    The Elastic Rebound Hypothesis was proposed by H.F. Reid in the Carnegie Commission summary report published in 1910. As we approach the 100th year anniversary of the 1906 disaster, progress not possible in that era is made possible by the use of advanced computer models and simulations, combined with new data sets and data mining techniques, together with ideas about complex nonlinear systems. Modern computational technology allows us to construct models such as Virtual California that include many of the physical processes known to be important in earthquake dynamics. These include elastic interactions among the faults in the model, driving at the correct plate tectonic rates, and frictional physics on the faults using the physics obtained from laboratory models with parameters consistent with the occurrence of historic earthquakes. Models such as Virtual California represent "numerical laboratories" in which the event statistics and precursory patterns can be determined directly from simulations, rather than by assumption. Using such models, we have found that failure on groups of fault segments is usually best described by Weibull statistics. Simulations also allow us to develop new types of forecast methods such as the Pattern Informatics (PI) method, which develops forecast maps based upon locating areas of highest change in seismic activity of small earthquakes. A contrasting approach, which serves as a null hypothesis, is to compute maps based upon locating the highest rates of small earthquakes (Relative Intensity, or RI). Development, testing, and evaluation of these forecast methods is possible only if reliable testing mechanisms are available. The Relative Operating Characteristic (ROC) method of forecast verification has been developed in the study of binary forecasts of tornadoes and other weather-related events. We have adapted this test to earthquake forecasting and find that it allows the best objective assessment of which methods show optimal

  20. Broadband source spectrum, seismic energy, and stress drop of the 1989 Macquarie Ridge earthquake

    SciTech Connect

    Houston, H. )

    1990-06-01

    The author computes the broadband source spectrum at periods from 1 to 50 seconds using teleseismic P body waves of the May 23, 1989 Macquarie Ridge earthquake (M{sub W} = 8.1) recorded by the GDSN, GEOSCOPE, and IDA networks. The average source spectrum is obtained by windowing, tapering, and Fourier-transforming P waves, removing from the spectra the effects of attenuation, geometrical spreading, and radiation pattern, and averaging logarithmically over the stations. The source spectrum for the strike-slip Macquarie Ridge earthquake is higher than an average source spectrum of seven recent large earthquakes (scaled to be comparable to a M{sub W} = 8.1 earthquake) by a factor of 2 to 3 at periods of 1 to 20 seconds. These other earthquakes were underthrusting events in subduction zones. Using Haskell's formulation assuming a point source with no directivity, she estimates the seismically radiated energy from the source spectrum by integrating the square of the source spectrum in velocity and scaling the result. The seismic energy thus estimated for the Macquarie Ridge earthquake is 3 to 8 {times} 10{sup 23} ergs. An Orowan stress drop can be obtained from the seismic energy and moment. The Orowan stress drop for the Macquarie Ridge earthquake is about 20 to 50 bars, much higher than similarly determined stress drops of other recent large earthquakes. There is a correlation between the Orowan stress drops and time since the last earthquake of comparable or larger magnitude for seven recent large earthquakes. This correlation suggests that a healing process operates that may control the mechanical strength of the fault and is important on time scales of tens to hundreds of years.

  1. Modeling, Forecasting and Mitigating Extreme Earthquakes

    NASA Astrophysics Data System (ADS)

    Ismail-Zadeh, A.; Le Mouel, J.; Soloviev, A.

    2012-12-01

    Recent earthquake disasters highlighted the importance of multi- and trans-disciplinary studies of earthquake risk. A major component of earthquake disaster risk analysis is hazards research, which should cover not only a traditional assessment of ground shaking, but also studies of geodetic, paleoseismic, geomagnetic, hydrological, deep drilling and other geophysical and geological observations together with comprehensive modeling of earthquakes and forecasting extreme events. Extreme earthquakes (large magnitude and rare events) are manifestations of complex behavior of the lithosphere structured as a hierarchical system of blocks of different sizes. Understanding of physics and dynamics of the extreme events comes from observations, measurements and modeling. A quantitative approach to simulate earthquakes in models of fault dynamics will be presented. The models reproduce basic features of the observed seismicity (e.g., the frequency-magnitude relationship, clustering of earthquakes, occurrence of extreme seismic events). They provide a link between geodynamic processes and seismicity, allow studying extreme events, influence of fault network properties on seismic patterns and seismic cycles, and assist, in a broader sense, in earthquake forecast modeling. Some aspects of predictability of large earthquakes (how well can large earthquakes be predicted today?) will be also discussed along with possibilities in mitigation of earthquake disasters (e.g., on 'inverse' forensic investigations of earthquake disasters).

  2. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program...

  3. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 13 Business Credit and Assistance 1 2014-01-01 2014-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program...

  4. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 13 Business Credit and Assistance 1 2013-01-01 2013-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program...

  5. Understanding and responding to earthquake hazards

    NASA Technical Reports Server (NTRS)

    Raymond, C. A.; Lundgren, P. R.; Madsen, S. N.; Rundle, J. B.

    2002-01-01

    Advances in understanding of the earthquake cycle and in assessing earthquake hazards is a topic of great importance. Dynamic earthquake hazard assessments resolved for a range of spatial scales and time scales will allow a more systematic approach to prioritizing the retrofitting of vulnerable structures, relocating populations at risk, protecting lifelines, preparing for disasters, and educating the public.

  6. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 13 Business Credit and Assistance 1 2011-01-01 2011-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program...

  7. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 13 Business Credit and Assistance 1 2012-01-01 2012-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program...

  8. Earthquakes in the New Zealand Region.

    ERIC Educational Resources Information Center

    Wallace, Cleland

    1995-01-01

    Presents a thorough overview of earthquakes in New Zealand, discussing plate tectonics, seismic measurement, and historical occurrences. Includes 10 figures illustrating such aspects as earthquake distribution, intensity, and fissures in the continental crust. Tabular data includes a list of most destructive earthquakes and descriptive effects…

  9. Fault failure with moderate earthquakes

    USGS Publications Warehouse

    Johnston, M.J.S.; Linde, A.T.; Gladwin, M.T.; Borcherdt, R.D.

    1987-01-01

    High resolution strain and tilt recordings were made in the near-field of, and prior to, the May 1983 Coalinga earthquake (ML = 6.7, ?? = 51 km), the August 4, 1985, Kettleman Hills earthquake (ML = 5.5, ?? = 34 km), the April 1984 Morgan Hill earthquake (ML = 6.1, ?? = 55 km), the November 1984 Round Valley earthquake (ML = 5.8, ?? = 54 km), the January 14, 1978, Izu, Japan earthquake (ML = 7.0, ?? = 28 km), and several other smaller magnitude earthquakes. These recordings were made with near-surface instruments (resolution 10-8), with borehole dilatometers (resolution 10-10) and a 3-component borehole strainmeter (resolution 10-9). While observed coseismic offsets are generally in good agreement with expectations from elastic dislocation theory, and while post-seismic deformation continued, in some cases, with a moment comparable to that of the main shock, preseismic strain or tilt perturbations from hours to seconds (or less) before the main shock are not apparent above the present resolution. Precursory slip for these events, if any occurred, must have had a moment less than a few percent of that of the main event. To the extent that these records reflect general fault behavior, the strong constraint on the size and amount of slip triggering major rupture makes prediction of the onset times and final magnitudes of the rupture zones a difficult task unless the instruments are fortuitously installed near the rupture initiation point. These data are best explained by an inhomogeneous failure model for which various areas of the fault plane have either different stress-slip constitutive laws or spatially varying constitutive parameters. Other work on seismic waveform analysis and synthetic waveforms indicates that the rupturing process is inhomogeneous and controlled by points of higher strength. These models indicate that rupture initiation occurs at smaller regions of higher strength which, when broken, allow runaway catastrophic failure. ?? 1987.

  10. Izmit, Turkey 1999 Earthquake Interferogram

    NASA Technical Reports Server (NTRS)

    2001-01-01

    This image is an interferogram that was created using pairs of images taken by Synthetic Aperture Radar (SAR). The images, acquired at two different times, have been combined to measure surface deformation or changes that may have occurred during the time between data acquisition. The images were collected by the European Space Agency's Remote Sensing satellite (ERS-2) on 13 August 1999 and 17 September 1999 and were combined to produce these image maps of the apparent surface deformation, or changes, during and after the 17 August 1999 Izmit, Turkey earthquake. This magnitude 7.6 earthquake was the largest in 60 years in Turkey and caused extensive damage and loss of life. Each of the color contours of the interferogram represents 28 mm (1.1 inches) of motion towards the satellite, or about 70 mm (2.8 inches) of horizontal motion. White areas are outside the SAR image or water of seas and lakes. The North Anatolian Fault that broke during the Izmit earthquake moved more than 2.5 meters (8.1 feet) to produce the pattern measured by the interferogram. Thin red lines show the locations of fault breaks mapped on the surface. The SAR interferogram shows that the deformation and fault slip extended west of the surface faults, underneath the Gulf of Izmit. Thick black lines mark the fault rupture inferred from the SAR data. Scientists are using the SAR interferometry along with other data collected on the ground to estimate the pattern of slip that occurred during the Izmit earthquake. This then used to improve computer models that predict how this deformation transferred stress to other faults and to the continuation of the North Anatolian Fault, which extends to the west past the large city of Istanbul. These models show that the Izmit earthquake further increased the already high probability of a major earthquake near Istanbul.

  11. Statistical earthquake focal mechanism forecasts

    NASA Astrophysics Data System (ADS)

    Kagan, Yan Y.; Jackson, David D.

    2014-04-01

    Forecasts of the focal mechanisms of future shallow (depth 0-70 km) earthquakes are important for seismic hazard estimates and Coulomb stress, and other models of earthquake occurrence. Here we report on a high-resolution global forecast of earthquake rate density as a function of location, magnitude and focal mechanism. In previous publications we reported forecasts of 0.5° spatial resolution, covering the latitude range from -75° to +75°, based on the Global Central Moment Tensor earthquake catalogue. In the new forecasts we have improved the spatial resolution to 0.1° and the latitude range from pole to pole. Our focal mechanism estimates require distance-weighted combinations of observed focal mechanisms within 1000 km of each gridpoint. Simultaneously, we calculate an average rotation angle between the forecasted mechanism and all the surrounding mechanisms, using the method of Kagan & Jackson proposed in 1994. This average angle reveals the level of tectonic complexity of a region and indicates the accuracy of the prediction. The procedure becomes problematical where longitude lines are not approximately parallel, and where shallow earthquakes are so sparse that an adequate sample spans very large distances. North or south of 75°, the azimuths of points 1000 km away may vary by about 35°. We solved this problem by calculating focal mechanisms on a plane tangent to the Earth's surface at each forecast point, correcting for the rotation of the longitude lines at the locations of earthquakes included in the averaging. The corrections are negligible between -30° and +30° latitude, but outside that band uncorrected rotations can be significantly off. Improved forecasts at 0.5° and 0.1° resolution are posted at http://eq.ess.ucla.edu/kagan/glob_gcmt_index.html.

  12. The great San Francisco earthquake

    USGS Publications Warehouse

    Nason, R. D.

    1981-01-01

    Seventy-five years ago on April 18, 1906, the most devastating earthquake in United States history occurred in northern California. This earthquake, which occurred at 5:2 in the morning just as the dawn was breaking, came from rupture of the San Andreas fault from San Juan Bautista (near Hollister) northqard for 270 miles to the coast near Eureka. Buildings were damaged everywhere in this region, for a north-south distance of 370 miles, from Arcata to Salinas, and an east-west width of 50 miles inland from the coast. The larger cities of San Francisco, Sna Jose, and Santa Rosa suffered the msot severe damage. 

  13. Earth science: lasting earthquake legacy

    USGS Publications Warehouse

    Parsons, Thomas E.

    2009-01-01

    On 31 August 1886, a magnitude-7 shock struck Charleston, South Carolina; low-level activity continues there today. One view of seismic hazard is that large earthquakes will return to New Madrid and Charleston at intervals of about 500 years. With expected ground motions that would be stronger than average, that prospect produces estimates of earthquake hazard that rival those at the plate boundaries marked by the San Andreas fault and Cascadia subduction zone. The result is two large 'bull's-eyes' on the US National Seismic Hazard Maps — which, for example, influence regional building codes and perceptions of public safety.

  14. Earthquake location in island arcs

    USGS Publications Warehouse

    Engdahl, E.R.; Dewey, J.W.; Fujita, K.

    1982-01-01

    A comprehensive data set of selected teleseismic P-wave arrivals and local-network P- and S-wave arrivals from large earthquakes occurring at all depths within a small section of the central Aleutians is used to examine the general problem of earthquake location in island arcs. Reference hypocenters for this special data set are determined for shallow earthquakes from local-network data and for deep earthquakes from combined local and teleseismic data by joint inversion for structure and location. The high-velocity lithospheric slab beneath the central Aleutians may displace hypocenters that are located using spherically symmetric Earth models; the amount of displacement depends on the position of the earthquakes with respect to the slab and on whether local or teleseismic data are used to locate the earthquakes. Hypocenters for trench and intermediate-depth events appear to be minimally biased by the effects of slab structure on rays to teleseismic stations. However, locations of intermediate-depth events based on only local data are systematically displaced southwards, the magnitude of the displacement being proportional to depth. Shallow-focus events along the main thrust zone, although well located using only local-network data, are severely shifted northwards and deeper, with displacements as large as 50 km, by slab effects on teleseismic travel times. Hypocenters determined by a method that utilizes seismic ray tracing through a three-dimensional velocity model of the subduction zone, derived by thermal modeling, are compared to results obtained by the method of joint hypocenter determination (JHD) that formally assumes a laterally homogeneous velocity model over the source region and treats all raypath anomalies as constant station corrections to the travel-time curve. The ray-tracing method has the theoretical advantage that it accounts for variations in travel-time anomalies within a group of events distributed over a sizable region of a dipping, high

  15. Earthquakes triggered by fluid extraction

    USGS Publications Warehouse

    Segall, P.

    1989-01-01

    Seismicity is correlated in space and time with production from some oil and gas fields where pore pressures have declined by several tens of megapascals. Reverse faulting has occurred both above and below petroleum reservoirs, and normal faulting has occurred on the flanks of at least one reservoir. The theory of poroelasticity requires that fluid extraction locally alter the state of stress. Calculations with simple geometries predict stress perturbations that are consistent with observed earthquake locations and focal mechanisms. Measurements of surface displacement and strain, pore pressure, stress, and poroelastic rock properties in such areas could be used to test theoretical predictions and improve our understanding of earthquake mechanics. -Author

  16. Earthquake Education in Prime Time

    NASA Astrophysics Data System (ADS)

    de Groot, R.; Abbott, P.; Benthien, M.

    2004-12-01

    Since 2001, the Southern California Earthquake Center (SCEC) has collaborated on several video production projects that feature important topics related to earthquake science, engineering, and preparedness. These projects have also fostered many fruitful and sustained partnerships with a variety of organizations that have a stake in hazard education and preparedness. The Seismic Sleuths educational video first appeared in the spring season 2001 on Discovery Channel's Assignment Discovery. Seismic Sleuths is based on a highly successful curriculum package developed jointly by the American Geophysical Union and The Department of Homeland Security Federal Emergency Management Agency. The California Earthquake Authority (CEA) and the Institute for Business and Home Safety supported the video project. Summer Productions, a company with a reputation for quality science programming, produced the Seismic Sleuths program in close partnership with scientists, engineers, and preparedness experts. The program has aired on the National Geographic Channel as recently as Fall 2004. Currently, SCEC is collaborating with Pat Abbott, a geology professor at San Diego State University (SDSU) on the video project Written In Stone: Earthquake Country - Los Angeles. Partners on this project include the California Seismic Safety Commission, SDSU, SCEC, CEA, and the Insurance Information Network of California. This video incorporates live-action demonstrations, vivid animations, and a compelling host (Abbott) to tell the story about earthquakes in the Los Angeles region. The Written in Stone team has also developed a comprehensive educator package that includes the video, maps, lesson plans, and other supporting materials. We will present the process that facilitates the creation of visually effective, factually accurate, and entertaining video programs. We acknowledge the need to have a broad understanding of the literature related to communication, media studies, science education, and

  17. Earthquake prediction in Japan and natural time analysis of seismicity

    NASA Astrophysics Data System (ADS)

    Uyeda, S.; Varotsos, P.

    2011-12-01

    M9 super-giant earthquake with huge tsunami devastated East Japan on 11 March, causing more than 20,000 casualties and serious damage of Fukushima nuclear plant. This earthquake was predicted neither short-term nor long-term. Seismologists were shocked because it was not even considered possible to happen at the East Japan subduction zone. However, it was not the only un-predicted earthquake. In fact, throughout several decades of the National Earthquake Prediction Project, not even a single earthquake was predicted. In reality, practically no effective research has been conducted for the most important short-term prediction. This happened because the Japanese National Project was devoted for construction of elaborate seismic networks, which was not the best way for short-term prediction. After the Kobe disaster, in order to parry the mounting criticism on their no success history, they defiantly changed their policy to "stop aiming at short-term prediction because it is impossible and concentrate resources on fundamental research", that meant to obtain "more funding for no prediction research". The public were and are not informed about this change. Obviously earthquake prediction would be possible only when reliable precursory phenomena are caught and we have insisted this would be done most likely through non-seismic means such as geochemical/hydrological and electromagnetic monitoring. Admittedly, the lack of convincing precursors for the M9 super-giant earthquake has adverse effect for us, although its epicenter was far out off shore of the range of operating monitoring systems. In this presentation, we show a new possibility of finding remarkable precursory signals, ironically, from ordinary seismological catalogs. In the frame of the new time domain termed natural time, an order parameter of seismicity, κ1, has been introduced. This is the variance of natural time kai weighted by normalised energy release at χ. In the case that Seismic Electric Signals

  18. Earthquake interevent time distribution in Kachchh, Northwestern India

    NASA Astrophysics Data System (ADS)

    Pasari, Sumanta; Dikshit, Onkar

    2015-08-01

    Statistical properties of earthquake interevent times have long been the topic of interest to seismologists and earthquake professionals, mainly for hazard-related concerns. In this paper, we present a comprehensive study on the temporal statistics of earthquake interoccurrence times of the seismically active Kachchh peninsula (western India) from thirteen probability distributions. Those distributions are exponential, gamma, lognormal, Weibull, Levy, Maxwell, Pareto, Rayleigh, inverse Gaussian (Brownian passage time), inverse Weibull (Frechet), exponentiated exponential, exponentiated Rayleigh (Burr type X), and exponentiated Weibull distributions. Statistical inferences of the scale and shape parameters of these distributions are discussed from the maximum likelihood estimations and the Fisher information matrices. The latter are used as a surrogate tool to appraise the parametric uncertainty in the estimation process. The results were found on the basis of two goodness-of-fit tests: the maximum likelihood criterion with its modification to Akaike information criterion (AIC) and the Kolmogorov-Smirnov (K-S) minimum distance criterion. These results reveal that (i) the exponential model provides the best fit, (ii) the gamma, lognormal, Weibull, inverse Gaussian, exponentiated exponential, exponentiated Rayleigh, and exponentiated Weibull models provide an intermediate fit, and (iii) the rest, namely Levy, Maxwell, Pareto, Rayleigh, and inverse Weibull, fit poorly to the earthquake catalog of Kachchh and its adjacent regions. This study also analyzes the present-day seismicity in terms of the estimated recurrence interval and conditional probability curves (hazard curves). The estimated cumulative probability and the conditional probability of a magnitude 5.0 or higher event reach 0.8-0.9 by 2027-2036 and 2034-2043, respectively. These values have significant implications in a variety of practical applications including earthquake insurance, seismic zonation

  19. Utilizing online monitoring of water wells for detecting earthquake precursors

    NASA Astrophysics Data System (ADS)

    Reuveni, Y.; Anker, Y.; Inbar, N.; Yellin-Dror, A.; Guttman, J.; Flexer, A.

    2015-12-01

    Groundwater reaction to earthquakes is well known and documented, mostly as changes in water levels or springs discharge, but also as changes in groundwater chemistry. During 2004 groundwater level undulations preceded a series of moderate (ML~5) earthquakes, which occurred along the Dead Sea Rift System (DSRS). In order to try and validate these preliminary observations monitoring of several observation wells was initiated. The monitoring and telemetry infrastructure as well as the wells were allocated specifically for the research by the Israeli National Water Company (Mekorot LTD.). Once several earthquake events were skipped due to insufficient sampling frequency and owing to insufficient storage capacity that caused loss of data, it was decided to establish an independent monitoring system. This current stage of research had commenced at 2011 and just recently became fully operative. At present there are four observation wells that are located along major faults, adjacent to the DSRS. The wells must be inactive and with a confined production layer. The wells are equipped with sensors for groundwter level, water conductivity and groundwater temperature measurements. The data acquisition and transfer resolution is of one minute and the dataset is being transferred through a GPRS network to a central database server. Since the start of the present research stage, most of the earthquakes recorded at the vicinity of the DSRS were smaller then ML 5, with groundwater response only after the ground movement. Nonetheless, distant earthquakes occurring as far as 300 km along a DSRS adjacent fault (ML~3), were noticed at the observation wells. A recent earthquake precursory reoccurrence was followed by a 5.5ML earthquake with an epicenter near the eastern shore of the Red Sea about 400km south to the wells that alerted the quake (see figure). In both wells anomalies is water levels and conductivity were found few hours before the quake, although any single anomaly cannot

  20. Predictability of population displacement after the 2010 Haiti earthquake.

    PubMed

    Lu, Xin; Bengtsson, Linus; Holme, Petter

    2012-07-17

    Most severe disasters cause large population movements. These movements make it difficult for relief organizations to efficiently reach people in need. Understanding and predicting the locations of affected people during disasters is key to effective humanitarian relief operations and to long-term societal reconstruction. We collaborated with the largest mobile phone operator in Haiti (Digicel) and analyzed the movements of 1.9 million mobile phone users during the period from 42 d before, to 341 d after the devastating Haiti earthquake of January 12, 2010. Nineteen days after the earthquake, population movements had caused the population of the capital Port-au-Prince to decrease by an estimated 23%. Both the travel distances and size of people's movement trajectories grew after the earthquake. These findings, in combination with the disorder that was present after the disaster, suggest that people's movements would have become less predictable. Instead, the predictability of people's trajectories remained high and even increased slightly during the three-month period after the earthquake. Moreover, the destinations of people who left the capital during the first three weeks after the earthquake was highly correlated with their mobility patterns during normal times, and specifically with the locations in which people had significant social bonds. For the people who left Port-au-Prince, the duration of their stay outside the city, as well as the time for their return, all followed a skewed, fat-tailed distribution. The findings suggest that population movements during disasters may be significantly more predictable than previously thought.